Pattie Maes is a professor in MIT’s Program in Media Arts and Sciences. She founded and directed the MIT Media Lab’s Fluid Interfaces group. Previously, she founded and ran the Software Agents group. She currently acts as the associate Department Head for the Media, Arts and Sciences Department. Prior to joining the Media Lab, Maes was a visiting professor and a research scientist at the MIT Artificial Intelligence Lab. She holds bachelor’s and PhD degrees in computer science from the Vrije Universiteit Brussel in Belgium.
Pattie talked about the limitations of current computer technology – that it is impractical to Google a person or product every time you see it. Wearable technology can do this for you. She showed off an array of items that cost less than $350, including a camera, wearable projector, mirror, phone, and colourful caps on fingers. Using this, you can walk up to any wall and begin using a computer – with the camera tracking any gestures they make by recognising their fingernails. Even without a wall, you can project onto your hand – eg as a simple dialling pad for a phone.
Pattie shows some examples of
- shopping for paper towels, with reviews and product information projected to each one.
- Buying a book, you can see a total star review on the front cover. When turning the page you can see user comments, when turning again you can see annotations from professionals.
- When talking to someone, a ‘tag cloud’ can be projected onto their Tshirt. This can show traits that person is known for, or information from blogs associated with them.
- by looking at your boarding pass you can see if the flight is delayed or gate changed
- draw on your wrist to project the time
Pranav Mistry is a PhD student in the Fluid Interfaces Group at MIT’s Media Lab. Before his studies at MIT, he worked with Microsoft as a UX researcher; he’s a graduate of IIT. Mistry is passionate about integrating the digital informational experience with our real-world interactions.
Gestures are everything, and come naturally to us. Pranav asks why we can’t interact with computers in the same we we interact. He experimented with different input systems for computers
- a hacked mouse was turned into a glove – allowing the computer read hand movements
- sticky notes that could be written and read by the computer, then either sent on as sms or treated as an input to the computer.
- A pen that can draw in 3 dimensions
- A computer map built into a table
People are interested in information, not necessarily the computers or pixels that show them. His next step was to try to eliminate the computer. SixthSense is a helmet mounted computer projected to a wall, that tracks your fingers using a camera. You can make gestures at any wall to use the computer. One gesture immediately takes a photo, another allows sending it as an email. Some extra features, acting as an interface between physical and digital world are
- can recognise an object such as a book, and project a review onto it (3star etc)
- can project videos or images onto newspapers
- looking at a boarding pass and seeing if the flight is delayed
- playing pong on the ground with your feet
- can project onto a piece of paper and use it as a touch screen (play racing games, draw with finger, browse web)
- ‘copy and paste’ from physical world onto the paper screen
SixthSense has the potential to keep us more connected to the physical world, and keep us human rather than a machine in front of another machine.
At the end of the video, he announced the software will be made open source for others to experiment with. The hardware is relatively cheap at ~$300.