Not a member yet? Register for full benefits!

Podcast: Sixth Sense AR Tech: TED 2009

Podcast Source:

View Podcast Online? Yes

Podcast length: 9 minutes

Podcast Description

This podcast comes from TED 2009. TED stands for Technology, Entertainment, Design. It is a conference that has been annual since 1984, and has become one of the most elite technological events on the calendar. This podcasts is the talk by Pattie Maes of the MIT Media Lab's new Fluid Interfaces Group, presenting the work of her student, Pranav Mistry, and the Sixth Sense technological augmenation system that was the talk of the conference.

Presenter Biographies

Pattie Maes

Pattie Maes was the key architect behind what was once called "collaborative filtering" and has become a key to Web 2.0: the immense engine of recommendations -- or "things like this" -- fueled by other users. In the 1990s, Maes' Software Agents program at MIT created Firefly, a technology (and then a startup) that let users choose songs they liked, and find similar songs they’d never heard of, by taking cues from others with similar taste. This brought a sea change in the way we interact with software, with culture and with one another.

Now Maes is working on a similarly boundary-breaking initiative. Her newly founded Fluid Interfaces Group, also part of the MIT Media Lab, aims to rethink the ways in which humans and computers interact, partially by redefining both human and computer. In Maes' world (and really, in all of ours), the computer is no longer a distinct object, but a source of intelligence that’s embedded in our environment. By outfitting ourselves with digital accessories, we can continually learn from (and teach) our surroundings.

Pranav Mistry

Pranav Mistry is a PhD student in the Fluid Interfaces Group at MIT's Media Lab. Before his studies at MIT, he worked with Microsoft as a UX researcher. Mistry is passionate about integrating the digital informational experience with our real-world interactions.

Some previous projects from Mistry's work at MIT includes intelligent sticky notes, Quickies, that can be searched and can send reminders; a pen that draws in 3D; and TaPuMa, a tangible public map that can act as Google of the physical world. His research interests also include Gestural and Tangible Interaction, Ubiquitous Computing, AI, Machine Vision, Collective Intelligence and Robotics.

Transcript Available? No

Audio file available? No

Podcast Download? Yes

28 MB

Podcast viewing notes

Fairly unsurprisingly, the podcast opens out with Pattie Maes describing the purpose of the original project as a way of creating a 'sixth sense' to bolt onto the natural senses humans already have - yes, we know there are more than five natural senses - and augment the experience of the world.

The initial section is background information, where she describes the problems with having all the information we do now, but being unable to access it in real-time when you are out and about, or meet someone, and desire to look up their bio as you talk to them. These are challenges which sixth sense sought to overcome.

The unit itself consists of a webcam, miniature projector, and mirror set up as a single device which is worn round the user's neck, and four or five brightly coloured, reflective surfaces worn around the fingertips. Additionally, a mobile phone with in-built web browser is used for internet connectivity on the go.

It is thus essentially a machine vision system, which gauges the distance to any object or surface, in the same way as virtual keyboards do; then projects augmented reality information onto that surface.

The colours on each finger, which are the size of pen tops in the prototype, but could conceivably be brightly painted nails, are used by the machine vision to track and differentiate the user's fingers in real-time, and thus for gesture recognition. The system is gesture controlled, and can for example, turn any wall or flat surface into an impromptu computer display, for the processing power the user wears. One finger is used to 'hold' down a virtual image, whilse another finger drags, to enlarge or shrink it for example. A single finger can drag items around a desktop, or a frame formed by two index fingers and two thumbs, tells the system to use the webcam to take a picture right now, of whatever its looking at.

If no surface is handy, as the video shows, the system is smart enough to project onto a non-smooth surface such as the palm of the user's own hand, and perform computation that way.

Additional Research Links

Sixth Sense: Integrating Information with the Real World

Pranav Mistry

Fluid Interfaces Group

MIT Media Lab

Staff Comments


Untitled Document .