
A couple of years ago at TED, we heard from a young grad student named . Jeff showed off a new technology. This technology showed off a new type of interface that anyone who watched the elections on CNN is now familiar with, but at the time no one had seen - the multi-touch. It’s on your I-phone and you are going to see it more and more places. Everyone that was in the audience knew that we were among the first to see a new technology that was going to take off.
Today we got to see another such technology. MIT Media Lab researcher showed a technology that her team has been working on that allows us to integrate data with our other senses. The device that she showed us built from off the shelf parts (like a simple web cam) “sees,” and can search for information based on what it sees. For example, you can look at a book in a bookstore, and the devise will pull up an Amazon review and project it onto the book pages for you to read. If you see something you would like to take a picture of, you simply form your hands into a frame, the device “sees” it, and takes a picture of what you have in the frame. It can also project numbers onto your hand which then act as a keyboard for your cell phone.