Projected metadata : a new way of interfacing

A friend of mind passed on this video on a new type of wearable tech called ‘Sixth Sense’.

Built as a device cobbled together from only $350 worth of off-the-shelf components, what gives ‘sixth sense’ the psychological edge is the tactile thing… After all, what could be more intuitive than manipulating things with your hands and fingertips? Dive into the vid at around the 02:16 marker when the speaker (Patti Maes) starts demo-ing the technology (video after the jump):

Sigh…

And it’s so simple, it leaves you almost breathless with the irritation of it. Not to mention the niggly feeling that this technology could totally slip under the radar, while noone is watching, and become the first big step on our journey into the crushing arms of some much-anticipated borganism. Long before the “wires” grow out of our myriad devices and embed themselves into us like ivy into crumbling brick, there will be this. This… tactile data thingy.

google-goggles

[update 2009-10-12] : just noticed that a new app for Google’s android phone is sneaking up on precisely this type of technology. The android app is called ‘Google Goggles‘, and it’s not so much about projected metadata as it is about using visual metadata into search alogorithms (instead of typing in search keywords, you take a snapshot with your snazzy android phone and send off the photo to goggle. It returns search results relevant to the place / item / person /artefact in your photo).

In terms of comparing it to the device demo’d at TED, it’s a flipside of the same coin, if you ask me (watch the small vid and you’ll see what I mean):

Leave a Reply

Your email address will not be published. Required fields are marked *