There is a certain conceptual neatness to this research that Slashdot shares. Developed by CMU’s Chris Harrison, the system uses acoustics to sense taps and gestures on your arm. Harrison has even shown the addition of a pico projector so imagine an actual keypad on your forearm or palm and accurately keying some input with it.
There is a certain amount of overlap with Sixth Sense, about which I’ve written previously. It has me wondering if the systems could be combined in a complementary fashion. The ability to input data without the projector active seems like a huge plus on top of everything else Sixth Sense can do with output and interaction.
Unfortunately, it doesn’t look like Harrison’s system is as compact, though that may improve as it makes its way to market. The Singularity Hub author, who also noted the similarity between the two systems and a third from Microsoft, also considers the other difference that occurred to me, that Sixth Sense is open source. The author, Aaron Saenz, looks at it as possibly helping Harrison bootstrap his work to market the same way Pranav Mistry is trying to do.
I think just one of them being open might facilitate a fruitful combination of the two. If they were both open, who knows what sort of wild, post-human augmentation they might unleash. Eventually.