Here’s an interesting concept. Why not turn yourself into a touchscreen?

Chris Harrison at Carnegie Mellon University in Pittsburgh, Pennsylvania, along with Dan Morris and Desney Tan at Microsoft’s research lab created a system that allows users to use their own hands and arms as touchscreens by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin. Skinput uses microchip-sized “pico” projectors to allow for interactive elements rendered on the user’s forearm and hand.

Check out the video too! Read on


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s