One of the more exciting fields of play that we’ve been involved in at Supertouch is Physical Computing. It is one of my favorite fields of interactive tech because it throws out the “traditional” ideas of computer interfaces and UI, replacing it with much older interfaces, very human interfaces. At the same time, observance of user experience is so much more gratifying for me – I can watch how people physically use my build (often without any need for instruction) rather than thinking of UX in terms of what is on a flat computer monitor. What is it? Physical Computing can mean many things. In my world it means the use of sensors on (analog) physical objects, an when manipulated, causes digital things to happen. This is a pretty broad description, here are a couple of cases where we’ve used it.
In the video above, we have constructed a sphere (clear in order to demonstrate the small footprint of the sensor and battery pack) which controls a 3d rendering of the Earth onscreen. Live seismic data is screen at runtime. As you rotate the sphere in your hand, you see live data representing earthquake activity around the world. Because the sphere is untethered, and the motion is so responsive, one can be immersed in the story of the data, automatically detaching themselves from the fact that they are using a computer. The sensor is small enough that it can be hidden in most any object, it need not be a sphere (ie: put it in a sneaker to demonstrate products, or into a 3d printed bone to educate about the human form). The seismic data can be anything too, in fact, we later switched to television viewing data comparing viewership of popular shows around the world. This demonstration was created with the help of my good friends Noah Zerkin on hardware and Robert Hodgin for his Cinder application.
Another example that we did very early on was detailed in my older blog post. In short, I used a set of BMX bicycle handlebars that I purchased from a bike shop. We were hired by Pfizer to provide children suffering from hemophilia with an activity that would leave a memorable impression for the children and parents alike. The event attracted families from all over the united states to San Francisco and many families treated the event as a chance at a vacation. We assumed many of the children we would meet had seen some key tourist sites around SF in the few days prior, so we decided to include those locations in the game to create an emotional connection. The “game” was very basic, yet effective. These children don’t get to partake in many activities that most children take for granted. We enabled the kids to take a virtual bike ride by using the handlebars (equipped with sensors) and see many of the actual tourist sites that was still fresh in their memories. I knew we’d succeeded when I repeatedly heard “look mommy, we were there together.” The use of handlebars rather than the traditional joystick or gaming controller allowed us to better tell a story because we removed any preconceptions about a video game experience, providing something unforeseen.
Adding similar sensors to a real skateboard allowed us to control Tony Hawk’s Pro Skater HD videogame.
Finally, I can’t mention physical computing without mentioning “wearables”. A VERY exciting field with many implications for changing daily life. I hope to create a future blog post with a concentration on wearables, including our experiments and possible commercial applications.
Here’s a video of Noah holding his prototype up for the camera, his own board. fun stuff.