The long-term shrinking trend of technology has dramatically changed our world, putting powerful computers that once filled entire rooms into our pockets. Many experts predict that the next frontier is the human body itself, with implantable devices under the skin and inside muscles and organs. But adding to the concerns about ethics and invasiveness, these methods of placing a computer inside the body sacrifice a key element of technology: interactivity.
In recent years, UChicago CS assistant professor Pedro Lopes has explored what’s possible with technologies that sit on the body: wearable devices that influence a user’s motion and perception. His vision of human-computer integration creates new interactive devices that “borrow” parts of the user’s body for input and output to expand potential and accessibility.
With a new NSF CAREER grant, the agency’s most prestigious award in support of early-career faculty, Lopes will embark upon the next phase of that mission, inventing and testing technologies that interface with smell, touch, temperature, and other senses.
“There's a big limitation coming around the corner in the way that we're designing technology,” Lopes said. “At some point, implantables are so deep into your tissues that they don't even expose a user interface, so they're not even interactive anymore, they have to do things automatically. I'm not so interested in that future, but more interested in understanding, is there any way to circumvent these limitations? The CAREER will empower us to do more of this work, to discover more of these approaches that allow us to trick the human body into producing those sensations without needing to externally generate them.”
The work builds upon Lopes’ previous experiments with electrical muscle stimulation (EMS), where electrodes on the skin activate muscles to help users operate an unfamiliar tool, draw a complicated data plot, or take a photograph of a fast-moving baseball. With members of his Human-Computer Integration Lab at UChicago CS, Lopes has also invented devices for making virtual reality or other “out of body” sensory experiences more realistic, using chemicals to produce illusions of temperature (with PhD student Jas Brooks) or wearable devices that simulate the grasp of small children (with postdoctoral researcher Jun Nishida).