With CAREER Award, Asst. Prof. Pedro Lopes Explores Human-Computer Integration

April 05, 2021

The long-term shrinking trend of technology has dramatically changed our world, putting powerful computers that once filled entire rooms into our pockets. Many experts predict that the next frontier is the human body itself, with implantable devices under the skin and inside muscles and organs. But adding to the  concerns about ethics and invasiveness, these methods of placing a computer inside the body sacrifice a key element of technology: interactivity.

In recent years, UChicago CS assistant professor Pedro Lopes has explored what’s possible with technologies that sit on the body: wearable devices that influence a user’s motion and perception. His vision of human-computer integration creates new interactive devices that “borrow” parts of the user’s body for input and output to expand potential and accessibility.

With a new NSF CAREER grant, the agency’s most prestigious award in support of early-career faculty, Lopes will embark upon the next phase of that mission, inventing and testing technologies that interface with smell, touch, temperature, and other senses. 

“There's a big limitation coming around the corner in the way that we're designing technology,” Lopes said. “At some point, implantables are so deep into your tissues that they don't even expose a user interface, so they're not even interactive anymore, they have to do things automatically. I'm not so interested in that future, but more interested in understanding, is there any way to circumvent these limitations? The CAREER will empower us to do more of this work, to discover more of these approaches that allow us to trick the human body into producing those sensations without needing to externally generate them.”

The work builds upon Lopes’ previous experiments with electrical muscle stimulation (EMS), where electrodes on the skin activate muscles to help users operate an unfamiliar tool, draw a complicated data plot, or take a photograph of a fast-moving baseball. With members of his Human-Computer Integration Lab at UChicago CS, Lopes has also invented devices for making virtual reality or other “out of body” sensory experiences more realistic, using chemicals to produce illusions of temperature (with PhD student Jas Brooks) or wearable devices that simulate the grasp of small children (with postdoctoral researcher Jun Nishida).

L: HandMorph (Jun Nishida et al.). R: Trigeminal-based Temperature Illusions (Jas Brooks et al.). For more see https://lab.plopes.org/.

The CAREER grant will support more investigations into these sensory domains, creating new devices that can produce and manipulate smell, temperature and skin sensations. Current approaches for controlling these senses typically involve large, bulky machinery, but Lopes will explore indirect strategies that are more easily miniaturized and thus practical for wearable technology.

“My idea is not to add more electronics that externally induce sensations, such as warming the skin with a heating pad or pushing the hand with an exoskeleton, but to add only the minimum components needed to trick the user's biology into internally inducing the sensation,” Lopes said.

That mission could inspire a device that stimulates a user’s trigeminal nerve inside the nose to simulate smells or an arm pad that secretes chemicals to create sensations of heat, tingling, or vibrations in the skin. These technologies may help make virtual experiences more realistic, or restore some olfactory function to anosmics — people who have lost their sense of smell. On the latter application, Lopes has started a collaboration with Jayant Pinto, a professor of surgery at UChicago Medicine who treats patients with loss of smell. 

Lopes will also continue research into how devices that take control of a person’s body or senses interact with the users’ sense of agency. In previous work, Lopes located a brain area where activity changed according to whether a person felt they initiated a movement or attributed their action to the device’s control. In the baseball photography experiment, Lopes and Nishida also found that adjusting the timing of the EMS device “assistance” could trick people into thinking they took the photograph themselves. 

In a collaboration with Professor of Psychology Howard Nusbaum, Lopes will pursue further experiments identifying this neural signal of agency and using it to create integrated devices that retain the user’s feeling of self-control while benefiting from the technological enhancement. Such a balance might also be crucial for testing whether the use of these devices can teach people new skills. For instance, would a wearable that activates a person’s muscles to help them learn to drum in time produce improvements even after the device was removed?

“Does it still work when you take off the electrodes? Do you get permanently sped up? We’re testing that question in new ways,” Lopes said. “The speedups we're seeing are small, on the order of tens of milliseconds, but that improvement in reaction time is very significant if you're trying to hit a baseball or a tennis ball. But our tests so far are very constrained, we want to use this opportunity to make the problem more natural and more complex.”

Lopes’ grant will also support new education, mentoring and outreach initiatives. Since his arrival at UChicago, Lopes has participated in events designed to encourage female and non-binary participation in computer science, such as the compileHer capstone event and Ada Lovelace Week. He is also developing new courses in human-computer interaction for the UChicago College Prep program, as well as a new Engineering Electronics onto Printed Circuit Boards course that will instruct UChicago students beyond computer science in the fundamentals for creating electronic devices.

In designing hands-on experiences with students who may not have any prior computer science experience, Lopes hopes to offer new doorways that expand participation in the field — a second route by which his devices can increase human potential.

“It’s using human-computer interaction to produce a good first interaction with computer science,” Lopes said. “Many people’s introduction to computer science is through learning about abstract algorithms or programming and while that may work for many, it might not reach everyone. So, what if you would use tangible physical devices and user interfaces in general to motivate someone's interest? That’s my goal.”