The rapid advancement of AI has created more capable robots that are now appearing throughout our homes, workplaces, and everyday environments. These robot assistants typically come with manufacturer-defined default settings, but as with any consumer product, one-size-fits-all rarely satisfies everyone. For robots to gain widespread acceptance in personal spaces like our homes, they need to be customizable to individual preferences. However, this customization presents an interesting dilemma.

Many cutting-edge AI-powered robots are marketed as autonomous, intelligent entities capable of natural language interaction. This led Ph.D. Candidate Alex Wuqi Zhang to wonder: might allowing users to customize or program a robot’s behavior inadvertently undermine the very perception of social agency and autonomy that makes these intelligent products appealing in the first place?

Zhang first got the idea for this research from a scene in the movie Interstellar. “One very specific moment that kindled this idea was when Cooper, played by Matthew McConaughey, was adjusting the humor and honesty settings of his robot TARS,” Zhang said. “I found it jarring to adjust the personality/behavior of a robot that had such a strong social presence.”

As a researcher passionate about human-centered technology, Zhang found the perfect environment to explore this question in Assistant Professor Sarah Sebo’s Human-Robot Interaction (HRI) lab at the University of Chicago’s Department of Computer Science. There, he designed a study to examine how different levels of customization affect our perception of robots as social beings. His paper, Balancing User Control and Perceived Robot Social Agency through the Design of End-User Robot Programming Interfaces, presented at the 2025 ACM/IEEE International Conference on Human-Robot Interaction, revealed a clear pattern. When users could program granular details of a robot’s behavior, they began to view it less as an autonomous social agent and more as a preprogrammed tool. This insight earned the paper a Best Paper nomination and highlighted a key tension in robot design.

robot interaction

For his experiment, Zhang developed a prototype using Hello Robot’s Stretch platform, transforming it into an at-home robot butler. He also engineered three different end-user programming interfaces that varied in the level of customization it allowed: a high-granularity interface allowing detailed customization of every behavior, a low-granularity option offering simple macro adjustments through sliders and buttons, and a control condition with no customization options.

robot interface“The engineering behind building this prototype of a robot butler that could handle all these different tasks at once was like conducting a small orchestra,” Zhang explained. “We had to coordinate everything from how it moves and talks to how it sees the world around it.”

The study took place in a conference room thoughtfully transformed into a living space, where participants engaged with the robot in everyday scenarios. They handed over their bags upon entering, engaged in casual conversation about their day, and requested snacks from a nearby selection. Each participant experienced one of the three interface conditions, allowing researchers to observe how different levels of customization influenced their perception of the robot as a social entity.

The significance of this research extends beyond academic interest. As Zhang points out, the customization of AI tools and robots has become increasingly central to product design, yet the research community has largely overlooked how different levels of customization might affect the fundamental way users perceive these technologies.

“We know customization can better align products with user preferences,” Zhang reflected, “but what might we inadvertently damage in the product experience ecosystem? Understanding these trade-offs is crucial for the overall user experience.” He thinks that for many future robot applications on the horizon, maintaining social capabilities could play a crucial part in the long-term acceptance of these robot products, making perceived social agency an aspect designers may want to consider.

The research raises important questions for companies developing the next generation of AI companions and assistants: How much control should users have? Where is the balance between customization and maintaining the perception of autonomous intelligence?

Building on these findings, Zhang has already begun exploring new territory with a follow-up study investigating multi-modal inputs for guiding robot behavior. This continued research aims to further refine our understanding of human-robot relationships in an increasingly automated world.

To learn more about this and other practical work from the HRI lab, visit their research page here.

Related News

More UChicago CS stories from this research area.
Pedro giving speech
UChicago CS News

Pedro Lopes Honored with 2025 IEEE VGTC Virtual Reality Significant New Researcher Award

Mar 13, 2025
UChicago CS News

Sarah Sebo Awarded Prestigious CAREER Grant for Research on Robot Social Skills in Collaborative Learning

Jul 29, 2024
UChicago CS News

Enhancing Multitasking Efficiency: The Role of Muscle Stimulation in Reducing Mental Workload

Jul 10, 2024
UChicago CS News

Unveiling Attention Receipts: Tangible Reflections on Digital Consumption

May 15, 2024
UChicago CS News

University of Chicago Computer Science Researchers To Present Ten Papers at CHI 2024

May 06, 2024
UChicago CS News

FabRobotics: The Fusion of 3D Printing and Mobile Robots

Feb 27, 2024
UChicago CS News

High School Students In The Collegiate Scholars Program Get To Know Robots

Nov 14, 2023
UChicago CS News

UChicago Computer Scientists Design Small Backpack That Mimics Big Sensations

Sep 11, 2023
UChicago CS News

Computer Science Class Shows Students How To Successfully Create Circuit Boards Without Engineering Experience

May 17, 2023
UChicago CS News

UChicago CS Researchers Shine at CHI 2023 with 12 Papers and Multiple Awards

Apr 19, 2023
UChicago CS News

New Prototypes AeroRigUI and ThrowIO Take Spatial Interaction to New Heights – Literally

Apr 18, 2023
UChicago CS News

Computer Science Displays Catch Attention at MSI’s Annual Robot Block Party

Apr 07, 2023
arrow-down-largearrow-left-largearrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-smallbutton-arrowclosedocumentfacebookfacet-arrow-down-whitefacet-arrow-downPage 1CheckedCheckedicon-apple-t5backgroundLayer 1icon-google-t5icon-office365-t5icon-outlook-t5backgroundLayer 1icon-outlookcom-t5backgroundLayer 1icon-yahoo-t5backgroundLayer 1internal-yellowinternalintranetlinkedinlinkoutpauseplaypresentationsearch-bluesearchshareslider-arrow-nextslider-arrow-prevtwittervideoyoutube