A natural behavior planner for multi-personal human-robot interaction within the simulated environment

Abstract

In recent years, diffusion models have made remarkable success in generating realistic human motions. However, existing robot pose-learning approaches are largely focused on single-task and one-to-one scenarios, failing to account for multi-person social interactions. This limitation leads to rigid, context-insensitive behaviors that are ill-suited for real-world service scenarios. Consequently, current systems often produce robotic behaviors incapable of the fluidity and responsiveness expected in human-centered environments, a shortcoming underscored by affordance theory in robotics. To address this issue, we propose RoboActor, an innovative human-robot interaction behavior planner that draws inspiration from theatrical acting to orchestrate both deliberate and automatic actions. Our framework leverages large language models (LLMs) to disentangle primary command-driven tasks from secondary, context-induced subtasks. By this means, RoboActor generates lifelike and socially appropriate behaviors in multi-person settings, significantly enhancing the naturalness, engagement, and realism of service robots in everyday social applications.

Citation

Chen, Y., Zheng, P., Zhou, Z., Soo, C. -E. K., Wang, H., & Yu, C. (2026). A Natural Behavior planner for Multi-personal Human-Robot Interaction within the Simulated Environment. Design and Artificial Intelligence, 2(1), 100062-100062. https://doi.org/10.1016/j.daai.2026.100062

Series name

Publisher

Elsevier BV

Degree

Type of thesis

Supervisor