Sensible Agent: A Framework for Unobtrusive Interaction with Proactive AR Agent

Proactive AR agents promise context-aware assistance, but their interactions often rely on explicit voice prompts or responses, which can be disruptive or socially awkward. We introduce Sensible Agent, a framework designed for unobtrusive interaction with these proactive agents. Sensible Agent dynamically adapts both “what” assistance to offer and, crucially, “how” to deliver it, based on real-time multimodal context sensing. Informed by an expert workshop (n=12) and a data annotation study (n=40), the framework leverages egocentric cameras, multimodal sensing, and Large Multimodal Models (LMMs) to infer context and suggest appropriate actions delivered via minimally intrusive interaction modes. We demonstrate our prototype on an XR headset through a user study (n=10) in both AR and VR scenarios. Results indicate that Sensible Agent significantly reduces perceived intrusiveness and interaction effort compared to voice-prompted baseline, while maintaining high utility.

Publications

teaser image of Sensible Agent: A Framework for Unobtrusive Interaction with Proactive AR Agent

Sensible Agent: A Framework for Unobtrusive Interaction with Proactive AR Agent

Proceedings of the 39th Annual ACM Symposium on User Interface Software and Technology (UIST), 2025.
Keywords: Proactive Agents, Augmented Reality, Unobtrusive Interaction, Context-Awareness, Multimodal Interaction, Human-Agent Interaction, Large Multimodal Models, Adaptive Interfaces



Videos

Talks

Cited By

Stay In Touch