
“SocialMind: LLM-based Proactive AR Social Assistive System with Human-like Perception for In-situ Live Interactions” was recently accepted and published at ACM Interactive, Mobile, Wearable and Ubiquitous Technologies.
In this paper, we propose SocialMind, the first LLM-based proactive AR social assistive system that provides users with in-situ social assistance. SocialMind employs human-like perception, leveraging multi-modal sensors to extract both verbal and nonverbal cues, social factors, and implicit personas, incorporating these social cues into LLM reasoning for social suggestion generation. Additionally, SocialMind employs a multi-tier collaborative generation strategy and proactive update mechanism to display social suggestions on Augmented Reality (AR) glasses, ensuring that suggestions are timely provided to users without disrupting the natural flow of conversation. Evaluations on three public datasets and a user study with 20 participants show that SocialMind achieves 38.3% higher engagement compared to baselines, and 95% of participants are willing to use SocialMind in their live social interactions.
This work is in collaboration with the AIoT Lab at The Chinese University of Hong Kong.