Augmented Communication

Our research on "Augmented Communication" aims to enhance remote communication in virtual and augmented reality through the integration of cutting-edge technologies such as machine learning, eye tracking, visual augmentation, and gesture recognition. Through this research, we have developed innovative solutions such as Visual Captions, which proactively suggests relevant visuals to aid in open-vocabulary conversations; ThingShare, a video-conferencing system that facilitates the sharing of physical objects; GazeChat, a remote communication system that utilizes gaze-awareness to represent users in 3D profile photos; and CollaboVR, a framework that enables multi-user communication in virtual reality through the design of interactive and reconfigurable layouts. Our goal is to further the state-of-the-art in real-time systems for augmented communication in VR and AR, ultimately making remote communication more universally accessible and effective.

Publications

teaser image of Visual Captions: Augmenting Verbal Communication With On-the-Fly Visuals

Visual Captions: Augmenting Verbal Communication With On-the-Fly Visuals

Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI), 2023.
Keywords: augmented communication, large language models, video-mediated communication, online meeting, collaborative work, augmented reality


teaser image of ThingShare: Ad-Hoc Digital Copies of Physical Objects for Sharing Things in Video Meetings

ThingShare: Ad-Hoc Digital Copies of Physical Objects for Sharing Things in Video Meetings

Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI), 2023.
Keywords: video-mediated communication, object-centered meetings, online meeting, collaborative work, augmented communication

teaser image of GazeChat: Enhancing Virtual Conferences With Gaze-Aware 3D Photos

GazeChat: Enhancing Virtual Conferences With Gaze-Aware 3D Photos

Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology (UIST), 2021.
Keywords: eye contact, gaze awareness, video conferencing, video-mediated communication, gaze interaction, augmented communication, augmented conversation

Videos

Talks

Interactive Graphics for a Universally Accessible Metaverse Teaser Image.

Interactive Graphics for a Universally Accessible Metaverse

Ruofei Du

Invited Talk at UMD by Prof. Amitabh Varshney , College Park, Maryland.


Computational Interaction for a Universally Accessible Metaverse Teaser Image.

Computational Interaction for a Universally Accessible Metaverse

Ruofei Du

Invited Talk at GAMES Seminar , Remote Talk.


Fusing Physical and Virtual Worlds into An Interactive Metaverse Teaser Image.

Fusing Physical and Virtual Worlds into An Interactive Metaverse

Ruofei Du

Invited Talk at UCLA by Prof. Yang Zhang , Remote Talk.


Polymerizing Physical and Virtual Worlds into  An Interactive Metaverse Teaser Image.

Polymerizing Physical and Virtual Worlds into An Interactive Metaverse

Ruofei Du

Invited Talk by Prof. Arthur Theil at Birmingham City University , Remote Talk.


Blending Physical and Virtual Worlds into  An Interactive Metaverse Teaser Image.

Blending Physical and Virtual Worlds into An Interactive Metaverse

Ruofei Du

Invited Talk at Wayne State University , Remote Talk.


Fusing Physical and Virtual Worlds into 
Interactive Mixed Reality Teaser Image.

Fusing Physical and Virtual Worlds into Interactive Mixed Reality

Ruofei Du

Invited Talk at George Mason University , Remote Talk.


Cited By

  • I Cannot See Students Focusing on My Presentation; Are They Following Me? Continuous Monitoring of Student Engagement Through "Stungage". arXiv.2204.08193. Snigdha Das, Sandip Chakraborty, and Bivas Mitra. source | cite
  • Local Free-View Neural 3D Head Synthesis for Virtual Group Meetings. 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). Sebastian Rings and Frank Steinicke. source | cite
  • Augmented Chironomia for Presenting Data to Remote Audiences. arXiv.2208.04451. Brian D. Hall, Lyn Bartram, and Matthew Brehmer. source | cite
  • A State of the Art and Scoping Review of Embodied Information Behavior in Shared, Co-Present Extended Reality Experiences. Electronic Imaging. Kathryn Hays, Arturo Barrera, Lydia Ogbadu-Oladapo, Olumuyiwa Oyedare, Julia Payne, Mohotarema Rashid, Jennifer Stanley, Lisa Stocker, Christopher Lueg, Michael Twidale, and Ruth West. source | cite
  • A Reference Framework for Evaluating Virtual Conferences. Universitätsbibliothek Linz. Lisa-Marie Huber and Alexander Gindlhumer. source | cite
  • A Reference Framework for Evaluating Virtual Conferences/submitted by Alexander Gindlhumer. JOHANNES KEPLER UNIVERSITY LINZ. Alexander Gindlhumer. source | cite
  • Stay In Touch