Title: Robots and Routines: Exploring the Future of Social Robots in Family Life
Challenges in human-robot interaction (HRI) often involve facilitating sustained interactions over a long time, fostering engagement with multiple individuals, and take place in real world settings. The home environment embodies all three challenges, given that multiple family members regularly interact with technology in their household. In this talk, I will describe my doctoral research where I take a family-centered approach to understand, design, and evaluate how social robots can take part in setting and maintaining shared family routines to support long-term HRI. I will present my prior work including participatory design sessions with families to understand their preferences for having social robots in their home and the interactions we prototyped for robot-facilitated family routines. The remainder of my research will include a series of field studies and evaluations investigating whether long-term engagement in robot-facilitated shared routines can promote stronger family relationships and interpersonal connections.
Bengisu Çağıltay is a fourth year PhD student in the Computer Sciences department at the University of Wisconsin-Madison, People and Robots Laboratory. Through qualitative and design-based research she explores how social robots can be used in family life and facilitate family routines. She received her PhD minor in Human Development and Family Studies, MS degree in Cognitive Science (’20) from Middle East Technical University, and BS degree in Computer Science (’18) from Bilkent University. Her prior work has been published and recognized in HCI venues including ACM CHI, IDC, and HRI conferences. Her work is supported by funding from NSF.
Title: Building the Future of Educational VR: Towards an Immersive and Social Learning Experience
Virtual reality (VR) has the potential to revolutionize the way we learn and educate, enhancing and supplementing the traditional learning experience by providing new ways to interact with information and people. However, its full potential in education has yet to be fully realized, as work in this space requires resolving cutting-edge technical challenges and addressing context-specific, user-centered, and pedagogical concerns in real-world settings. In this talk, I will introduce my research which is aimed at pursuing the vision of educational VR through 1) empirical investigations from a multi-stakeholder perspective to understand educational VR adoption and usage, and 2) immersive technologies that enhance social learning experiences. I will advocate key strategies to drive the future of educational VR: lowering the entry level of VR creation tools, fostering collaborative and social experiences, and emphasizing community engagement as the foundational element for inclusive VR adoption.
Qiao (Georgie) Jin is a PhD candidate in the Computer Science and Engineering Department at the University of Minnesota, Grouplens Research Center. Her research focuses on using AR/VR/MR to support remote learning, collaboration, and social connection. Her work has been published and recognized at top-tier venues in HCI and social computing (ACM CHI, CSCW, IDC etc.), including a CHI Best Paper Honorable Mention and Best Toio Award at UIST Student Contest. Furthermore, she is a recipient of the Doctoral Dissertation Fellowship from the University of Minnesota, providing support for her research endeavors.
Title: Pushing Visualizations Beyond the Desktop with Everyday Devices and Novel Interfaces
Visualizations play a critical societal role ranging from uncovering scientific insights to being a core part of analytical decision-making. However, most visualizations are siloed, desktop applications, which limits the iterative, collaborative nature of data analysis. Sense-making happens in the head as well as in the physical world: we think with objects, with our bodies, with marks on papers, and with other people in a distributed, socially situated manner. How can we bridge both the digital and physical worlds for data visualization? My research investigates how data interactions can go beyond the traditional desktop setup in two ways—using everyday devices like smartphones, and through novel platforms for interaction like 3D printed physicalizations with sensing capabilities. In this talk, I’ll discuss how these beyond-the-desktop platforms offer new opportunities to (1) reimagine our interactions with data, (2) broaden education and accessibility, and (3) even reveal more insights about the users themselves. These examples highlight a future where interactive data visualizations are a more seamless and natural extension of our everyday lives.
Sandra Bae is a PhD student at CU Boulder, focusing on how to push data beyond the flat screen. Bae’s research goal focuses on understanding the contexts of when data should be presented as 2D versus 3D and similarly digitally versus physically. Her interdisciplinary research combines techniques from human-computer interaction (HCI), data visualization, and digital fabrication. Her work has been presented at top HCI and visualization research venues, including ACM CHI and IEEE VIS, and has been awarded Best Paper Honorable Mention (VIS 2023). In addition, Bae’s research has been recognized with NSF PEGS21 Fellowship (2018), NASA JPL Master's Thesis Educational Fellowship (2019), Achievement Reward for College Scientists (2021, 2022, 2023), and as a Rising Star in EECS (2023).