Virginia Tech® home

2022 CHCI Student Symposium Celebrates Student Research

May 10, 2022

The 2022 CHCI Student Research Symposium was held in-person April 29th at the Moss Arts Center, celebrating student scholarship and creativity. Over forty students showcased research in posters, demos, and art installations. The student symposium was organized by the CHCI Student Council.

The Best Presentation Prize was designed and created by Student Council member Don Park (Industrial Design), an aspiring toy designer.

First Place

Best Presentation award co-winners Shane Bennett and Mia Saade

MOVIS Scale 
Shane Bennett 

This project uses a tablet to act as a window in a Virtual Reality environment rendered in Unity. This software, VR Viewfinder, has been tested for effectiveness in prior semesters using individual testing, but is being tested on a larger scale (10 users) this semester. The project will be tested for Biophysical and Technical perspectives, with a current focus on the technical stress testing. The biophysical examination will be done using real participants, while the static test will be done without motion. In the static stress test multiple Vive Trackers will be placed in the real environment, with data being collected to verify accuracy or precision. This will use both different physical layouts, and quantity of trackers. These trackers will be represented in the virtual environment using a rendered camera, creating the illusion. The dynamic testing will be done in a later semester, and is currently in the project design phase

Arts in Extended Reality
Mia Saade

This research project aims at creating and refining a taxonomy of interactive VR art by collecting feedback from focus groups and experts in the field of technology and art, with a view to defining and describing this newly defined field of study. The study will build on the reported experiences of users in a cross-reality art space using either a headset or a tablet, to establish if and how those users enjoy VR art experiences together in a shared physical environment.  Our study further builds on prior research, notably the VR ViewFinder Project (#19-1166) and on literature review of aesthetic experience, social VR, and VR art. This research project tackles a key limitation of studies about the impact of the VR ViewFinder on the art experience, which lack precise measures of its benefits.

Second Place

Second place best presentation award winners, Hamda Almahri and Jiayuan Dong

Robot Musical Theater for promoting STEAM Ed & Integrated Art Performances
Jiayuan Dong, Hamda Almahri

The robot musical theater project consists of two parts: the afterschool Program at Eastern Montgomery Elementary School and Robot Musical Theater for Climate Change Education at the Cube. The after school program was a 13-week program that promotes Science, Technology, Engineering, Arts, and Math (STEAM) education to the elementary students at Eastern Montgomery Elementary School through effective human robot interactions (HRI) and took place in Fall 2021.The robot musical theater project was performed at the VT Science Festival 2021.The goal for the present section of the project is to address the climate change issue by utilizing social robots with immersive system (eg. Cyclorama) to facilitate STEAM education. At the end, children, teachers, and parents left the research project with positive reviews and the results suggest the robot theater program maintains and advances students' interest and curiosity in STEAM education.

Third Place

Third place best presentation award winners, from left to right are Timothy Palamarchuk, Vikram Bala, Lauren Sartori, Forrest Meng, and Soham Gandhi.

Haptic Tactics: Advancing Manufacturing with VR Haptic Drilling
Forrest Meng, Lauren Sartori, Timothy Palamarchuk, Soham Gandhi, Vikram Bala

As we enter Industry 4.0 there are endless opportunities for innovation. Our team aims to reimagine current manufacturing tasks using a state of the art VR haptic system that enables mechanics to perform their tasks more effectively and safely. Currently, our scope is recreating the sensation of drilling in aircrafts. Our haptic drilling controller employs compliant actuation to recreate resistive forces that mimic that of a real drilling task. Additionally, we are using virtual reality to develop a virtual environment that interacts with the haptic system. To our knowledge, our system is unique as current solutions in this problem space focus on vibrotactile feedback, which is unable to render the ‘feel’ of the operation. Finally, our approach is human-centered as we employ a human-in-the-loop model for control, modeling, and simulation.

Over forty students presented at the CHCI Student Symposium.

Presentations

iThrive
Shiva Ghasemi

We present our investigation on how Augmented Reality (AR) can be utilized in the healthcare training system with an aim to increase efficiency in Lean Healthcare (LH) and to improve telemedicine visits for infants and children.  We employ Telemedicine Carts which entails systems that integrate cameras, displays, and network access to bring remote physicians right to the side of the patient. This allows patients to communicate with a healthcare provider using technology, as opposed to physically visiting a doctor's office or hospital. To achieve this, we propose a training program via augmented reality, which will improve both online and in-person learning experiences. AR represents digital information on top of real-world environments and in this study, it is utilized as interactive learning support to increase engagement and immersion in tele healthcare. Digital and virtual objects (e.g., graphics, text, sounds) are superimposed on an existing environment to create an AR learning experience for telemedicine operators. The training system has been designed on the hints we got from both contextual inquiry (Qualitative) and eye-tracking (Quantitative) techniques to inform the design solutions. This paper contributes to an understanding of how intuitive AR experiences benefit the four parties involved in this research (patients, parents, doctors, and nurses/operators).

Klaxon. My Dear Sweet Friend
Nikita Shokhov

This 360 VR experience is a laboratory of understanding and poetic interpretation of whiteness and the double consciousness of black people. It is a journey through the intimate world of memories and thoughts of a woman, going through several stages of her life and her different selves. Through symbolic language, a viewer contemplates her life-long process of becoming. An innovative approach in immersive storytelling bridges cinematic and post-dramatic theater experience and employs gaze-interactive visual poetry. The visual narrative unfolds in the subconscious state between dream and wakefulness.

Audio to Augment Your Reality (AAYR) - A Grounded Theory Study 
Abhraneil Dam, Arsh Siddiqui 

We are working to generate and refine a taxonomy for audio augmented reality, in which we have created a definition and have discerned various form factors, use cases, and some potential limitations for those use cases. In order to compile the taxonomy, we have been interviewing experts in audio and in augmented reality, along with the undergraduate and graduate student population at Virginia Tech, and we have been reading previous papers that have studied audio augmented reality. 

To develop these definitions well, we have used the grounded theory approach, where we have split up the research framework into three stages: segmenting our data, coding/identifying the properties of our data, and developing a set of propositions for the data. We intend to use this study as a basis for developing further quantitative experiments.

Sonically-Enhanced In-Vehicle Gesture Interactions
Ahmad Abu Shamat

In-vehicle touchscreen displays offer many benefits, but they can also distract drivers. Car accidents have been on the rise as many drivers focus has been shifting towards secondary tasks. The goal of the project is to focus on auditory in-vehicle gestures. This would eventually decrease the amount of time completing secondary tasks and allow drivers to center their attention to driving. A LEAP Motion camera will be used to record and allow participants to perform air gestures, such as swipe, select. 4 different auditory cues will be explored (Earcon, Spearcon, Auditory Icon, and No Audio) in order to weigh out the most beneficial outcome. Participants will complete four different driving scenarios. Data will be collected through NASA-TLX surveys, questionnaires, eye tracking data and more variables.

Immersive Sports: Improvement in soccer training within the use of VR and third-dimensional technology.
Fernando Branco Moraes

In a collaboration with the Grado Department of Industrial and Systems Engineering and Applied Research in Immersive Environments and Simulations (ARIES) program within University Libraries, this project focuses on the utilization of Virtual Reality (VR) and third-dimensional (3D) technology to find ways of improving soccer athletes performance both in training and in-game. By critically analyzing players' characteristics, choices, behavior, and data, VR and 3D technology will simulate scenarios to be as similar as possible to certain parts of the game that the players feel need improvement. This research through previous papers analysis, collection of data, testing with humans, and documentation will integrate the world of Computer Science and Sports to explore possible outcomes. The objective of this research is to find ways that would implement soccer athletes' training by the familiarization and immersion in real game scenarios, recreated as similar as possible with crowd pressure and weather effects.

Effects of Acoustic Situation Awareness on Pedestrian Safety within VR
Nitin Ayyala, Abhraneil Dam

Our project has participants immerse in a virtual reality simulation of a crosswalk with oncoming traffic. Participants need to cross the road without getting hit by a car while trying to detect an ambulance siren in the distance. In addition to this, they have music playing through their ears. They need to detect the ambulance siren while music is playing and crossing the road. They are given two different sets of earphones and the music will shift randomly between lyrical and non-lyrical. Also, the scenario will randomly implement a bus into the environment.

Muskan Gupta presents Platform vs Users.

Platform vs Users: Investigating Content Warnings & Trigger Warnings on Social Media
Muskan Gupta, Emily Altland

Our user study researches the way social media platforms currently handle content warnings as compared to user-added trigger warnings. We conducted a survey (n=91) of social media users to collect quantitative and qualitative insights on where users should take responsibility for trigger warnings and when a platform should handle content warnings. We followed up with semi-structured interviews (n=4) to gather more clarity about how users interact with the platforms concerning their triggers. We compare how people think these warnings are handled and how they should be handled on social media. Participants agreed there are features that social media could use in making social media a less triggering place without restricting freedom and being aware of possible privacy concerns. Participants were also sensitive to the fact that social media can never be perfect alone and recommended ways that users could be more considerate to their followers/friends.

Immersive Space to Think
Lee Lisle

Immersive Space to Think is a design approach for sensemaking of non-quantitative datasets in Augmented or Virtual Reality (AR/VR). In our approach, users can move and orient artifacts in 3D immersive space in order to organize their thoughts. It supports offloading cognition onto the environment through various methods, such as highlighting key phrases or sentences in text documents, annotating artifacts through attached notes, or creating labels to categorize other artifacts.

Use of Mixed Reality in Safety Training
Anvitha Nachiappan, Roshan George, Joaquin Lara Azocar, Shail Patel

Abbott Nutrition is dedicated to developing science-based nutrition products and supporting healthcare professionals with research to help people live a healthier and better life. Abbott trains its employees on lock-out tag-out, a safety procedure that ensures equipment is restricted and labeled while undergoing maintenance. Trainees currently practice the procedures on a lock-out tag-out cart, which contains valves and switches that can be found on the production floor. The team developed a mixed reality solution using Microsoft Guides and the Hololens to enhance this training process, improving safety, employee knowledge retention, as well as aiding in Abbott’s digital transformation initiatives. The implementation of our solution is expected to reduce the number of lock-out tag-out incidents and save $176,000 in injury costs, $117,320 from the change in training processes, and accelerate digital transformation goals for an additional impact of $94,500 for a total impact of $387,820 over the next three years.

Haptic Tactics: Advancing Manufacturing with VR Haptic Drilling
Forrest Meng, Lauren Sartori, Timothy Palamarchuk, Soham Gandhi, Vikram Bala

As we enter Industry 4.0 there are endless opportunities for innovation. Our team aims to reimagine current manufacturing tasks using a state of the art VR haptic system that enables mechanics to perform their tasks more effectively and safely. Currently, our scope is recreating the sensation of drilling in aircrafts. Our haptic drilling controller employs compliant actuation to recreate resistive forces that mimic that of a real drilling task. Additionally, we are using virtual reality to develop a virtual environment that interacts with the haptic system. To our knowledge, our system is unique as current solutions in this problem space focus on vibrotactile feedback, which is unable to render the ‘feel’ of the operation. Finally, our approach is human-centered as we employ a human-in-the-loop model for control, modeling, and simulation.

Arts in Extended Reality
Mia Saade

This research project aims at creating and refining a taxonomy of interactive VR art by collecting feedback from focus groups and experts in the field of technology and art, with a view to defining and describing this newly defined field of study. The study will build on the reported experiences of users in a cross-reality art space using either a headset or a tablet, to establish if and how those users enjoy VR art experiences together in a shared physical environment.  Our study further builds on prior research, notably the VR ViewFinder Project (#19-1166) and on literature review of aesthetic experience, social VR, and VR art. This research project tackles a key limitation of studies about the impact of the VR ViewFinder on the art experience, which lack precise measures of its benefits.

Orientation Device - Dragzina
Nikita Shokhov

The purpose of the ORIENTATION DEVICE AR project and the DRAGZINA scene, in particular, is to challenge the perception of space familiar to the audience through the prism of queerness. That is why a reflected interior augmented with holographic performance is key. This work comprehends the notion of disorientation of a queer person in the patriarchal system. Furthermore, it celebrates the performative essence of queerness. Working with queer activists who express their identity in creative practices, we want to give them a platform to present themselves to a wide international audience through the poetics of augmented reality and documentary video holograms.

Robot Musical Theater for promoting STEAM Ed & Integrated Art Performances
Jiayuan Dong

The robot musical theater project consists of two parts: the afterschool Program at Eastern Montgomery Elementary School and Robot Musical Theater for Climate Change Education at the Cube. The after school program was a 13-week program that promotes Science, Technology, Engineering, Arts, and Math (STEAM) education to the elementary students at Eastern Montgomery Elementary School through effective human robot interactions (HRI) and took place in Fall 2021.The robot musical theater project was performed at the VT Science Festival 2021.The goal for the present section of the project is to address the climate change issue by utilizing social robots with immersive system (eg. Cyclorama) to facilitate STEAM education. At the end, children, teachers, and parents left the research project with positive reviews and the results suggest the robot theater program maintains and advances students' interest and curiosity in STEAM education.

Automated Vehicles for Seniors
Scott Zieger, Chihab Nadri

Research on improving driving experience and safety for seniors has revolved around their differences with younger drivers in sensory and motor skills, mostly in manual driving situations. As automation technology is improving, automated systems will soon be able to take over and relieve driving tasks. This is particularly helpful for older drivers who may have severe decreases in sensory capabilities, or experience driving cessation and heightened anxiety. The purpose of our project is to understand older driver requirements in increasing levels of automation and subsequently design automotive displays targeting these requirements.

Skye Taylor and Caitlyn Sanford presenting The Effects of In-Vehicle Agents on Trust and Performance with Conditionally Automated Vehicles

The Effects of In-Vehicle Agents on Trust and Performance with Conditionally Automated Vehicles
Jing Zang, Skye Taylor, Caitlyn Sanford

The study uses a driving simulator and looks at the effects of in-vehicle agents on driving performance and behavior in Conditionally Automated Vehicles. In this study, the vehicle is semi-autonomous. Therefore, participants do not have to be driving the entire time, but the in-vehicle agent will request that they take over control for certain events. After each drive, they will need to fill out several questionnaires. Also, the study uses an advanced physical sensor, Empatica E4 wristband, to collect physical signals like heart rate and skin conductance.

VR Haptics
Grady Orr, Kristina Pratt

We have conducted two studies about the effect of meaningful haptics in a virtual environment on social presence and presence when engaging in a game. The first study was a user study for a haptic glove developed by the researchers and its effect on social presence. The second study was investigating the effect of controller method (traditional versus hand tracking) and haptics on social presence when playing with more than one player. The first study is based around a virtual reality rhythm game, and the second study is built around a cooperative sculpting game made in virtual reality.

Eidolon: An Asymmetrical VR vs. PC Game.
Zachary Gaydos

Eidolon is a virtual reality (VR) versus PC asymmetrical game where one VR player faces four PC players. PC players press colored buttons around an arena to complete an objective while the VR player tries to hunt them and send them all to jail. After one team completes their objective in the arena, they ascend to the next arena with a new PC player objective. The team that wins two of three arenas wins the match. Eidolon’s VR game mechanics explore unique interactions using VR controllers and its PC game mechanics focus on encouraging cooperation and codependency between teammates. 

After School Robot Theater
Hamda Almahri

We study how robot theater can promote STEM to young children through an after-school program that teaches the students about robots and programming using art.

Social Emotions of Robots
Isabella Villarente

For my project, we are designing an escape room where each participant will work together with a robot (Pepper) to escape a room by finding clues. There are three emotional conditions: neutral (control), happy, and angry. We are focusing on basic emotions for the first experiment before moving onto secondary emotions. Before the experiment begins, the participant will take a survey about their current emotions and then, they will write about an experience that made them feel happy or angry in order to induce those emotions (no writing for neutral). Then, Pepper will successfully help the participant escape the room by providing hints. After the experiment, the participant will take another emotion survey that will also ask about robot trust. Overall, we are investigating cognitive appraisal and the effect on human trust in robots based on induced emotions.

Automated in-vehicle Agent Interactions in Autonomous Driving
Genevieve Montavon, Ravi Parikh, Manhua Wang

Speech style matters: Evaluation and comparison of in-vehicle intelligent agents with different speech styles and embodiment conditions in autonomous driving.

In-Vehicle Auditory Alerts for Highway Rail-Crossings
Hamza Al Matar, Chihab Nadri

Accidents involving High-way Rail Grade Crossings (HRGCs) have been a serious source of injury in the United States. However, the development of warning technologies have caused this trend to decrease over time, though issues persist. Research in the field of driving safety is primarily focused on understanding human psychology and cognitive abilities. One way to improve driver perception and awareness of HRGCs is by using a hybrid auditory warning that combines both speech and non-speech components. This warning is especially helpful in manual driving tasks when visual attention is already busy. The purpose of this research is to study and quantify the effectiveness of using hybrid auditory alerts to improve driver safety.

MOVIS Scale 
Shane Bennett 

This project uses a tablet to act as a window in a Virtual Reality environment rendered in Unity. This software, VR Viewfinder, has been tested for effectiveness in prior semesters using individual testing, but is being tested on a larger scale (10 users) this semester. The project will be tested for Biophysical and Technical perspectives, with a current focus on the technical stress testing. The biophysical examination will be done using real participants, while the static test will be done without motion. In the static stress test multiple Vive Trackers will be placed in the real environment, with data being collected to verify accuracy or precision. This will use both different physical layouts, and quantity of trackers. These trackers will be represented in the virtual environment using a rendered camera, creating the illusion. The dynamic testing will be done in a later semester, and is currently in the project design phase.

fMRI Sonification
Allison Tieman, Chihab Nadri

Sonification is an increasingly used tool to auditorily represent data sets through music.  Traditionally, fMRI data is majorly conveyed through visual means.  However, the introduction of sonification to fMRI data representation opens up a new realm of possibilities for data differences and patterns to be intuitively displayed.  In this study, we seek to exhibit differences in brain activity between neurotypical individuals and individuals with schizophrenia and autism spectrum disorder (ASD).  The sonification of this fMRI data will allow for an easily distinguishable way to understand and immersively experience the vast differences in cognitive processing between neurological states.  MaxMSP was used to modify the tempo and pitch of music clips based on the fMRI data.  The music clips ranged throughout three different genres both with and without lyrics.  For our initial study, the sound samples will be comparatively played for participants for subjective assessment and evaluation.

Project SHARP
Drew Bowman

This project aims to bridge the gap between live coding and version history, with SHARP (State-History Augmentation for Rapid Programming) creating histories for patterns used in a Tidal live coding piece. SHARP runs in the Atom editor and works with TidalCycles.