Faculty and students from the School of Interactive Computing are working to benefit the lives of underserved groups and communities such as autistic employees and visually impaired social media users.
That’s just two examples of the Georgia Tech-led research presented at the 25th Conference on Computer-Supported Cooperative Work and Social Computing (CSCW), which wraps up today. In all, 19 papers authored or co-authored authored by Georgia Tech faculty and students were presented at the virtual conference, which focuses on technologies impacting groups, organizations, communities, and networks.
Jennifer Kim, an assistant professor in the School of Interactive Computing, is researching how technology can make the work environment more inclusive of neurodiverse people, while Stanley Cantrell, a Ph.D. student advised by Interactive Computing and School of Psychology professor Bruce Walker, is exploring how Facebook can be more accessible to the visually impaired.
Kim’s paper, The Workplace Playbook VR, is based on a study she conducted in South Korea with researchers from Seoul National University and Hanyang University. The study looks at how virtual reality can foster a more inclusive environment for neurodiverse employees.
Kim and her team designed a virtual reality program that provides data to families of neurodiverse workers to give them an idea of what they may be struggling with in the workplace.
“Families of autistic individuals can’t go to the workplace with them, so they really didn’t know what struggles they were going through, but by seeing how they do with the virtual reality and the available data, this made an opportunity for parents and therapists to understand this individual and have more empathetic communication with them,” Kim said.
But the study took an unexpected turn when the researchers began showing the VR program to the coworkers of the neurodiverse employees.
“What we didn’t expect was this case of being able to use this virtual reality for neurotypical coworkers,” Kim said. “(The neurodiverse individuals) liked that sharing this data can open up a conversation about how they are different from their neurotypical co-workers and how those neurotypical coworkers should change their behaviors to better interact with neurodiverse people.”
Kim said that adding the goal of modifying the behavior of coworkers helped the paper stand apart from previous research projects.
“We shouldn’t just focus on neurodiverse people,” she said. “There are a lot of technologies for neurodiverse people, but there isn’t much research on how we can change behaviors of neurotypical people to better interact and understand the perspective of neurodiverse people.”
Kim said the companies studied in her research have already reported a noticed difference in how much more comfortable neurodiverse employees feel in their work environments.
“Managers have told us it’s going to be really helpful for new neurotypical employees to better understand what neurodiverse employees like and what are their behavioral characteristics so they can understand by their first day of work how to communicate with them and what to expect,” she said.
Stanley Cantrell has a different definition of accessibility than what is normally understood. Accessibility features in social media sites like Facebook must go beyond simply allowing users who are visually impaired to functionally use the website. These users deserve an equitable experience that rivals that of their sighted counterparts.
“We know there are about 285 million people worldwide living with some form of visual impairment,” Cantrell said. “They want to do the same things that sighted individuals do on Facebook, but the technology doesn’t facilitate rich engagement for individuals living with disabilities like visual impairment.”
In his paper, Sonification of Emotion in Social Media: Affect and Accessibility in Facebook Reactions, Cantrell explores making Facebook Reactions more emotionally engaging for visually impaired users. Working with Walker, who is the director of the Sonification Lab at Georgia Tech, Cantrell and his collaborators produced 48 different sounds that can be associated with Facebook Reactions, such as Like, Love, Sad, and Angry.
Cantrell has always had an interest in universal design, also known as inclusive design, but it blossomed during his internship with Facebook. He said although Facebook currently meets basic accessibility standards, he wanted to reimagine the experience for visually impaired users.
“Facebook’s screen reader accommodations check the box of making this accessible, but does it make it rich and engaging and delightful? What ways can we use sound to transform this visual information, but also in a way that’s engaging and doesn’t disrupt the experience,” he said.
Cantrell recruited 75 participants for his study, including 11 visually impaired subjects, to evaluate each of the 48 sonifications that he and his collaborators designed.
“Before we could begin designing sonifications, we had to first understand how sighted people interpret each Facebook Reaction,” he said. “Sometimes Reactions can have different meanings based on the context.
“We did some legwork prior to the study to see the different ways that Facebook Reactions could be used. For example, we found that the Haha Reaction can be used to laugh at something funny, or it can be used to bully someone.”
Looking beyond the scope of his paper, Cantrell said he hopes to make sound-enabled emojis a feature for text messaging, and he hopes it will be something that both sighted and visually impaired users can enjoy.
“We knew we didn’t want this to be just for blind people,” he said. “We wanted this to be an accessibility feature that could be useful to anyone.”