In my recent posts, “Exploring Our Universe Through Sound” and “How a Blind Oceanographer Studies Temperature-Regulating Currents” I discussed how sonification can make data more accessible for the blind and low vision community. However, this method is usually derived from an existing visual representation, which prevents blind and low vision users from creating their own data representations.
A team of researchers from MIT and University College London (UCL) has developed a groundbreaking software system called Umwelt. Unlike existing tools, Umwelt doesn’t rely on an initial visual chart—it allows blind and low-vision users to create customized, multimodal data representations from scratch.
Here’s how Umwelt works:
- Authoring Environment: Umwelt provides an editor where users can upload datasets and design custom representations, such as scatterplots.
- Three Modalities: Users can incorporate three modalities into their representations:
- Visualization: Traditional visual elements.
- Textual Description: Descriptive text.
- Sonification: Converting data into nonspeech audio.
- Interactive Exploration: Umwelt’s viewer lets users seamlessly switch between modalities, enabling interactive exploration of data.
This innovative software aims to break down barriers and empower blind and low-vision individuals in their data analysis endeavors. Read the full article here.
“We have to remember that blind and low-vision people aren’t isolated. They exist in these contexts where they want to talk to other people about data,” says Jonathan Zong, an electrical engineering and computer science (EECS) graduate student and lead author of a paper introducing Umwelt. “I am hopeful that Umwelt helps shift the way that researchers think about accessible data analysis. Enabling the full participation of blind and low-vision people in data analysis involves seeing visualization as just one piece of this bigger, multisensory puzzle.”





