In my recent posts, “Exploring Our Universe Through Sound” and “How a Blind Oceanographer Studies Temperature-Regulating Currents” I discussed how sonification can make data more accessible for the blind and low vision community. However, this method is usually derived from an existing visual representation, which prevents blind and low vision users from creating their own data representations.

A team of researchers from MIT and University College London (UCL) has developed a groundbreaking software system called Umwelt. Unlike existing tools, Umwelt doesn’t rely on an initial visual chart—it allows blind and low-vision users to create customized, multimodal data representations from scratch.

Here’s how Umwelt works:

  • Authoring Environment: Umwelt provides an editor where users can upload datasets and design custom representations, such as scatterplots.
  • Three Modalities: Users can incorporate three modalities into their representations:
    1. Visualization: Traditional visual elements.
    2. Textual Description: Descriptive text.
    3. Sonification: Converting data into nonspeech audio.
  • Interactive Exploration: Umwelt’s viewer lets users seamlessly switch between modalities, enabling interactive exploration of data.

This innovative software aims to break down barriers and empower blind and low-vision individuals in their data analysis endeavors. Read the full article here.