Maryland Today | The project aims to convey big data with sound and touch

Whether it’s functioning effectively at work or keeping up with the news, modern society increasingly expects people to be not only data literate but also data savvy. Charts and infographics that can make a sea of ​​numbers tangible often present a barrier for blind users, and existing accessibility tools such as screen readers cannot yet provide an overview of large datasets or charts, let alone a thorough search.

Two University of Maryland researchers funded with $433,000 from the National Science Foundation are turning to sound and touch to help people analyze massive data.

“The fundamental impact this work will have on the millions of blind or visually impaired people in the United States and around the world cannot be overstated,” said Niklas Elmqvist, who is leading the two-year project with his colleague College of Information Studies Professor Jonathan Lazar. Elmqvist, who also has an appointment at the University of Maryland’s Institute for Advanced Computer Studies, adds that this research will have a broad impact on the general population, given that most people experience vision loss as they age.

As the role of big data continues to grow in scope and importance, the inaccessibility gap could also widen if researchers don’t step in, they said. The pandemic has highlighted this urgent need – a study shows that half of blind users depend on the help of sighted people to access important data on COVID-19.

The UMD team will work with the blindness community and technology organizations to assess popular accessibility tools and contexts, and conduct the research through the lens of two real-world settings that require large data sets – higher education and employment.

Their overall goal is to develop high-bandwidth data representations based on sound, touch, and physical computing that enable blind users to view, analyze, and understand large data sets as easily as sighted users.

The underlying approach for this work is called “sensory substitution,” or using broad assistive technologies to functionally substitute in one sense for another, said Lazar, who is director of UMD’s Trace Research and Development Center that is working on the Improving accessibility in technology.

For example, Lazar and other UMD faculty are known for a tool they developed called iSonic, which creates an audio version of a map that allows a user to sweep left to right to hear different pitches representing the data.

Elmqvist also has extensive experience in this area — he helped design an essential oil diffuser that transmits data by smell, and is currently working with the Maryland Department of Education to create a computer science data course designed for blind high school students is accessible.

As Elmqvist discussed at a community TEDx talk at Montgomery-Blair High School in Silver Spring, Maryland, many current sensory substitution techniques are not scalable because they take too long to develop, are too expensive, or are not widely available.

Therefore, the iSchool professors say that a fundamental goal is to develop cost-effective solutions that are compatible with blind users’ existing software and equipment.


Leave a Comment