Skip to Content, Navigation, or Footer.
The Daily Cardinal Est. 1892
Thursday, February 19, 2026
Cook-AR-Figure.png
CookAR is a Computer Vision-powered prototype AR system with real-time object affordance augmentations to support safe and efficient interactions with kitchen tools for people with impaired vision abilities (Low Vision).

Meet the UW-Madison lab developing technology for the blind, visually impaired

MadAbility, run by Yuhang Zhao, is developing access technology that relies on machine learning to fill information gaps for people who are blind or have low vision.

MadAbility, the University of Wisconsin-Madison Accessibility Lab, is working to develop technologies that allow blind and low-vision individuals equal access to information and to the world.

In an era of visual information, like TikTok videos, YouTube and virtual reality headsets, MadAbility aims to create high-tech solutions for the blind and visually impaired.

Principal Investigator Yuhang Zhao told The Daily Cardinal that modern society is built on — and many of our technologies are centered around — visual information. Exhibit A: smartphones.

“When you use a smartphone, you have a very fancy touch display and vivid colors that you can interact with,” Zhao said. “The whole [of] virtual reality is based on visual experience. But all of these emerging technologies leave blind and low vision people out.”

One technology MadAbility is developing is A11yBits, a “tangible toolkit” allowing blind and low-vision people to assemble their own personalized devices to support their unique needs.

Each kit includes a set of four sensing modules and four feedback modules that can be mixed and matched like LEGO pieces. The sensing modules detect environmental information and user commands: a motion module detects movement, a voice module recognizes speech, a timer module tracks time and a temperature module detects the current temperature. The feedback modules send auditory, visual and vibration alerts based on the input. 

Zhao said customizable access technology like AllyBits is very important because an individual's visual abilities, living conditions and prior experiences can differ in a multitude of ways.

Because AllyBits’ digital components may be challenging for blind or low vision individuals to use without a technical background, MadAbility also developed an AI agent to help users understand the modules’ functionalities and create effective solutions with them.

Zhao said she doesn’t think access technology should merely assist people with disabilities. Rather, she sees the individual and the technology as collaborators. 

“A lot of our technologies follow that principle: what are people's current abilities and what are their preferences? We can [then] leverage those to build our technologies,” she said.

Recipe walkthroughs, cooking safety and AI in the kitchen

MadAbility also developed an AI system called AROMA that enables blind individuals to better follow video recipes in the kitchen. The user wears their phone in front of their chest to capture the cooking process as an AI agent describes information from a chosen video recipe, responds to input from the user and issues alerts or corrective suggestions if the user made an error.

Enjoy what you're reading? Get content from The Daily Cardinal delivered to your inbox

For example, if a person using AROMA to make a pepperoni pizza accidentally added pepperoni before cheese, the system would recognize the mistake and generate an alert, Zhao said.

CookAR is AROMA’s “sister” system for those with low vision, an Augmented Reality (AR) system that enables low vision individuals to cook in the kitchen more safely and efficiently by wearing AR glasses that highlight “grabbable areas,” such as the handle of a knife, in green and “hazardous areas,” such as the blade of a knife, in red.

“The fact is, low vision people still have vision, and they want to use their vision,” Zhao said.

Zhao said a lot of existing technologies see blind and low vision people as “the same,” providing only audio and haptic feedback despite those that are low vision still retaining partial sight. CookAR and AROMA aim to meet the needs of both different groups.

Zhao said she wants to continue exploring how AI and people with disabilities, especially those that are blind or low vision, can collaborate with each other to complete tasks “more smoothly, efficiently, safely and confidently.”

AllyBits, AROMA and CookAR were developed in collaboration with a professor from the University of Texas-Dallas, a team from Notre Dame and a student from University of Washington, respectively.

The MadAbility Lab also investigates how their technologies can be applied to generalized audiences in areas like mental health and gender identity and expression. They are planning to host a workshop in early May for people in the blind and low vision community to try out their technologies and provide feedback. 

Support your local paper
Donate Today
The Daily Cardinal has been covering the University and Madison community since 1892. Please consider giving today.

Powered by SNworks Solutions by The State News
All Content © 2026 The Daily Cardinal