New phone case provides workaround for inaccessible touch screens

October 26, 2023
Written By:
Derek Smith, College of Engineering

Touch screens are everywhere but not built for everyone. A new device could help bridge that gap, helping users access ticket kiosks, restaurant menus and more.

A new smartphone case could soon enable folks with visual impairments, tremors and spasms to use touch screens independently.

Developed at the University of Michigan, BrushLens could help users perceive, locate and tap buttons and keys on the touch screen menus now ubiquitous in restaurant kiosks, ATM machines and other public terminals.

“So many technologies around us require some assumptions about users’ abilities, but seemingly intuitive interactions can actually be challenging for people,” said Chen Liang, a doctoral student in computer science and engineering.

A touchscreen monitor displays several food items on a menu. The monitor's screen is backlit and the frame is black. Someone is holding a smartphone connected to BrushLens against the screen while moving it from left to right. BrushLens appears as an off-white case around the edges of the phone.
BrushLens helps users interface with touchscreens by perceiving, locating, and tapping the screen on their behalf, which makes touchscreens more accessible. Image credit: Chen Liang, doctoral student, Computer Science and Engineering

Liang is the first author of a paper accepted by the Association for Computing Machinery Symposium on User Interface Software and Technology in San Francisco. He will demo BrushLens at 7 p.m. Pacific Time Oct. 30 and present the paper at 9 a.m. Pacific Time Oct. 31.

“People have to be able to operate these inaccessible touch screens in the world. Our goal is to make that technology accessible to everyone,” Liang said.

Liang works in the lab of Anhong Guo, U-M assistant professor of computer science and engineering. Guo led the development of BrushLens with Alanson Sample, an associate professor in the same department.

Users can comb through a touch screen interface by holding a phone connected to BrushLens against a touch screen and dragging the phone across the screen. The phone sees what’s on the screen with its camera then reads the options aloud by harnessing the phone’s built-in screen readers. Users indicate their menu choice through screen readers or an enlarged, easy-to-tap button in the BrushLens app.

When given a target, BrushLens divides the screen into a grid, then guides the user’s hand toward the section of the screen containing their menu choice by saying the coordinates of both the target and device. Once those coordinates overlap, pushbuttons or autoclickers on the underside of the phone case tap the screen for the user, depending on the model.

“The user doesn’t have to precisely locate where the button is and perform the touch gesture,” Liang said.

A person is sitting at a tan-colored desk. A black-bordered screen lays flat on the desk and displays a green menu with several images of drinks. Several black cords are connected to the screen and run off the desk. The person is moving a phone inside a BrushLens phone case across the screen. A red arrow is point to her menu choice in the top right corner of the screen, from the viewer's perspective.
BrushLens helped people with visual impairments locate items on a touchscreen menu in study trials. Image credit: Chen Liang, doctoral student, Computer Science and Engineering

Ten study participants, six with visual impairments and four with tremors or spasms, tested the hardware and app.

“As a blind person, touch screens are pretty much inaccessible to me unless I have some help or I can plug headphones into the kiosk,” said study participant Sam Rau. “Somebody else has to order for you, or they have to help you out with it. I don’t want to be in a situation where I always have to rely on the kindness of others.”

It took some time for Rau to figure BrushLens out, but once he became familiar with the device, he was excited by the tool’s potential.

A touchscreen menu shows a picture of a salad bowl with broccoli cheddar soup. Green dashed lines divide the menu into grids. Two black-bordered rectangles, which are smartphones, are resting over the right-hand portion of the screen. The leftmost phone shows a list of menu items to choose from. The rightmost phone displays a large red arrow in the center of the phone screen that is pointing to a chosen menu item. A message below the arrow reads "You will hear a clicking sound if BrushLens has actuated the button."
Once the user indicates their menu choice, the BrushLens companion app will direct users to the correct location on the screen. The device connects wirelessly to the app. Image credit: Chen Liang, doctoral student, Computer Science and Engineering

“I thought about myself going into a Panera Bread and being able to order from the kiosk,” Rau said. “I could actually see myself accomplishing something that I otherwise thought impossible.”

The picture shows the underside of BrushLens, a white phone case. A window in the center of the case reveals a smartphone camera. Thirteen black circles, which are the clickers, surround the window in a circle.
Autoclickers on the bottom of Brushlens tap the screen for users, which helps people with tremors and spasms tap their desired option on the screen. A window in the center of the case allows the phone’s camera to view the items on the touchscreen menu. Image credit: Chen Liang, doctoral student, Computer Science and Engineering

Likewise, BrushLens worked as intended for users whose tremors or spasms cause them to make unwanted selections on touch screens. For one participant with cerebral palsy, BrushLens improved their accuracy by nearly 74%.

The inventors of BrushLens recently applied for a patent with the help of Innovation Partnerships, U-M’s central hub for research commercialization. The team hopes to bring the product to users as an affordable phone accessory.

“The parts that we used are relatively affordable. Each clicker costs only $1,” Liang said. “The whole device is definitely under $50, and that’s a conservative estimate.”

The team plans to further streamline their design so that it easily fits in a pocket. Offloading the battery and processing to the phone, for example, could make the design cheaper and less bulky.

“It doesn’t have to be much more complex than a TV remote,” said study co-author Yasha Iravantchi, a doctoral student in computer science and engineering.

The companion app could also be improved by allowing users to directly interface with it via voice commands, Liang said.

Participants were enrolled in the trial study with the help of the Disability Network, the University of Michigan Council for Disability Concerns and the James Weiland research group in the U-M Department of Biomedical Engineering. The research was funded by a Google Research Scholar Award.