Auto industry deadlines loom for impaired-driver detection tech, U-M offers a low-cost solution

March 1, 2024
Written By:
Jim Lynch, College of Engineering
Contact:

As the comment period closes on the new federal requirement, a U-M team demonstrates that upgrades to current technologies could do the job

Professor of Electrical & Computer Engineering, Mohammed Islam, holds a prototype development kit of a hybrid camera (right) and a 'direct time of flight' sensor, (left) a sensor found in smartphone cameras for proximity sensing. Image credit: Jeremy Little, Michigan Engineering
Professor of Electrical & Computer Engineering, Mohammed Islam, holds a prototype development kit of a hybrid camera (right) and a ‘direct time of flight’ sensor, (left) a sensor found in smartphone cameras for proximity sensing. Image credit: Jeremy Little, Michigan Engineering

Cameras similar to those already on newer model cars, combined with facial recognition tools, could read the “tells” of impairment in the face and upper body of a driver, University of Michigan engineers have shown.

This low-cost system could effectively detect drunk, drowsy or distracted drivers before they get on the road—or while they are on the road. A new federal requirement for all new passenger vehicles to have this safeguard passed as part of the 2021 Infrastructure Investment and Jobs Act, and the deadline could come as soon as 2026.

The standards that these new safety systems will have to meet are currently under discussion, and the National Highway Traffic Safety Administration’s comment period closes March 5. While the details are up in the air, the U-M team is confident that their system can meet the new requirements in a cost-effective way.

“You already see these 3D camera technologies in products like smartphones, tablets and mixed reality devices,” said Mohammed Islam, U-M professor of electrical engineering and computer science who leads the project. “And these are small, inexpensive cameras that can easily be mounted on the rearview mirror, the steering column or other places in the driver’s cockpit.

Five figures with seatbelts are shown behind the wheel of a car. 
The first has blushing cheeks, the label, "Blood Flow: LiDAR sees under the skin to capture increases in flow."
The second has a heart monitor symbol next to it with the label, "Increased Heart Rate: It also monitors and calculates changes in heart rate."
The third shows a driver with drooping eyelids, with the label, "Droopiness of the eyelids: The system flags eyelids that droop or blink slower than normal."
The fourth shows a driver whose head leans to the side with the label, "Head/Body posture: It captures changes in head position and body posture."
The fifth shows a driver and a breath monitor readout with the label, "Changes in Respiratory Rate: The rise and fall of the chest allows monitoring of respiratory rate." Image credit: Michigan Engineering
The system aims to detect drunk, drowsy and distracted drivers by looking for five signs that can be observed from the driver’s face and body. Image credit: Michigan Engineering

“In many new vehicles, Advanced Driver Assistance Systems (ADAS) cameras are already onboard to track driver alertness. They’ve already been matured and are cost-effective.”

Islam’s team proposes augmenting existing ADAS cameras with infrared Light Detection and Ranging (LiDAR) or structured light 3D cameras costing roughly $5-$10. Their proof of concept experiments, which interpret data captured by the 3D cameras with artificial intelligence tools, can identify five signs that a driver may be impaired.

  • Increased blood flow to the face. Consuming alcohol causes surface blood vessels to expand, sending more blood to the face and resulting in the redness, and puffiness, we associate with being drunk. Blood absorbs infrared light sent out by the 3D (LiDAR) cameras, so an increase in blood flow creates a variance from the driver’s baseline measurements.
  • Heart rate. By viewing the blood vessels, the infrared cameras can also observe the driver’s pulse.
  • Eye behavior. The AI can identify drooping eyelids or decreases in blinking.
  • Head position and body posture. Variations in the driver’s baseline head position and overall posture are measured and monitored.
  • Respiratory rate. 3D cameras can observe the rise and fall of the chest, and the AI can compare the driver’s respiratory rate to their baseline. Alcohol consumption typically reduces respiratory rate.

The team demonstrated that the system can measure vital signs, detect drowsiness and provide data that correlates with breathalyzer readings. The researchers are now working with Tier 1 auto suppliers, including DENSO, and Tier 2 suppliers that make the cameras to further develop and potentially commercialize the technology.

The system is cheaper and harder to cheat than in-auto breathalyzers, which could cost as much as $200 per vehicle. A breathalyzer may be defeated by someone else performing the test on the driver’s behalf—or by opening the windows or diluting the air near the driver.

While the use of light to measure blood may raise concerns about equity after pulse oximeters were shown to give incorrect readings for patients with darker skin, the infrared light used by 3D cameras is not affected by melanin that causes skin color.

The 3D cameras also overcome two other challenges that prevent conventional cameras from performing this role effectively. By measuring only the infrared light sent out by the camera, the system can ignore changes in ambient light. In addition, the 3D camera can track the motion of the driver, preventing a different angle from looking like a change in the driver’s face.

Drunk driving remains one of the leading causes of death on America’s roads. In 2021, the most recent year data is available, NHTSA reported 13,384 deaths were linked to drunk driving—up 14% from the previous year. The agency estimates an average of 37 people die each day in drunk driving accidents.

This technology was developed with support from DENSO, Omni Sciences and the University of Michigan. Omni Sciences is a U-M startup founded by Islam, in which Islam has a financial interest. U-M is seeking partners to bring the technology to market.

Related research:

Contactless Vital Sign Monitoring System for In-Vehicle Driver Monitoring Using a Near-Infrared Time of Flight Camera, Appl. Sci. 2022, 12(9), 4416. (DOI: 10.3390/app12094416)

Contactless Vital Sign Monitoring System for Heart and Respiratory Rate Measurements with Motion Compensation Using a Near-Infrared Time-of-Flight Camera, Appl. Sci. 2021, 11(22), 10913. (DOI: 10.3390/app112210913)