New in-car AI can flag drunk drivers by constantly scanning their face

image

The computers built into cars could one day tell if a driver is drunk simply by looking at their facial features, researchers say. By constantly “watching” the driver for typical signs of inebriation, it could even reduce drunk driving accidents. 

The project, outlined in a paper published April 9 as part of an Institute of Electrical and Electronics Engineers (IEEE) and Computer Vision Foundation (CVF) conference, gives in-car computing systems the power to assess the driver’s intoxication level as soon as they get in — with 75% accuracy.

It goes beyond existing computer-aided methods that rely on observable behaviors like steering patterns, pedal usage and vehicle speed. Those data points can only be collected and processed when the vehicle has been moving for an extended period.

In contrast, the new project uses a single color camera that watches for variables like gaze direction and head position. The overall system can also incorporate 3D and infrared footage of the driver’s face and rearview videos showing driver posture, alongside steering interactions, event logs, and screen recordings of driving behavior.

Related: AI models could devour all of the internet’s public data by 2026

“Our system has the capability to identify intoxication levels at the beginning of a drive, allowing for the potential prevention of impaired drivers from being on the road,” Ensiyeh Keshtkaran, a doctoral student at Edith Cowan University, Australia, who contributed to the project, said in a statement.

She added that because the software fits seamlessly into the digital architectures of smart vehicles — like eye tracking and driver monitoring systems — it makes it easy to migrate to environments like a smartphone.

Curbing a public health crisis

The World Health Organization (WHO) estimates that alcohol impairment is involved in 20% to 30% of fatal car accidents worldwide. In Australia, where the project was born, 30% of fatal crashes involve blood alcohol levels over the legal limit of 0.05%.

“Although efforts are underway to integrate driver alcohol detection systems into future vehicle generations, and the advent of autonomous cars is on the horizon, the persistent issue of drunk driving remains an urgent concern,” Keshtkaran said.

The study used video footage of drivers of a range of ages, drinking habits and driving experience using simulators under three levels of intoxication – sober, low intoxication and severely intoxicated. They worked with software company MiX by Powerfleet to collect data from alcohol-impaired drivers in controlled but realistic environments. 

The algorithm then searched for discernible facial cues of intoxication in the video footage and successfully predicted a prospective driver’s state in three-quarters of cases. Some common visual cues of intoxication include bloodshot eyes, flushed face, droopy eyelids and a dazed look, according to material published by the Oregon Liquor and Cannabis Commission.

Project lead Syed Zulqarnain Gilani, senior lecturer in the School of Science at Edith Cowan University, said the next steps are to improve the resolution of the image data the algorithm receives, letting it make even more accurate predictions. “If low-resolution videos are proven sufficient, this technology can be employed by surveillance cameras installed on roadside,” said Gilani in the statement.

But for now, the finding represents a major leap forward because it can identify intoxication levels before the car even moves. That could usher in a future in which smart cars won’t start with a drunk driver behind the wheel — or they could even alert the authorities if a driver is too intoxicated.

This post was originally published on Live Science

Share your love

Leave a Reply