fbpx

A “Not-So-Simple” Matter of Perception for Safe Self-Driving Cars and ADAS

dreamstime_s_117418161-night-rain-bicycle-crash-stock-photo

This blog post was originally published at Algolux's website. It is reprinted here with the permission of Algolux.

Why are darkness and bad weather considered “corner cases” by the industry?

The chilly rain this past weekend was unusual in that it should be warm and sunny in northern California by now. This also capped 2 weeks of excellent conferences on computer vision and automotive sensing and perception… AutoSens Detroit and the Embedded Vision Summit, where many discussions centered on the challenges of computer vision in harsh conditions and corner cases.

I’ll comment on takeaways from those events in forthcoming posts, but a close call during the rain this weekend made me want to share a personal story of the need to improve the ability for cars to see better and this connects to some recent posts I read.

It was lightly raining and getting dark while I was driving to get a few last minute items for a party. As I neared an unprotected intersection, a bike rider turned wide onto the road. While I saw him a bit late, I swerved and slowed down in time, so no real danger. But it was near where my son was actually hit by a car while biking home from school a few months ago in the rain.

He luckily escaped with only a bad wrist sprain and various scrapes and bruises, but it could have been much worse. The driver had stopped her car, a quite new Japanese luxury sedan, by the side of the road to pick up her grandchild from school and pulled out not seeing my son. There were no obstructions behind the car and neither the driver or the Autonomous Emergency Braking (AEB) system in that car model (if it had that option purchased) saw him to stop the car in time.

Road accidents will be significantly reduced with more robust and accurate computer vision

Needless to say, this happens a LOT. In the US alone, roughly 45,000 cyclists are involved in road accidents per year. Improving computer vision accuracy of the Advanced Driver Assist Systems (ADAS) like the above AEB system is a fundamental and urgent requirement, one that automotive OEMs are working on. But these systems still struggle to deal with the everyday use cases… driving at night, in the rain or fog, dealing with snow and dirty sensors. This will become a bigger problem as more self-driving autonomy is being introduced into vehicles.

”As an optical system, Lidar and camera are susceptible to adverse weather and its performance usually degrades significantly with increasing levels of adversity.” This quote is from Andreas Haja’s comprehensive article on Autonomous Driving. It outlines various Advanced Driver Assist System (ADAS) and autonomous vehicle computer vision and perception components and architectures with a focus on multi-sensor fusion.

“Cameras are good at object detection but struggle with low light. Lidar works in every lighting condition but suffers in scenarios like snow.” is a takeaway from Aparna Narayanan’s recent article on how soon (or far away!) self-driving cars are.

I wanted to reference these posts as they highlight some key takeaways.

Most traditional automotive OEMs and next-gen mobility providers, with the exception of Tesla, see achieving NHTSA Level 4 or Level 5 autonomy as a set of pragmatic steps, building off of advancements in ADAS and progressively increasing autonomous capabilities and use cases as the technology evolves.

This is being done by restricting the use cases into manageable “bites” to introduce new ADAS and autonomous functions. We see today’s ADAS capabilities being active only in clear highway driving conditions, for self-parking applications, and in back-up scenarios. Autonomous vehicle and autopilot testing is limited to highly mapped geo-fenced areas, during clear times during the day. The consensus is is they have a long way to go before being as capable as a human driver. As such, leaders at GM, Waymo, Uber, Cruise, Aurora, and others are hedging on the timeframe for practical fully autonomous deployment.

Addressing this needs to start with more robust and accurate approaches to autonomous vision and the industry can’t treat darkness and bad weather as a “corner case” problem. The good thing is many are focusing on advancing the state of the art and I’m excited to be part of a team that has proven we can see in the dark, look through rain and other harsh conditions, and massively improve these these systems for safer driven and self-driving vehicles vs. the best the industry has today.

Here are some examples showing much better than state-of-the-art detection in very difficult scenarios…



Please also see my prior blog for some metrics comparing Eos results against alternative perception networks called “Spring Showers Bring Flowers… and More Traffic Fatalities

As always, drive safely and I look forward to your questions and comments!

Dave Tokic
VP Marketing, Algolux

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top