Rain Dampens the Autonomous Vehicle Parade
Elon Musk has said that Tesla plans to implement autonomous driving using only cameras to see what's coming. Tesla's approach contrasts with what the rest of the automotive industry is doing. Most autonomous driving schemes depend on a blend of cameras, radar, and lidar to figure out what lies in a vehicle's path. The reason for this multi-sensor approach is a belief that only multiple sensor technologies can figure out scenarios that specific sensing technologies find confusing.
Given Tesla's position on camera sensing, it is interesting to review a few of the problems that plague autonomous vehicle sensors despite years of testing and research. First, consider cameras. The classic example of a problematic scenario for vehicle cameras is when a vehicle crests a hill facing the setting sun. The bright sunlight blinds the camera momentarily just as it might a human driver. As the vehicle descends the hill into a dark valley, the camera can also be momentarily blinded as it takes a while to adjust to the relative darkness.
There are also “edge cases” likely to confuse cameras but not other kinds of sensors. In artificial intelligence parlance, edge cases are odd-ball situations that happen in real life that machine learning algorithms haven't encountered. The classic example of an edge case that might flummox camera sensors is that of scenery painted on the back of a truck. The camera mounted on the car behind the truck may fail to realize the scenery is just a painting.
There may be ways of getting around such difficulties using only camera technology—adding an infrared camera to the mix might help matters. But there is another obstacle that plagues cameras and other autonomous driving sensors: bad weather.
(Image source: Littelfuse)
A single strategically placed water drop (or snowflake) on a lens can incapacitate a camera. Raindrops that don't hit the lens can still produce intensity variations in images and video frames that complicate the process of identifying obstacles. In particular, every raindrop blocks some of the light otherwise reflected from objects in view. Rain streaks also lower the overall contrast of the scene. In tests conducted at Michigan State University a few years ago, researchers found camera-based algorithms failed to detect as many as 20% of objects in light rain. In heavier rain, the miss rate for vision algorithms jumped to as many as 40%.
There are de-raining algorithms designed to help camera-based systems detect objects. But so far, they don't work well. One reason: These algorithms have mainly been tested on synthetic rainy scenes that are much less complex than the real thing.
Radar and lidar have their own problems when it comes to navigating through rain or snow. It's easy to see why. Both sensing mechanisms rely on reflections coming back from objects to locate and identify what's ahead. When reflections come back from raindrops or snowflakes, it's hard to tell what's beyond the snow and rain.
It turns out that light rain doesn't bother lidar much. But heavy rain can form lumps that a lidar sensor can interpret as obstacles. Moreover, tests have shown that spray coming off other vehicles after a rain storm can create false lidar targets. (Readers who want a taste of this problem might consider following an 18-wheeler during a thunderstorm on a turnpike for a while.) Ditto for snow. Test data on lidar behavior in snow is sparse, but indications are that snow can prevent lidar from seeing objects or create false returns. For example, one test in Finland and Sweden found that snow swirl, as generated by a car you might be following, can garble lidar readings.
Radar fares better than lidar in the rain and snow. Wet snow seems to cause the most problems. Radar's biggest drawback in wet weather seems to be a degraded ability to detect objects with smaller radar cross-sections, like pedestrians.
Of course, autonomous vehicle sensing in bad weather is still a research topic. One approach under investigation uses two sensors instead of one to detect objects. If one sensor detects an object when the other doesn't, sophisticated math tries to determine which one is right. We suspect automakers might not be excited about this specific scheme because it increases the number of sensors on the vehicle, but other approaches are getting attention as well.
Eventually, automakers will likely be able to see their way through all kinds of weather. At least for vehicles, then, a variation on an old quote may come true: There's no such thing as bad weather, only different kinds of driving weather.

Have questions or comments? Continue the conversation on TechForum, DigiKey's online community and technical resource.
Visit TechForum