Skip to main content

Best viewed in portrait mode

Blog Post

4.5.2023

Inclement weather - A full stack challenge

When you’re driving through a rain shower or in fog, there may be several conscious and unconscious adjustments to your driving style that help you continue driving safely and comfortably. If visibility is lower in fog and heavy rain, you may proactively slow down and choose to follow less closely. If roads are wet and slippery, you may apply softer acceleration and braking to avoid skidding. And if you do detect the tires slipping a bit, you correct for it.

Driving in inclement weather is one of the most critical unlocks to scale autonomous vehicle fleets to new geographies and grow AV adoption; and it's also one of the biggest technical hurdles the industry is working to overcome.

At Cruise, tackling inclement weather is a full stack challenge. Our fleet relies on a robust, multimodal sensor suite, powerful software stack, and lightning-fast response rates to perform safe and comfortable maneuvers when the weather is less than perfect. But getting here wasn’t easy—this challenge requires tight coordination across hardware, software, systems engineering, manufacturing, data science, simulation, and operations. Over the last year, we’ve been implementing a range of improvements across the technologies powering our fleet, from our custom-built sensor suite and continuously learning AI, to our internal tooling and operations software.

Recently we saw that work pay off. This winter, California experienced a historic weather event, in which a series of atmospheric rivers descended upon the state, week after week, dumping record amounts of rain. Here’s a look at a driverless AV traversing Golden Gate Park this winter, handling light rain, puddles, headlight reflections/glare on wet roads:

This ride occurred on 12/10/2022 10:49am.

Take a look at some of the technical upgrades we made here at Cruise to make drives like this a reality.

Starting with sensors

On the hardware side, our sensor suite gives us a detailed view of the world around us, even during a storm. Multimodal data from our sensor suite provide us with safety critical information that helps us make smart predictions and safe decisions on wet roads. Key sensors include:

  • Optical cameras: these sensors most closely approximate human vision, allowing us to distinguish between objects, colors, human direction and intention. While optical camera performance may degrade on rainy or foggy nights, our camera still performs reasonably well to pick up high contrast 2D detections from objects like headlights and stop signs.

  • Long and short range lidar: gives us highly accurate point cloud 3D representations of scenes and objects surrounding an AV. While fog and rain lessen lidar range, we’ve found lidar to be a valuable tool to make localized measurements of fog density near each AV in real time.

  • Radar: allows us to see longer range views and detect the speed of a vehicle or object. It’s especially effective at spotting vehicles on the road at a distance beyond what humans can even in heavy fog. Radar enables AV to recognize fast moving objects and their radial velocity. Radar is the least impacted sensor from weather conditions.

  • Audio: allows us to detect sounds near the vehicle, for example, emergency vehicle sirens.

Our in-house designed, high resolution radar system could easily detect moving objects up to 200 meters away. The point cloud is color coded using ego-motion compensated radial velocity, which is a direct measurement from radar. Green objects are static while non-green objects are in motion.  

In addition to our tried and true radar sensors, we’ve thought carefully about how to increase the effectiveness of optical sensors in inclement weather. We’ve integrated many custom-designed sensor cleaners, wipers, and air puffers to minimize water droplet flare and condensation on sensor lenses across our suite:

Our AVs are equipped with custom sensor cleaning solutions that can blow water off of cameras or wash dirt off of cameras, driven by machine learned camera degradation detectors.

Turning sensor data into action

Aside from reduced visibility, there are other things humans unconsciously take into account when driving on wet roads. Early on in road data collection, certain weather-related scenarios were perceived as “phantom objects” in our vehicle’s path. These included: Large puddles which often reflected light from other objects, and “Rooster tail” splashes which trailed vehicles in motion in front of an AV. Strong reflections and headlight glare on puddles and wet roads also had the potential to trigger false positives. Understanding that these conditions could degrade vehicle performance, we proactively fed rainy road data into our robust perception and prediction stack, generating mountains of “splash and puddle data” in simulation to harden our fleet. As a result, our AI has learned to detect and filter out these phantom objects.

Wet roads also affect AV motion planning and controls. One of the main reasons we proactively drive more gently in adverse weather is to account for the reduced friction of our tires on the road. Just as human drivers do, when an AV encounters inclement weather it adjusts its acceleration and deceleration rates, steering capabilities, gives more distance to other road users and lowers maximum speeds. Nevertheless, inclement conditions still increase the likelihood of excessive tire slippage for any vehicle on the road. This is especially true when driving over tram rails or metal grates (which are slippery even in dry conditions!). To compensate for this, we redesigned how our planning and controls stack responds to more comfortably handle excessive slipping events. Just as a trained driver would, each AV now detects rain and adjusts its driving behavior. If it still experiences excessive tire slip, it responds by adjusting its plan in real-time, keeping our AV and its occupants on a safe trajectory.  

Owning the full development stack, from hardware sensors all the way up to the AI software that powers decision making, creates a tight development feedback loop that helps us make a ton of rapid improvements over a short period of time. Since beginning this work a year ago, we’ve leveraged tens of thousands of miles of supervised and simulated rain drives to enable autonomous driving in somewhat heavy rain. Our fleet has been able to maintain 86% uptime during California’s historic winter storms. And when San Francisco’s infamous Karl the Fog returns later this year, our AVs will be ready.

While we recognize that there is a lot more to do to tackle inclement weather in San Francisco and beyond, we’re proud of these methodical, incremental improvements we’ve made to our stack, enabling rides in 2023 that weren’t possible in 2022. As we expand into new geographies, we’ll continue to implement new solutions that help us travel more safely and comfortably through ice, snow, dust storms and more.