What Crazy Dash Cam Videos Teach Us About Self-Driving Cars

If you want to really understand the challenge facing engineers programming these cars, there's no better way to illustrate the myriad things that can, and do, happen on public roads.

The first self-driving cars are expected to hit showrooms within five years. Their autonomous capabilities will be largely limited to highways, where there aren't things like pedestrians and cyclists to deal with, and you won't fully cede control. As long as the road is clear, the car's in charge. But when all that computing power senses trouble, like construction or rough weather, it will have you take the wheel.

The problem is, that switch will not---because it cannot---happen immediately.

The primary benefit of autonomous technology is to increase safety and decrease congestion. A secondary upside to letting the car do the driving is letting you can focus on crafting pithy tweets, texting, or do anything else you’d rather be doing. And while any rules the feds concoct likely will prohibit catching Zs behind the wheel, there’s no arguing that someone won't try it.

Audi’s testing has shown it takes an average of 3 to 7 seconds---and as long as 10---for a driver to snap to attention and take control, even when prompted by flashing lights and verbal warnings. This means engineers must ensure an autonomous Audi can handle any situation for at least that long. This is not insignificant, because a lot can happen in 10 seconds, especially when a vehicle is moving more than 100 feet per second.

Which brings us to dash cam videos. If you want to really understand the challenge facing engineers programming these cars, there's no better way to illustrate the myriad things that can, and do, happen on public roads. In the videos below---a tiny selection from a vast library of our favorite YouTube genre---people fall off bridges. Cows fall out of trucks. Tsunamis strike and buildings explode.

It's impressive when the drivers in the videos dodge the threat, and they deserve the benefit of the doubt when they don't. Self-driving cars will face a higher bar, since every failure to avoid danger will be a PR disaster and, quite possibly, a legal liability for the automaker. The technology does have advantages: They don't get drunk, angry, distracted, or sleepy. They ignore that idiot who just flipped you off, and resist the urge to show that guy over there who's boss. But most of all, computers have reaction times so fast they are as good as immediate. They replace human instincts---which behind the wheel are often the wrong response---by following distinct, predetermined rules.

One basic rule works for a wide variety of even some pretty bat guano crazy situations that involve blocked roads. Hitting the brakes and trying to avoid the object is often the right move. That works when you're dealing with cows falling out of trucks:

Or a guy jumping off a bridge onto a highway (and walking it off):

Or a sinkhole swallowing a truck:

The car's software doesn't need to know what a cow is to know it's an obstacle that should be avoided. But the true joy---and in our case, educational value---of dash cam videos is finding the truly bananas situations, the ones where it's hard to know what the right response is, let alone how a robot would handle itself. What happens when a tsunami hits?

Or a building collapses (after being hit by a car), sending debris flying into the road?

Or a plane nearly hits a highway as it crashes into a river, scaring but not actually endangering drivers?

Or you encounter a forest fire?

The point is, the world's highways are a crazy, unpredictable place where anything can happen. And they don't even have the pedestrians and cyclists and buses and taxis and delivery vans and countless other things that make autonomous driving in an urban setting so tricky. So how do you prepare for every situation imaginable?

You don't.

You might be able to use computer simulations to teach autonomous software how to handle the most insane circumstances you can imagine, says Jurg Schlinkheider, Audi’s head of driver assistance systems, but that's not the best approach: It's an infinite amount of work, and there's no way to know if your simulations are even accurate. "If it comes to one instance, it's going to be a problem. And then you have to sort out what happened." In other words, you're not going to save the first automated car that gets hit by a flash flood or a mudslide.

That doesn't mean human drivers have the upper hand here. If one car finds itself in an unexpected and dangerous situation, it could send an alert to a server that would warn other cars in the area to keep away, potentially saving lives. More importantly, the technology can learn: The act of "sorting out what happened" means using data from the car to figure out how to handle, or at least recognize, that situation the next time it occurs. That knowledge doesn't stay with the individual car, the way what a human learns stays in their own brain. Anything an automaker learns from one of its cars on the road, it can relay to every other vehicle it makes (or maybe even to those made by its competitors). So the first time a tornado bears down on a self-driving car, you may not be able to do anything about it. But the second time it happens, you'll be better prepared.