The skinny is that thousands of drivers across the US are being monitored by AI systems that actively watch drivers and their surroundings for “events” that trigger negative feedback for the driver. It actually looks as though they’re distracting drivers and causing them to make erratic, dangerous decisions. Background: AI is really, really stupid. I cannot express how wrong the general public is about what “they can do with AI these days.” Per the aforementioned Motherboard article: It’s easy to get lost in the individual morality surrounding this issue, but the big picture effects us all. After all, there are no long term studies on the effects of constant surveillance on drivers. Even if we imagine a world where AI doesn’t make mistakes, there are considerations beyond just driving safely to care about. What effects will these systems have on driver stress levels? Numerous employees interviewed by Motherboard expressed anxiety over the constant “dinging” from the system. And this system is far from perfect. Drivers report being dinged for glancing at their side mirrors – something that’s necessary for safe driving. This means drivers have to make a choice between driving safely and earning their full paychecks and bonuses. That’s a tough position to put people in. Here’s more from the Motherboard piece: Per the Motherboard article: But there are even bigger areas of concern. The system flags “events,” many of which drivers don’t know about, and sends them to humans for evaluation. Supposedly, every event is checked by a person. That may sound fair, but it’s far from it. Deeper: Let’s set aside for a moment that the people doing the checking are incentivized to meet quotas as well and certainly have their own biases. Logic dictates that if one group of people generate more events than another group of people, it should be demonstrable that the first group engaged in less safe behaviors than the second. But that’s not how AI works. AI is biased just like humans are. When an AI is purported to detect “drowsiness” or “distracted driving,” for example, we’re getting a computer’s interpretation of a human experience. This means you could get “dinged” for looking drowsy just because you’re Black or if your face doesn’t look typical enough for the AI. It means white men are likely to generate less dings than other drivers. And, no matter what you look like, it means you can lose money for being cut off in traffic, checking your side mirrors, or anything else an algorithm can get wrong. We don’t know what the effects of having a machine telling thousands upon thousands of drivers to conduct themselves unsafely on public roadways every day will be.