Amazon AI pilot attention cameras deny pilots bonuses
How far do you have to turn your head, in your own car, to see your mirrors? A few degrees? It is probably enough that your nose is not perfectly aligned with the front of the car. TO NetradyneThe Driveri security system, now installed in Amazon Prime delivery vehicles, is enough to be considered a demerit.
Amazon has a long history of offering apps that track a driver’s speed, location, and driving habits, but earlier this year the company started installing Driveri monitoring system in its Prime delivery vans. Now the drivers have said Vice motherboard that they are penalized for glancing in the mirrors, changing radio stations or being cut off by other drivers.
Driveri uses a two to four camera system, depending on the model installed, to track interior and exterior views of the vehicle. The AI-driven brain of the system uses these video streams to analyze the route around a vehicle and record any dangerous driving incidents. According to a Vimeo Video Hosted by Karolina Haraldsdottir, senior director of last mile security at Amazon, the company tracks sixteen specific metrics.
When the system detects dangerous driving, it uploads a clip from the camera to a cloud portal visible to Amazon. In some cases, this is accompanied by a verbal warning from the device’s speakers. Drivers, however, say the threshold for dangerous driving activity is far too low. From motherboard:
Netradyne cameras routinely punish drivers for so-called “events” which are beyond their control or which do not constitute dangerous driving. Cameras will punish them if they look in a rearview mirror or manipulate the radio, pull over at a stop sign at a blind intersection, or get cut off by another car in heavy traffic, they said.
A low threshold for dangerous driving, followed by a verbal warning, is an experience familiar to anyone with previous driver education. What Amazon’s unemployed don’t know, however, is the score the company generates each week based on this Driveri data. These scores are used internally to determine if a driver is eligible for bonuses or prizes for their performance, and drivers feel they have been unfairly impacted.
Many of these errors appear to be the fault of improperly trained AI image recognition. An Amazon driver from Oklahoma told Motherboard that a large percentage of false infractions came from stop signs:
“Either we stop after the stop sign so we can see around a bush or tree and that rings the bell for us, or he grabs the stop signs as stop signs. A few times we have been in the countryside on a dirt road, where there is no stop sign, but the camera signals a stop sign.
An AI is only as good as the data it is trained on, and these systems seem to need a lot more training before they are ready for prime time. While tech companies may enjoy solving problems with machine learning, it’s worth taking the time to remember the flaws that exist in their algorithms. When people’s paychecks are on the line, an automation rush rarely ends well.