Uber May Not Be Meeting The Safety Regulations for Self-Driving Cars

in uber •  7 years ago  (edited)

Uber settled with the family of the crash victim from March 19 in Tempe, and it seems like there might be a hush clause. This unfortunate tragedy probably never should have happened ... IF UBER HAD NOT SCALED BACK IT'S LiDAR ARRAY. There's now proof that Uber's system might be very likely to blame for causing the accident that killed a pedestrian lady crossing the street. It's easy to blame it on the pedestrian because she was most likely illegally crossing a section of street where vehicles drive fast. It also seemed like she was not aware as she was crossing how fast the Uber Volvo XC90 was coming, when it should have slowed down. Once again, this appears to be Uber's fault at this point. Even if some investigators believe it was dark and the lighting was not good, the purpose of LiDAR is to be able to "see in the dark". No excuse for Uber to not have that capability, especially when testing with self-driving cars.

Screen Shot 2018-03-30 at 3.42.22 AM.png

LiDAR uses pulses of light to measure the distance of objects, stationary or moving. Many engineers and experts, all agree that the Velodyne LiDAR system array on Uber's XC90 should have detected the crossing pedestrian. The LiDAR Uber uses is the HDL-64E which has a detection range of 120 meters. Ex-Uber employees have also disclosed information that Uber had neglected safety standards by reducing the number of LiDAR sensors in it's array from 7 to 1. Reducing LiDAR's can create blindspots, though there is still no conclusive evidence this was the case, but could have likely caused the accident. If the LiDAR can still detect pedestrians, Uber's self-driving software should have then slowed down and applied the brakes to within 8 feet of the pedestrian to prevent an accident. The safety driver is also questionable. Camera footage inside the car showed the safety driver looking down at something before the accident occured. Could the safety driver have had a chance to react in time by swerving away? We won't know for sure until further details about the investigation come out. Also interested to see what Uber's self-driving software data revealed before the crash to understand if there was an issue with the LiDAR.

In the end Uber's crash is going to affect the entire self-driving car industry as regulators are going to become more strict in issuing permits and validating systems.

Further Reading:
Ex-Uber Employees Reveal LiDAR Array Was Reduced:
https://cleantechnica.com/2018/03/29/uber-reduced-lidar-array-in-self-driving-cars-from-7-units-to-1-in-2016-creating-pedestrian-blindspots/

Who Is At Fault - Humans or Self-Driving Cars?
https://medium.com/self-driving-cars/when-self-driving-cars-cause-accidents-who-is-at-fault-e026303f24c

Uber's Self-Driving Technology is Questionable
https://www.engadget.com/2018/03/23/ubers-self-driving-policies-tech-face-questions-after-fatal-cr/

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

This whole business of driverless cars frightens me. This post raises the question of future maintenance of these cars, If they can operate with a reduction in the number of sensors, what happens if all the sensors become faulty? Also what happens if a malicious person disables them through a remote contact?
I would never get into a taxi that didn't have a driver

When done properly and regulated to high standards of safety, it can work. The potential is tremendous, despite the risk. The benefits far outweigh the risk, but there are still concerns among many people.

I saw a related problem that caused 4 deaths in the US recently. It seemed a tourist was driving a rented car, and using a sat nav. this told him to make a U-turn, which was illegal at that point. He made the turn, and was hit by a large 4x4. If that had been a driverless car using the sat nav, who would have been at fault?

That was tragic. Now that can actually be prevented by driverless car systems, since they don't rely on GPS alone. They have many types of sensors do detect things that humans cannot see or know is coming. That is in theory of course, but when applied to actual real world situations they often have a high degree of success. The Uber incident is the exception and we know now there may have been neglect on Uber's part.

There is always neglect with a few cars, and that is the worry. At least with a driver there is chance to control failures.

Congratulations @vtce! You have completed some achievement on Steemit and have been rewarded with new badge(s) :

You got your First payout

Click on any badge to view your own Board of Honor on SteemitBoard.

To support your work, I also upvoted your post!
For more information about SteemitBoard, click here

If you no longer want to receive notifications, reply to this comment with the word STOP

Upvote this notification to help all Steemit users. Learn why here!

Do not miss the last announcement from @steemitboard!