Uber and the state of Self Driving cars
In light of the Uber debacle, an unfortunate event which cannot be wished upon anyone. Condolences to the affected family.
Here we will explore the event and then answer the questions that came about as a result of this accident.
Autonomous driving have been around for quite a while, with significant improvements over the years which has led to them being touted as the answer to our transportation, road and safety problem. But their rise to fame haven't been without setbacks as we have witnessed a number of accidents involving autonomous cars.
In this scenario, a first of its kind involving a pedestrian, an Uber vehicle in autonomous driving mode with a safety driver behind the wheel struck a 49 year old woman who later passed due to the sustained injuries. The accident was soon followed by Uber seizing all testing of its autonomous driving, regardless this accident has raised eyebrows from the public at large, fuelling their distrust of the technology, posing the question; "Are self driving cars as safe as advertised?"
Following a preliminary investigation, the Police has come forward and said that Uber is unlikely to be at fault for the accident, the investigation found that even was no systematic defect in the car itself and the emergency back-up driver was not under any influence and had little to nothing he could have done to intervene before the accident. Elane Herzberg (the deceased) is said to have abruptly walked from a centre median into a lane with traffic, an occurrence confirmed by the car's dash-cams.
The question one would ask is; what effect would this have on the future of self driving cars? Should we seize research and production? Or should we opt for stricter regulation?
Here are my two cents on the matter; on the question whether or not self driving cars are safe?
Are human driven cars safe? Any road trip is a gamble that rests on the trust we have on the person on the wheel and that should anything happen, the law can extend its arm and exercise its proceedings. Currently we don't have fully autonomous self driving cars, only semis and regulation has been very strict on manufacturers not to create the impression that cars drive themselves. Just two years ago Mercedes, after complaints, pulled an ad for its driver-assistance system having overstated the technology's capabilities.
The "are self driving cars safe?" question is flawed, what we should ask rather is "are they safer?" from where we are now I can say with confidence that they have the potential to outdrive us as far as safety is concerned. Also, road safety rests not only on autonomous driving but to the build of the car because if you are going to bring together Electric engines and AI then you have yourself a better car. How?
The Tesla Model S has been praised for its safety ratings, not in relation to its self driving capabilities but to the build of the car itself, wherein the removal of a gasoline engine, gearbox and gas tank, we get a car that is easier to drive with improved handling and with Teslas allocation of the battery system, we get a lowered centre of gravity which also improves handling and cornering ability. Back to AI and autonomous driving as the subject of this blog, Should we seize research, production and testing? Accidents are inevitable and on the road, they are always more likely to happen and with accidents that have already happened that involves self driving cars, we haven't had one in which the car itself has been responsible enough to advocate a seizure in in R&D.
To explore a few, when a Tesla Model S was involved in a fatal accident it was discovered that the accident did not result from a defect or bug in their system, the car was said to have performed as intended. It was also discovered that the driver was not attentive on the road and the accident was due to his own negligence. Likewise in this Uber scenario, it was neither the car nor the driver who were at fault as explained above. It is no doubt, we are still going to see more accidents as we are in the testing phase of the technology, but this alone isn't enough to bring about an end to self driving cars. We are still on the road towards level five autonomy and companies like Nvidia has made amazing strides towards that goal which only means there's still room for improvement and that room can't be left unfilled.
Should we impose stricter regulations? We certainly cannot regulate innovation away, and it isn't enough as means to bring about an end to some thing. It will only impede development in that jurisdiction but companies can always move to more allowing jurisdictions, so we can't bring about an end to self driving cars, the rocket has already taken off and cannot be crushed. What is need are guided R&D programmes and procedures as to how rollouts and tests are to be conducted.
I'm a lover of technology and innovation and being an optimisation problem solver, I always advocate for any improvements and optimisations to how things are done because no system was created to be static and permanent, and as the world changes, their relevancy decreases and it's up to us to come together to bring about the next great thing. We must be willing to change with the times and environment in order to remain relevant and efficient.
This post was upvoted by Steemgridcoin with the aim of promoting discussions surrounding Gridcoin and science.
This service is free. You can learn more on how to help here.
Have a nice day. :)