To Crash but not Burn
Take it in. One deep gulp of air at a time. It’s true, you’re moving, but your hands aren’t behind the wheel. The car has a phantom behind the wheel, invisible hands steering it, swerving between traffic with deft movements – like a scary movie gone wrong?
For the tech-savvy universe that we occupy, this hardly sounds like science-fiction. With self-driving cars being pushed for by the big names in tech – Tesla, Google, Apple and what have you – it is just another massive landmark on the exponential curve that technology represents in terms of growth today.
We are already cyborgs, yes. We continuously interface with the machine on a daily basis. So what is so scary about the self-driving car? Before we answer that, who’s taking steps to make it happen? Just about everybody, or so it seems. Google, some say, are pioneers, beginning about as far as 8 years ago – followed closely by companies like Uber and even China’s Baidu. Car companies join the race – Tesla has shown great promise in that area – but solutions are being developed by Volvo, Daimler, BMW, Audi, Jaguar – the list goes on and on.
The scary word has always been in this discussion because of the safety factors that need to be taken care of. Every year, nearly 1.3 million people die because of car-crashes, with an additional 20-50 million sustaining some sort of injury for it. It is also the 9th leading cause of death, as the ASIRTasserts.
The autonomous driving system and its development is the answer to this – or the attempted one, at least. The driver needn’t be drunk anymore because the machine definitely won’t be. The driver needn’t even learn anymore because you can safely get yourself from point A to B without worrying about that.
Let us go back to the recent hitch regarding Autopilot. While many other companies are also trying to make the system two way – meaning, if the machine does not register certain things on the road or is unable to make decisions, there should be some provision for the human to intervene and take over things. Tesla has advocated for Autopilot (no human agency), claiming it to be twice as safe as compared to a human.
The NHTSA also has a scale for automation: Zero through 4b, where Zero stands for the driver having full control, and 4b providing ‘passive driver experience’ with optimized mapping and no intervention by the driver. Many trucks, for your information, already have a level 3 system in place which gives drivers a 20 second warning to intervene in case the software cannot.
Tesla’s vaunted system can brake, accelerate, turn effortlessly, read signs and road markers and also avoid certain obstacles as more and more tests run by. It has amassed significant data – over 47 million autonomous miles driven, to make some sort of judgements as to its safety.
However, we recently encountered an accident in May (Florida) where a Tesla model S crashed into a trailer because the system could not tell apart the white side of the trailer from the brightly lit sky, failing to apply the brakes.
But does this crash mean a definitive end for development in this vein? Hardly.
Surging Ahead of Collisions
There is absolutely no doubt as to the tragic nature of this affair – which is exactly why Tesla has decided to amplify its efforts in this direction.
To understand why this is so complex, consider this – driving is not simply a manual, purely technical skill that people own or acquire. It has moral issues to boot – along with having a clear sense of space and time so that accidents on the road can be avoided. You may be a perfect driver, but you are pitched against a million others who use their own algorithms, so to speak, when they drive. Trying to program the massive amount of data that drivers have to crunch when making split second decisions concerning life and death, becomes indefinitely harder when you consider the relatively static nature of machine intelligence, which requires algorithms for decision-making.
Tesla has updated its radar-based software so that it can create better 3-D maps of the surrounding world to distinguish ordinary objects (signposts, markers and highway overpasses) from extraneous ones – this was a problem with radar earlier. Now, radar combats the camera’s inability to see through fog, sleet, rain and other obstructive conditions (like telling apart the colour of the trailer from the bright sky).
These, however, have other implications apart from technological developments.
One of the most plaguing issues with this technology has still been split second decision-making. Machines still cannot observe rights and wrongs as the human ethical compass can – they have far more objective standards for these things. Tech developers are confused as to what kind of algorithm to code in which will resolve these issues – should the AI protect the people in the car, or the ones outside it? The many, or the few? Questions like these need answers that account for almost an infinity of conditions that may occur.
Another moral dilemma is the redundancy of jobs in this massive race toward automated driving. Sure, it has a lot of benefits – reduced driver stress, costing, safety, efficient parking, fuel efficiency and also boosting the sharing economy in a significant way: companies can possess their own fleets to rent out without the human error factor, providing mobility to non-drivers like never before. However, there is a massive job problem that is bound to occur.
Considering that over 8.7 million jobs that truckers could potentially lose in the United States, take into account the roughly 1.5 million drivers that Uber possesses worldwide. Then take into account the thousands of other companies that employ any human agency in terms of ferrying people or goods from one place to another and the figures do not show a plush, cushy future for all those drivers out of jobs.
The race is two way – job redundancy on one hand, and safety in autonomous systems on the other. The race to innovation must not only consider the technological advantages, but also the social costs that come with these innovations. As the AI takes another step toward the wheel, let’s hope the drivers that step away from it are provided a comprehensive solution.