Articles, Level 5 Autonomous Driving, Smart Mobility

Human-Controlled Vs. No Control At All

Some automakers are toying with how to completely eliminate pedals and steering wheels. It sounds interesting on paper, but what if the car’s safety features fail and a passenger needs to intervene? Intervention is thought to be a huge danger to AVs, but would cars really be better off without any human control?

“I think that’s the question everyone has,” said Michael Schuldenfrei, a technology fellow at Optimal+, a big data analytics company. “Even an aircraft, which can do just about everything automatically – takeoff through landing – even then people still have manual overrides for everything. No one is pretending you can do it completely automatically.”

Maybe not, but Christophe Begue, director of solution strategy and business development at IBM’s Global Electronics Industry Team, said that you could argue it’s possible to remove pilots. Automobiles are a whole other story.

“Driving a car is more complicated than flying a plane,” said Begue.

Between traffic lights, intersections, varying road designs and fluctuating speed limits, it’s easy to see why. Planes are also dwarfed by the number of cars jamming the world’s biggest roads, which adds to the complex nature of driving.

“I think it’s going to take quite a bit of time unless it’s in a very controlled environment, like a controlled campus,” Begue added.

Controlled environments are already being tested. But while there are some shuttles that offer low-speed, pedal and wheel-free mobility in a geo-fenced environment, most automakers are not yet willing to drop human controls.

“I think what you see is everyone is being very aggressive about going to completely autonomous level 5 autonomy in the vehicle,” said Schuldenfrei. “And then as soon as something happens, they back away very fast. The Uber [incident] is a classic example.”

Schuldenfrei does not expect an override-free AV to arrive as quickly as the hype suggests. However, he is concerned that no amount of control will keep passengers safe if something goes wrong.

“Even if you have an override, when the car is driving itself, you’re already opening up tremendous risk that the driver won’t even be concentrating,” said Schuldenfrei. “So if something goes wrong, the driver [may not] notice.”

Schuldenfrei also thinks about the classic moral dilemmas faced by human drivers. If a human driver has to crash into a tree to avoid hitting a bunch of kids, he or she will likely do so without thinking twice. It’s a natural instinct. Autonomous cars would have to be programmed to do the same. No amount of machine learning will change that.

“So what do you do: do you drive into a tree and kill the driver or drive into the kids and kill the kids?” Schuldenfrei asked. “On the flipside, where is the liability and to what extent does it play a role? If you look at the statistics, autonomous cars are significantly safer per mile, per kilometer driven than human-driven cars. But it doesn’t mean they’re not going to make a mistake ever or that there won’t be a situation where you’ll be in an accident.”

Begue thought about all this for a moment. It takes a lot to develop an autonomous car, but the progress has been impressive, to say the least. New AV tests are cropping up all over the place.

“I live in San Francisco,” said Begue. “I cannot go out into the street and walk for more than 30 minutes without seeing one or two of these autonomous cars. Five years ago, would I have imagined that I’d see one every day? Probably not. Things are moving pretty fast.”

About the author:

Louis Bedigian is an experienced journalist and contributor to various automotive trade publications. He is a dynamic writer, editor and communications specialist with expertise in the areas of journalism, promotional copy, PR, research and social networking.