Every day, millions of drivers interact with each other just by waving, nodding and gesturing in a multitude of ways. These motions are second nature to human drivers, but what will happen when robots are thrown into the mix? Can self-driving cars learn to understand our most subtle movements?
In these early days of AV development, the question is still open for debate. But at least one analyst is convinced that driverless cars are making progress in this area. During a rain-soaked day at the Consumer Electronics Show last January, Mark C. Boyadjis, global connected car lead at IHS Markit, came face-to-face with an autonomous car that, if nothing else, knew when it was time to stop. “I’m crossing one of the streets at the convention center,” Boyadjis recalled. “I’ve got ‘green’ and ‘walk’ signs, but there’s this right turn lane. I’m at the corner and the cars keep turning in front of me, ignoring the fact that I’m there. Car after car, maybe six or seven different vehicles, turn in front of me.”
As you might have guessed, human drivers were at the helm of those vehicles. “And then the Lyft automated BMW approaches,” Boyadjis added. “I motion one foot [forward], like I’m getting out there, and the vehicle slams on its brakes. I know it wasn’t the driver. I laughed, waved, crossed the street and thought, ‘Score 1 for artificial intelligence, 0 for humanity.’ It was one of those situations where the AI needs to be able to filter out the noise. It needs to be able to filter out the rain, the hundreds of thousands of people that are there [at CES] – all of it – and it still achieved its victory in that one instance.”
Boyadjis said he was “floored” by this experience, which shows how far AVs have come. He also acknowledged the many challenges ahead, particularly in teaching cars additional insights that will be needed before full autonomy is attained. “With machine learning we’ve now begun to teach machines how to learn and that’s the whole premise,” he said. “But they still need to learn and you can only learn with experience, and that – by its very nature – takes time.”
In the case of humans, experienced professionals (cab drivers, truck drivers, etc.) have an advantage over new recruits. They’ve been on the job longer and they know what to expect from the road’s varying conditions. It is not physically possible to replicate that experience for new drivers – they have to acquire it over time. Machine learning is no different. “The public needs to understand that this is a very slow uptick,” said Boyadjis. “But at a certain point in time we need to trust those platforms as if they were humans (or better than humans) because that is where we’re headed.”
Brian L Gallagher, CEO of Andromeda Interfaces, a company that designs HMI display solutions, also thinks there will be a long process of gathering data. Without data collection, the cars will not be able to advance to the next level. “I think going forward with full autonomy, there’s still a process of learning and collecting information to better understand the different edge cases,” said Gallagher. “[This happens] before you can allow the vehicle to drive [people] to work on the freeways without having any issues. And you need to be able to prove [that] having the robot offers more than the human driver, especially when it comes to human error.”
About the author:
Louis Bedigian is an experienced journalist and contributor to various automotive trade publications. He is a dynamic writer, editor and communications specialist with expertise in the areas of journalism, promotional copy, PR, research and social networking.