Before giving his talk Maps for robots vs maps for humans. What is the difference? at the online Scale Up 360° Auto Vision conference, Vadym Vlasenko, the Automated Driving Solutions Tech Lead at Intellias, gives us a quick overview of this pressing topic. This interview gives a sneak peek at the key factors that distinguish maps for robotics systems, like autonomous vehicles, from maps for humans.
In addition, Vadym shares the unique contribution to HD mapping and navigation that software development vendors like Intellias can offer to OEMs, mobility service providers, and Tier 1 suppliers working to get autonomous cars on the roads.
The message I want to deliver is quite simple. Automated driving is around the corner, and its success requires accurate and rapidly updated HD maps that differ from our common perception of maps and navigation systems we use everyday.
As a driver, you basically use maps to build a route from point A to point B. Then you follow voice guidance from a navigation system to reach the destination without getting lost on the way. And of course, you add a spot to grab a coffee. For automated systems, maps work differently.
Just as drivers rely on their senses to react to emergencies, an automated driving system receives data from sensors that complement the built-in navigation system. The onboard computer overlays data on basic map tiles to build the exact picture of the surrounding environment and recognize potential hazards along the road.
So what seems to be as simple as making a car follow the map turns out to be a complex system. Automated vehicles have to rely on a fusion of sensor data and enriched geospatial data uploaded as layers on maps and processed in milliseconds. And this is the key difference from simpler maps for humans, which are more focused on guiding and showing additional context like coffee shops by the road.
A fusion of technologies works to make automated driving predictable and safe, and the same fusion of efforts by active players involved in the autonomous revolution, like OEMs, mobility service providers, and software development vendors gives us the key for success. We see engineering service and solution providers tightly collaborating and contributing their exclusive experiences. This readiness for partnership will help self-driving cars take their final turn to the main street as participants in daily traffic.
At Intellias, we work on creating maps for fully automated vehicles and a hands-off experience, which implies driving under controlled circumstances where the vehicle can take over control in case the driver has taken hands off the wheel. We are responsible for the full life cycle of 3D map development, from source data elicitation to map compilation and publishing. This is part of a long-term engagement with a Tier 1 mapping service provider who serves a wide range of global OEMs. This partnership unites a software development vendor, Tier 1 solution provider, and OEMs to create additional value for each participant and share experiences for a better outcome.
For an autonomous vehicle, our world is seen as a complex network of objects like roads, road lanes, bicycle lanes, pedestrian areas, road signs, and their relationships. A vehicle comprehends the world through data. The list of possible data sources is huge. It ranges from car sensors and connected infrastructure to GPS devices and mobile phones. This data comes in different formats like images, video streams, and spatial coordinates.
What we hear from our clients who develop HD maps for automated vehicles is that they have to withstand floods of data from everywhere. The key challenge is to structure all this data and understand what’s relevant to place on the map.
Another challenge is standardizing mapping data. OEMs and mapping service providers like HERE Technologies, TomTom, and Nvidia are all publishing maps in their own proprietary formats, while the NDS format is proven to offer data consistency among providers thanks to map integrity and reliability.
I can tell you from my experience that to meet these challenges we have to do a bunch of backstage work with maps. Among the steps we take on our clients’ projects, I would highlight unifying data into the NDS format for ease of use, optimizing map pipelines to streamline compilation, and implementing OTA updates to keep maps current.
For dealing with limitless data flows that oversaturate maps, we’ve been developing a real-time streaming perception stack suitable for generating HD map content based on sensor systems mounted in and on vehicles. Providing rapid data streaming makes it possible to create and maintain 3D maps for precise positioning and lateral and longitudinal control of a vehicle on the road surface. After thorough testing and quality control, this geographically tiled and functionally layered data becomes suitable for direct-to-car and OEM cloud consumption.
The future will be built by partnerships and cross-industry experiences. Widespread adoption of autonomous driving technology won’t be possible without HD mapping. Meanwhile, the technology still faces challenges. To solve them, mapping services providers and auto manufacturers will forge strategic alliances with software development vendors. Through tight collaboration, their aim will be to alleviate cost burdens and accelerate market entry for their high-precision mapping solutions for automated cars.
Also, I expect a major change in road and city infrastructure development. Municipalities together with active players in driving automation will create limited roads and geofenced areas where autonomous vehicles are going to be permitted and thereby tested in close to real-world environments. In order to achieve that, I expect a tighter collaboration between automakers, mapping services vendors, and legislators to reduce complexity in requirements for building maps and to simplify driving regulations and road rules.
In one to two years, we’ll hear some echoes from recent pandemic events. People will desire to save some private space and keep a social distance without being locked down at home. In my opinion, automated driving, when there’s no one at the wheel except a passenger, can save the day.
I think manufacturers and other players involved in the development of autonomous driving will have a more realistic view on how technology should serve people’s needs. Technology can’t dictate to people what’s good or bad for them. Automation would not get anywhere; it is still in demand for safety features. But some accompanying trends like the sharing economy can slow down its pace on the way to a massive rollout.
Join Vadym Vlasenko’s webinar session at 2020 ScaleUp 360° Auto Vision:
Session title: Maps for robots vs maps for humans. What is the difference?
Date: Wednesday June 03, 2020, 10:45 CET
Follow this link to sign up now for free: Online registration
Signing up automatically gives you access to all webinar sessions at ScaleUp 360° Auto Vision, June 03 - 04, 2020
ScaleUp 360° Auto Vision – the digital event deep diving into Computer Vision and Sensor Fusion technologies for autonomous vehicle perception. Experience 2 days with digital case studies by stakeholders involved in computer vision, sensor hardware, image processing and sensor fusion in the Level 5 automation scene. Learn, engage and discuss automotive tech innovation in real-time with thought leaders across the globe – right from your desk.