Niels Koch | Component Owner Radar Systems for Driver Assistance | AUDI AG
In the run-up to the ScaleUp 360° Automotive AI, we.CONECT spoke with Niels Koch, Component Owner Radar Systems for Driver Assistance at AUDI AG, about the prospect of autonomous vehicles entering the market in a large scale in the next years.
Niels Koch: The autonomous driving topic is extremely complex which requires a complete new way of thinking and problem solving concept than done before. Hence it is an interesting setup for engineering and solution finding. I tend to say that we as engineers face “Golden Times” for inventions and innovation. Time to think smart!
Niels Koch: None! The next five years will only show how many unsolved problems there are. These must be overcome and solved first before there is a customer ready product to be defined and launched. I am very pessimistic for the next 10 years, but beyond that I think we see flying taxis first and then some robo-vehicles entering the market.
Niels Koch: I only see a tremendous hype on the advances of artificial intelligence and its sub-mainstreams, which is driven by some breakthrough in pattern recognition on pictures/video and speech. On the radar end there are not even usable concepts to be seen. As radar sensors are intended to be the master-sensor in the autonomous vehicle, AI is not seen for the next years to come. Radar is currently no focus for AI and machine learning concepts, so I don’t expect any breakthrough issues on that frontier.
Niels koch: There will be newcomers in the industry who will enter the market with extremely progressive promises on autonomous driving to mark their entry point. They can and will do so because they have nothing to loose and there is no track record of reputation. They must be progressive.
However, the classic car manufactures with close to 100 years history and lots of reputation to loose, I don’t expect anybody from the old-car-world to force-forward autonomous driving. Here we need to distinguish between individual traffic and cargo-fleet traffic as well as closed-area traffic. In some localized areas we will see autonomous vehicles earlier, e.g. airport shuttles, heavy truck traffic in gold mines etc. As autonomous driving is not solely solving the customer wishes on mobility and individualism in conjunction with safety aspects, I personally do not see individual traffic to become automatized in the next 20 years. As I said earlier, I think we will see flying taxis earlier than robo-taxis in the market.
Niels Koch: The session will show a systematic method to integrate radar sensors into the vehicular chassis and how the contradictory requirements can be solved. It also gives a very little insight into the best-practices and dos & don’ts on radar sensor integration into vehicles.
Niels Koch: The first car I was allowed to drive was a Ford Escort in 1990 from my father, the first own-bought car was a Mazda 626, followed by BMW and AUDI (various models).
The perfect car for me would consist of a cross-over between BMW-engine and 4-wheel X-drive powertrain with minimum 400 PS and minimum 800 Nm torque, shape and exterior as Audi R8 but with a trunk like a VW-GOLF and the interior old-fashion-style from Mercedes Benz, good for 4 passengers incl. driver and able to carry minimum 12 beer-packs.
The only driver assistance system would be an Automatic Cruise Control (ACC) radar sensor to the front and perhaps a navigation system. As I am growing older, perhaps a park-assist camera to the rear might be nice. The rest as pure as possible.
Join Niels Koch’s webinar session at 2020 ScaleUp 360° Automotive AI:
Session title: Developing an efficient Method for Vehicular Radar System Integration
Date: February 04, 2020, 09 AM (CET)
Follow this link to sign up now for free: Online registration
Signing up automatically gives you access to all webinar sessions at ScaleUp 360° Automotive AI, February 04 - 05, 2020
ScaleUp 360° Automotive AI – the digital event deep diving into Level +5 automation. Experience 2 days with 12 live webinars and case studies by stakeholders involved in deep driving, imaging, computer vision, sensor fusion and perception in the Level +5 automation scene.