Electronic
Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?

When will fully autonomous driving come? The industry currently does not have an exact time node. What is certain is that increasing the level of autonomous driving from L2+/L3 to L5 will also bring a series of huge challenges, whether it is the test method of the automatic driving system or the training of the automatic driving algorithm.

Author: Zhang Huijuan

When will fully autonomous driving come? The industry currently does not have an exact time node. What is certain is that increasing the level of autonomous driving from L2+/L3 to L5 will also bring a series of huge challenges, whether it is the test method of the automatic driving system or the training of the automatic driving algorithm.

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?

What are the key challenges to improve the level of autonomous driving?

How can automakers improve the level of autonomous driving? What are the key challenges? After all, to advance autonomous driving technology to a higher level requires first of all to rely on complex systems, and the capabilities of these systems should surpass the best human drivers in order to achieve the ultimate vision of fully autonomous driving.

Tom Goetzl, vice president and general manager of the Automotive and Energy Solutions Division of Keysight, believes that there are currently two major gaps:

One is the gap between road testing and software simulation testing. Current sensors and control modules are tested in a simulation environment with software-in-the-loop testing capabilities. Although software simulation is useful, it cannot fully reproduce the real situation and possible imperfect sensor response, and fully autonomous vehicles must know how to deal with such situations.

By conducting road tests on complete systems integrated in prototype cars or legally on-road vehicles, automakers can verify the performance of the final products before bringing them to the market. Road testing is very important and an indispensable part of the development process. However, considering a series of issues such as the cost of testing, the time required for testing, and the repeatability of testing, it becomes impractical to rely entirely on road testing. If this method is adopted, the vehicle will need to go through hundreds of years of testing to achieve sufficient reliability and safely drive on urban and rural roads without fail.

The second is the gap between training advanced driver assistance system (ADAS)/autonomous driving (AV) algorithms under real conditions. He pointed out that on-board radar testing is of great significance for training autonomous driving algorithms. These algorithms use data obtained by on-board radar sensors to make decisions, indicating how the vehicle should respond when encountering specific driving conditions. If algorithms are not properly trained, they may make unexpected decisions, endangering the safety of drivers, passengers or pedestrians.

In other words, the combination of sensors, precise algorithms and powerful processors is a key factor in the realization of autonomous driving. Sensors can perceive the surrounding environment, processors and algorithms can make correct decisions and ensure compliance with road traffic rules. It is necessary to ensure that the new ADAS functions are safe and reliable. Using immature systems to conduct road tests prematurely is very risky. Therefore, it is necessary to be able to simulate real scenes and verify the actual sensors, Electronic control unit (ECU) code, artificial intelligence logic and other parts. By testing more scenarios as early as possible, OEMs can know when development can be completed and when ADAS functions can be released.

How to fill the gap through innovative technology?

How to effectively fill these gaps? There are difficulties in traditional test systems. Because some test systems use multiple radar target simulators (RTS), each RTS presents multiple point targets to the radar sensor and simulates the horizontal and vertical positions by mechanically moving the antenna. This mechanical automatic operation Delayed the overall test speed.

In addition, some solutions use antenna walls that contain only a few RTSs, which means that the target can appear anywhere in the scene, but not at the same time. In a static or quasi-static environment, this method can test a few targets that move laterally, but is limited by the speed of the robotic arm.

The existing radar sensor test solution also has a limited field of view (FOV) and cannot distinguish targets with a distance of less than 4 meters. When testing radar sensors, if there are not enough targets, it will not be able to reflect the complete driving scene and reproduce the complex situation in the real environment.

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?

Therefore, to fill these gaps, new radar sensor testing methods are required. Tom Goetzl said that this method does not use simulated targets for target detection, but needs to simulate a complete traffic scene, and can be tested in a laboratory before implementing road tests. By performing full-scenario simulation in the laboratory, OEMs can arbitrarily combine various repeatable complex scenarios, high-density (stationary or moving) targets, environmental features, and test more driving scenarios as early as possible, thereby significantly accelerating advanced driver assistance systems (ADAS) )/Autonomous driving (AV) algorithm learning.

Move real road scenes into the laboratory

Based on the above requirements and challenges, Keysight recently announced the launch of a radar scene simulator solution, which is not to simulate a single target, but to simulate the entire traffic scene for target detection. This solution uses hundreds of miniature radio frequency (RF) front-ends to form a scalable simulation screen that can present up to 512 targets at a distance of up to 1.5 meters.

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?

According to reports, this radar scene simulator uses a full-scene rendering method to simulate distant and close targets in a wide continuous field of view (FOV), enabling customers to quickly test autonomous driving systems using extremely complex multi-target scenarios. Integrated vehicle radar sensor. Automotive OEMs can thus obtain the following key advantages:

First, the field of vision is broader. The radar scene simulator not only allows the radar sensor to find more targets in a wider continuous field of view, but also supports the simulation of short-range targets and long-range targets. This can avoid the blind spots left by the radar field of view, and can also improve the algorithm training effect, so as to efficiently detect and distinguish multiple targets in dense and complex scenes. Therefore, self-driving cars can make decisions based on the overall situation rather than just the information obtained by the test equipment.

Second, test the complex real environment. When testing radar sensors, if there are not enough targets, it will not be able to reflect the complete driving scene and reproduce the complex situation in the real environment. The radar scene simulator allows automakers to set various traffic densities, speeds, distances, and total number of targets in the laboratory to truly simulate realistic driving scenarios. Whether it is common or extreme, you can test in advance to minimize risks.

Third, speed up learning. The Keysight radar scene simulator provides a certain real environment for testing complex scenes in the laboratory. Previously, such scene tests could only be carried out on the road. With the help of the test methods provided by the simulator, car manufacturers can use Repeatable high-density complex scenes are tested in advance. The scene can include stationary targets or moving targets, as well as various variable environmental characteristics, which significantly improves the learning speed of ADAS/AD algorithms and avoids the inefficiency caused by manual testing or automated testing.

Fourth, increase the scene resolution. Automakers need to test whether the radar can distinguish obstacles on the road in order to smoothly and quickly transition to L4/L5 autonomous driving. Keysight uses point clouds (multiple reflection points per target) to improve target resolution and fill this technological gap.

Mr. Thomas Goetzl believes that the radar scene simulator can complete road tests in the laboratory through full scene rendering, which provides a pioneering solution for automakers.

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?

Autonomous driving has more urgent requirements for scenario testing than traditional testing

Some research institutions have shown that to realize fully autonomous driving requires hundreds of millions or even tens of billions of miles of test mileage to fully verify its autonomous driving performance. This is a very huge challenge for the entire test link, not only requires a lot of The investment of resources is also a huge test for the test time. Even if the existing test cars run together, it may take hundreds of years to run all the test mileage.

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?
Zhu Xiaoyue, Business Development Manager, Greater China, Automotive and New Energy Business Unit, Keysight

Zhu Xiaoyue, Business Development Manager of the Greater China Region of Keysight’s Automotive and New Energy Division, said that the real road scenes will be moved into the laboratory for simulation testing. The goal is to make the radar sensors on the car see more in the laboratory. It can detect complex scenes that are very close to real roads and some Corner Cases, and ultimately accelerate the iteration of the entire algorithm, so that the process of achieving the ultimate goal of autonomous driving can be accelerated. Now, Keysight has announced a cooperation with Renault in the global market, and China’s autonomous driving market is very hot, and various autonomous driving solution providers and head host manufacturers actually have corresponding needs.

She added that the industry now generally recognizes the development trend of multiple sensor fusion. Although the camera is getting stronger now, it is more efficient for tracking and locking the target, and can also measure the distance and speed. However, in special weather conditions such as rain and fog, the camera has congenital defects. There are certain obstacles under the scene or Corner Case. In general, multi-sensor fusion is the future trend, and millimeter-wave radars among them need such radar scene simulators to provide support.

Autopilot L2+ to L5, full-scene simulation rendering accelerates the industry’s pace?
Ma Jianrui, Business Manager, Greater China, Automotive and New Energy Business Unit, Keysight

Ma Jianrui, business manager of the Greater China Region of Keysight’s Automotive and New Energy Division, said that autonomous driving is undoubtedly an important direction for the development and evolution of automobiles, and the testing methods of traditional cars and autonomous vehicles are obviously different. Traditional cars focus on road testing, and in some typical specific scenarios, it is enough to complete the required legal requirements test. But for self-driving cars, not only the driver is involved, but also the system.

People, systems, and cars are themselves a complex system, and their test methods are also different from traditional car test scenarios. There are roughly two types of testing methods for autonomous vehicles, one is virtual testing based on scenarios, and the other is testing based on actual roads. Traditional cars have accumulated current safety data through actual driving of tens of millions of kilometers, even hundreds of millions of kilometers, including various situations that may occur. The same is true for self-driving cars. Such data guarantee is still needed, but how to obtain this data? Traditional road testing methods theoretically require hundreds of years or even longer to test. Even so, it cannot meet the requirements of all autonomous vehicle test scenarios, because road testing is relatively limited and simple. Therefore, the requirements of automated driving for testing, scene testing, and virtual testing are obviously much more urgent than traditional testing.

Of course, even if the scene simulated in the laboratory is real, it will be different from the actual road. The most important thing for Keysight’s solution is to place complex tests in the laboratory as much as possible, and then use road tests as the last stage of performance verification, thereby greatly improving test efficiency.

The Links:   BSM50GD60DN2 MSP430F5510IRGCR