Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Embedded
?
?
Embedded??

Verifying ADAS in vehicles and in labs

Posted: 14 Apr 2015 ?? ?Print Version ?Bookmark and Share

Keywords:Advanced Driver Assistance Systems? ADAS? ECU? algorithm? XCP?

Behind the wheel, humans obtain information about their environment through their sensory organs C specifically their eyes and ears. Signal processing in the brain interprets the collected information, decisions are made, and actions are initiated. Decisions might include whether a space on the side of the road is large enough for parking or whether the distance to the car ahead needs to be adjusted. Advanced Driver Assistance Systems (ADAS) support the driver in making these decisions, thereby enhancing safety and improving comfort and convenience as well as economy.

Access to sensor and algorithm data
Driver assistance systems must be able to reliably detect the environment as a type of "attentive passenger". Radar, ultrasonic and video sensors are very often used to provide information to ECUs on the driving situation or the vehicle's environment. Complex algorithms process the sensor data to detect objects such as road signs, parking vehicles, other participants in traffic, etc., and they initiate actions. To verify the sensor system, it may be sufficient to simply measure the results of the algorithm and compare them to reality. An example here is the distance measuring radar of an Adaptive Cruise Control system: The sensor detects objects by return reflections of the radar beam. The ECU supplies range distance information for each object as coordinates.

In this case, it is not necessary to acquire all of the radar reflections in the sensor. However, all input variables of the algorithm must be measured if the data is being logged for later stimulation in the laboratory, for example. In this case, over 100,000 signals with a data rate of several megabytes per second would not be atypical.

Image processing ECUs with video sensors are used for road sign detection systems or lane-keeping assistants. An algorithm analyzes the video images and detects road signs or lane markings. One typical requirement for data processing in the ECU is a high level of microcontroller performance. On the other hand, whether the sensor data originates from a video or radar system has little impact on measurement instrumentation requirements C a high-performance solution is essential for transporting the measurement data. In evaluating and optimizing the algorithms, the measurement instrumentation must be able to acquire all of the algorithm's input and output variables and all necessary intermediate variables within the algorithm without incurring additional controller load (figure 1).

Figure 1: Acquisition of inputs and outputs, the environment and all data relevant to evaluating the algorithm. Display of all data, coordination of parameters.

Serial bus systems such as CAN and FlexRay run into their performance limits in terms of the necessary data throughput rates. Therefore, controller-specific interfaces such as Nexus, DAP or Aurora are used to transport the large quantities of measurement data. It makes sense to rely on established and proven standards to avoid having to develop a separate solution for each technical measurement task. The VX1000 measurement and calibration hardware from Vector is ideal for this; it transfers the data from the controller interface to a base module via a small PC-board (plug-on device or POD), where it converts it to the standardized XCP on Ethernet; it then transfers the data stream to the PC at a high throughput rate [1].

Validating sensor data with reality
The ECU's object detection results must now be verified against reality. Is the distance to the vehicle ahead on the road actually 45.5 meters? To compare the sensor data with reality, it is first necessary to acquire that reality. A camera, which is independent of the sensor system, records the driving situation. Developers can now quickly and reliably verify the object detection algorithms of their ECUs by comparing the objects detected by the ECU with the video image.

Figure 2: Video image of the environment with objects detected by the distance measuring radar system and display of objects from a bird's eye perspective.


1???2?Next Page?Last Page



Article Comments - Verifying ADAS in vehicles and in la...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top