Page 12

EETE JUNE 2013

Roadmap to automated driving takes shape: silicon chauffeur takes over By Christoph Hammerschmidt The European Community intends to reduce the number of traffic fatalities by 50 percent until 2020; the United Nations put a similar program in place to improve road traffic safety and cut back the number of traffic accidents in a global scale. This gives carmakers and tier ones a good argument to switch to automated driving: A chauffeur made of silicon and algorithms is less prone to fatigue and emotions and thus a safer driver than a human. Another big driver for automation on the road is the reduction of greenhouse gas emissions. Many of the building blocks required to automate the act of driving are already in place, such as the Electronic Stability Control (ESC) which today is one of the most widespread driver assistant systems in the automotive world, explains Wolf- Henning Scheider, who oversees the Chassis Control Systems business for Robert Bosch GmbH. According to Scheider, ESC has proved to be “plainly the best technical measure to prevent accidents”. But systems that eventually could take over the control of the car will be, of course, much more sophisticated. To enable a vehicle to drive autonomously (no driver required) or at least automatically (driver is present but only monitors the activities of the electronic driver), it needs, first of all, a machine perception of its surroundings. Therefore, sensors are a precondition for the next generation of vehicles. Then it needs the ability to process the data provided by the sensors, and the ability to translate the results of its algorithms into instructions, which affect the longitudinal and lateral movements of the vehicles. The third element required for automatic driving, the actuators, are already in place - electric power steering, electric brakes (already required for ESC) and throttle-by-wire are already commonplace in today’s vehicles. Algorithms and computing resources are subject to research and advanced development, much like the sensors. Not all sensor technologies are suited for volume cars. LIDAR (Laser) sensors used in Google’s famous experimental autonomous vehicles, provide a good spatial resolution, but they are rather susceptible to errors induced by unfavourable weather conditions such as snow of fog. It is also much too expensive for deployment in serial vehicles. The high mechanical and optical content prevents that the price level drops significantly in high-volume, explains Ralf Herrtwich, director for Daimler Benz Research and Advanced Development of Driver Assistance Systems. To create a 360° surround image for the car’s brains, manufacturers typically rely on a combination of proven technologies - radar, stereo and mono cameras, infrared and ultrasound. As an example, Daimler places four radar sensors to the corners of the vehicle plus one long-distance radar (range: some 270 meters) each centrically into the front and aft. Radar and cameras complement each other ideally, Herrtwich explained: while radar is ideal to determine relative speed and distance to an object, cameras along with pattern recognition software offer images with an excellent resolution and can discriminate different objects. In addition, stereo cameras can provide a spatial model of their surroundings. The electronic systems that steer the vehicle through the traffic need exact and trustable information to compute the correct driving decisions. Therefore, the signals from different sensors are combined and double checked for plausibility and significance - an approach called “Sensor Fusion”. Already in today’s driver assistance systems the signals from radar and video sensors are used to control several safety functions. For example, the video signals are used to “warn” the airbag controller if a collision seems unavoidable but before the mechanical sensors trigger the igniton. “Video-controlled pre-activation of airbags and safety belts saves valuable time in the case of an accident”, explains Gerhard Steiger, director of Chassis Control Systems for Bosch. At the next level, vehicles will be connected - be it by Car-to-Car or by Car-to-Infrastructure schemes – whereby the infrastructure includes relevant traffic data stored in the cloud. Local current weather information, variable speed limits or traffic congestion data will be used to configure the internal systems of the car accordingly, Steiger said. Currently, carmakers and suppliers are discussing the roadmap towards automated driving. Steiger believes that the techniques necessary for automatic driving will gradually emerge from todays advanced driver assistance systems. In contrast to more or less all OEMs who do not yet Fig 1: Radar and video sensors generate a 360° surround model of the car’s environment. 12 Electronic Engineering Times Europe June 2013 www.electronics-eetimes.com


EETE JUNE 2013
To see the actual publication please follow the link above