Page 12

EETE APR 2015

Delphi selected to build Audi’s autopilot computer By Christoph Hammerschmidt Automated driving is ante portas – but how will the computers look like that steer the vehicles through the evening rush hour and across the high-speed autobahn? Audi now introduced the electronic brain of future self-driving cars. The zFAS German for zentrales Fahrerassistenzsteuergert or central driver assistant controller will be gradually introduced in Audis model range, starting within the next two years. The contract to manufacture the highperformance computing platform has been awarded to supplier Delphi who also was involved in the development. We can assume that Delphis current project of sending a self-driving vehicle across the United States from San Francisco to New York is inspired by the companys development work for the zFAS all the more so as the vehicle used for this cross-country trip is an Audi. Nevertheless, Delphi was not the only company involved in the zFAS development also partners like TTTech (the real-time networking company who recently received a major funding from Audi and also from Infineon), Mobileye (expertise in signal camera processing) and high-performance microprocessor manufacturer nVidia played an important part. The zFAS is the vehicles sensor data hub. Radar, lidar, ultrasound and camera data are processed to create a complete model of the vehicles surroundings in real-time. The findings computed are then made available to all driver assistance systems distributed around the vehicle. Its paramount significance for processing the sensor signals makes it the central hub for all functions of piloted driving, Audi says. Hitherto the driver assistance systems have mostly be managed by physically separated electronic control units. Audi claims to be the first carmaker to implement this function as a central domain architecture, combining all related functions, sensors, electronics hardware and software architecture in one unit that follows a holistic concept. Safety aspects have been in the focus of the concept, Audi assures without elaborating. The platform has the size of a tablet computer. Processing the data is split between Mobileyes EyeQ3 microprocessor and the Tegra K1 from nVidia. With these processors, the performance of the platform equals the combined computing power of all ECUs of a state-of-the-art mid-sized car, Audi says. The modular approach of the platform ensures scalability. Audis development roadmap provides for self-learning vehicles as one of the next development steps. The data generated by the zFAS will be fed via mobile radio connection to a backend in the cloud. There, the data are processed by machine learning algorithms and then sent back to the car. Thus, the zFAS continuously increases its performance and improves its ability to handle complex situations over time. IDT to speed CERN’ data analytics By Julien Happich Striking a three-year collaboration agreement with the European Organization for Nuclear Research (CERN), Integrated Device Technology (IDT) will provide its lowlatency RapidIO Interconnect technology to help improve data acquisition and analysis in some of the world’s most advanced fundamental physics research. Massive volumes of data are collected by the experiments on CERN’s Large Hadron Collider (LHC), the world’s largest and most powerful particle accelerator. Teams from IDT and CERN will use the IDT technology to improve the quality and timeliness of this data collection, as well as the initial analysis and reconstruction work at the experiments’ data farms and the CERN Data Centre. The LHC produces millions of collisions every second in each detector, generating approximately one petabyte of data per second. This data is vital to CERN’s quest to answer fundamental questions about the universe. The RapidIO technology provides a low-latency connection between clusters of computer processors, dramatically speeding the movement of data. Widely used for 4G base stations, IDT’s low-latency RapidIO products can also enable real-time data analytics and data management for high-performance computing (HPC) and data centres. As part of the mandate for the fifth phase of the CERN openlab partnership, several of the LHC experiments are exploring the possibility of moving from custom-built hardware and backplanes to fully programmable heterogeneous computing with low-latency interconnect between large clusters of processors. IDT’s current RapidIO 20 Gbps interconnect products will be used in the first stage of the collaboration with an upgrade path to RapidIO 10xN 40 Gbps technology in the future as research at CERN progresses. “This CERN collaboration is about enabling programmable real-time mission critical data analytics,” said Sailesh Chittipeddi, IDT’s vice president of Global Operations and chief technology officer. “Since the job spans multiple processors, the interconnect between them has to be ultra-low latency, and our technology— already used across 4G wireless base station deployments worldwide—is ideally suited to CERN’s real-time interconnect needs.” Because of the volume of real-time data CERN collects, current implementations are done in custom-built ASIC hardware. Using algorithms implemented in hardware, the data is sampled, and only 1 percent is selected for further analysis. “The bottleneck for better data acquisition, selection and analytics is superior real-time interconnect,” said Alberto Di Meglio, head of CERN openlab. The collaboration is based on industry standard IT form factor solutions suitable for deployment in HPC clusters and data centers. Engineers will use heterogeneous servers based on specifications from RapidIO.org that are targeted towards the Open Compute Project High Performance Computing initiative that IDT co-chairs. 12 Electronic Engineering Times Europe April 2015 www.electronics-eetimes.com


EETE APR 2015
To see the actual publication please follow the link above