Page 18

EDNE MAY 2015

Sensors Sensor fusion enhances device performance By Keith Nicholson, Amithash Kankanallu Jagadish, Bosch Sensortec With the ongoing revolution in powerful and intelligent device development such as smart phones, new applications are being enabled at a rapid pace and system development often fails to keep up with new and changing requirements. Today, new applications such as indoor navigation and augmented reality, which make use of motion or positional data, require users to accept a somewhat crude sensor fusion implementations originally developed for simple gaming applications. Now, however, end users easily notice the considerable shortcomings and inaccuracies of the implementations. Sensor fusion is a creative engineering technique that combines sensor data from various system sensors to guarantee more accurate, complete and dependable sensor signals or derived sensory information. For sensor fusion to be consistently accurate it is important to have a deep understanding of the strengths and weaknesses of sensors before the engineer can decide how the data from these sensors is best combined. One approach that is being successfully implemented uses a fusion library based on sensor signals from accelerometers, magnetometers and gyroscopes and compensates for each of the sensor's shortcomings to provide highly accurate, reliable, and stable orientation data. As end-users become exposed to these new applications, they demand more accurate and reliable solutions. Indoor navigation, where sensors are used to track users between known fixed locations, is similar to early GPS equipment, where only a superior quality of sensor fusion could provide the level of reality, accuracy and therefore user confidence required. OEMs are aware of this and most see this as an opportunity to differentiate their products. Another example is the progression from virtual to augmented reality. In virtual reality (VR) systems the user is isolated from the real world and immersed in an artificial world. In augmented reality (AR) systems, users continue to be in touch with the real world while interacting with virtual objects around them. With existing technology, the lag in information delivery can actually cause nausea in the user - and such misalignment in AR can result in a very negative user experience. The big challenge for OEMs and platform developers (i.e. OS developers) is to ensure that all devices deliver the performance required for these applications to work consistently. For example, in Android devices there are many different software and hardware combinations, each resulting in a different output quality. There are currently no standards and no standard test procedures, which means that application developers cannot rely on Android sensor data to achieve consistent performance across many different platforms. The following is a proposal for a motion tracking camera system to analyse and compare the performance of different hardware/ software combinations, and thus set minimum performance criteria. The performance analysis is accomplished by measuring the four key performance indicators (KPIs) of the system: Static accuracy Dynamic accuracy Orientation stabilisation time Calibration time The camera-based system produces an orientation vector based on the movement of an object (the smart phone) by tracking markers on the object. Orientation can then be compared with the vectors created by sensors in the phone. These vectors are simultaneously recorded using a data recording application, 17 EDN Europe | MAY 2015 www.edn-europe.com


EDNE MAY 2015
To see the actual publication please follow the link above