Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
EE Times-Asia > Sensors/MEMS

Boost device performance with sensor fusion

Posted: 15 Jan 2015 ?? ?Print Version ?Bookmark and Share

Keywords:Sensor fusion? sensors? magnetometer? gyroscope? Accelerometers?

As we undergo this current revolution in powerful and intelligent device development such as smart phones, new applications are being enabled at a rapid pace and system development often fails to keep up with new and changing requirements. Today, new applications such as indoor navigation and augmented reality, which make use of motion or positional data, require users to accept a somewhat crude sensor fusion implementations originally developed for simple gaming applications. Now, however, end users easily notice the considerable shortcomings and inaccuracies of the implementations.

Sensor fusion is a creative engineering technique that combines sensor data from various system sensors to guarantee more accurate, complete and dependable sensor signals or derived sensory information. For sensor fusion to be consistently accurate it is important to have a deep understanding of the strengths and weaknesses of sensors before the engineer can decide how the data from these sensors is best combined. One approach that is being successfully implemented uses a fusion library based on sensor signals from accelerometers, magnetometers and gyroscopes and compensates for each of the sensors shortcomings to provide highly accurate, reliable, and stable orientation data.

As end-users become exposed to these new applications, they demand more accurate and reliable solutions. Indoor navigation, where sensors are used to track users between known fixed locations, is similar to early GPS equipment, where only a superior quality of sensor fusion could provide the level of reality, accuracy and therefore user confidence required. OEMs are aware of this and most see this as an opportunity to differentiate their products.

Another example is the progression from virtual to augmented reality. In virtual reality (VR) systems the user is isolated from the real world and immersed in an artificial world. In augmented reality (AR) systems, users continue to be in touch with the real world while interacting with virtual objects around them. With existing technology, the lag in information delivery can actually be nauseous to the userand such misalignment in AR can result in a very negative user experience.

The big challenge for OEMs and platform developers (i.e. OS developers) is to ensure that all devices deliver the performance required for these applications to work consistently. For example, in Android devices there are many different software and hardware combinations, each resulting in a different output quality. There are currently no standards and no standard test procedures, which means that application developers cannot rely on Android sensor data to achieve consistent performance across many different platforms.

The following is a proposal of a motion tracking camera system to analyse and compare the performance of different hardware/software combinations, and thus set minimum performance criteria. The performance analysis is accomplished by measuring the four key performance indicators (KPIs) of the system:
???Static accuracy
???Dynamic accuracy
???Orientation stabilisation time
???Calibration time

The camera-based system produces an orientation vector based on the movement of an object (the smart phone) by tracking markers on the object. Orientation can then be compared with the vectors created by sensors in the phone. These vectors are simultaneously recorded using a data recording application, which allows a direct comparison of end user devices.

This article introduces the concept of sensor fusion within a smart phone context. It discusses how sensor fusion software is used to improve overall accuracy and introduces a test method including performance result measurement on a number of flagship smart phones.

The fusion library described uses accelerometer, magnetometer and gyroscope sensor signals to compensate for each others shortcomings and provides highly accurate, reliable and stable orientation data. Let's look at the strengths and weaknesses of these critical devices and how they compensate for each others drawbacks.

The orientation of an object describes how it is placed in the 3-D space and typically the orientation is given relative to a frame of reference specified by a coordinate system. At least three independent values as part of a 3-dimensional vector are needed to describe true orientation. All points of the body change their position during a rotation except for those lying on the rotation axis.

The magnetometer
A magnetometer is highly sensitive to interfering local magnetic fields and distortions, which result in errors in the calculated magnetic heading. The gyroscope can be used to detect such interference and heading changes where no rotation is registered. Sensor fusion can then accurately compensate for this by giving more weight to the gyroscope data than to that of the magnetometer.

1???2???3?Next Page?Last Page

Article Comments - Boost device performance with sensor...
*? You can enter [0] more charecters.
*Verify code:


Visit Asia Webinars to learn about the latest in technology and get practical design tips.

Back to Top