Global Sources
EE Times-Asia
Stay in touch with EE Times Asia
?
EE Times-Asia > Sensors/MEMS
?
?
Sensors/MEMS??

Implementing proximity gesture for automotive HMI

Posted: 10 Feb 2016 ?? ?Print Version ?Bookmark and Share

Keywords:Capacitive proximity sensors? gesture recognition? infotainment? rawcounts? capacitance?

As the hand continues to pass over the console, the top and bottom sensors are triggered while the left sensor still remains triggered. As the hand moves further towards the right sensor, the right sensor is triggered. The left sensor stops sensing the hand as the hand has moved outside its region of detection. As the hand passes over the right sensor, the top and bottom sensors will no longer detect the hand's presence. When the hand moves further away, the right sensor too stop sensing the hand. If we look at the order of triggering of sensors, it will be one of the below, depending upon the position of the hand and sensitivities of the individual sensors:

Left top bottom right
Left bottom top right
Left bottom right
Left top right

All of above sensor activation sequences are mapped to the (left right) gesture. A PSoC is used in this case for implementing the capacitive proximity sensors. A Capacitance to Digital converter (known as Capsense Sigma Delta) inside the PSoC is used to measure the capacitance. The output of the CSD module is referred to as rawcounts. The higher the rawcounts, the greater the capacitance sensed by the sensor. The presence of a hand close to the proximity sensors increases their capacitance.

When rawcounts of the sensor crosses a certain threshold from its base value, we say the sensor is triggered due to presence of an object in its proximity. The rawcounts plot of the four sensors as a hand draws a straight line from left to right as shown in figure 2 (top) is shown in figure 2 (botton). The plot confirms the order of activation of sensors mentioned above. If hand moves in opposite direction, that is for a (right left) gesture, the sequence in which sensors are triggered is reversed with respect to the left and right sensors in the above mentioned sensor activation sequences. That is, the sensor triggering sequence will be one of those below for a (right left) gesture:

Right top bottom left
Right bottom top left
Right bottom left
Right top left

The above two gestures mentioned involve movement of the hand in the horizontal direction. Similarly, if the hand draws a straight line in the vertical direction, then it can be either a (topbottom) gesture or a (bottomtop) gesture, depending on direction of hand movement.

The gestures (up down) and (down up) can be associated with simple actions like scrolling up, down the menu or a track list for example as shown in figure 3.

Figure 3: Proximity gesture of hand drawing a straight line in vertical direction to scroll through menu.

The gestures (left right) and (right left) can be associated with changing a track or album to the next one for a music player application. The same gestures can also be used instead of button presses to turn on or turn off the interior lights of a car by placing proximity sensors as shown in figure 4.

Figure 4: Hand drawing a straight line gesture to control the cabin lights of a car.


?First Page?Previous Page 1???2???3?Next Page?Last Page



Article Comments - Implementing proximity gesture for a...
Comments:??
*? You can enter [0] more charecters.
*Verify code:
?
?
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

?
?
Back to Top