Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / Mobile / Android

Beyond Mobile Gestures

4.76/5 (16 votes)
8 Dec 2013CPOL8 min read 27.7K  
What is next for Mobile Gestures? Past and Future of Mobile Gestures

What is Mobile Gestures  

Mobile Gestures technology lets you control your device actions by movements or touchs. They can be likely some shortcuts of complex actions. The success of these shortcuts are depend on becoming natural behavior of users. The movement can be with device movement or movement on the camera of device. Touchs can be on screen or on device.

<o:p>

These gestures are becoming one of the main reasons while customers choosing their new smart phone such as battery consumption, screen size, weight or processor. The mobile phone companies are investing much on the features to be selected by customers. Especially the phone companies who offer their customer same Operating System (Android) are in the competition. The leading firms in this race are Samsung, Motorola, Nexus, LG, HTC, Sony, Huawei and ZTE. Apple is not in the same competition, however they have a strategy on the gestures which is obivious. On the other hand, these features are subject to fill patent applications.

<o:p>

History  

While smart phones offers wide task list to their user such as voice comminucation, SMS, Multimedia,  internet, GPS, mobile games, their input list gets limited. The only input way was hard keyboards. The hard keyboard surpassed by touch screens. After touch screens and multi touch options, shortcut movements gets more popular. The movements on the screen with touch called screen gestures. <o:p>

Screen Gestures 

Some of the known and popular screen gestures are <o:p>

-          spread or pinch for resize an image,

-          swipe right for delete a message or answer a call,

<o:p>

-          slide to unlock phone,<o:p>

The screen gestures was an evolution in the gesture history, they are still in use, however the usage is reducing.

<o:p>

<o:p>  

Image 1

<o:p>Motion Gestures 

The second invention on the subject was motion gestures. They also called as 3D gestures. After the devices equipped with sensors, the motion gestures became favourites by users. The main sensors in devices of today are  accelerometers, gyroscopes, and orientation sensors.

<o:p>

<o:p> Most popular gestures are using by phone users are 

<o:p>

-          moving phone to face to answer a call while there is an incoming call,

-          turn device over to mute a incoming call,

-          double tap on top of your device to go on the top of a list.

-          Moving your phone left or right down to control a craft in a game,

<o:p>

Mobile phone manufacturers add motion gestures to the rom of their products. They can be eligible by user via settings menu. The below image is the motion  gestures menu of Samsung Galaxy S3.

<o:p>

<o:p> 

Image 2

<o:p> 

User can turn on or turn off the gesture himself. Direct call and Smart Alert are my favourites for this phone. While Direct call option  is on, if the phone brings to face of user, it calls directly the contact on the screen. if the phone placed down and turned off screen, smart Alert feature shows the notifications to user after user pick up the phone. The image below illusturates the usage of Smart Alerts.<o:p>

<o:p> 

Image 3

<o:p> 

If your phone does not have these gestures in settings menu, you can download a motion gesture application from Google Play. Listening sensors or registering a service to root does not need root permissions, therefore an application with appropriate permissions can be installed safely to use motion gestures. What is more you can create your own action for a specified input with these applications or your own program. For example you can send a message while your phone is in an appropriate slope.<o:p>

<o:p>Air Gestures 

The latest invention on gestures are air gestures. They also called wave control or hover control.  These gestures help device owners to control their phone without touching them. With proximity sensor and image processing technology, the camera or sensor captures the hand movements and gets them as an input for an action. <o:p>

<o:p> 

Most adopted ones by user till now  are<o:p>

-          Hands on your device screen to mute a call,

-          Pass your hand on camera to see next browser page,

-          Move you hand for next the song,

-          Moving your hand side  from left to right or vica verse to capture screen.

<o:p>

<o:p> 

The left air gesture is illusturates the mute a call. While phone rings user can mute the phone via holding his hand on phone for a short while. It raise up a service that catches hand movements when a call comes.<o:p>

Image 4

. <o:p>

<o:p> 

Another air gesture is for accepting or rejecting an incoming call. The user can move his hand from left to right to accept a call, or visa verse to reject a call. Whenever you move your hand to accept it, the call accepts and speakers gets on.  This is a handy feature if a call comes from your girl friend,while you wash dishesJ.  <o:p>

<o:p> 

Image 5

<o:p>  

<o:p>  

In Samsung S4 and some models of Sony these gestures come default on rom, you can use them via turning on them in settings menu. In addition to this, there are applications in google play that provides some of these features. To use them, you should have an above average proximity sensor in the device. <o:p>

The Architecture Of Gestures 

Sensors can not be up all the time, because of power consumption issues. That is why when need a special action to turn them on. Tha main action listener should be registered in root to make the gesture available after start up.  After start up, the main action should be listened. If an expected action comes, the service attaches sensor listeners and if sensor listeners catch the input shows the output action. After the main action completed the sensors should be released with a detach sensor function.

The below architecure illustrates the point, 

 Image 6 

 

Lets describe calling a contact gesture with this architecture. A broadcast receiver service should be registered for catching the contacts screen in root. Whenver the contacts screen is shown, the service starts the sensor listeners to catch movement of user. If the user move the phone to his face, the output action is shown which is calling the contact. After the contact screen is closed service should give an order to detach sensor listeners. Registering the main action can be done via a menu setting. 

The architecture is designed for an android device, it can be different in other embedded operating systems. 

 

Limitations 

 

This handy gestures have some difficulties while we apply them. Without a main action which described above, they can not work. An application which turns on sensors all the time phone is up, can drain the battery fast. In addition to this, the sensivity of sensors are not in good for all phones. Some phones there is a delay for sensor updates. What is more, the range of sensor is different in different model.

 

For example in some devices, the proximity sensor range is from 0-5. Equipped best sensor devices are expensive than others. The processor and memory are still an issue for image processing or sensor listening. 

Next Step 

Mobile technology is improving really fast. The gestures will be more popular in near future. The limitations will be solved soon. And the gestures will be stable with new equipments.

1) I believe new sensors will be introduced in near future. Pressure or resistance can be candidates. The value range of sensors can be redefined and sensivity will be better. 

2) Battery consumption problem will be solved. Drawing a letter on screen while screen off can be detected, because there will be no consumption problem and gesture wont need the main action. 

3) Location, the gestures will be registered with different locations, for example if the user in a shopping center, the gesture will be defined to buy an item.

4) Speech Technology will be more popular, the gesture will combine with speech. Speech and hand movement will get unique output. 

5) After the phones gets Wearable Devices, the usage of gestures will be increased dramatically. A new method will be introduced, the body gestures, the devices can be controlled with body movements. The Samsung Gear is released with Samsung Galaxy Note 3, Google Glass is in same the competition.  However I think the wearable devices needs more time to be natural behavior of humanity.  

6) Mind Waves, there is some equipments that can catch mindwaves. If you look at ios or android application market you can find some application or games that works with these equipments. They will be a part of gestures, users will be control their devices with mind waves. Check out Mindwave Mobile , http://www.neurosky.com/ . 

Conclusion 

Mobile Gestures technology will be most important preference for phone users while they choose their phone. There is still important limitations, however, the phone manufactures invest to solve these limitations. After the problems are solved, the gestures will be more popular and handy. 

Screen gestures, Motion gestures and Air gestures are todays gestures.  Body gestures and Mind wave gestures will be future gestures. Today there is one dimension input for gestures, in future there will be also range or option list as input. Location and speech services will provide a range or option list as input.  

 

Revision History 

------ 

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)