Sie sind auf Seite 1von 4

Smartphones

Editor: Roy Want n Google n roywant@acm.org

Gesture Search: Random Access to Smartphone Content


Yang Li

EdItoRs IntRo
one of the great opportunities for smartphones is to take human-computer interface design beyond hard and soft keyboards into the more natural realm of gestures. this department provides an introduction to the topic, by one of Googles leaders in this area. Roy Want

martphones with touchscreens in particular, Apples iPhone and Googles Android phonesare the predominant, fastest-growing consumer computing device. With annually increasing computing power and capacity, smartphones typically contain gigabytes of storage space and thus can store hundreds of contacts and music tracks and tens of videos and applications. In addition, smartphones offer rich functionality that users often need to configure for their own specific needsfor example, turning off the Bluetooth radio. Currently, a user must go through a series of screens or menus, or search a long list of items, to access a specific target on smartphones. This is slow and tedious, especially because smartphone users are often occupied by real-world tasks. Essentially, theres a tension between the smartphones rapidly growing capabilities and the user interface support it has to control them. It should be noted that most user interaction, including input and output, are conducted via the touchscreen of a smartphonea small formfactor device. In addition, theres a mismatch between mobile user tasks and existing

mobile user interfaces. User tasks on these devices differ significantly from those performed on desktop computers. Mobile user interactions are typically event-driven and episodiccalling a friend for directions on the way to a party, for example as opposed to structured tasks that occupy the users undivided attentionwriting a report, for example. Consequently, existing smartphone interfaces, which largely employ GUIs designed for desktop computers, are often cumbersome for mobile user tasks. Here, I discuss Gesture Search, which enables random access to smartphone data and functionalities in contrast to existing WYSIWYG, GUI-oriented interaction.

across the touchscreen. (Gestures have also been used to refer to finger actions for directly manipulating interfaces, but thats not my focus here.) Although gesture shortcuts let a user quickly invoke a target, theyre traditionally limited in two ways. First, before a user can benefit from them, he or she must define how each shortcut is associated with a target. Second, gesture shortcuts dont scale well, because memorizing the mapping of many shortcuts is difficult. Gesture Search (www.google.com/ mobile/gesture-search/)2 helps address these challenges by letting a user gesture the alphanumeric name of the targets keywords. This essentially leverages mnemonic information that users have already processed in their memory, eliminating the need to define a shortcut. Gesture Searchs deployment to hundreds of thousands of users in the Android market has proven the effectiveness of combing gestures and search for random access to smartphone functionality.

TradITIonal GESTurE ShorTCuTS


Gesture shortcuts have been extensively explored.1 The approach associates a gesture with a target data item or command. A user can easily invoke a target by performing the gesture. Because a touchscreen is the major interaction medium for these devices, the gestures of concern here are 2D trajectories generated by a user sliding his or her finger

GESTurE SEarCh
Gesture Search starts as a blank canvas on which the user can draw anywhere. This design requires low precision of user input in contrast to using a dedicated drawing area. Changing a Setting Assume a user wants to turn on the phones Bluetooth setting, which currently

10

PER VA SI V E computing

Published by the IEEE CS n 1536-1268/12/$31.00 2012 IEEE

(a)

(b)

(c)

(d)

Figure 1. Using Gesture Search, it takes only two gestures to bring up the Bluetooth setting: (a) the user draws a b on the screen; (b) Gesture Search displays the search results; (c) the user then draws an l on the screen; and (d) Gesture Search updates the results.

requires going through a set of menus and screens. For example, on an Android phone, starting from the Home screen, the user must open the menu, select Settings, click on Wireless & networks, and finally select Bluetooth. This assumes the user knows where the Bluetooth setting is located; otherwise, finding the setting will require more time and effort. In Gesture Search, a user can quickly reach the Bluetooth setting by drawing a few strokes. The user first draws a character b on the screen, and Gesture Search instantly shows all items starting with letter b or B. Meanwhile, the raw stroke is scaled down and displayed at the bottom of the screen as a gesture query (see Figure 1a). Because the gesture also looks like the letter s, the system returns the items that match s as well. Showing the raw gesture instead of a recognized letter tolerates the ambiguity of hand-drawn gestures. It lets the user focus on search results instead of intermediate recognition results, which can be imperfectfor example, b versus s. In addition, Gesture Search shows the recognition in the context of search results by highlighting the matches (see Figure 1b),

which inform the user why a search result is selected. The user then draws another character, l, directly on top of the search result (see Figure 1c). The system immediately refines the search results based on the updated gesture query (see Figure 1d). It takes two gestures to bring up the Bluetooth setting, and now the user can tap on the Bluetooth item to directly bring up the Bluetooth setting. In case a target item isnt shown, a user can keep adding more gestures to further refine the results. Alternatively, a user could scroll through the search result list to manually find the item. Gesture Search automatically differentiates scrolling and gestures by analyzing a touch trajectory. Because Gesture Search employs prefix-based matching, the user doesnt need to finish an entire word or term to fetch a result. To narrow down the search, a user can use multiple prefixes that can be mapped to multiple terms for example, c c for the Custom Care contact. The user adds a space between the two letters by drawing a horizontal line (moving from left to right) in the query. A user can backspace the last gesture in the query by

drawing a horizontal line from right to left or can clear out the entire query by drawing a rightward horizontal line at the bottom of the screen. Improving Search Performance Gesture Search learns from search history to improve search performance in many ways. In the Bluetooth example, instead of using bl to retrieve the setting, the next time the user draws a single gesture b, Gesture Search will list the Bluetooth setting as the top choice. Learning from history not only improves gesture recognition but also captures the users mental model about how shorthand prefixes are associated with each target. Notice that this mental model evolves as a users activities change over time. Gradually, Gesture Search automatically defines a set of optimal gesture shortcuts for a user. For privacy reasons, search history is currently maintained on the users device. Activating Gesture Search Activating Gesture Search involves performing a specific motion called Double Flipthe user simply flips the phone

januaRymaRch 2012

PER VA SI V E computing

11

SmartPhoneS

SmartPhoneS

public boolean onCreateOptionsMenu(Menu menu) { super.onCreateOptionsMenu(menu); menu.add(0, GESTURE_SEARCH_ID, 0, R.string.menu_gesture_search) .setShortcut('0', 'g').setIcon(android.R.drawable.ic_menu_search); return true; } @Override public boolean onOptionsItemSelected(MenuItem item) { switch (item.getItemId()) { case GESTURE_SEARCH_ID: try { Intent intent = new Intent(); intent.setAction("com.google.android.apps.gesturesearch.SEARCH"); intent.setData(SuggestionProvider.CONTENT_URI); intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION); intent.putExtra(SHOW_MODE, SHOW_ALL); intent.putExtra(THEME, THEME_LIGHT); startActivityForResult(intent, GESTURE_SEARCH_ID); } catch (ActivityNotFoundException e) { Log.e("GestureSearchExample", "Gesture Search is not installed"); } break; } return super.onOptionsItemSelected(item); }
Figure 2. A code snippet for invoking Gesture Search from an application.

for a restaurant to help customers search for a particular menu item from a list of hundreds of entries. Similarly, a modern mobile application thats rich in functions (such as a Web browser) might have tens of menu options or settings, often organized in a hierarchical way (with top- and lower-level menus). It can be difficult and time-consuming for a user to discover and find a feature. With Gesture Search, developers can organize the data (or options) in a single list so the user can find an item by simply drawing gestures. Lets assume a developer wants to embed Gesture Search into an application that lets users find information about specific countries by drawing gestures on the touchscreen. To use Gesture Search in the application, the developer first must create a content provider (in accordance with the format required by the Android Search framework3). The developer can then invoke Gesture Search from the application and pass this data source to Gesture Search. This requires creating an Android Intent that contains the content providers URI using the following action: com.google.android.apps.gesturesearch.SEARCH Calling startActivityForResult invokes Gesture Search (see Figure 2). The code snippet intent.putExtra(SHOW_MODE, SHOW_ALL) specifies that all country names should appear when Gesture Search is invoked. Alternatively, the code could be parameterized to show the country name that a user recently clicked on, or only a blank screen (enough to prompt a trained user to make a gesture). Gesture Search then presents a list of all the country names. A user can certainly scroll through the list to find a target country. However, a more pleasant way would be to draw gestures directly on top of the list and have the target item them jump to the top of that list (see Figure 3). When a user clicks on a country name,

away and then back (see www.youtube. com/watch?v=gD3ZYKIqj7A). This feature saves the user from searching for Gesture Search on Android phones in the first place. The Double Flip feature is active whenever the phone is unlocked. Double Flip is currently an experimental feature, and its motion detection and battery usage still must be improved. A user can turn the feature on in Gesture Searchs settings. Besides the motion activation, Gesture Search also creates a home screen shortcut for easy activation. Interacting Eye-Free The coarse gesture-based interaction style of Gesture Search affords eyefree interaction. Because Gesture Search uses the entire screen as the gesture input area, a user can use the phone as the frame of reference, and doesnt

need to look at the screen for gesturing. When the phones accessibility is turned on, Gesture Search will let a user go through search results using an arrow key and will then pronounce each result that is focused.

GESTurE SEarCh for aPPS


Currently, Gesture Search lets a user access five sources of data and functionalities: contacts, applications, settings, music tracks, and bookmarks. These satisfy a major portion of a users daily needs. However, users might want to access specific data sources that Gesture Search doesnt support. To let users benefit from Gesture Search in these situations, Gesture Search can be invoked by an application externally and can search the applications own data. For example, a developer might want to create a mobile ordering application

12

PER VA SI V E computing

www.computer.org/pervasive

SmartPhoneS

Gesture Search exits and returns the result to the calling application. (For more details about how to integrate Gesture Search with an Android application, click on API at the bottom of www.google.com/mobile/ gesture-search/).

2. Y. Li, Gesture Search: A Tool for Fast Mobile Data Access, Proc. ACM Symp. User Interface Software and Technology (UIST 10), ACM Press, 2010, pp. 8796. 3. Building a Suggestion Table, Android. com, 2011; http://developer.android. com/guide/topics/search/adding-customsuggestions.html#SuggestionTable.

esture Search has enabled a new way of interacting with smartphones. It supports random access of a phones content and functionality using gesture shortcuts, so users no longer need to manually search and navigate through the interface hierarchy.

Yang li is a research scientist at Google. contact him at yangli@acm.org.

Figure 3. Gesture Search helps a user fi nd a target in an application-specific data source.

REFERENCES
1. C.G. Wolf, Can People Use Gesture Commands? ACM SIGCHI Bul., vol. 18, no. 2, 1986, pp. 7374.
selected cs articles and columns are also available for free at http://computingnow.computer.org.

http://w ww.com

For computing professionals, keeping abreast of the industrys most exciting developments is a continuous process. Beginning in January, the new digital Computer will offer you even more tools to accomplish that at no risk.
ht tp ://ww

w.comp

More value, more content, more resources


http

://w

93 ?, P. 6 , P. ED 10 NT ING , P. LY VE UL RICS EN IN RB BE MET G BE CY BIO IN TH FT ER SO S EV HA

.com ww

pute

r.o

rg

112

uter.org

OCTOB

RS, P. 84 TO MEMBE 100 REPORT TION, P. COLLEC OUS DATA P. 103 UBIQUIT ICAL DATA, HISTOR ING RAW ANALYZ

EM

BE

PT

SE

ER 20 11

20

puter.org

11

DIGITAL DIGITAL

011 JUNE 2
Crowdsourci

This new version will deliver the same great peerreviewed articles and columns as the print issue, PLUS, it will be: Mobile Searchable Linked Engaging

Computer

JUNE 2 0 11
ng//Smart Cities//Ring Generators

The IEEE Computer Societys next-generation flagship publication

Volum umbe e 44 N r 6

OBJECT DIGITIZ ATION, P. 81 TWITT ER MO OD AS A STOC DIGITA K MARK L MACH ET PRED INERY AND AN ICTOR, P. 91 ALOG BRAIN S, P. 10 0

View Demo Now computer.org/computer-demo

Current digital subscribers Look for the new digital Computer January issue link coming your way in December. Current print subscribers Switch from print to digital by 8 December 2011 to receive the January issue. See link below.

Make the switch at

computer.org/digitalcomputer

januaRymaRch 2012

PER VA SI V E computing

13

Das könnte Ihnen auch gefallen