User Interface Requirements for Google Glass
Glass has some specific user interface requirements that you need to follow, not only to create a great app, but also to give users the features they expect after they’ve used other Glass apps.
Don’t just port an app written for a smartphone, tablet, or laptop to Glass, because the user interfaces for those other devices are fundamentally different from the Glass user interface.
Keep three main design principles at the top of your mind as you create your app:
Ensure that the information on the screen doesn’t distract the user from what’s going on in the real world in front of him.
Show information that’s relevant and timely.
Use an interface that integrates both visual and audial models.
Glass is designed to provide quick information that adds to what the user is experiencing in real life. Too much information, however, subtracts from the user’s experience. Users are looking at real life as well as information on the Glass screen, so it’s important to make the information displayed on your Glassware as unobtrusive as possible.
One good example of how to use the Glass interface correctly is Google Search. If a user searches for jellyfish while viewing the creatures in an aquarium, for example, a brief definition of jellyfish appears on her Glass screen just above her line of sight. The definition doesn’t detract from her view of the jellyfish in the aquarium tank in front of her.
What’s more, information provided at inappropriate times will likely cause users to remove your Glassware. . .and tell all their friends to avoid it, too. Consider the example shown. The user wakes up very early in the morning and puts on his Glass to see what’s going on. A few seconds later, he sees an ad for a cabbage sale.
Chances are pretty good that most users don’t need cabbage at 3:37 a.m., and most grocery stores are closed at that time anyway. If the user regularly shops at a 24-hour supermarket, however, and needs to get cabbage for the day’s holiday meal right away, the ad may be appropriate.
Your app can use the GPS connection between your app and your smartphone to “see” that the user is in or near the store and display information about that cabbage sale.
The information on the screen should be both current and relevant to what the user is seeing on her Glass so she’s convinced that your Glassware is important. If the user isn’t convinced, she won’t let your wares take up valuable storage space on her Glass.
If a user needs to pop over to the grocery store to pick up some dinner, for example, she can use the Evernote app to display the shopping list on the Glass screen. This app saves the Glass user time because she doesn’t have to fumble for her smartphone or a handwritten list to find out what’s needed. Instead, she can focus on shopping.
Users are going to use their voices to manipulate Glass and Glassware, so it’s important to implement a visual and audial user model. After all, a user can start any app by using his voice, and there are plenty of examples about how to use voice control within an app. The following figure, for example, shows the use of voice control in the Google Search app.
If you’re going to include voice interaction in your Glassware, keep in mind the Glass rule that a voice command or message is initiated or sent after the user stops talking.