I recently completed a short course on augmented reality and ARCore, Google’s augmented reality engine. Basically, augmented reality offers a layer of information between the user and the world through a device such as a smartphone or headset. That information might include environmental data, biometric data, visual representations, user instructions, and much more. While AR isn’t new, its potential is emerging rapidly. The following post brings together 11 takeaways from that course.
1. Target smartphones for your first AR app.
Augmented reality experiences reach audiences through two delivery modes: headsets and smartphones. For most people, smartphones will be the gateway to AR due to convenience and cost. Microsoft’s HoloLens 2, for example, costs $3,500 per device, whereas many people carry a smart phone. Many smartphones can play AR apps even if they lack the software and hardware to create them.
2. AR has many uses.
One key advantage of AR is that it adapts to everyday life instead of requiring the creation an entirely new world altogether like virtual reality does. That connection with everyday life allows for great flexibility in its applications. Shopping options are among the most popular; just choose a pouf and see if it (virtually) fits at your feet. Architecture, warehouses, and brand campaigns also are possibilities for AR applications.
3. Education has many uses for AR, too.
AR offers opportunities for education, particularly with spatial content and complex subjects. Examples for secondary education might include extreme weather, DNA strands, and sculptures. Medical fields use AR for patient diagnoses, procedures, and treatment plans. Even the first application for AR was education related: It was for airplane engineer training.
4. Breaking immersion happens in AR.
Immersion refers to the user’s sense of being part of this created world. The primary way to break immersion is through messing with how objects behave, such as floating a lamp in mid-air or making a snake slither across the ceiling. In other words, objects should behave in AR as they would in the real life.
(I found this lesson interesting as the class video showed a dancing pizza slice. Maybe it was channeling Dean Martin?)
5. Navigation is hard.
Navigation within AR experiences poses some unique challenges without clear answers. Do you include menus? How much information appears on each menu? How do people access these menus — tap, voice, swipe, something else? How do these menus affect interaction and design? Can you get away with not having any menus? Can a toggle be used to hide the menu when it’s not needed?
6. They will try to break it.
Users are wily. Some will try to mess with the application and see what they can get away with. Designers should be aware of what users might try to do, but they need not create materials for everything.
For example, some AR applications allow the devices to pass through objects. Sometimes, this technique can be quite cool. The app LightSpace allows you to draw in the world before you and then walk through the drawing that you created.
Sometimes, users will pass through AR objects because they can. Designers need not create worlds inside those objects unless they seek to make a joke or point.
7. Turn on the lights.
AR requires plenty of light in order to work correctly. The technologies have not quite caught up with working in dimmer environments.
8. Use textured or unique surfaces.
One temptation might be to use a clean, white surface for AR. But the AR engines needs distinct features to help them with mapping spaces.
9. Use first person for UI development.
While virtual reality offers both first and third person, AR is best conceived through first person. A VR experience might play out with the user observing, but an AR experience requires the user to DO something in order for the app to function. This is why writing the app through a first-person perspective can be helpful for determining navigation options and user actions.
10. There is such a thing as “too real.”
This one struck me as odd as first. If characters appear almost — but not quite — perfectly human, then users might become creeped out. The term here is “uncanny valley,” first described in connection to humans reacting to robots. According to the thinking, if the robot appears less humanlike, humans are more accepting of it. If the robot appears more humanlike, humans are more likely to recoil.
11. Remember screen size when designing.
Though phone screen sizes seem to increase each year, they still are quite small. Finger sizes also vary widely. Both of these are important when thinking about menu design and navigation.
Basically, everything appearing on the screen must be essential to the AR app experience. Too many small elements on the screen create clutter and make challenges for hitting just the right one.
Placement is also important. AR apps usually place the action at the center of the screen, so it is best to avoid placing any other elements such as menus or icons there.
This class was created by Google, so unfortunately it felt like an ad for the company’s products more so than a class sometimes. The class designers also assumed much knowledge on the learner’s part, making it a bit of a challenge to make sense of everything.