SWAY
SUMMER 2018

succulents grow in a terrarium modeled from their wilder environment

unsplash-image

AR Research

Includes environmental context mapping and real world explorations to better understand user goals within the defined environment.
 

IN THIS SECTION

AR Overview

"3D virtual objects integrated into a 3D real environment in real time" [note]https://www.cs.unc.edu/~azuma/ARpresence.pdf[/note], [note]http://ieeexplore.ieee.org/document/6681863/[/note].

 
AR PROPERTIES
  • Combination of virtual elements and real environment
  • Three-dimensional registration of virtual information
  • Real-time interactivity with virtual elements

 

IDEAL APPLICATIONS
  • Illustrating spatial and temporal concepts
  • Emphasizing contextual relationships between real and virtual
  • Provide intuitive interaction
  • Visualize and interact in 3D
  • Facilitate collaboration

 

AR AFFORDANCES
  • Real world annotation
  • Contextual visualization: presentation of virtual information in the context of a real environment
  • Vision-haptic visualization

Context

Contextual Awareness

AR and VR get tossed in the same bucket, but AR is no more similar to VR than it is to flat interactions (web, apps, kiosks, etc) in general.

Mixed reality interfaces provide something critical and intangible that neither VR nor flat interactions can provide easily — environmental context — innate and in real time. With virtual world-building, the context needs to be designed and built-in. With flat interactions, the context has to be borrowed or forced (remember design’s skeuomorphic phase).

The most usable applications of this tech are going to be ones that focus on the shifting problems of real environments, and how that affects people, nature, and other complex systems. 

corear

Environment

Environmental Context

To define the environment, I looked at distance, direction, surface, and immersion potential. What does the environment consist of? What are its limitations? Where are its entry and exit points?

OVERALL

Trees
Rocks
Dirt
Trails
Streams
Hills
Bugs
Wildlife
Weather
Temperature
Time of day
Visibility
Terrain

FOREST

Foggy
Rainy
Steep
Dark
Muddy
Overgrown
Dangerous
Sharp
Poisonous

TREES

Circumference
Distance between
Hazard level
Height

LIMITATIONS

No clean, running water
Limited restrooms
No electricity
No heat
Limited shelter
Limited mobility
No roads
No cars

ENTRY AND EXIT

Roads
Trails
City parks
State parks
Private property
Public property

Personal

Personal Context

Much of the personal context is already defined from user research, we can try to dig a little deeper into it from the environmental side and find an insight or two, but I would save that for a next round of experience testing, and continue to refine the relationship between the user and the environment.

  • How does the user relate to the environment?
  • How is it meaningful to them?
  • What meaning, ideas and assumptions do they bring to it?

 

Proxemic

Social Context

Look into what brings people to natural environments, explore the cultural context of hammocking (holidays and seasons), and how it bridges between being a group and solitary activity.

 

PROXEMIC ZONES

Part of the social context for an AR app includes how comfortable a person feels using it in public. 

  • How does interpersonal distance change when in nature?
  • Does this affect willingness and ability to interact with an AR app?
proxemic@2x

Range

AR Research Explorations

For my initial explorations, I did non-technical real-world immersive walk-throughs of the target environment. I focused on how I was holding the phone, and made notes to log for future tests. Of note was how far someone held the phone from their body, and at what position above or below their line of sight, or how often they moved the phone up or down and side to side in an attempt to get the “full picture” of a scene.

The following two explorations highlight initial findings.
 

Exploration 1

AR EXPLORATION

Walk the Scene


RESEARCH FOCUS: Experience of looking through camera view and seeing a representational hammock in a natural setting. 

METHODOLOGY: Paper prototype, real world environment.

IMPLEMENTATION: Holding a transparent sheet through the viewfinder on camera while the camera is pointed at a group of trees.

Mixed Light

Mixed Light Matrix

This exploration was instrumental in understanding lighting changes, and in defining the mixed light matrix: the various lighting situations a person may find themselves in while using the app.

With the way sunlight filters through trees, you could easily find yourself in this common lighting situation while in the mountains or forest. The sun could be in your eyes, but shade on your screen, pointing at a mixed light scene [note] For further reading, http://www8.cs.umu.se/education/examina/Rapporter/WajidAli.pdf talks about some of the same problem of mixed light environments on tree detection[/note].

As you move around a grouping of trees to find spots with your phone, now the shade is in your eyes, but the sun is hitting your screen. You could experience this strobe effect multiple times during a session.

mixedlightmatrix

There are other environmental scenarios where lighting plays a necessary and important role in the user experience.

FOG: Difficult to find edges and estimate distance

SNOW: Difficult to estimate height from ground

RAIN: Lowered edge detection, rain may affect sensors

FIRE: Makes it difficult to anything

 

Low Light Capabilities

A short browse through the ARKit [note]https://developer.apple.com/documentation/arkit/arlightestimate[/note] and ARCore [note]https://developers.google.com/ar/reference/java/com/google/ar/core/LightEstimate[/note] SDKs shows that low light capabilities are limited to primary use cases, well-lit indoor spaces, in these early releases of the technology. Environmental factors that happen when outdoors, like rain or dark, can limit the functionality of the design.

Mapping and prototyping these scenarios can help engineers understand how design is thinking about interactions with outdoor, natural environments.

Exploration 2

AR EXPLORATION

Measuring a Grove

 

RESEARCH FOCUS

Understanding distance and immersive potential.

  • Visibility potential of a line in 3D space
  • Modeling a real grove of trees for hang spots
  • Understanding the accuracy of eyeballing distance


METHODOLOGY

I took the hammock model for one of my primary personas, Robin, a spool of string and a tape measure, and set out in search of workable grove of trees.


WALK THE SCENE
  • Persona hammock model
  • Spool of ribbon (white, which in hindsight was less than ideal)
 
IMPLEMENTATION

When I found a large enough grove, I measured out two pieces of string based on my hammock model: min height for hang point, 6.2ft and min distance between trees, 10ft.

I measured the distanced between all the trees in the grove to see which ones had about 10 feet of distance between them. I ignored all the trees that didn't fit within the model.

After identifying workable trees, I took my second measurement string for min height and marked off with ribbon the hang point for each tree.

From there, I took the spool of ribbon and wrapped lines from one hang point to the next all around the grove.

Analysis

Right away, just having the 10ft string to walk and measure between trees showed me how off my original eyeballing had been. I needed more space between trees than my mind visualized.

With the abundance of trees, spaced in what seems like a reasonable distance, almost every tree could be a potential spot. I found 5 good spots in an area that covered about 8 trees.

I found myself needing to move around in a larger perimeter circle in order to get a clear view of all the options and the accurate distances between trees. The more trees, the more blocks to my line-of-sight, hence the need to constantly shift perspective.

Fatigue

Fatigue Feedback

The basic interaction is similar to taking a panoramic photo. The differences start with the user’s intention when looking into the viewfinder. When taking a photo or capturing video, you’re capturing a moment, active, and in anticipation. Because the scene is primary, controls gather around the edges and out of the way.

With AR, in this implementation, the scene is half as important as the digital information woven throughout. Instead of waiting to pounce on the shutter or miss the moment, the intention is centered around comparing the digital scene to the real scene, and exploring both. This will most likely translate to more physical movement, and for a longer duration, in comparison to photo-taking.

motionrange1@2x
motionrange2@2x

Range of Motion

For the purposes of understanding the user’s natural perspective and motion, I’m leaning heavily on Dreyfuss’ "Measure of Man and Woman" [note]http://design.data.free.fr/RUCHE/documents/Ergonomie%20Henry%20DREYFUS.pdf[/note] for my baseline human factors variables.

I’ve integrated mobile phone posture ranges [note]http://www.auspicesafety.com/2017/01/17/text-neck-forward-head-posture/[/note] into my models to illustrate how much a context switch between phone and real world can trigger a considerable amount of head and body movement outside comfortable ranges.

AR user experience

Defining AR Experience

Design for a first-person perspective, with an open discovery viewpoint. Optimize for a medium mixed-reality immersion level, made viable with a five meter interaction range.


DEGREE OF IMMERSION
  • A medium overall mixed reality immersion level.
  • Spatial cues into augmented world
  • Virtual hammock lines on real-world trees

FIELD OF VIEW
  • 360º view
  • Five meter scan range

Five meters is defined as edge of both the social proxemic zone and comfortable depth perception.


ORIENTATION
  • Natural environment provides directionality
  • On-screen compass 

For an outdoors AR app, this variable may manifest in the form of a compass. Primarily defined for VR when you cant rely on your natural senses for determining a change in direction. 

Technical Feasibility

The role of design is to envision what can be, such that engineering can make it so. Knowing something is implementable is critical to product development, and this UX research and design project includes engineering assessments.

The major technical challenge is how successfully we can train a system to be markerless in an environment that thrives on everything blending in; getting a computer to see the forest from the trees.

AR Markers

At the time of writing this, there are two primary ways to overlay virtual elements onto real-world scenes: with external markers [note]https://www.kudan.eu/kudan-news/augmented-reality-fundamentals-markers/[/note] and without [note]https://www.marxentlabs.com/markerless-augmented-reality-everything-you-need-to-know/[/note]. With markers, the system is given a pre-defined pattern and asked to match it. 

Markerless AR involves dynamic pattern-matching, immersing into the context of a scene. From a user-centered perspective, implementing a markerless interaction model is ideal.

AR markers are still useful. AR marker-based prototyping can assist in usability testing for an augmented reality app even if it is a markerless product. For example, putting markers on every tree isn't desireable or scalable for a user product, but marker-based prototypes can still be useful for testing hammock spot searches in a single grove of trees.

Adjacent AR Research

There are apps on the market that utilize augmented reality to help with the measurement of an area and placement of objects in real space. These make useful adjacent research opportunities, since they share similar goals (helping the user measure something) and use similar technologies (augmented reality) to accomplish that goal.

NEXT

Design Strategy

design criteria and engagement strategies

Explore this Project

Hammocks7 minute read

User Research12 minute read

AR Research10 minute read

Strategy6 minute read

Interactions8 minute read

Participate4 minute read

ERIS STASSI  |  EXPERIENCE DESIGN

 © 2019  |  CONTACT