bARk
CASE STUDY

bARk

Overview

bARk is an augmented reality (AR) app that provides a virtual companion for animal-lovers. This relationship with a virtual pet provides both entertainment and emotional support to our users. What makes bARk special is it allows users to continue to create memories with their pet wherever they are. This application, designed by a team of undergraduate students studying Interactive Digital Media, is a high-fidelity Figma prototype of the onboarding process and a pet interaction screen with AR functionality developed in Unity.

Context & Challenge

bARk uses AR technology to place a digital pet into a user’s natural surroundings. With their pet in their pocket, users can interact through the bARk app at any point during the day. bARk gives its users an animal companion that can help to fill the void of a lost furry friend or give the pet owning experience to those with the inability to own a pet. Any pet lover with a device can interact with a bARk digital pet. This allows those who may be restricted by things like rental restrictions, lifestyle costs, or allergies to interact with and care for a cute and realistic furry friend.

Goal

The goal of bARk is to provide users with quick and easy access to a furry friend for emotional support, entertainment, and companionship. Over a period of six months, bARk’s team of six designers and developers have created a functional AR build and high-fidelity prototype.

Process & Insight

UX

The UX team was responsible for researching how a user’s experience differs on a mobile AR application versus a traditional mobile application. Research was also performed on how to best conduct usability tests when dealing with AR applications. The UX team focused on usability test scripts and creating card sorts to better understand our users’ thought processes about how they interpreted bARk’s interface. Our goal was to understand exactly how users think rather than assuming what a user might understand or like about the interface.

UI

The UI team worked primarily in Figma iterating designs and prototypes to meet the deliverables set by the team. Every design decision made throughout the process has been based on research on the market’s leading trends as well as results from user testing conducted by the UX team. There was a lot of collaboration between the UI and UX teams, such that they both functioned as quality assurance for each other’s work. This constant flow of communication was key to successfully realizing the needs of our users within the timeframe we were given. Over the course of 6 months and 5 different prototypes, we were able to reach our goal of a high-fidelity Figma prototype.

Low Fidelity Prototype
Low Fidelity
Mid Fidelity Prototype
Mid Fidelity
High Fidelity Prototype 1
High Fidelity Prototype 2
High Fidelity

AR

The AR team spent this time working with a few different AR platforms, teaching themselves to ultimately build the prototype in Unity. The AR team started development in SparkAR working on initial button functionality. After realizing how SparkAR would limit our design, the team moved on to working in Unity. Unity was used to build out the 3D model interactions for the bARk app. Development started with making sure the camera could detect planes in AR. We then added functionality to spawn an object on the plane upon load. This was followed by linking in the 3D dog model as the object to spawn on load. We styled our interaction buttons, and finally linked the 3D model’s animations to the buttons to bring the app to life.

We decided to begin AR development in SparkAR, which is the platform used for Instagram and Facebook filters. The main reason we decided to begin in SparkAR was because of its ability to test our project on Instagram and Facebook.
Our initial progress in SparkAR consisted of creating a basic button set up by following a tutorial through the platform. This tutorial allowed us to create a basic interface with functioning buttons and changing animation states on the screen.

AR Mockup

The team researched developing for AR in Unity before moving into the software to make the transition more seamless. We collected a few tutorials to follow in Unity to begin understanding the interface and how to create an AR build within Unity. Unity has the ability to create an iOS build that can be opened in XCode and run on a personal mobile device. To test our progress after each development session we would compile and run our build on a personal device, taking note of any changes or adjustments that would need to be addressed in our next development session.

AR Mockup

The Husky model had built-in animations that the team wanted to use. The goal was to have the Husky model playing an idle animation. To do this, the team needed to work in animator to link the animations to the husky ‘avatar.’ Adding functionality to the buttons has been the most complicated part of this build in Unity. The team successfully added an open state to the main three buttons on the home page. This was done by creating a child element of the button that holds the open menu image, this element was then turned off. Next, a script was written to check if the element was on or off on click, turning the element off if it is on and on if it is off. This script was then linked to the buttons’ built-in onClick function. This allowed the button menus to be opened on click.

AR Mockup
Real AR Example

Results

What we were able to produce in this timeframe was a high-fidelity Figma prototype for pet creation. We finalized this design based on the data collected through extensive user testing, building a semi-functional prototype that allows users to customize their pet. Through Unity we produced an iOS build with AR functionality that allows limited interaction with the pet.

bARk provides a new way to interact with pets using a new and modern technology, giving the user the experience of interacting with a virtual pet in any environment. This initial build of bARk shows that there is not only an opportunity for an app of this type in the industry, but a desire.