Project Brief

The final project for my Programming 1 class was simply to create an interactive experience using the Processing Library for Java. I decided to take this opportunity as a product design challenge to design an augmented reality application that helps solve a problem that I am passionate about.

Problem Statement

“Roughly 90% of children’s toys are made of plastic, but less than 10% of that plastic ends up being recycled”

Ideation Stage

Brainstorming

Early in the brainstorming process, I decided to create an augmented reality app. An AR app would allow for an experience for children contained entirely within pages of a book, and entirely constructed of paper.

Because children are the target audience, I decided the theme of the experience should be the natural curiosity and imagination of children. I especially wanted to focus on children creating fantastical objects out of the mundane.

Prototyping

3D Modeling

Once I had a list of all the objects I would need in the rooms, both collectible puzzle pieces and static furniture items, I headed into Rhino to start modeling. I decided to create all the assets for a room, to try and get a one room prototype working.

Programming

Developing in Processing was an interesting change of pace, as my past experience in AR development was in Unity. After some research on Processing, I found NyARToolkit, an AR library for processing.

Unfortunately, all the official documentation is in Japanese, so I worked closely with Google Translate to learn the basics of the library.

The Clicking Conundrum

Even though I’m using a dedicated AR library for this application, the addition of pointers, which would normally allow for clicking, caused the application to lag when the user clicked. So I had to find another way to implement clicking, so that the user could interact with the game.

The solution I came up with is finding the hexcode color of the pixel the user taps on, and comparing it to the list of hexcodes used in each room. This allows for the program to know what object is being tapped, and from there I can define click stages and event instructions for the different stages of the game. Unfortunately, this led to my models having to each have a unique color in each room. For instance, if I wanted to be able to differentiate between the toilet and bath tub, even though both are white, I would need to use slightly different tints of white in each, maybe differing by only one or two points of RGB.

Iterating

User Testing

A few days before the final submission deadline for the project, I had finished a prototype that allowed for the user to at least finish the game. I decided to test it out by letting a few friends try it. Here were the main pain points:

No Onboarding

Without explaining the concept to my testing participants, very few were able to decipher what the game was about or where to start.

Solution: I added a child-like sketch of the finished rocketship to direct the user toward the end goal as well as keep the theme. As far as instructions, I decided to add those into the packaging and product itself, rather than the digital application.

Clicking Difficulty

Certain objects are difficult to pick up, because the scale of the game is so small. At certain stages, it was quite apparent what the next step was, but the user couldn’t quite select the right object.

Solution: I created a larger amount of acceptable hexcodes to what the user could click to progress. For example, rather than only allowing them to tap the gold color used in a small gold coin, I added the color of the countertop under the coin.

Changing Views

Some objects and clues need to be accessed by turning the camera and seeing the room from a new perspective. This isn’t readily apparent, especially to the participants who hadn’t had experience with augmented reality.

Solution: Disclose the possibility of objects being hidden from the user’s default view in the physical instructions/onboarding process. This aspect is a key feature of the application and the advantage of using AR over a regular 3D game, so it was crucial to include it.

Conclusion

Intentional Frustrations

Going into this project, I expected the most difficult part to be in the programming or 3D modeling, when it actually resided in the game design itself. As a UX designer, I’m constantly striving to make experiences as intuitive and fluid as possible for the user, so that they hardly need to think about their next step. However, when designing a puzzle game, the next step shouldn’t be intuitive, and the user should have to think hard about how to progress. This led to me thinking about the role frustration plays in the products we use. Should I frustrate the player of my game to create a more compelling puzzle experience, or would that be more likely to make them exit the experience? I realized I needed a general rule to follow when asking if each step of the experience is fun or unhealthily frustrating:

The player should be wondering what to do, not how to do it.

That is, as soon as the player understands what the next step of the puzzle is, whether that’s picking up coins or finding the hull of their rocket, they shouldn’t have to wonder how they go about that, they should be able to just do it. That’s why I need the coins easy to pick up and the cardboard box that is the hull of the rocket to be readily apparent, recognizable, and interactive. As soon as the player starts random clicking, that’s when they build the sort of negative frustration that causes them to quit the game.

The Final Product

I was able to submit not only a fully playable product, but something that had been user tested by a group of people and had some of its main bugs and pain points resolved, leaving a more elegant and delightful experience.