For my class, Integration of Immersive realities, my group was given 5 weeks to create an engaging environment in virtual reality, using Oculus and Unity, as well as another 5 weeks to create any mobile application of our choice.
Wanting to create an immersive and interesting environment, my group of two decided to create an old war submarine.
Since my teammate and I didn’t have experience with VR games on this scale before, we decided to keep the scope fairly small. We decided to create a submarine with 3 rooms: sleeping quarters, an engine room, and a control room. We wanted to have the environment be immersive and realistic, so we studied photos of real submarines and fictional submarines to get a better understanding of what assets we would need.
There were some factors concerning VR applications that we had to account for in our early planning stages. Primarily, we researched and discussed ways to minimize motion sickness for our users. We found that a higher frame rate often results in less motion sickness, so we decided to keep the polygon count of our models fairly low, and the lighting all simple and baked, while not sacrificing immersion in our lighting or models. In addition, we had to consider what to avoid when designing a submarine as an immersive environment. For instance, we discussed early on about having the submarine rock or jerk to make the environment more realistic, but decided against it because it would cause the users to feel annoyed or even sick.
Once we started putting the submarine together, we decided that we would love to add motion to the environment. I began creating UI for the submarine, including radars and engine screens.
Unfortunately, the Oculus Unity SDK doesn’t work very well with constrained rigidbodies, so we had to develop our own workarounds for many of the interactions we designed. Here are some of them:
We used two different positions of the button, unpressed and pressed. It swaps between states when the hand enters the general area around the button.
When the hand grabs the wheel, we have a custom script that translates the vertical and horizontal distance the hand travels to the rotation of the wheel.
The lever is similar to the wheel, except that we used contraints in the code to make sure the lever can’t go beyond its boundaries. In addition, we had to move the pivot to one side of the lever.
We submitted our 3 room interactive VR experience at midterms and our professor was thrilled with the quality of our work. He advised us to enlist the help of two other students in our class to help expand the experience and create a more complete game. So, we brought on someone to help with modeling two new rooms for the submarine, as well as someone to give us more realistic lighting that fits the somewhat eerie environment of the submarine. While they worked on those assets, my teammate and I began discussing the direction we wanted to take the game. We realized we both had 3 desires for the finished product:
The main issue we had with VR is that the user in VR goggles is so isolated from their surroundings and from others. We want to connect at least two people together so the user can interact with someone else as they explore the submarine.
We want the user to solve a series of puzzles to finally escape the submarine to beat the game. Also, to reward the user for completing puzzles, new areas would be uncovered for the user to explore.
We wanted the player to have a sense of urgency as they solve the puzzles. Instead of simply implementing a win condition, we wanted the narrative to include the sinking of the submarine as a way to have the player race the clock to escape.
We decided to create a mobile app that connects the Unity game to a mobile experience. To do this, we followed the narrative of a hacker helping someone escape a sinking submarine. One key concept we wanted to incorporate is information flowing both ways between the mobile and VR player.
We then brainstormed different ways we could include puzzle information in the submarine and the application so that the players could actively help each other, and I used that to design and program a fully functional companion app for the game.
Because we were still working on the VR portion of the experience, I ran user testing using only the application, guiding users through. Though we couldn’t test the two sides working in tandem, is great to get general feedback from the app alone.
The most pressing piece of feedback we found is that the phone user doesn’t have enough to do while the VR player is exploring the submarine.
Solution: I added a mini game that the phone user can play at certain points in the game, or if they get a combination wrong in the app.
Much of the buttons and text were too small, which was something I didn’t discover until I tested an older demographic.
Solution: I increased the size of the main buttons and a lot of the text used throughout the app to make sure it’s easy to navigate.
When inputting codes into the mobile application, I needed to make a more intuitive way to remove symbols which have already been input.
Solution: I ended up having to redo all the programming for inputting code. The user just needs to select a symbol that has been input, and it will be removed and the next symbol they type will be put there.