FAO write-up

Notes on For Amusement Only, a project by Mat Olson (NYU ITP '24)

The following was written to accompany an unfinished build of the VR component for this project presented to a class of students learning the Unity Engine in Fall 2024. It does not represent "complete," formal documentation of the project. If you're reading this and you're not part of the class in question, please understand that I've focused mainly on the VR implementation aspects for the students' benefit.

For Amusement Only is the joint final I worked on for two classes I took at NYU's Interactive Telecommunications Program. Those classes were Winslow Porter's "Multisensory Storytelling in Virtual Reality and Original Flavor Reality" and Michelle Cortese's "Hedonomic VR Design."

The project was designed to fit the briefs for the final project assignments across both classes; respectively, those classes called for (1) a short VR storytelling experience that made use of "multisensory" aspects like scent, temperature, or haptics beyond the standard capabilities of a VR headset, and for (2) a VR project demonstrating the principles of hedonomic VR design in either the positive (i.e. an especially comfortable, pleasurable experience) or the negative (i.e. a VR experience that intentionally makes the user uncomfortable).

As you'll see if you load up the build provided, For Amusement Only was an overly ambitious project for a lone artist in a single semester, even pooling my time across two classes. I basically managed to put together a proof of concept or demo of the intended final experience.

Technical outline

The centerpiece of For Amusement Only is its custom-built VR controller, designed to resemble the front-most part of a standard size pinball table.

I had two motivations for creating the custom controller. First, creating a properly scaled box serving as the physical controls for For Amusement Only's virtual pinball table rendered in-headset would, I believed, enhance the participant's immersion and ease in people with less experience using VR headsets–the project was designed to use Meta Quest hand-tracking only as a limited, secondary form of interaction and to avoid use of Quest's standard controllers completely. Second, with participants effectively blindfolded in front of the box, I could hide additional multisensory components inside the box itself, creating opportunities  to surprise participants with multisensory tricks as the experience unfolds.

The project also included two additional physical components: a faux neural interface wristband to be worn by participants and a light up sign bearing the project's name. Neither of these components were integrated fully into the build of the project presented here.

For Amusement Only was built using a 2021 long-term support version of Unity and version 43 of the Meta Quest/Oculus software development kit. The hand tracking used in the experience was not significantly altered from the Unity scene examples included in the SDK, and no other Unity packages were used in the project.

The controller, signage, and wristband (on the table, next to the headset) were conceived of as the three main elements of the project to be used along with the headset.

Story outline

The story of For Amusement Only is a satirical take on Meta's role in the VR/AR industry and its attempts to convince the public that development of "the metaverse" is both a worthwhile pursuit and one that the company can be trusted with given its track record on advertising and privacy matters.

(The "Mata" name used at the beginning of the VR experience and seen on the pinball controller's decoration is a fictional shell company for Meta. I created it so that I wasn't literally plastering Meta's real name and logo on the physical components of my project.)

For Amusement Only casts participants as subjects in a user research study conducted by Meta for its development of augmented reality glasses, a wrist worn neural interface and accompanying AI assistant. These are all things that Meta is actually working on, but the specifics presented in the project are my not-so-generous interpretations of them.

In the project's fiction, the user research study is being conducted inside VR, with the AR glasses interface and AI assistant running as a parallel but separate process from the VR simulation. This is not all that dissimilar from how some AR interface tests are conducted in the real world–because AR glasses technology is still so expensive and early in development, it can be easier for companies to test ideas for how a person might use a pair of AR glasses by essentially creating a controlled testing environment inside VR.

This dual VR/AR simulation is the setup for the actual plot of For Amusement Only: a hacker alters the VR component of the test, replacing the virtual pinball table in the testing environment with a different table themed around criticisms of Meta as a company. Upon discovering this, the AI assistant guiding the participant panics and attempts to stop the participant from playing the pinball table. This would then lead to a tug of war between the hacker and the AI assistant, with the assistant becoming increasingly irate and hostile as the pinball game continues.

The AR pop-up advertisements arranged in the arcade space behind the participant in this demo build are an example of one of the measures the AI assistant could take in order to try and force a premature end to the pinball game–while playing, a pop-up could appear directly in front of the participant, forcing them to either play around it or to take a hand off of the controller to close the ad.

PRO-TIP: If you're doing a satirical project, make satirical slides about it if you have to present it in class with a deck. People love that.

How it was made–VR specifics

The VR implementation here was fairly straightforward as Unity VR projects designed for Quest goes, though it's also missing one of the key components I badly wanted to get working.

The player controller entities and hand tracking components used in the project were all derived or copied straight from the examples provided in the build of the Meta SDK for Unity that I installed. While both Unity and the Meta SDK have received several updates since the versions used in this project, I'd still stand by my main piece of advice for using them together: install everything and take as many working components from the examples provided as necessary.

Deploying a build straight to a Quest headset also requires having the Android compatibility packages installed for Unity. Again, I'd recommend just installing everything and following the setup documentation provided by Meta.

The great strength of the Meta SDK for Unity is that it comes with so many solid examples, and because of the way they've been designed you can pretty easily copy-paste components or use provided prefabs in your own projects.

The drawback to this is that the documentation on how to set things like a VR player controller or hand tracking entities up from scratch inside of Unity has never been quite as helpful as you'd hope. When you run into issues you're probably going to have to spend some time experimenting with stuff to figure out why it's not working as intended. A solution you find online to one problem might not apply to your current version of Unity or the SDK.

Example: I wanted this project to use the then-latest version of Meta's mixed reality features to define the position of my pinball controller in the physical space, allowing me to anchor the virtual pinball table to those coordinates so that the controller would always align perfectly with the virtual table. Documentation on how to actually set this up was scant at the time, and I wasted many hours trying to get it working. I ultimately didn't get it functioning by the time I showed this demo, and instead resorted to physically moving the controller into position to match what participants were seeing in the headset. Not ideal.

How it was made–controller specifics

The controller is actually deceptively complex looking. Initially I had planned to use an Arduino-compatible board and Ardity, which I know you've already been introduced to in this class. However, Ardity is designed to work primarily with a microcontroller connected physically to a computer via a COM port, and I'm not savvy enough or familiar enough with how Unity running through Quest's Android OS could be made to work with that.

Instead, I resorted to something hackier but much more simple and stable: to get the button inputs for the flippers, ball launch, and start game button on the pinball controller, I bought a cheap Bluetooth keyboard off of Adafruit. I tested it when it arrived and found that the Quest was able to recognize it as a keyboard and that Unity projects running in the headset would still take inputs from that keyboard without issue.

The keyboard used in the controller. You can see that keys 1-4 were removed and had sockets soldered into place, allowing me to plug in arcade pushbuttons.

I then opened up the keyboard, studied the circuit board inside, and soldered on my own connection points in place of most of the number row. These points I then connected to the push buttons mounted in the controller, and that was it. No special input code required inside of Unity, just a bit of tricky soldering. With a bigger, more expensive keyboard–or with a microcontroller with Bluetooth keyboard functionality built in, if one exists–it would probably be even easier to make the modifications required to do something similar.

Notes on playing through For Amusement Only

If you want to load up the experience yourself, you'll need to follow the steps for sideloading an APK file onto your headset. You'll also need some space to move around in (I'd recommend at least 6' by 6') and your Quest headset's controllers at the ready.

While the experience was designed for use with my custom built controller, you can still play around with the pinball table if you'd like. The flippers are mapped to the triggers on each controller and pressing the B button will launch a new ball.

Prior to that point, though, you need to use hand tracking gestures to get through the beginning of the experience (it's possible that button prompts/the usual grip controls on the Quest controllers will work as well, but I haven't tested that). To switch between hand tracking and controller use, just tap the two controllers together twice. Hand tracking can also be enabled in the Quest pause menu, I believe.

When you boot the APK for the first time, you'll likely be prompted for permission to run it because of the spatial tracking features enabled in the APK. I assure you, no tracking data is actually being used beyond what's standard for the headset–the prompt is just an artifact of my attempts to get spatial anchors for the custom controller working.

If you'd rather watch a walkthrough of the experience instead, I've included a narrated video here. It's a bit long because I go into more detail about how some of the interactable elements were made. Feel free to put me on 1.5x speed or something. Hope it's helpful!–Mat