VR Prototyping #1 | Meta Horizon World Selection

Jon Lehman
6 min readMay 13, 2022

Disclaimer

I am sure there is a lot of context that my assumptions mentioned below miss. This exercise was purely conceptual and meant to help overcome technical challenges associated with creating VR experiences. I think Meta’s Horizon Worlds is great and I look forward to seeing how the Horizon team continues to evolve their product.

Background

I have been designing products for about 9 years. Although I have a great interest in XR tech, most of my professional experience has been designing apps for the web and mobile platforms (most recently for Teladoc and Apple). Over the last several years I’ve been bullish on WebXR (previously known as WebVR). I’ve been reluctant to try other tools for creating VR content like Unity. However, WebXR isn’t quite at the maturity that I would like it to be for quickly creating VR prototypes so I’ve started to dive into using Unity for these types of projects.

Ok, to be fair I am still bullish on WebXR. It just doesn’t have what I need yet to move quickly.

Brief

  1. Create an alternative experience to select “worlds” in Meta’s Horizon Worlds.
  2. Use the exercise to build technical understanding of the challenges associated with designing spatial interfaces.

One thing I’ve noticed a lot about Meta’s Horizon Worlds (and a lot of XR experiences) is their use of 2D UI (user interfaces) throughout their experiences. There is nothing inherently wrong with this of course.. but.. like, come on.. there’s gotta be some missed value that a more spatial experience could provide.

Examples of Meta Horizon Worlds home menus
Examples of Meta Horizon Worlds loading (left) and world editor (center and right)

Side note: Is there a way we could provide a more better loading experience? A lot of XR apps use basic 2D loading sequences similar to traditional web and mobile apps. It feels like a missed opportunity to me. Even an interactive block or ball would bring more delight to a loading experience.

I am sure there are plenty of advantages to utilizing 2D UI in VR (wow, four 2-letter “words” in a row!). 2D UI is likely faster to develop, easier to understand for a new-to-XR user, and it may be less physically demanding as well. Spatial UI can be so delightful though! Some experiences such as Nock and even the “First Steps” seem to strike a balance between 2D and spatial elements. These experiences are both intuitive and satisfying.

That being said here are some spatial based UI experiences that I’ve run across recently:

Nock immerses you into their game experience by utilizing their archery mechanic in the menu
First Steps uses spatial game cartridges and console to allow the user to switch from one experience to another.
Rec Room has you tear away the seal on a box (with satisfying haptic feedback) to reveal new items.

Worth noting that Rec Room used to be on my list of favorite VR UI.. but over the last year or so they seem to be moving their UI experience in a generic 2D direction. Likely to bring into parody the massive amount of platforms they now support.

I’ve decided to do one experience entirely spatial, while the second augments the existing 2D experience in Horizon World’s today.

Result

Fully spatial prototype.

For the spatial prototype I wanted to see if there was a way to visualize a world outside of just images and video. What if the user could actually see a live preview of a chunk of the world they’re highlighting? What if they could see the players in that world and what they’re up to? Lastly, I added a highlight above the players in each world that were a friend to the user.

Augmentation of existing 2D UI prototype

For the 2D augmentation prototype, I recreated the existing world selection experience in Meta Horizon Worlds, except for one additional feature. In this prototype, the user can pull out each of the world tiles to make them tangible. This would allow the user to “share” the world by actually handing it to a friend. This would also open up the world for additional experiences like the ones shown in the spatial prototype.

This idea was influenced by how Rec Room allows you to take polaroid style photos in-game and place them in the virtual space.

Findings

While putting together these experiences I discovered (or rediscovered) different characteristics that I thought were worth remembering for my next project. Here are a couple of those little thought nuggets.

💡 Experiencing physical/tactile responses in hand-tracking (no controller) interactions 10x’s immersion and adds delight. ie, feeling the surface of your hand as you tap on a button that is laying on top of it.

💡 Sound feedback should accompany any spatial interaction. Sound enhances immersion and quick understanding of the interactions.

💡 Anytime the user can make a 3d object, it should always be destroyable. Some users experience anxiety around clutter that cannot be cleaned up.

🚫 Don’t place medium-large collision interactive objects in a location where the user has to reach over them. Accidentally triggering them is likely.

😅 Coming across bugs while developing can be a source of humor as compared to bugs found in traditional 2D digital products.

Screenshot of bug where dozens of world tiles exploded like a deck of cards

Quick Breakdown: Spatial Prototype

Rest state, on hand open
Active state, on button tap
  1. Started out with some sketches (pictured above). I like using procreate to sketch these out.
  2. Moved over to blender to begin putting the general/basic geometry together.
  3. Exported UV maps with blender and created textures in Photoshop and Figma.
  4. I began building experience in Unity using building blocks created in Blender. I also used the following asset packages to help with the modeling and interactions:
    * Bolt / Visual Scripting for most of the logic. (dragging my feet on C#)
    * Autohand for the grabbing and hand-tracking interactions
    * Poly Perfect for the lowpoly world models
    * Mixamo for the character animations
    * Sound effects from SND.dev and Meta’s Sound Kit

Quick Breakdown: 2D Augmentation Prototype

Animated sketch showing different states.
  1. Started out with some sketches shown above.
  2. Moved over to Figma and rebuilt Meta Horizon World’s existing UI (wasn’t going for pixel-perfect, just close enough).
  3. Moved over to Unity and built the general UI “canvases” and applied the Figma-created UI. Also began putting together the interactions.
    * I used Bolt / Visual Scripting for most of the logic
    * I used DOtween for help with easing and animations
    * Autohand for the grabbing interactions and the pull-apart effect
  4. I Used Blender to create the geometry that pops out of the menu, and used Figma to create the texture for it over a UV map.
  5. I Added sound effects from SND.dev and Meta’s Sound Kit .

--

--