Space Exploration in VR---Ball game
This is the final project for my master program's course---"Narrative VR", and in the 2019 NYU IDM showcase, it is proved to be an popular project.
RESEARCH REVIEW
Key Elements: Physical Box, Motion Controller, Ball (In-Game Character), Leap Motion
Virtual Reality (VR) is often used to create immersive experiences beyond the limitations of time and space. It plays a significant role in art (narrative VR) and gaming. However, integrating tangible interactions—connecting the physical and virtual worlds—adds a new layer of depth to VR experiences. This approach, known as Tangible VR, is gaining interest as researchers and designers explore ways to enhance user interaction.
Tangible VR allows users to physically interact with virtual environments by linking real-world objects to digital experiences. It not only increases user control but also helps address sensory conflicts between vision, balance (vestibular system), and proprioception. Various studies have explored this field:
• Jack & Alison (2018) developed a tangible VR game to enhance spatial thinking.
• Daniel & Aneesh (2017) examined how tangible VR improves engagement in interactive storytelling.
• Daniel & Alexander (2018) researched Sensory VR, incorporating senses like smell, touch, and taste into virtual experiences.
Building on these concepts, a simple educational VR game was developed that blends the physical and virtual worlds. The project aims to enhance cognitive and motor skills through tangible interactions, providing a more immersive and engaging learning experience.

GAME CONCEPTION
Inspiration: Labyrinth
Game Space:
The game takes place inside a cube, where each side represents a different landscape or world. However, only the horizontal side is a fully interactive 3D environment, while the other sides display 2D images of these worlds. This design allows players to analyze the correct path before making a move, similar to how the dream layers function in Inception.
Gameplay Mechanics:
• Route: The maze adapts to the landscape’s unique characteristics—players might navigate a ball floating on a river or rolling over hills.
• Player Role: The player controls a ball trapped in the 3D horizontal world. The ball follows realistic physics, rolling and reacting to the cube’s movements.
• Control System: The physical box in the player’s hands is mirrored by a virtual cube in the game. Rotating the real-world box directly influences the virtual environment, creating a seamless interaction between physical and digital spaces.
• Objective: The goal is to guide the ball to the end of the correct path. Successfully reaching the endpoint triggers a portal, transporting the ball into the world displayed on the walls.

Enhancing the Classic Labyrinth Game with VR
The traditional Labyrinth game challenges players to develop spatial awareness and observational skills, making it an excellent educational tool, especially for children. However, its static design and lack of engaging details can hinder player immersion. To enhance the experience, the game was reimagined in Virtual Reality (VR) by integrating narrative elements and interactive enhancements.
To ensure a consistent spatial experience between the physical and virtual worlds, a 3D-printed box was designed as a physical counterpart to the virtual cube in the game. This allows players to rotate the box in real life, ensuring a seamless interaction between the two worlds.
Additionally, Leap Motion was incorporated to track hand movements, enabling precise and natural interactions within VR. This integration improves the realism of the experience, making movements more intuitive and immersive.

HOW TO MAKE IT
This VR game was developed using Unreal Engine.
1. Box Construction
By utilizing ‘Solid World’ and 3D printing, a knock-down box was created, designed for easy assembly and convenient placement of the device inside.

2. Connect Box to VR
At the start of the project, Arduino and an Inertial Measurement Unit (IMU) were used to transfer data from the physical world to Unreal Engine. However, three key challenges were encountered:
1. Unstable Data Input – The data was inconsistent, influenced by factors such as the USB interface, IMU stability, and wiring issues.
2. Excessive Wiring – The setup became cluttered with numerous wires, both inside and outside the box, adding complexity to the existing wiring of the Oculus headset.
3. Complicated Data Transfer – Initially, attempts to send data directly from Arduino to Unreal were unsuccessful. An alternative method was then used, involving transferring data from Arduino to Max, and then to Unreal. While this approach worked, it created a long and inefficient data processing pipeline.

3. Motion controller
Ultimately, the IMU was replaced with a motion controller, a more mature and readily available technology for tracking the rotation of the physical box. This change eliminated the need for internal wiring, significantly simplifying the setup.
The core concept was to integrate the virtual world inside the cube as a Blueprint component within Unreal Engine, linked directly to the motion controller. As the controller moves, the entire virtual world responds accordingly, ensuring a seamless and intuitive interaction.

SCENE BUILDING
All the scenes inside the cube were created using Cinema 4D (C4D):




BALL ANIMATION DESIGN
As the only character in the game, the ball needed to simulate realistic physics and move naturally based on the rotation of the world. To achieve this, the Ball GameMode in Unreal Engine was used, allowing the ball to move according to the landscape. However, two major challenges were encountered during development:
1. Collision Issues
Initially, the ball would bounce out of the box due to the way the model was imported into Unreal. The entire project was treated as a single object, resulting in a large, unintended collision box around it. To solve this:
• The default collision from the imported model was removed.
• New collision boundaries were manually added along the ball’s intended path in each world to ensure accurate movement.
2. Trigger Event & World Transition
According to the game design, the ball triggers a box at the end of its route, transporting it to another world. However, this posed a challenge in Unreal’s Blueprint system:
• The trigger event occurs in the Blueprint Class.
• The world-switching action is controlled in the Level Blueprint.
Since Blueprint Classes and Level Blueprints do not communicate directly, the Dispatcher mechanism and Blueprint instances were used to send messages between them. This approach differs from the typical method of sending messages from the Level Blueprint to Blueprint Classes, requiring a unique event-handling solution.

In the end, “Method 2” was chosen because, unlike “Method 1”, where some assets were placed in the Level Blueprint and others in Blueprint Classes, “Method 2” keeps all components within their respective Blueprint Classes.
This approach offers several advantages:
• More efficient communication between different blueprints.
• Easier component management, ensuring a more organized and scalable system.
LOOKING FORWARD
Currently, the scenes in VR are static. To enhance immersion, the plan is to animate the environments in Cinema 4D (C4D), bringing elements like flowing water and swaying trees to life. These dynamic elements will add realism and engagement, encouraging players to immerse themselves further in the virtual world.
1. Incorporate Physical Feedback
To enhance the tactile experience, physical feedback will be integrated into the box. Inspired by physical installations, the game will trigger vibrations when the ball reaches the trigger box, offering a more immersive and satisfying interaction.
2. Introduce More Complex Mechanics
At this stage, the ball can smoothly roll along the route and reach the end when the cube is rotated correctly. Moving forward, more complex mechanics will be introduced, such as traps and barriers along the path, increasing the game’s difficulty and adding layers of challenge for the players.
Reference:
1. Jack Shen-Kuen Chang, Georgina Yeboah, Alison Doucette, Paul Clifton, Michael Nitsche, Timothy Welsh, and Ali Mazalek. 2017.A Tangible VR Game Designed for Spatial Penetrative Thinking Ability. CHI EA '18 Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing SystemsPaper No. D307
2. Daniel Harley, Alexander Verni, Mackenzie Willis, Ashley Ng, Lucas Bozzo, and Ali Mazalek.2018.Sensory VR: Smelling, Touching, and Eating Virtual Reality. TEI '18 Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied InteractionPages 386-397
3. Daniel Harley, Aneesh P. Tarun, Daniel Germinario, and Ali Mazalek.2017.Tangible VR: Diegetic Tangible Objects for Virtual Reality Narratives. DIS '17 Proceedings of the 2017 Conference on Designing Interactive SystemsPages 1253-1263