top of page

AR monster hunter

It’s all hunters' dreams to be in this beautiful world one day to experience hunting beyond game consoles and keyboards. This is the final project for my Wearable Class at NYU IDM. It was fun to develop this wearable so we can experience more immersively in this amazing mh world. Let’s go hunters!!!

01 Inspiration

I am interested in the intersection of HCI and gaming, particularly using technology to enhance the gaming experience. The Adafruit Circuit Playground piqued my interest as a potential tool for controlling PC games in a new way. While some games, like Taiko no Tatsujin and JustDance on the Nintendo Switch, use unique control methods to fully immerse players, it can be harder to achieve this level of immersion in action or role-playing games on current consoles.

In Monster Hunter, the player can take on tasks from a village and complete them in order to earn rewards and money. The tasks and quests offer a sense of purpose and progression, and the gameplay is satisfying and challenging. The game also features a wide variety of monsters to hunt and defeat, each with their own unique behaviors and abilities. I plan to create a wearable suit that allows the player to input commands directly through their body movements, rather than using a keyboard. This will enhance the immersion and enjoyment of playing the game, as the player will feel more like they are truly part of the action.

circuit_playground_hidCPXLaptop.png
image-1.webp

02 Brainstorm 

I obtained some example and classic input data for this game from the official website. I then brainstormed ways to map these inputs to natural human movements during gameplay. I was excited to explore the potential for replicating the in-game avatar's motions, with the goal of enhancing the immersive experience for players. By allowing gamers to perform similar motions to trigger actions such as walking, running, attacking monsters, and collecting materials, the gameplay in Monster Hunter would become more engaging than simply pressing keys on a keyboard.

IMG_2536.jpg
IMG_2537.jpg
IMG_2529.jpg
IMG_2530.jpg
IMG_2532.jpg
IMG_2531.jpg
IMG_2533.jpg

03 prototyping

03-1 Mouse

The mouse is a crucial aspect of the gameplay. It controls the camera angle, or the player's point of view. I found it natural for players to want to look around in-game, as they would in real life. To replicate this, I mounted a Circuit Playground on the back of the head and utilized its accelerometer to control the mouse movements. The x-axis and z-axis were used for left/right and up/down movements, while the y-axis was used to control jumping in-game, allowing players to physically jump while playing for a more immersive experience

img_3740-1.webp
image-2 (1).webp

03-2 WASD

For other inputs, I decided to implement capacitive touch sensing. It's worth noting that I couldn't use the a0 pin for this purpose as per the documentation. I placed the touch sensors on the left hand, simulating the "WASD" keyboard controls. I encountered difficulty in figuring out how to send keyboard commands from the Arduino to the computer, so I researched and learned about the ASCII table. I also found a reference for the Arduino keyboard library to complete the prototype.

circuit_playground_GPIO.jpg
img_3809-1.webp
wx20221207-124740402x.webp

03-3 Attack

Imagine a player physically mimicking the motion of attacking a monster with a weapon. By placing two capacitive touch sensors near the player's elbow, the act of bending the arm triggers the in-game attack, allowing for a more immersive gameplay experience, akin to clicking a left mouse button.

img_3903-1-edited.jpg
image-1 (1).webp

03-3 use/collect items

I placed the capacitive touch tape near the pants pocket on the raincoat to simulate the act of accessing items in a bag. This approach allows players to intuitively reach for their pocket when they wish to use or collect items. In terms of item selection, I initially attempted to utilize a softpot to simulate mouse scrolling, but found that it was not accurate enough with the conductive thread. I plan to explore alternative methods in future iterations of the project.

img_3812.webp

03-4 coding test

Before connecting the conductive thread from the pads to the pins on the Circuit Playground, I tested the code to ensure that the capacitive touch and x-y-z accelerometer were functioning as desired.

I then proceeded to sew the conductive thread onto the raincoat. I discovered that different types of thread have varying levels of conductivity. Some threads lost their conductivity after a certain length, such as 30 cm. It took some experimentation to find the appropriate thread to complete the raincoat prototype.

03-5 Struggles...

- Forever untying nodes of long conductive thread

- Conductive thread lost conductivity

- Raincoat was easy to tear up

- Nail polish would not stay on the raincoat

- Making shorts when performing motion

- Not suitable for other body size

img_3822.webp
img_3799.webp
img_3796.webp
img_3793.webp

Besides the hardware, calibrating the accelerometer was also very hard since there were large-scale movements but also subtle-scale movements at the same time and I need to capture both of them and utilize them for different key commands. I was able to use one of the libraries from MIT and calibrate the X-Y-Z in this case. The entire code is on my GitHub

04 Iteration

In response to feedback from Wearable class final, I decided to create the next prototype using actual clothing, such as a jacket or hoodie. I selected a XXL cotton hoodie from Target and began integrating the wires and circuit directly into the garment, rather than using conductive thread, to achieve a more stable connection between the pads and the Circuit Playground.

IMG_4077.HEIC
IMG_4080.HEIC
IMG_4083.JPG
IMG_4087.HEIC
IMG_4102.HEIC

05 Showcase

I had the opportunity to test this wearable suit at the NYU IDM Fall 2022 showcase and received overwhelmingly positive feedback. Many people remarked that it represented the future of embodied gaming, providing a nearly 4D gaming experience and an incredibly immersive hunting experience. I chose the Coral Highland in Monster Hunter World as the demo location because it is one of the most visually stunning areas in the game. When players looked around and changed the camera view to take in the beautiful scenery, they commented that it felt like VR without the need for a headset, which they found to be very user-friendly and accessible.

IMG_4850.heic
IMG_4855.HEIC
IMG_4218.JPG

There is certainly room for improvement and further iteration. The participants at the showcase provided valuable feedback and suggestions. For example, although the hoodie I used was XXL, it may still be uncomfortable for some body sizes. It is more convenient to put on this coat and start the game compared to other motion capture suits, but it is also difficult to customize for different body sizes since this is a wearable project. Additionally, the "WASD" command can be improved as it is not as intuitive as other commands. In the next iteration, I am planning to use a gyroscope sensor on the left hand to allow players to control the character's movement simply by walking and using hand gestures. There was also debate on whether the suit would be effective in harder game modes, with some participants stating that they may not have time to reach the pocket and use recovery items during battle. This raises an interesting question about how to achieve playing efficiency while maintaining immersion in role-playing games. With more motion capture suits and games being developed, it is possible that this open question will eventually be answered.

bottom of page