VR Rhythm Game
Conductor is a VR rhythm game. Players wave the handle with both hands according to the polylines on the podium to simulate the hands of a conductor and direct a small band to perform in the concert hall.
This project is my graduation project. I completed the design and development of the entire game independently, and eventually won the title of excellent graduation project.
Dec 2018 - Mar 2019
As a rhythm game lover😝, I used to play games on mobile phone. After experiencing the very popular VR game Beat Saber through the school's virtual reality course, I developed a strong interest in VR games, so I began to look for other interesting VR games. I looked around on the platform and found that they are basically all beating beats like Beat Saber, so I thought about how to make full use of the interaction method of VR equipment to develop a seems-a-little-different rhythm game.
I use the Oculus Rift, it consists of a helmet and two controllers for both hands. These controllers are used as the main input devices to simulate various objects. For example, in Beat Saber, they are two lightsabers, which can be used to chop. In other games, they can be weapons or just hands. One day I suddenly had an inspiration, rhythm → chart → conductor. If I could use VR to let the player become the conductor of the band, is it also a novel experience?
How to turn this inspiration into reality? First, we need to break it down and turn it into operable parts. If I want to implement such a game, the key tasks I need to do are modeling, animation, rhythm notation, and gesture recognition for the VR controller.
The gameplay design is that the player listens to the music while making corresponding gestures based on the polylines and dots they see on the podium. The polyline means waving, and the point means hit downward, simulating the gesture of a conductor.
Modeling and animation
I used Maya to create the musicians and the hall. Based on the consideration of time, all the characters used the same model (😂), with different musical instruments. And here they are, a small identical band, and the same identical audience. I referred to the pictures and created a simplified version of hall. Since the modeling is too ugly, we skip this step 🤪.
C4D has a very convenient way to rig, so I imported the character model into C4D to rig, and then matched the related musical instruments for key frame animation, forming resources and importing them back to Unity. This step is also relatively smooth.
Create rhythm notation
This is the most laborious part of the entire process. Although there are programs for automatically extracting rhythm, I still feel that the experience of manually made notation is better.
After choosing the tracks, I recorded the timestamps while arranging the gestures according to the melody and rhythm. At this time, I used Excel to record all the data, then used a plugin to read the data into Unity later, and wrote a script to generate polylines and single points according to the timestamps and their position, and then you can automatically generate different notations by importing different files.
In the game, there are only two kinds of gestures that need to be recognized - waving and tapping. These two gestures can be judged by the moving direction of the controller. When the point and the polyline reach the baseline, check the time difference between the movement of the handle and the time of the point, and divide different scores according to time difference ranges. At the end of the track, the player is presented with a grade based on the total score.
In order to further increase the fun and sense of reality, I added the action of "bow and acknowledgement" to the step of announcing the results. At the end of the track, guide the player to bow to the audience. After bowing, the result panel will appear in front of you. At the same time, the audience will applaud 👏🏻 or boo 👎🏻 according to the score. This part was well received by players at the open day.