Vtubers
Tips to Improve Your Facial Expression and Body Movement as a VTuber
3 mins read
According to Albert Mehrabian, body language makes up 55% of all communication. Our innermost thoughts and feelings are conveyed via our facial expressions and movements, no matter how much we hide them. Thus, it’s important for a VTuber like yourself to be able to translate your physiological actions into your avatar so that you can connect more with your audience and breathe more life into your character. Here are some tips to help you out on this.
1. Know Your Character Intimately
In order for your expressions and movement to fit your avatar’s personality, you must also BE your avatar. Similar to what method actors do, you have to give some time for internalization, which is the process of getting to know your character so well that you start embodying them and thinking like them. Do they have certain personality quirks? Are they excitable and bubbly, or brooding and serious? Of course, it’s always easier to pattern your character after yourself, but being able to embody a different persona altogether is an exciting challenge. Once you’ve got this down, you’ll notice yourself moving and expressing like your character in no time.
2. Use The Right Software
Without the proper tools at your disposal, all your hard work and preparation will be useless. You must be able to rig and animate your avatar well so that your expressions will also translate well to the stream. There are lots of options available for this, such as VTubeStudio, UltraLeap, Webcam Motion Capture, and VSeeFace which are easy to use for anyone, no matter the experience level. Below is a simple description of each to help you choose better:
- VTubeStudio: a tool for VTubers using 2D models. This software allows for face tracking, including eye-tracking and winking, through the use of the webcam or an iOS device. It also allows the flexibility of adding on accessories to your avatar such as sunglasses, hats, or other props which can have their own movement tracking. Furthermore, it allows for collaboration between 2 or more VTubers so both models show up in one livestream.
- UltraLeap: this highly advanced software used for hand tracking (and eventually, skeletal tracking) requires the use of their proprietary motion tracking cameras which allow for a highly accurate depiction of your hand movements. UltraLeap includes Unreal and Unity plugins for proper integration into your set up.
- Webcam Motion Capture: this software allows you to track your hand movements only with the use of a webcam and a minimal monthly subscription fee. It allows for head tracking, facial expression tracking, eye gaze tracking, eye blink detection, lip sync, and upper body tracking. This works best for 3D models with a vrm file extension which will then be uploaded into the software’s Webcam Motion Receiver.
- VSeeFace: this free of charge Windows OS-only software has face tracking, including eye gaze, blink, eyebrow and mouth tracking all through the use of a webcam. Those who wish to have hand motion tracking will need to make use of Leap Motion and its accompanying software. Note, however, that this is still in the beta stage as of this writing so users may still encounter some bugs.
3. Practice, Practice, and Practice
After internalizing your character’s personality and bringing them to life on screen, it’s time for you to do some sample recordings. This is when you can practice your character’s expressions and movement so that you can modify as you see fit. It may be awkward to watch yourself on screen, but this step is essential for you to see any physical tics that you may be doing, or go back to the software to fix something in the rigging. Make sure to devote time to watching sample reaction recordings so that when you finally have to stream, you’ll be confident enough to act out and connect with your audience effectively as your character.
There you have it! These are some tips to help you express well and translate your own movements and expressions into your character. Of course, it won’t be a 100% accurate depiction, but as long as your audience understands what you want to convey, then you’re all set.