How VTube Studio’s NVIDIA integration makes VTubing more accessible

VTubing can be easy, but at the high end it can be an expensive and intensive process. VTube Studio and NVIDIA want to change that by using new technologies to make the streaming medium more accessible to everyone.

Getting started with VTubing can be as simple as getting a bunch of PNG images, plugging them into a Discord bot, and going live on Twitch. However, it can also be a complex mix of Live 2D or 3D models, with individual body parts animated and manipulated to follow the streamer’s every movement.

This upper level of VTubing can be quite difficult to access. It requires quality equipment ⁠ – either an iPhone or iPad to use face tracking, or a computer powerful enough to track the same facial expressions from a webcam. Either way, it’s a big hardware investment to get things running smoothly.

VTube Studio is one such face tracking software for VTubing and it is one of the most popular out there. The PC program allows users to load their model before using a webcam or smartphone to track and imitate movements on the model.

Denchi, the person behind VTube Studio, knows how resource intensive the entire medium can be. Despite being a PC application, many users choose to run it on their smartphone and send tracking data to their computer to reduce CPU and GPU usage.

“I’ve tried pretty much every face tracking framework over the past few years, but they’re often unstable, very experimental, or prohibitively expensive,” they told Dexerto.

“Most people are currently using either webcam-based face tracking or iOS face tracking. The webcam face tracking that comes with VTube Studio, an open-source library called OpenSeeFace, is already pretty impressive, especially considering it was built from the ground up by a single person.

“But both webcam-based tracking and iOS tracking have their issues. Webcam tracking is relatively resource-intensive and not as accurate as iOS tracking, while iOS tracking is very accurate and tracks more facial features, but users need an expensive iPhone or iPad to use it.”

However, this barrier to entry continues to fall a new collaboration between VTube Studio and NVIDIA. NVIDIA Broadcast’s new face tracking feature reduces the load on GPUs for VTubers who want to keep everything on their computer, and the Live2D program is one of the first to benefit.

It’s been “optimized to run most of the face-tracking AI code… on its high-performance tensor cores, which all RTX-series cards have” ⁠ – the same stuff that makes your AAA games look silky smooth on PC, but now also can help with face tracking.

It also looks smoother without sacrificing performance too much ⁠ – in fact, it potentially outperforms what’s currently on the market, claims Denchi.

“Performance impact will be minimal and tracking can run alongside even the most demanding games,” they continued. “NVIDIA face tracking accuracy is also extremely good, coming very close to, and maybe even surpassing, the quality of current iOS tracking.”

The feature will not only help VTube Studio, but any developer who wants to use face tracking on NVIDIA GPUs. It opens up a wealth of development opportunities in the VTubing space, which could lower the entry barrier even further.

It’s an area in which NVIDIA is also firmly trying to position itself. Gerardo Delgado Cabrera, product line manager at NVIDIA Studio working on the new broadcast capabilities, said it is part of long-term plans to “optimize” the VTubing space.

“As part of NVIDIA Studio, we work with all the top creative apps – and future ones too,” he told Dexerto. “And one of the hottest areas of development in live streaming is VTubing.

“We reached out to all the top VTubing apps months ago and started working with everyone to help them optimize their apps. In fact, improvements have already been shipped via NVIDIA Studio drivers to help with optimization and stability.

NVIDIA Broadcast’s face tracking will go live in October, with an update streaming to VTube Studio at the same time. This will help around 30% of users with RTX-enabled GPUs. The update will also be completely free for everyone, and the manufacturer is working with the VTubing community to continuously add new features and updates.

That includes a new tool in NVIDIA’s augmented reality software development kit called Face Expression Estimation, which “helps animate facial meshes that can convey better emotions,” Delgado said.

It presents as a big leap for the technical side of the VTubing field, but at the end of the day it’s just a small part of the experience. There is still a lot of growth in what could become of VTubers and Denchi will continue to explore this with VTube Studio.

“I think tracking will certainly improve, but I also think it’s important to remember that tracking is just one aspect of VTubing. Personally, most of the VTubers I watch regularly have very basic tracking and often fairly basic models.

“At the end of the day, VTubers aren’t really that different from regular streamers. People look at VTuber because they like their personality and stream content. While a good tracking setup can help, nothing can replace a fun personality and interesting stream content.

“That’s what I want to focus on with VTube Studio. Most of the features I plan to add in the future are focused on improving viewer interaction and collaboration with other VTubers. This is what I personally enjoy the most and, in my opinion, also distinguishes VTubers from normal streamers.”

Leave a Comment