3D Artist
May
23

Faceware Live 2.5 out now

News & Features
by
Carrie Mok

Facial animation in real-time has just gotten a boost

Faceware Technologies has announced the immediate release of Faceware Live 2.5,

The provider of markerless 3D facial motion capture solutions has given a host of upgrades that are the result of advancements in the consumer-grade facial tracking technology from clients including L’Oreal and Nissan.

Faceware Live produces facial animation in real-time by automatically tracking a performer’s face and instantly applying that performance to a facial model. Faceware Live requires just a single camera for tracking purposes. That camera can be an onboard computer video or webcam, the Faceware GoPro or Pro HD Headcam Systems, or any other video capture device.

With Live 2.5, some of the use cases include live performances that incorporate digital characters can now have them ‘puppeted’ in real-time, allowing interaction with live audiences and people. Digital characters would also be able to interact in real-time on kiosk screens in theme parks and shopping malls.

Facial animation content can also be created instantly for previs and users can stream their own personas into digital and virtual worlds—perfect for training applications as well as interactive chat functionality.

Faceware Live 2.5 out now

New features include:
• New and Improved Face Tracking Technology
You’ll now see improved tracking with more stability, better face detection, and less jitter in virtually all scenarios. In addition to better tracking and animation under normal conditions, also enjoy better tracking with heavy facial hair, low and/or poor lighting, different skin tones, glasses, etc.

• New Animation Preview Characters
You can now select from a variety of Animation Preview characters in Live’s Preferences.

• Animation Tuning
You can now tune and customize Live’s realtime animation to your face using the new suite of tools in the Animation Tuning window.

• Animation Value Multipliers
You can now use multipliers in the Animation Tuning window to enhance and exaggerate animation so that animation quality and style can be tuned to a specific user’s face.

• Animation Tuning Profiles
You can now Save and Load profiles in the Animation Tuning window for different users so that each user can have their own ideal setup.

• Command-line Calibration
You can now trigger Calibration and toggle Streaming via command-line for improved automation in your setup.

• Simulate Controls For Improved Character Rig Integration
You can now have individual controls Simulate from 0 to 1 without needing the user to move their face to help with improved character setup and rig tuning.

• Stream UV Landmark Positions
You can now stream the UV positions of each Landmark on the face alongside the animation data.

• Stream Head Position
You can now stream your head position along with rotation so you can move your characters head. Additionally, Live now streams a 4×4 Transform Matrix for the head along with the Position and Rotation values.

• Floating Point Precision
You can now adjust the floating point precision of animation values to keep network packet size low when streaming over a network.

• Toggle Streaming of Individual Face Groups
You can now toggle whether or not you want to stream individual Face Groups to keep network packet size down when certain Face Groups are not in use.

• Keyboard Shortcuts
You can now Calibrate, toggle Streaming, and open various windows via keyboard shortcuts.