Linden Lab Unveils Puppetry Project

to Animate Avatars With Webcams & Mocap Equipment in Real Time

Impressive unveiling by Linden Lab just now — real time puppetry for Second Life avatars:

We have been working on this feature for some time and now we are ready to open it up to the Second Life community for further development and to find out what amazing things our creators will do with this new technology… The codebase is alpha level and does contain its share of rough edges that need refinement, however the project is functionally complete and it is possible for the scriptors and creators of Second Life to start to try it out.

This is not just a lightweight “wave into your webcam and your avatar waves too” technology (which has existed for many years) but is meant to integrate much more devices — and the whole avatar:


We are excited about Puppetry’s potential to change the way we interact inside Second Life. For example, using a webcam to track your face and hands could allow your avatar to mimic your face animations and finger movement, or more natural positioning of the avatar’s hands and feet against in-world objects might also be possible.  Alternative hardware could be used to feed information into Second Life to animate your avatar – a game controller or mocap equipment.  There’s a lot to explore and try, and we invite the Second Life community to be involved in exploring the direction of this feature.

What’s even more exciting is that Second Life avatars recently got a Bento skeleton update, which makes them extremely articulate. You can see that in recent SL pics featured by Cajsa, where an avatar’s fingers down to individual joints are highly expressive.

Read about the announcement here,

Introducing Second Life Puppetry


 
 

image1.png
Photo by Alexa Linden

The idea

Wouldn’t it be cool if you could animate your avatar in real time?  What if you could wave your arm and your avatar could mimic your motions?  Or imagine if your avatar could reach out and touch something inworld or perform animations?  Linden Lab is exploring these possibilities with an experimental feature called “Puppetry.”

We have been working on this feature for some time and now we are ready to open it up to the Second Life community for further development and to find out what amazing things our creators will do with this new technology.

The codebase is alpha level and does contain its share of rough edges that need refinement, however the project is functionally complete and it is possible for the scriptors and creators of Second Life to start to try it out.

See the section below “How to participate” to learn how to use Puppetry yourself.

Take a Look

We have some basic things working with a webcam and Second Life but there’s more to do before it’s as animated as we want.

Puppetry Technology

Puppetry accepts target transforms for avatar skeleton bones and uses inverse kinematics (IK) to place the connecting bones in order for the specified bones to reach their targets.  For example the position and orientation “goal” of the hand could be specified and IK would be used to compute how the forearm, elbow, upper arm, and shoulder should be positioned to achieve it. The IK calculation can be tricky to get right and is a work in progress. 

The target data is supplied by a plug-in that runs as a separate process and communicates with the viewer through the LLSD Event API Plug-in (LEAP) system.  This is a lesser known functionality of the Viewer which has been around for a while but has, until now, only been used for automated test and update purposes.

The Viewer transmits the Puppetry data to the region server, which broadcasts it to other Puppetry capable Viewers nearby.  The receiving Viewers use the same IK calculations to animate avatars in view.

For more details about the Puppetry technology, take a look at the Knowledge Base article Puppetry : How it Works

Uses and Possibilities

We are excited about Puppetry’s potential to change the way we interact inside Second Life.  For example, using a webcam to track your face and hands could allow your avatar to mimic your face animations and finger movement, or more natural positioning of the avatar’s hands and feet against in-world objects might also be possible.  Alternative hardware could be used to feed information into Second Life to animate your avatar – a game controller or mocap equipment.  There’s a lot to explore and try, and we invite the Second Life community to be involved in exploring the direction of this feature.

How to participate

The Puppetry feature requires a project viewer and can only be used on supporting Regions.  Download the project Viewer at the Alternate Viewers page.  Regions with Puppetry support exist on the  Second Life Preview Grid and are named: Bunraku, Marionette, and Castelet.

When using the Puppetry Viewer in one of those regions, if someone there is sending Puppetry data you should see their avatar animated accordingly.  To control your own avatar with Puppetry it’s a bit more work to set up the system.  You need: a working Python3 installation, a plug-in script to run, and any Python modules it requires.  If you are interested and adventurous: please give it a try.   More detailed instructions can be found on the Puppetry Development page.

What’s next

We look forward to seeing what our creators do with the new Puppetry technology. Compared to other features we have introduced, it’s quite experimental and rough around the edges, so please be patient!  We will keep refining it, but before we go further we wanted to get our residents’ thoughts.

We will be hosting an open discussion inworld on Thursday, Sept 8 1:00PM SLT at the Bunraku, Marionette, and Castelet regions on the Preview Grid.    We’re also happy to talk about this at the upcoming Server User Group or Content Creator meetings.  Come by, let us know what you think, and hear about our future plans!

and compare/contrast with VRChat’s recently launched Avatar Dynamics project.

My immediate guess is that most Second Life users won’t use avatar puppetry for most occasions — after all, gesticulating into your webcam or with a mocap suit quickly gets exhausting — but it will still be a huge breakthrough for live performers at music/dance/theater shows, along with conference presenters. And yes, for that other use case you thought of first.

Have a great week from all of us at Zoha Islands/Fruit Islands