News

Hand Engine 1.2.0 Released

Enhanced features including OptiTrack integration in Hand Engine 1.2.0

Announcing Hand Engine 1.2.0

In this new release, we’ve focused our attention on developing features that you’ve told us will have the highest impact on you.

And, of course, if you have some feedback we are always keen to hear it: post a comment!

Key features of this new release

Express calibration improved

Save time with improved Express calibration

We’ve updated the machine learning model with thumb and splay detection even more accurate than before, using a calibration that can be done in seconds.

Expanded support for optical systems

Better support for optical solutions

We now synchronize with OptiTrack Motive timecode, and we’ve made it easier to use our solution with Vicon Shogun without using Python.

Automated jam sync

Less fiddley jam syncing!

No more unplugging cables: Hand Engine will manage jam syncing for you.

Use Xsens MVN across your network

Improve stability with CPU load spread across your network

We’ve added the ability to send Hand Engine data over your network to a computer running Xsens MVN Animate Pro or Plus, reducing the load on a single machine.

Adjustable FBX frame rates

Adjustable framerates when recording to FBX

We’ve added the ability to record FBX files using industry-standard framerates  to match your target platform.

IMPORTANT: you must update Glove and Dongle firmware to the latest versions in order to use Hand Engine 1.2. You can find the latest installers and firmware updates on your account dashboard.

Detailed release notes

Major improvements and features

  • Streaming Hand Engine data over the network to a computer running MVN Animate Pro or Plus (via UDP). User can set the IP address of the computer running MVN in Hand Engine.
  • New Express calibration – new model with improved performance, especially in regard to thumb and splay movement. 
  • Simplified and expanded timecode synchronization for optical systems (Vicon Shogun and OptiTrack Motive).
  • Improved timecode synchronization via wireless jam sync. Gloves can be synced from within Hand Engine. Reduces drift and the need for mechanical devices like UltraSync One to be plugged into the glove.
  • FBX recording changes: adjustable frame rates that conform with industry standards.

Minor changes

  • Remapping: ability to set binding pose of target asset to L-Pose or Paddle Pose. Previously target asset had to be in L-Pose.
  • Fix to align keys with frames (bug fix).

Stability fixes

  • Crash during trigger of multiple devices resolved (bug fix).
  • How to Guides link in Hand Engine updated to new Knowledge Base.

Known issues

  • Loading a session doesn’t load user changes to Xsens Streaming IP Address, Remapping settings.
  • UI: Multicast/Unicast dropdown menu is misaligned on different sized monitors.
  • When loading a Saved Session the captured poses for the profile loaded is missing. They reappear if you change between profiles.

Got an idea?

What features would you like to see added to our product Roadmap? Let us know in the comments.

Indigo 2021 banner.

StretchSense at Indigo 2021

We’re Looking Forward to Seeing You

We’re excited to announce that StretchSense will be attending INDIGO 2021, the Netherlands’ premier game industry event, organised each year by Dutch Game Garden.

Run as an online event on the 25th June, the programme features talks by key industry representatives from companies such as Epic Games, Sega, Guerilla Games and more.

Meet with Sarah

Our motion capture expert Sarah Thompson is looking forward to connecting with publishers, service providers, and game companies on the international matchmaking platform MeetToMatch.

Sarah will demonstrate the benefits of stretch sensor hand motion capture and will show you why it is the best solution for your pipeline. To book a meeting with Sarah sign in to MeetToMatch now.

Sarah Thompson from StretchSense.
Sarah Thompson from StretchSense assists gaming, VR and motion capture studios get exceptional hand mocap results.
StretchSense Xsens integration banner.

StretchSense native integration in Xsens MVN

In December 2020, Xsens released MVN 2020.2. For StretchSense customers, this was a significant development because it was the first time StretchSense MoCap Pro gloves could be captured inside either MVN Animate or MVN Analyze

This integration provides the link between premium body and hand motion capture that studios have been asking for. You no longer need to capture hand and body data separately and combine it later in a third-party application. Instead, do it all at once inside MVN.

Crucially, because StretchSense integration is native within MVN software, all the features of our Hand Engine software are available. That includes Pose Detection (snapping to pre-defined, key artistic poses), Blend Mode (smooth animation between poses), and Hybrid Mode (the smooth transitioning between saved poses). Basically, if it’s something Hand Engine can do you can utilize it in MVN software.

To get the lowdown on how to take advantage of the StretchSense native integration in MVN, Katie Jo Turk of Xsens joined StretchSense Motion Capture Solutions Architect CJ Markham earlier this year for an interactive webinar. 

Katie Jo was wearing the full-body MVN link suit with StretchSense gloves, which are equipped with velcro on the top of the hand to accommodate the suit’s hand tracker.

A few clicks and you’re done

The beauty of native integration is that the MVN system is tailored to use our solution as the finger tracking hardware. All that is required is some simple configuration steps in Hand Engine and MVN and you can commence streaming in a matter of minutes.

This means that studios using Xsens for body motion capture can easily slot our gloves into their pipeline. To make this easy we’ve put together this tutorial on our Knowledge Base.

Poor pizza guy!

To demonstrate native integration, Katie Jo and CJ role-played a familiar scenario — the fight at the front door of your house with the pizza delivery guy. 

Okay, we hope this isn’t a normal occurrence but it made for a great motion capture demo.

What to do with your data

You’ve set up our gloves and Hand Engine in MVN and finished recording your take. Now what? 

MVN has several export options so there’s an output file type for every project. For mocap specialists and animators, you can export to FBX and BVH files, the latter being popular with MVN users who retarget to character assets in Blender.

You can then import your file into any 3D animation software. Popular choices being Maya, Unreal Engine, Unity, MotionBuilder, Cinema 4D, Houdini, and Blender.

High fidelity finger tracking 

How does the hand motion capture stack up? The animation curve below shows the data points from the fight with the pizza guy motion capture take CJ and Katie Jo produced for the webinar.

The top curve, which shows the motion data for the right hand, flattens when Katie Jo made a fist pose. Pose detection was used here to ensure stable data and no noise. Typically, achieving a solid pose like this would require cleanup in post-production. Art direction can instead be done in pre-production, an ethos that has been affectionately labeled as “fix it in pre”.

Xsens MVN Animate StretchSense integration.

Learn more

If you’re interested in learning more about integrating the StretchSense hand motion capture solution into your Xsens pipeline, you can contact us or request a trial

More info can be found on our Xsens MVN integration page.

OnPoint Studios CARGO setup in Unreal Engine.

Cargo, the live virtual performance featuring StretchSense gloves

OnPoint Studios in Berlin, Germany are developing a project to fuse motion capture technologies, including our hand motion capture solution, with Unreal Engine to deliver a new generation of theatre.

CARGO is a virtual production showcase OnPoint is using to improve the methods and workflows they use in their client projects.

The 5-minute CG short story, which will be live-streamed later this year, is also a way for the studio to highlight their expertise.

The story revolves around Ally — a space zoologist traversing the galaxy cataloging and experimenting with unknown species.

Ally is fully performance captured in a live Unreal Engine scene.  This enabled the team to treat the performer as a game character — where they can employ colliders and dynamic events as she interacts with the virtual world.

Head of Studio at OnPoint, Niklas Bothe, says they aren’t trying to create a new medium.

“It’s more of a vehicle to drive progress and explore new ways of creating content; to make it possible for creatives on set to be immersed in their own words that they crafted and to be able to produce content in this way.”

CARGO was featured on the 26 February 2021 edition of Inside Unreal — a YouTube series by Unreal Engine to showcase its creators. 

Epic games metahumans in unreal engine.

Using StretchSense gloves to power a MetaHuman

Epic’s MetaHuman Creator allows you to create a bespoke photorealistic digital human, fully rigged and complete with hair and clothing, in a matter of minutes. 

The great news is, it takes just a few more minutes to map our MoCap Pro gloves to your new MetaHuman character in Unreal Engine.

To do that, follow these steps, outlined in the video below.

  1. Ensure you are running the latest video card drivers on your machine
  2. Put on the gloves and open Hand Engine. Ensure they are streaming into Hand Engine
  3. Open the remapping tool in Hand Engine and scan in the metahuman FBX to create the mapping table and ensure remapping is accurate (available in the Downloads section of your StretchSense account)
  4. Open Unreal Engine
  5. Find the MoCap Pro Live Link plugin for Unreal and install it (if you haven’t already). The plugin is available in the Downloads section of your StretchSense account and is also packaged with the Hand Engine installer
  6. Find the Blueprint for the character skeleton. Search: Skeletal Mesh
  7. Open the Blueprint, and locate the hands you want to connect to the MetaHuman. Disconnect and bypass using Live Link Pose. You’ll need to connect Live Link to the previous asset (relevant lower arm) and subsequent asset (relevant leg)
  8. Select Live Link connection from the Subject Name drop-down menu

The final step is to revel in the fact that creating your MetaHuman character and mapping our gloves to it took less time than it would going out to get your lunch!

Request a glove trial.

Request a trial package

Now you can try our premium hand motion capture solution before you buy.

The introduction of trial periods has already been warmly received by animators and motion capture professionals who want to see how our MoCap Pro glove and Hand Engine software can streamline their animation pipeline.

Borrowing our gloves means you can test integrations with platforms that you use in your studio. It also means you can check how the glove fits as well as performance and accuracy.

What you get with a trial package

Our trial packages consist of a single MoCap Pro glove and a short term license for Hand Engine Animator and includes support from our motion capture consultants.

We will help you set up the glove and software and take you through everything you need to know to get the best out of your experience.

You can take advantage of the trial package if you reside in the USA, Canada, UK, Europe, Australia and New Zealand. 

How to secure a trial

Fill in the Request a Trial form. Our mocap experts will be in touch within two business days.

Guitar playing motion capture featuring Xsens and MoCap Pro gloves.

How to Remap Live Guitar to a Character Asset – MoCap Pro Gloves

Recently, we produced a video showing a live motion capture guitar performance using MoCap Pro gloves. Our team produced this demo in one take with no clean up, so we thought we’d share the process that was used to create it.

The demo is a good example of the the raw quality you can expect when using our gloves. You’ll see that the finger articulation is excellent – particularly during the strumming – but we need practice with asset clipping!

This video was made using Maya to edit the Renderpeople asset acquired from TurboSquid. Then we exported this as FBX for use in our brand new software application Hand Engine.

We calibrated the gloves using Hand Engine’s “Group Calibration” function, which allows you to calibrate multiple gloves at once, then “Auto-capture” that cycles through the training poses automatically.

It took less that four minutes to train our MoCap Pro gloves for this demonstration after putting them on our performer, Josh.

Next we remapped on to our character using our saved profiles.

Finally in MotionBuilder we connected to Hand Engine for our gloves and MVN Animate for the Xsens body capture suit and began streaming live motion from our guitarist. Since we made this video MVN Animate has now been fully integrated with our gloves meaning you can manage both the gloves and the suit from within MVN.

See our behind the scenes video below for a demonstration of the process.

Webinar Xsens StretchSense MotionBuilder.

Xsens and StretchSense – finger and body mocap in MotionBuilder webinar

It’s the middle of 2020 and many of us are locked down in an effort to stop the spread of Covid-19.

We figured this is the right time to deliver our first webinar: Xsens StretchSense finger and body mocap in MotionBuilder.

To make it all work, people were tasked with performing separate production duties in different locations.

The distributed team had to use technology that didn’t exist a few years ago in order to make live performance capture possible.

We had CJ Markham operating MotionBuilder from his garage in Seattle, Katie Jo Turk operating the Xsens suit from her living room in Los Angeles, and Todd Gisby using the MoCap Pro glove in New Zealand. This demo was performed simultaneously from distant locations.

The Covid-19 lockdowns have taught us to reevaluate how to utilize tech to keep production going and ensure that those of us in the motion capture industry can keep our jobs.

“The show must go on” as they say.

Katie Jo was able to stream live data from LA to CJ in Seattle from her Xsens link suit with the help of Xsens MVN Animate software and a VPN.

The data could then be assembled in MotionBuilder along with MoCap Pro glove data from Todd in New Zealand. Somehow it all worked!

Curating key poses before the shoot

A key focus in the webinar was the utility of Pose Detection Mode. With this functionality, the senior animator or lead animation director can curate the poses they want to see in the final animation before the capture session. When the performer approximates their hand position to one of the pre-defined poses, Pose Detection Mode will transition the character’s fingers to that pose.

This eliminates the work often needed in post to clean up the animation. This is powerful for animators because it could be a huge time and money saver.

Blend Mode allows for more of nuanced and novel gestural animation. This is the performance that people look for in live applications, cinematics, and pre-visualization.

Full-body performance with no need for recalibration

One of the key points Katie Jo and the StretchSense team emphasized in the webinar was the fact that a vigorous performance shoot didn’t disturb the calibration of either the Xsens suit or the MoCap Pro gloves.

To illustrate this, Katie Jo performed the UCLA Bruin 8 Clap, which all true fans of the university know instinctively (we’re told). In fact, the only thing that couldn’t stand up to the 8 clap test was the Zoom app. 

Maintaining calibration is key on set when often there simply isn’t time to keep going back and fixing calibration issues.

StretchSense gloves can achieve consistent calibration because they are designed to fit snugly and comfortably on the hands while the stretch sensors are robust and do not drift.

Handling props

Where pose detection really shines is in prop interaction. You can capture specific poses for props so those hand poses stay solid when the performer is handling those props. Again, this reduces the amount of post-production work for animators.

Animators are sometimes given optical finger data that isn’t captured quite well enough to use. It presents a headache to the person having to clean it up. Reduced post-work due to pose detection is great for things like in-game animation and good for things like you know dynamic and cinematic work.

Another application where pose detection is an advantage is live VR/AR where fidelity is important to the final product.

Hand position tracking

MoCap Pro gloves can be purchased with an optional IMU (inertial measurement unit) to track wrist rotational data for the wrist. VIVE trackers can also be used to track the position and rotation of the hand.

The glove comes with a velcro strip on the back of the glove so the inertial hand tracker on the Xsens body suit can be attached easily.

More information

The webinar is packed with information both specific to the MoCap Pro glove and Xsens, however, it also contains a wealth of advanced mocap tips and power moves that are too extensive to go into here. Watch the webinar to learn more.

Sean Firman, an up and coming indie game developer, created a VR game demo shooting fireballs from hands

The Growing Role of Hand Mocap in VR Gaming and Films

The Growing Role of Hand Mocap in VR Gaming and Films

Imagine waking up tomorrow and having to navigate your day with no use of your hands. How limited would you feel not being able to get dressed, pick up your lunch, greet co-workers, or play sport?

It’s no wonder then that films and games, designed to bring realism to make-believe fantasy scenes, are working towards artfully capturing an actor’s body language, of which hand gestures are key.

The movie industry smashed a record $40 billion in global revenue in 2018, and at $100 billion, the gaming industry is even bigger than movies and music combined. Backed with mega-budgets, studios are stepping up to the challenge of awing audiences with bigger blockbuster hits year after year. Audiences are regularly seeing advanced CGI, adrenaline-inducing stunts, and sophisticated animation and have growing expectations of escapism from directors and game studios.
Delivering more immersive experiences hinges on the ability of studios to capture the subtleties of hand motion for richer content creation. Bringing computer-generated characters to life is an arduous task; a film crew will use cameras and body markers to capture an actor’s performance on a mocap stage and then apply them to digital characters in post-processing. What audiences might not appreciate though, is how limitations in camera technology are forcing animators to painstakingly hand-draw missing frames — all to deliver a few moments of perfection on the big screen.

The holy grail for studios is the ability to live stream hand, facial and full body motion data — from a large group of actors — with low latency. Because an actor’s mannerisms in their voice and movement all run together to form a character, it is highly advantageous to capture the full performance in one shot – this saves post-production time and costs and captures the authentic performance from the actors.

Image credit: GREE VR Studio Lab adding hand tracking to a full body suit for a Virtual YouTube showcase at SIGGRAPH Asia.

Capturing hand motion is even more critical in VR gaming than films where there is an inherently higher expectation of immersion. If you were shaking hands with someone in an interactive VR game and their hand suddenly slumped in yours like cold limbless rubber you would recoil*6. Research shows that unlike facial expression, with hands, the closer a virtual model is to a real one, the higher our liking of that is.

Image credit: Sean Firman, an up and coming indie game developer, created a VR game demo shooting fireballs from hands.

An innovative solution to this time consuming and expensive method is to use motion capture gloves. The recent developments from smart glove manufacturers show promise of serving the underserved hand departments of studios. Mocap gloves foremost need to be designed with actor comfort in mind – they need to be lightweight to not hinder an actor’s performance. The gloves also need to help technicians reduce production downtime by being quick to set up, provide accurate motion data and run for a full day’s shoot without the need to recharge.
 
Using stretch sensors that deform with finger bend and spread is an attractive solution thanks to their thin form factor as well as their demonstrative stability after calibration. StretchSense believes that smart gloves are a seed to a whole new industry of XR in the next decade. The movie and gaming industries are paving the way for broader opportunities in developing gesture-based input peripherals for medical, military and workplace productivity applications.

References:

  1. 2018 Worldwide Box Office Hits Record as Disney Dominates
  2. Prosthetic Hands Trigger Uncanny Valley Sense of Creepiness
  3. Can Looking at a Hand Make Your Skin Crawl? Peering into the Uncanny Valley for Hands
  4. Valve’s VR Index Controllers Include 87 Sensors to Track Your Fingers and Hands
  5. Robots: Is the Uncanny Valley Real?
Stretch sensors on models' hands to trigger LED lights as they walked down the runway

Sensors and Fashion Come Together at NYFW

Stretch sensors on models’ hands to trigger LED lights as they walked down the runway.

I just got home from New York Fashion Week (NYFW) where I met with a whole bunch of interesting people including coders, fashion designers, photographers and engineers. It was an amazing experience, but you might be asking what this has to do with StretchSense? Well, we were on the runway!

The NYFW project was our second collaboration with Intel and our first with Chromat, a fashion label focusing on structural experiments for the human body. Becca McCharen, the CEO of Chromat who has designed for high-profile celebrities including Beyonce, Madonna and Taylor Swift, has made our sensors a fashion accessory.

We first met up with Becca through the Intel team at CES. Becca and Intel had the idea of using stretch sensors at NYFW so that models could trigger lights on their dresses to express themselves through subtle movements of their hands. Becca loved the minimalist look of our sensors and decided she wanted them exposed on the models’ hands rather than hidden inside a glove.

Once the idea was conceived, we immediately got to work outlining the major project milestones and specs — we only had a month to do it! Becca did a quick sketch on a napkin before we all dispersed back to our home cities. Chromat was to make the dress, Intel to tie it all together with their Curie platform, and StretchSense to provide the sensors.

Napkin sketch where our collaboration was born.

Once we got home, we got to work right away. Making the sensors minimalist was a new challenge, and our engineers had to remove as much material as possible while having a sensor that was robust, stayed on the fingers and still got a signal.

The tricky part was that we didn’t know what size sensors we needed because the casting wasn’t going to take place until the weekend before the show. In the end, we shipped “open” gloves and lots of spare material so that the Intel team and Chromat could size them on the fly.

Prototype silicone sensor system we delivered for the fashion show.

On February 5th I packed my bags for New York to meet Becca at the casting. It was exciting to see how the fashion industry works; my first experience was running into a model who asked if I thought she should walk the casting with or without heels (my recommendation was with, although she looked to be 7 feet tall — I hope it worked out for her!).

Once inside there were queues of girls waiting on the steps for their turn to model Chromat’s garments. Things were coming together and I couldn’t wait for the Monday photoshoot, which took place between 6pm and 1am. Turns out a lot of work in fashion happens late at night, which suited me fine as a night owl 🙂

One of the first shots of the models wearing stretch sensors. Subtle movements of the hands would trigger the LED strips on her dress to light up.

The Monday shoot was limited to 10-12 people to keep the work efficient, and aside from the models, most of us wore black to reduce unwanted light bouncing into the shoot. At the photoshoot, the Intel team smoothed out the final technical hurdles and we saw the whole system working for the first time: dress, stretch sensors and Curie. The models loved it, the whole team was stoked and the photos looked amazing! Thanks to Tayler Smith for the wonderful images 🙂


Alisar, Erin E and Sabina Karlsson checking out the stretch sensors during the shoot.
Late night team photo at 1am!

Fast forward to Friday, and I found myself at the Chromat AWLumina16 fashion show at Milk Studios in the Meatpacking District. After getting through layers of security checkpoints, I made it backstage with the Intel team and photographers. This time the team was even bigger — if the Monday shoot had 10 people on site, the fashion show looked like over 200 people were involved. More models, stylists, reporters, photographers, engineers, event managers and guests, all in black of course.

At 4.45pm with 15min left until the show, everyone was hyped, busy and still working on the sensors and the models with only minutes left to go!

Model getting her eye make up pre-show.

Hair and makeup.

Some last minute inspiration for the models posted at the exit of the runway.

CHROMAT AW16 | LUMINA RUNWAY from CHROMAT on Vimeo.