Mocap with mirrors

(AKA How I Fell in Love with 3DEqualizer, and the Mystery behind the ‘Mirror Points’ Checkbox)

First post! Yay!

So as you may or may not know, I’m currently working on a short sketch called ‘Isn’t It Nice?’, which involves me, my old iMac and facial motion capture. Don’t ask.

It had been in my head for a while, it was shot in November after my first demo reel, and shortly after I started focusing on job hunting, realised that I wanted to move to London, then Autodesk announced they weren’t developing Softimage anymore, I started learning Maya and Houdini already in London while trying to find both a job and a place to live there… Yay?

I needn’t do any mocap at all, but sometimes I like to experiment, so I covered my face with Blu-Tack and shot it between two large mirrors thinking that I was about to bring mocap to the masses… Meh. Turns out it has already been done.

After some frustrated attempts with NukeX’s CameraTracker, in which’s defence I have to say that it’s not intended to do mocap, I started learning 3DEqualizer. Because learning two 3D packages at the same time wasn’t enough. Jokes aside, my first job as a 3D artist in a VFX house will most likely be as a matchmover, and nearly every big house uses it, so it was well worth learning. It took me about a week.

I fell in love with the software.

So these days I’ve been learning #3DEqualizer, and now I can safely say WOW. The more I use it the more I love it, now I…

Posted by Unai Martínez Barredo on Saturday, June 28, 2014

The Blu-Tack wasn’t a wise choice though, as its brightness is too close to that of my skin’s, so I had to manually animate a lot of points. Quick tip: if you have animated a point until a frame where it sticks and want to track it automatically from then on, make sure all the previous splined frames are marked as tracked, so 3DE doesn’t overwrite your animation! (under Manual Tracking Controls, Edit > Set Curve > Tracked, then Delete Curve > Until End, then End Point button).

I tracked the mirrored points as they were shot, because 3DE has a Mirror Points checkbox under the camera’s Synchronization attributes. But every time I calc’d, the result had one camera facing my head properly, and the other two (representing the points as seen through the mirrors), which had the checkbox on, behind my head. That was strange, and the points looked overly flat and moved wrong. I tried turning off the checkbox for all cameras and the result was the same.

So I decided, even though I hadn’t bought a licence, to send a support request through their website. Rolf Schneider himself (founder and owner of Science-D-Visions, the company behind 3DE) answered me less than a day after. Now that’s customer support.

The news weren’t good though: that checkbox is in fact dead, and the only workaround would be to flip the footage to track the mirrored points. But as stated above, tracking them had already been quite painful, so, as I had access to the full version of the software through a friend, I decided to learn Python and make a point-mirroring script. Because learning two 3D packages and a matchmoving/mocap suite wasn’t enough. Jokes aside (again), I already knew JavaScript and the script itself is nothing fancy:

uGrp=tde4.getCurrentPGroup()
uCam=tde4.getCurrentCamera()
for uPnt in tde4.getPointList(uGrp,True):
  uOut=[]
  for uKey in tde4.getPointPosition2DBlock(uGrp,uPnt,uCam,1,tde4.getCameraNoFrames(uCam)):
    if uKey[0]!=-1:
      uVal=1-uKey[0]
    else:
      uVal=-1
    uOut.append([uVal,uKey[1]])
  tde4.setPointPosition2DBlock(uGrp,uPnt,uCam,1,uOut)

Credits go to Rolf as well, as he pointed me the best way to get the start and end frames in the sequence (the getCameraNoFrames bit). So yeah, after I shared my script he kept being involved with the issue and improved the solution. Then they gave me the chance to try the full version for two weeks.

I’ve fallen in love with the company.

Leave a Reply

Your email address will not be published. Required fields are marked *