Jump to content
Dataton Forum
mitreklov

3d mapping on a rotating object

Recommended Posts

Has anyone successfully projection mapped a 3d object with a minimum of 2 projector, then rotated the real world object and linked the rotation to the 3d model rotation tween layer in Watchout while keeping the mapping calibration?

Would this work?

 

A typical example would be a car on a revolve, 3d projection mapped from 2 projectors, the revolve would be turning slowly. The encoded positional data of the revolve would be linked to the rotation tween track for the model of the car in WO so the model mimicked the position of the actual car. Would the initial projector calibration stay valid through a full 360 degree rotation. How would the masking work?

 

I've seen this done with other media servers but budget demands we find an affordable solution, hence the question if WO will do it.

 

Any insight would be appreciated.

 

Thanks

 

 

Share this post


Link to post
Share on other sites

I recall Dylan at Penmac in South Africa did something like this a while back. Hopefully he'll see this. If not, perhaps someone at Dataton can put you in contact (or send them an email).

 

Mike

Share this post


Link to post
Share on other sites

Hey, the position calibration of the 3d projector (virtual camera) in WATCHOUT will remain constant and you need to sync the virtual car's rotation with the real one. Best would be too slave both WATCHOUT and the rotating platform to timecode. The other alternative is to use a third party tracking system but this will introduce a small lag.

 

The tricky part here is that WATCHOUT at the moment doesn't have automatic masking for 3D projections. So you need to do the masks by hand. You can set up static mask's on the time line that you interpolate. Like a mask for each 20 degree step in the rotation or so. And by mask I mean to place a textured quad between the 3D projector and 3d object. Its doable but a bit tricky and time consuming. The other alternative for masking is to bake the masking into a UV-mapped video on the 3D model using a 3D editor/engine like Unity 3D. Is this case you need to import the calibrated 3D projector positions into the 3D editor to get it right,

 

Best Regards,

 

Miro

Share this post


Link to post
Share on other sites

Haven't done it on a 3D object, but synced 2d content to position and rotation tweens based on automation data many times. 

 

I'm not sure why your initial idea wouldn't work.

Share this post


Link to post
Share on other sites

The simple answer here is that rotating, or otherwise moving, 3d mapping objects is not supported by our software.

 

What this depends on is the blending masks, that you need to blend the images together from your multiple projectors. These masks are not dynamic, or calulated in real time, but they are static. This means that you can construct blending masks that look good from one direction, when you have calibrated your mapping projection, but when you start rotating the object, even though you may manage to synch your rotation of the 3d object in WATCHOUT, the blending will not change, and it will not turn out a good result.

 

I'm not saying it's not possible, there may be ways of working around this problem. But there is no easy solution.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

×
×
  • Create New...