Jump to content

Ventmedia

Member
  • Posts

    26
  • Joined

  • Last visited

Posts posted by Ventmedia

  1. We had a load of sound issues recently on a tour - intermittent pops, crackles and dropout. This was with a Focusrite Scarlett 18i20 (3rd Gen) on a watchpax 40. Somedays it seemed fine then it would just start again. We tried a second sound card, new cables etc, which was the same, so we ended up stripping the audio out of the watchpax, putting it into QLab which also then sent timecode to watchout. Bit of a headache mid tour 
    The sound quality using the same sound cards have been fine via QLab, they were also fine on a Pc running WO display.
    Personally I think the inability to add sound card drivers to a watchpax is a mistake.
     

  2. On 10/16/2021 at 3:10 PM, Walter said:

    With regards to geometry: Yes, it is definitely possible, although fairly limited and cumbersome. Individual geometry will be limited to a simple 4-point correction by using a corner tween. 

    Workflow can be a small mindf*ck, but again, if tight on budget, you will be able to manage.  

    With regards to blending = this will be trickier, but if the budget is so low you can't even afford a projector that has this build in...  Yeah, I'd take on another gig.

    Interesting response.....

    Unfortunately relying on the generally limited projectors capabilities isn't the solution for this gig it's too complex. Neither is the question based on a budgetary issues or concerns, but I appreciate the heads up.

    Obviously Watchout wasn't built for this type of setup in mind, 

  3. On 2/23/2019 at 4:02 PM, Morgan Wong said:

    If you do not need Geometry correction and blending function from Watchout, then you will be fine.

    if not, It will become very complicated to do the geometry correction for FOUR 1920x1080 in one display(3840x2160) output.

     

    HI,

    Could you let me know - Is it actually possible at all to do geometry correction and blending within Watchout for four separate 1920x1080 in one display(3840x2160) output via a FX4?

     

    Thanks

  4. I’ve been batching out a lot HapQ content from After effects, Premiere and Media encoder over the last year using Aftercodecs plugin and it’s been really solid. The only issue I’ve seen is if I attempt to replace the video files from within watchout using the browse option, it create a nasty yellowing. Bringing in video files freshly then replacing them on the timeline sorts this out.

  5. Hi,

    Due to the restrictions over the last few months I've had a set-up where I am predominantly working on a show remotely via VNC giving we access to the Production and Display Machine. I've a camera on stage and this is generally working out OK and as planned.

    The only frustrating element is due to the elements of the set on stage I am unable to see the updating status bar on the onstage outputs when I am updating/transferring new content. It's leaving me clueless to any sense of how long is left for the transfer. Is there a way of knowing the status of data/file transfers from the production computer. The outputs are all 3D Mapping projectors (I know a status bar can be seen with a 2d projector).

    thanks

     

    Barret

  6. On 10/9/2018 at 3:13 PM, matkeane said:

    I'll take Mike's word for it that it's OK to force all conditions to zero - although I've also added a switch for it, just in case the default behaviour is useful in some circumstances. And, after a busier couple of weeks than planned, I found time to Mac/Windows builds of my helper app, which can be downloaded from my website:

    https://matkeane.com/project/watchout-conditional-layers-utility.

    The readme file in the zip has some instructions but, basically, conditional layers can be switched through the UI, or via UDP/OSC commands. The UI controls can be disabled if only network control is required. In addition to switching individual conditions, they can be assigned (in the UI) to groups and presets for more flexible recall and switching.

    Very Helpful   - Many thanks for this👍

  7. I'm in the process of bringing an old show into WO. The footage is all rendered in ProRes. After finally getting it to run on the display software it was noticable that the blacks were not full black, I checked the original footage in After effects which is all good  - black is black. Subsequantly I've done a test transcoding and comparing the footage to HAP which fixes the problem within WO.
    It reminds me of the old issues with gamma shifts with ProRes with Quicktime as can be found on a multitude of forums for editers and content creators  - is it the same quicktime issues causing this within WO?

    I've added a screenshot of the footage of the two test clips within Watchout which I've then looked at in Photoshop, there is 16% RGB shift in the blacks with the ProRes compared to 0% in the Hap footage. I'm guessing there's a whole gamma shift but I've not tested it properly  - I realise ProRes is not the recommended codec but the simple conclusion it seems is that ProRes isn't an option.

    black compare.png

  8. On 10/8/2020 at 4:10 PM, Mike Fahl said:

    You can just make the edge-blend gradient in Photoshop or similar, and add it as a separate layer on top. It's essentially a black image with a transparency gradient. Test, djust the curvature of the gradient as required, update the media, rinse and repeat.

    Mike

    Sure, as was the method with After Effects tens years ago as well as embeding the mapping and distortion into the artwork, it's nice to think we've moved on from there a little though.

  9. OK, I've done further tests on this and striped down the set-up to do it. It seems the screen print/capture works when a single display is attached but two or more and I'm unable to do it - I get the WO screens. Is this the same as other folk? Is this a bug?

     

     

  10. 31 minutes ago, Morgan Wong said:

    connect keyboard on the display computer, 
    then use print screen button.
    paste it on "Paint"
    save the file.
     

    I thought that would be the case - I've got the images displayed with the mask but when I print the screen I get the following

    ss.jpg

  11. 12 hours ago, RBeddig said:

    If you have a second computer and a capture card, you could capture the output of the display computer and generate the screen shot.

    If not, you'll need to do it on the production computer.

    I would use the feature to record and export a video. For this you need to have Quicktime installed and you've got to scale the stage window up. Then you make sure that only the screen you want to capture is fully visible in the stage window.

    Check the manual for information regarding the video export function.

    Again RBeddig thanks for the reply.

    Unfortunately the export video method seems impossible as the production computer is running on an HD monitor and the outputs of the display are 4k  -I can't make the displays fully visible on the stage window to capture them - it's a shame this would be a good workaround.

    The capture card method I'll tinker with but I feel it's adding significantly more kit and workflow to the process which I feel should be relatively straight forward.

     

    Has anybody got a screenshot method of working with Watchout? A third party app that''l do the job?

    I don't know enough about what's happening with Watchout to allow for the Watchout icon window to be captured and not the output display itself

     

  12. Hi, It seems like a newbie question to follow all this but I can't get a screen shot of the display computer whatever I try - I keep getting just a white screen or Watchout in a window, not what's being displayed - is there a knack to this?

  13. 19 hours ago, RBeddig said:

    Well, as I said before, you could place your masks (taken as screen shots and imported as transparent images) in a layer above everything else and turn the layer on and off by sensing control cues to yourself. This would allow you to enable the masks at one part of the show and to disable them when you reach a certain timecode.

    On the other side, if your servers and your production computer are good, you could of course also play sound from WATCHOUT. The sound can come from either the production computer or from a display server! Both works.

    I will give this method a test, I can see it working. My only reluctance is loosing the ability to quickly tweak the blends if required between shows. One thing I haven't mentioned is that the projectors we are discussing are also on the floor. On the tour the only variable I can't control is the floor, sometimes sprung sometimes solid etc. It means these projectors need to be checked and tweaked generally between every show. I'll try this out though.

    Many thanks

  14. Again, thanks for the reply. 

    There isn't a moment between the sections to have a pause unfortunately.

    I think my final way to sort this is to convince the sound designer to move the audio into Watchout thus eliminating the timecode problem. It does make me nervous running both audio and video of one machine but I'm willing to try it.

  15. It's a seamless show, no breaks,  no cues, fully automated from timecode. The audience are in the same space as the performer, a mix of installation and dance. The truth is it's been running on Catalyst for about 6 years, I'm hoping to bring it up to date. The images show examples of the different states I've mentioned but there are more

    iinfinite2018_3.jpg.0f5de0d5aec3f3f1f689c59204df41c7.jpg

    iinfinite2018_4.jpg

  16. Many thanks for the replies. The whole show is actually timecode dependant, but i can see that making a screenshot of the blends and using this as a controllable layer is a workaround method with possible control of the conditional layers. Unfortunately I won't be able to get the powers that be to add another computer and switcher. Again thanks

  17. I have a show/installation in which during the first part it has two projectors mapped onto the walls of the interior space. These require an edge blend between the two with the content mapped with virtual displays.
    Later these same projectors are used in a compleatly different way, through haze. The different effect is with the content on more virtual displays mapped seperately which don't require a blend/mask.

    I've seen that an auxilary timeline can be placed above the edgeblend, which would be perfect except that this second sequence needs to follow timecode and lasts for about 30 minutes. As timecode syncing an auxilary timeline is possible, is there a way to achieve switching the masks off as part of the programming of the show, i.e a script I can send that switches the mask off and on? Or some other method I'm overlooking maybe, or layer specific masking?

    many thanks

  18. On 7/2/2018 at 4:57 PM, nikolai said:

    Hi Walter, thanks for getting back to me.

    Then I have missunderstood something.

    the way I thought it would work was like this:

    put any object on stage maybe a 2x2x2 meter cube.

    use this to align the projector.

    the 3d projector should now have the position angle etc of the real world projector, sitting in 3d space in WO.

    now i should be able to both replace the geometry or move it on stage.

    the obj file in watchout would need to move using x,y,z and rotation tweens.

    calibration is done once and as long as the projector stays put it should not have to change through the show.

    have i got this wrong?

    Hi Nikolai, I realise this topic is a little old but I’d be interested in knowing how you got on with this setup (3D objects moving on a set together with the accuracy of the 3D mapping within watchout). I’ve got something similar in the near future and I’m still not sure if watchout is best for the job or not

  19. The ability to snap the position of the corner points of a virtual display outputs to other corner points of other virtual displays, and the ability to select more than one corner point at one time, both of the same virtual display and of another so it stays in a relative position - not unlike Madmapper and Resolume

  20. I'd have to second this for a very helpful addition to Watchout - 3d mapping projectors being able to route through to a virtual display which then could be composed onto a more complex composition - a fx4 layout etc.

    I have a relatively old 4x projector show that tours quite a lot, that we try to keep the touring kit to a minimum- we hope to update it onto Watchout for the future. Only one output requires, or would benefit, from a 3d mapping output. The show has run via a datapath programmed on Avolites Ai in the past and before that Catalyst, I'm struggling to work out the best way to make this work at the moment with Watchout

  21. I'm also going to add the request to integrate Notch blocks directly. Making content in Notch is simply revolutionary after years of rendering. Having the ability to make use of Notch's realtime parameters directly within the media server is the way forward of content creation for live performance.

×
×
  • Create New...