Jump to content

Walter

Dataton Partner
  • Posts

    300
  • Joined

  • Last visited

Posts posted by Walter

  1. 2 hours ago, JJ Myers said:

     

    Nope, I second your opinion strongly! Would be really confusing if a file wouldn’t show when it hits the first frame. After all, I always sell WO as being basically a non-linear editing programme but without the need for pre-rendering and this behavior would really be out of line (and all reality). 

    Feels like TS is referring to something like QLAB or such which is a total different kind of ballgame with its lack of timeline (although I love using it whenever convenient) 

  2. Hi Nikolai. So if I understand correctly, you have one front projector to capture several moving elements?

    On 6/25/2018 at 3:01 PM, nikolai said:

    So the main challenge is 3-4 set pieces that move around on the stage from scene to scene

    In that case, using 3D mapping is no option whatsoever, since you'll be aligning one projector for one element, which will impact the position of each element in the picture (you can't align multiple elements with one 3D projector, since you'll be adjusting the virtual position of the projector, not the elements)

    So as I see it, only "conventional" corner pin tweaking would be your only option. Apart from that I doubt 3D Geometry would work as you can not tween this. (I suppose you might be able to copy the geometry and live paste during the show? But even if I could I wouldn't be going that route).

     

  3. Thanks Thomas, Thanks Mike. 

    Meanwhile I did a test already. Just generated some content in C4D and rendered into 8K-HAP. Resulting bitrate approx 1500 mbps. Played without any issue in stage window in production on my 2012mbp (bootcamp) and I have no doubt it will play perfect on a properly build displayserver, even multiple streams. (Am used to play numerous HAP files with a 7200x1200px framesize). 

    But again, thanks for the insight, this dxva app seems useful for numerous applications. 

  4. Hey guys, 

    quick question: what’s your experience with 8K content rendering and playout. Better to use one stream in hap 4320P or quad split to 4x4K? 

    Anyone knows the average bitrate of an 8K HAP file ? 

    Thoughts are welcome ?

  5. On 6/1/2018 at 2:44 PM, Stefan Duenkel said:

    Hey Walter,

     

    Thanks for your help

    Yes , WO 6.

    And yes Watchout is installed on the m.2.

    I was wondering if the m.2 slot and the GPU have to share the 16 pcie lanes the CPU supports.

     

    But if i read it right in the manual of the motherboard the M.2 SSD is connected to the CPU via DMI 3.0.

    so this shouldnt be the problem.

     

    i measured the read speed of the SSD an its over 3000 Mb/s.

     

    I can playback and loop two  4k files in HAP with 30fps simultaniously on top of each other perfectly , but not one with 60fps.

     

    is there a maximum of bitrate per file  that Watchout supports?

     

    The 4k HAP 60fps - clip is ten seconds long and has a bitrate above 600Mb/s

     

     

     

    Well, I’m used to playing numerous files with a bitrate of around 800mpbs at 50fps without any issues (at least 4 or so) using a dual ssd configuration so driving it of an m2 should present no issues. Not sure about the lanes but as your reading speed turns up as well as you state, this should not be the bottleneck. 

     

    Watchout doesn’t have a max bitrate it can handle afaik, so....

     

    btw, m2 is flash, not ssd, right? Or am I totally lost in the world of data carriers ?

  6. WO6 I presume? 

    This should indeed (at first sight) be no problem at all. Are you sure watchout is installed on the m2? Did you measure it’s read speed? 

    How long is the video file you’re trying to play? 

    HAP or HAPQ should play a few 4K streams without issue. 

     

     

  7. Hi Ben,

    you state the object is a series of boxes... so not a separate object for each box? 

    I’d go with separate elements and, if needed, split up your source content in watchout using virtual displays to map onto the separate boxes. Would give you exact control per box.

  8. I’m pretty sure Dill is talking about stage size rather than display size.

     

    There red used to be a limitation to how much you can scale up or down the stage size of a display relative to it’s actual size. Used to be factor 2, later it became factor 3 (So a 1920 display couldn’t be smaller than 640px in stage size? 

    But in the latest releases there’s apparently no noticeable limit anymore. At least, I appear to be able to set extreme values, both up and down. 

     

    Which leaves the question: Dill, what version of WO are you using? 

  9. 7 hours ago, FUM said:

    Hi, Erik.
    Thank you for teaching me in detail.
    I am using Adobe Media Encoder and it may be difficult to codec to Hap.
    Because it does not correspond to Hap family.
    So I will try it on H264.

    Hi there. What do you mean with “it does not correspond to HAP family” ? 

    I encode using AME all the time to hap. Just don’t forget to install the codec and it will be in the QT list of the encoder. 

  10. Use virtual display for total outline (also makes for easy programming) and divide that towards real world outputs. 

    Then scale and reuse this VD on an extra output which you can use for DSM feed. Also enables you to overscale a bit to leave out any non-essential parts of the feed (outer sides i.e.). 

    Could be one of your outs of the w7100 or if you’re full, add in a simple watchpax. 

    I use the same for multi viewing and with an output of one of the main displayservers, you can keep track of any input signals (keeping them active at the same time ) 

  11. Hi there, 

     

    if that would be the case it’s hardly a Watchout issue, is it? 

    I suppose artnet is artnet... and since you’ve used it successfully in the past your network is set at 2.x.x.x and subnet 255.0.0.0 ? 

     

    Please remember that where one device starts counting its universes at 1, others start at 0. So you might want to try changing the universe. 

  12. My two cents: 

    havent installed QT on both production and display machines ever since it wasn’t mandatory anymore during the WO install and never experienced any issue since during programming or playback with WO (of course, haven’t used the export function and using dedicated machines that aren’t used for editing and such. 

    Few weeks ago when I had to work on a v4 WO installation was the first time since, that I was asked to and had to install QT again. 

  13. 19 hours ago, ZaakQC said:

    Well, life does not always work like that. Sometimes there are last minute changes needed. We use watchout for live shows and tv shows recording. Can't always stop the show. And there is no problem with that. It's not hard to have the cue not show the next image if it ends flush. It even makes lots of sense. It makes our life easyer and makes us work faster. Snap magne and not show next video. That is the purpose of a pause. Not to show the next video right away. Better worklow. Saves time and worries. 

    Still, even in a rush, its so little effort to work neat and efficient. You don’t have to move the pause cue or media cue around. Just snap, double click the media cue and type X.1 in the time position field. (X being the actual second where pause cue is, or vice versa). 

    Personally i tend to use short opacity fades on the beginning of my cues, so media can sort of preload, but that’s kind of a heritage from the old days where that was needed (old graphic cards, bad codecs etc). In your environment I understand you don’t want to go that route. 

    Let me clearly state I hope your request doesn’t get processed as I don’t understand the problem and as was said above, it’s the entire point of timeline based cueing. If a media cue is placed at 1 second in the timeline, it makes sense it’s visible once the timeline hits the 1 second mark, doesn’t it? In this case it’s a 2 or 3 second action : double click the pause cue, type 0.9 and press enter. Tadaaaahhh... 

  14. 3 hours ago, Menetekel said:

     

    and yes, it is a solution, but it is still a workaround.

     

    Hi Felix, 

    glad to hear show went fine. But about this, let's agree to disagree, as it's not a workaround, imo it's THE way to go, as in the only proper way to go. Of course the briefing to the content provider should have been ok, so you'd be supplied with separate files from the start. :D

    Just saying as an experienced operator....

  15. On 5-3-2018 at 6:12 PM, DavidA said:

    We are getting there. At the moment we are looking into any pitfalls related to Microsoft End User License Agreements and other legal issues by making such a list available for Windows 10. We want to make sure we are on the right side of the legal fence. 

    David, why don’t you provide a “preliminary” tweaking list? Stating clearly that it contains the findings so far, and marking any uncertain parts in red / italic / underlined script? I’m certain this will help out the previous respondent getting his system out of the disaster zone, don’t you agree? 

     

    I understand and respect your take on wanting to provide a bulletproof document, but perhaps that is never going to happen anyway (because of the ongoing changes in and different versions of win10)

    My two cents... 

  16. Hmmm... lot of work? It takes literally 5 seconds to extract the WAV from the hap file (with AME for example), another 10 to 15 seconds* per file to duplicate the cue in a layer below it and fill it with the correct WAV file, and of course muting the volume on the HAP cue.  As added bonus it gives you more control on the play out / routing options. 

     

    *note: I didn’t include the time used to insert the content disk containing all the hap files in a AME machine, dragging all the files in AME, selecting them all and choose WAV as format) and ejecting the disk, copying the WAV’s onto your WO media folder and dragging them into the mediabin... 

    Using separate audio files has been the “good practice” for a long time now, so just bite the bullet and comply ;-) 

  17. Hi all,

    Had to work on an older setup last weekend. 2 separate setups with 10+ projectors got their projectors replaced and so I was asked to redo the geometry on them.

    So far so good, although it was a long time ago I’ve use v4.

    Anyway, on both systems, the first screen didn’t do live geometry update. All the others behaved as to be expected.

    Could it have something to do with the first display being the cluster master?

    All is set and done so no urgency here, just curious as I never such behavior in the past.

     

    Grtz Walter

×
×
  • Create New...