Jump to content

Walter

Dataton Partner
  • Posts

    313
  • Joined

  • Last visited

Everything posted by Walter

  1. I would never use mpeg2 or mpeg4 for 4K playback. Please use HAP codec for this and all will be fine (imo). I use adobe media encoder with repetitive success. The latest version appears to have lost the Native HAP support, but if you have a subscription, you can always install the CC2017 version. Otherwise, there is a third party solution available for this, by Disguise.
  2. ??‍♂️ thanks for this completion! Can you shortly elaborate on why this (imo) additional setting is required? Would WO otherwise overrule gpu settings?
  3. Any particular desired resolution? And number of simultaneous sequences?
  4. Hi MisterK, you don’t have to do any settings for this in Watchout. You have to go into Nvidia control panel, advanced setting. Setup a sync master (one of the outputs) and slave all other displays and the rj45 port(s). On the next machine, use the incoming sync at the rj45 port for all displays.
  5. What Leonard said! And ps: setup appears to be solid for years to come (specifically for gallery / museum setups where you normally have more than enough time to tweak content to be most efficient).
  6. Walter

    Pause cue

    Nope, I second your opinion strongly! Would be really confusing if a file wouldn’t show when it hits the first frame. After all, I always sell WO as being basically a non-linear editing programme but without the need for pre-rendering and this behavior would really be out of line (and all reality). Feels like TS is referring to something like QLAB or such which is a total different kind of ballgame with its lack of timeline (although I love using it whenever convenient)
  7. Hi Nikolai. So if I understand correctly, you have one front projector to capture several moving elements? In that case, using 3D mapping is no option whatsoever, since you'll be aligning one projector for one element, which will impact the position of each element in the picture (you can't align multiple elements with one 3D projector, since you'll be adjusting the virtual position of the projector, not the elements) So as I see it, only "conventional" corner pin tweaking would be your only option. Apart from that I doubt 3D Geometry would work as you can not tween this. (I suppose you might be able to copy the geometry and live paste during the show? But even if I could I wouldn't be going that route).
  8. Hi Nikolai, will you be using one or multiple projectors (one for each stage element for example)?
  9. Wow Alex, great set design. Thanks for the technical insight!
  10. Thanks Thomas, Thanks Mike. Meanwhile I did a test already. Just generated some content in C4D and rendered into 8K-HAP. Resulting bitrate approx 1500 mbps. Played without any issue in stage window in production on my 2012mbp (bootcamp) and I have no doubt it will play perfect on a properly build displayserver, even multiple streams. (Am used to play numerous HAP files with a 7200x1200px framesize). But again, thanks for the insight, this dxva app seems useful for numerous applications.
  11. Hey guys, quick question: what’s your experience with 8K content rendering and playout. Better to use one stream in hap 4320P or quad split to 4x4K? Anyone knows the average bitrate of an 8K HAP file ? Thoughts are welcome ?
  12. Well, I’m used to playing numerous files with a bitrate of around 800mpbs at 50fps without any issues (at least 4 or so) using a dual ssd configuration so driving it of an m2 should present no issues. Not sure about the lanes but as your reading speed turns up as well as you state, this should not be the bottleneck. Watchout doesn’t have a max bitrate it can handle afaik, so.... btw, m2 is flash, not ssd, right? Or am I totally lost in the world of data carriers ?
  13. WO6 I presume? This should indeed (at first sight) be no problem at all. Are you sure watchout is installed on the m2? Did you measure it’s read speed? How long is the video file you’re trying to play? HAP or HAPQ should play a few 4K streams without issue.
  14. Hi Ben, you state the object is a series of boxes... so not a separate object for each box? I’d go with separate elements and, if needed, split up your source content in watchout using virtual displays to map onto the separate boxes. Would give you exact control per box.
  15. Sequential 3D is not possible to generate in Watchout. Can’t your active projector or display handle two full res feeds and combine them into a sequential result? Should be possible
  16. I’m pretty sure Dill is talking about stage size rather than display size. There red used to be a limitation to how much you can scale up or down the stage size of a display relative to it’s actual size. Used to be factor 2, later it became factor 3 (So a 1920 display couldn’t be smaller than 640px in stage size? But in the latest releases there’s apparently no noticeable limit anymore. At least, I appear to be able to set extreme values, both up and down. Which leaves the question: Dill, what version of WO are you using?
  17. Are you sure only one network adapter is active? And by “the IP of the machine”, do you mean you actually checked it in the adapter settings or the one that’s written on to it?
  18. Hi there. What do you mean with “it does not correspond to HAP family” ? I encode using AME all the time to hap. Just don’t forget to install the codec and it will be in the QT list of the encoder.
  19. Use virtual display for total outline (also makes for easy programming) and divide that towards real world outputs. Then scale and reuse this VD on an extra output which you can use for DSM feed. Also enables you to overscale a bit to leave out any non-essential parts of the feed (outer sides i.e.). Could be one of your outs of the w7100 or if you’re full, add in a simple watchpax. I use the same for multi viewing and with an output of one of the main displayservers, you can keep track of any input signals (keeping them active at the same time )
  20. Hi there, if that would be the case it’s hardly a Watchout issue, is it? I suppose artnet is artnet... and since you’ve used it successfully in the past your network is set at 2.x.x.x and subnet 255.0.0.0 ? Please remember that where one device starts counting its universes at 1, others start at 0. So you might want to try changing the universe.
  21. What he said ;-) it’s all about edid management.
  22. My two cents: havent installed QT on both production and display machines ever since it wasn’t mandatory anymore during the WO install and never experienced any issue since during programming or playback with WO (of course, haven’t used the export function and using dedicated machines that aren’t used for editing and such. Few weeks ago when I had to work on a v4 WO installation was the first time since, that I was asked to and had to install QT again.
  23. Walter

    Pause cue

    Still, even in a rush, its so little effort to work neat and efficient. You don’t have to move the pause cue or media cue around. Just snap, double click the media cue and type X.1 in the time position field. (X being the actual second where pause cue is, or vice versa). Personally i tend to use short opacity fades on the beginning of my cues, so media can sort of preload, but that’s kind of a heritage from the old days where that was needed (old graphic cards, bad codecs etc). In your environment I understand you don’t want to go that route. Let me clearly state I hope your request doesn’t get processed as I don’t understand the problem and as was said above, it’s the entire point of timeline based cueing. If a media cue is placed at 1 second in the timeline, it makes sense it’s visible once the timeline hits the 1 second mark, doesn’t it? In this case it’s a 2 or 3 second action : double click the pause cue, type 0.9 and press enter. Tadaaaahhh...
×
×
  • Create New...