Jump to content
Dataton Forum


  • Content Count

  • Joined

  • Last visited

Everything posted by Walter

  1. Use virtual display for total outline (also makes for easy programming) and divide that towards real world outputs. Then scale and reuse this VD on an extra output which you can use for DSM feed. Also enables you to overscale a bit to leave out any non-essential parts of the feed (outer sides i.e.). Could be one of your outs of the w7100 or if you’re full, add in a simple watchpax. I use the same for multi viewing and with an output of one of the main displayservers, you can keep track of any input signals (keeping them active at the same time )
  2. Hi there, if that would be the case it’s hardly a Watchout issue, is it? I suppose artnet is artnet... and since you’ve used it successfully in the past your network is set at 2.x.x.x and subnet ? Please remember that where one device starts counting its universes at 1, others start at 0. So you might want to try changing the universe.
  3. What he said ;-) it’s all about edid management.
  4. My two cents: havent installed QT on both production and display machines ever since it wasn’t mandatory anymore during the WO install and never experienced any issue since during programming or playback with WO (of course, haven’t used the export function and using dedicated machines that aren’t used for editing and such. Few weeks ago when I had to work on a v4 WO installation was the first time since, that I was asked to and had to install QT again.
  5. Walter

    Pause cue

    Still, even in a rush, its so little effort to work neat and efficient. You don’t have to move the pause cue or media cue around. Just snap, double click the media cue and type X.1 in the time position field. (X being the actual second where pause cue is, or vice versa). Personally i tend to use short opacity fades on the beginning of my cues, so media can sort of preload, but that’s kind of a heritage from the old days where that was needed (old graphic cards, bad codecs etc). In your environment I understand you don’t want to go that route. Let me clearly state I hope your reques
  6. Hi Felix, glad to hear show went fine. But about this, let's agree to disagree, as it's not a workaround, imo it's THE way to go, as in the only proper way to go. Of course the briefing to the content provider should have been ok, so you'd be supplied with separate files from the start. Just saying as an experienced operator....
  7. David, why don’t you provide a “preliminary” tweaking list? Stating clearly that it contains the findings so far, and marking any uncertain parts in red / italic / underlined script? I’m certain this will help out the previous respondent getting his system out of the disaster zone, don’t you agree? I understand and respect your take on wanting to provide a bulletproof document, but perhaps that is never going to happen anyway (because of the ongoing changes in and different versions of win10) My two cents...
  8. Hmmm... lot of work? It takes literally 5 seconds to extract the WAV from the hap file (with AME for example), another 10 to 15 seconds* per file to duplicate the cue in a layer below it and fill it with the correct WAV file, and of course muting the volume on the HAP cue. As added bonus it gives you more control on the play out / routing options. *note: I didn’t include the time used to insert the content disk containing all the hap files in a AME machine, dragging all the files in AME, selecting them all and choose WAV as format) and ejecting the disk, copying the WAV’s onto yo
  9. Hi all, Had to work on an older setup last weekend. 2 separate setups with 10+ projectors got their projectors replaced and so I was asked to redo the geometry on them. So far so good, although it was a long time ago I’ve use v4. Anyway, on both systems, the first screen didn’t do live geometry update. All the others behaved as to be expected. Could it have something to do with the first display being the cluster master? All is set and done so no urgency here, just curious as I never such behavior in the past. Grtz Walter
  10. Perhaps you could use a (free) program like StartupDelay. It gives you many options and is easy to configure. A few scenarios you can try, for instance making it run a small script pinging the production machine and set WO to start afterwards. Or whatever program or computer setting you think might kickstart the network settings. You can also have WO wait to start until the system is a given percentage in idle state. Or just go out and by a new network card, costs peanuts.... ;-)
  11. Hmmm... don’t think so. Not much help but you could be using tasks in stead of the main timeline and only use the main timeline to trigger the tasks... or use copy / paste from out the tasks once done to paste all items in the correct order...
  12. Thanks David and Rainer... indeed bit off-topic, but to clarify, I’m indeed operating in large (high end) events, and all that Rainer said applies. (ALL projectors I use carry BLC). About the latency, yes, and one should avoid using warp in any case, BUT, with class A camerasystems, genlocked to their camera switcher and the seamless switcher, that additional delay won’t matter at all (just enable warp on all PJ’s then even if that particular PJ doesn’t need to be corrected).
  13. Hmmm.... that might even work ;-)
  14. What James said. This workflow works great for ledwalls or multidisplay solutions (i.e. 32 separate displays in some sort of creative configuration on an exhibition). If you do the blending and warping in the projectors, then this would work out perfectly too (it's always better to do this in the projectors anyway because of the lack of black level correction in WO). About the throughput = it wouldn't scare me of personally but perhaps I'm just spoiled with extreme beefy systems to work with ;-)
  15. Although that works on a single object, and just for colouring, you'd need another opacity tween for having it dissappear. If you use it for multiple "fixtures", fading down all colours would leave you with a black object, as oppose to the first mentioned workflow, where your objects would actually dissappear or blend and thus behave like multiple lighting beam object blending with eachother and appearing on / behind eachother.
  16. May I suggest a different workflow? Been using this for a while and works great: I’ve been providing lighting designers with on screen “fixtures”, being either a square, circle, gobo-shape or basically anything you like. Having 20 or more of these fixtures available across the canvas, allows the lighting designer to play around with them as if they were projected moving heads. You can assign any property you want (position(x/y/z, size (x/y), rotations, but for color it gets a bit more complex and tedious. I tend to provide 4 colors (rgbw) making pastel tints possible, but RGB also works well.
  17. Hi Steve, “And your stage layout will not be able to represent the layout of the displays” This is of course not true!! This is, I suppose, exactly where virtual displays are meant for. Just create a physical 4K display somewhere out of sight. Then create virtual displays (1080) as if you were building a normal widescreen. Then layout the created VD’s back on each quarter of the physical 4K output. If you really want you could put in (self created) blend masks on top of these to finish it off .... All programming / editing then can be done as if you were running a conventional HD work
  18. Which IP address and subnet are you using?
  19. Exactly, rotating in any degree is very well possible and easy. Just double click a display in the stage windows and adjust the rotation angle.
  20. Uhm, please elaborate on the question! What is it you’re trying to achieve? Basically there are virtually no limitations to achieve anything you wish incl rotating the entire display itself so perhaps I don’t understand what you want?
  21. Thank you Erik. Any plans on developing this in the (near) future?
  22. Hey guys, just quick question. I was trying out NDI in the b12 testversion, which actually seems to work really well and extremely quick (little latency). However, I noticed there's no audio feed to be used. Is that correct? Will NDI feature in WO only be videostream, no audiostream? Or is this still developing? Or.... am I'm missing something? Curious about this. Walter
  23. Your command is wrong I'm afraid. Don't know it by heart, have it somewhere in my documents. But, if you search the forum you will find a previous post where the correct command it stated. Good luck! Walter
  • Create New...