Jump to content

Walter

Dataton Partner
  • Posts

    316
  • Joined

  • Last visited

Everything posted by Walter

  1. Hi Felix, glad to hear show went fine. But about this, let's agree to disagree, as it's not a workaround, imo it's THE way to go, as in the only proper way to go. Of course the briefing to the content provider should have been ok, so you'd be supplied with separate files from the start. Just saying as an experienced operator....
  2. David, why don’t you provide a “preliminary” tweaking list? Stating clearly that it contains the findings so far, and marking any uncertain parts in red / italic / underlined script? I’m certain this will help out the previous respondent getting his system out of the disaster zone, don’t you agree? I understand and respect your take on wanting to provide a bulletproof document, but perhaps that is never going to happen anyway (because of the ongoing changes in and different versions of win10) My two cents...
  3. Hmmm... lot of work? It takes literally 5 seconds to extract the WAV from the hap file (with AME for example), another 10 to 15 seconds* per file to duplicate the cue in a layer below it and fill it with the correct WAV file, and of course muting the volume on the HAP cue. As added bonus it gives you more control on the play out / routing options. *note: I didn’t include the time used to insert the content disk containing all the hap files in a AME machine, dragging all the files in AME, selecting them all and choose WAV as format) and ejecting the disk, copying the WAV’s onto your WO media folder and dragging them into the mediabin... Using separate audio files has been the “good practice” for a long time now, so just bite the bullet and comply ;-)
  4. Hi all, Had to work on an older setup last weekend. 2 separate setups with 10+ projectors got their projectors replaced and so I was asked to redo the geometry on them. So far so good, although it was a long time ago I’ve use v4. Anyway, on both systems, the first screen didn’t do live geometry update. All the others behaved as to be expected. Could it have something to do with the first display being the cluster master? All is set and done so no urgency here, just curious as I never such behavior in the past. Grtz Walter
  5. Perhaps you could use a (free) program like StartupDelay. It gives you many options and is easy to configure. A few scenarios you can try, for instance making it run a small script pinging the production machine and set WO to start afterwards. Or whatever program or computer setting you think might kickstart the network settings. You can also have WO wait to start until the system is a given percentage in idle state. Or just go out and by a new network card, costs peanuts.... ;-)
  6. Hmmm... don’t think so. Not much help but you could be using tasks in stead of the main timeline and only use the main timeline to trigger the tasks... or use copy / paste from out the tasks once done to paste all items in the correct order...
  7. Thanks David and Rainer... indeed bit off-topic, but to clarify, I’m indeed operating in large (high end) events, and all that Rainer said applies. (ALL projectors I use carry BLC). About the latency, yes, and one should avoid using warp in any case, BUT, with class A camerasystems, genlocked to their camera switcher and the seamless switcher, that additional delay won’t matter at all (just enable warp on all PJ’s then even if that particular PJ doesn’t need to be corrected).
  8. Hmmm.... that might even work ;-)
  9. What James said. This workflow works great for ledwalls or multidisplay solutions (i.e. 32 separate displays in some sort of creative configuration on an exhibition). If you do the blending and warping in the projectors, then this would work out perfectly too (it's always better to do this in the projectors anyway because of the lack of black level correction in WO). About the throughput = it wouldn't scare me of personally but perhaps I'm just spoiled with extreme beefy systems to work with ;-)
  10. Although that works on a single object, and just for colouring, you'd need another opacity tween for having it dissappear. If you use it for multiple "fixtures", fading down all colours would leave you with a black object, as oppose to the first mentioned workflow, where your objects would actually dissappear or blend and thus behave like multiple lighting beam object blending with eachother and appearing on / behind eachother.
  11. May I suggest a different workflow? Been using this for a while and works great: I’ve been providing lighting designers with on screen “fixtures”, being either a square, circle, gobo-shape or basically anything you like. Having 20 or more of these fixtures available across the canvas, allows the lighting designer to play around with them as if they were projected moving heads. You can assign any property you want (position(x/y/z, size (x/y), rotations, but for color it gets a bit more complex and tedious. I tend to provide 4 colors (rgbw) making pastel tints possible, but RGB also works well. Just create all of the above (however complex you need it), set blend mode to Add and copy the cue 3 times. Replace each instance for its own color and finally assign a DMX address to the opacity of this cue. You could even create a custom fixture for it in your lighting desk. I’ll try to post a small video of the result of a simple “party” project. https://www.dropbox.com/s/2n5j0ifgs1ongap/samplevid3.mov?dl=0
  12. Hi Steve, “And your stage layout will not be able to represent the layout of the displays” This is of course not true!! This is, I suppose, exactly where virtual displays are meant for. Just create a physical 4K display somewhere out of sight. Then create virtual displays (1080) as if you were building a normal widescreen. Then layout the created VD’s back on each quarter of the physical 4K output. If you really want you could put in (self created) blend masks on top of these to finish it off .... All programming / editing then can be done as if you were running a conventional HD workflow! Watchout is sooooo awesome being able to provide such a wonderful, creative, impressive and efficient workflow !!!! ;-)
  13. Which IP address and subnet are you using?
  14. Exactly, rotating in any degree is very well possible and easy. Just double click a display in the stage windows and adjust the rotation angle.
  15. Uhm, please elaborate on the question! What is it you’re trying to achieve? Basically there are virtually no limitations to achieve anything you wish incl rotating the entire display itself so perhaps I don’t understand what you want?
  16. Thank you Erik. Any plans on developing this in the (near) future?
  17. Hey guys, just quick question. I was trying out NDI in the b12 testversion, which actually seems to work really well and extremely quick (little latency). However, I noticed there's no audio feed to be used. Is that correct? Will NDI feature in WO only be videostream, no audiostream? Or is this still developing? Or.... am I'm missing something? Curious about this. Walter
  18. Your sdi/hdmi feed will have a yuv colourspace as it's a smpte format signal. A dvi capture card would be RGB as standard. In the capture card settings you should be able to tell your card to recognize the incoming signal as YUV which will fix your issue.
  19. No there is not! (As discussed widely before in tons of topics in this forum ;-)
  20. To me it seems the issue is exactly as Morgan pointed out. As the files are free running and thus theoratically not starting at the exact same time, the software is reading out the exact same sector on your hard disks, and I can imagine you're reaching a bottleneck just there. As this is the only different factor as oppose to running the same cues with different video's in it. If you still need to play these exact same video's the solution is probably extremely easy : just make some copies of the original file and play these, thus not reading the exact same location on your drive so many times. Could well be, right?
  21. Steve, so using virtual displays didn't help you here? I'd say it is the way to solve this for you...
  22. Steve, if i understand you correctly, you have media on a certain point in your projection surface, you need to adjust that, and any other media positioned on that position in the projection surface? Well, if that's the case, use a virtual display. You just position all that media in the virtual display and position the single virtual display cue in the correct position. Then all you have to do is adjust the corners of the virtual display and all your cues are the same. Good luck!
  23. Your command is wrong I'm afraid. Don't know it by heart, have it somewhere in my documents. But, if you search the forum you will find a previous post where the correct command it stated. Good luck! Walter
  24. Using greenkey in stead of alpha is a way, but for quality reasons I tend to render out an alpha mask version and using that as mask above the video. (No chance of green artifacts and also way better performance than using alpha in the videostream.
  25. Hey, I'm just thinking outside of the box for OP here, no need for judging his chosen solution. If he somehow needs this particular workflow, then this is an option for his case. We're all techies here, we're born to solve issues. Sometimes you just gotta make it work, and hey, that's what just happened (a better option than "no, you can't".)
×
×
  • Create New...