Jump to content
Dataton Forum

matkeane

Member
  • Content Count

    130
  • Joined

  • Last visited

2 Followers

About matkeane

  • Rank

Contact Methods

  • Website URL
    http://matkeane.com

Profile Information

  • Gender
    Male
  • Location
    Paris

Recent Profile Visitors

730 profile views
  1. Another way to manage the load on a laptop producer, while still using video previews, is to use Stage Tiers. If you assign parts of your show to separate Tiers, you can choose which to preview on the production machine. Alternatively you can encode low-res/low data-rate preview files and place them on a 'preview' Tier, only rendered on the production machine, so that you have a visible reference (even a BITC in case of corrections requiring a re-render) for timing, but without overloading the machine.
  2. Salut Benoit, Ah, you're right! For some reason I thought the Datapath accepted both analogue and digital signals, but I guess I got mixed up between their older VisionAV cards and the VisionSC, which does appear to be digital only. That's a shame, but not a big problem really... Merci et à+ Matthew
  3. Apologies for what is not strictly a Watchout question, but I wondered if anybody has managed to successfully capture a composite PAL source with a Datapath VisionSC-SDI4 card. The Datapath card is installed in a Watchout player (purchased from a Dataton distributor and otherwise working perfectly) currently running v6.2.2. However, I can't get the Datapath to recognise a PAL source on any of the 4 inputs in the Datapath Vision software - so I'm not even getting as far as configuring the video inputs in Watchout. I did wonder whether the Datapath still supports SD formats, but the data sheet for the VisionSC-SDI4 seems to show support for both PAL & NTSC signals (576i25, 480i29.97) https://www.datapath.co.uk/datapath-products/video-capture-cards/visionsc-range/visionsc-sdi4 The camera is a HikVision DS-2CC12D9T which outputs either PAL or NTSC on the composite output. https://www.hikvision.com/uploadfile/image/9900_E资料模拟相机NewtemplateDatasheetD5T(18)DS2CC12D9T(A).pdf The reason I don't use a HD signal is that the camera is actually used with a legacy motion capture system which is configured for a 4:3 PAL source. Since the camera hangs alongside the main video projector for the show and needs to frame the stage, it would be handy to be able to see what I'm doing while adjusting the camera. Currently I have to use a laptop to VNC into the machine in FOH that's connected to the camera, but I was hoping I could make life a little easier by using the Datapath card to display the live input through the video projector. When the camera is connected - with the same video cable - to the machine running the motion capture software, which uses a Blackmagic capture card, the signal is recognised as a PAL source. Usually I find Datapath cards are very good at auto-detecting the input format, while Blackmagic cards are more picky. Unfortunately I don't have access to the machine right now, and it will be a while before I can try this again, but I was just wondering if I'm missing something obvious... Thanks, and I hope everybody is staying well out there.
  4. Hi, If you search for 'countdown' on this forum you will find various ways of doing this, as well as a few feature requests for an automatic way of doing this. One easy way is to add a video with a countdown timer alongside your video media. The downside is that this adds some extra load to your machines and, if you switch your production machine to 'Video as thumbnails' you won't see the timer. Another way might be to use Watchnet to show the status of your timelines. I haven't actually done this, but I believe you can get the status of your timelines and show the remaining time until the next cue. The downside is that it requires another machine and Watchout licence. The way I usually do this is to create a composition with a simple clock animation made from image layers. That way there are no extra video files to be played back and it's still visible when displaying 'Video as thumbnails'. I drag it into my tasks and slide it along until it snaps to the end of my video layer, and then place it somewhere on the Stage where I can see it during playback. I often add it to a preview monitor so that colleagues can see it without me having to call it out. The downside is that it has to be placed manually and I haven't got around to making a version longer than 1 minute, but I figure if you have more than a minute to go, there's nothing to worry about yet... There's an example Watchout project file with the media for the clock on my github here: https://github.com/matkeane/watchout-snippets/tree/master/countdown-clock
  5. I never got around to testing it, but I downloaded this spout-nidi utility a while back (Doesn't look like the site got their certificate sorted for https access, but the links still work if you edit to use http). http://spout.zeal.co/download-spout-to-ndi/
  6. I haven't tried nested free-running/looping. My impression, from comments on this forum by Mike Fahl and others, was that free-running and looping should only be done at one level, otherwise things get unpredictable. If I want to loop a video in a composition, I don't set free-running and looping on the cue within the composition, but only on the composition itself and that has worked reliably so far (haven't tested in v6.6 yet).
  7. Hi Tim, Nice - thanks for posting this. It's working for me on MacOS 10.13.6 using a non-commercial licence for TouchDesigner099.2019.20140. The icon images in the transport and cue-list windows seem to be missing, but other than that, it's working as described.
  8. I usually setup a static backdrop with logos or something so that it's hopefully not as noticeable when I switch to Standby. However, on a recent job using Watchout 6.5.0 and a 6 output rental machine (W9100?), anytime I did an update (CTRL-D) - whether in Online, Live or Standby mode - each output would sequentially switch to black and then back to picture output... Needless to say that didn't go un-noticed. Standby content was in a composition on the main timeline, but the outputs were fed via Virtual Displays. There was too much else going on that day to try and identify a workaround, so I don't know if this was related to the Watchout version, the project setup, or the hardware.
  9. In the past, whenever I've had to deal with very large format Powerpoints, it's been split across multiple PCs. You just have to make sure you have _exactly_ the same number of steps on each slide (going forwards and backwards) and then use something like a PerfectCue to control both machines with the remote control (or a multi-cue, if you need 3 or 4 machines in sync). Managing changes across 2 presentations can be a pain, but it seems more likely that you will manage to get a UHD output from a powerpoint PC working than an 8K signal. Bear in mind that you probably won't get perfect sync between machines though - even on slide transitions with supposedly identical machines - so you probably don't want to try playing video. Although if you need 8K video playback, then you have the Watchout so...
  10. The question is not entirely clear, but I wonder whether the OP was asking about anamorphic lenses - like the concave mirror adapters used to shoot 360 degree video - or motorised mirror heads which allow a circular image to be projected in different directions... I've never seen the concave lenses used for projection, but there was a discussion on another forum about dome projections using a single projector and a convex hemispheric mirror - although apparently it's both fiddly to align and expensive. An example of motorised mirror heads can be found here: https://www.dynamicprojection.com/mirror-head-en/ - and I believe I saw a demo of something similar from Panasonic.
  11. Recent forum posts about slow network transfers when dealing with large media files reminded me of this... The ability to constrain which media gets pushed to which displays in Live Update mode would be useful in some cases. On the last couple of projects, I had two distinct sets of displays, with media created specifically for each area - content for the front screen will never be displayed on the back, and vice-versa. When I activate Live Update, so that I can tweak some effects in real-time, all media is pushed to all servers - resulting in a long wait while several hundred Gb of files is transferred to players which will never use that content. The only workaround seems to be to temporarily deactivate displays, but then the client only gets to see half the content update live, so it’s not really a solution in most cases. Other media server software seems to manage this problem with named servers and string matching with media filenames so that, for example, a media file named ‘my_file_front.mov’ won’t be transferred to a server named ‘back’. It seems a bit of a clunky solution, but it does work. I wonder whether, in Watchout, it would be possible to exclude certain folders in the media window from selected stage Tiers… That way I could put all media files destined for the front screens in one folder, right click to bring up options for that folder, and select only the ‘front’ Tier to ensure media was never pushed to displays on other stage tiers.
  12. The 'Format' shows as Mpeg-4, which is a bit surprising to me. I've only ever used HAP encoded Quicktime movies with a .mov extension. Is this Hap video in an MPEG-4 container, or did it accidentally get renamed as an .mp4? Could that be what's confusing Watchout?
  13. Are you looping by jumping back to the start of the aux timeline, or by placing the audio tracks in a composition and setting free-run/looping on that in an aux timeline? I have found that the composition method usually works better, but I haven't actually tried with a seamless audio loop.
  14. Hi Mike, The problem I have encountered with compositions is that (sometimes) when media cues within a composition use a blend mode other than normal, the cues 'pop' on and off instead of fading smoothly, once I apply an opacity tween to the whole composition in a task. I initially ran into this with a fairly complex wait loop containing free-running loops and various blend modes. Nesting the whole thing in a composition gave hiccups with opacity tweens and looping, but placed in a task everything ran smoothly, but then it was difficult to fade everything out mid-sequence if necessary.
  15. I would find it really useful to have a master opacity and volume control for all cues within an auxiliary timeline, so that I can say something like 'Task 11 - fade out nicely over 2 seconds and then stop!'. Currently I'm doing this by creating generic inputs for my_task_opacity and my_task_volume, but then I need to add them to every media cue in each task timeline, which can be a slow process. For some simpler tasks, like live inputs, I nest a composition in a task with a fade-in/pause/fade-out which achieves the same thing but, having to create compositions for every task is also time consuming. If I could do something like 'setInput my_task_01.opacity 0 1000', I'd find that really useful.
×
×
  • Create New...