Jump to content
Dataton Forum

matkeane

Member
  • Content Count

    126
  • Joined

  • Last visited

1 Follower

About matkeane

  • Rank

Contact Methods

  • Website URL
    http://matkeane.com

Profile Information

  • Gender
    Male
  • Location
    Paris

Recent Profile Visitors

699 profile views
  1. I never got around to testing it, but I downloaded this spout-nidi utility a while back (Doesn't look like the site got their certificate sorted for https access, but the links still work if you edit to use http). http://spout.zeal.co/download-spout-to-ndi/
  2. I haven't tried nested free-running/looping. My impression, from comments on this forum by Mike Fahl and others, was that free-running and looping should only be done at one level, otherwise things get unpredictable. If I want to loop a video in a composition, I don't set free-running and looping on the cue within the composition, but only on the composition itself and that has worked reliably so far (haven't tested in v6.6 yet).
  3. Hi Tim, Nice - thanks for posting this. It's working for me on MacOS 10.13.6 using a non-commercial licence for TouchDesigner099.2019.20140. The icon images in the transport and cue-list windows seem to be missing, but other than that, it's working as described.
  4. I usually setup a static backdrop with logos or something so that it's hopefully not as noticeable when I switch to Standby. However, on a recent job using Watchout 6.5.0 and a 6 output rental machine (W9100?), anytime I did an update (CTRL-D) - whether in Online, Live or Standby mode - each output would sequentially switch to black and then back to picture output... Needless to say that didn't go un-noticed. Standby content was in a composition on the main timeline, but the outputs were fed via Virtual Displays. There was too much else going on that day to try and identify a workaround, so I don't know if this was related to the Watchout version, the project setup, or the hardware.
  5. Some interesting topics there... For those of us unable to attend ISE, any chance that the presentations might be recorded for later viewing?
  6. In the past, whenever I've had to deal with very large format Powerpoints, it's been split across multiple PCs. You just have to make sure you have _exactly_ the same number of steps on each slide (going forwards and backwards) and then use something like a PerfectCue to control both machines with the remote control (or a multi-cue, if you need 3 or 4 machines in sync). Managing changes across 2 presentations can be a pain, but it seems more likely that you will manage to get a UHD output from a powerpoint PC working than an 8K signal. Bear in mind that you probably won't get perfect sync between machines though - even on slide transitions with supposedly identical machines - so you probably don't want to try playing video. Although if you need 8K video playback, then you have the Watchout so...
  7. The question is not entirely clear, but I wonder whether the OP was asking about anamorphic lenses - like the concave mirror adapters used to shoot 360 degree video - or motorised mirror heads which allow a circular image to be projected in different directions... I've never seen the concave lenses used for projection, but there was a discussion on another forum about dome projections using a single projector and a convex hemispheric mirror - although apparently it's both fiddly to align and expensive. An example of motorised mirror heads can be found here: https://www.dynamicprojection.com/mirror-head-en/ - and I believe I saw a demo of something similar from Panasonic.
  8. Recent forum posts about slow network transfers when dealing with large media files reminded me of this... The ability to constrain which media gets pushed to which displays in Live Update mode would be useful in some cases. On the last couple of projects, I had two distinct sets of displays, with media created specifically for each area - content for the front screen will never be displayed on the back, and vice-versa. When I activate Live Update, so that I can tweak some effects in real-time, all media is pushed to all servers - resulting in a long wait while several hundred Gb of files is transferred to players which will never use that content. The only workaround seems to be to temporarily deactivate displays, but then the client only gets to see half the content update live, so it’s not really a solution in most cases. Other media server software seems to manage this problem with named servers and string matching with media filenames so that, for example, a media file named ‘my_file_front.mov’ won’t be transferred to a server named ‘back’. It seems a bit of a clunky solution, but it does work. I wonder whether, in Watchout, it would be possible to exclude certain folders in the media window from selected stage Tiers… That way I could put all media files destined for the front screens in one folder, right click to bring up options for that folder, and select only the ‘front’ Tier to ensure media was never pushed to displays on other stage tiers.
  9. The 'Format' shows as Mpeg-4, which is a bit surprising to me. I've only ever used HAP encoded Quicktime movies with a .mov extension. Is this Hap video in an MPEG-4 container, or did it accidentally get renamed as an .mp4? Could that be what's confusing Watchout?
  10. Are you looping by jumping back to the start of the aux timeline, or by placing the audio tracks in a composition and setting free-run/looping on that in an aux timeline? I have found that the composition method usually works better, but I haven't actually tried with a seamless audio loop.
  11. Hi Mike, The problem I have encountered with compositions is that (sometimes) when media cues within a composition use a blend mode other than normal, the cues 'pop' on and off instead of fading smoothly, once I apply an opacity tween to the whole composition in a task. I initially ran into this with a fairly complex wait loop containing free-running loops and various blend modes. Nesting the whole thing in a composition gave hiccups with opacity tweens and looping, but placed in a task everything ran smoothly, but then it was difficult to fade everything out mid-sequence if necessary.
  12. I would find it really useful to have a master opacity and volume control for all cues within an auxiliary timeline, so that I can say something like 'Task 11 - fade out nicely over 2 seconds and then stop!'. Currently I'm doing this by creating generic inputs for my_task_opacity and my_task_volume, but then I need to add them to every media cue in each task timeline, which can be a slow process. For some simpler tasks, like live inputs, I nest a composition in a task with a fade-in/pause/fade-out which achieves the same thing but, having to create compositions for every task is also time consuming. If I could do something like 'setInput my_task_01.opacity 0 1000', I'd find that really useful.
  13. To piggyback on Cowboyclint's suggestion, it would be nice if it were possible to change the timecode display format for a timeline (or for all timelines based on the Project frame rate) and let Watchout deal with the conversion to milliseconds. Content creators (in my part of the world at least) are more familiar with working at 25fps, and doing timecode calculations in my head while also multiplying by 40ms per frame seems like something a computer would probably do faster and more accurately! Also, the ability to enter relative timecodes would be great - e.g. hit ctrl+J and then type +12.20 to jump forward 12 seconds and 20 frames from the current timeline position.
  14. I'm currently working on a Watchout installation at the Paris airshow. In addition to the production and player machines, I have a Watchnet 1.4 server running and a SurfacePro that the client can use to cue various clips on demand. So far the Watchnet setup is working well, but I have a few questions... Occasionally, before or after the show opens, I need to make changes in the control booth and so I put the show in Standby (triggered by Production, not Watchnet). Is there a way to show the standby status in a Watchnet panel, so that the user outside on the floor can understand why the buttons are unresponsive? I wondered about creating a 'standby' panel and forcing the Watchnet UI to navigate to a holding page, but that would require somehow triggering a 'navigate' command on the remote UI. I can just switch the screen off, of course, but a blank screen tends to worry the client, plus it means a lot of walking back and forth! Is it possible to trigger Watchnet commands from Production? Now I have all my scripts set up in the Watchnet server, It would sometimes be handy to be able to trigger them when I need to launch a specific task from production, without recreating the same events in my Tasks. Is there a way to temporarily disable buttons via a script? Once certain tasks are running, I'd like to disable the other buttons until the current task is finished to avoid lots of videos being launched at the same time. And finally, just a detail, is there a way to add a newline to button text? I was trying to add the clip duration under the title, but my attempts at adding '\n' and '<br/>' didn't get me anywhere, and extra spaces seem to get stripped out. Thanks for any help and suggestions!
  15. To expand on JFK's suggestion - I usually then put the live input and the drop shadow/border layer together in a composition so that I can move and scale the whole thing as required without 2 sets of Tweens to manage. Placing the live input in a composition doesn't seem to affect the latency.
×
×
  • Create New...