Jump to content

Mike Fahl

Member
  • Posts

    720
  • Joined

  • Last visited

Everything posted by Mike Fahl

  1. It could also be a bug in the graphics card driver. If you have access to another type of computer/graphics card, you may want to see if playing th show on tht computer solves the problem. If so, I would venture that it's the hardware and/or driver thatäs causing the problem. Mike
  2. I recall Dylan at Penmac in South Africa did something like this a while back. Hopefully he'll see this. If not, perhaps someone at Dataton can put you in contact (or send them an email). Mike
  3. I believe there's an upper limit on max size of a single video, ultimately dictated by the texture size supported by the graphics card. Currently likely 8k-by-8k for most graphics cards. So a 12000 pixel wide video may fail to play for this rason alone. Dealing with very wide videos often requires pre-splitting the video. In your case, the video could be pre-split into three 4k wide videos. Playback performance may of course depend on a number of other factors, as pointed out by Jonas and others above.
  4. The easiest is probably to use the color balance sliders of he display to just back all colors down a bit. That will decrease the overall brightness without changing the color tint.
  5. I see no reason why a plain fade to value should result in "jitter". You may want to bring this up with support. Mike
  6. Should have nothing to do with the above formula as far as I can tell. Where's the input variable coming from? Most likely, there's some "jitter" here.
  7. try variable*10 - ((variable*10) % 10) instead. The % operator works only with integers, not with fractions.
  8. I believe th OP asks about exporting a 3D Projector's view tab as a video file, while running the show, to give the client a 3D visualization of the show. As far as I know, this view can't be exported (only the stage window can). As a possible work around, you could set up something like Camtasia to record the part of the display that shows the "3D calera view" as the show runs. I believe they have a demo version that you can run for a week or so, allowing you to test this method. Mike
  9. To clarify, the speed setting will affect ALL timelines. So while you can't do it specifically for a particular timeline, the clock speed is a GLOBAL setting. Changing the speed will slow down or speed up ALL timelines. Also, it may or may not work with audio playback, so I'd stay away from using audio in the show if you attempt to use this speed-changing command. Mike
  10. Yes, that's fine. However, while WATCHNET can control any number of clusters, WATCHOUT production software can only control one cluster at a time. Obviously, you should not attempt to control the same cluster from both at the same time. Mike
  11. Normally, they're in the "log" folder in WATCHOUT's program folder. Mike
  12. The reason you can play a very large number of the same video all at the same time is that WATCHOUT will recognize this duplication and really only play a single instance internally, while displaying those same pixels in 100 different places on stage. Thus it will only decode that video once (which is very the heavy lifting happens), and stomp out those same pixels many times (done on the GPU, so this is very fast). Note that this optimization works only if all the specs in the cues are identical, and the cues start at exactly the same time. Mike
  13. It may be necessary to wait for the connect event from the aux timeline before attempting to subscribe to its "playing" status. Mike
  14. The Flash (AS3) API is probably the easiest route if you want to build something in Flash/AIR. It's fairly well documented with ASDoc files, assuming you have the complete SDK. Here's the docs: http://academy.dataton.com/sites/default/files/docs/WATCHMan/AS3/index.html I'm not sure the rest of the SDK it's still available on the web, though, but you may have a copy since before. To work with Aux Timelines, call GetAuxTimeline to get access to the desired timeline, then use the methods on the returned AuxTimeline object to control or obtain info on the aux timeline. There may still be some sample code ont eh academy site that shows how to use thie API from flash. Mike
  15. Try selecting "premultiplied by white" for the problematic image, to see if that fixes the problem. If so, it sounds like WATCHOUT's automatic alpha type detection isn't working properly. Mike
  16. Yes, those messages sure seem legit. Just to make sure I understand this properly. Are you playing 50+ videos simultaneously or sequentially? What codec is being used? There's a protocol command to check for available memory that may help you in tracking this one down. Please check with dataton support for the exact syntax here. The idea being that you could ussue this command repeatedly (using a telnet client or similar) while running those videos and then see what impact it has on memory from WATCHOUT's point of viw. Mike
  17. My understanding was that you only used the dynamic image server to serve static images being dumped into a folder. Can you elaborate a bit more on your workflow here, to see if there's anything that can be made to fit? BTW, I agree with your findings in relation to the dynamic image server, and that it is likely a bug (unless I'm overlooking something in what you're trying to do, bringing me back to my question above). Mike
  18. What's the exact error message you're getting? Can you provide a screenshot of the message? Your notes on chunked HAP videos are correct. Using chunks allows the used of multiple CPU cores working on the same video. This can help if you want to play one or two huge files. But it may actually hamper the performance when playing many smaller videos, as those would be spread out over multiple cores anyway. Possibly, WATCHOUT could be a bit smarter here in how it allocates threads to HAP chunks, taking the number of simultaneously played videos into account. Mike
  19. If all you need to do is loading images that sit on another computer ready to go, you might want to consider the ability of the Image Proxy to load from a web server. Use something like WAMP or MAMP to turn the computer holding the images into a web server, then point the pxoxy there. This may be more reliable, albeit less flexible, than the Dynamic Image Server. Mike
  20. Yes, 4096 can be problematic. Can you drop it down to 4080 instead? If so, that may help getting around this particular problem. Alternatively, you may want to look into other codecs such as HAP or H.264. Mike
  21. I have a vague memory that you can do this using a Control cue in a Composition set to kill all timelines. It should then kill all EXCEPT the timeline from which the composition is run. I made it deliberately this way to do what you requested (i.e. kill all timelines except "this"). You can delete it by de-selecting it through the Tween menu. Same for all tweens. The Move command allows you to spatially move all selected cues (even those NOT intersecting the current time position). Mike
  22. What you're asking for will work. WATCHOUT can accept multiple control sources simultaneously. Obviously, you need to take care not to send conflicting commands from the different sources, or you may confuse your users. WATCHNET talks TCP to WATCHOUT, although this has no bearing on your main question. Mike
  23. Claude, I'm truly humbled by your thoroughness. I think the way you go about things here is very relevant and give you good and useful data. I bet eveyone here would like to learn more about your findings in relation to various hardware/software configurations, in case you have anything you want to share. Mike
  24. > Mike, when you saw the issue, I assume it was in 6.1.4 as mentioned in the previous post, correct? Yes > Was the media presplit? No > Was there anything else out of the ordinary? No > Was there a preview file attached to the HapQ movie? No Mike
  25. Just a clarification. I saw this HapQ colorization problem on a regular 2D display. I.e., not related to 3D texture mapping. Mike
×
×
  • Create New...