Jump to content

Mike Fahl

Member
  • Posts

    721
  • Joined

  • Last visited

Everything posted by Mike Fahl

  1. I believe WO is still 32 bit only, so throwing more than 16GB of RAM at it shouldn't make much difference. Just make sure you have RAM installed in an optimal way for the motherboard (some mobos benefit from parallell channels if modules installed across all RAM lanes). Mike
  2. Sounds like you have some video content that causes WO to lock up. To figure out which one, remove video one by one, and try to "update" between each, until you found the offending item, then forward that file to Dataton. Such problems are sometimes caused by rouge codecs, but since you've just installed the machine according to the guidelines, that shouldn't be the case.
  3. The Instagram API I used back then is no longer available, so this function wouldn't work now anyway. PIXILAB Blocks has built-in support for Instagram feeds, but even that has been quite constrained by the latest round of Instagram API restrictions. Mike
  4. As far as I recall, this can't be done in a WATCHNET script. In case you're not tied to WATCHNET for your application, PIXILAB Blocks provides similar control capabilities for WATCHOUT (plus a whole lot more), and provides more flexible programming, allowing for the kind of functionality you're asking for here. In Blocks, scripts are called "tasks". Tasks are organized into groups. Tasks in a group can be set to be mutually exclusive, in which case starting another task will automatically terminate any previous task in that group, which sounds exactly like what you ask for. More about Blocks here (with a link to the manual at the bottom of the page): http://pixilab.se/blocks Mike
  5. The reason you can't create them in comps is that they can''t control the comp itself (since the comp is entirely governed by its enclosing timeline). Hence, to protect you from this potentilly confusing situation, I decided to not allow control cues to be created in comps. However, as you noticed, you can "cheat" by pasting control cues into the timeline. And if you're controlling another timeline (by name), then it can actually be useful, as in this special case, where you're telling timelines to STOP. It then also make sense that the control cue would NOT be applied to its enclosing timeline (since control cues in comps weren't allowed for this very reason). So control cues in this context targeting the governing timeline are therefor simply ignored, which can be quite useful in this way. If the behavior is still there, I doubt it will be removed (even though it is undocumented). And I did make it that way for a reason ;-). Mike
  6. I velieve what I recalled was that you could make an aux timeline (or composition) with a bunch of control cues to kill timelines. This "bunch of control cues" can include the aux timeline that's firing the bunch (allowing you to use the same comp from all those timelines). If it DOES include the aux timeline that's firing the bunch, that one will be ignored in this context. However, it was quite some years since I put that in, and I doubt anyone have ever used it. I have no idea whether it still works that way. But that was my idea at the time. However, this is NOT a "shotgun" kill-all-but-me method. You still have to add an individual control cue for each target timeline in the bunch of timelines you need to deal with. This mechanism just saved you having to create a separate set of control cues for each timeline including all EXCEPT the firing timeline, which would makt the amount of duplication far worse. Mike
  7. I made some tests with bumping the priority of WP.EXE way back, but came to the conclusion at that time that increasting the priority of WP.EXE actually made things WORSE. There are a lot of things going on under the hood, which all need to share the same CPU. Bumping the priority of some tasks usually have a detrimental effect on others, resulting in an "unbalanced" system. Also, keep in mind that there are ususally TWO processes related to the display computer. WATCHPOINT.EXE is really just the "watchdog" process (to restart the display software if it crashes), while WP.EXE is the actual player process. At least that was the case last I looked. Mike – http://pixilab.se
  8. Quim, I doubt that will give good performance. The whole idea with HAP is to use the GPU to do the final decoding step. I don't think WO5 exposes what's needed for a DS or QT codec to do its job efficiently. Mike – http://pixilab.se
  9. I don't believe "brain surgery" at that granularity is possible with WATCHNET (at least it wasn't when I wrote it). It is however possible in PIXILAB Blocks, which does what WATCHNET does and a whole lot more. Read more about Blocks on Dataton's website: https://www.dataton.com/press/dataton-appoints-pixilab-as-new-solution-provider or direct from the "horses mouth" at http://pixilab.se/blocks Mike
  10. That seems correct. There are only 30 conditions, internally mapped to bits in a 32 bit word if I recall correctly, where 1073741824 dec corresonds 0x40000000 hex, which is all low 30 bits zero. Nifty trick to turn all conditions OFF, rather than reverting to the setting in Preferences (which a zero here would do). Another, somewhat less cryptic, option would of course be to leave the Preference settings all turned OFF and then setting to 0 would indeed turn them all off. But if you want to keep both options, your "hack" seems valid. Mike – http://pixilab.se
  11. Kinect is no longger available AFAIK. When you say "motion sensor", perhaps you're just talking "someone moving in front of a sensor" in general, rather than tracking some motion? if so, you may want to look at standard PIR sensors (infrared motion detectors). Although you can't connect such a sensor directly to WATCHOUT, you can do so through some kind of control system, such as PIXILAB Blocks. Mike
  12. Wouldn't the virtual display rendering delay be dependent on the order of render target processing? I.e., if a particular virtual display is rendered to before or after another one? If virtual display A comes before virtual display B in the rendering sequence, A would have bee updated before its content is potentially rendered to B, resulting in no delay between the two. However, if A comes after B in the rendering sequence, there will be 1 WO frame delay between the two. Or am I missing something here? Assuming things work as I think they do here, what's needed would be some way to control the rendering sequence order. Then this would be more predictable. Perhaps Justin's idea of sorting them top/bottom and left/right would be good starting point. Then one could put virtual displays at negative Y coordinates, in the order on prefers them to render. My 2c anyway... Mike - http://pixilab.se
  13. Stutter may come from many sources, for example: Inadaquate hardware to play the content (either in isolation or along with other content). Mismatch in framerate (e.g., 25 fps in; 60 fps out). The first point can only be fixed by having adequate hardware for the content at hand. The second point seems to be most of what you're concerned with here. Historically, this has always been a concern in WATCHOUT. And having a source framerate that's an even multiple of the output (graphics card) framerate is always advantageous. For example, if WATCHOUT outputs 60fps, using video that plays at 60 or 30 fps is optimal. If you play 25fps video in this case, there will always be some "temporal aliasing" going on, that can be seen as stutter. The introduction of frame blending in WATCHOUT alleviates this to some extent, by blending adjacent source video frames together when framerates don't match, to make the resulting video framerate match the output rate. This often results in smoother perceived playback, but may also introduce some blurriness due to the frame blending itself. Finally, I would see no real benefit in upsampling 30fps to 60fps when making the video files (regardless of codec). If the source material is 30fps, you won't really gain anything by outputting two identical frames (at 60fps) for every input frame. You're really just wasting resources by having to play back twice the amount of data without any advantage. If there's some processing, though (such as After Effects vector-based frame blending), that may in some cases result in a smoother playback, since you then synthesize those missing in-between frames. For some content, this may result in dramatic improvements in smoothness, while for other content it results in strange artifacts that just make thing look worse. Of course, the smoothest results will be achieved when playing back content shot at 60p with an output rate of 60fps (or 50p on 50fps if you're in PAL land). Mike - http://pixilab.se/
  14. Are you setting the proper resolution in Windows or AMDs control panel before going online? If not, WATCHOUT will try to switch the resolution (along with the frequency). When I did this part of WATCHOUT way back, I made sure that if the res was set properly before going online, WATCHOUT would not attempt to make any changes at all to the display config. That my have changed since, though. Mike - http://pixilab.se/
  15. The HTTP flavor of the network video is intended to play video from a file on a server, while NDI is a more demanding "streaming" type protocol, more commonly used with cameras and other live feeds. I our case, we want to use HTTP from a server to make it easier to update content on a separate server, without having to change the show programming. Mike
  16. Well, the network video media item in WO is supposed to handle both RTSP (proper streaming) and plain HTTP file access. The point with the HTTP video file access is that one can update a video file on a web server, and then have WO pick it up "dynamically", without someone having to edit and push a new show. Such arrangements can streamline content management when the client wants to be able to update some content on a more regular basis. It works fine for still images (using an Image Proxy that points to an image URL). It's supposed to also work for video (since 6.2.1 if I recall correctly), but that part seems broken in 6.2.2. I never got around to using it in 6.2.1, so I donät know if it worked there. Mike – http://pixilab.se/
  17. I tried playing video from a HTTP server using a Network Video media item. The video appears on the production PC, but never plays (just shows first frame as a still image). On display PC it doesn't appear at all (not even as a still). Has anyone else managed to make this work in 6.2.2, or is it just broken? The video I'm trying to play is a 3840x720 H.264 MP4 file that plays fine in WATCHOUT if used as a regular media file, so the file is known to be good. Mike
  18. Yes, that definitely seem to be a networking error of some kind. I believe the underlying error is: [SocketException (0x274c): A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond Mike – http://pixilab.se
  19. > the obj file in watchout would need to move using x,y,z and rotation tweens. I guess the tricky part is to have the 3D object in WATCHOUT move exactly as the physical set piece, which would be required for this to work. But if you can accomplish that, it should work. An alternative may be to use the new tracking feature in WATCHOUT, which as far as I understand does this positioning for you, based on camera tracking data. Mike – http://pixilab.se
  20. I doubt attempting to play video through the dynamic image server will perform well. However, WATCHOUT does support playing video using a video proxy pointed at a web server. Assuming you can place the rendered video file on a web server, this may be an option. Since the OP's question was related to text titles, another option could be to use a title generator. I assume those can be found that can output NDI too. Finally, depending on your needs, you may not need video if all you want is a text title. We've done dynamic text like this for WATCHOUT with Blocks on a couple of occations. Blocks provides control over how the text is formatted (size, styles, fonts, colors, etc). The result is rendered to an image (with proper alpha channel), which can then be loaded dynamically into WATCHOUT using an image proxy. Mike – http://pixilab.se
  21. I don't believe HAP plays through DXVA in WATCHOUT, so I doubt the tool Thomas suggest will actually tell you anything relevant. It may very well be a useful tool in other cases, but likely not in relation to HAP, which is built into WATCHOUT and makes no use of DXVA as far as I know. Mike http://pixilab.se/
  22. PIXILAB Blocks works very well with WATCHOUT and comes with a PJLink driver, as well as numerous others. Panels can be made using any iOS/Android device. Blocks may be overkill if all you need is just a control panel that can fire off strings. But if you have a need for a more complete control system solution – including lighting and various display technologies – it might be of interest. More technical details in the wiki. Mike
  23. You didn't say specifically that you needed to play video. What video format do you want to play that contains an alpha channel?
  24. I had a write up a while back that shows how this can be done either using the dynamic image server or a separate software called vMix, in conjunction with PIXILAB Moments for creating audience interaction based on their mobile phones. More here: Mike
  25. There's a new write-up in our wiki on how to use PIXILAB Moments for live audience interaction with WATCHOUT using the newly added NDI streaming video capability, that may interest some of you. Find it here: https://int.pixilab.se/docs/moments/watchout It contrasts three ways of getting such dynamic content into WATCHOUT; two based on NDI (the Dynamic Image Server and a separate piece of software called vMix), as well as traditional capture card solutions, listing pros and cons of each method. Moments is a cloud-based software for audience interaction during conferences and similar events. A short introduction video can also be found on our wiki: https://int.pixilab.se/docs/moments Mike
×
×
  • Create New...