Jump to content
Dataton Forum

Mike Fahl

Dataton Partner
  • Content Count

    639
  • Joined

  • Last visited

About Mike Fahl

  • Rank
     CTO, PIXILAB AB

Contact Methods

  • Website URL
    http://pixilab.se/

Profile Information

  • Gender
    Male
  • Location
    Linköping, Sweden

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Mike Fahl

    Hap HapQ for WO5

    Quim, I doubt that will give good performance. The whole idea with HAP is to use the GPU to do the final decoding step. I don't think WO5 exposes what's needed for a DS or QT codec to do its job efficiently. Mike – http://pixilab.se
  2. Mike Fahl

    Wacthnet panel transfer

    I don't believe "brain surgery" at that granularity is possible with WATCHNET (at least it wasn't when I wrote it). It is however possible in PIXILAB Blocks, which does what WATCHNET does and a whole lot more. Read more about Blocks on Dataton's website: https://www.dataton.com/press/dataton-appoints-pixilab-as-new-solution-provider or direct from the "horses mouth" at http://pixilab.se/blocks Mike
  3. Mike Fahl

    Changing layer conditions with String Cues

    That seems correct. There are only 30 conditions, internally mapped to bits in a 32 bit word if I recall correctly, where 1073741824 dec corresonds 0x40000000 hex, which is all low 30 bits zero. Nifty trick to turn all conditions OFF, rather than reverting to the setting in Preferences (which a zero here would do). Another, somewhat less cryptic, option would of course be to leave the Preference settings all turned OFF and then setting to 0 would indeed turn them all off. But if you want to keep both options, your "hack" seems valid. Mike – http://pixilab.se
  4. Mike Fahl

    Motion sensor Device / Kinect motion sensor Device

    Kinect is no longger available AFAIK. When you say "motion sensor", perhaps you're just talking "someone moving in front of a sensor" in general, rather than tracking some motion? if so, you may want to look at standard PIR sensors (infrared motion detectors). Although you can't connect such a sensor directly to WATCHOUT, you can do so through some kind of control system, such as PIXILAB Blocks. Mike
  5. Mike Fahl

    Latency on running cues

    Wouldn't the virtual display rendering delay be dependent on the order of render target processing? I.e., if a particular virtual display is rendered to before or after another one? If virtual display A comes before virtual display B in the rendering sequence, A would have bee updated before its content is potentially rendered to B, resulting in no delay between the two. However, if A comes after B in the rendering sequence, there will be 1 WO frame delay between the two. Or am I missing something here? Assuming things work as I think they do here, what's needed would be some way to control the rendering sequence order. Then this would be more predictable. Perhaps Justin's idea of sorting them top/bottom and left/right would be good starting point. Then one could put virtual displays at negative Y coordinates, in the order on prefers them to render. My 2c anyway... Mike - http://pixilab.se
  6. Mike Fahl

    23.98 frame rates and playback in WO

    Stutter may come from many sources, for example: Inadaquate hardware to play the content (either in isolation or along with other content). Mismatch in framerate (e.g., 25 fps in; 60 fps out). The first point can only be fixed by having adequate hardware for the content at hand. The second point seems to be most of what you're concerned with here. Historically, this has always been a concern in WATCHOUT. And having a source framerate that's an even multiple of the output (graphics card) framerate is always advantageous. For example, if WATCHOUT outputs 60fps, using video that plays at 60 or 30 fps is optimal. If you play 25fps video in this case, there will always be some "temporal aliasing" going on, that can be seen as stutter. The introduction of frame blending in WATCHOUT alleviates this to some extent, by blending adjacent source video frames together when framerates don't match, to make the resulting video framerate match the output rate. This often results in smoother perceived playback, but may also introduce some blurriness due to the frame blending itself. Finally, I would see no real benefit in upsampling 30fps to 60fps when making the video files (regardless of codec). If the source material is 30fps, you won't really gain anything by outputting two identical frames (at 60fps) for every input frame. You're really just wasting resources by having to play back twice the amount of data without any advantage. If there's some processing, though (such as After Effects vector-based frame blending), that may in some cases result in a smoother playback, since you then synthesize those missing in-between frames. For some content, this may result in dramatic improvements in smoothness, while for other content it results in strange artifacts that just make thing look worse. Of course, the smoothest results will be achieved when playing back content shot at 60p with an output rate of 60fps (or 50p on 50fps if you're in PAL land). Mike - http://pixilab.se/
  7. Are you setting the proper resolution in Windows or AMDs control panel before going online? If not, WATCHOUT will try to switch the resolution (along with the frequency). When I did this part of WATCHOUT way back, I made sure that if the res was set properly before going online, WATCHOUT would not attempt to make any changes at all to the display config. That my have changed since, though. Mike - http://pixilab.se/
  8. Mike Fahl

    Network HTTP Video not playing in 6.2.2

    The HTTP flavor of the network video is intended to play video from a file on a server, while NDI is a more demanding "streaming" type protocol, more commonly used with cameras and other live feeds. I our case, we want to use HTTP from a server to make it easier to update content on a separate server, without having to change the show programming. Mike
  9. Mike Fahl

    Network HTTP Video not playing in 6.2.2

    Well, the network video media item in WO is supposed to handle both RTSP (proper streaming) and plain HTTP file access. The point with the HTTP video file access is that one can update a video file on a web server, and then have WO pick it up "dynamically", without someone having to edit and push a new show. Such arrangements can streamline content management when the client wants to be able to update some content on a more regular basis. It works fine for still images (using an Image Proxy that points to an image URL). It's supposed to also work for video (since 6.2.1 if I recall correctly), but that part seems broken in 6.2.2. I never got around to using it in 6.2.1, so I donät know if it worked there. Mike – http://pixilab.se/
  10. I tried playing video from a HTTP server using a Network Video media item. The video appears on the production PC, but never plays (just shows first frame as a still image). On display PC it doesn't appear at all (not even as a still). Has anyone else managed to make this work in 6.2.2, or is it just broken? The video I'm trying to play is a 3840x720 H.264 MP4 file that plays fine in WATCHOUT if used as a regular media file, so the file is known to be good. Mike
  11. Mike Fahl

    Suggested for my Watchmax

    Yes, that definitely seem to be a networking error of some kind. I believe the underlying error is: [SocketException (0x274c): A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond Mike – http://pixilab.se
  12. Mike Fahl

    Recommendations for 3d mapping work flow

    > the obj file in watchout would need to move using x,y,z and rotation tweens. I guess the tricky part is to have the 3D object in WATCHOUT move exactly as the physical set piece, which would be required for this to work. But if you can accomplish that, it should work. An alternative may be to use the new tracking feature in WATCHOUT, which as far as I understand does this positioning for you, based on camera tracking data. Mike – http://pixilab.se
  13. Mike Fahl

    NewTek NDI with Alpha

    I doubt attempting to play video through the dynamic image server will perform well. However, WATCHOUT does support playing video using a video proxy pointed at a web server. Assuming you can place the rendered video file on a web server, this may be an option. Since the OP's question was related to text titles, another option could be to use a title generator. I assume those can be found that can output NDI too. Finally, depending on your needs, you may not need video if all you want is a text title. We've done dynamic text like this for WATCHOUT with Blocks on a couple of occations. Blocks provides control over how the text is formatted (size, styles, fonts, colors, etc). The result is rendered to an image (with proper alpha channel), which can then be loaded dynamically into WATCHOUT using an image proxy. Mike – http://pixilab.se
  14. Mike Fahl

    8K render method

    I don't believe HAP plays through DXVA in WATCHOUT, so I doubt the tool Thomas suggest will actually tell you anything relevant. It may very well be a useful tool in other cases, but likely not in relation to HAP, which is built into WATCHOUT and makes no use of DXVA as far as I know. Mike http://pixilab.se/
  15. PIXILAB Blocks works very well with WATCHOUT and comes with a PJLink driver, as well as numerous others. Panels can be made using any iOS/Android device. Blocks may be overkill if all you need is just a control panel that can fire off strings. But if you have a need for a more complete control system solution – including lighting and various display technologies – it might be of interest. More technical details in the wiki. Mike
×