Search the Community
Showing results for tags 'tearing'.
Found 2 results
This is a multi-part question: I occasionally get tearing on stills in Watchout. It doesn't seem to be consistent, and I can't quite nail down why. It's almost always big stills...8k, 9k wide. I've tried different file formats and nothing seems to be the silver bullet. So the first part of the question is, what part of the computer does the heavy lifting of fading a still? Is that the GPU or CPU? And would additional system RAM or video ram help that? The second and related question is using sync cards between machines. We have traditionally used AMD cards with the s400 sync modules and had good success. However, we have been moving to nVidia Quadro cards (M4000 and P4000) as they benchmark at better performance and have much more flexibility in the software. We have the appropriate Sync I and Sync II modules. In my current scenario with two display machines using these nVidia cards and sync modules, I have set them in various configurations: to be external sync on both cards, external sync on one card feeding sync the second, and internal sync on one feeding the second. In all cases, I still intermittently get tearing between machines. It doesn't seem to happen on the videos, but does on stills. Curious what's happening on the back end which could cause this. And finally, can someone elaborate on what the "sync chain master" checkbox does under the hood? Does Watchout actually talk to both the s400 and Sync I / II modules? If you are using external sync to each module, does this checkbox still matter, as there's not really a "master"? I have also found when I make an adjustment on one display machine (say re-launch Watchout), sync struggles until I re-launch on the second machine. Is this expected behavior? Just trying to understand the technology better so I can fix things quicker in the middle of the night on a show! Thanks!
Sometimes we're using the display machines for other applications than Watchout as well. Last week we used a system with a eyefinity 6950 card for simply playing a mpg2 video with windows media player. I noticed a visual artifact called screen tearing. Searching the internet I found discussions about eyefinity setups which cause this artifact. "The problem is all about timing between the converters, while using a mixture of inputs and monitors have different timings for each." "The problem stems from using more than 1 type of connector for your monitors." In our situation we used one straight DVI connection for the primary monitor and a displayport to DVI adapter for the secondary monitor. While playing the video full screen on the secondary output i saw the screen tearing artifact. Connecting both monitors with a displayport to dvi adapter solved the problem. Just wondering how Watchout handles timings of the multiple outputs and if anyone saw this artifact while using Watchout with eyefinity cards as well? Kind regards, Rogier Tuinte Check: http://en.wikipedia.org/wiki/Screen_tearing