Jump to content

benchmark software


Curtis Hutchins

Recommended Posts

  • 2 years later...
  • Moderator

This is not an easy task because different media use the hardware very differently. For image sequences and HAP you need a lot of disc bandwidth while other media have RAM and CPU bandwidth as bottlenecks. There are some good tools to monitor CPU and GPU load remotely but most tools are very bad at monitoring memory and disc bandwidth.

 

We will look into this and compile a testshow that we can upload after ISE.

Link to comment
Share on other sites

This is not an easy task because different media use the hardware very differently. For image sequences and HAP you need a lot of disc bandwidth while other media have RAM and CPU bandwidth as bottlenecks. There are some good tools to monitor CPU and GPU load remotely but most tools are very bad at monitoring memory and disc bandwidth.

 

We will look into this and compile a testshow that we can upload after ISE.

THANK YOU! This will be helpful for all I think.  :D

Link to comment
Share on other sites

There are many benchmark that need to be used if you want to properly benchmark your machine but there is not a single benchmarking tool aimed at Watchout specifically.

 

I use many synthetic benchmark when testing components but in the end the only real measure of performance will be shown using a show I built specifically for the purpose, it is HUGE but it precisely measure the performance under many common situation in real world usage.

 

Here is how it works:

 

-I have 60 HD files, same content, different name, a sherrif star spinning on sand, hard to compress, obvious if there is a lag or latency issue, loop hiccup or anything. Alternatively I use to have a file containing fractal noise and a diagonal bar going up and down (easier to see jerk, stutters and the like on diagonal movement than horizontal or vertical one). Same thing, hard to compress obvious is there are issues.

 

-I have 20 4k files, same content, the sherrif star on sand, same reason.

 

-I have 10 10240x1080 files, noise and diagonal movement with color bars throughout in movement.

 

All files are availlable in HAP, HAPQ, HAP Alpha, MP4 and when possible MP2, I also have several copies of the same movie (24x HD, 8x 4k, 4x 10240x1080) in image sequence, BMP, PNG and JPEG. That's a total of 456 discrete files.

 

I then built a main timeline where it reads one HD file then after 3sec in starts another and so on and so forth until I reach my max or I see any artifact, jerks, tears, stutter, lag, anything. I note the number. I then jump in the timeline to another set of HD files but in another format. Take note when it displays issues. Then for the 4k files and finaly the large files (10240x1080).

 

Then I do the same but with files playing back in aux timelines being triggered by cues, and then finally in composition playing back in virtual displays applied to 3D objects.

 

ALL test are done with the full 6 output active with edid emulation and frame lock engaged.

 

Then when I know the real limit of my machine for various situation and format I use my live inputs and 720p and 1080p and run the test again to see the limit when using live inputs on both the content and the live input lag/latency and fluidity of fades.

 

At last I program controls for opacity, keying and scale and run the test while playing with my faders.

 

It takes me 1-2 day to benchmark my machines but at least I truly know the limit in relevant circumstances.

 

Currently I can play back 56x HD files, 16x 4k or 7x 10240x1080 concurently, in aux timelines in comps in virtual displays applied to a 3d object.

These are my 4 year old machines I am building new ones this very month optimized for a live update workflow and remote management, I'm like a kid waiting for his new toys :) .

Link to comment
Share on other sites

  • Dataton Partner

 

 

 

 

 

Currently I can play back 56x HD files, 16x 4k or 7x 10240x1080 concurently, in aux timelines in comps in virtual displays applied to a 3d object.

 

Hi Claude,

 

Thanks for this description. We are using comparable tests on our servers but of course, as you described as well, the results depend on encoding and the actual codec too. Your results look impressive but which codecs are they referring too?

 

Can you share the specs of the computers you used for that test?

 

Thanks

 

Rainer

 

Link to comment
Share on other sites

Claude,

 

I'm truly humbled by your thoroughness. I think the way you go about things here is very relevant and give you good and useful data. I bet eveyone here would like to learn more about your findings in relation to various hardware/software configurations, in case you have anything you want to share.

 

Mike

Link to comment
Share on other sites

I am curious to when you run benchmarks, how can you tell what subsystem is being maxed out? I can try to upgrade everything when something starts to not play smoothly, but is there a benchmark you have found to run on the display computer that will store the load information on different subsystems while the content is playing on them?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...