Jump to content

RBeddig

Dataton Partner
  • Posts

    628
  • Joined

  • Last visited

Everything posted by RBeddig

  1. The remote feature uses a VNC program built into WATCHPOINT. This is also one of the reasons why you should not install a VNC program on any WATCHOUT display computer since both would probably use the same ports. We also use Datapath most of the time but when using some older BM cards, we always use the Decklink driver w/o WDM on the name.
  2. Let me add two or three points to Jim's answer. DMX is simple. Basically you define the universe in the preferences/control tab. While you can only set one universe here it is actually possible to send cues to more universes if needed. There is a setting to select the universe on each cue now. WATCHOUT behaves as an Artnet node. It sends "DMX" over the network. If your DMX devices need proper analogue DMX, you will need a converter from Artnet to DMX (KISSBOX, Entech,..). ArtNet officially works on a 2.xxx.xxx.xxx network but you can often change it to the network addressing scheme used for your servers, e.g. 192.168.xxx.xxx. The converters allow to do this and our's run on 192.168.xxx.xxx networks without any issues. The enumeration of Artnet universes can sometimes be tricky. Some devices count from 0, others from 1. So changing the number +-1 might resolve problems. Playing content for 3D mapping in WATCHOUT is not very difficult either. You should use a virtual screen as a container for all the content going to your projector. This gives you normal timeline control. Mark the 3D object and then drag the virtual screen media item (media window) onto your 3D object. There are some useful training videos on the Dataton website.
  3. Yes with Windows 10, no with Windows 7.
  4. FirePro 7100 is the older model, the new one is called Radeon Pro WX7100 now. The new cards are faster than the older models.
  5. Sorry, I rather thought of stacking the content, not the stacking order of the aux timelines. Just check my attached sample. STACKING.zip
  6. Use one Samsung 860 Pro SSD for Windows and a U2 or M.2 SSD for data. Two SATA-3 SSD in RAID 0 only give you something like 1GB/s reading speed. Since all the rest is quite up to date I would go for a faster storage for WATCHOUT as well. We are mounting the M.2 NVMe SSD into a PCIe holder with a passive cooler. Active cooling would even be better. The drives tend to get rather hot and then drop speed.
  7. Ever thought of using the stacking order? Not many people probably use this but this could maybe eliminate the need for the three content VDs.
  8. Just be aware that control cues sitting at 0:00.00 in a timeline will not be executed. Move them a fraction to the right and all is fine.
  9. Hi Phonebook, We also use capable notebooks for productions very often and the internal battery and portability helps in real life. BUT, if you plan to use many display servers with HAP Q or uncompressed codecs you might want to use a real desktop or rack server with a 10 Gbit/s network. Files tend to get really big nowadays and a 10 Gbit network can help to reduce the time to upload media drastically. Of course, the internal storage for media needs to meet the throughput of the 10 Gbit network as well. This means that you'll need NVME SSDs or a RAID 0 with at least 3 fast SATA SSDs for the content to make full use of the 10 Gbit network. Best regards, Rainer
  10. Hi David, Do we have any list of camera systems supporting MPCDI? I can find various docs with VESA specs but no real system (camera / software) stating that it can generate this data format. Best regards, Rainer
  11. Xicomen, Did you check the incoming level of your timecode signal with the "Timecode Tester" (in the Dataton installation folder)? I've used TC on several Academy sessions with WATCHOUT 6.2.2 and had no issues. The incoming level needs to be defined in the Windows settings. Also be careful with the components between the timecode source and the input of your production computer. Some DI boxes and other gear are not suitable for the rectangle signal form of a tc signal. It is advisable to define the right tc format in the WATCHOUT settings and not to use automatic here.
  12. Have you tried sending the string followed by a carriage return (hexadecimal value 0d)? i.e. {01@05}$0D Many command protocols require the carriage return to complete the message. It's a little while since I programmed Medialon Manager for a Lightware matrix but according to the manual it should either work without a carriage return or with CrLf which would be hex values 0d0a. {01@05}$0D$0A I don't know where you're based but you could take a look at universe-control.com. This software can control most devices you'll find in events nowadays.
  13. The WASAPI mode should not matter since WASAPI and ASIO are different things. The 48k setting in the sample rate is important. If this is set to auto or other values it might not work as described above, especially when you try to check the audio signal using Dante VIA. You will probably want to change the ASIO buffer size as well (if the buffer is too small it might lead to clicking noise). 512 is a good value here.
  14. WMV is not officially supported. Can work but does not have to work.
  15. Sounds more like a network issue. Could UDP be blocked somewhere? Could the UDP communication be going to a different network adaptor than the TCP commands?
  16. You could try UNIVERSE. It can control WATCHOUT and VLC from one GUI.
  17. Did you select a valid output in the Preferences settings in WATCHOUT? And try to define the position of the audio cue by using the name of a display instead of the coordinates. I assume that the layer itself is "visible" and not turned off due to the setting of conditional layers or standby layers.
  18. One way would be to use a Visual Production iOCore 2 unit in the network. This can translate numerous protocols and could trigger OSC strings after receiving TCP/UDP from WATCHOUT.
  19. The calibration points for the alignment of the texture in a 3D projector do not allow any isolation of areas. The points are actually used to triangulate the position of the projector in relation to the position and orientation of the 3D object. We always start with 5 points first and align those since up to this point the texture will stay unmoved and the projector will not "fly into space". Then we add the 6th point which will start to move the texture. The position of the texture itself depends solely on the information in the 3D (OBJ) file. This is defined in softwares like 3DS, Blender, Cinema 4D etc. If the 3D file is not accurate, i.e. it differs a bit from the real world object, the alignment won't be perfect at this point. WATCHOUT sees in the 3D file where the texture needs to go and not in the real object. You can optimize the calibration by using a few (try to keep the number as low as possible) more calibration points. They do not need to be at exact corners or so since they are then just used as handles to drag the texture from the correct position. This only works with small corrections. If you drag too far, WATCHOUT will tell you that the texture can not be calculated in relation to the geometry of the 3D file anymore and show red lines. You can also use the standard grid for correction. In this grid you can easily isolate a region and prevent it from spilling over. Just place 4 points in a rectangular way (same lines vertical and horizontally). Then you can add extra points inside which will not influence anything outside the box. Of course, you can break your texture down into smaller regions as Walter suggests. This can help too. We usually ignore the numbers in the position completely. You can use them to preposition the projectors in 3D space but they will change automatically once you have 6 alignment points and start to move them. Important: on the general tab you have to define the vertical and horizontal lens shift. If you do not have correct values in there (lock them after defining them) the calibration will not work accurately since it makes a difference for the underlying formulas whether the lens is shifted or not. Purely optical physics!
  20. MIDI CC will not jump to a new value but use a ramp. That's why it will also trigger values on the way. The timeline is not meant to run backwards which is probably the reason why you see a different behavior going backwards.
  21. Hi, If you use capture cards, you do need to split the signal and feed every server with the same signal using 3 capture cards. Another option could be to use a HDMI > NDI converter box and then use the NDI signal instead. This works with the current version of WATCHOUT. We actually stopped selling and using Blackmagic capture cards due to lack of support and unreliable drivers. Datapath would be a better option. If latency is not an issue, you could also use USB3 capture devices like the INOGENI 4K device. Best regards Rainer
  22. You can send this out from WATCHOUT, e.g. a special aux timeline, or through WATCHNET where you would need to add this in the devices section.
  23. I guess that you purchased through Show Sage. You should contact them and ask for assistance. Since they are a Premium Partner, they should be able to help.
  24. Are you using two completely different sets for main and backup, i.e. two production computers and two display computers? Are you using Art-Net or DMX, i.e. an Art-Net to DMX converter between WATCHOUT and the projectors? Usually Art-Net and DMX work in a way that the last command takes precedence over the commands before but to my knowledge and experience WATCHOUT keeps sending the current Art-Net data even if there is no change. If the first computer sends 0 because it is at the beginning of the timeline where the output should still be 0 and the other one send a different value it can happen easily that the value remains at 0 or does strange value jumps negotiating between the two values sent from two controllers.
×
×
  • Create New...