Jump to content

Eddy Yanto

Member
  • Posts

    26
  • Joined

  • Last visited

Everything posted by Eddy Yanto

  1. Yes, agree with Mike that the error means WATCHOUT failed to connect and to deliver the AV mute/unmute commands to the TCP device for some reasons. It could be many things hence the suggestions. Some of the common issues that I've seen in the past were due to having multiple active networks (wired and wireless network being active at the same time and were connected to different networks), firewall being disabled temporarily but enabled upon computer reboot thus blocking the message and in some cases the commands were simply too fast for the target device hence the need for delay between commands. For SawLae's case, Epson Thailand has also reached out to us yesterday. We talked and confirmed that the TCP port 4352 on all projectors were in fact had always been open. They tested it using third party TCP sender software to confirm this. In fact, we also tested with Epson projectors in our showroom, and it worked without issue (see attached test video). The only difference between our test is the ASCII command that I used (which I don't think make matter): %1AVMT 30$0D %1AVMT 31$0D 0001-2500.mp4
  2. Hi SawLae, Make sure there's only NIC in either WATCHOUT Production or Display machine so that the command is routed to the correct network Make sure the firewall is always off on both systems When you switch from running the show using WATCHOUT Production/Master to WATCHOUT Display controlled by third party control system, please issue a LOAD command to the Display machine first Try to put some delay (eg: 50 ms) between commands (as shown in the attached video) Try to add Line Feed or $0A in the command eg: $25$31$41$56$4D$54$20$33$30$0A$0D $25$31$41$56$4D$54$20$33$31$0A$0D 2024-04-08 11-32-30.mp4
  3. We did a few setups with multiple BirdDog Mini (1080p@60) — some using WATCHPAX60 with 4 x 1080pNDI and some using our custom-built servers with 6 x 1080p NDI. All in all, to keep things simple and easier for troubleshooting, we only use single NIC even when our custom-built servers have an external 4-ports network card (and on WATCHPAX60 only 1 port is active/usable whereas the second NIC is reserved for future). I believe your milage may vary and if you're going for 4 x 4K NDI, you might want to go for 10G network switch with 10G port on your WATCHOUT display (be it WATCHPAX60 or custom built. There's some benchmark out there that state the limit on 1G network is 8 x 1080p NDI stream? Anything more than this will exceed the theoretical bandwidth. So, it's a good practice to have a bigger network pipe and to leave some bandwidth for additional stuff. Now coming back to using multiple NIC — it should be possible through some tweaking the priority and metric config (this is only possible for custom-built server). The downsize is, when there's any unintended tweaking changes, the NDI streams may not be discoverable through WATCHOUT or any other NDI tools on the custom-built server anymore.
  4. You can actually send GET or POST request using *String Output* in WATCHOUT by mimicking how the request header work, since HTTP is a protocol built on top of TCP. Please see the steps below Let's say you have a device that accept GET request at http://192.168.1.100/volume/mute Before you try it in WATCHOUT, you have to make sure that it works using a web browser Next, create a String Output Drag the String Output to timeline and edit the data to similar format below, the key thing is the GET uri and the host address: GET /volume/mute HTTP/1.1 Host: 192.168.1.100 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 Edg/120.0.0.0 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7 Accept-Encoding: gzip, deflate Accept-Language: en-US,en;q=0.9$0A$0D
  5. The short answer is no, not without conversion or additional work. But if your timeline permits and you’re willing to spend sometime to make it work, then it can be done. I dealt with 3D a lot back when I was with a UK based projector manufacturer. Below some notes from memory. One of the common 3D format on Blu-ray player is frame packing, essentially the left and right eye or frame with parallax are “packed” within the same frame. And for 1080p frame packed content, the vertical resolution is doubled so the HDMI signal that comes out of Blu-ray player is 1920x2160 px plus some additional space pixels. The refresh rate was limited to 24Hz and this was probably due to HDMI bandwidth back then. When an 3D capable projector receives the frame packed signal, the projector will then flash a left and right image one after another, together with an IR emitter in sync, a viewer with the active 3D glasses will see left frame on the left eye and right frame on the right eye. Since the the refresh rate on the signal is only 24Hz, the 3D projector has a feature to triple flash it - 3 x 24hz x 2 eyes = 144hz. WatchOut does stereoscopic 3D differently from conventional Blu-ray player or other 3D playback software. In WatchOut, you’ll need 2 separate left and right videos to 2 separate outputs. In one of my demo setup not long after the launch of v5, we did a 2 channels projection and it required 4 projectors. You can then adjust the parallax setting from within WatchOut. One thing that I remember is if you’re setting it up this way, you’ll need a silver screen with passive 3D glasses for it to work. Alternatively, to make it work using active glasses, a lot more work is needed. You’ll need to combine both outputs and convert it into a format that is compatible with active 3D projector.
  6. I would concur with what RBeddig has suggested and I think the broadcast address will also depend on how your network was/is setup. In a private local network, some will use class A IP addresses eg: 10.0.0.x while others will use class C IP addresses eg: 192.168.1.x. So I think it's not ideal if you were to hardcode a fixed broadcast address into your utility. Even in the same class, some may use different segment eg: 192.168.0.x vs 192.168.1.x. And to be able to broadcast from one machine in one segment to another machine in another segment, you'll also need to do the routing which is the function of router anyway. In one of my software where I broadcast a small UDP payload to discover available nodes/machines, what I did was the following: Loop through available network interfaces (as some machines may have multiple network interface eg: Ethernet, Wifi) Discard loopback addresses like localhost or 127.0.0.1 Get the broadcast addresses from those detected or active interfaces Add the broadcast addresses to the combo box so that user can select Below the UI and the implementation in Python (you'll need to find similar function in your language): # broadcast discovery for iface in netifaces.interfaces(): iface_details = netifaces.ifaddresses(iface) if netifaces.AF_INET in iface_details: det = iface_details[netifaces.AF_INET] if(det[0].get('addr') != "127.0.0.1"): #do not include loopback address self.broadcastCombo.addItem(det[0].get('broadcast')) Hope this suggestion helps.
  7. Thanks for sharing, Matt! It's a really useful piece of software. I just tried it, the way the modules are structured is quite intuitive — the protocol, hardware, software, etc. I am barely scratching the surface but I think it can fulfill what Manuel is trying to achieve with his input values getter/setter.
  8. Olá Cristiano, Você pode baixar o utilitário e o guia em: https://omnigram.net/releases/RemoteApp_v2021.06.zip https://omnigram.net/releases/remote-app-guide.pdf O RemoteApp será executado por 10 minutos inativado. Se você precisar executá-lo permanentemente, basta me informar o ID do hardware e enviarei o código de licença.
  9. If WatchOut restarted in an unintended manner, that meant something is wrong. In this case, you'll need to plan how do you want your inputs to be when WatchOut comes back online. There are 2 ways of doing it depending on your show design and how do you want your users to see: 1. Send the last saved input values back into WatchOut. Eg: setInput xPosition 1000 500 2. Send the default input values into WatchOut. Eg: setInput xPosition 0 500 If you share your overall setup and design, we can probably give you some suggestions.
  10. Getter method or function that return the value of an existing input doesn't exist in the published external control protocol. What you can do is to use or create a proxy or intermediary application that sit between the setter device (the device that set the input value) and WATCHOUT. This proxy or intermediary application will have member variables with all the input values. Whenever you set an input value, the proxy will set its member variable matching the name and remember that value internally. As well as forward that input value to WATCHOUT. I did a similar thing a while back for exactly this — basically I needed to know what is the current value of a specific input before I send the subsequent command. This was done by having the proxy/intermediary application storing and remembering the input. Please see this programming sketch : https://github.com/eddyyanto/watchout-proxy Eddy
  11. No problem, always happy to help! The standalone iPad app in the video was custom built for a client, I think we can repurpose it with new graphic and programming for new project. Alas, WATCHNET is good for basic scheduling and basic control where you can bind an action of a button to an event eg: run, pause or stop a timeline. Or a slider to a range values to control an input in WATCHOUT eg: to set opacity of an image or video with value between 0 and 1. But it doesn't support a slightly more complex subscription based protocol where WATCHOUT will periodically send updates to the subscriber. Also for the user interfaces, you're provided with a few basic controls such as button, slider, dial, joystick, text and image. It worked great and fit a lot of use cases back then. But for a slightly more complex user interaction, I'd recommend using its control protocol as the programming logic can be more expansive. Side note: great to hear you're based in SG too. Feel free to reach out to me, my contact can be found on my user profile in this forum. 😉
  12. Hi Eddy, What Mike and Wiesemen had suggested will work based on your listed requirement. I'd also like to point out that WATCHOUT has a very straightforward control protocol to interface with. You can use PIXILAB Blocks, WATCHNET ( WATCHNET is no longer in active development for a few years already since it's a mature product. You can still run it if you can get hold of a version 5 dongle license, in fact I deployed WATCHNET a while back) or you can use any other control systems in the market eg: Extron, Crestron, AMX and others. Since it's basically TCP/UDP ASCII based commands. To call preloaded videos is actually really simple, say those videos need to be independently started/pause — in most of my setup, I'd put them into auxiliary timelines. You can have another video/image as the background sitting in the main timeline. Once the aux timelines are uploaded and cached in the Display computer, you can just call them like: In fact, WATCHOUT also have protocol for you to subscribe to the timeline status (both main and aux timeline) so that you can sync your user interface in the control system to the realtime status of the timeline. You can see the video below for better illustration. If you're based in Singapore or APAC, give me a ping. I'd be glad to help.
  13. Currently working on a physical design with rotary encoder as an input interface to an interactive app. Incremental encoder is quite common in our daily life — we can find it in the knob of a sound system or a fan speed control. Some other use cases include: Timeline control with linear duration that can either go backward and forward. Range value control eg: volume or menu selection. Angular position/degree trigger mechanism. In this experiment, I am using a rotary encoder with a 3D printed wheel interface as a control for WATCHOUT linear timeline. Turning the wheel clockwise makes the timeline go forward, likewise turning the wheel counter clockwise makes the timeline go backward. This incremental rotary encoder is connected to an Arduino Ethernet controller and controller will calculate the rotation and translate it into milliseconds as required by the gotoTime function. Each full rotation (CW/CCW) is set to 30 seconds and can be adjusted accordingly. -- Eddy Yanto Omnigram
  14. I've had a few installations in the past few months using WatchPax40 and custom-built machines that required master volume control. In WatchOut control protocol, there doesn't exist yet any command that allow you to adjust the system master volume level. The closest you get to achieve volume adjustment is to define an input variable and attach that input to the video's or audio's volume tweening function (or through DMX). And then, you'll get to adjust the level using "setInput" command. In a show that contain dozens of video or audio files, tweening the same variable to the input can be a bit of an extra work. I've written a utility that I called Remote App for my projects — it allows amongst other things, adjustment to the system wide volume level (1 to 100%) and mute/unmute through network command in a simple to understand protocol. What the Remote App does is listen to TCP port 7000 and wait for volume commands. Example commands below: volume 60<CR> //set system volume to 60% volume ?<CR> //query system volume, the command will respond with volume=60 volume mute<CR> //mute system volume volume unmute<CR> //unmute system volume The download to the utility and guide are: https://omnigram.net/releases/RemoteApp_v2021.06.zip https://omnigram.net/releases/remote-app-guide.pdf At the moment, upon running, the utility will be in 10 mins demo mode. Happy to provide free permanent activation if anyone is interested. Here's the demo video of the utility in use. Best regards, Eddy Omnigram
  15. Hi jme, Yes, that's how I understood it and I am well aware of that. Thanks for the additional explanation to the thread. Best regards, Eddy Omnigram
  16. May I know if you've selected the same stream for both the default and alternative stream name when you add the NDI Video media? The alternative stream is actually optional. I've seen those source switching warnings many times in my setup and I'd say those warnings are pretty harmless. From observation, it didn't seem to affect the stream quality — no visible frame drop or stream lag at all.
  17. The thing about command tagging I observed is that, the tag only get returned on the first reply. For example: [16]ping<CR> [16]Ready "6.6.5" "WATCHPOINT" "Windows" false If you use a subscription based commands such as getStatus, subsequent replies will not have the tag returned: [18]getStatus 1 "TaskList:mItemList:mItems:TimelineTask \"Intro\""<CR> [18] Status "TaskList:mItemList:mItems:TimelineTask \"Intro\"" 2 6427 1858406 Status "TaskList:mItemList:mItems:TimelineTask \"Intro\"" 2 11460 1863418 Status "TaskList:mItemList:mItems:TimelineTask \"Intro\"" 2 16459 1868434 Status "TaskList:mItemList:mItems:TimelineTask \"Intro\"" 2 21475 1873450 I remember I had one similar status polling requirement for a 3 blended projection a while ago — and we needed to query the status of multiple auxiliary timelines and also the position of the video assets within the auxiliary timelines. Like Patrick, we ended up solving it using external solution. For us, we wrote a proxy utility that keep track of the auxiliary timelines' statuses and the objects' positions (there was no command to query for object's tween position or those variables defined under Input). The earlier copy of the utility that I used for that project is preserved here at : https://github.com/eddyyanto/watchout-proxy
×
×
  • Create New...