Jump to content

JJ Myers

Member
  • Posts

    74
  • Joined

  • Last visited

Everything posted by JJ Myers

  1. I tried that. And I even used a single cue that had very minimal properties, and certainly used very legacy properties (opacity, position, scale). I thought perhaps there was change in use of line returns or something of the like that was offending the older version. I did nearly a verbatim string search and found the 3rd line version string to be really the only difference. Thanks for the suggestion though. I look forward to evaluating the next major version.
  2. I am needing to take cues from a project built in v6.1.2 and migrate some of them to a new project using v6.0.2. Using a version later than 6.0.2 is not an option for this client. Long story... let's just say the client is not comfortable with any version later than 6.0.2 at this time. So, I tried the old trick of copying cues from the 6.1.2 project (open-end in 6.1.4) and pasting them in a new 6.0.2 project. The cues will not paste in my 6.0.2 project. The "paste" option is greyed out, as if 6.1.4 recognizes some sort of offending item in the clipboard. Even tried to copy just one simple, single image media cue item void of any tween data. Is there any way around this? Any find/replace tricks I can run through the copied code to make it compliant with 6.0.2? All I need to copy are some video, image, and composition cues that have some very traditional tween tracks (opacity, scale, position). And the compositions contain simple image and video cues as well. Any help would be greatly appreciated.
  3. We use these: http://www.avenview.com/index.php?main_page=index&cPath=1265_1292&zenid=agevjksi3p62bsfiiga2rh6gb2 They can do 3840x1080, but not full 4K. They work very well though and are not very expensive.
  4. I think perhaps what Playground is trying to say, is that the "0" key should be trapped when typing in a text field of any sort, so that it does not trigger the main timeline. I have also noticed that typing the "0" key in one of the numeric text fields for 3D projector registration will unexpectedly trigger the main timeline, making it cumbersome to have to keep exiting that dialogue to set the playhead back to where I need it (eg. test pattern). Whether is considered "intended" or not, it definitely is counter-productive when under the gun. To make a comparison, if you were typing "192.168.0." in the network range text field of the preferences, you would not want the playhead of the main timeline to trigger when you get to the key "0".
  5. If all he has to do is corner pin adjustments for keystone, I would advise to useThomas' approach, and then use the corner pin tween on your virtual display layers that are placed on your 2D displays. Seems a lot simpler than using a 3D approach, which requires creating a mesh, UV map, etc...
  6. Hey dannyrats - we have purchased a bunch of the DPx2 cards and have tested them with WOv6. Everything has been working stellar thus far. The only thing to take into consideration with the DPx2 cards is the lack of 4K DP distribution/infrastructure hardware out on the market. I have yet to find a 4K DP matrix routing solution, and there is very little out there in the form of 4K DP DAs. And of course, the fact that HDMI/DVI is not backwards compliant with DP, you have limited options trying to convert signals into these cards as well.... although I have found a really good solution for 2K signal conversion: https://jet.com/product/product/5f968a69eb444b838f8ecdcd2b98ba77?jcmp=pla:ggl:electronics_a1:electronics_accessories_cables_a1_other:na:na:na:na:na:2&code=PLA15&k_clickid=23b67f0c-a077-5349-4327-0000397bed7f&kpid=5f968a69eb444b838f8ecdcd2b98ba77&gclid=CPGjgcjIgMgCFQpDaQod9DkGag So if you are like me - and have the need to distribute various 4K sources to different capture cards, you may find a challenge with these cards. In that regard - I think these cards are a little ahead of their time. I hope this helps and best of luck! JJ
  7. @NYC Todd - very well said. I could not have articulated that any better! I highly second NYC Todd's suggestion. In fact, other than a blur effect track, this is probably at the top of my wish list.
  8. Jim, With all due respect, something is getting overlooked. I have produced a video sample of what I am referring to here: https://www.dropbox.com/s/wa81yy6u5nqruof/Looping%20Composition%20of%20video%20cue.m4v?dl=0 As you can see, I followed all of your instructions to produce this working example. I will say, if you blink, you will not see it. In fact, watching this in real time will probably take a couple passes to see it, so you may be better off just downloading the video and then examining the footage frame by frame. This cue is a 180 frame composition out of After Effects, counting each frame. We use it to examine drop frame scenarios. What you will see is, when the cue arrives to frame 180, it drops to black (when the composition is at the loop point), and then continues at frame 2 of the cue's internal time base.
  9. Hi Erik, Placing the video in a composition and looping the composition will not result in a seamlessly looped media cue. Compositions have not been able to jump to their zero time base without some sort of visual jump/glitch for some time now. I even tried your suggestion with the latest version of WO and it did not work. Composition time bases have not been able to produce the same time base behavior results that we have traditionally gotten with internal time bases of a video media cues. If I ever use the loop behavior feature of a composition, it is only when I have matched the first 0.5s of the composition with the last composition with something static. Otherwise, drops to black occur. If one were to line up multiple iterations of the ProRes media within a composition - for a longer-than-needed duration, and then placed the composition in free-run mode, that would work. But if someone needs to loop a composition for a completely TBD duration, then your suggested workaround will not work. My suggestion to anyone wishing to loop ProRes media at this time, is to convert the media to HAP. We have found the HAP codec to be incredibly powerful, lightweight, while delivering very good quality. We have adjusted our workflow from MPEG2 to HAP. And HAP even supports alpha transparency! JJ
  10. I am wondering if anyone has any insight regarding these 2 articles as it may or may not pertain to WO and memory restrictions on 32-bit resources: http://www.unawave.de/windows-7-tipps/32-bit-ram-barrier.html?lang=EN http://www.makeuseof.com/tag/unlock-64gb-ram-32-bit-windows-pae-patch/ I am trying to determine if this tweak to the kernel allows 32-bit applications to access more system memory. Obviously, it is advertised in these articles to allow more memory allocation to Win 32-bit installs (beyond the 4G limit), but I cannot find anywhere in the article where/if it states that restrictions on 32-bit applications/resources are also then removed. I suppose I could just implement the tweak and then monitor memory usage, but figured inquiring through the forum may be a better move first. Any experience/knowledge would be helpful! Thanks, JJ
  11. Follow up. Turns out things didn't work out quite as planned. The client specified 1080i, and my tests were attempted with 1080i. However, Unbeknownst to me, the GFX card was actually switching resolutions on me each time WO would launch. So... all my tests were actually operating @ 1080p30. I found a way to force 1080i even when WO launched, but WO behaved similarly to when you aren't working with legitimate DirectX 3D outputs. My guess is that DirectX 3D is disabled when "(interlaced)" display modes are selected. So... the DVI Extenders do not do the trick. We are now going with actual scaler devices to turn 1080p60 into 1080i60, using these : http://decimator.com/Products/MiniConverters/MD-HX/MD-HX.html Hopefully this thread alleviates future pain to someone else having to do this same thing!
  12. Just to wrap up and put a bow on this post, the client ended up opting for a used CPU from our rental stock (with v6 on the horizon, we are going to need to redo our fleet anyways ). So... we ended up using an old Firepro V7900 with the Blackmagic DVI Extenders. The DVI Extenders work perfectly! Once set up, everything boots up with their respective, remembered EDID handshakes. Add an iPod Touch, WAP, and WatchRemote for control, and I have a low cost "power-and-go-out-of-the-box" solution for my client! Thanks everyone for the help!
  13. I have a client that needs a WO system to feed 3x 1080p displays via HD-SDI. I have a couple ideas that I am pondering, of which I would be tremendously grateful for any forum feedback on : Solution 1 : Use a standard multi-head display port GFX card, along with active DP>HDMI adapters, and BM DVI extenders for the conversion. I didn't think there were active DP>HDMI adapters out there, until I found the one at the link below. Relevant parts list : http://www.newegg.com/Product/Product.aspx?Item=N82E16814195118&cm_re=FirePro-_-14-195-118-_-Product https://www.blackmagicdesign.com/products/dviextender http://www.startech.com/AV/Displayport-Converters/DisplayPort-to-HDMI-Active-Video-and-Audio-Adapter-Converter-DP-to-HDMI-1920x1200~DP2HDS Solution 2 : Use a multi-head DVI GFX card, along with the BM extenders. Relevant parts list : http://www.newegg.com/Product/Product.aspx?Item=9SIA24G28N0982 https://www.blackmagicdesign.com/products/dviextender I like solution 2 the best, but have had zero experience with those types of cards in WO systems with 2+ heads. Thanks for the help!
  14. Holy moly! This may possibly be one of the most brilliant things I have seen come along for WO in a long time. Definitely putting in my vote and interest in seat licenses now. If only I could vote twice...
  15. Hi, We have a new project where the requirements include an AES audio card from the WO display. Does anyone have any hardware recommendations for an AES sound card?
  16. Well look at that! You learn something new every day
  17. You can write a flash application (or an app from any development environment for that matter) that connects to your WO cluster over a TCP/IP socket. You can code that application to accept strings over that socket, and then use that string data to call a php script somewhere that sends an email. After WO, it is all pretty standard web development stuff. How you format and parse the string data is entirely up to you. Hope that helps.
  18. Circling back to the feature request concerning geometry control... With projection mapping becoming more and more prominent in WO shows, I am seeing a greater need for a makeover to the geometry control interface. What we would like to see is : 1. Some form of underlay of the actual display's current content, underneath the geometry grid. It is very important to be able to see exactly where we are setting/adjusting a point to in relation to the display's output. It would also be nice to see the warping effect on this underlay content, just like you see it on the actual display output. If performance is a concern here, we would happily sacrifice stage rendering while within the display's geometry control properties tab is open. 2. Making the interface scalable. The geometry grid is too small to work with in certain projection mapping capacities. If we had the ability to scale the grid up to a larger portion of screen real estate, it would help facilitate the meticulate work required for some of the projection mapping work we do. 3. The ability to delete individual direction handlers. I guess this can be accomplished by just setting the displacement value of the direction handler to zero, but the ability to just delete a direction handler from a point would make things snappy. The more it can operate like other vector-based drawing tools (eg. Adobe Illustrator), the more intuitive we would find it to be. Thanks for your time.
  19. I love features request topic discussions! One of my top requests is to have the ability to apply X and Y rotation to an anchor point separate of the vanishing point. In other words... not have to subscribe to the global 3D/vanishing point values for a given X or Y rotation. I understand this does not work for stereoscopy display, but for those of us who are using X and Y rotation 99.99% of the time for simple "2.5D" work, it would be nice to have the same X/Y features one has in After Effects. Just to be clear, if I wanted to horizontally align multiple cues of a playing card, rotating on it's Y axis, I could never achieve that effect in the current version of WO. Unless I am missing something...
  20. I picked up 2 Delta 66 (6 discrete output) cards as my client specifically requested them for a project. They assured me they were the best solution for multi-channel playback in WO and insisted on using them. I unfortunately found out on site that their experience with the cards was limited to WO v4/ Win XP. Of course, all my systems are WO v5/ Win 7 64b. So... I have discovered the following : 1. The Delta 66 cards will play back multi-channel audio files ONLY in Windows Media Player (outside of WO) with "multi-channel" selected in the Control Panel >Sound 2. WO reports an Operating System error (614) & Direct Show Error (-2147220890) when attempting to use a multi-channel audio file. This error is reported by both production and display machines, regardless of which Delta 66 configuration is selected in Control Panel >Sound. 3. WO reports an Operating System error (614) & Direct Show Error (-2147220890) when attempting to use ANY sound file (including stereo) if "multi-channel" is selected in Control Panel >Sound with the Delta 66 card. I have to choose "Line 1/2" instead, in order to not get the error in WO with non-multi-channel files. My guess is that the issue relates to the Delta 66 drivers not supporting the proper Direct X config that WO uses? If the drivers for the Delta cards are not the issue, and it is something else... what are the chances that it becomes rectified in a later release of WO? I'm trying to determine whether I should hold onto these cards or get rid of them. They are nice, because the cards themselves are PCI, instead of plug n' play... so it would be nice to use them down the road. If the drivers are the issue, then I guess I would be at the mercy of M-Audio... JJ
  21. "...seems to be all Flash (which won't help me on iOS)..." Not true. Although browser-based Flash is not supported in the iOS browser, one can write and compile apps for iOS using Flash technology. The latest versions of both Flash Builder and Flash Professional all contain the resources to successfully compile Flash and Flex projects for iOS. Read here : http://gregsramblings.com/2011/06/20/finally-its-here-flex-on-ios-android-and-blackberry-playbook/ JJ
  22. I have come across this behavior many times in the past, and have subscribed to the following protocol to alleviate any further encounters: Name your proxy files to the EXACT same name as the display. If your file is named "Display1.mpg" and your display is named "Display1", you will get an error. Therefore, you have to strip the extension from the proxy file name so that it is quite literally "Display1". Windows will warn you about stripping the extension from the file, and will no longer associate it with an application, but WO will be able to correlate it with it's proper display. Hope this helps! JJ
  23. I'm not sure if any one else using multi-ouputs in v5 has come across this, but I'm trying to get my head around it... My machines all have ATI Firepro V7900 cards for multi-output use. For those who are not familiar with these cards, they have 4x DP physical ports, arranged very nicely across a single PCI plane. Looking at the back of the card, my logic thinks.... hmmmm... Top --> Bottom must either equal outputs 1 --> 4 or 4 --> 1. However, my latest experiences seem to result in some (in my humble opinion) illogic from Microsoft, when assigning IDs to connected displays. Call me crazy, but it seems like Windows 7 'remembers' certain combinations of connected displays, and assigns IDs based on 'remembered' EDID info of those given displays for different combinations. Furthermore, I cannot figure out how Windows 7 determines these assignments of IDs, when it determines and 'remembers' and why it refuses to follow a logical, sequential order. As I've come across these experiences, I have learned to NOT fight the illogic and just take what Windows gives me. Lucky for us, we are able to sort it all out in the WATCHOUT environment. None-the-less, it is very annoying to anal-retentive, engineer, propeller head types like me. So my question to the WO community? Does any one know any clever tricks or have some further insight on how to force Windows to assign IDs according to their physical backplane order? If so.... perhaps, I can take a P-touch labeler to my GFX cards, and be free from Microsoft trying to be smarter than me JJ
  24. You were right! They were indeed passive. I did find that using an ACTIVE DP --> DVI adapter and a passive DVI --> HDMI adapter did the trick! It was a little messy, but it worked! Thanks!
  25. That actually sounds like a cool feature request. WO providing some form of Op. monitor providing feedback on an extra display output. Something "Task Manager-like", assignable from the production software. Don't know if it's do-able, or how much extra load it would put on a display, but I like it conceptually. It has my vote!
×
×
  • Create New...