Press Conference Production Notes
Some people have asked questions about the video process we used for our press conference earlier this week where we announced the new ATEM Television Studio HD, Blackmagic Web Presenter and the HyperDeck Studio Mini. So here are some notes on that!
We used 4 Blackmagic URSA Mini cameras with studio viewfinders. This meant I had a big tally light to see which camera was on air, however I did keep forgetting to look at the right camera from time to time! We used PL mount lenses on 2 of the cameras doing the wide shots, which had the broadcast style controls. The other 2 cameras we used for close ups just had 70 mm EF photographic lenses on them.
All 4 cameras connected back to an ATEM 2 M/E Broadcast Studio 4K switcher via the ATEM Talkback Converter 4K just using BNC cables. We didn’t really need to use such a big switcher, but it was in the rack so we used it anyway! However we did the production in Ultra HD 2160p24. We did 24 frames per second because we are only using it for web distribution so the lower frame rate saves a lot of data and makes the videos more responsive to view.
For monitoring we have SmartView 4K monitors for program and preview and we had SmartView Duo monitors for each camera. We also had some tech monitoring that had 2 SmartScope Duo 4K monitors, where we had a bunch of scopes for aligning the cameras. But that kind of went wrong as we had been using the cameras and the cameras were very badly aligned. No one noticed and, it was only after we had started that the guys noticed how the cameras looked vs the slides and realized we had not aligned them correctly. That’s one mistake we won’t make again!
For the slides, I just used Apple Keynote on a MacBook Pro and fed that out via HDMI to the switcher. The MacBook Pro will do Ultra HD at 2160p24 perfectly fine so we used Teranex Mini HDMI to SDI to convert it to SDI to feed to the switcher. I just changed the slides on the keyboard as we went through the press conference. The slides looked quite nice because I did the Keynote presentation at Ultra HD resolution, which is a custom setting in Keynote. However Keynote scales images very well so it’s always a nice way to generate graphics to the switcher.
Then the output was connected to the Blackmagic Web Presenter and we used Open Broadcaster software to generate the stream up to the YouTube Live servers. However this was a little more involved than just the single link, because YouTube Live supports redundant links which means that you can have a primary and a backup stream up to the YouTube servers. This is fantastic and if the primary link goes down, the servers will switch to the secondary link.
For the second link we had it connected via 4G phone internet so we had redundant internet connections also. The primary link was just using our company ethernet, which goes out via fiber. We used 2 Web Presenters on 2 iMac’s with one iMac connected by ethernet and the other one to the 4G network. This meant that the whole stack was redundant. If anything should go wrong, the YouTube servers would sort it out. The latency was about 35 seconds from the studio to the YouTube viewer.
At the start of the presentation we used the YouTube still frame for the first 24 hours. The YouTube server adds that countdown and message about the streaming, but the image was one we shot ourselves. About an hour before the press conference we started the stream so we could check out the network and ensure everything was working well with enough time to fix something if it went wrong!
So that opening image was just the first slide from my Keynote presentation. It was fun when the whole system was live and being streamed. The small counter we had running on that slide was a simple app I wrote the night before and it just did a countdown to the start of the event and it ran on a second MacBook Pro with HDMI out via a Teranex Mini to the switcher. We keyed it over the opening slide so people could get an idea of how long it was until the presentation started.
Just before the presentation started we wanted to play some music so we could check the audio was all working ok. We just played a royalty free music file available from YouTube and that was connected directly to the switcher. We wanted to play “Video Killed the Radio Star” by The Buggles, but it was just too hard to get the royalties sorted out for it! Perhaps next time!
What did we get wrong? A bunch of things, however like anyone doing live production, you learn from it! Apart from the camera grades being wrong, the audio was also a little too hot. We needed to spend more time testing that. We also felt that the audio could have had a little processing on it, such as some compression. I also need to say um less and make sure I look at the correct camera!
I think it’s fantastic using the cameras with the big viewfinders on them. The tally is really clear and easy to see. It was also nice to be able to use the Web Presenter product to do the stream and that was cool to be able to use the product we were launching to do the press conference itself!
So that’s how we used the equipment and I hope this helps!
Watch the Blackmagic Design Live Production and Broadcast Press Conference live stream, which covered new developments in live production and broadcast. 6th …