Uli's Web Site
The iPhone Reality Show
A company named Command Guru ran the first iPhone Reality Show last week. The goal? Develop a social media application for the iPhone within one week. They had renowned Cocoa teacher Aaron Hillegass, Fetch Softworks' very own Jim Matthews, Jonathan Badeen from Casting Networks, CloudApp's Max Schoening and many, many more great and talented people.
If you didn't catch the show, I recommend you go to Command Guru's Web Site and watch the highlights (for you iPhone people, they're stored on YouTube, so you can probably view them there instead of using the Flash player on their site).
The team did great work. Although the show had a bit of a rough start with a bunch of technical issues, they kept improving on everything, and in the end it was quite an enjoyable show. If you want to see how programming in a big team works, or where people start designing an application, this show documents one of them, and you can even look at the source code they produced.
However, I'm not known for praising people on this blog. Rather, you're probably here for the technical analysis. So let's get to it. There's not much to say about the development process itself:
There is a lot more to say about the execution of the show. Now mind you, I'm not a professional audio or video person, so I'm writing this from the point of view of the audience:
The static/mobile camera rule
They followed the base rule I heard for good tapings of live events: Always have one stationary camera that covers the whole scene, so you have material for every second, then have other mobile cameras that get close-ups etc.
It took them several days to sort out the audio ...
... but to their credit, they did. I can understand interference from cell phones on this occasion, but they seemed to have no plan besides "the developers will switch their mics on and off manually when they have something to say". This didn't work at all. For one, the whole event was distributed across two rooms, which meant you sometimes heard people talking in one room while the camera showed the other (and the developers apparently had no way of knowing when they were on camera).
More importantly, often you actually had people in both rooms talking, or in the busy work phases, you had several teams in the big room involved in conversations. Often, only half of the participants had working mics (or had remembered to turn them on), so you got one half of each conversation.
You didn't see what was on the screens
Pretty early on they put up a camera over the shoulder of one of the graphic designers, so at least you saw what progress the graphics were making, and the nice part about graphic design is that it's much more obvious what's going on, and that it's a big part of what users see of an application. It was a good compromise, but still a compromise. On the live Twitter stream next to the video player on the web site (where they showed tweets tagged with "#CGLive"), you could soon hear viewers complaining that they wanted to see more of the developers, not just the designers.
This is a difficult problem to solve: You could have a screen recorder or VNC server running on each developer's Mac, but that would slow things down and cause jerky playback, and put a high burden on the network. You could set the Macs to mirror to DVI and hook them up to a video converter and feed that into a main Mac, but that's an expensive setup. Again, to their credit, the production crew caught on and started walking around with handheld cameras, filming screens over developers' shoulders. But they'd have needed more cameras, and they weren't constantly doing this, probably because camera men in the middle of the room screw up the picture for the stationary camera.
Also, the shots they did of cameras were usually more short, quiet transitional shots. It would have been nice to encourage the developer to think out loud, and to actually show someone debugging something, tracking down a bug for the audience to see.
You would think someone intending to do a live reality TV show of programmers would have a plan for showing more than just ten guys bent over their screens, typing. I hope the CommandGuru guys will chime in with their own report on this. I can only guess there was a plan and it fell through at the last moment.
They weren't recording gap-fillers
In every real-life interaction there are boring moments, where, simply said, nothing happens. Also, in every longer time-span of real life, there are moments where several things happen at once. On 24, these would be moments where you get a split screen picture. In the case of a reality show, these two things fit together. If you record everything that goes on, and record each sound channel separately, you can actually go and take apart the conversations that happen at the same time.
While it happens live, you pick one conversation, and turn down the volume on the others for the live stream. Once things have calmed down, and nothing happens, you can superimpose the recorded picture on the current broadcast as a picture-in-picture, and turn up the sound for the second conversation you had muted before.
Now again, this is not trivial. You'd need the audio and video to be fed into several machines, need several directors, one for the main feed and one for the second conversation, and they'd need to communicate. But maybe they could have recorded all conversations, all microphone feeds, and then put the second conversation into the highlights. Again, we will not know whether that was intended to happen and didn't work, either due to unforeseen problems, or simply because I'm imagining this as being way too easy, while it's actually impossible.
The web site started out with a bunch of short interviews with the participants in the highlights, usually only introduced by first names, but that was essentially all the introduction we got. It would have been nice to use a professional live-video-editing app (like BoinxTV, made by my friends at Boinx here in Munich). Not only would it have been easy to do the picture-in-picture thing I mentioned above (they did a few PiPs later in the week, but it seemed to have been done by some sort of manual video processor, and they did it rarely), but they would also have had the opportunity to show "lower thirds", little overlays at the bottom of the screen with text like "Emanuele and Matej, programmers". That would have been a great help to get to know everyone.
Of course, just having a page somewhere listing the participants with job, name and photo would have helped a lot as well.
Well, that's it from me with my uninformed guesses and obvious hindsight on how one should have done this reality TV show. Now let's hope we find out what actually happened in a bit. It was a great show, and fine for a first time. Maybe they'll do it again, and then we'll see what they come up with.
Created: 2009-12-13 @504 Last change: 2017-08-20 @761 | Home | Admin | Edit|
© Copyright 2003-2017 by M. Uli Kusterer, all rights reserved.