The work for the previous Sherlock movie was shared between Framestore and DNeg but this time around it was a 50/50 split between Framestore and MPC. Framestore had two main sequences, the gun fight on the train towards the beginning of the film and the Alpine castle environments for the finale of the movie. I was part of the castle team.
All the castle scenes were shot on a sound stage. The balcony was surrounded by green screen and needed matte paintings of the valley added. Because of the wide range of camera angles, I was asked to work with the matte painting guys to create a 3D environment in Nuke that could be used for all the shots in the sequence.
Once the matte painting layout had been approved, it was just a case of working on the key and adding in some layers of snow. We used 2D snow elements for the locked off shots, but had snow simulations rendered by the FX guys for the shots with big camera moves.
The two biggest shots I worked on were part of a sequence in which Sherlock previsualises the outcome of a confrontation. The fighting had been shot at 96fps and we had to match the re-speeds that had been done in the edit suite. We were provided with reference QuickTimes with timecode burned in and it was a case of manually matching the source frames in a retime node.
These retimes are a Guy Ritchie signature and I remember trying to recreate them in Final Cut Pro at university after seeing them for the first time in Lock Stock And Two Smoking Barrels. It didn’t work because at the time I didn’t understand that slow motion shots were filmed at a different frame rate. It was very cool to actually be working on some of them 11 years later.
The other element that was added to almost every shot on the balcony was cold breath. We were very nervous about this as it was easily something that could draw attention to itself and look fake. Our 2D supervisor had sourced a great collection of breath puff elements and grouped them by shape. He then did some clever scripting and created a gizmo that took the audio file for the shot and timed the elements to start at any point on the audio wave above a given threshold. We could then tweak the timings and elements for each point triggered and track the position to the mouths in the shot. A lot of it was open to interpretation as to what looked ‘real’ and there was a lot of discussion in the review sessions.
The original deadline for the project was supposed to be just before I left London but the work rolled on for another month or so. MPC Vancouver were still working on it when I got there but I didn’t do anything on it there. It was a shame that I didn’t quite see it to the end at Framestore (and I’m gutted that I missed the wrap party) but it was a good project leave on.
You can read a more detailed account of Framestore’s Sherlock work on their website.