* PRIME FOCUS IS HIRING ARTISTS FOR ITS MUMBAI FACILITY. ANY FRESHER OR EXPERIENCED ARTIST CAN APPLY THROUGH WALK IN TEST ALONG WITH THEIR RESUME. FOR LOCATION DETAILS, Click here *

* PRIME FOCUS IS CURRENTLY HIRING FOR ARTISTS AND PRODUCTION OPENING FOR ITS VFX CAMPUS IN MUMBAI. ANY FRESHER OR EXPERIENCED ARTIST CAN APPLY THROUGH WALK IN TEST ALONG WITH THEIR DETAILED RESUME. FOR LOCATION DETAILS, Click here *

Saturday, 23 May 2015

Sci-fi indie effects: the making of Infini-By Ian Failes

If you're making a sci-fi film set in the 23rd century on a deep space mining colony, you might think that heavy CG builds and a ton of visual effects shots would be necessary. But that's where the new sci-fi film, Infini, from director Shane Abbess, is different. While there are some stunning set extensions, hologram work and plenty of visual effects enhancements, the production skewed heavily on carefully constructed sets, practical effects and practical solutions for visual effects.

Director Shane Abbess (left) and VFX supervisor Steve Anderson.
Director Shane Abbess (left) and VFX supervisor Steve Anderson.

“I see the role as a visual effects supervisor as not just a digital consultant,” says Infini's visual effects supervisor (and fxphd Prof.) Steve Anderson. “It’s about finding the path to the best effect for the available time and budget. That path is not always best achieved via digital methods. It’s always good to look in the back catalog for how things are done.”
We find out from Anderson how just some of the 350-odd shots were achieved for Infini, a film on which fxphd alumni Mat Graham was a producer. The shots we cover include real-life ASCII art screens, ooze-filled blood and creature gags and some elaborate stunt and set extension work. Most were achieved via an in-house effects unit.

Oozing with time
In the film, the presence of an alien force causes some strange phenomena to occur - including the apparent backwards motion of dripping blood.
Blood drips.
Blood drips.
That blood effect might typically be achieved by reversing high speed film, but Anderson was aware the final result can appear problematic. “I’m always adverse to doing reverse gags,” he says. “It is a little obvious to our vision memory where you can pick up on little accents of motion blur and acceleration/deceleration when it’s the wrong way around.”
When Anderson read the blood moment in the script - a drip of blood falls from the table and suspends itself in mid-air and then slowly reverses itself back up onto the table - he immediately thought of utilizing the ‘frame rate phase’ effect. “We found this reference on YouTube where some people were using a speaker driven by a signal generator to make a sign wave that’s really exactly the same as the frame rate of the camera,” explains Anderson, “where you vary that frequency ever so slightly to be a little slower or faster than the frame rate. You get this snapshot moment 25 frames a second of a repeated event, and when it’s ever so slightly out of phase it looks like it’s slowly going forwards or backwards - like high speed photography.”
Although Anderson created a first test with a speaker, he eventually settled on using a stepper motor to drive the phase effect event. “It had more torque and fine control,” he notes. “We just attached that to the table and created that vibration in sympathy with the camera. We also had to have a very shutter angle otherwise it wouldn’t work. We did have to have a little bit more light than we would have needed if we went the high speed route, though.”
The 'jelly finger'.
The 'jelly finger'.
With that blood effect so successfully achieved practically, Infini’s special effects supervisor Phillip Young consulted Anderson on another effect - the creepy ‘jelly fingers’ - that he was looking to solve. “We used the same stepper motor drivers to add all the secondary and tertiary motion to the worm-like forms,” outlines Anderson. “We had little strings to lift the heads but all of that lovely sinusoidal motion was entirely in camera and from the same phase gauge to give it that slow jelly-like undulation.”

Oozing with aliens

Towards the end of the film, the image of crew member Whit Carmichael (Daniel MacPherson) is revealed amongst a number of aliens - the ‘jelly people’ - in semi-transparent form. Again, the visual effects for these shots were not completed in the usual way. “The brief was that the slime and ooze was forming and learning to take the shape of human,” says Anderson. “We had to give the impression of Whit's face being formed on these rather exaggerated or skeletal jelly forms. In a perfect world, we would have gone a full CG route. But in the constraints of what we had to play with, we had to think our way around the problem.”
On the set of Infini, shot in Sydney, Australia.
On the set of Infini, shot in Sydney, Australia.
To solve the effect, the team elected to cover a stand-in actor, Goran D. Kleut, with KY Jelly lubricant. “He has this most unique form and is able to contort his body into the most extreme shapes and poses that really fit the bill to create something that was a little bit creepy and other worldly,” notes Anderson. “We shot his body for traditional crowd replication but of course the poor bugger was covered in KY jelly with a little bit of metal fleck added to it. He had to stand up there for a couple of hours freezing his nuts off!”
“Then Daniel as Whit stood in the same position and mimicked what Goran had done with his head movements,” adds Anderson. “We did enough takes that we had enough overlap and could marry heads to bodies.”

ASCII art

Close-up on screen.
Close-up on screen.
The control room on the deep space mining station features views to the extreme outdoors, as well as several monitors with information and read-outs. These days it would not be uncommon for the graphics to be burned in during post, but for these particular shots a unique approach was undertaken. Production networked more than 20 Raspberry Pi computers and monitors which were coded with interactive ASCII-like code - programs that the actors could actually interact with by typing and pressing buttons.
Filming the screens.
Filming the screens.
“It was an XML based bunch of ASCII that we could rig up that illustrated the drilling process,” explains Anderson. “And an entire map of the world. We were in geek heaven writing all of that! We put little Easter eggs in there - story points and mythos for all of the universe. Anyone who wants to go pausing and scanning those screens will be rewarded.”

Making the stunt

Plate.
Plate.
CG background.
CG background.
CG background.
CG background.
Final shot.
Final shot.
The film also involved much more traditional visual effects - from CG sets, matte paintings, environments to composites. One of the more challenging shots involved Carmichael making a run for it from members of his newly-infected team and leaping over a chasm, only to fall 30 meters.
For the stunt, MacPherson (or his stunt double) was filmed running through a corridor set backed with greenscreen and leaping onto a mat, with a camera operator following in a lightweight EPIC rig. The side view involved a wire rig and greenscreen setup. The fall saw the stunt double take a 12 meter tumble to a greenscreen crash mat below.
Visual effects artists then incorporated CG geometry as backgrounds, crafting two and a half-D matte paintings and ensuring depth cues were present in the shots. Similarly, scenes with outside views were aided by projecting matte paintings into the space.
“This meant that at least the light from outside an apartment would reflect in and interact with all of the reflective surfaces inside,” says Anderson, “and that would let the DOP and director frame up shots properly and not just be second-guessing visual effects.”
For Anderson, that approach - which also involved re-creating sets digitally with photogrammetry - was about keeping the visual effects manageable and in budget.
“We were not going to burn our budget for one given effect that may not be that necessary to tell the story,” he says. “It certainly was a challenging level of scope with the time and budget but we’re very proud of what we achieved.”

Thursday, 12 February 2015

Exodus: Gods and Kings -- Grading Ridley Scott's 3D Epic

CreativeCOW presents Exodus: Gods and Kings -- Grading Ridley Scott's 3D Epic -- DaVinci Resolve Editorial

Director Ridley Scott's 3D epic Exodus: Gods and Kings takes a gritty approach to the biblical story of Moses' escape from Egypt. Colorist Stephen Nakamura of Deluxe Creative Services' Company 3 helped expand the realistic look developed by Scott and DP Dariusz Wolski, ASC by using DaVinci Resolve 11 to grade the film, utilizing new tools that helped push the look further while still preserving necessary details for 3D. The film was shot on the RED Epic in 3D, graded in Los Angeles and London.
Nakamura's colorist credits are expansive, to say the least. He's worked with David Fincher, Kathryn Bigelow, Martin Scorcese, Quentin Tarantino – and that's just the start. His career started out in telecine, doing transfers for Warner Bros. cartoons in the early '90s, eventually moving onto telecine with other shows, commercials and music videos. In 2002, Technicolor started doing digital intermediates, so Nakamura jumped into grading Panic Room for Fincher and Confessions of a Dangerous Mind for George Clooney.

The transition to digital wasn't difficult for Nakamura. "I had been color correcting film to video in telecine, which is a film to video transfer, for over a decade. And then I'm doing a digital intermediate, which is color correcting scanned film. Even in the rudimentary days of the digital video revolution, we still made the images look filmic. It's much easier today, but even back then, we would create our own look-up tables and color correct it so it looks like a log-ish image."


Ramses (Joel Edgerton) and his army pursue the fleeing Hebrews. Photo: Kerry Brown. TM and © 2014 Twentieth Century Fox Film Corporation.  All Rights Reserved.  Not for sale or duplication.
Ramses (Joel Edgerton) and his army pursue the fleeing Hebrews. Photo: Kerry Brown


Having worked with director Ridley Scott a number of times in the past, how does that relationship work for you now?


Stephen Nakamura
Stephen Nakamura
Stephen Nakamura: Ridley is very very involved with color. Every movie is different for him, so it's not like he has one particular palette and all this movies look like that. Every movie he does has a selectively different palette according to the content and the feel he wants to convey to his audience. We have a session that's pretty early on where we build looks for the digital intermediate and visual effects, and we start sending palettes. Eventually they may end up reshooting, or certain visual effects will need grades, or marketing starts coming up, so I need to know what his vision is early on so I know where to go if he's too busy. I can get the looks he wants to the people that need them.


How do you translate what a director wants out of the color if they can't specifically tell you?

When I look at a movie, I start correcting the movie without any sound and direction for the most part. I kind of do a first pass of what I feel the movie should be, based on the way it was shot – and the costume design, the editing. At that point, I should kind of know what the scene is about: if it's a dramatic scene, or an edge of your seat scene. I put the color in that I think would look best for that. And usually when you work with a director a few times, you kind of know how they think.

Usually when I'm putting a grade on images, I'm not that far off right off the bat. Once you get into a relationship with a director and cinematographer, you know their aesthetic sensibilities. It's like somebody that cuts your hair, they know your taste, right? It's sort of the same thing. You know the people you work with, they know 'that guy knows what I like already.' And they have me change a bit, but the basic stuff is there. They don't have to micromanage every single shot.


You've done a lot of films where the color grade is more of a formal approach, like A Single Man. Exodus is considerably more realistic-looking, especially more so than other movies with similar content. What is your part in developing a formal or realistic color grade?

A scene from A Single Man
A scene from A Single Man


The cinematographer is very heavily involved with the look. So when a cinematographer decides with the director, 'I want to create this look for this movie', a bleach bypass or heavily saturated or contrasty look, that is getting conveyed all through production with the production designer, costume designer, and so on. I'm the last part of that chain. I'm trying to keep their vision and add my own feel to the color that can help them convey that message emotionally through the visuals.

I'm glad you mentioned [A Single Man]. I love all the movies I work on and all the stuff I do, but as far as color correction goes, I was really proud of the way that movie looked at the end. To create that kind of Life Magazine look from moving images? I had to experiment like crazy to come up with that look. That was about as difficult as a color correction gets. I used DaVinci tools on that too, very similar to what I use today.


Christian Bale as Moses leads the Egyptians into battle. Photo: Courtesy Twentieth Century Fox. TM and © 2014 Twentieth Century Fox Film Corporation.  All Rights Reserved.  Not for sale or duplication.
Christian Bale as Moses leads the Egyptians into battle.


What do you think about a colorist having a style?

I feel like a colorist shouldn't have a style. If you have a style, that might be too rigid. Every director has a different vision for a particular movie, whether it's a drama or action or romantic comedy. Everyone has a different vision of how they want the movie to feel, so as a colorist, you need to be like a chameleon. If you have a look, you don't want to impose that on everyone as a blanket thing on every movie. You need to examine the images and have a conversation with the director and cinematographer, get the feel of the story, then build a look that can help them tell their story. It's completely project by project basis.


What is the process of grading a film shot in 3D? How many grades are there?

We grade the movie in 3D first, and we're grading it at 3.5 foot lamberts (fL). Typically the standard is 4fL, and [20th Century] FOX likes the ability to grade at 3.5fL just in case theaters are a little darker. We make the picture look as good as we can at 3.5fL and if the theaters can go to 4 or 4.5fL then it looks fantastic, even better. We do a 3.5 grade, then we do a 6 fL grade, and a 14 fL 2D pass. We're doing three transfers, but really the bulk of theaters will get the 3.5 (3D) and 14 fL (2D) passes.


Christian Bale (left) stars as Moses and Joel Edgerton stars as Ramses. Photo: Kerry Brown. TM and © 2014 Twentieth Century Fox Film Corporation.  All Rights Reserved.  Not for sale or duplication.
Christian Bale (left) stars as Moses and Joel Edgerton stars as Ramses. Photo: Kerry Brown.


What tools inside DaVinci Resolve helped you accomplish this color grade?

There's a highlight adjustment tool in the Color Match function that was recently added which was extremely helpful on this movie. What it basically does is it allows you to suppress the highlights to an extent, preserving details while we're brightening the picture to make it look appropriately bright for actors' faces at 3.5 fL in 3D. Typically we grade in 3D at 3.5 fL and when you're grading people, the key side of the actor's faces is what needs to look good.


A shocking hailstorm plagues Ramses (Joel Edgerton). Photo: Courtesy Twentieth Century Fox Corp. TM and © 2014 Twentieth Century Fox Film Corporation.  All Rights Reserved.  Not for sale or duplication.
A shocking hailstorm plagues Ramses (Joel Edgerton).


What happens, especially with daylight exteriors, is a lot of the highlights get very blown out. The highlight tool kind of pulls it down. You can dial it down in a really subtle and beautiful way, and it looks really fantastic at 3.5 fL in 3D. That tool is amazing in 3D. It works differently than using a typical highlight key or from using a soft clip. It's a really clever design. That tool by itself in Resolve 11 was a real game changer as far as how hard I can push the grade and maintain the feel of having the highlights being saved while not looking compressed.

DIGITAL DOMAIN VISUAL EFFECTS TOOLS MOVA AND DROP RECEIVE ACADEMY AWARDS®


Playa Vista, CA - January 14, 2015. The Academy of Motion Picture Arts and Sciences announced on Tuesday, January 13 its recipients for the Scientific and Technical Awards. Among the honorees were the technologies only available at Digital Domain, MOVA, the facial performance capture system and Drop, the large scale destruction toolkit. Each year the Academy honors the behind the scenes innovations that have demonstrated over time a proven track record of making, significant contributions to and improving the process of, making motion pictures. MOVA has set the standard for capturing actor’s performances in films such as “Curious Case of Benjamin Button” for Brad Pitt's aging facial effects, resulting in an Academy Award® for visual effects, “Harry Potter and the Deathly Hallows,” “Pirates of the Caribbean: On Stranger Tides,” “Transformers: Dark Of The Moon” and “Tron.” Drop has been breaking ground helping visual effects artist create destruction in such films as “Tron,” “X-Men: Days of Future Past,” “Iron Man 3,” “Thor” and the Transformers franchise. This is the first time that two different technologies Digital Domain exclusively offers have received Academy Awards® in one year. Artists at Digital Domain have been awarded this honor four times in the past for FSIM - Fluid Simulation System, STORM, NUKE and Track.

“We could not be more thrilled about this tremendous honor,” said Daniel Seah, CEO of Digital Domain. “Our teams here work tirelessly and with a passion to find new and innovative ways to improve the industry standards. It is inspiring to be around this amazingly talented group of artists and we could not be more proud of them.”

Doug Roble, who has been with Digital Domain for 21 years and serves as its Director, Software Research and Development said, “The Academy has awarded two distinct pieces of technology at Digital Domain: MOVA, a tool developed to create believable digital actors and Drop, a tool that simulates destruction on a massive scale. It's a testament to the creativity and range of the software developers and artists at this company. We're constantly pushing the technology of visual effects forward, giving filmmakers new tools and audiences things they've never seen before.”

About Mova
Since humans are really, really good at recognizing faces, the challenge for the visual effect team is to create a computer generated character that the audience will believe. The visual effects team needs to be able to capture all the detail and subtle motion of an actor's face. MOVA was developed to do exactly that. The technology uses an array of many cameras, and an advanced lighting rig to record all the details of an actor's performance. Special software uses this massive amount of data to create a high-resolution animated digital 3D mesh of the actor's face. The shape of the actor's face is reconstructed down to the smallest of wrinkles and as a result all of the subtleties of the performance are preserved. Artists are then able use this animated face to reproduce the actor in a different scene or transfer the actor's performance to a different character altogether.

About Drop
Drop is a digital destruction effects toolkit. Simply put, Drop gives the artists the tools needed to take a 3D model of a building or object, and blow it up. First, the building will shatter. Drop takes a 3D model and lets the artist break it into bits and pieces. The building still looks the same, but now it's built out of a bunch of chunks. Next, Drop knows how things move. The artist can specify where the explosion starts and Drop will figure out how all the chunks of the building will fly apart and bounce around during the explosion.
This is very computationally expensive when you think of how much is going on in an explosion. Drop works in conjunction with "Bullet," an open-source rigid-body simulation system, so that the simulation of the motion of the chunks is very, very fast. And there can be a huge number of chunks (a lot of smaller chunks adds detail to the simulation that make it look more real). Before Drop, breaking up large objects was extremely difficult. With Drop, it's now commonplace and gives artists and directors new tools to tell an exciting story.
 
About Digital Domain 3.0
Founded in 1993, Digital Domain (“original Digital Domain”) delivered innovative visuals for more than 100 movies including “Iron Man 3”, the “Transformers” trilogy, “TRON: Legacy” and “Titanic.” Its artists have earned multiple Academy Awards®. A creative driving force in media applications, original Digital Domain brought its artistry to thousands of commercial, video game and music video productions. The original Digital Domain also created digital humans for concert performances and co-produced the feature film “Ender’s Game”. From facilities in California and Vancouver, British Columbia, Canada, including its own state-of-the-art virtual production studio, the Digital Domain 3.0 Group (consisting of Digital Domain 3.0, Inc., Digital Domain Productions 3.0 (BC), Ltd. and Mothership Media, Inc.) enables the creation of extraordinary visual images in traditional entertainment and advertising. www.digitaldomain.com

Wednesday, 22 October 2014

Adventures in 6K with Jackson, Wyoming's Brain Farm Cinema



Credit John Schnack
Photo Credit John Schnack


Nestled in the wilderness of Jackson, Wyoming on the edge of Grand Teton National Park you might not expect to find a high end production house. Staffed with a small team of outdoor sport enthusiasts and adventure-seekers and fortified with the latest in 5K or 6K cameras and technology, the Jackson Hole Valley becomes a perfectly logical place for Brain Farm Digital Cinema, known for such work as the snowboarding film 'The Art of Flight', filled with the craziest of snow sports stunts by snowboarder Travis Rice, all captured and cut in house.


Photo by Scott Serfas.
Travis Rice. Photo by Scott Serfas.


Whether it's a ski jump flyby or getting up close and personal to Jackson's native wildlife, getting the shot comes at great cost to Brain Farm, financially and mentally, especially when you put 6K acquisition into the mix. Pushing at the edges of growth in technology to make great films isn't without its growing pains. But by developing great relationships with companies like HP and experimenting with workflow changes, Brain Farm is beginning to shift its approach to dealing with the massive amounts of massive footage.

Shooting these athletes in the field is sometimes about luck, and sometimes about planning. "There is quite a bit of choreography between the team and athletes," said Brain Farm's head of production Chad Jackson. And there would have to be, considering teams are dragging equipment through miles of terrain, sometimes even by snowmobile. "These aren't Hollywood budgets. We do a lot more with less."


Brain Farm's head of production Chad Jackson
Brain Farm's head of production Chad Jackson


Whether the team's job is to execute a planned shot or be there to capture the moment when it happens, the talent behind the cameras is important because Brain Farm isn't shooting on just anything. Their arsenal includes the Red Dragon, Phantom Flex and Phantom Flex 4K, and Shotover F1, among other tools. Brain Farm also has specialty vehicles for traveling and shooting on the road, like a customized Ford F250 with a Shotover F1 camera, or from the air, with various unmanned aerial cameras. Fujinon ultra wide lens, Arri Ultra and Master Primes, and Canon cinema lenses are within the team's rotation.


Credit Danny Zaplac
Photo Credit Danny Zaplac


Besides the expected challenges of shooting on a snow-covered mountain, media management becomes a concern, Jackson explains. Many of the cameras have a special process for offloading media and shoot a lot of big files, so manpower and hard drives are a must in the field.


Post production supervisor Danny Holland
Post production supervisor Danny Holland


Back in the climate-controlled Brain Farm headquarters, post production supervisor Danny Holland keeps things running in an offline to online workflow. All the media acquired in the field is transcoded to ProRes proxies and reconformed at the end – or that's how it's been so far. Holland says "Things are changing so quickly right now, I think there's a thought process like an old crochety IT guy like 'here's what works, what we're sticking with because we know it'll work.' And trying to be open to change and embracing new stuff as fast as you can has its advantages."



The edit bay


Brain Farm has been a Mac-centric facility until recently, when HP's Z820 Workstation was introduced to the mix. Holland, a long-time Apple user, was skeptical. "It really was like this alien in our environment for me," he explains only half-joking. One of the main concerns? How to actually integrate the Windows machine into the facility's ethernet-based shared storage. Turns out it wasn't so hard: an update to Mavericks and switching from AFP to SMB for connecting to the server, and a little intervention from Maxx Digital's Bob Zelin, and everything was working as expected.

Another question for Apple users moving to Windows: but what about my ProRes? For the last several years, ProRes has come to be at the center of acquisition, editorial, delivery and archival. It's a comfortable and ubiquitous codec -- for Macs. For Holland, the issue was a little confusing at first, jumping to Google at first and getting no good answers. Then he connected with Open Drives CTO Jeff Brue, who had been experiencing a similar issue. Brue recommended a plug-in from Miraizon. Holland commented, "For the most part, the ProRes Codec from Miraizon is pretty much 'plug and play.' Once installed, it just shows up in the drop down menu when exporting. It's pretty simple." He noted that Da Vinci Resolve doesn't currently support the plug-in, but his other apps including Adobe Creative Cloud are working smoothly.


Credit Cameron Strand
Photo Credit Cameron Strand


"Consistency tends to keep things running smoothly in an offline to online workflow. Starting this integration with 80% of our media being in ProRes, it felt like the right decision to keep working with it as a mezzanine codec," Holland added. And while the solution is working well for Brain Farm, Holland hasn't dismissed the idea of building workflows around a different codec, like GoPro Cineform or Avid's newly announced DNxHR, which was previously not a contender since it was limited to HD resolutions.

And for a post supervisor, that's pretty much the extent of the technical difficulties. A little bit of codec questioning and some disk format concerns, and the integration has been happily unremarkable except for what it's added to Brain Farm's power. The first test came for Holland on a massive 4K conform in DaVinci Resolve. "It was a life saver for me to be able to work at that resolution and grade in Resolve, and I don't think any of the Mac Pros I had in house could have done that. Having the power of the Z820 was vital to the success of that work in 4K. That turned me, and I became a lot more open to the performance we got there," Holland explained.


Photo Credit Greg Wheeler
Photo Credit Greg Wheeler


[For the sake of comparing Apples to ... non-Apples, Brain Farm's in house Mac Pros are the legacy style: 2x2.26Ghz quad core with 52GB of RAM, a 256GB SSD boot drive, and a GTX580 3GB RAM with a Red Rocket card. The HPZ820, now the previous generation in the Z workstations, was loaded with a dual Intel Zeon E5-2680 2.280Ghz processor, 64GB of RAM, 4 3TB drives as RAID5, Z Turbo Drive 256, an NVIDIA K5000 and Thunderbolt card. If you're the type that likes to keep score at home with all this.]

Windows being the biggest hurdle for getting historically Mac people to make the jump, Holland remarked that the adjustment wasn't as profound as he originally expected. "The software is where the creative aspects are happening, so if that's working okay and functioning as it should, the power and performance just allows for us to work at a high resolution with less difficulty. So whether it's rendering something out in a matter of minutes versus 30 minutes, that can make all the difference when you're trying to upload something to send to a client in time. Those little 30 minute renders for a five minute video can add up quickly. The speed gains have been nice because we can keep working and not have to worry about the time lag to deliver or watch something."


Credit John Rodosky
Photo Credit John Rodosky


With the kinks working out and the staff warmed up to the HP Workstation, what's next for Brain Farm now that they have more power on their side? For one thing, they're going to keep doing what they're doing with a little less worry. Jackson says, "We're less hesitant to shoot a lot in the field now that we have a faster, more powerful machine to transcode." And with the transcode bottleneck alleviated, Holland suggests maybe native editing is on the horizon as the team contemplates multiple 4K finishes in the coming months, an especially promising outlook for Brain Farm considering the power behind the newly released Z840 Workstations.

"I think everyone has to evaluate their budget and goals as they make decisions [about their system needs.] As we evaluate and look to the future and try to find a tool that could grow with us, I think we've got a nice solution with the HP Workstation....Working with [them] has been really nice because we have a dialogue for trying to solve things, and we get the space and expandability in which to do that."

Technology aside, Jackson and Holland are happy to continue shooting and cutting extreme sports in extreme places whatever the trade-offs. "I feel very fortunate. It could be drier content for sure," Holland laughed. Jackson added, "It's lots of work. Sometimes you ask if it's worth it, but then there's always a pay off."


Credit Ryan Sheets
This photo and top title graphic credit Ryan Sheets

HP's New Workstations: Powerful, Expandable, and Compatible



CreativeCOW presents HP's New Workstations: Powerful, Expandable, and Compatible -- What Computer Should I Buy? Review

Expanding its Z-series line, HP announced major updates to desktop and mobile workstations focusing on expandability, reliability and compatibility – and specifically targeting disappointed Mac users that may not have considered the PC world before.

"The resounding word among media and entertainment customers is bottleneck." said Jim Zafarana, Vice President and General Manager, Commercial Solutions, PC Global Business Unit. With the major file sizes and processing needs of 4K media and beyond, and the demand for native editing, the requirements of workstations are pretty hefty. And HP is responding: the latest Intel Xeon processors, multicore updates, and many options to keep expanding as the media needs keep exploding.



HP Z840, Z640 and Z440 workstations


DESKTOP WORKSTATIONS
The Z840 workstation is HP's most powerful solution for major IT uses and visual effects needs, with support for two Intel Xeon processors for up to 26 total processor cores. It has 7 PCIe slots, up to 10 internal drive bays, and 16 memory slots supporting up to 2TB of memory. Yeah, terabytes of memory – to clarify, right now the Z840 can support 1TB, but as soon as 128GB DIMMs are available users can take advantage of 2TB. In addition to NVIDIA graphics cards, HP has added the option for AMD cards so users can have more options.

With the massive expandability and potential power in the Z840 box, it's definitely a workstation for the power user. Or as Mike Diehl, Worldwide Product Manager (HP Z Ultimate Platforms) put it, "it's also good for those of you that like to think you're a power user – we love you too."

The Z640 workstation is a versatile solution for many in the post-production industry, with a dual Xeon processor configuration and plenty of scalability for the future including an NVIDIA or AMD graphics card option. The 8 memory DIMMs and up to 4 internal hard drives make it a popular workstation for creatives.

And the Z440 is most commonly the consumer level workstation. Interestingly, HP keeps typically outdated connectivity like PS/2 on board for legacy customers who depend on keeping certain older technologies around for mission critical work. Working to keep users happy, they balance the latest workstation technologies around some of the oldest.

All these workstations feature a tool free chassis with green "touch points" that show users how to remove parts, so a task like inserting RAM or switching a hard drive can be a simple fix (or not – security screws are an option too.) They're also quiet and carefully designed for industrial uses, and can be rackmounted. Each workstation features plenty of connectivity from USB to Thunderbolt. And handles on top, because they're heavy when you load them up.




LEAVING MACS BEHIND
There are some things I've never cared to write about in my professional life, and the older-than-the-internet debate of Mac vs. PC was definitely one of them. What more is there to say? But thanks to Apple, it's not only become a legitimate and important aspect of the conversation, it's become a fascinating way of examining the evolution of the post-production industry. Some creative pros fully committed to jumping from Apple's ship, while others decided to wait it out and see what the company has to offer. After all, FCP7 didn't stop working. But as older Mac Pro towers begin to show their age, more hesitant users are starting to look at what's next. Is the new Mac Pro the solution?

"We believe the [older] Mac Pro install base is about 1.2 million strong right now, and they're all looking for the next thing. Is the new Mac Pro it? It might be for some but...what used to be very expandable and very secure with a choice of graphics is now something completely different," said Jeff Wood, Vice President Worldwide Product Management Commercial Solutions Business Unit.

He added that the previous iteration of the Mac Pro was a huge part of the high end workstation market and "with 4K data coming in, multiple streams of audio on the timeline – the [new] Mac Pro is going to be very under-powered. We feel there's a big opportunity."



HP Z27i 27 inch IPS Display


So much versatility with these workstations: powerful GPUs, flexibility with processors, expandable in every way. What's keeping Mac users hesitant? Wood put it simply: "Windows."

"It's scary, but we're saving time." Danny Holland is the post-production supervisor at Brain Farm Cinema, the production facility responsible for the snowboarding film "The Art of Flight" and other extreme spots films with Red Bull Media. Currently working on finishing three films in 4K, Mac-user-for-life-Holland began integrating Z820 workstations into his otherwise Apple post-production facility. He agreed that Windows was a major challenge. It's difficult to be considered to go-to expert in a facility when you don't feel like an expert yourself.

But any difficulty has been worth the gains, especially in finishing the films in 4K on Resolve. Using platform agnostic software like Adobe Premiere and Resolve has also eased the transition for Holland: "What we use to edit and finish with, that stuff is basically the same. It doesn't change. They're tools to help us do the creative work."


Holland is looking forward to incorporating more Z workstations into Brain Farm's facility. He's especially interested in how the internal drive bays on the new Z840 might add to their ability to get work done with large amounts of media more quickly if SSDs are a part of the equation. In the mean time, he's happy with what HP is providing. "I think we've found a good solution [in our workstation] – it's not super maxed out but it's pretty high end and we'll see how it goes as we continue to test things out."


UPDATED MOBILE WORKSTATIONS
HP's Director of Worldwide Product Management (HP Z Workstations) Josh Peterson commented on the industrial durability of the mobile workstations while standing upon a Zbook 17: "Are there any MacBook Pros out there I could demonstrate on?"

Inside the laptop built to military spec, the new Zbook 15 and 17 both feature i5 and i7 dual and quad core Intel processors, up to 32GB of RAM and NVIDIA or AMD graphics cards. Both mobile workstations have maximum connectivity, with Thunderbolt 2, USB 3 and DisplayPort available. And like the desktops, the laptops feature tool free removals. One difference between the workstations: the ZBook 15 has a QHD display panel, while the 17 has the option for a DreamColor display for color accuracy.


HP: UPDATED MOBILE WORKSTATIONS
MOBILE WORKSTATIONS


MANAGING TRANSITIONS EFFECTIVELY
HP's Fort Collins, Colorado location, where these workstations were unveiled, has a genial atmosphere that's more like going to someone's home than stepping into a facility that sells more desktop workstations than anyone else.

As I took a tour of their facility, I got the sense that work is actually done here. It may seem a silly observation, but I've walked around a company before and got the feeling I was seeing a carefully crafted facade more than anything else.



Inside the HP Hardware lab at Fort Collins, Colorado


At HP, the people were more in focus than the products they were carefully designing, testing, evaluating. In contrast to the kind of closed off architecture that a company like Apple has become known for, this level of access and amiability could become a key factor in transitioning some of that large chunk of a dissatisfied user base.



At HP's R&D lab


HP's Z workstations are expected to be available in October. Estimated U.S. pricing starts at $1,299 for Z440, $1,759 for Z640 and $2,399 for Z840.

First Naval Air Reserve Unit Descends Upon AlphaDogs

(Burbank, California--October 21, 2014) The humble beginnings of the United States military can be easily forgotten in today’s modern world where advanced military technology and drone strikes are becoming more prevalent and changing the face of warfare as we know it.



Resources were scarce at the beginning of World War I with the U.S. Naval Aviation consisting of just 48 officers and 239 enlisted men who had very limited aviation experience. Not to mention that there were only 54 aircraft available that were inadequately equipped for warfare. It was during this time in 1916 that the “The First Yale Unit”—otherwise known as “The Millionaires’ Unit”—was formed. Comprised of 29 Yale college students, these courageous men taught themselves how to fly aircraft such as the Sopwith Camel, SPAD VII, the Curtiss Model F flying boat and more. The students became what is now known as the first naval air reserve unit. In the documentary The Millionaires Unit: U.S. Naval Aviators In The First World War, audiences are taken on a journey back in time into the fascinating story of how the militia airmen from Yale changed the face of naval aviation forever.

Seven years in the making and sprinting to make the deadline in time for the EAA AirVenture Air Show in Oshkosh, WI, which was less than a month away, filmmakers Ron King and Darroch Greer entrusted AlphaDogs would deliver the vision they had for the film. “When you’re bringing your baby to be cared for by strangers, it takes a sucked-in-gut and a leap of faith to sign on with a new post-house. You want the post crew to take it to heart and treat it as tenderly and thoughtfully as you would yourself. The staff confidently said they could accommodate us and their energy never flagged.”



The Millionaires Unit consisted of a wide range of source footage and mixed formats, each requiring a different look and feel for the story. AlphaDogs colorist Sean Stack worked one-on-one with the filmmakers in setting the look for the film. “Sean’s curiosity and attention to the story were thoughtful and encouraging,” said Greer. “Most of our WWI footage was shot in New Zealand with Red Epic Cameras using the same color palette, yet the footage had to represent the pilots training and four separate battle/dogfight sequences. Sean was able to give them all their own verisimilitude. New Zealand is beautiful and hardly looks war-torn, yet with desaturation and special attention to black levels, Sean gave several of the scenes an ominous and deadly feel.” In addition, Stack also gave vintage photographs and recently shot HD interviews an overall balanced look to fit seamlessly into the story without becoming a distraction to viewers. Dynamic range was then added revealing detail in darker areas that would not otherwise be noticed including the expression of emotion on the pilots faces. Stack comments, “Aerial photography and vintage airplanes along with real stories of courageous young heroes. It's true American history, it's great to take part in telling the stories.”

Due to the high amount of low-resolution archival footage Stack decided it would be beneficial to work creatively outside of the box in an unlocked timeline to expedite the post-production workflow. “Working in an unlocked timeline is not typical. It was quite a feat of organization. We added about a dozen partial timelines together that allowed us to get all the graded media assembled together,” said Stack. “It’s a bit unusual to move forward without a completely locked full resolution timeline, be we did it anyway. I couldn’t have done it without the help of the conform editor and the director working together in a collaborative environment to keep track of the footage. It was a successful juggling act, and we finished in time for the airshow.”



Using a DaVinci Resolve workflow, Stack was able to work collectively with AlphaDogs graphic design team in creating visual effects for the film. Select visual effects shots were created for the aerial dogfighting scenes and for an extended night bombing sequence where Stack applied a very dark blue grainy look giving a feel of“” animated realism to the scene. The bombing sequence required POV shots of enemy searchlights, timed to a first person account of the mission. Senior Animator & Designer Russell Frazier constructed these shots in After Effects from archival photographs and cloud footage, fabricating the search light beams and tracer fire. “Because the visuals illustrated a dramatic and personal account of the battle, our goal was to match the mood and impact of the author’s recollection,” said Frazier.

Another scene Frazier created required billowing smoke to be added to new footage of a vintage aircraft to simulate an emergency landing. The extensive movement of both the plane and the camera required separate tracks for each. The smoke element was created in Trapcode Particular, including the turbulence and wind effects. Greer comments, “The fact that the VFX team was working without visual references and only with written descriptions of WWI-era anti-aircraft fire and searchlights was impressive. In particular, the 3-D depth that Russell gave the searchlights from the POV of the planes was thrilling.”



AlphaDogs VP of Design Sean Williams did additional VFX scenes giving the story additional depth and vigor. “I was excited by the challenge of recreating WWI aerial dogfighting. By blending subtle VFX with beautifully shot footage of vintage planes, I hoped to heighten the drama of an already thrilling story,” said Williams. “Having almost no archival sources of period tracer fire, the evocative written words of the actual pilots were instrumental in setting the look and feel. Very proud to have made a contribution to such a captivating story.”

The collaborative approach between the filmmakers and AlphaDogs during the post-process was instrumental in delivering a quality product that everyone was happy with. “It was fun plotting the workflow with the color and VFX and in making sure the effects would play against the color palette. The work—shot-by-shot—continually exceeded expectations. The entire crew was first-rate and made the birthing process feel safe and exciting. Even after we dropped the ball in coordinating our workflow, the team pitched in, picked up the pieces, and attended to each problem and detail as they were presented. More importantly, they made us feel proud of our work. Indeed, we were able to walk out of AlphaDogs with smiles and pats on the back. I don’t think we could have made a better choice”

The MillionairesUnit is produced by Humanus Documentary Films with producers Ron King, Darroch Greer, Harry Davison, and Mike Davison. For more information visit http://www.millionnairesunit.org

About AlphaDogs:
Founded in 2002, AlphaDogs is an independently owned full service post-production facility located in the center of Burbank’s media district. AlphaDogs gifted team brings a combination of both creative talent and technical expertise paying extra attention to detail in delivering projects with a personal touch. State of the art editing bays, color correction, audio mixing, visual effects, production offices and equipment rentals are available. To learn more http://www.alphadogs.tv

Sunday, 5 October 2014

Running the VFX for The Maze Runner





8






CreativeCOW presents Running the VFX for The Maze Runner -- TV & Movie Appreciation Editorial



The Maze Runner poster
Already a popular young adult series that has spawned a trilogy and a prequel, The Maze Runner opens with a mystery: its central character awakens with no memory, in a maze whose 100-foot walls keep rearranging themselves. The maze is also populated by Grievers, slug-like creatures with six mechanical legs and venomous scorpion tails. Unsure of why he is there, or how he can get out, Thomas meets other young people caught in the maze, known as Runners, and finds that even more mysteries are piling up far more quickly than answers are arriving. The Maze Runner opened with a gross of $81 million worldwide in its first three days, with some of the highest critical acclaim and audience scores this side of The Hunger Games. While YA dystopias are just now showing up in significant numbers onscreen, the literary genre has been incredibly vibrant for generations. The current wave was certainly energized in 2005 by the publication of the novel The Hunger Games, but the genre's roots go back another 50 years beyond that. As one of the most popular genres of young adult literature – even moreso than vampires in love or wizards – it's only reasonable for Hollywood to tap into sources that stretch a bit deeper and wider than comics, reboots and resurrected TV series. Films based on books! Imagine that. Some of the strength of The Maze Runner's opening is in fact due to the intensity of its readers' enthusiasm, as it has spawned a popular trilogy, and most recently, a prequel. One of those fans is the daughter of Method VFX's Sue Rowe, a supervisor whose films include X-Men: The Last Stand, The Golden Compass (which won a VFX Oscar), Die Another Day, and John Carter. [Sue spoke to us about her work on that one, in a terrific story here.] In fact, Sue's daughter felt so strongly about it that Sue got swept up in it, and very simply, campaigned 20th Century Fox for the job until Method was awarded the contract as the exclusive VFX house for the film.
Decayed, ivy-covered walls were at the heart of the "character" of the maze. Click on any image to view larger.


Method's team of 170 worked for 10 months to create 150 character shots and 380 environment shots. Environments have been one of Method's long suits, but character animation is new. To build the company's strength in that area, they added Erik de Boer, Academy Award winner for his work with Rhythm & Hues on Life of Pi, and former WETA whiz James Jacobs, an Academy Sci-Tech Award winner for technology used for The Goblin King in The Hobbit: An Unexpected Journey.
The creatures they set themselves to work on were the Grievers, who needed to be agile and threatening as they roamed the maze at night, attacking anyone they came across.


At the heart of The Maze Runner is the maze itself, whose 100-foot concrete walls were rigged to move, at some unknown point in the past when it was created. The walls have decayed over the years, and parts of it break off as the walls shift. The walls are also covered with ivy, which presented Method with some particular challenges for creating complex geometries that are also deeply organic. For both the creatures and the atmospheric elements, Method developed its own software to facilitate their experience as artists, an emphasis that shines through in work that we, frankly, think looks quite impressive. The walls are also covered with ivy, which presented Method with some particular challenges for creating complex geometries that are also deeply organic.
The walls are also covered with ivy, which presented Method with some particular challenges for creating complex geometries that are also deeply organic.


As we spoke with Sue and Animation Director Erik de Boer, we found them both to be terrific storytellers themselves. We were happy to get out of the way, and let them fill in the details.


Left, Animation Director Erik de Boer; Right, Method VFX's Sue Rowe.
Left, Animation Director Erik de Boer; Right, Method Studios' VFX Supervisor Sue Rowe.


My daughter read The Maze Runner, and she loved it. I heard that there was a film being made by Fox, so a couple colleagues here at Method just hopped on a plane down to see Fox Vice President of Visual Effects Joe Conmy to say, "We really want to work on The Maze Runner."

Shortly after that, I met up with Wes Ball, the director. He's such an unassuming guy, but he's full of energy and incredibly articulate. The way he describes things, you just get completely engaged. Within five minutes of meeting him, he was describing the opening shot of the movie, and he's bouncing around the room, acting it out, giving me sound effects and everything. I was like, "Ah! I'm in. I so want this show." So I didn't let it go until they gave it to us.


Wes Ball (right) directs Dylan O'Brien (left) on the set of THE MAZE RUNNER. Photo by Ben Rothstein.
Wes Ball (right) directs Dylan O'Brien (left) on set.


Because Wes comes from an animation background, he was able to give us great notes. We would record the sessions with him because the noises he made and his head movements were very much in line with what he wanted the Griever character to do.

So he was pulled on board for the character, but also the environments, because he could actually do his own previs. Wes likes to work using a software called Modo, and so he would mock up preview scenes with cameras. I looked at his previews, and I thought, "Okay, so I need a camera 100 feet high, that means I need a Technocrane in an environment with a 50-foot reach."


Final comp and before shots. Click to enlarge.

We only had two days from the shoot to get the footage we needed, but because we planned really well, we got every angle that we needed. That's something that I'm asked to do more and more on feature films because the shoot times are so short.

We talked to Wes every day, which is the way to do it, because he's based in LA, down there with editorial, and we are in Vancouver. He would do his mock-ups in Modo, and then we'd talk about them the next day in the cineSync remote collaboration and review system. We also set up a Skype camera so that we could see the way he acted out the scenes.


THE MAZE AS A CHARACTER
As the VFX Supervisor, I was on location in the glade, which is sort of the safe area right in the center of the maze. We built a wall that was about 16 feet high and about 40 feet wide – just a simple facade, but we got a lot of shots for free in front of that, and lots of dialog shots.



The Maze had 100-foot concrete walls that were rigged to move to add a challenge to the game.


That was a pretty tricky shot in itself because the trees in that big field were already 100 feet high. So we had to take out those 100 feet high trees, and replace them with a 120-foot wall.

The first time that the character Thomas is in the maze, we needed to pull the camera back for a wide 360, so he's surrounded by the walls that are covered with ivy as the camera is moving. We approached it in a modular fashion. Especially with 3D, you have to be economical in how you build things. Render engines fall over and die if there's too much geometry.



The first time that the character Thomas is in the maze, Method needed to pull the camera back for a wide 360, so he's surrounded by the walls that are covered with ivy as the camera is moving


We took a lot of really of great reference photos from the set pieces, then we used a LiDAR scaner to do 3D laser imaging to help us glean as much as we could from the actual location and the actual set build. And then we went about building those. We wound up with 15 flat surface walls, and then five kind of different corners, and then some different style cracks, and so on.

But the biggest thing was the ivy on the walls, which I knew was going to be a technical challenge: thousands of tiny leaves, and building them in an organic way. I got a couple of guys on my team, effects technical director Harsh Mistry and visual effects artist Kuba Roth, to look into some software to build ivy and grow vines. In the end we ended up writing our own proprietary software. I know everyone says that but we really did!

I loved it because it was a really creative tool. We didn't have too many varieties of walls, but then each time the ivy was grown on it, it would clearly be different than on the other walls, so it was always going to look like a unique wall. We even wrote in tools so that I could vary the thickness of the base of the ivy, and how many branches it would split into.

One of the cool touches that they gave me was that we could rotate the ivy leaves to always face the sun as it moved across the scene. It allowed us to catch a nice bit of light and add some randomness to the shot.


Method could rotate the ivy leaves to always face the sun as it moved across the scene.
Method could rotate the ivy leaves to always face the sun as it moved across the scene.


GETTING THE BUGS IN
The thing that you need to think about for the computer-generated objects that you're creating is how would they really look if they were filmed. It doesn't matter now that it's digital film. It still has a certain look, and it has to do with adding motion blur, adding film grain, and matching the black and the white points on your CG to make sure it fits into your live action plates.

Of course, we had to build the floor of this giant maze as well. The floor had lots of greenery and tufts of grass, and all the little things that you add when you get your head around building a totally CG environment that feels real.

For example, when I was on the location in this field in Louisiana where they were shooting, there were bugs everywhere, snakes – the whole caboodle. It was a very uncomfortable place to shoot. When I came back to the office, and we'd be looking at the live action, there'd be all tiny bugs kind of flying around. And then when you pan over onto a completely CG wall, they were missing, right?

So we made a whole library of CG bugs, which my producer still laughs at me about because it wasn't in the original brief, but I knew it would be subtle stuff like that that just would keep it all alive. When we showed Wes, he loved it, but our joke was, at the end of every daily session he would say, "That was great, that was great, add more bugs!"


GOLDEN HOUR
One of the challenges of building a CG maze whose giant walls keep moving is that you need to be able to crash zoom into a close-up wall, and then another shot is a wide establisher, so we built a number of levels of detail for them. We call them LODs. We were then able to judge that if a shot was mid-distance, we could use a lower level of detail, and then the foreground would be high res. These are the kinds of judgments that allowed us to keep our costs as low as possible, while delivering shots that looked much more expensive.


(From left), Teresa (Kaya Scoderlario), Thomas (Dylan O'Brien), Alby (Aml Ameen) and Jeff (Jacob Latimore) react to a shocking development in the Glade.
Teresa (Kaya Scoderlario), Thomas (Dylan O'Brien), Alby (Aml Ameen) and Jeff (Jacob Latimore) react to a shocking development in the Glade.


But I think the artistry on the walls was making them feel like they had scale. The cool thing was that the story dictated that the maze doors open at dawn and close at dusk. And both those times of day, regarding lighting, are the most peaceful times. It's golden hour. So we were able to employ kind of soft, raking lights, which, in a lot of cases broke the huge scale of the wall. So wherever we could, you'll see shots where we pull wide and there's a shaft of light going down the side of the wall. It gave a shape to the scale, and caught the subtleties in the texture on the wall. That was a good thing to do because, quite often, poor CG lacks detail and texture. I wanted to do my very best to show that off, because all the detail was in fact in the CG. It turns out that golden hour lighting is the best time to shoot, and it's the same for digital worlds as well.


THE MAZE RUNNER

Dylan O'Brien stars as Thomas in THE MAZE RUNNER.

TM and (c) 2014 Twentieth Century Fox Film Corporation. All Rights Reserved. Not for sale or duplication.
Dylan O'Brien stars as Thomas in THE MAZE RUNNER.


WALLS AND FLOORS
It was a 9-week shoot, but everything for the moving maze was shot in a car park over two days, which is a ridiculously short amount of time. It was an unkempt car park, so there was a dirt floor and some greenery. And then, we had about 180 feet of blue screen, which was about 40 feet high. It was held up by shipping containers, which I'd done for a previous show back in the UK, on John Carter, because the containers can stand up to all weathers, and they don't get blown about in the wind.



The very impressive final maze walls. There were actual tufts of grass in the car park concrete where some of the filming took place – the actual landscape had a deep effect on the final VFX shots.


The floor was really important, because with any CG, contact with the ground is the hardest part to make look real. If I had a real floor on location, I could put in the walls behind them, but that wasn't possible. The idea is that this maze had been around for a number of years, and when the kids are running, this floor that was rigged long ago is suddenly called to action. The kids have basically set off an alarm.


Young men, including Thomas (Dylan O'Brien, center, pointing) trapped at an undisclosed locale, investigate the mysteries of a massive maze.
Young men, including Thomas (Dylan O'Brien, center, pointing) trapped at an undisclosed locale, investigate the mysteries of a massive maze.


We needed to crack the ground, and shoot up theses sort of geysers of dust to show how the ground was crumbling under them. The soil itself was very light, then this fine powder shooting up – a very fine particulate that we rendered as Houdini particles. Then a layer of grit, and then rubble, and then large chunks of ground. By combining multiple layers, we built up the kind of complexity that you see in nature.

It was only about 15-20 shots, but we needed to sell this whole "tsunami of wall" flying behind the kids.


Minho (Ki Hong Lee, left) and Thomas (Dylan O'Brien, right) search for a way out of the maze.
Minho (Ki Hong Lee, left) and Thomas (Dylan O'Brien, right) search for a way out of the maze.


There were also repetitions in the modeling of the floor, but we were able to hide it with lots of debris. If you saw the befores and afters, it's really just a floor, blue screen, and tracking markers. We added everything else. Including the bugs! There were markers for the locations of these big, huge metal doors, but the set piece itself was something like 6 feet by 12 feet, and needed to be completely replaced in post.

We realized that in order to really give this some tension and some fear, we needed to make sure that it wasn't just a couple of doors opening and walls shifting. There are a couple of wide shots, but the shots are mostly quite close, just overhead or just over their shoulders while the kids are running. There was still enough room to frame with a lot of grit and dust to it, to add extra tension and panic.


A group of boys known as the Gladers, are shocked to discover a girl, Teresa (Kaya Scoderlario), amongst their midsts.
A group of boys known as the Gladers, are shocked to discover a girl, Teresa (Kaya Scoderlario), amongst their midsts. Click to enlarge

A-MAZE-MENT
I had such a great time on this show! It was a great team of people.

Method had actually not been known for its character work, but that was part of my plan when I joined Method, to bring together a creature team. We got Eric de Boer from Rhythm & Hues, where he'd won an Oscar for his work on Life of Pi,and also James Jacobs, who was a sci-tech winner for his work at WETA. [Ed. note: James shared a Scientific & Technical Achievement Award from the Academy "for the development of the Tissue Physically-Based Character Simulation Framework."] James did the Goblin King for The Hobbit: An Unexpected Journey, so he was able create lots of muscle tissue and simulations for how the skin would work on top of it – perfect for something like the Grievers.


(c) New Line Cinema and MGM: Fantasy Adventure: The Hobbit: An Unexpected Journey. A Warner Bros. Pictures Release.
WETA whiz James Jacobs, an Academy Sci-Tech Award winner for technology used for The Goblin King (above) in The Hobbit: An Unexpected Journey, recently joined Method, and played key roles in the development and creation of the Grievers.


With a team like that, that's why we wanted to go for The Maze Runner. It was perfect for us. It featured some environment work, which Method is already well-respected for, but creature work was really that thing that I wanted to pull in too. I feel great that we got the job and I think it was a success. Method has already got more creature work on the back of it.

As a company, we're kind of punching above our weight, and that's where we want to take it. Personally, I think creature work is my favorite thing. My background is in traditional animation as well as computer animation, so for me, as a VFX supervisor, this show was the culmination of so many good things.





The Grievers are the guardians of the maze
The Grievers are the guardians of the maze.



GRIEVERS: GUARDIANS OF THE MAZE
Erik de Boer, Method Animation Supervisor for The Maze Runner
The Grievers are the guardians of the maze, to make sure that these kids don't escape. They are slug-like creatures with organic bodies and six huge mechanical legs. A bunch of them also have huge scorpion tails coming out of the back of their organic bodies. There were about 150 of these shots in all.

It was exciting to have something with a sort of a robotic, mechanical feel to it, but that also needs to have the agility, and the appeal, of course, to make it an exciting character. They need the agility to be a threat to the kids, yet clunky enough to make it plausible for these kids to outrun them, or outsmart them, long enough to escape the maze at the end of the movie.

When we started to do motion tests for these guys, we looked at nature for inspiration, as we always do. We looked at insects like cockroaches and centipedes, and we definitely found a lot of interesting stuff there. But when we did our initial test of the Grievers, they were just too insectoid.

We then started to look at more mechanical machines. We found some Burning Man hexapod creations, and we just tried to borrow some of that motion language and awkwardness. Yet, at the end of the day, the Grievers still had to be obviously competent protectors of this maze that were actually designed to be inside of that environment. To accomplish this, we gave them telescopic legs so that they could partially retract them for the narrower alleys and sections in the maze.

This is less visual, but more of a back story: I liked the idea of their organic hearts controlling the hydraulic fluids into the mechanical legs. There were a lot of hoses that connected the organic heart to the mechanical parts, and their lungs would be driving the pneumatic actuators at the tip of their feet, as sort of jackhammers to find perches and traction onto the maze walls and floors.


The Grievers were hideous and threatening, but agile creatures that roamed the maze and attacked anyone they came across. Final comp on the left and before shots on the right. Click any image to enlarge.

PIPELINE
It was important to me to find a way to get animators comfortable in their sessions as quickly as possible. Get them to focus on the work and take away a lot of the housekeeping that the pipeline had when we found it. I think we were really successful in that.

That sounds trivial but to do that – but not in an environment where everybody's fighting for CPU resources. When projects go away to a render farm, they can sometimes get stuck there and never come back. We managed to keep that smooth, but we also worked on allowing the animators to have a more intuitive interaction with their rigs. Victor Barbosa designed the griever rigs, and did a great job on it.


CG AND THE REAL WORLD
There's this spectacular end sequence called the Griever Finale, where the kids run across this long bridge towards the exit of the maze, and the Grievers are there to prevent them from doing so. They end up in a close battle on this narrow bridge. Despite the fact that you would expect these Grievers to be pretty competent, I think we've found the believable way to make it a really exciting battle – back and forth a little bit until of course the kids, no surprise, defeats them.

What I always like to do is find any excuse to connect the animation to the physical world. In this case that could be a CGI environment, but also the physical world: the walls, the ground, or any prop or live action actor in the scene.

That can be done by moving a CGI character in and out of shadows that are being cast from live action creatures, but also bodies bumping into things, or tails that wrap around corners. And, of course, adding to that later in the integration phase by having dust and rock bits flying off.

Looking at what made previous CG characters successful or not is that main collision, which is of gravity yanking this weight down, and the paws hitting the ground. If that main collision is not there convincingly, then you basically lose the whole battle.

What we did on the tiger for Life of Pi was spend a lot more R&D time on just the paws. We had to sell that every time that paw collided with the ground, it was taking at least 80 or 100 pounds of weight through it. We'd always done tendons tensing up, the scapula firing up – but really getting the subtle shape changes into the paw, that was really important.


Pi Patel (Suraj Sharma) and a fierce Bengal tiger named Richard Parker must rely on each other to survive an epic journey. TM and (c) 2012 Twentieth Century Fox Film Corporation. All rights reserved. Not for sale or duplication.
Pi Patel (Suraj Sharma) and a fierce Bengal tiger named Richard Parker must rely on each other to survive an epic journey. ™ and © 2012 Twentieth Century Fox Film Corporation. All rights reserved.

Now, for the grievers on Maze Runner, we couldn't use the same tools that we had used for the tiger in Life of Pi, because the Grievers only have hard spikes that hit the ground. We used the telescopic feature of them to try and sell as much as possible that the weight was going into those legs. Also, because they are six-legged, the weight distribution can be a little bit sloppier because finding that triangle that supports the weight. With six legs, there's always a bunch of them on the ground, so you can be a little bit sloppier in terms of which leg the weight is shifted over. Very often when you look at hexapods, the motion is a little bit clunky in the middle and these six legs are doing their fun, random, quick stuff. It was definitely a whole different challenge for the Grievers than it was on, for instance, the tiger.

And I won't deny it: that was definitely an exciting part of this job.