Created By Anna McCraith & Callum Luckwell
Original Music by Michael Cook
Special Thanks to Alec Parkin & Kerry McCormick
Created By Anna McCraith & Callum Luckwell
Original Music by Michael Cook
Special Thanks to Alec Parkin & Kerry McCormick
The Short Film Storm within was created by myself and Anna McCraith, whilst Anna handled the Character Animation and Compositing, as well as Texturing of characters, I dealt with the 3D Modelling, Rigging, Environment Layout and 3D Effects that you see in the film.
A brief bullet-point of my contributions broken down by scene is below.
In the plan for our short, the wolf was originally going to be comprised entirely of smoke and electricity, however after looking into how to do that, we quickly decided to simply have it emit some smoke and flame.
It took several attempts before we realized that this was an unfeasible goal to add in the time we had, especially as a team of two with a wide spread of work to do.
The tests initially started simple, adding some smoke and flame to the eyes of the characters to make them appear more threatening.
After a lengthy series of tests and practices, I eventually figured out how to graft the smoke to the geometry of the characters so that it moved with them.
This however, created a different problem.
In shots where the wolf moved, the smoke effect came off blocky and thick, rather than the whispy aesthetic I was going for. When static, it worked fine, aside from a few glitches.
Rendering the successful ones also sometimes created this problem.
This mainly occured with Zync, and was supposedly due to issues with caching the particle effect history, however no matter what tutorials I followed and attempts I made to correct this, I could not get Zync to render the particles properly and without issue.
This was what the above images looked like when static, and as you can see, despite all 3 having identical sky-boxes and light set ups, the results could not be more different when rendered through Zync or a different computer.
This inconsistency then drove me towards Paint Effects, and attempting to create the fire and smoke using that.
Needless to say, it had mixed results.
Generally, the paint effects looked good, but didn’t necessarily fit the aesthetic of the film – and as they rendered fully 2D, without existing transparency – it was hard to get a visualisation pre-composition of how it might look.
As it stood, it’s impossible to tell how this would look in the scene – however the results were more shocking when comped together.
Whilst certain terrifying, and very close to the aesthetic we wanted – the wolf seemed to lose all definition as to what it was when we covered it in flame.
Additionally, the sheer amount of compositing that would need done for this, the hawk and the snake proved to be quite excessive.
Given these factors, we decided to keep the smoke and flame effects to a minimum, using them sparsely to accentuate danger through after effects.
This tutorial was interesting, and in an ideal world, we’d have liked our film to feature effects like this on the characters. – However, as you can see, even creating a simple static effect took time – and replicating this through dozens of shots on a moving creature proved… troublesome at best.
This was the idea of how we could go about doing the shot in which the wolf disintegrates into smoke, using a combination of render layers and effects to have it disappear.
As can be seen here, most of the smoke effects we ended up using were simple ambient pieces, with the occasional piece of particle effects around the eyes and mouth, like shown at the beginning of this post.
The primary 3 effects through the short film are the lightning, the grass (and other plants) and the water.
The Lightning went through several iterations before settling on the final version, however most of them ended up looking clunky or throwing off the lighting in the scene.
Ironically, the simplest execution ended up working the best, with the standard Maya lightning effect working very well.
The lightning had to be animated across the scene, to follow the wolf’s position and timed to create the right aesthetic.
With the exposure settings, timing and a basic locator rig that let me drive keys along so that the lightning stretched and ‘vibrated’ more the larger distance it moved.
This tutorial was helpful for understanding all the features and parts of the lightning tool that I could utilize, and how to sculpt the effect I wanted.
This was how the lightning looked when comped into the scene – I would have ideally liked to be able to spend a little bit more time thinning it out and making it more defined, possibly even adding hard outlines to it, but there were other parts that needed addressed with higher priority.
Like I’d mentioned previously, I had made efforts to construct the grass out of Xgen, however it wasnt until recently, that I realized that combining the paint effects tool with polyscatter would achieve the same result with greater ease and effect.
As can be seen here, the Xgen based scenes ran very intensive and didn’t produce an ideal result, especially as any animation they would have needed to be manually implemented.
However, with paint effects, I could create something quickly and simply that was animated by default and resembled the kind of painterly grass we wanted.
By creating multiple different ‘scales’ of the same grass type, I could scatter it over multiple surfaces without creating obvious gaps and maintaining the ‘structural’ functionality of the scenes.
Something that Xgen failed to do, as it often caused crashes due to the intensity it functioned at.
Polyscatter allowed me to instance the grass I made along any environment surface, though it’s downside was that as the instances didn’t generate their own effects, but rather pulled from the parent, they wouldn’t interact with any other geometry in the scene.
This meant that for any shots where the rabbit needed to run through grass, the grass immediately around him had to be removed and manually painted in with collision effects.
As can be seen in this scene, Polyscatter could also be used to add things like thorns across the brambles or to quickly spread clusters of rocks across the ground, all I had to do was import geometry I created and adjust the settings to find a result I wanted.
This was the tutorial I used to learn about the settings and options available to me when creating the grass.
Like in the above video, many of the scenes had multiple sets of instances, that slowed them down immensely, so it was important to use some of them sparingly, and be carefully about the amount of detail we included in the environment.
Not just for the purposes of maintaining scene fidelity, but also to ensure the aesthetic we wanted remained intact.
Water was one of the trickier parts to grasp, and we had many options available, each with their own pro’s and cons to deal with.
Initially, I attempted some bifrost pieces of water to see how they would look, however I found it was too finicky and produced results that were too photo-real for our purposes.
When at a lower setting, like above, Bifrost tended to generate an almost plastic appearance, regardless of lighting adjustments.
This improved the more detail you began to add. however at this stage it was too much like an ocean to be rendered effectively in something like the stream.
This would be great for the ocean, but it took a while to achieve and was very finnicky about how it rendered out.
However, I found that the best result could be achieved using Maya’s built in 2D water texture.
Adjusting velocity and ripple speed let me create a single preset that I could scale through a scene to have the water increase in intensity and height, that way it didn’t need manually adjusted between shots.
The water here had a much more painterly aesthetic that I really liked, and fit well with the rest of the Short, it was also incredibly easy to manage once it was set up – though like mentioned before, it’s pacing was sometimes hard to wrangle.
By placing the water texture on a transparent plane, with another layer on a very mild ramp shader behind it, I could manipulate the appearance of depth.
In the example of the log ocean above – the water layer is highlighted and was set up with a high level of catclark subdivison to generate a nice deformation, whilst the layer below had a displacement map and a ramp shader to give the appearance of terrain under the water, and to back cast a nicer effect onto the water surface.
This was ramped down for scenes where the water needed to be calmer, like the stream or scene 11.
The process of rendering the short film was a drawn out, difficult and often troublesome process, that seemed at times to rely more on luck than any effort of planning.
As Anna finished the animation in shots, I took them, set up the environment scene references and textures – and began the process of lighting and test rendering.
Most test renders turned out something like this however, until eventually I hit the jackpot.
Generally the animation process took some time, which afforded me the opportunity to work more on lighting and set up.
As can be seen in the plan below, some scenes required a constant shifting of assets and the occasionally keyed set of lighting. This was particularly prevalent in the shots with lightning, which had to not only be created, but rendered on a separate layer whilst still having it’s accompanying lights timed to work in tandem with it.
Zync was a blessing and a curse when it came to this, we had to use multiple accounts, amassing approximately £3000 worth of rendering, if this had been something we paid for out of pocket, this film would never have gotten completed. Fortunately, utilizing the free trials available to us and rendering layers on efficient settings, we were able to cut down dramatically the cost and time needed to render.
Shots frequently errored out during the rendering process, or came back with problems that seemed to be temperamental in the fixing. Often, simply closing and opening a scene and re-starting the render fixed it, or simply hiding a cube somewhere in the scene.
This has convinced me that Cloud Renderers, while vital, clearly have some anger issues.
Renders often came out like this, with the light not casting as defined shadows or appearing too dark, whilst this could be corrected in post, it was much easier to compensate for it directly through lighting.
This meant having some layers ‘overlit’ whilst others had to be underlit. This is mostly down to how Zync dealt with the colour output management.
Most of the testing was sporadic frames or short play-outs of how the lighting moved in the scenes.
Generally it worked out without problems once we got it set up, though occasionally we had some issues.
Generally, these were all easily fixed using alembic caching and baking the animation before exporting.
Most rendered shots needed minor colour correction, basically ensuring they all had similar tones and palettes per scene.
You can see in this pre-correction set of renders, that each character, whilst rendered with the same set of lights, came out with clear outlines and slight pops.
The Wolf had to be rendered brighter, with a purple texture to allow him to be more easily comped and have the effects that drift with him added.
In addition to the Zync system of rendering, we also set up a number of Mac’s rendering independently, and thanks to the referencing system established, this went by fairly quickly, as a scene could be loaded instantly on any mac, with the render setting presets imported directly.
However, the Mac rendered out noticeably lighter than Zync with the same settings, which resulted in an interesting battle when it came to color correction.
The Log Ocean scene was a combination of one of the easiest and most frustrating scenes to make.
In terms of layout, it was incredibly simplistic, a slight hill leading down to a fallen log stretching across a wide ocean.
However, the process of creating an ocean of the appropriate scale proved intensive, and slowed down the process of tweaking and adjusting immensely.
The log ocean was constructed with two states, one in the midst of a storm and the other with a calmer sea.
The calmer sea was lighted with pale greys and a slight tint of yellow to provide highlights and contrasts for the characters.
Compared to the hill leading down to it and the early parts of the ocean, which I lit using intense reds and assembled to give a vivid skyline and clear lines of action, I found that actually creating the scene raised complex issues surrounding how to light it.
Primarily though, the skysphere gave a red tint that then needed some forced shadows from directional lighting and an ambient yellow light to help highlight it.
Here you can see a finished shot from the above environment, with a brown light being used to cast the highlights to create a more visceral effect.
This is probably one of my favorite shots, and the one I feel the lighting works best in.
The above shot shows another aspect of lighting this scene that had to shift between shots, as a brighter set of lights was needed for the shots that framed the rabbit against the sky, to ensure that the red tones didn’t blend too heavily.
The ocean itself was primarily the result of two layered ocean shaders using the 2D Water displacement texture in maya, set to generate a simple but effective sequence of waves and ripples.
The Log that stretched across it, both in the shot where it splinters and when it’s static, was an extended version of the tree I had modeled prior, with a light rig allowing the splinters to be moved in relation to the trunk itself.
Most of the second sequence is occupied by the chase of the Rabbit from the Snake through a brambled lair and tunnel, intended to create a feeling of claustrophobia.
This was the initial background and lighting, missing the brambles and dead grass that would be eventually added.
You can see in this shot, the design choice to create claustrophobia relied heavily on dark black and red lighting, though in order to enhance this, the floor and grass were made into pale tan and brown colours, with yellow grass to generate more definition.
When it came to creating the exit, it was important to make sure the corridor was both long enough, and positioned in such a way as to allow the camera to glide through the scene without impacting into geometry.
This paralleled interestingly with the requirement to make the space feel closed and collapsed.
The environment often had to be adjusted between shots to better frame the characters and improve the sense of motion in the scenes.
I also added a soft area light that drifted just in front of the characters in this scene to help them pop out from the dark backdrop more easily.
The Bee was a fairly simple model and rig, and mainly comprised of small textured spheres – with wings and legs attached by locators and IK controls to allow them to articulate easily.
The stream environment was basically a replication of Stream version 1, but with a wider deeper river, a darker ground and grass texture and a deeper blue tint to the lighting.
The water in the stream had to be much more ferocious than the scene before, though I had issues with the speed of it at points, as it seemed to tie velocity and speed to the depth of the waves, which meant I had to juggle the averages between the two.
The scene had many piecemeal and hidden components, and was initially designed to actually stretch during the shots.
This was later scrapped, and so many of those additions went unused.