JURASSIC WORLDIndustrial Light & Magic // 2015
NO PLATE. This jungle set design is a full 3D build using instanced objects. The ground was sculpted in 3ds Max, then populated with grass, foliage, and trees. Assets inside the aviary at the end of the shot were conformed to fit the composition of the dome breach.
PHOTOGRAMMETRY CLIFFS. The foreground grass and foliage in this shot are procedurally generated and defer-loaded at rendertime for memory efficiency. The cliff is a photogrammetry reconstruction of photographs from an aerial shoot in Hawaii, with additional geometry sculpting and retopology in Z-Brush and Maya.
CAMOUFLAGE. For this jungle sequence, I developed a new pipeline tool to exchange instanced foliage assets between departments. All foliage set design and layout was approved with proxy renders from the Generalist team, then the plant transforms and species tags were exported for simulation and rendering by the Creature Dev team.
THE ISLAND. A straightforward, old-school mattepainting of Isla Nublar, using only Photoshop and Nuke.
TRANSFORMERS: AGE OF EXTINCTION Industrial Light & Magic // 2014
KNIGHTSHIP LANDING BAY. I built the 3D model of the Knightship landing bay based on a concept art. To balance the warm and cool tones of the exterior and interior lighting, I separated light contributions to multiple render channels. Little attention was given to adjusting the intensities and color before rendering, which allowed for quick iterations and reworking of the composition in Nuke, as well as greater control over the interactive lighting.
PREHISTORIC INVASION. The miningships that invade Earth in the opening shot of Transformers: Age of Extinction were highly detailed hero models that didn't fit into 3ds Max scene memory, so I wrote a tool that automatically created cached versions for delayed load whenever new animation takes were published from the Layout Department. Multiple sun positions were cheated in 3D to achieve the desired backlighting on each ship. The cloud layers catching sunlight over Earth were projected as displcement maps.
PACIFIC RIM Industrial Light & Magic // 2013
LEATHERBACK TAKES A HIT. The Kaiju monster in this shot crashes into a dock crane and tears it from its base to be used as a weapon against Gipsy Danger. I was responsible for all the FX in this shot: water spray upon impact, rain, dust plume, crane sparks. ILM's fluid solver, Plume, supplied the grid-based advection, and Renderman was used for rendering.
GIPSY TUMBLE. This shot required all the typical destruction effects: particle simulations for sparks and rain, voxel grids for the dust impact, instanced geometry for debris, fluid simulations for the streaming water.



WORK IN PROGRESS. These are a few example OpenGL renders that show the simluation data for water splashes and dust plumes on Gipsy Danger.
DOCK FIGHT. The Houdini FLIP solver was used for the streaming water effects as well as the structured splashes in these two shots. The motion of the creatures, debris, and wind provided the dynamic forces acting on the water.
UNSTOPPABLE. Sidewinder missiles only bruise monsters the size of skyscrapers. The explosions and smoke clouds were created with ILM's Plume fluid solver. The missile trails and tracers are particle simulations.
THE LONE RANGERIndustrial Light & Magic // 2013
PUFF PUFF. The train smoke on this action sequence was accomplished with Houdini's fluid solver and velocity grids. Careful balancing of pumping heat into the grid from the smokestack while allowing for dissipation once the smoke hit the cooler air outside creates the curling eddies.
THE AVENGERSIndustrial Light & Magic // 2012
MAYDAY. Loki takes out the Quinjet. After an initial explosive engine blow-out, I wanted the resulting smoke cloud to churn with the draft of the working engine on the opposite wing. Loki's scepter energy beam was created procedurally using Python noise functions and the engine explosion was simulated with ILM's Plume fluid solver. Deep compositing with the Quinjet geometry allowed for an accurate holdout within the 3D volume.
HULK HURTING. The Chitauri charoit tracer fire seen throughout The Avengers was a tool I developed and packaged for show-wide distribution using rule-based Python scripts. An RGB mapping of temperature distribution from the hot tip through the cooler tail allowed compositors to control the glow and energy flares. For these two shots specifically, I simulated and rendered all FX: tracer fire, explosions, dust, smoke.
COWBOYS AND ALIENSIndustrial Light & Magic // 2011
HE'S NOT DEAD YET. The surprise lake breach of this alien required splashing water, streaming water, spraying water, and a dash of guts and gore. Smooth particle hydrodynamics (SPH) was used for the water structure, and meshed ballistic particles were used for the spray and goo.
MORE GORE. Being slashed in the face with a lazer scalpal makes even aliens cry. This sequence of shots required gooey, sticky green blood to ooze from the gaping head wound. Getting the viscosity and adhesion values just right, combined with arterial pressure that looked realistic was the real challenge here.
NO PARACHUTE. Quick camera cuts like the ones seen in this sequence are especially difficult for FX artists. A simulation that might look great from one angle might look terrible from another, so multiple simulations need to be run with similar, yet subtly different characteristics. That was the case with these four action shots in particular, where the camera framing caught every conceivable angle of the speeder plunging from the sky.
THE CURIOUS CASE OF BENJAMIN BUTTONMatteworld Digital // 2008
PARIS, 1950s. I designed and built this full-CG shot of Paris, starting from mid-century aerial reference. The hero buildings in the foreground were modeled based on photographs, but the background buildings and trees were procedurally generated to follow splines extracted from street maps. The texturing was accomplished using photo-based camera projections from multiple perspectives, with occlusions auto-filled at render time based on viewing angle. The tools I developed from the multiple techniques used in this shot were eventually released as a consumer software product, The Mattepainting Toolkit for Maya
ZODIACMatteworld Digital // 2007
TRANSAMERICA PYRAMID, 1969. I designed and built this full-CG shot of San Francisco's North Beach neighborhood based on historical reference photography. The building reconstruction and period details are an accurate representation of what the neighborhood looked like just after the iconic Transamerica Pyrimid erected the first girder beams.
SACRAMENTO CAPITOL. Notice the perfectly linear translation of the camera in this shot. It's a subtle yet intentional giveaway that this is a CG shot, not a live action plate. Craig Baron and I flew over Sacramento at dusk with still cameras to capture multiple angles of the capitol building, then reconstructed their positions in 3D space with photogrammetry software. The buildings were then built using image-based modeling and textured with camera projections of the photos.
FOLLOW THE TAXI. This shot used every technique in the book. I led the team that worked on the photogrammetry build of the environment, and animated, lit, and rendered all CG elements. The shot was designed to feel natural at the beginning, as though it was captured from a helicopter, but then break believability as the taxi turns the corner and the camera tracks perfectly on its rotation axis. We even experimented with stop-motion car models, which are traveling opposite the taxi at the end of the shot.
MAD GODKaleidoscope // 2016

STEREO 360 RENDERS. The Mad God virtual reality experience was the first project my company, Kaleidoscope, produced with Tippett Studio and Wevr. I approached Tippett with the idea to create a VR experience based on Phil's stop-motion Mad God universe, using binaural audio cues to lead the viewer's gaze.
All creative direction, animation, photography, and compositing was done by Tippett Studio, and the final animated sequence was delivered to Kaleidoscope for delivery in VR. I chose Unity as the rendering engine, targeting the Oculus Rift and Samsung GearVR as delivery platforms. Custom shaders were written in C# and GLSL to re-project and composite multiple equirectangular layers on the GPU. You can find Mad God on Wevr's Transport platform.
JURASSIC WORLD: APATOSAURUS Industrial Light & Magic // 2015

MIXING LIVE ACTION AND CG. Felix and Paul Studios produced this short experience with ILM providing the CG elements. I was responsible for the FX that ties the dinosaur with the photography: the tree limb and leaves that become a snack for the Apatosaurus, as well as the foot impressions left on the ground.
THE LAST MOUNTAINKaleidoscope // 2014
LOW POLY WORLDS. Starting with a team of 2 and ramping up to 17 during peak production, I led a team of animators, modelers, and texture artists that kicked off The Last Mountain as part of San Francisco's first VR Hackathon. The project took 5 months to complete with artists and developers working remotely. This decentralized production model we developed established the foundation of Kaleidoscope and guides its mission to empower independent artists.
STAR WARS 1313Industrial Light & Magic // 2013
REALTIME FLUID SIMULATION. Though not technically a virtual reality project, Star Wars 1313 got me really excited about the future potential of realtime graphics, and was the genesis of my decision to move into the burgeoning virtual reality industry. I worked as an FX artist with Kim Libreri and the talented team of artists and developers at LucasArts on this cinematic piece, which was rendered in 22ms/frame—the bleeding edge of rendering in 2013.
REALTIME LIGHTING. I was the lighting technical director on this sequence, which demonstrates the realism the LucasArts developers achieved with advanced shader using Unreal Engine 3.
MICHAEL BREYMANN was one of those kids who dismantled all of his toys in an attempt to better understand how they worked. Still doing that to this day with grown-up toys, Michael is passionate about creative technology and its relationship to storytelling as an artistic medium.
For the last decade, he has worked as a technical director and graphics programmer in the visual effects industry. His production experience has resulted in the development of new software and tools designed to enhance the immediacy of interaction between artists and computers.
In 2007, he formed Glyph Software to commercialize a photogrammetry toolset, the Mattepainting Toolkit for Maya, which is now in use by freelancers and studios around the world. The features and design implementation were informed by Michael's work at Matteworld Digital from 2006-2009.
In 2009, Michael joined Industrial Light & Magic as an FX technical director working on particle and fluid simulations. After a brief stint as a pipeline consultant to TV Globo in Rio de Janeiro in 2010, Michael returned to ILM where he continued his work as an FX TD and 3D Environment Artist until he left to form Kaleidoscope, a virtual reality content company, in 2015.
As CTO of Kaleidoscope, Michael developed realtime graphics technologies and global distributed workflows for artists and developers. In January 2017, he returned to Glyph full-time to focus on building realtime content creation pipelines.
A few languages and software packages Michael commonly uses on projects include: C++, Python, Cinder, Mari, and Houdini.