Category: Film VFX

  • How Foundry’s Athera Can Change Conventional VFX Pipelines Forever

    If you work in visual effects, computer-generated graphics, or 3D design, Foundry is a VFX pipeline software company you will likely be familiar with.

    Established over 20 years ago, in 1996, and based out of London, England Foundry has served up an ever-expanding range of software in the visual effects and 3D content spaces. Their list of clients include the likes of Google, Pixar, Mercedes-Benz and Sony Pictures among others, and if that doesn’t impress you, their 2017 showreel, you can watch it below, definitely will. Studios and individuals worldwide widely use their portfolio of power-packed post-processing tools, including the Nuke compositing tool, the Modo 3D modeling, rendering and texturing suite, and the Flix development VFX pipeline, among many more.

    At FMX 2018, one of the foremost conferences dedicated to the world of digital visual arts, Foundry unveiled a new product to add to this already impressive portfolio called Athera. It isn’t a complete bolt from the blue though, because those intimate with the comings and goings of the visual effects world will have heard whispers, caught glimpses and may even have gotten hands-on with the suite when it was in its beta testing phase and went under a different moniker – Project Elara. Before going any further though, let’s understand what Athera brings to the VFX table.

    What is Foundry’s Athera and what does it do?

    For those unfamiliar with Athera, it is a cloud-based platform with Google that provides cloud services that allow the entire VFX production pipeline to be executed from this single service. It isn’t just the myriad software that makeup Foundry’s portfolio, and third-party apps that tie into VFX workflow, or remote storage for this workflow though because Athera goes above and beyond and also offers top-of-the-line computing capabilities as part of its ambit of services.

    Here’s a rundown of the Athera VFX pipeline and the services offered within it:

      • Virtual workstations:

        As mentioned earlier you can tap into computing capabilities that are tailor-made to run the gamut of software available in the VFX pipeline without faltering. Depending on your needs and resources, you can access the workstations through web browsers or a dedicated interface.

     

      • “Context” organization capabilities:

        Athera comes with something called “Contexts” essentially projects or distinct pieces of work that you can label and access independently. You can switch roles of individuals, complete teams and even full-blown organizations working on these projects without having to create a separate one each time. All your files, renders and apps remain locked to these “contexts” making collaborative efforts on projects fairly straightforward.

     

      • Secure centralized storage:

        Using the cloud allows you to store all your raw data, project files, renders and everything else in a secure and easily-accessible server that is rooted in a cluster of high-performance SSD drives that are dedicated to your project.

     

    • Versatile Applications:

      The list of applications for VFX and 3D design available on Athera is staggering. Check out what’s on offer:

      • Houdini FX, Power Houdini FX
      • Nuke, Nuke X (with Cara VR optional), Power Nuke X
      • Nuke Studio, Power Nuke Studio
      • Katana + 3Delight, Power Katana + 3Delight
      • Modo
      • Mari
      • Blender
      • DJV View

    The Power options mentioned in the list above essentially run the software on workstations with twice the capability of standard ones.

    Now that we know that Athera is capable of, here are just a few reasons why using a cloud-based VFX pipeline such as this is beneficial for VFX and 3D production houses.

      • Powerful rendering capabilities at your fingertips:

        Foundry constantly upgrades their GPU-powered workstations and provides you with the latest updates of their software suite meaning you don’t need to make an expensive outlay on all the necessary hardware and software if you chose to go down the Athera route.

     

      • Pay for what you use:

        Speaking of outlay, Athera allows you to block up the software for 30 days at a time, and you can add and remove software access based on your needs. Prices start from $244 a month for Blender and DJV View and the most expensive piece of kit is enhanced Power Nuke Studio at $1,388 per month. On the rendering side, Athera charges by the second to ensure you don’t pay for more than what you need, and prices range from $0.43-$3.00 per hour. You can see the entire price list and choose what you want based on your needs.

     

      • Security:

        Since your data is stored on a Google-powered cloud server, security isn’t an issue. All the data is encrypted when uploaded and remains so even when stored, and Athera has been audited by external data security experts who have given it their seal of approval. This means you don’t have to worry about hacks or data theft if you choose to use Athera.

     

      • Flexibility:

        Apart from the flexibility that Athera offers in terms of choosing which specific software you need and billing by the second for renders, you also have the option to access the virtual workspace from different locations and devices since it isn’t bound to physical servers. This allows collaborative teams that are spread out geographically to easily come in, do what they need to do, and then allow for the project to move on to its next step without the need to physically transfer the files and data, saving a lot of time and really streamlining the entire production process.

     

    Clearly, Athera and this cloud-based approach to VFX production offers a host of benefits. All you need is a fast enough internet connection, something that’s becoming more and more accessible in all four corners of the world. It is no surprise then that this tech is being lauded as the way forward, and it is only a matter of time before more cloud-based VFX services similar to this begin to crop up.

  • Photorealism – An Exciting New Trend in the VFX Industry

    Since the conception of the term by Louis K. Meisel in 1969, the word “photorealism” has gathered much-deserved attention and applications across all traditional and modern-day artistic endeavors. In simple words, photorealism is a visual effects technique that includes drawing, painting and all graphic material available at a photographer or filmmaker’s disposal and then using different platforms and mediums to make the gathered information as life-like as possible.

    Although photorealism originated from pop art, it has evolved considerably over the years. Photorealism has survived this long because it has been undiluted and has managed to stay consistently compelling.

    Understanding Photorealism

    One of the most commonly-heard statements from cine-goers these days is, “Wow, that looked so real!

    Now, how is that made possible? A huge misconception is that all a VFX artist does is make an
    image as realistic as possible. The approach to creating a VFX sequence depends on the object in question and whether it follows the laws of nature or not. For example, if a VFX supervisor is to shoot a car or an animal, he chooses the photograph which has the most detailed texture, composition or light to enhance and make it look more realistic on screen. But when it comes to preparing for a sequence that has an alien ship or sci-fi props like the ones used in Star Wars, there’s nothing much to compare the image to. It all now boils down to the VFX artist’s vision, creativity and skills to use all the information and create magic on-screen.

    The amazing benefits of photorealistic animation

    Photorealistic animation is a powerful visual tool that brings virtual worlds to life with astonishing realism. By utilizing advanced techniques and cutting-edge technology, photorealistic animation has become an essential element in various industries, from entertainment and advertising to architecture and product visualization. Let’s explore the amazing benefits of photorealistic animation.

    Immersive Visual Experience: Photorealistic animation creates an immersive visual experience that transports viewers into a virtual environment that looks incredibly realistic. The attention to detail, lifelike textures, and accurate lighting techniques combine to produce animations that are almost indistinguishable from real-life footage. This level of realism captivates audiences, allowing them to engage with the content on a deeper and more emotional level.

    Enhanced Brand Perception: Incorporating photorealistic animation into your marketing campaigns or product visualizations elevates the perception of your brand. The high-quality visuals and attention to detail in photorealistic animations create a sense of professionalism and excellence. This level of sophistication enhances your brand’s credibility, making it stand out in a competitive marketplace.

    Product Visualization: Photorealistic animation is particularly beneficial for showcasing products and prototypes. By rendering products in lifelike detail, you can present them to potential customers or investors before they are physically produced. This allows you to gather feedback, make improvements, and create buzz around your offerings. Photorealistic animation helps customers envision how the product will look and perform, making it easier for them to make purchasing decisions.

    Architectural Visualization: In architecture and real estate industries, photorealistic animation is a game-changer. It allows architects, interior designers, and property developers to showcase their designs in breathtaking detail. From showcasing realistic lighting effects to presenting interior spaces with accurate materials and textures, photorealistic animations help clients visualize the final product. This leads to better decision-making, increased client satisfaction, and improved marketing efforts.

    Commercial Visual Effects (VFX): In the entertainment industry, photorealistic animation plays a vital role in creating stunning visual effects. From blockbuster movies to TV commercials, photorealistic VFX can transport audiences into fantastical worlds, create convincing creatures, and bring unimaginable scenarios to life. These visual effects not only entertain but also contribute to the overall storytelling and audience engagement.

    Cost and Time Efficiency: Photorealistic animation can be a cost-effective alternative to traditional photography or live-action video production. It allows you to create stunning visuals without the need for physical sets, props, or expensive equipment. Additionally, it offers flexibility in terms of revisions and modifications, saving time and resources compared to reshooting or re-creating physical scenes.

    Unlimited Creative Possibilities: Photorealistic animation provides limitless creative possibilities. It allows you to visualize concepts and ideas that are difficult or impossible to capture through traditional means. Whether it’s creating imaginary worlds, showcasing futuristic products, or simulating complex scenarios, photorealistic animation enables you to unleash your imagination and bring your visions to life.

    Mind-blowing Examples of Photorealism

    To cite some examples of the use of photorealism in VFX, we can refer to the imagery used in movies like Marvel’s Guardians of the Galaxy and Disney’s The Jungle Book. The use of realistic imagery and the finished product gives you a preview of what photorealism can do for the VFX industry and not just limited to the movie business.

    1. Photorealistic changes in composition and lighting in Korvath’s appearance

      photorealism in guardians of the galaxy
      Source: reddit.com/r/marvelstudios/
    2. Photorealism-infused VFX used in Maleficent

      photorealism-infused vfx in maleficient
      Source: whatsontheredcarpet.com
    3. From the production sets of The Jungle Book

      photorealism in junglebook
      Source: nongthonviet.com.vn

    The science behind creating all these effects and sequences, in layman’s terms, is called image rendering. Fortunately, no added resource training or new technology training is required when it comes to the use of photorealism in the world of VFX. It is simply compiling all the information and details gathered from multiple frames and making it ready for the final cut of the film. That being said, the higher the demand for photorealism, the more is the time required for final visual output or sequence to be created.

    Which Are the Most Popular Photorealism Tools?

    The most widely-used rendering software or image-enhancing tools include 3Delight, Arnold, Artlantis, Clarisse, Maxwell Render and Octane Render Solidworks Visual. These are the most in-demand tools currently being employed by VFX studios such as Sony Pictures, Marvel Studios and DreamWorks.

    Over the years, VFX companies like ILM, Weta and Pixar have come up with state-of-the-art 3D modeling, animation and rendering technologies that can convincingly simulate anything of everything. Subsequently, these technologies were included in the commercial software sector, where anyone could buy them, and with the easy availability of high-tech hardware, it only makes it easier for VFX studios to have access to these capabilities (which were once exclusively reserved for big production houses).

    The Future of VFX

    So, what comes next? As VFX technologies and filmmaking techniques evolve with each passing day, it is quite possible that very soon, it will get more and more difficult to distinguish between reality captured on camera and that synthesized on visual effects software!

    The only way for VFX studios to stand out is to never settle for anything less than the best (in terms of visual outputs) and keep scaling things up in terms of quality. At Toolbox Studio, a VFX company in India, we spend a good amount of time discussing and embracing technological advancements in the world of VFX, frequently conducting interactive sessions where younger VFX artists can learn about what is new in the industry, and sharing creative thoughts and ideas that can then be implemented on the projects that we work on.

    What are your thoughts about photorealism? Where do you see the VFX industry 10 years from now? We would love to hear your thoughts and opinions – so type away in the comments section below!

  • Compositing 101: The Evolution of VFX Compositing Through Time

    Compositing 101: The Evolution of VFX Compositing Through Time

    VFX Compositing has so many subsets and minor elements that combine to produce awe-inspiring on-screen magic that we sometimes fail to acknowledge these cogs in the wheel. Here’s looking at one such all-important aspect: VFX Compositing.

    In simple words, VFX compositing is a process of making visual effects whereby visual elements from separate sources are combined into a single image, creating the illusion that all the different elements are a part of the same scene. Chroma key techniques and green screens are all used to shoot live action shots that are later composited to create a scene for a movie.

    How Does Chroma Keying Work?

    We’ve all seen pictures or footage of actors working in front of a completely blue or green backdrop and then seen the outcome in a movie with the details filled in. Have you ever wondered why the sets are draped in blue or green instead of any other colour? Blue or green backdrops are used for shooting live-action footage due to being composited as they are the colors believed to be the farthest away from human skin tones, making it easy to fill in the background without blurring the main characters out of the screen.

    The blue or green backdrop is entirely replaced by alternate background video or CGI. Naturally, it goes without saying that the characters avoid wearing clothing that matches the backdrop, unless of course, that part of the character is meant to be blurred out or dissolved into the screen. For example, a character playing a floating head may be dressed entirely in blue so that only the head is composited.

    Speaking of floating heads, let’s take a journey back in time to when visual effects were still at a nascent stage and explore the history of  VFX compositing for films.

    Georges Méliès and the Four Heads

    Georges Méliès was a French director and illusionist who pioneered many techniques in filmmaking in the early 1900s. As an illusionist, it is no wonder that he was intrigued by what one could do with a camera and wanted to create magic using film as his medium. In 1898, he created a film Un homme de têtes (A man of heads) which featured on of the first known use of multiple exposures of the same object.

    In the film, he removes his head and places it on a table next to him where it starts looking around. After doing this a couple of times (so there are three heads on tables next to him and one where it should be) he then plays a banjo with the three heads joining him in chorus. He used a technique known as substitution splicing to achieve this effect where he’d stop the camera every time he “took off” his head, placed a black bag over his real one while holding a dummy head in his hands. He used a black glass screen to create a matte leaving a portion of the film unexposed, which he would then shoot over to get the desired effect.

    Pretty cool, we reckon!

    Norman Dawn and Matte Painting

    Another pioneer of the filmmaking world, Norman O. Dawn, is credited as one of the early cinematographers who refined the matte painting technique. He developed a technique that joined together a photograph and a painting to effectively “create” a location for the scene being shot. He would place his paintings and photographs on a large sheet of glass. Black tape was then placed over the parts of the camera where the painting would go, after which the live action footage was shot.

    Frank Williams and Travelling Matte

    Glass panes limited scenes as the camera needed to remain in one place. Frank Williams overcame this by developing the travelling matte method in 1918 where he placed actors in front of black backgrounds. The film would then be copied to create high contrast negatives of the actors which would be then superimposed on the required scenes without creating ghostly double-exposures.

    The Dunning Process

    duning process

    Notably used in King Kong in 1933, the Dunning process was created by Carroll D. Dunning. This process used blue backgrounds while a solid yellow light illuminated the foreground action (or actor). Unfortunately, this process became redundant with the advent of color films.

    Sodium-Vapour Lighting or Yellow screen

    This method was developed in the 1950s and relied on the narrowband characteristics of LPS lamp. Special black-and-white film can record this light. A special camera was used to record on two spools of film simultaneously. One spool recorded the actors (and other foreground objects) while the other was used as a mask for combination with a different background. Alfred Hitchcock famously used this method in his classic movie The Birds.

    CGI and advances in blue and green screen technology made this method impractical, until eventually matte paintings became digital bringing us to today’s spectacular visuals on screen.

    Benefits of VFX Compositing in Film

    VFX Compositing in film is a powerful technique that involves combining multiple visual elements, such as live-action footage, computer-generated imagery (CGI), and other assets, to create a final cohesive image or sequence. Here are some benefits of compositing in film: 

    Seamless Integration: VFX compositing allows for the seamless integration of different visual elements into a single shot. It enables filmmakers to combine live-action actors with virtual environments, creatures, or objects, resulting in a believable and immersive final image.

    Enhanced Visual Effects: VFX compositing plays a crucial role in enhancing visual effects in film. By combining various elements, compositors can create complex and realistic effects such as explosions, fire, water simulations, and fantastical creatures. It allows for the integration of CG elements with live-action footage, resulting in visually stunning and captivating scenes.

    Creative Control: VFX compositing provides filmmakers and visual effects artists with greater creative control over the final image. It allows them to manipulate and adjust various elements independently, including color grading, lighting, and overall composition. This level of control enables the creation of specific moods, atmospheres, or visual styles, enhancing the storytelling and artistic vision of the film.

    Time and Cost Efficiency: VFX compositing offers time and cost efficiency in film production. Instead of shooting complex or dangerous scenes on location,filmmakers can create those scenes digitally. This reduces the need for extensive practical sets, elaborate props, or costly location shoots, ultimately saving time and production expenses.

    Flexibility and Iteration: VFX compositing provides flexibility and the ability to iterate on visual elements. It allows artists to make adjustments, refine details, or changes individual components without affecting the entire scene or requiring reshoots. This flexibility enables filmmakers to refine the visual effects, ensuring they meet the desired creative vision.

    Control over Depth and Dimension: VFX compositing in 3D animation gives artists precise control over depth and dimension in the final image. By integrating various layers and elements, compositors can create a sense of depth, add atmospheric effects, or place objects at different distances from the camera. This control enhances the realism and depth perception of the visual composition.

    How’s that for enlightenment?

    Got more questions about VFX composting?  Ask one of the experts at Toolbox Studio!

    At Toolbox Studio, we are armed with the tools and more importantly the people who work day in and day out to deliver high-quality VFX compositing outputs for our clients across the globe. Learn why we are considered masters of VFXcompositing and browse through the services that we provide under this umbrella.

  • The Evolution of Rotoscope Animation

    The Evolution of Rotoscope Animation

    What is Rotoscope & The Evolution of Rotoscope Animation

    When animated and visual effects enabled films were first produced in the early 1900s, the movements of the characters tended to be jerky and stiff. This all changed when rotoscoping techniques were invented by Max Fleischer in 1915. In this article, we look at what is  rotoscope animation and its evloution from its conception to what it has done for the world of animation today.

    What is Rotoscopy?

    Rotoscoping is an animation technique used by animators to trace over live action footage, frame by frame, in order to produce realistic looking action. Initially this was done by using a device called a rotoscope where footage was projected on to a glass panel and painstakingly traced over by artists.

    Naturally, this equipment has now been replaced by computers, but the process is still named after the original device. Rotoscoping is also used in the VFX industry where the matte for an element is created manually on a live action plate which is then composited over another background. Rotoscoping in the digital domain is aided by motion-tracking and onion-skinning software.

    The Origins of Rotoscopy

    origin of rotoscopy

    Polish born Max Fleischer was the staff cartoonist for The Brooklyn Daily Eagle. While there, he met John Bray, an early animator who introduced him to the world of cartoons and animation. Believing that the techniques used for animation could be improved upon, Fleisher invented the rotoscope, which used a combination of a projector and an easel. His patent was granted in 1917.

    Fleischer’s first character was a clown based on his brother Dave dressed in a costume. Although the patent was still pending, Fleischer used the technique to animate a series of short cartoons called Out of the Inkwell. Fleischer then founded Fleischer Studios, which gave us legendary cartoons like Betty Boop and Popeye! A notable example of rotoscoping animation used in a feature length animation was Snow White and the Seven Dwarfs. It was met with plenty of resistance from the artists though as they felt that it hindered their work.

    The Evolution of Rotoscope Animation

    Rotoscopy wasn’t just used in animated films either. The Beatles Yellow Submarine notably featured the use of rotoscoping animation techniques in the Lucy in the Sky with Diamonds section in 1968 while Ralph Bakshi famously used roto techniques in films like Wizards (1977), The Lord of the Rings (1978) and American Pop (1981).

    The early methods of rotoscopy were time consuming to say the least and it wasn’t until the late 1990s that rotoscoping went digital thanks to Bob Sabiston, an animator and computer scientist veteran of the Massachusetts Institute of Technology (MIT) Media Lab. He developed a computer-assisted “interpolated rotoscoping” process which he used to create a short movie Snack and Drink. He created the computer program Rotoshop which allowed one animator to do the work of many.

    Director Richard Linklater subsequently employed Sabiston and his proprietary Rotoshop software in the full-length feature movies Waking Life (2001) and A Scanner Darkly (2006) making him the first director to make a whole movie using the rotoscoping technique.

    Benefits of Rotoscope Animation

    Rotoscope animation is a technique that involves tracing over live-action footage frame by frame to create animated sequences. This technique offers several benefits in the world of animation and visual effects. Here are some key advantages of rotoscope animation:

    Realistic Movement:

    Rotoscoping animation allows animators to capture the natural movement and fluidity of real-life actors or objects. By tracing over live-action footage, animators can achieve lifelike animations with accurate proportions, weight, and timing. This technique is particularly useful when animating complex actions or human characters, as it provides a realistic foundation for the animation process.

    Time-saving:

    Rotoscope animation can be a time-saving technique compared to traditional frame-by-frame animation. By using live-action footage as a reference, animators can quickly create the initial keyframes, resulting in a more efficient workflow. Rotoscoping reduces the need for animators to start from scratch, allowing them to focus on refining and enhancing the animation instead.

    Consistency and Accuracy:

    Rotoscope animation ensures consistency and accuracy in the animation process. By tracing over live-action footage, animators can maintain the original performance or movement captured in the reference footage. This technique is beneficial for achieving precise facial expressions, body movements, or intricate actions, ensuring a high level of detail and fidelity in the final animation.

    Integration with Live-Action Footage:

    Rotoscoping allows for seamless integration of animated elements with live-action footage. By tracing over the live-action frames, animators can create animations that perfectly match the movements and perspectives of the actors or objects in the scene. This integration enhances the realism of the final composite and creates a cohesive visual experience for the audience.

    Artistic Interpretation:

    Rotoscope animation provides an opportunity for artistic interpretation. While the initial tracing process captures the realism of the live-action footage, animators can also add their creative touches, exaggerate movements, or stylize the animation. This combination of realism and artistic expression allows for unique and visually appealing animations.

    Rotoscopy Now and the Future

    rotoscope animation

    Rotoscope Animation

    Rotoscopy has now become a creative animation and VFX technique in its own right. Software programs mean there are new ways to carry out the entire rotoscopy process without having to use physical film, thus making it less time consuming. Animators can work in multiple different layers using one layer as the digitised film image and the rest containing the animation or effects that need to be inserted.

    The evolution of smaller and more powerful computers and more complex software applications means desired effects can be achieved with less effort than once required. Colours can be changed and lines blurred using modern graphics software and sophisticated technology can now also track the position of composited objects in each frame taking away much of the physical labour. As software and computer power evolves, so will the art of rotoscopy and what you can do with it.

    However, all these techniques still need skilled artists to ensure that everything is in the right place at the right time. Roto artists, as they are now known as, still need immense amounts of training and discipline to bring the magic to your screens.

    Toolbox Studio thrives on its ability to deliver world-class rotoscope solutions to range of global clients. In the last 10 years, the team has grown to include 50+ inhouse VFX artists who work on industry-standard software. If you have a project that needs these competencies, we can help you achieve that. Get in touch with us now!

  • Blade Runner 2049 – A Sequel That Changed the VFX Game

    Blade Runner 2049 – A Sequel That Changed the VFX Game

    The best visual effects are those that are hard to spot. Recreating realistic effects for a movie is an art and the movie Blade Runner 2049 took this art to a whole new level. Here’s a look at what makes its VFX so special.

    The director, Denis Villeneuve, had a lot to live up to. Not only was he making a sequel (and we all know how easy it is for sequels to flop) of a much-revered sci-fi classic but was aiming for some grounding-breaking visual effects as well. The Blade Runner 2049 VFX teams delved into some serious detail to make sure every Blade Runner 2049 special effects used was relevant and masterfully executed.

    Here’s how some of those awe-inducing visual effects were brought to life to wow audiences across the globe.

    1. The Eccentric Cityscapes

    Blade Runner 2049 is set in a dystopian Los Angeles where the climate has gone bonkers. The film needed to show rain, snow and fog through much of its outdoor footage. Taking their cue from cinematographer Roger Deakins, the Double Negative VFX team went through a whole lot of hard work to get those effects just right. To understand what was needed for the film’s cityscapes, they shut down a couple of streets in Budapest and brought in huge fans being driven by V8 engines to blow wind and smoke into the scene. This gave the team a basis for what the world would look like.

    The team then spent months adding details into the buildings that you only see when the shots are close, the rest of the time, the details are obscured by the fog. However, you can rest assured that were you to strip away the layers of fog and rain effects, you would be able to see the details of the buildings – just like in real life. The team avoided using auto-populating techniques on the buildings… each one was drawn by hand!

    2. The Rain

    The VFX team at DNEG studios had to put in a lot of hard work to get the rain just right. The smoke, smog and haze combined with the different light sources meant that the team had to really concentrate on how the rain looked. Elements like the lights from the advertisements or the spinners flying by meant that the team had to ensure that the rain drops were just the right size, moved and fell in the right direction and were lit correctly. The rain shots were so intense that they took the longest to render and we are not surprised.

    3. Creating Joi

    Villeneuve wanted the audience to feel that the hologram Joi was in some ways actually real. Anything that is too effects-heavy would have taken away that slight feeling of uncertainty where the lines between the real world and the hologram become blurred. A simple 2D transparency effect wouldn’t have cut it for the more realistic look they wanted. So, DNEG came up with an effect that imagines Joi being completely hollow and transparent, with her skin and dress visible on the outside. In order to achieve this, they had a multiple witness camera setup to help with the body tracking. Joi would then be recreated in CG with the exact onset HDRI lighting. This would then be sliced down the middle and the back section “shell” would be composited on to a clean background. The live action Joi would be extracted and composited over the shell and background. To add to the feeling of realness and simultaneous transparency, Joi’s transparency was highlighted when she was in front of a bright background.

    4. Joi Syncing with Mariette

    This effect, where Joi syncs with another girl, is one that probably warrants an article for itself. As DNEG’s Paul Lambert describes it, “The effect was to have Joi overlap Mariette and come in and out of sync with her by revealing different levels of transparency. We had Mariette do her performance first and then Joi would replicate that performance. We would line up Joi at the start using a mix from the on-set video department and then she would perform. In post we had to fully 3D body track both actresses and sometimes K if they were interacting with him. Once we had the tracks we were then able to project Joi and subtly move her to be closer in sync. We wanted to keep both actresses’ performances as much as possible but there were times where Denis wanted to have a sync moment so we would animate Joi from her original performance to Mariette’s animation. A big part of the integration was shadow work. Joi receives shadows from her environment as well as casting shadows on her environment. To correctly integrate both actresses into the same space shadows casted from one to the other were rendered for both of them which were then artfully composited together.”

    The film had around 1200 visual effects in all. The hard work really paid off to create a worthy sequel to the original film, giving the world a whole new level of VFX to look up to.

    To know more about what goes into creating great visual effects, check out our blog dedicated to the art of VFX. If you have a project that needs some stunning VFX, look no further. From top-notch rotoscope and vfx paint services to match move and compositing solutions, Toolbox Studio can help you achieve the kind of visual magic that you see in films like Blade Runner!

  • How Netflix’s Altered Carbon Has Pushed the Limits of VFX

    How Netflix’s Altered Carbon Has Pushed the Limits of VFX

    Cinemagoers are fairly used to, and may in fact expect, stunning visual effects on theatre screens. The same people watching television at home are more forgiving with regards to the quality of the effects or cinematography. It was understood that big budget VFX was reserved for the big screen until fairly recently when some seriously ambitious storylines and concepts came into production with web series pushing the boundaries of storytelling, VFX worlds and concepts like never before. It’s not just the VFX that sends budgets skyrocketing though – talent can cost a bomb, too!

    Netflix is one of the streaming giants that is at the forefront of this revolution. Its 1980’s Steven Spielberg lookalike sci-fi TV show Stranger Things cost about $6 million per episode for the first season with the second season needing about $8 million per episode.

    Altered Carbon is another show that Netflix was really excited to produce and while they remained tight lipped on the budget, there are rumours that the show’s producers were given carte blanche on the budget – and it shows!

    What is Altered Carbon All About?

    Altered Carbon is set in a dystopian world around three hundred years into the future. Gloriously cyberpunk in nature, this show tells the story of Takeshi Kovacs; a former U.N. elite soldier turned private investigator who has been brought back to life to investigate the murder of the wealthy Laurens Bancroft. This future world has the technology to store human memories in “stacks” that are disk-shaped devices implanted in the vertebrae at the back of the neck. Synthetic human bodies (called sleeves) are used as vessels that can accept any stack. Only the wealthiest of humans can afford to replace their sleeves as the old ones age. Takeshi Kovacs was one of the soldiers killed in an uprising against the new world order 250 years earlier.

    The dystopian cyberpunk nature of the story has allowed the show’s producers to include some serious visual effects to create a magnificent, complex and gory world that takes television production to new heights.

    Altered Carbon Cinematography and VFX

    Contrary to what one would imagine, a lot of the show was actually shot on locations and sets instead of just using VFX. Shot in 5K, the division between actual camera shots and movies VFX was about 60 percent camera and 40 percent VFX, which is pretty impressive considering the premise of the show.

    While the cinematographers incorporated as much as possible into live action shots, the burden on the VFX teams working on Altered Carbon VFX was still pretty high. To put it into perspective, the Altered Carbon VFX created for an hour-long episode was about 70 shots. DNEG TV’s Steve Moncur stated, “The expectations were very simple – it had to be cinematic. Everyone wanted to produce a series of the highest quality with effects that you would ordinarily only see in a Hollywood blockbuster”, and the Altered Carbon VFX team certainly delivered on that promise.

    Creating the World

    The Altered Carbon team tried to get as many influences from real-world cities to base their world on, looking at buildings that are so high that they almost don’t seem real. Cyberpunk movie Blade Runner, film noir The Third Man and poet Edgar Allan Poe all played a big part in creating the megalopolis of Bay City.

    Steve Moncur explains, “The San Francisco area was created using a procedural city generator, producing a city populated with over 23,000 buildings made up of over 100 custom made structures.  A large majority of the buildings had interchangeable modular sections and roof dressing to help create the scale and magnitude required for this futuristic landscape, varying from 10 metres at their lowest along the trenches of Lick Town to way above the cloud line measuring at nearly 3 kilometres high to create the meth level scenes. Linking the two areas of interest across the bay was the famous Golden Gate Bridge, built to scale but populated with over 24,000 containers to set the scene of a network of slums for the less wealthy population of the city. To help sell the chaos and business of the city, generic car traffic from background to hero vehicles were created and animated flying through the city and alongside a network of pill lines and over 135 holograms.”

    The biggest challenge while creating the world was to create interesting lighting in the city as it is completely covered in clouds so using the sun as a reference was impossible. DNEG created 1,585 shots for the show.

    The Graphics

    Many of the graphics were handled by UK-based firm Rushes who faced their own unique set of challenges with the series but came through admirably to create some memorable effects. A lot of thought went into the technology used by the  Altered Carbon web series characters. For example, the tech used by the police would not be as futuristic or advanced looking as the extremely rich “meths” and this was taken into consideration when designing the templates. They made sure that the tech they designed still looked practical and functional even though it was futuristic. They managed to achieve an amazing variety in technology and design in just one show.

    The Altered Carbon web series is an R-rated show which is a treat for fans of sci-fi, cyberpunk action with a compelling storyline and great visual effects. To know more about what goes into creating great visual effects, check out our blog dedicated to the art of VFX.

  • How Marvel Redefined the VFX Industry with a Decade of Hard Work: Part I

    How Marvel Redefined the VFX Industry with a Decade of Hard Work: Part I

    The MCU (Marvel Cinematic Universe) has a reputation for leaving its fans with their mouths agape after every best Marvel movie. And it’s not just because of their compelling storylines, incredible star casts or creator Stan Lee’s cameos. Nearly every single movie has some ground-breaking Visual Effects (VFX) that probably had only been a dream until MCU made them come to life. So, what VFX software does Marvel use to achieve such stunning visuals? It’s a question on every fan’s mind.

    Ten years after we watched the first MCU movie Iron Man in 2008, we take a look at some of the earth-shattering, mind-bending (literally) VFX creations that are still incredible a decade later. MCU has pioneered so many amazing visual effects, that we feel one simple article will not do them the justice they deserve.

    So here is the first in our series of Marvel’s marvellous creations!

    1. Iron Man

    It’s only fair to start at the beginning in this homage with Iron Man (2008) one of the best Marvel movies, which would set the trend for MCU and VFX. The first challenge was to recreate a realistic looking and moving Iron Man suit that was true to the character created by Stan Lee, Larry Lieber and artist Don Heck in 1963. Director Jon Favreau was well known for his scepticism of CGI and wanted the Iron Man armour to move like Tony Stark was actually in it. He also wanted to include shots that showed Robert Downey Jr. in the suit to make it more realistic. It soon became clear that CGI, even with motion capture technology simply wasn’t going to cut it.

    Now this was where things got really interesting. An Iron Man suit was actually created complete with the sleek chrome finish we now know so well. Downey often wore only parts of the suit, scenes were shot, and the rest was built using CGI.  The hard part was the tracking where the VFX teams had to match the computer generated suit parts exactly to the visible parts of Downey. At other times, they had to use shots of Downey in the full suit but make the motion look completely natural using motion capture. Reproducing the metal of the suit in CG was another challenge they had to face, especially when it starts icing up in the outer layer of the atmosphere. There were about 800 effects used in the entire movie but the most stunning effect of all was that it changed Favreau’s mind about the usefulness of CG!

    2. Skinny Steve Rogers – Captain America

    Captain America: The First Avenger, one of the best Marvel movies,, introduced the sickly young man Steve Rogers turned into muscular superhero Captain America. While this progression from skinny young man to buff superhero makes sense in terms of a storyline, using a muscular actor (Chris Evans) to play the skinny young man doesn’t – unless, of course, you are MCU. They wanted to create a believable weaker version of their superhero that looked real (not like a substitute actor playing the same character in a flashback sequence). And boy, did their dedication to detail pay off! Using the expertise of VFX company Lola Visual Effects, they managed to create the perfect skinnier, younger, non-superpowered version of Captain America.

    Here’s a shortened version of how they did it:

    • They filmed Chris Evans doing his Steve Rogers thing and then used another actor as a body double, mainly for light reference on a skinnier frame. Think about it, the shadow an arm with a bicep will cast on the body is completely different from a thin one.
    • For about 5% of the shots, Chris Evans’ face was projected onto the skinnier frame.
    • The third method they used was actually using footage of Chris Evans and then slimming him down using VFX. The actor needed to use all his skills to appear smaller for these scenes like taking smaller steps. They also used other methods like lowering seats to make him appear shorter than the other actors on screen with him.

    The combination of these different methods resulted in VFX that looked completely natural, like they had actually gone back in time to film the scenes when the actor was young!

    MCU definitely deserves credit for creating worlds that come to life on the silver screen, allowing their fans to escape into uncharted territories one movie at a time. Keep an eye out for the next one in our series where we look at how (everyone’s favourite) Groot was made and a whole lot more!

    In the meantime, here’s a quick look at some of our past VFX projects, which include Hollywood blockbusters such as Jupiter Ascending, Hunger Games, and 300 Rise of an Empire.

  • Why Matchmove Is Important for Integrating CG Elements into Live-Action Footage

    Why Matchmove Is Important for Integrating CG Elements into Live-Action Footage

    Matchmove or Matchmoving in VFX, is one of the most important things for today’s matchmove in VFX processes. It is also referred to as motion tracking and without it one cannot integrate the 3D/CG data into a live action video. Earlier, this was a tedious, time-consuming, and costly process, but with technological advancements, match moves in VFX have become more efficient, affordable, and faster. And a crucial part of the modern moviemaking and visual effects industry. Today, matchmoving is used across several mediums, that include movies, television, online video content, etc.

    Match moves in VFX incorporate CG elements with real-world footage. Nowadays, filmmakers use virtual cameras to capture the motions and expressions of an actor emoting a character and scenes that look very different, and those actions cannot be achieved with normal cinematography. This footage is known as live-action footage and special cameras help record all of this in 3D and real-time.

     

    Matchmove in VFX

    This helps ensure that all the data that will get rendered in the final video will be from the same angles as captured by the live action cameras. And this process of matching the movement of the footage captured through the special live action virtual camera is the beginning of the matchmoving process. The job of rendering the final video and integrating CG elements into live-action footage is done on a match move in VFX artist’s desk using the latest software and advanced computer systems.

    Various applications and match moves in VFX software have capabilities that are useful during the process. These applications and software employ a tracking algorithm to lock and track multiple frames and the specific points identified in the footage.

    Different Forms of Matchmoving

    Generally, there are two different forms of matchmoving, two-dimensional i.e., and three-dimensional i.e., 3D.

    2D Matchmoving: This process only helps track the features in a 2D environment, and does not integrate elements such as camera movement or distortion. If your shot doesn’t include any major changes in camera angles, then 2D tracking will work efficiently. Because 2D tracking doesn’t affect the camera at all, it just carefully examines the footage to see what direction things are moving and keep track of how they are driving. Therefore, it has limited usage and is mainly employed for replacing or changing the 2D image in live-action footage, such as a TV program or a billboard advertisement.

    3D Matchmoving: This is a more efficient and popular process as it allows the incorporating of 3D information into the available footage. Today, the technology has reached such competencies that with the help of a 3D animation application, one can create a virtual camera that allows the filmmaker to see the real-time representation of what the final footage may look like. This is called real-time match move in VFX and is becoming popular among many production studios in the film industry.

    Matchmove - Integrating CG Elements into Live-Action Footage

    For example, the Iron Man suit in Marvel movies was created using 3D matchmoving software. However, the method of the suit becoming a part of Tony Stark is called tracking and rotomation. This process involves both camera tracking and object tracking.

    This process could be more accurate, but it is good enough to give the team a hint of what the film or video may look like and make changes/corrections if necessary. Nowadays, many movies are shot in studios using green screens. This match move in VFX technology lets the crew know about set extensions and CG characters that will be incorporated later.

    Now that you know the importance of matchmoving in incorporating CG Elements into live-action footage, and if you have a project that needs these competencies, Toolbox can help you achieve that. Click here to know more.

  • 5 frontrunners of the Oscars’ VFX category and why they deserved the nomination

    5 frontrunners of the Oscars’ VFX category and why they deserved the nomination

    The first week of March will be a feast for movie enthusiasts; the world’s biggest cinematic spectacle, the Oscars, will dazzle one and all, come March 4. While most of us will speculate about who will take home “the best film” or the “best actor / actresses’ honours, this year will witness one of the fiercest ever battles on the CGI / special effects arena. For fans of award-winning movies, this year’s lineup promises a blend of innovative storytelling and groundbreaking visual artistry, making it a must-watch event.
    So what does it take to emerge as the SFX frontrunner? Plenty of imagination, ingenuity and flawless execution, of course! According to the Academy rules, Oscar-winning movies with outstanding visual effects are chosen based on: (a) consideration of the contribution the visual effects make to the overall production, and (b) the artistry, skill, and fidelity achieved by the visual illusions.
    The Oscar for Best Visual Effects, 2018, is tough to predict, more so because the selection process is a layered one – the nominees are determined in three stages. First, the members of the executive committee of the branch cast preferential ballots to determine 20 films for further consideration, which is then distilled to 10 semi-finalists. All branch members are invited to a screening of excerpts from these 10 films, at which potential nominees may discuss their work. Ballots are counted using a system of re-weighted range voting to determine the final five nominees.Many of these nominees often include some of the most celebrated Oscar-winning movies, showcasing groundbreaking technical achievements. This meticulous process reflects why Oscar’s best film nominees are held to such high standards, highlighting the exceptional talent behind award-worthy cinematic works. The focus on award-winning movies ensures only the finest films make it to the final list.
    Interesting, isn’t it?

    Past winners

    A quick look at some of the last winners in the category reveals that a combination of great storytelling and next-level VFX always scored big with the jury.

    Recent winners:

    • 2017 – Jungle Book
    • 2016 – Ex Machina
    • 2015 – Interstellar
    • 2014 – Gravity
    • 2013 – Life of Pi

    All the past winners have been epics in their own right, and this year, we expect another worthy one to enter the Hollywood hall of fame.

    The marquee list – 2018 nominees

    So who’s leading the VFX bandwagon, circa 2018? Here’s a list of the most likely winners, featuring some Oscar-winning movies that have set benchmarks in visual storytelling. Many of these films have also been among the Oscar best film nominees, showcasing their brilliance in VFX and across multiple cinematic aspects

    1.“War for the Planet of the Apes”

    Oscar Nomination- Planet of the Apes

    When it comes to fantasia, some movie franchisees, with their jaw-dropping CGI and VFX displays, have been looked up to as game changers; ‘The Planet of the Apes’ is one of them. After failing to win the Best Visual Effects Oscar for “Rise of the Planet of the Apes” (2011) and “Dawn of the Planet of the Apes” (2014), the third outing – ‘War for the Planet of the Apes’ is shaping to the most-likely contender to bring home glory for the 20th Century Fox’s “Apes” reboot. The odds of it winning, pegged at 8/11, are pretty high. And why not, after all, Weta Digital has drawn on its depth of expertise, to translate body and facial motion to their cg models, to simulate muscles and fur, and create further photo-real environments and effects. Also, real actors in performance capture suits were filmed on location – including in snow and rain – and the film was edited with them ‘as is.’ With an astonishing array of digital apes led by performance capture master Andy Serkis as Caesar, the third Apes sequel could win Weta its fifth Oscar.

    A brief look: https://www.youtube.com/watch?v=h5XQq1ulspc

    2. “Blade Runner 2049”

    Oscar Nomination - Blade Runner 2049

    With Blade Runner 2049, Rodeo has raised the benchmark of retro-futuristic technology and its holographic AI personas a few levels up. While Blade Runner – the original – belongs to a pre-CGI era, the sequel features a heavily digitized, gloomy-looking future LA cityscape, flying cars and what not. Sean Young’s replicant Rachel hasn’t aged a day in this version, and some stupendous effort has gone in to help achieve this –a combination of procedures, including bringing back Sean Young and scanning her in, combining that with archival footage from the original film to create a digital double, followed by motion-captured animation of the stand-in actress.2049 eclipses every single filmmaking limitation, and pushes the envelope of imagination much further.

    Watch some of it here:https://www.youtube.com/watch?v=nO-eU6N8_7w

    3. Star Wars: The Last Jedi

    Oscar Nomination - Star Wars - The last Jedi

    George Lucas’ space magnum opus, the first instalment of which was introduced in 1977, has now become a pop cult phenomenon, earning millions of fans worldwide. The first seven films from the franchisee have bagged Academy award nominations, and with ‘Star Wars: The Last Jedi’, expectations are sky-high.

    With every new production, Star Wars has witnessed steady evolution in VFX. The latest release is expected to be decorated with many big moments – space battles, salt flat battles, and some pretty unique critters. The seamless blend of VFX and storytelling is what Star Wars is known for. The credit goes to overall visual effects supervisor Ben Morris.

    Sneak peek:www.hollywoodreporter.com/behind-screen/star-wars-last-jedi-vfx-behind-sinister-snoke-1069559

    4. Kong : Skull Island

    Oscar Nomination- Kong Skull Island

    Originally debuting in a 1933 film, the legacy of King Kong is a long-standing one. With‘Kong: Skull Island’, the mighty form has earned new, hitherto unexplored dimensions. Directed by Jordan Vogt-Roberts of Successful Alcoholics and Kings of Summer fame and with Warner Bros. Pictures and Legendary Pictures at the VFX helm, this was touted as a creative masterpiece even before it was unveiled, and it has stayed true to its repute. The result? A completely new take on the creature, driven by shot-on-location aesthetics – a combination of old-school and new-school techniques. Keyframe animation was used to create scale and motion-captured facial performances were relied upon, to add personality and humanity to Kong. The mythical re-imagination is truly larger than life!

    VFX credits:

    Industrial Light & Magic (VFX Supervisor : Jeff White)
    Hybride (VFX Supervisor: Philippe Theroux)
    Shade VFX (VFX Supervisor : Mitchell S. Drain)

    5. Guardians of the Galaxy Vol 2

    Oscar Nomination - GOTG Vol 2

    In 2014, James Gunn’s Guardians of the Galaxy took the box-office by storm, carving a special space in the hearts of film fans worldwide. The reprise was expected to be more adventure-packed than the first release, and it certainly did not disappoint.

    For this rollercoaster of a movie Framestore collaborated with VFX Supervisor Christopher Townsend to deliver a mind-boggling 620+ shots ranging from creature work, spaceships, to what is being hailed as ‘the best opening sequence in the world’ and an exhilarating space chase across the Galaxies. James Gunn, the director, is a maverick, with a wild bent of imagination.

    The collaboration has led to groundbreaking work, which has thrilled global audiences beyond measure. Hear it straight from the horse’s mouth:

    “What we’ve achieved is something totally unique. That is because the mind of James Gunn
    is a very unique place; but also the demands of the work pushed our team to create the
    impossible.”
    – Jonathan Fawner, VFX Supervisor

    Digital breakdown:https://www.youtube.com/watch?v=JkqTeQHFoBY

    With so many Oscar-winning movies and best film nominees vying for top honours, Oscars 2018 is poised to be an extremely mouthwatering affair. Stay tuned, for this is going to be one hell of a firecracker!

  • 5 Brilliantly Created VFX Characters from Star Wars: The Last Jedi

    5 Brilliantly Created VFX Characters from Star Wars: The Last Jedi

    Star Wars: The Last Jedi (also known as Star Wars: Episode VIII – The Last Jedi) is the latest entry in the Star Wars universes. The epic space opera began with the Star Wars (later retitled Star Wars: Episode IV – A New Hope) in 1977 and the rest is history. Over the period of 40 years, these movies have not only entertained us, but also continually created remarkable benchmarks in terms of film making and the various departments involved in it. And, one of the greatest highlights of every Star Wars movie has been the use of technology to produce some of the most breathtaking visuals. These movies are also popular among fans for introducing various alien or non-human and robotic VFX characters such as Chewbacca, C-3PO, R2-D2, BB-8, Yoda, Ewoks, Jar Jar Binks, Maz Kanata, Porgs, etc.

    Likewise, The Last Jedi also features some new VFX characters that have been created using creative practical effects and groundbreaking VFX. Through this blog, we are exploring how the visual effects in Star Wars: The Last Jedi helped create these iconic VFX characters.

    Star Wars: The Last Jedi Characters

    The Porgs

    The Porgs from The Last Jedi

    The puffins, mainly found in Ireland, inspire these cute little creatures. Director Rian Johnson worked with Neal Scanlan’s creature team, to create the tiny orange, black, and white Porgs that have the face of a pug with little chicken feet.As a Star Wars tradition of using practical effects, various practical puppets were created as animatronic version for the different shots of Porgs. However, the puppets weren’t giving the range of performance that the director wanted, so that team had to build CG versions too.

    The Vulptices

    The Vulptices from The Last Jedi

    The Vultpices that look like crystal foxes are found on the mineral planet Crait in the Star Wars Universe. The Vultpices play an important part in the movie and the director had a clear vision about how the characters should look and sound. Therefore, Neal and his team designed and built an animatronic puppet as done for the Porgs. The team also used a trained dog with drinking straws on it to study the movement of the character. However, the crystal foxes were ultimately created fully in CG due to required detailed look.

    The Fathiers

    The Fathiers from The Last Jedi

    Although these VFX characters might look like a goat in the first impression, but they are referred to as “space horses” in The Last Jedi. Owing to their contribution in the movie they quickly became one of the most memorable additions in the long list of characters in the franchise.Fathiers are trained to run in high-stakes races and are kept captive on the casino planet Canto Bight.

    Creating the fathiers was an intricate process because they have long ears, large eyes, goat-like faces, and run like a cat in many ways, but at a speed of 40 to 50 miles an hour giving the character its unique quality.The team initially used heavy shoulder puppet for various shots, and then moved onto CG and VFX for the complex action scenes. They also used special effects rigs as the actors had to ride on the fathiers’ backs in a chase sequence.

    Snoke

    Snoke from the Last Jedi

    The character is originally played by performance-capture specialist Andy Serkis. The menacing villain was first introduced in Star Wars: The Force Awakens (2015) as a heavily stylised, 25-foot-tall hologram version. However,for The Last Jedi director Rian Johnson wanted to “bring Snoke out of the shadows and make him real”. To achieve this the VFX team had to scale down him downto7-8 feet tall and redesign the look of his face/skin/body for creating an old aged, realistic human character.In order to highlight the details of Snoke’s disfigured face and anatomy,the team took inspiration from real-life cases of healed scar tissue and structural deformity.Along with the research team of Industrial Light & Magic (ILM), the Creature FX team created a painted sculpture and a maquette as a starting point for rebuilding the character.However, the initial concept didn’t translate well into the CG character, also it didn’t support Andy Serkis’ forceful and powerful voice.So ILM restructured Snoke’s face, made his shoulders broader, straightened his back, and the final render was approved by the director Rian.

    Yoda

    Yoda from The Last Jedi

    Yoda is one of the most loved characters in the Star Wars Universe and has featured in all 3 movie trilogies made until now. While making the original trilogy (1977, 1980 &1983) George Lucas and his team resorted to practical effects and puppetry for portraying the character. Frank Oz is the puppeteer and voice of Yoda in these movies.Whereas, the VFX character was mostly CG and animation in the prequel trilogy (1999, 2002&2005). However,in the sequel trilogy,Yoda is again a physical hand puppet (voiced again by Frank Oz),that was designed and built by Neal Scanlan’s CFX team.

    In terms of VFX work in the shots involving Yoda in The Last Jedi, the team added the Force ghost effect and put a very subtle treatment around his edges. They also made some minimal adjustments to the facial presentation in 2D and cleaned the puppeteering rigs for the final scene.

    Apart from the aforementioned, there are various other unique VFX characters that have been introduced in The Last Jedi and the Star Wars Universe. While a comprehensive list of all the non-human characters including creatures, monsters, droids, etc. in the franchise could fill countless pages. And the list is only going to increase with the universe expanding through new movies, cartoons, books, TV shows and what not.