Chalk Warfare, created by Sam Wickert & Eric Leigh, directed by Sam Wickert and produced by Micah Malinics, is our second indie project in this series. Chalk Warfare 4.0 is financed with only the generous support of several important manufacturers and is dependent on the goodwill of friends. It's bigger, longer, and dustier than anything the team has done before.
Chalk War 4.0
The concept of Chalk Warfare developed by SoKrispyMedia is simple. Teams of friends draw chalk guns inspired by popular games and media that become real when pulled off the wall and war begins. The first installment 10 years ago was limited to four people in a park. In the third episode, twelve players battled it out for a post-apocalyptic storage set and drew objects such as space portals, chalk drones, energy weapons and even an Iron Man suit. Chalk Warfare 4.0 unites the team with old and new team members in a massive new battle produced for a YouTube audience as a fun platform for the team's creativity and technical skills.
In a truly collaborative style, Chalk Warfare not only incorporated wacky sets and sequences, but also combined a lot of computer animation work from director / VFX supervisor Sam Wickert and in-house visual effects artist Brendan Forde, as well as a variety of artists from around the country . The work included extensive 3D tracking, particle FX and fluid dynamics.
Chalk Warfare 4.0 really took production to a new level, with filming in Los Angeles and South Carolina as well as an antenna unit for a sky diving sequence. “Actual production began about a year ago in California with the bottom bracket,” recalls Wickert. The team then shot in South Carolina for the subcamps. The last shoot was the first skydiving. In total, there were about five to six shoot days spread over months, but with many small shoots to capture elements or smaller cutaway elements such as falling debris. The team only ever had half the cast, as half were in the warehouse and the others were playing their roles on the outskirts of Carolina. Even skydiving, it was the exact same 8 people who doubled in size and changed helmets to look like 16 people were jumping.
The guns are made by filming the actors with cardboard or flat plastic gun cutouts, which are then tracked and replaced with chalk versions of the same shape. This means that the entire project requires an enormous amount of object and camera tracking. The film was shot mainly with Prime lens. About 80% of the footage once the team landed after the parachute sequence was filmed on a rig and about 20% on a similar glidecam. Typically, in a close-range combat sequence, footage is shot <20mm wide, with the main shot usually closer to a 28mm lens.
The team took photos with the URSA Mini Pro 4.6K G2 and the Pocket Cinema Camera 4K. “We'd use a two-camera setup to double-cover the flagship action scenes,” Malinics said, “and sometimes even split it into two to maximize the day, as we only had a day or two at some of our locations capture everything. The URSA Mini Pro also allowed us to use some of the cinema-quality glass and peripherals that we already had access to, which continued to help us improve our film capacity for this project. We also used the Pocket Cinema Camera 4K for a lot of our behind-the-scenes footage. It's simple and versatile, and goes well with URSA Mini Pro footage. "
Shama Mrema on the water tower
One of the many aspects that was interesting in this project was the team's engagement with the audience before and during production. For example, the team asked its fans to suggest weapons, and once the fans even came up with important dialogue suggestions. "In the movie, I call Shama (Mrema), who is on top of the water tower, and ask him if he's ready. Shama replies," locked and loaded. "I then posted a picture of it on social media. Well, that is the actual line, but one of our followers posted the comment, “Chalk and loaded.” And I asked, “Oh my god, why didn't we think of that?” So we re-dubbed it and it fits perfectly. And exactly that we participated. "
Forde mostly focused on the massive task of tracking guns, while Wickert handled the heavier 3D sequences and chalk integration. All materials were assembled in DaVinci Resolve Fusion. "Fusion's node-based workflow has been incredibly helpful," said Forde. "On a project like this, with so many shots using similar compositing methods, we were able to use the node workflow to easily swap assets in and out, and track data to replace weapons and add them to new shots."
This project differed from early Chalk Warfare projects in two important ways. First, the scope was much larger so the team used more prior knowledge, especially in the complex opening scene of skydiving. The second difference was a reflection of technological advances. Brendan Forde built a much more robust tracking and post pipeline using tools like Frame.io so that the team could work in a distributed manner and still be highly productive.
Not only did the post process need to be tightened, but 16 people were fought in this film, which is drastically more than in the previous Chalk Wars. "There were four teams of four, and it was one of the big hurdles to bring all the actors together," says Wickert. "Especially since we had some people this time who are well-known influencers, especially in the California section, with a higher profile and much more difficult to plan." Wickert points out that most of the cast and crew members are personal friends. With indie productions like this one, the project relies very heavily on the generosity of the friends who only work on it because they want and love the previous videos.
Above is the RC camera car and the chalk robot car as it appeared in the movie.
Cool new technology
Among the innovations used in Chalk Warfare, there have been some particularly interesting advances for the team. In particular the new chalk particles, the use of LIDAR and the new virtual cinematography.
The chalk effects in this movie were significantly enhanced using a full particle simulation. The team used tyFlow, a new particle system from Tyson Ibele. It is an Autodesk 3ds Max plugin that is currently in open beta. One of its main functions is that everything is multithreaded. Wickert points out: "I actually bought a new computer with a lot more cores in order to be able to use tyFlow better and to be able to carry out even crazier particle simulations for elements such as water decay." The team tried cloud computing systems and found the cloud Solutions, however, for the most part problematic given the large sim cache files and its own internet pipelines and data transfer speeds.
TyFlow uses the latest PhysX SDK rigid body simulations and works as a fast OpenCL accelerated solver to simulate a wide variety of materials. "It's amazing what Tyson did and he was very helpful, I was in direct contact with him throughout the production," commented Wickert. “The product can compete with some of the high-end particle systems like Houdini and other particle systems. It works right in 3ds Max and brings out some incredible features that Max couldn't really offer before. "
TyFlow was also used for the water simulation that neutralizes the robot. The actual water network was carried out with Phoenix FD. “The water network was actually done with Phoenix and our render was V-Ray. A great guy and friend, Allan McKay, helped us with this water simulation and the water rush, ”says Wickert. McKay provided the water cache and Wicket took care of the disintegration of the robot as well as the disintegration of the weapon weapon that is being held as it is falling apart too. "
While bullet strikes and close-ups on the weapons with the chalk dust particles worked well, the fine dust and particles are not visible at a certain distance from chalk objects. This made it difficult and problematic to make props out of chalk to block and light them up. Once the fine detail is gone due to its distance from the camera, the chalk supports are nothing but diffuse, even colors that look remarkably strange and difficult to bring into the shot. “That was a big deal to shadow the robot. Honestly, that's one of the great things we found. So we decided that we just had to do it with millions of particles. It was the easiest way to replicate chalk, ”says Forde.
One of the most unexpected aspects of the South Carolina shoot was the scanning of buildings. The team needed to get a photogrammetry of the remote location. On the outside of the old mill, some craftsmen were working on renovating one of the buildings the team shot at. “And one of the gentlemen who happened to pass by was LIDAR, who scanned the entire area. He asked if we wanted the files, ”recalls Wicket. "So we were happy to have hundreds of gigabytes of lidar scans of this entire environment." These scans were then combined with their own still images to solve the visual effects of the outside position.
Virtual real-time cinematography
For the scene in which the chalk robot breaks through a wall, the team opted for virtual production and virtual cinematography. Wicket has been using different types of real-time virtual cameras since 2017. Therefore, the team "filmed" the sequence with a virtual live real-time camera. “Things like the Unreal engine are just so cool. We're excited to see how these powerful tools become available and get into the hands of people like us, ”says Wicket. “I'm excited to see how talented people get inspired and take the time to download the software and learn. You no longer have those tens of thousands of dollar barriers holding you back from making this stuff. Anyone can download and use it. "
The actual footage was not transcoded and the raw footage was recorded directly in DaVinci Resolve Studio, allowing the team a lot of flexibility in the dynamic range of the footage. Most of the post-production was completed in Resolve, which Wickert found to be quick and efficient. “One of the major problems we've faced with software programs and editors in the past is that the software wasn't using the full power of our computers. Just because you may have great hardware doesn't mean your post production software is using it. This film has a combination of a lot of live action and lots of full CG sequences with no practical footage (other than assets), so having a fast, multi-threaded, efficient editor that could handle all of this was a must. Resolve was Up to the task and really made our workflow more efficient. "All of the materials were put together in DaVinci Resolve Fusion." Fusion's node-based workflow has been incredibly helpful, "commented Forde." On a project like this where there were so many shots of similar compositing Methods were required, we could use the node workflow to easily swap assets in and out, and track data to replace weapons and add them to new recordings. "