*Originally published by Little Black Book (lbbonline.com) on July 9, 2022.
Shift Dynamics founder Kirk Slawek and president of Motorized Precision Sean Brown talk LBB through the next frontier in virtual production
Over the past few years, the world of filmmaking has grappled with an enormous amount of change. Partly due to the pandemic - but mostly because of revolutionary new technologies - the potential and fabric of filmmaking has been revolutionised.
MP AR, a new app from the motion control software company Motorized Precision, has positioned itself at the crest of that revolutionary wave. By tapping into augmented reality, the app visualises cinema robots, products, and robot movement on an iPhone or iPad screen. As a result, directors and filmmakers across the production pipeline will have visibility over virtual production elements, including production pre-viz and real time workflow, with an unprecedented level of convenience.
What makes the MP AR app distinctive is its ability to visualise AR objects both in front of and behind real-world objects, and place target QR keyframes at precise points to allow for greater accuracy. The result is a filmmaking process which, in the words of the app’s creators, ‘puts more power into a director’s hands, and improves communication with the rest of the crew’.
In order to show off the capabilities of the new technology, the video production experts Shift Dynamics were enlisted - along with their sister company, the virtual production-focused Arc Studios - to create a slick demo video using the MP AR app.
To find out more about the tech’s potential to remodel the craft of filmmaking, LBB spoke to Shift Dynamics founder Kirk Slawek, alongside the president of Motorized Precision Sean Brown.
Above: A video from Shift Dynamics showing MP AR in action.
LBB> Sean, first things first - how is this new app going to be helping filmmakers?
Sean> Has a director ever asked you if they can ‘just take the camera by hand to show you what I want’? Or taken out their iPhone to film something, with you left to interpret what they want? This is the app which solves that problem. You can literally give them a ‘directors viewfinder’ that is filming the shot. So you can film the exact shot that’s in their head, as they make it, and then perform it back instantly on-set.
Or, if you go onto location without bringing your robot or a tape measure, you can simply visualise any location around the world. You can see what the shot looks like pre-viz, and send it to a client right from the location scout.
LBB> Kirk, how did you go about finding the best ways to show off these new features?
Kirk> From Shift Dynamics’ perspective, the power of the app is the ability to put it directly in the hands of the DP or the director. Whether or not you use the move they actually create, it accelerates the process on set and lets us get the vision and communicate it quicker. That’s a battle any filmmaker will be familiar with, where it takes a few hours to get everyone up to speed to sync the robot technician’s understanding of the moves and the pacing.
Now, when you can just give it to the DP and they swing their move, we can tweak that stuff where necessary and it allows us to shoot more stuff, more quickly. With the target modes, we saw some benefits with mapping out objects in the space so you have collision references, and can perform more dynamic moves that follow the target through space and would otherwise be almost impossible in a similar timeframe.
LBB> And are there any industries, product categories, or filmmaking styles for which you think these new capabilities will be especially useful?
Sean> I think in more live action scenarios, where you can animate the target using the iPad, there’s a lot of creative applications to be had. Imagine you’re shooting a Disney film and there’s a character that’s not really there. You’d need the robot camera to look at it and follow it as it flies around. You can do that with the app’s moving target feature. I think those aspects of filmmaking will get a lot of joy out of MP AR.
On top of that, motion control has always played a big part in tabletop, car photography, mixed media, and VFX. If you have a digital character or any kind of special effects, this new app opens doors to making precise or handheld-feeling shots with integrated post production tracking support. It sets up shots quickly to help accomplish big shoot ideas on a commercial production schedule.
LBB> What makes MP AR quite so useful for location scouting?
Sean> It gives you a one-to-one representation of a robot in any environment. So you can populate in any location just by using your phone or an iPad. If you want to see if a robot fits in your kitchen, for example, you can simply use your phone to find out. You can also move the robot around and play it back to see how the robot moves - ensuring it doesn’t crash into any windows or something like that!
LBB> What’s been the biggest challenge you faced in the process of making or using this app, and how did you overcome it?
Sean> There are a lot of ways to approach moving a camera for motion control, and we’ve kind of pioneered along the way. The ultimate challenge was taking the incredibly complex world of robotic movements and laying them out for users in a way which is intuitive and efficient. So coming up with the UI and ensuring it was simple, powerful, and ticked all the boxes we need took a lot of time. That being said, we’re all delighted with the results.
Kirk> For us, the learning curve of the app initially was challenging after using robots the way we had for years without it. Adding the app workflow was kind of flipping the way we had done it for years!
We kept going back to our roots of how we knew how to program things - but stepping back to realise the simplicity of the app and utilising it always got everything back on track.
LBB> Looking at the industry more broadly, would you agree that we’re seeing an ever-increasing demand for content across all categories? If so, do you think this kind of virtual production technology can help filmmakers to rise to that challenge?
Sean> Yes. Increased demand has been the story of the past decade or more, I’d say. What we’ve tried to do with this app is all about putting motion control and the power to design into anybody’s hands, and I think that will be huge for the industry. Anybody that has the App Store on their phone or device can download this, see what's possible, and then find any Motorized Precision vendor across the world and collaborating with them. It’s the power to see what’s possible in a more accurate way than ever before.
Kirk> The app unlocked the ability to recreate some moves that we previously couldn’t have done in such a small amount of time. From a content perspective, everybody thinks robots give very linear, robotic type of moves. But the app totally unlocks that and allows robots to feel a little more floaty and less, well, robotic-y - all whilst maintaining the precision of the robot. My view is that this will open up lots of creative channels by utilising not only the app, but robotics and different forms of media that haven’t been used before due to technology restraints.