Blackmagic Fusion 9: Closing the Gap

I have just finished up two days of doing demos and answering questions at Blackmagic's booth on the expo floor at SIGGRAPH. I had a fine time meeting the people behind Fusion and learning a little bit more about the company. Not to mention all the interested compositors, editors, and graphics artists interested in the new software. There was a healthy mix of enthusiasm, skepticism and curiosity.

And let me take a break here for some disclosures: I am not a Blackmagic employee, and I am receiving no remuneration for this article. I was paid for my time working the booth, and part of the reason I was there was to promote Fusion. This article contains my own thoughts and a little bit of advocacy for a product that I hope will be successful because I am, after all, writing a book about it. The better Fusion does, the more books I am likely to sell. So yeah, I'm hoping for a financial benefit, but it's not because BMD is paying me to say nice things about them.

Some folks said, "$299? Can't possibly be worth my time, then." If they want to go on paying several thousand to a company that doesn't really seem to care about its customers, that's fine with me, but I think they're going to realize in a couple of years that they were mistaken. It's not a fatal mistake, though, because switching from Nuke to Fusion isn't really all that hard. As a junior, it only took me a couple of weeks to make the transition. It would, of course, be harder for someone with more experience, but the industry moved from Shake to Nuke with hardly a hitch. I doubt it will be all that much different to move to Fusion. And maybe Nuke will hang around longer than I expect. Maybe the proliferation of Fusion will light a fire under the Foundry, and they'll realize they need to reposition themselves or risk being overwhelmed. On the other hand, low-cost Blackmagic cameras haven't destroyed the market for ARRI or Red, so who can say what the rest of the market will do?

Let's talk about those new features, though. I want to headline this with one that I think is being overlooked by the press releases and initial reviews:

Certified Apple ProRes Export on Windows

As of Monday, it was very difficult to correctly encode ProRes on a PC. Your options were Smoke and… nope, just Smoke. There were a few other options that could make a ProRes file, most based on FFMPEG, but one of our (MuseVFX's) clients pointed out a very subtle color shift that was common to every solution we found. As a result, they rejected our deliverables, forcing us to use one of the producers' Mac desktops to do all of our final encoding. Fusion Studio 9 is certified to export ProRes on Windows and Linux. Since it's easily scriptable and has interface options with both Resolve and Avid, Fusion has just become a must-have tool for any Windows-based post-production facility. Even if the only thing it is ever used for is encoding ProRes, it's only $299 for a permanent license. Smoke's going to cost $1500 per year. I don't know if Cinec is Apple certified, but it costs about $1000, depending on the exchange rate from Euros, for the version that supports 4k. Or you could add a Mac computer dedicated to that sole task at a minimum cost of $800, plus whatever extra IT will cost you to support a second operating system.

If you're already on Mac, then this isn't a big deal, but there are plenty of houses that run Windows or Linux, and for us, it's a huge benefit.

New Trackers

Yes, we all knew they were going to do it, so it's not terribly surprising that Fusion now has a camera and planar tracker. Now, I haven't used Nuke's trackers since 2012 or so, so I don't know how well they operate now. At that time, though, the planar tracker was very difficult to understand and use. I imagine it's better now, but if we compare first releases, Fusion's trackers are dead simple. I don't have the time at the moment to demonstrate, but I'll try to find some time in the near future to put up a tutorial or two. I'm sure Vito will have something to show you on that front at some point, too.

The new trackers aren't going to replace dedicated software. Mocha can do some magical stuff—I haven't had time yet to see if Fusion can match it, but my guess is that we'll keep Mocha in our pipelines for a while, though. PFTrack and its competitors have some additional features that just aren't going to be matched in a composting program (at this time). If you want geometry tracking or facial performance capture, Fusion isn't going to do that for you. Focal length estimation, extracting lens distortion parameters, multi-camera solves, rapid mesh-building from point clouds—these are all things that make it worth having at least one seat of a dedicated tracker available. On the other hand, if the camera tracker is in your compositor, you don't always need a rock-solid track because you have tons of tools at your disposal to fix problems with other methods.

And again, that's $300. A bit cheaper than Mocha or Syntheyes, and far cheaper than PFTrack or 3DEqualizer.

The New Playback System

Fusion Studio now has a playback/framecycler available through its Bins system. Honestly, I haven't spent much time with the Bins. I'll be dipping into it more seriously when I write my chapter about Fusion in the pipeline. I did try to get used to the previous playback system, Generation AM, and it never quite lived up to expectations. The new player lacks a lot of features still, in my opinion, but I think it's at least on the right track to become something more useful than Generation was. We'll see. It does offer SDI output through a Blackmagic DeckLink card, and it can apparently be synced between multiple instances on the same local network, so an artist doesn't actually have to get up from their desk to discuss their notes. Is that a benefit? Honestly, I think most compositors actually need the exercise of getting up and going to the conference room occasionally!

Anyway, I'll wait until our pipeline guy has had a chance to play with the new player and see how it integrates with our pipeline before I say much more than that.

More GPU Acceleration

The engineering team claims that every tool has been rewritten to run on the GPU, potentially making Fusion even faster than it was before. I intend to do some rigorous testing on that point,  both here at home, where I have an Nvidia 960, and at work, where I have a pair of Titan X's. I can say that they definitely changed something in tools that were otherwise unchanged because I found a bug (already fixed during the beta period) in the CustomVertex3D node. So they were obviously tinkering with the code in some fashion. Stay tuned for the results of my testing.

VR Tools

There are really only two truly significant pieces of the new VR suite: The spherical camera (which, yes, can do stereo), and direct integration with a VR Head Mounted Display (HMD). The other tools are fairly standard stuff for reprojection. Extremely useful and very fast, but nothing groundbreaking. It's nice to be able to reproject, paint, then restore the lat-long image right there in the compositor. No more round-tripping through Photoshop to remove the tripod from your pano images.

It is absolutely wonderful to be able to put on my Oculus Rift and see the actual quality of my composite in a spherical environment. Sure, you can use the 360 mode in the Viewer to look at your pano (also new in version 9), but nothing beats putting on the visor and seeing it as the audience will see it. And now you don't have to send it out to Unity or something to be able to look at your work. On the other hand, with the HMD covering your face, you can't see your workspace, so you can't actually do any work while you're in there. It would be great if there were a way to put a stream of your desktop on a 3d card—then you could actually work with your flow to some degree while in the VR environment.

The spherical camera is obviously a crucial component to doing any VR compositing work, but it also has at least one application in normal 2d work. Previously, you could fake reflections in your scene with a six-camera rig rendering a cube map. Not only did that involve at least seven Renderer3D nodes (one for each camera, and one more for the final scene), it also had the usual problem where the corners of the cube map recede unnaturally. Replace that six-camera rig with a single spherical camera, and both of those issues go away. It's still no substitute for an actual raytracer, but it's at least a little better than it was.

Now that I can afford a Studio license, and since I already have the Oculus at home, I may well do some more writing about VR in Fusion in the relatively near future. Maybe it will show up as an appendix or supplemental online material for the book.

Delta Keyer

On the one hand, it's great that Fusion finally has a good color difference keyer in its tool set (and it is good. Maybe even a little better than Keylight, though that will take some testing to quantify.) On the other, I did spend quite a bit of time on my own color difference keyer, the need for which has now been obviated.

Delta Keyer has some cool workflow features. Instead of having to mix multiple keys with a keymix or something of that sort, you can actually chain multiple nodes together, and they concatenate their functions. I need to do a little bit of experimenting to figure out exactly how that works, but it's a clever idea that should make keying a slightly less annoying problem. There is also a new cleanplate estimator that lets you use Delta Keyer a little bit like Nuke's Image Based Keyer. Create the estimated cleanplate and plug it into the Delta Keyer, and the screen will be subtracted, resulting in a cleaner key across an unevenly-lit screen.

They also quietly added a new mode to UltraKeyer. It now uses a screen subtraction mode instead of a post-multiply to knock out the background. This results in cleaner, higher quality edges. I'm not sure what circumstances would make me reach for UltraKeyer instead of Delta Keyer, but it's good to know that it has received that much-needed improvement.

Most of the exciting new features are only available in the Studio version, but since the price has been dropped to $299, that's not as big a deal as it was before. Heck, one Red Giant plug-in for After Effects costs more than that! That lowered price also covers all of the things that were previously Studio-only features: unlimited network rendering, external scripting, and the optical flow and stereo disparity tools. Oh, and if you already have a dongle for version 7 or 8, the update to 9 is free. Just download it from the Blackmagic Website, and your dongle should authorize the new version. In fact, if you order Fusion 9 right now, you'll probably receive version 8. Don't complain: you're saving the planet by not wasting all that packaging. Or something.

So, assuming I don't stumble over any more critical new bugs, I'm pretty happy with the way Fusion is shaping up. We've got the most-requested new features in addition to a few that I didn't even know I wanted. I hope that they spend some time in the near future fixing the outstanding problems that have been documented the last couple of years (that thing with the spotlights is particularly troublesome). And I doubly hope that the industry embraces the software. Because I want to sell a lot of books!

5 Comments

      1. i shall start reading your chapters at once. i presume most of what is in F8 will still apply to F9, and if not i imagine you will be updating where necessary.

      2. You are correct. Fu9 didn't change much in the existing tools, so nothing should be invalidated. I'll do some extensive revisions with better example materials in the next draft, and any new workflows or tools will get slotted in where appropriate. The keying chapter will change the most, I think. It needed to be redone even before DeltaKeyer was released.

      3. i am very excited about the delta keyer, as i am going to have to do masses of keying – practically every shot in our feature film, in fact.
        also, if the camera and planar trackers are effective and not too time consuming i will be able to consider more complex shots. i love tracking shots (eg The West Wing), but if they involve difficult solving then a shoestring film such as ours cannot be so indulgent. i will also have to replace lots of screens, monitors and dials, and again i would rather have moving shots than static. i am extremely keen to see your examples as soon as they are ready!

Leave a Reply