Our most recent #mochat covered the various rendering options in Cinema 4D. From sketch & toon to physical to Octane to V-ray, posters talk about how and why they use different render engines.
Last night on MoChat we talked about the pros and cons of working on either desktops or laptops. The general consensus was that laptops are great for portability—if that’s your thing—but come render time, it’s really nice to have the power of a desktop. In the end, it comes down to personal preference. For getting the most out of your preferred rig, desktop users recommend upgrading the GPU (as time/budget allows), throwing in lots of storage, and taking advantage of the expandability desktops afford. Laptop users tend to have external monitors on their desks at home to help with lack of screen real estate and external drives are a must.
Earlier this week, Video Copilot released their highly anticipated Element 3D plugin. I had a bit of time to play with the plugin since this week’s MoChat, and thought I would share my first impressions. This isn’t a full review or tutorial. I haven’t had enough time to invest into learning the plugin quite yet. Rather, I am just recording my initial thoughts while working through the interface. Where I can, I’ll link to Video Copilot’s videos for more in-depth information.
First of all, everyone should be aware of what this plugin is not: a 3D modeling environment within After Effects. It was originaly conceived as a 3D object particle plugin. Even then, it does not contain a physics or dynamics engine. Element does not contain an emitter, either. Rather, it has a replicator which determines where the particle live. I’m sure people will figure out creative ways to fake it (much as people have already pushed the Ray-trace engine in CS6).
What Element is very good at is bringing in static 3D objects to incorporate into your scene. These object can be separated into up to 5 groups, each with their own settings. If an object has multiple pieces, those pieces can be separated and dispersed using the group’s Multi-Object settings. This looks like it will be good for quick shatters or particle dispersion. One trick is to bring pieces of a model in as separate objects. This will give you a bit more flexibility with animation by assigning each piece to a different group. Andrew Kramer demonstrates this quite well with his gun & helicopter models.
Another great feature of Element is extruding a text layer or mask. This is where it directly competes with the ray-trace engine in CS6, and seemed to be one of the more frequent requests of After Effects previously. Element 3D is definitely quicker than the Ray-tracer, and you have some interesting animation options with the Multi-Object settings, but you lose the ability to cast shadows or interact more directly with your AE scene. One thing to watch out for, though, is that when type is separated, i & j dots are also separated, as seen here.
Bringing in your own objects is also fairly straight-forward. Bonus points for being able to bring in C4D projects & OBJ files. This means you’re not limited to the model packs from Video Copilot (though they are very reasonably priced). You can get models from your choice of sources, including Turbo Squid and The Pixel Lab. This doesn’t mean you’ll get animated objects, cameras, lighting, etc. You’ll just get a static object. And you’ll want to make sure you have a cleaned up project file. All your materials will come in, even if they’re not used.
But you’ll need to rebuild your materials within Element. Your C4D object & materials will come in as all white, at least in my tests. This does make sense since Element is not a Cinema 4D render engine, but custom. [Update: As David Biederbeck demonstrates, if you are using bitmapped textures, you can imply point Element to your texture files and it all works provided you have UVW coordinates. Procedural materials (colors/shaders/noise/gradients/etc…) will not come into Element from the C4D file.]
Once you start replicating objects, you can really start to have fun. I mentioned earlier that this isn’t a physics-based particle system, but you still have a lot of control over how particles are produced and transformed, all of which can be keyframed (and subsequently controlled by expressions). Where you really get to do fun things is with the Animation Engine. This is basically a transition between two groups. The immediate use case is almost like an effector with falloff in Cinema 4D. You have control over easing, transition percentage, and even a time delay for position, rotation, scale, and material transitions.
At this point, I want to bring up a major difference between Element and the Ray-trace engine (or most any other 3D software). Element is an OpenGL environment, not a ray-trace environment. That means objects & lighting are rendered, but light interaction is not. The biggest example of this is the lack of reflection between particles or objects, and lack of shadows entirely.
Lighting and environments, however, are nicely handled in Element. There are several environment maps (1024×512 PNG files) included, and you can use your own as well, including HDRIs. You are also not limited to setting up lights in your AE comp. Element has pre-built lighting setups to get decent results without too much work.
When it comes to compositing, Element can even output separate passes for z-depth, normals, AO, diffuse, specular, refraction, reflection, lighting, illumination and focus. This is one case where it would be nice to have node-based compositing in After Effects. As it stands now, to get each of these passes out of Element separately you’ll need a separate layer, each with it’s own instance of Element, each set to output the different channels. In a node-based system, each channel would be output from a single node without the need to duplicate layers. However, Element does have the ability to adjust the opacity of diffuse, specular, ambient lighting, reflection, refraction, and illumination channels to do a rough composite right in the plugin.
Render, Render, Render…
Lastly, performance of Element was very impressive. I tested it on both a 2010 8-Core Mac Pro w/ Radeon 5870 and a 2012 MacBook Pro w/ GeForce 650M. Both machines performed very well. I actually preferred the performance of the nVidia card over the AMD. Antialiasing especially seemed to be much more accurate. But this serves as a reminder that this plugin is entirely dependent on the GPU, and renders themselves will vary between machines. This is one of the reasons Video Copilot does not recommend using the plugin in a multi-machine render.
There are a few things, though, I wouldn’t mind seeing improved: shadow support (which is apparently in the works), more intuitive saving of presets (you must right-click the model or material), custom preset paths with support for OBJ & C4D files (instead of just the .epack files),support for animated models in OBJ sequences, and ability to expand the lighting setups to the composition.
That said, Element will still definitely have a place in my workflow, mostly for bringing in singular 3D assets (logo, type), or simple Cinema 4D cloner-like animations. Just having the ability to not go back and forth round-tripping Cinema 4D & AE scenes along is worth the cost of entry. If you go for the plugin, I would at least recommend going for the Pro bundle to get the shaders so you can learn how the materials system works in Element.
I have a feeling we’ll see a lot of Element renders in the coming months. Almost as a rule, a lot of the initial uses will be basic shiny spheres and particles, with a few really creative uses. Then, as time goes on, we’ll see some really interesting uses that push Element to its limits.
Octane Render is a standalone, unbiased render engine which works with most modern 3D packages. It also uses your CUDA-enabled GPU for rendering.
That sounded like a lot of jargon, so let me make this simple – Octane is fast. Very, very fast. Some people will remember a CGTalk thread where the idea of rendering purely from the GPU was dismissed as fantasy. Well I’m happy to report that my opinion has changed completely – it’s fast, accurate and I never want to go back!
Like Maxwell Render, Luxrender and the new physical render in Cinema4D R13, Octane will render as long as it can until you tell it to stop. If you’ve never used an unbiased renderer this can take some getting used to, but it’s certainly an enjoyable way to work. What’s unique about Octane (at least to people who haven’t used the hardware acceleration in apps like 3DSMax) is that it renders every change you make directly to the viewport. What you see is literally what you get. All material changes, quality settings etc. are rendered progressively in real time, so there’s no need to sit around waiting for your GI to calculate before you can get feedback on a material.
Did I mention that Octane is fast? Let me give you an example. At work, I use a fairly old quad-core workstation with bags of RAM. A 1080p frame with AO, some reasonably complex geometry and soft shadows will take me around 3 and a half minutes to render in Cinema4D’s native renderer. In Octane I can get much higher quality (including, I might add, working AO) in about 1 and a half minutes. Better yet, this is all done on the GPU. All of it.
Another little gem hidden away here was the option to undertake a little instant post-work on the fly- as far as I can tell there is no render overhead for things like exposure changes or tonemapping.
I should mention at this point that my graphics card is very modest – a nVidia GTX470. It’s certainly not a high end workstation card, but it does a good job with both Octane and general viewport navigation. And this little sub-£180 card is better at rendering a complex 3D scene than my quad-core CPU.
So. Octane is fast, it’s interactive, it’s required hardware is relatively cheap. What’s the downside?
Well – for one thing it’s In Beta. Yes, a commercial engine in working beta, so sometimes you are left with decisions that feel like least-worse options. You can find examples of these all over the fairly obscure interface, and even to experiences like making a purchase and receiving a license.
Secondly, there is no official plug-in for Cinema4D (which I’m going to assume you use if you’re reading this post), “just” an exporter script. Its basic, but enough to get an .obj with material separations into Octane, ready to render. It’ll also send an animation over, too. A recent email from the developers assures me that they expect a C4D official plugin to be released as late as a years time – and it will cost extra.
Finally on the Cons side; there’s the realisation that your prior efforts at shading weren’t that great. With the current system, you might as well not bother shading in C4D at all, such is the difference between a material translated to Octane and its counterpart in C4D. I’ve also not managed to find a way to export a full scene including disparate objects from C4D; instead each component must be exported and imported as a discrete node.
Summing up: Incredible power, simply mindblowing for those with mid-range or slightly old machines and the right graphics cards. However there’s a big trade-off here in terms of living with quirks and a greatly changed workflow.
If you have any questions regarding Octane, I’ll try my best to answer them in the comments or via Twitter. Alternatively, you could head over to Refractive Software and give the demo a try for yourself.