Loading a LUT with DaVinci Resolve: Part 4
How I use LUTs on set
My workflow on set is to load all the 3D LUTs that I have created for a feature or TV pilot into my Flanders CM250 OLED lighting monitor. The monitor can hold up to 16 LUTs. I designed a total of sixteen LUTs on Need for Speed, four LUTs for Fathers and Daughters and currently fourteen LUTs on my current project. Each LUT has to be linked to your specific camera’s sensor to be the most accurate.
Many people have said that the new RED LUTs that I created are looking amazing on the BMCC and the BMPC. It is all color science, contrast and personal preference. Finding your visual rhythm is so essential with these lighting and exposure tools.
If you want to use the LUTs I have created for other cameras in post, you can easily do so in Resolve. All you have to do is take the .CUBE file that I have created and put it into your Library of where Resolve stores the .CUBE files.
When you open Resolve up and go into your Color management, you will again follow the steps outlined in the “A few things to consider before getting into Resolve” section of this post.
First, select the settings button in the bottom left-hand corner of Resolve.
Next, go to “Lookup Tables.”
Go to “Open LUT Folder.” (This will automatically open the folder where LUTs are saved, both on a MAC or PC.)
Copy your .CUBE file into the LUT folder. You can drop it in or create a folder to keep your LUTs organized with a specific name to it. That folder will appear in Resolve.
Hit “Update Lists.”
Then hit “Apply” in the bottom right hand corner of this window. Always hit “Apply” in Resolve.
Your LUT will appear in this list, just like our “Blackmagic URSA LUT_1.RCR URSA 1.9.10_1_2014=12=-5+1238_C0000” .CUBE file we created.
Loading LUTs into your devices
I say devices because there are many different places you can load a LUT into now. I have listed a few that I have used on set and what I am currently using, where all of the LUTs that I have created will work.
With Atomos Shogun
Atomos has begun to step up their game and I really like what I see out of their Shogun Monitor/Recorder. Atomos just updated the Shogun to support larger file recording with the Sony FS7 and FS700, they are working on more updates currently.
With certain applications at Hurlbut Visuals, we like to use the Shogun with the C300, it allows us to stay in a streamlined ProRes workflow when needed. I’m looking forward to seeing updates in the future for Canon RAW for the C500. It’s a great monitor for my crew to use as a focus pulling monitor while we are on the MoVI or even a director’s viewfinder when we go wireless. Having a LUT built right into the monitor is even more essential now than ever before. The ability to setup this monitor to allow for the LUT to be turned on or off on the recording and on a standard output over 12G-SDI and HDMI is solid….12-Gs baby….12.
Download the latest Atomos Firmware: http://atomos.com/support/
With Teradek Bolt
One of the latest is with the Teradek Bolt. I’ve been using the Teradek Bolt 2000 and 600 and have had great success using them both to send video with and without LUTs embedded into the system. This baby has been rock solid as we have been rolling with multiple MoVIs for the last few months. Here is how you can load your custom LUTs into your Teradek Bolt.
With SmallHD DP7 Pro
While I was shooting Need for Speed, I used the SmallHD DP7 Pro on every shot. Here is how you load their LUTs into a DP7 Pro Monitor.
With a Flanders Monitor
The Flanders CM250 has become my lighting monitor on set over the last few years. Getting a 3D LUT onto any Flanders takes one additional step in converting your .CUBE file to a .DAT file. There are two pieces of software that are endorsed by Flanders Scientific to use with their monitors. One is fairly expensive around $500. The other, $150. Yeah, $150 baby.
This is the workflow for loading a LUT onto all of the Flanders Monitors, every model.
My team in New Orleans used the $150 program from Lattice. This software will allow you to convert your Resolve .CUBE to a Flanders .DAT file.
Once you have converted the LUT file, you are ready to load the file onto the Monitor. At the 55 second mark of the video, you will see how to connect to your Flanders monitor.
Keeping the look consistent
One of the biggest reasons I use LUTs on my monitors is to keep a very consistent look from beginning to end. You know when you apply that specific LUT that it will affect the color, contrast and keep you on the right track to exposing it the way you want. Especially with changing weather conditions and moving in and out of the sun, these LUTs are essential.
Wrapping Up
The LUT is the new extension of the chemical process of film stock. I used to have all these cocktails of baking film stock for 30 minutes at 300 degrees and then reducing to 225 for four hours. HA HA! I shot on Magnetic Sound recording film stock. Yes, you can get an image from that. At 80 ASA, it is some of the best black and white imagery I have ever seen. These were my tools. Now the future holds a limitless amount of colors, tones and controls that will push your creativity to the next level. I have taken countless hours in the color correction with DI colorists designing LUTs to bring the best out of the RED EPIC Dragon sensor, Canon C500, C300, C100, C100 MKII and Arri Alexa. Grab a LUT or design your own. You now have the playbook!
What LUTs do you like to use?
How have you used them to help create your look?
Take a look at Part 1, 2 and 3 of this series here:
How to Create 3D LUTs to Deliver the Power of your Artistic Look with DaVinci Resolve: Part 1
How to Create 3D LUTs to Deliver the Power of your Artistic Look with DaVinci Resolve: Part 2
How to Create 3D LUTs to Deliver the Power of your Artistic Look with DaVinci Resolve: Part 3
In your article, you said that red dragon lut was looking beautiful on the bmcc and bmpc. Were you referring to the pocket cam or production cam.
Also, can we get the Ursa lut you created? Thanks for all you do, and it’s such a pleasure to be apart of the inner circle.
Hi J Parker, I was referring to the BMCC and BMPC, not the production camera. Working on LUTs for all that is Blackmagic in the coming months.
A question: you have described LUTs for REC709, viewed on REC709 monitors (reasonable for what you are currently shooting). If shooting for something other than broadcast release, shouldn’t on-set monitors be P3 monitors, and the “master” LUTs designed for the P3 color space and 10-bit color, not 8? Then if you have people looking at dailies on their iPads, those files should be sent through a REC709/sRGB conversion LUT. That way you’re always going for the highest quality look when judging your lighting/exposure, while allowing people off-set to see what you are going for adjusted to their monitors’ more limited color gamut.
Thanks for taking the time to do this while on location!
That is one way to look at it. I think P3 is probably the worst color space available and shy away from it at all cost. It makes things look cheap, like video and all the whites go yellow.
Not sure I understand, since DCI P3 (aka SMPTE 231-2) is the color space of every digital projector: the color space that every (color) movie is projected in; and therefore the color space every theatrical release is graded in. REC709 is essentially a (very) limited subset of P3 (although they use different gamma power curves, since their intended viewing environments are very different). And REC709 only has a dynamic range of about 7 stops, 8 at the most.
Shane, I’ve been thinking about your response. The only way I can figure you would see what you describe is if you just display REC709 data straight off of a video camera (or a Canon C300 or 5D Mk II for example) on a monitor calibrated for P3, then the neutral tones _will_ look yellow. Because there was no conversion between the different color spaces — no conversion to account for the different color primaries used to create the image — _everything_ will be off. That’s why you need a P3-to-REC709 conversion LUT before you look at those higher color, higher dynamic range images on a REC709 monitor or an sRGB tablet, and vice versa.
But the only times REC709 should be recorded and used as the master color format on-set is when either
(a) you are shooting video for broadcast [or web] which will be sent in REC709 anyway, in “SMPTE legal” restricted color code values, or
(b) you’re using cameras (such as GoPros or Canon DSLRs) that can only create “video” files in REC709, not RAW or P3.
DCI P3 was developed because the limitations of REC709 (and NTSC) color and dynamic range fell so far short of what film could provide. P3 was the way digital projectors could match and exceed the capabilities of film, when film was scanned and projected digitally. It has a color gamut at least as wide as any film, at least 12 bits per color, with a gamma function to allow an extended dynamic range that film and theatrical projectors are capable of. So there is no way that P3 can cause a “look like video”. If it did, every movie you see in a movie theater would “look like video” — which they don’t. On the other hand, REC709 — with its 10-bits per color often restricted to 8 and converted from RGB into Y’CbCr and subsampled color, further limiting its dynamic range and color gamut — is BY DEFINITION what “video” looks like!
That was the point of my question: if you are not shooting for broadcast video and are using a RED Dragon or a C500 shooting RAW, shouldn’t everything you see on-set be in P3 — which shows you close to the range of color and dynamic range that those professional cameras are actually capturing, and how it will be seen when color graded?
[Please note: typo correction to the above — that should be SMPTE 431-2 as the standard for digital cinema projection.]
W T W, i’m trying to wrap my head around all the jargon. So in essence these LUTS will take, for example the RED log film, and convert it into a REC 709 colorspace that would have to be viewed in a rec 709 monitor. This will in terms limit and restrict the color depth and latitude that was otherwise possible to see with the raw image if viewed in a DIC P3 color space. Am i somewhere on the right track with that dumb down explanation?