Dolby Vision on Projectors for $30? Maybe.
LLDV, aka Player Led Dolby Vision -- a technology hack for our transition period
Chances are good, any A/V projector you buy today has a built in capability to acceptably render HDR content on your screen.
This hasn’t always been the case. The best solution was often to simply convert the HDR content to SDR (standard dynamic range, like what you find on Blu-ray and HDTV content) for projection. And since basically all playback devices, whether UHD (“4k”) streamers or UHD (“4k blu-ray”) players can convert HDR to SDR, this is a simple solution.
If you are in the high end market, there are external devices ($5k to $15k) that can further improve the HDR projection experience (eg, Lumagen, MadVR Envy). But even in the budget market, there are some new ways to improve HDR rendering on projectors — even those that can already do it pretty well.
Case in point: Some intrepid folks figured out what has been called the Dolby Vision LLDV (or, player led Dolby Vision) “hack.”
What Is LLDV and “the hack”?
Dolby Vision is a particular flavor of HDR encoding that contains data used by the video system (player and display) to render content in the best way possible, within the constraints of the technology involved. Typically, this meant that since most TVs cannot get as bright as needed for all the highlights in a video image, they would use meta-data and direction about how to render the content as well as possible within the limits of the display.
Converting the original HDR content into a version of HDR that the display can handle — “tone mapping” — in a consistent way meant that even less a capable display could look very good, arguably close to what the original content creator intended.
Some intrepid tinkerers (mostly at AVS Forum) noticed that Dolby Vision can work in two different ways: most of the time, the DV content is sent to the display untouched, and the display Tone Maps the content to its own capability. But due to the influence of Sony, who wanted to market “DV capable televisions” that couldn’t actually do the tone mapping, a second version of DV was added, where the player (UHD disk player, streaming stick, etc) does the tone mapping BEFORE sending the content to the display.
That “player led” conversion scenario was also called “low latency” since it greatly reduce the processing in the TV, reducing image latency, and became known as “low latency dolby vision” or LLDV. As people took a look at the LLDV content coming from consumer devices, they realized it was “just” HDR10 content in most ways….and meant that “any” display capable of displaying HDR10 content could now display DV content if you tricked the video player (eg, UHD disk player) into outputting the content as LLDV. What you would get is the benefits of Dolby Vision tone mapping, without a Dolby Vision display.
That alone was interesting, but since one of the big features of Dolby Vision is mapping to displays of different brightness output capabilities, it became very interesting to map DV content to a very low brightness target…..which was very similar to the brightness capabilities of many projectors. You could use Dolby’s proprietary tone mapping to dynamically map Dolby Vision content to something compatible with just about any HDR display (even one with limited output, like projectors).
Case Study
In my case, I was testing this on a JVC RS500 projector, mostly using an AppleTV4K as a convenient source of Dolby Vision content.
This model year of JVC was one of the first “HDR capable” projectors, announced in 2015 or 2016. While that was “technically correct” I suppose, they really didn’t know how to handle an incoming HDR signal well, and so unlike many other projectors that would follow, with workable (if not perfect) HDR handling, the JVC needed help to even look halfway good with an HDR signal.
So, step one with the JVC I tested with is not really relevant to other later models from JVC and other manufacturers. With this JVC projector, it was necessary to create or import a custom gamma curve — essentially a gamma curve (or EOTF) designed to render HDR in an acceptable if unremarkable fashion. Creating such a curve is non trivial because it involves balancing overall image brightness, black and shadow level detail, and highlight retention….which are all competing and contradictory goals. I would say it took the community a couple years before consensus emerged about the ideal balance of those factors. I should note, again, that most later projectors with “HDR Compatibility” didn’t require much effort in this area and solved the “comprise” decisions by including a “slider” or multiple settings for how HDR is handled, sometimes favoring average brightness of the image, sometimes favoring better detail in the shadows, sometimes favoring better rendition of highlight detail — each choice shortchanging the other aspects of the image to some degree. But on my projector, step one involved loading a good HDR curve on the projector.
Enter step two, which applies to every LLDV setup. Here, one “tricks” the source device (like an AppleTV) into thinking the display (projector) is a Dolby Vision capable display, that wants a LLDV (player led) Dolby Vision feed (which, you may recall, is really just an HDR10 signal, with tone mapping built in). So, now you have an HDR10 signal/feed/piece of content with all the Dolby Vision dynamic meta data and tone mapping already built in.
For a few years, the main method for tricking a UHD player, or, in my case, an AppleTV4K into outputting LLDV was to buy a device from companies like HDFury, which would spoof the EDID (electronic display ID) coming from the TV to the player, telling the player that the display is a Dolby Vision display that requires LLDV mapped to a particular brightness value. The player would then dutifully tone map the Dolby Vision content and send it as HDR10. And the display would then display that content, that has already been adapted to the display’s capabilities. And, presto, a gorgeous HDR presentation on a projector, greatly exceeding simple HDR10 rendering. The HDFury device is still required for the many displays that don’t let you manually trigger HDR10 mode on your display. Many many projectors and flat panels, especially projectors in the under $3000 range, lack this capability. So the HDFury devices are required for those projectors. In my case, I was able to use a $30 device from Amazon to do this trick, because my projector allows me to manually trigger my projector’s HDR10 mode/gamma/EOTF & REC2020 color space. (More about this below.)
Why does this LLDV stuff look better than HDR10?
Dolby Vision has a couple of advantages over HDR10 in this situation. First, HDR10 is mastered for a single target output throughout the entire program. Bright scenes and dark scene all get the same virtual container, meaning one can run out of shadow detail in dark scenes or highlight detail in bright scenes, or overall brightness in mid tones in complex scenes. Dolby Vision, on the other hand, can have different containers/parameters for each scene. For a dark scene, the container can forget about trying to retain highlight nuances, and spend that “contrast budget” on the shadow details. Or vice versa, ignoring shadow detail when the scene is primarily bright and could benefit from more detail and gradation in highlights (like clouds in the sky).
Second, Dolby Vision can map its overall presentation very carefully to the actual capabilities on the display with far more nuance and accuracy (to industry standards including what people mastering the content saw on their own displays) than can HDR10. This fact, combined with scene by scene changes to the content optimized for the type of scene being show, means that Dolby Vision content can be rendered in a more sophisticated, nuanced manner than HDR10 content.
Test Case Including Some Technical Details
For reference, here is what I did.
Custom Curve for the JVC Projector: User Dominic Chan at AVS shared some HDR curves for the JVC projectors. The one I used is this curve, 350-125, which essentially means “pretend I have a max light output of 350 nits and that diffuse white should be mapped to 125 nits” which is about the ideal match for the lowest level Dolby Vision can map to. (As usual one needs to remove the txt extension on that file and unzip it before use.) Note, Dominic points out that this particular curve worked best for me because I was using a lower output lamp (ie, it had 2000 hours on it) when I tested it, and that the curve titled 350 (without the -125) would work better if my lamp were newer/brighter. But note that the 350ntm file doesn’t do any tone mapping and I haven’t tested it yet.
To upload this custom curve to the JVC, I used the JVC Autocal Tool. This tool can do many things, but in this case, I simply used it to upload the custom gamma file into the Import / Custom 1 slot on my RS500 projector. Dominic has created a custom tool for uploading curves to JVC and Sony projectors and shared this on AVS Forum as well.
HDMI EDID Spoof: Luckily one no longer has to buy a device from HDFury costing hundreds of dollars to simply adjust EDID settings if your projector (or other display) allows you to turn on the HDR mode (HDR EOTF) manually even when the signal does not have any HDR flags in it. If your projector cannot do this, and most under $3000 cannot, then you will need to step up to an HDFURY device that can add HDR flags to the signal.
There are several devices on the market as of this writing, each with slightly different capabilities and methods for adjusting them, at affordable price points, as long as you can manually put your projector into HDR mode.
I purchased this unit from AVStar for about $30 but I understand this other one is about the same, and to test that I bought one of those too and did exactly the same thing.
There is a built in feature/option that sort of lets you test things out, to see if things are generally working. But to really do it right and make it worthwhile, there is custom firmware from AVS user xnappo that lets you precisely set the DV EDID info to explicitly ask for the content to be mapped to c 330 nits peak, ie, just about perfect for the custom curve on the projector (which is optimized for 350 nits peak content).*
Footnotes
*The main post from xnappo on AVS about the firmware update, most of which I have quoted below, is what I followed for this process.
How to program the EZCOO SP12H2 HDMI splitters with a custom firmware that:
Provides all possible Max Luminance options
Changes colorspace to BT2020 (vs. DCI-P3 in the original)
Disables display-led DV
Steps:
Download the vendor FW update software from here:
www.easycoolav.com/tmp/madeimg/GD32 UPDATE EX11PRO and SP12H2.rar
Note that this includes the programming software, Word doc on how to update, a video and their own FW update you can use to restore to the original.Download the custom FWs from Github here:
https://github.com/xnappo/ez_dv_edid_fw/archive/refs/heads/main.zip
Extract both packages somewhere convenient
In the extracted vendor directory, go to:
...\gd32\GD32AllInOneProgrammer\GD32DfuDrivers_V3.6.6.6167\x64\
and double click 'GD32DfuDrivers.exe' to install the drivers.Go to ...\gd32\GD32AllInOneProgrammer\GD32AllInOneProgrammer\GD32AllInOneProgrammer_支持C103
and double click 'GD32AllInOneProgrammer.exe' to run the programmer.(If you run into an error regarding missing qt5core.dll, install the 'vcredist2010_x86.exe ' DLL package from here:Latest Visual C++ Redistributable Runtimes All-in-One Nov 2023 DownloadThis archive contains the latest version (November 2023) of all VCRedist Visual Studio C++ runtimes, installable with a single click by running the installer.I do not think I did this and so I cannot vouch for this.)Now, we need to put the device into update mode. This involves holding the 'update' button WHILE plugging in the USB cable.
a. Plug the USB cable into your PC.
b. Set the device on your desk and find something to push the button with (It comes with a 'sim card tool' to do this, but I found a pen easier to use.)
c. Plug the cable into the EZCOO.
d. Release the button
The blue LED should start flashing every second.In the programming software select USB under interface, then hit connect:
Hit the 'Browse' button in the 'Download' section
Browse to where you extracted the custom FWs, change the extension from .bin to .hex in the pulldown, and select the FW you would like to install (I suggest starting with 332 or 384) then hit Open.
Hit Download, you should see the device program.
Hit disconnect, and you are done!
If you ever want to restore the factory FW, it is here in that package from the vendor:
...\gd32\EZ-SP12H2_APP_DFU_GD32_Ver2.06(FB4D).hex
Here is a video of the process:
Are there limitations to the LLDV decoder, regarding BL, Profile, L2 trim?
Are 2020/P3 colorspace , PQ and max luminance the only required parameters for the HDR10 layer, what about maxFALL and maxCLL? Does a tv customise its tone-mapping based on those aswell?
Does the LLDV player decode to the specified luminance range from the spoofed EDID?
Thanks for the write-up.