I wish I knew about this PS5 and Xbox Series X Dolby Vision problem earlier
The PS5 and Xbox Series X aren't the media players I expected
Editor's note: This article previously, incorrectly, stated that the PS5 had Dolby Vision support in streaming apsp apps. It does not.
As much as I'm a gamer, I'm a movie-lover. So, when it came time for me to buy a next generation console (first the PS5 and later an Xbox Series X), I bought the disc-drive bearing options. And looking back at those decisions, I wonder if I made the right call.
Sure, I'd rather have the Xbox Series X's higher internal storage and 4K gaming, but after I tried to watch a 4K Blu-ray of Spider-Man: No Way Home on Microsoft's console, I've realized I didn't make the most informed buying decision. So if you're one of the prospective console shoppers, and thinking about PS5 vs PS5 digital edition or Xbox Series X vs Xbox Series S, let me introduce one element you might not have thought about — optical media.
And, yes, as the world of physical media slowly gets replaced by streaming, this may not be a huge deal to you. But it's enough of a concern that I'm actually looking into standalone Blu-ray players.
The PS5 and Xbox Series X's Dolby Vision issues
Dolby Vision, one of the top and widely-accepted video standards in home entertainment, is only kinda there on Xbox Series X. I say "kinda" because folks watching the Marvel movies in order on Disney Plus or streaming the latest Netflix Original series on the Xbox Series X — which stream in pristine Dolby Vision — will likely have no idea why i'm upset.
That's because streaming apps are the only way to watch something in Dolby Vision on one of these consoles. Those consoles only have regular ol' HDR for optical media, which isn't bad, but it's below the established standards.
As we've stated on Tom's Guide, Dolby Vision television sets "should deliver a more dynamic, brighter and livelier image" than regular-old HDR, though some of the metrics that get us there are supposedly coming from sets that aren't even out yet.
For what it's worth, optical media played back on these consoles does allow for the more-immersive sound of Dolby Atmos, just (again) not Dolby Vision.
How the heck do you support Dolby Vision for apps, but not discs? It doesn't really make sense at all. It feels like some issue about someone not wanting to pay for it. These consoles are far from cheap, though, at $500 each.
Outlook: What are my options?
Is this a huge deal? Probably not. Is this something that makes me want to sell my Xbox Series X or PS5? Not really. But it's still the kind of thing that I wish I knew going in. I probably wouldn't have made the decision to save $100 to $200 by getting the digital-only versions of either console, but that optical drive doesn't feel as complete without Dolby Vision.
That's especially true when standalone 4K Blu-ray players start at $219 online. But I don't really want another piece of hardware cluttering up my counter-top. Maybe this is just a sign to leave optical media behind.
Many copies of movies you buy these days comes with a digital code through MoviesAnywhere. I was buying physical copies, though, to prevent any loss of quality that comes with the streaming process.
What I'll likely do, I think, is just always make sure that the streaming device I'm using is hard-wired to my router to try and prevent quality dips. I wish Sony or Microsoft would just add Dolby Vision for their optical media drives, but since these consoles have been out since November 2020, I'm not going to hold my breath.
Then again, Nintendo eventually added Bluetooth headphone support, so maybe not all hope is lost. For now, though, the writing continues to be on the wall for optical media. If these mammoth consoles (likely two of the more popular Blu-ray players) can't get Dolby Vision for discs, why should I think disc-based media is important?
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Henry is a managing editor at Tom’s Guide covering streaming media, laptops and all things Apple, reviewing devices and services for the past seven years. Prior to joining Tom's Guide, he reviewed software and hardware for TechRadar Pro, and interviewed artists for Patek Philippe International Magazine. He's also covered the wild world of professional wrestling for Cageside Seats, interviewing athletes and other industry veterans.
-
Rookie311 “If these mammoth consoles can't get Dolby Vision for discs, why should I think disc-based media is important?”Reply
For a very important reason-hi-resolution lossless audio soundtracks on Blu-ray and 4K Blu-ray movies! This is even more important than Dolby Vision to me since regular HDR is beautiful regardless. You want lossless audio where you can hear all the details of a movie and feel like it’s happening right there with you?! Then you’re going to need Dolby TrueHD, DTS-HD Master Audio, and Dolby Atmos movie soundtracks! You can’t get those tracks by streaming movies! You can only get regular lossy 5.1 Dolby Digital surround sound and Dolby Digital Plus when streaming movies. Those are not as great as hi-res lossless audio! -
Xxenddex “Sure, I'd rather have the Xbox Series X's higher internal storage and 4K gaming”Reply
What does that even mean? PS5 is absolutely as capable of 4K gaming. This made me wince so much I didn’t read any more. Sick of these crap tech articles. -
Rookie311 While the Xbox Series X SSD does indeed have a meager advantage in storage size (which in itself is reduced by the variety of expandable options available to PS5), it’s really in the performance category that Microsoft’s console takes an absolute beating. The figures don’t lie – the PS5 SSD runs more than twice the speed of the Xbox Series X SSD in Raw mode and nearly twice as fast in compressed. This is a game changer, not least because not only can Sony (and you) finally say goodbye to the excruciatingly long ‘update copying’ process that would make every patch download an absolute chore, but also because Sony’s super fast custom SSD also means a return to the cartridge days of old where games would just boot up and play immediately.Reply -
treoo0 It was intended to be a comparison of the Series X vs the Series S. "...Xbox Series X's higher internal storage and 4K gaming” - vs. the Series S. Not an expressly written comparative, but easily figured out.Reply
This author is not bad at all. He also has a valid point, one I discovered and was upset about as well. I also found the PS5 drive has better video performance than the Series X.
Cheers -
Uncoolest_Evar I couldn't bring myself to invest actual money into getting a UHD Blu-ray player. It feels like buying a Laserdisc player must have felt in the mid-nineties. Yeah it's better... But it's still going to be obsolete soon...Reply
My solution was to get a easily hackable USB Blu-ray drive off Amazon. I just install the custom firmware, rip the files to my PC and stream them over Plex to my Nvidia Shield. That fixes the problem for Dolby Vision. Those people out there with Samsung QLEDs trying to watch HDR10+ are on their own. -
drew_nickel I regret buying the DISC-less PS5 because I can only buy games from the PSstore where games stay sky high in price $$$. I'm missing out on buying cheaper used versions at gamestop and ebay. Or even the occasional discounts on Amazon. I cant even trade games with friends. In the PSstore games like Demon souls, ratchet and clank, returnal, miles morales, etc are still $69.99 plus tax. On Ebay you can find brand new still sealed copies for half the price.Reply
I made my decision thinking it would be the same landscape as PC. Steam/Epic/amazon/origin etc.... All those platforms practically give away games.
Just to be a stickler, I have to point out that if you have a newer HDR TV set Dolby Vision streaming is not better than HDR10 off a UHD disc. The lower bitrate in streaming movies reduces details/grain. DV streaming content is only available in single layer MEL (minimum enhancement layer) metadata, which doesnt restore grain like FEL metadata (which is only available on the UHD discs or their exact ISO copies). The thing is that alot of newer HDR TV sets have really good dynamic tonemapping features. So if your playing a UHD bluray (with a the regular bit rate with all the grain and detail still intact) in HDR10 with dynamic tonemapping on, you're probably getting a better picture quality vs watching it streamed on Dolby vision. (Now this all depends on how well the movie was mastered and graded... But more than likely, A dolby vision movie that is streaming wont look any better than that same movie played off the disc/iso in HDR10 )
On the other hand though, a UHD bluray/iso that was released with a DV Dual Layer FEL layer should be considerably better than the HDR10 counterpart. Mostly because the extra range in brightness provided by the 12 bit metadata (even though there are no commercial panels that display 12bits, the FEL metadata is designed to tonemapping those extra bits to extra brightness). Also, usually, the fel layer includes an overlay of extra detail/grain. It seems as if the movie studios are purposely leaving this extra detail and grain out of the HDR10??? but thats just speculation.
Another thing I want to be a stickler about is that the Nvidia doesnt play FEL dual layer DV movies. It limits the playback to only play the MEL layers. -
cdna I've been waiting years to hear why there's no DV disc support... is it a hardware limitation? A licensing issue? Is it technically difficult to implement? Too little customer interest? They don't care?Reply -
ginandbacon drew_nickel said:I regret buying the DISC-less PS5 because I can only buy games from the PSstore where games stay sky high in price $$$. I'm missing out on buying cheaper used versions at gamestop and ebay. Or even the occasional discounts on Amazon. I cant even trade games with friends. In the PSstore games like Demon souls, ratchet and clank, returnal, miles morales, etc are still $69.99 plus tax. On Ebay you can find brand new still sealed copies for half the price.
I made my decision thinking it would be the same landscape as PC. Steam/Epic/amazon/origin etc.... All those platforms practically give away games.
Just to be a stickler, I have to point out that if you have a newer HDR TV set Dolby Vision streaming is not better than HDR10 off a UHD disc. The lower bitrate in streaming movies reduces details/grain. DV streaming content is only available in single layer MEL (minimum enhancement layer) metadata, which doesnt restore grain like FEL metadata (which is only available on the UHD discs or their exact ISO copies). The thing is that alot of newer HDR TV sets have really good dynamic tonemapping features. So if your playing a UHD bluray (with a the regular bit rate with all the grain and detail still intact) in HDR10 with dynamic tonemapping on, you're probably getting a better picture quality vs watching it streamed on Dolby vision. (Now this all depends on how well the movie was mastered and graded... But more than likely, A dolby vision movie that is streaming wont look any better than that same movie played off the disc/iso in HDR10 )
On the other hand though, a UHD bluray/iso that was released with a DV Dual Layer FEL layer should be considerably better than the HDR10 counterpart. Mostly because the extra range in brightness provided by the 12 bit metadata (even though there are no commercial panels that display 12bits, the FEL metadata is designed to tone mapping those extra bits to extra brightness). Also, usually, the fel layer includes an overlay of extra detail/grain. It seems as if the movie studios are purposely leaving this extra detail and grain out of the HDR10??? but thats just speculation.
Another thing I want to be a stickler about is that the Nvidia doesn't play FEL dual layer DV movies. It limits the playback to only play the MEL layers.
I purchased a Dune HD Vision Pro Solo for this reason. DV is more confusing then most people know. This started when Sony and Dolby Labs introduced LLDv ( low latency Dolby Vision). Previously, STD, where the display did the dynamic tone mapping was done for one reason. At the time the version of HDMI did not support dynamic metadata, so that's why the display had to have a dedicated chip in it to do the heavy lifting. Different displays took different amounts of time to do this depending on how powerful they were. I believe profile 4 is the only version that is STD. Going forward (from around 2018), all DV displays must support LLDV but are not required to support STD. I have read conflicting info on this though.
My understanding was streaming services used profile 5, which was the first version of player led decoding (LLDV). LLDV makes sense because a console can do what the display used to have to do, but way, way faster. With display led, one display may take longer than another which would mean more lag. Obviously not an issue for movies but huge issue for gaming. It's software based so no dedicated hardware is required. I know with the Xbox, it's how they were able to bring DV to the console via a firmware update because it was now all software based.
UHD disks use profile 7 (MEL or FEL). There is no indication on the packaging of any kind that lets you know which is used. For movies, it's better to have the display do the dynamic tone mapping instead of the source. The display knows it's "limitations" for lack of a better term.
Dolby likes to complicate matters, just like that way it complicates Dolby Atmos when it first rolled out circa 2013 - 2014. They provided a white paper on this and requires manufacturers of Atmos-enabled speakers to conform to certain requirements like HPF to crossover at 180Hz before it can be certified as a true Dolby-Atmos enabled speakers :P. I guess history repeats itself with the Dolby Vision standards. The problem with Dolby Vision is the complexity of the way Dolby roll out its DV implementation. Believe it or not, there are at least 6 known profile levels. Within a given profile, the maximum level a base layer (BL) or enhancement layer (EL) is restricted by the Profile. If you have used MakeMKV s/w to make back-up copies of your precious bluray or 4K UHD bluray titles, you should be familiar with terms like Profile H.265 Main10 4.1/5.1 etc.
Now let us break down what are some of the Profiles that Dolby has had dished out over the years.
First is the single-layer (Profile 5 & 8) that most LLDV-based sources will prefer - e.g. Netflix, Apple TV+, Disney Plus. Besides streaming devices that utilize this type of DV implementation are the media players such as Zidoo Z9X and nVidia Shield TV 2019 that comes with DV decoding capability (LLDV). Xbox Series X/S console utilize LLDV (Player-led) as the preferred form of DV instead of the unadulterated version. This is why the 4K UHD bluray drive is not able to read the dual-layer (usually Profile 7) containing the DV for TV to process and decode.
With all that said, all Marvell movies are HDR10 only for the UHD disks. Disney waited until the streaming versions before implementing DV. It's nothing more then a way to get more subscribers because of the DV buzz word. Similar to IMAX version.
Right now DV has one advantage over HDR10, dynamic metadata. No commercial display is near 4000 nits, much less 10000. No 12 bit commercial display exists. HDR10+ looks great. Bohemian Rhapsody is evidence of that but HDR10+ content is severely lacking.
The main reason I bought the Dune is it supports profiles 4, 5, 7 (MEL and FEL), 8 and 9. In fact you can have it output everything as HDR or DV. It properly re-maps the color tones so the colors are correct. This does not really improve the picture quality when done correctly. It does cut down on banding and polizieeation by "up-biting" to 12 bits.
Nobody is doing frame by frame dynamic metadata. At most, scene based. HDR of any kind takes basically no bandwidth. It's just a either 16 or 32 character HEX value. Plain text pretty much uses zero bandwidth. DV does not add any bandwidth that would be noticable regarding bandwidth and size. Even if it was scene by scene a HEX value is nothing compared to sending the info for the billions (or millions) of pixels on 4K display at 24fps.
At the end of the day DV is much more complicated than HDR10 or HDR10+ because of Dolby. -
ginandbacon cdna said:I've been waiting years to hear why there's no DV disc support... is it a hardware limitation? A licensing issue? Is it technically difficult to implement? Too little customer interest? They don't care?
Believe it or not, there are at least 6 known profile levels. Within a given profile, the maximum level a base layer (BL) or enhancement layer (EL) is restricted by the Profile. UHD disks use profile 7 (MEL and FEL).
Now let us break down what are some of the Profiles that Dolby has had dished out over the years.
First is the single-layer (Profile 5 & 8) that most LLDV-based sources will prefer - e.g. Netflix, Apple TV+, Disney Plus. Besides streaming devices that utilize this type of DV implementation are the media players such as Zidoo Z9X and nVidia Shield TV 2019 that comes with DV decoding capability (LLDV). The Xbox Series X/S console utilizes LLDV (Player-led) as the preferred form of DV instead of the unadulterated version. The built-in 4K UHD bluray drive is not able to read the dual-layer Profile 7 containing the DV for the display to process and decode.
Here is an example of a disk. The Xbox simply can't read the enhanced1080p.layer, which contains the actual DV info. The Xbox would need a different Blu ray drive, and possibly additional hardware, in order to do this. LLDV is better for gaming, less lag, and it's a game console first. It's gotten to the point where most people never use the drive anyways.
An example of a typical MKV file encoded with a DV stream:
DISC INFO:Disc Title: Gladiator.2000.MULTI.COMPLETE.UHD.BLURAY-EXTREME
Disc Size: 92,459,896,796 bytes
Protection: AACS2
BD-Java: Yes
Extras: Ultra HD
PLAYLIST REPORT:Name: 00041.MPLS
Length: 2:50:56.495 (hⓂs.ms)
Size: 79,038,529,536 bytes
Total Bitrate: 61.65 Mbps
VIDEO:Codec / Bitrate / Resolution / Frame-rate / Aspect Ratio / DV Profile / Chromacity / Bit-Depth / Colorspace
Base Layer: MPEG-H HEVC Video / 43319 kbps / 2160p / 23.976 fps / 16:9 / Main 10 Profile 5.1 High / 4:2:0 / 10 bits / HDR / BT.2020
The below can't be read by the Xbox, it contains the actual DV info. The base layer has yhe HDR10 info, so that's what it falls back to and pretends the second layer just doesn't exist.
Enhancement Layer: MPEG-H HEVC Video / 6826 kbps / 1080p / 23.976 fps / 16:9 / Main 10 Profile 5.1 High / 4:2:0 / 10 bits / Dolby Vision / BT.2020 -
drew_nickel
Awesome explanation! thank you for clarifying 👍ginandbacon said:I purchased a Dune HD Vision Pro Solo for this reason. DV is more confusing then most people know. This started when Sony and Dolby Labs introduced LLDv ( low latency Dolby Vision). Previously, STD, where the display did the dynamic tone mapping was done for one reason. At the time the version of HDMI did not support dynamic metadata, so that's why the display had to have a dedicated chip in it to do the heavy lifting. Different displays took different amounts of time to do this depending on how powerful they were. I believe profile 4 is the only version that is STD. Going forward (from around 2018), all DV displays must support LLDV but are not required to support STD. I have read conflicting info on this though.
My understanding was streaming services used profile 5, which was the first version of player led decoding (LLDV). LLDV makes sense because a console can do what the display used to have to do, but way, way faster. With display led, one display may take longer than another which would mean more lag. Obviously not an issue for movies but huge issue for gaming. It's software based so no dedicated hardware is required. I know with the Xbox, it's how they were able to bring DV to the console via a firmware update because it was now all software based.
UHD disks use profile 7 (MEL or FEL). There is no indication on the packaging of any kind that lets you know which is used. For movies, it's better to have the display do the dynamic tone mapping instead of the source. The display knows it's "limitations" for lack of a better term.
Dolby likes to complicate matters, just like that way it complicates Dolby Atmos when it first rolled out circa 2013 - 2014. They provided a white paper on this and requires manufacturers of Atmos-enabled speakers to conform to certain requirements like HPF to crossover at 180Hz before it can be certified as a true Dolby-Atmos enabled speakers :p. I guess history repeats itself with the Dolby Vision standards. The problem with Dolby Vision is the complexity of the way Dolby roll out its DV implementation. Believe it or not, there are at least 6 known profile levels. Within a given profile, the maximum level a base layer (BL) or enhancement layer (EL) is restricted by the Profile. If you have used MakeMKV s/w to make back-up copies of your precious bluray or 4K UHD bluray titles, you should be familiar with terms like Profile H.265 Main10 4.1/5.1 etc.
Now let us break down what are some of the Profiles that Dolby has had dished out over the years.
First is the single-layer (Profile 5 & 8) that most LLDV-based sources will prefer - e.g. Netflix, Apple TV+, Disney Plus. Besides streaming devices that utilize this type of DV implementation are the media players such as Zidoo Z9X and nVidia Shield TV 2019 that comes with DV decoding capability (LLDV). Xbox Series X/S console utilize LLDV (Player-led) as the preferred form of DV instead of the unadulterated version. This is why the 4K UHD bluray drive is not able to read the dual-layer (usually Profile 7) containing the DV for TV to process and decode.
With all that said, all Marvell movies are HDR10 only for the UHD disks. Disney waited until the streaming versions before implementing DV. It's nothing more then a way to get more subscribers because of the DV buzz word. Similar to IMAX version.
Right now DV has one advantage over HDR10, dynamic metadata. No commercial display is near 4000 nits, much less 10000. No 12 bit commercial display exists. HDR10+ looks great. Bohemian Rhapsody is evidence of that but HDR10+ content is severely lacking.
The main reason I bought the Dune is it supports profiles 4, 5, 7 (MEL and FEL), 8 and 9. In fact you can have it output everything as HDR or DV. It properly re-maps the color tones so the colors are correct. This does not really improve the picture quality when done correctly. It does cut down on banding and polizieeation by "up-biting" to 12 bits.
Nobody is doing frame by frame dynamic metadata. At most, scene based. HDR of any kind takes basically no bandwidth. It's just a either 16 or 32 character HEX value. Plain text pretty much uses zero bandwidth. DV does not add any bandwidth that would be noticable regarding bandwidth and size. Even if it was scene by scene a HEX value is nothing compared to sending the info for the billions (or millions) of pixels on 4K display at 24fps.
At the end of the day DV is much more complicated than HDR10 or HDR10+ because of Dolby.
Another thing... people tend to write off Samsung for not carrying DV. And they regard HDR10+ for being inferior. But samsungs do a really great job of tonemapping. And alot of HDR10+ look just as good if not better than their DV counterparts.