Close

What are you looking for?

cancel
Showing results for 
Search instead for 
Did you mean: 

Samsung, you have to introduce dolby vision

(Topic created on: 05-12-2019 04:53 PM)
6723 Views
Benrc
Journeyman
Options

This is ridiculous. DV is taking off and your lack of support does nothung but hurt your customers.

How about you lead the way and put your customers before some ***** fisted marketing strategy? 

162 REPLIES 162
Soul_
Pioneer
Options

This is quite a nightmare here. I just ditched my Galaxy phone for iPhone Xs, and I am in the market for a tv this holiday season, so I ended up here during my research. Based on the universal support for DV, (that is quite well documented, from consoles to PCs to Netflix to Disney+ to apple TV+), and the stubbornness that I see here, I feel like I should just get a 77" OLED instead. I was looking at Samsung because I have Samsung TV today, but this changes a lot for me.

Plainly, I can afford to spend 5k on a TV which can't support any of the big use cases. As for HDR 10+, I don't use Prime and not many movies support it either. Sadly, I would rather pick bluray than HD DVD, and not regret later.

 

EDIT: Engrish

DrGravity
Journeyman
Options

Wow reading through this thread there is a horrendous lack of knowledge around Dolby Vision and HDR. It seems like few people actually understand what it is they're arguing about or why they want Samsung to support it!

 

So firstly all Dolby Vision content plays back in HDR10 on displays that don't support Dolby Vision - it's not like the content isn't compatible or you're not getting HDR from these sources.

 

Secondly we need to look at Dolby Vision and why people feel like it's important. The fact that Dolby Vision is 12bit colour is irrelevant as all the TV's you're going to be watching the content on are 10bit so that leaves us with Dynamic Metadata. Most people seem to think this will somehow make HDR look 100 times better on their QLED - spoiler alert it won't!

 

There is one simple truth to HDR which you need to understand: The capabilities of your display are one thousand times more relevant to how HDR looks vs whether it uses dynamic or Static metadata.

 

Dynamic metadata has the most impact when your display can't hit the required peak brightness (ie is sub 1000nits). Why was LG the first to embrace DV? Because their OLED's have crippled peak brightness (thanks ABL) so they tried to use Dynamic meta data to get over the fact they needed to do a horrific amount of tone mapping when showing HDR content. Plus it's good for marketing as this thread shows - people now seem to assume Dolby Vision support is some sort of wondrous thing.

 

By contrast my Q9FN can hit much higher peak brightness and colour volume, allowing for much less tone mapping (or even none at all depending what peak brightness the content was mastered at) giving a superior HDR presentation regardless of whether it's static or dynamic metadata.

 

Of course Samsung does plenty of sets that don't achieve such high levels of peak brightness hence the open source HDR10+ which includes dynamic metadata as part of it's spec. Yes they could have gone with DV but this brings me to my last point - why is everyone so keen to hand over the keys to the HDR kingdom to one company? Dolby Vision is completely proprietary and controlled end to end by Dolby! No thank you I prefer open source as should anyone who's invested in the future of HDR.

 

In short your QLED will give a better HDR experience regardless of the format thanks to superior peak brightness and colour volume while Dolby owning HDR lock, stock and barrel is not a good thing. Here endeth the lesson.

100pat
Pioneer
Options

You have a lot of generalisations in your post which skew your arguments in apologising for Samsung. 

If you have a high end set with no Dolby Vision then you will not correctly see scenes in DV streaming content where the film maker has adjusted their settings on a scene by scene basis. 

If you have a budget set, then any form of enhanced HDR will help compensate for your lack of Nits (light output) when compared to HDR10. But how many streaming titles are available in Samsung’s HDR10+ as compared to those in Dolby Vision. 

I suspect that many (e.g. Netflix, Apple+, Disney+) content producers and manufacturers wish to future proof their products by catering for next year’s 10 and 12 bit screens. The fact that they are willing to pay cash to Dolby for the use of DV speaks volumes. Or, are these billion-dollar companies as stupid as you seem think contributors in this thread are?

Why would anyone wanting to watch titles in DV buy a non-DV set? However much you argue, it’s simply illogical. “I want to watch DV streams so I’ll buy a non-DV set” ?

Suad
Apprentice
Options

 

@DrGravity Its not that we are stupid, its that we want the best when your paying the top price. 

We know that HDR10+ is as good as Dolby Vision, but the problem is most popular streaming platfroms dont support HDR10+, so we are being downgraded to HDR10. 

We also know that the difference between HDR10, and Dolby Vision is only noticable when you put 2 TVs next to eachother. So you might think, then I dont need Dolby Vision cause I will never have 2 TVs next to eachother. But then no one would buy expensive TVs.

Samsung is thinking in long run, and they are calculating how much money they would lose in 10 years if they keep paying for Dolby Vision. 

But they could make a deal with Netflix and Disney, and pay them to add HDR10+. 

In that case no one would complain. 

Soul_
Pioneer
Options

Isn't that the problem? They are only thinking about how much money they can save. They are not thinking about how much the consumers will not be able to enjoy their products due to lack of the most popular dynamic HDR format.

 

None of the major stream providers, graphics card manufactures, console manufaturers, movie studios support HDR10+. So, I pay 5k+ on the tv in 2019 and be stuck with HDR 10 from 2015?

 

How do you justify it from a consumers point of view?

 

EDIT: Please don't give me the long game answer, I am buying TV today to use today. Who predicts the future? Dolby Vision, to what I recall, is 3 bucks per TV set licensing. Charge me 5 bucks extra for my set if you wish, but dont give me a gimped TV. They are just being a stubborn baby. Do a poll and tell me how many people would not mind paying 5 bucks extra for DV on their TV?

Suad
Apprentice
Options

Lol Soul, Im  am on your side. I was talking positive about Dolby Vision😂

I was responding to DrGravity reply

0 Likes
Soul_
Pioneer
Options

Sorry I got a bit emotional when I read this thread and mistook you. Why would people defend a multibillion dollar company which is clearly taking advantage of them? Beyond me.

GadgetMan
Student
Options

Dr Gravity - gven the mistakes that you make I am not sure that you understand DV that well either.

"all Dolby Vision content plays back in HDR10 on displays that don't support Dolby Vision" - that is not quite correct. DV meta data can only be interpreted by a DV enabled display - it is never downsampled to HDR. Content can mastered to include both HDR10 and DV. This is actually mandated for UHD 4k blurays. So any UHD bluray with DV will also have an underlying HDR layer that any HDR display should be able to use. Not all DV content has to have an underlying HDR10 layer - so streaming companies can include DV without deciding to include HDR.

 

There are also demonstrable advantages of dynamic HDR (HDR10+ or DV) implementations over HDR10 even if the display can manage 1000 nits as some of Samsung's top QLEDs and the Panazonic QZ2000 OLED can do. Have a look online for comparisons. As DV is much more widely used than HDR10+ Samsung owners will miss the advantages of dynamic meta data and be limited to watching the HDR10 version - if it is included.

 

Even the brightest QLED sets cannot display all HDR content so have to resort to tone mapping. Some content is starting to be graded at 4,000 nits.

 

I was in Dolby's grading suite in London a couple of weeks ago where they were demonstrating the grading process using DV. They use Dolby Pulsar monitors for this grading - these are LCD displays that support more than 4,000 nits and they were grading the content to 4,000 nits. The Pulsar is an impressive display - and you really could see the advantages of such bright highlights over another display limited to 1,000 nits. So content with more than 1,00 nits (or beyond any current consumer display) is coming.

 

In these situations with rival formats content is king - and the current content victor is clearly DV. So whether you think DV is a bad idea because it is a closed system is somewhat irrelevant. If you want the advantages of dynamic meta data  then DV is the only real game in town at the moment.

 

DrGravity
Journeyman
Options

@GadgetMan wrote:

Dr Gravity - gven the mistakes that you make I am not sure that you understand DV that well either.

"all Dolby Vision content plays back in HDR10 on displays that don't support Dolby Vision" - that is not quite correct. DV meta data can only be interpreted by a DV enabled display - it is never downsampled to HDR. Content can mastered to include both HDR10 and DV. This is actually mandated for UHD 4k blurays. So any UHD bluray with DV will also have an underlying HDR layer that any HDR display should be able to use. Not all DV content has to have an underlying HDR10 layer - so streaming companies can include DV without deciding to include HDR.

 

There are also demonstrable advantages of dynamic HDR (HDR10+ or DV) implementations over HDR10 even if the display can manage 1000 nits as some of Samsung's top QLEDs and the Panazonic QZ2000 OLED can do. Have a look online for comparisons. As DV is much more widely used than HDR10+ Samsung owners will miss the advantages of dynamic meta data and be limited to watching the HDR10 version - if it is included.

 

Even the brightest QLED sets cannot display all HDR content so have to resort to tone mapping. Some content is starting to be graded at 4,000 nits.

 

I was in Dolby's grading suite in London a couple of weeks ago where they were demonstrating the grading process using DV. They use Dolby Pulsar monitors for this grading - these are LCD displays that support more than 4,000 nits and they were grading the content to 4,000 nits. The Pulsar is an impressive display - and you really could see the advantages of such bright highlights over another display limited to 1,000 nits. So content with more than 1,00 nits (or beyond any current consumer display) is coming.

 

In these situations with rival formats content is king - and the current content victor is clearly DV. So whether you think DV is a bad idea because it is a closed system is somewhat irrelevant. If you want the advantages of dynamic meta data  then DV is the only real game in town at the moment.

 


Oh I understand DV just fine

 

DV has the same "core" HDR information as HDR10 and thus it is simple with Dolby's tools to add HDR10 to any Dolby Vision content and this is indeed what happens. While you are correct in that it is only mandated for UHD Blu Ray, streaming services offer both options because it is simple to do so. Please feel free to provide examples if you know of any content that will only playback in HDR on Dolby Vision devices otherwise implying there is "DV only" content is simply misleading.

 

I never said there was no difference between static / dynamic metadata. What I said is that a displays peak brightness and colour volume capabilities are FAR more important to HDR presentation than what HDR format is used and that dynamic metadata is more important on displays that can't achieve even the 1000nit peak brightness threshold and thus have to employ a greater degree of tone mapping. You in fact make this point for me by going on about how much better HDR looked on that 4000nit Dolby monitor - it will because of the higher peak brightness of the display.

 

Yes some content will be mastered at a higher peak brightness such as 4000nits or even 10000nits - this can be the case for both HDR10 and DV content. As such virtually all current TV's will need to tone map the content to the displays capabilities. The crucial point here though is the better the capabilities of the display (eg higher peak brightness) the LESS tone mapping will need to be employed and the better the HDR experience will be. 

 

The fact there is currently more DV content at this point in time than HDR10+ in no way invalidates the fact that better display peak brightness = superior HDR and open source is always preferable to a closed ecosystem

 

Ultimately the point I was making was in response to the number of posts on here with people saying "I won't buy QLED, I'll buy (insert brand) because they support DV" when buying a product with lower peak brightness and poorer colour volume will give you an inferior HDR experience regardless of whether it supports DV. 

Soul_
Pioneer
Options

Way to fabricate things.

 

No one said that we wouldnt buy QLED, all we said is to not buy QLED from Samsung. As a technology QLED is great, but as a brand, Samsung seems to be extremely stubborn. I would happily buy any QLED TV with DV from anyone, if it had a good track record and features I need. Right now, I will be forced to buy OLED because of a major QLED vendor's demeanour.

 

Open ears are always better than an open mouth.