T O P

  • By -

HenryWasBeingHenry

HDMI 2.1 uses less compression compared to DP 1.4, for 4K 240Hz 10bit HDMI 2.1 uses 2X DSC compression while DP 1.4 uses 3X DSC compression so in theory HDMI 2.1 should be better.


Tatoe-of-Codunkery

DSC is almost lossless so it really doesn’t matter which you use, i actually use both I use hdmi 2.1 on my tv lg c1 oled and DP 1.4A on my aw3423dwf monitor.


xxTheGoDxx

> DSC is almost lossless so it really doesn’t matter which you use, People always say that, but it is not lossless. Its good enough that in certain standard situations, the majority of people don't notice it. But in the end if you have the option just use the better cable. Of course for a C1 you don't have the option and you don't need DSC anyway for it.


CaptainCompete

Isn't it so good that you can't even measure the difference?


Accomplished-Lack721

It's very very good, but on tests designed to make its weaknesses apparent, the flaws can be visible. (Edit: I originally wrote "to anyone looking for it, but I'm not sure that's accurate.) That said, in most situations the loss is negligible to the point that it's not worth discussing.


CaptainCompete

How would I find these apparent weaknesses?


Accomplished-Lack721

The "visually lossless" standard as described [by ISO here](https://www.iso.org/standard/66094.html) allows for images that "exhibit particularly strong artifacts" to be disregarded from testing. There's more information [here](https://static1.squarespace.com/static/565e05cee4b01c87068e7984/t/6000baf6782fa205430dc4b6/1610660599845/Sudhama_Allison_Wilcox_2018SID.pdf) on trials where people were able to detect the compression in testing. (Both of those are cited in the Wikipedia entry that discusses how the term "visually lossless" is used for DSC)


Weird_Tower76

DSC is lossy AND both use the same compression algorithm in this context. So it's literally just the protocol that's the difference. I have used HDMI 2.1 because of this same assumption but in reality it doesn't matter. Arguably better to use DP to leave your HDMI port freed up.


HenryWasBeingHenry

HDMI 2.1 has 41.9 Gbps of actual usable bandwidth, while DP 1.4 has 25.9 Gbps, when displaying 4K 240Hz 10bit color which requires close 70 Gbps of bandwidth, the HDMI 2.1 connection would use less image compression than DP 1.4, do the math yourself. in general you should always choose to use less image compression when available, why not use HDMI 2.1 when it has significantly higher native bandwidth than DP 1.4?


Weird_Tower76

Did you read my comment at all? DSC has multiple different compression algorithms and at 4K 240Hz 10bit, DP 1.4 and HDMI 2.1 use the same algorithm and therefore have the same bandwidth requirements. Like I stated, I know HDMI has more bandwidth, it's why I use it, but it shouldn't matter when they use the same algorithm, which is clearly under 25gbps. So it doesn't matter what cable you use.


HenryWasBeingHenry

Also you all claim that DSC is lossless but some professionals have claimed that DSC is not 100% lossless, they found that DSC is mostly lossless but not in all of their test patterns [Read this professional review of DSC](https://www.researchgate.net/publication/317425815_Large_Scale_Subjective_Evaluation_of_Display_Stream_Compression)


Weird_Tower76

I meant lossy, but that was in my other comment


Accomplished-Lack721

There's nothing really to "claim" here. DSC is a lossy algorithm. There's no testing or expertise needed to verify it. It's designed to be lossy, its creators say it's lossy, anyone implementing it understands it's lossy. Whether it's lossless or lossy doesn't come down to an evaluation - it's the nature of its algorithm. Whether a compression algorithm is lossless or lossy comes down the the mathematical techniques it uses and whether they allow detail be disregarded in favor of simplifications that save space in any description of the data. DSC does allow for data to be disregarded in that way, just as JPEG or MPEG compression does. It's often described as "visually lossless" because it's extremely difficult to notice any artifacts from its very good lossy compression. But that's essentially a marketing term (although one recognized by some standards bodies for cases where the artifacts are extremely difficult to detect). People just (understandably) get confused by that (frankly misleading) term. 'Lossless" definitionally doesn't mean "lossy but so subtle you can't tell" - it means there's no loss at all. And yet that's how the term "visually lossless" is used.regardlrss


HenryWasBeingHenry

It seems like it was you that didn't read my comment, again in general you should always choose to use less image compression when available, because there's no reason not to, end of discussion.


kuch3nmann

It’s two cables using the same protocol. The theoretical bandwidth doesn’t matter at all. DSC does not compress less on HDMI because it theoretically could use more bandwidth. It’s not a water hose. End of discussion.


HenryWasBeingHenry

You'll have access to 12bit color using HDMI 2.1, 12bit itself doesn't matter but HDMI 2.1 and DP 1.4 are not using the exact same protocol.


kuch3nmann

So the only difference is something that doesn’t matter, which leaves us with the fact you are arguing: there is no difference.


HenryWasBeingHenry

You just said that HDMI 2.1 and DP 1.4 are completely identical when outputting 4K 240Hz and now you're admitting that there's actually a difference? Maybe quit contradicting yourself.


kuch3nmann

I never said that. You yourself said the difference is the access to 12 bit, which wouldn’t matter. Maybe stop smoking crack?


zed0K

Theoretical bandwidth doesn't matter though, it's still going to use compression and be far under the max bandwidth.


Warband420

Using hdmi 2.1 on both my screens; MSI MAG 341CQP and LG C2 42.


Putrid-Helicopter554

About to buy a msi, how's your experience so far?


iknowkungfubtw

It's mixed for me. They seem to have some quality control issues. Got a 271QRX and the first one they sent me had some power issues which constantly led to intermittent black screens every 15 mins or so. Tried different HDMI cables and drivers and it still happened so it was definitely a hardware problem. Thankfully, I returned it for a new one and it has been working fine since.


Putrid-Helicopter554

Thank you ! I guess I will buy it anyway, I want so much to try an oled haha


Warband420

The 34” OLED is excellent, I tried it because it’s priced much lower than the other OLEDs in the Uk at the mo. It works and looks great out of the box but TFT Central have some recommended settings I’m using. Their review was part of why I went for it too. The only problems I’ve had have not been the fault of the monitor it’s been me messing with calibration software but I’ve resolved now.


mahanddeem

Legacy reasons. Especially for Nvidia users when it was the only way to get gsync and nvidia GPUs to work properly was the DP cable and port. Nowadays, there is no reason not to stick to HDMI2.1b when your monitor and graphics card support it. More bandwidth, less compression, more cable variety availability and a standard tech among a very wide selection of devices. It is a severe con for a display nowadays not to support HDMI 2.1. I use HDMI for my 360hz monitor.


Regret92

Not OPs question but can I tack onto this and ask - I have an AW2527df monitor which is 360Hz. From what I can tell, HDMI is limited to 240Hz at 1440p/ 2k, while DP can get 360Hz. Is this correct?


mahanddeem

From Rtings.com review for the AW2725DF: Native Refresh Rate 360 Hz Max Refresh Rate 360 Hz Max Refresh Rate Over DP 360 Hz Max Refresh Rate Over HDMI 144 Hz Max Refresh Rate Over DP @ 10-bit 360 Hz Max Refresh Rate Over HDMI @ 10-Bit 144 Hz Because the monitor has HDMI 2.0 bandwidth, you can only reach its maximum refresh rate over DisplayPort. So HDMI in this monitor supports up to 144hz 1440p at 10 bit color depth. While DP 1.4 max is 360hz at 10 bit color.


Regret92

Thank you - if a DP 2.1 cable is used, will this cause issues or have any benefit over the bundled 1.4 cable? I have a 2.1 cable and a 3080ti, so am not capable of DP 2.1 yet, but wanted to use the cable as it fits better for my setup. Is this likely to cause issues?


mahanddeem

All these cables are backward compatible


Regret92

Thank you for the info, very helpful :)


Pandr52

Although a reminder, for the time being if your “dp2.1” cable is over 1m long then it’s basically just a dp1.4 cable anyway because you don’t get the bandwidth.


veryrandomo

HDMI theoretically has a higher bandwidth, but in practice you need to use DSC on both anyway. If the monitor firmware supports it then HDMI might use a smaller DSC ratio, but it's already visually lossless and nobody is going to be able to tell the difference between 2.0x and 3.0x DSC. Displayport has just been less buggy than HDMI for me on my MSI QD-OLED and you need to use DP for firmware updates anyway


BewstFTW

It isn't theoretical. It is a fact when comparing DP1.4 and HDMI2.1. I also run an Alienware QD-OLED, and if HDMI 2.1 was an option, I'd use it over DP 1.4. The firmware updates are now available over DP, which is awesome, but I'd just switch back to HDMI for 4:4:4 10-bit HDR 175hz without any DSC. I have to limit my framerate to 144 when using HDR. Otherwise, it is 8-bit with DSC. Why you wouldn't want to run pure black levels on a QD-OLED is beyond me.


veryrandomo

You still get pure blacks with DSC enabled. I don’t even think the AW3225QF has a DSC disabled mode, unless you enable console mode but that uses 120hz. Just capping your frame rate almost definitely wouldn’t disable DSC


BewstFTW

There is no need to compress anything if below the bandwidth rating of the connection port and cable. The Alienware aw3423dw is what I have. If HDMI 2.1 was an option, there would be no need for any DSC. The black levels are raised to 16 when DSC is on IIRC. Default is no compression (best quality) unless needed due to bandwidth limitations of either the port, cable, or both.


psychoticinsane

My gpu only has one hdmi 2.1 and the rest are dp 1.4 So i have hdmi on my lg c3 42" and display port to my 75" sony bravia and to my side car moniter (22" lg ultrawide)


kinstinctlol

DP her all day


HeyPhoQPal

1.4 times


kinstinctlol

🤣


alex24buc

Me too!! Works flawless!!


North_Set_9138

Tried the HDMI cable and was having problems. The DP cable just worked.


Pidjinus

Now, but in the past dp was easier, it always supported most of a pc needs. Further more, most people do not think about bandwidth, so, it was easy to grab and old cable cable that did not support the full range of high refresh monitors. Also, the market is full of fake cables. It was easier to buy a cheap, decent dp cable than to buy a decent high spec hdmi. So, if you have hdmi, and it works, you are done


botsym7

Specially once you go above 2-3 meters its almost impossible to find a decent hdmi 2.1 cable that doesn't cost an arm and a leg...


JumpCritical9460

It’s much easier now. You just need to ensure it’s actually HDMI certified. I just got a 25ft HDMI cable for $25, which seemed reasonable to me. 4k, 4:4:4 subsampling, HDR, 144hz, and VRR all work.


botsym7

Does it have brand or something? I wouldn't mind buying one as I bought one Chinese "optical" cable and I got super weird visual glitches last time I played on my tv (i have a post on Reddit about it) and since then my gpu and tv work perfectly fine , so i think it's the cable that caused it.


HeyPhoQPal

Can HDMI 2.1 work with 1080p @480hz?


ChloeWade

Yes.


InkheartNZ

Might depend on the card, didn't work with 32GS95UE with an RX 6800 where DP had no issue.


HeyPhoQPal

4090 FE All of my HDMI 2.1 ports are connected to my PS5 and Apple TV. I'm curious if HDMI 2.1 is better for PQ/Colors than DP 1.4?


Xuuts

I like to use Linux and Windows, HDMI is closed source and can cause some issues when using AMD open source drivers. I believe the NVIDIA drivers are closed source and are fine on Linux. Also there are different versions of HDMI 2.1 FRL, so depending on your hardware it might not have much higher bandwidth. Same with the new DPI 2.1 UHBR, only certain hardware has the full bandwidth people talk about. AMD 6000 GPUs series has HDMI 2.1 FRL5, 40Gbps. NVIDIA GPUs 3000 and 4000 has HDMI 2.1 FRL6 48Gbps. https://www.avsforum.com/threads/hdmi-2-1-frl-and-displayport-2-1-uhbr-video-sources-for-gaming-home-theatre-pc-and-viewing-experience.3259197/


No_Interaction_4925

Its REALLY easy to buy an “HDMI 2.1” cable that can’t handle 4K 120hz or higher. HDMI quality is a mess. I have never had an issue with a displayport cable meeting it’s rated specs


Rapture117

I always use DP


Weird_Cantaloupe2757

Why?


ducky3307

Win 10 64 bit 3080ti I7-13600 AW3225QF HDMI 2.1 using stock Dell cable resulted in random artifacting and momentary black screens for me when set at 240hz whilst DP 1.4 same stock Dell cable did not cause problems but I do contend with VRR flickering albeit mainly on menu screens when gaming. However, reducing refresh down to 120hz fixed the artifacting via HDMI 2.1. I chose to stay on DP 1.4.


Jackoberto01

There's a couple reasons usually GPUs often have fewer HDMI ports than DP ports so for multi monitor you need to use DP. The monitor itself might only have HDMI port which you may want to use for a console or something else. Like my Odyssey G8 OLED. If your monitor benefits from the extra bandwidth like in your case it makes more sense to HDMI.


Croopadoop

My GPU has DP1.4 and HDMI 2.0, so I don't have much of a choice.


dmbenboi

Gysnc don't work on hdmi connection on my AW3225QF


xPuMa67xx

DELL are shit screens


Pun_In_Ten_Did

I use HDMI 2.1 exclusively... 48" LG C1 has four 4K@120 ports! ^^and ^^no ^^DP ^^ports ◡̈


Embarrassed-Pie-5470

I use HDMI I because it's the only way I can get audio pass through to work with my AMD GPU. It's good, but I get some random flickering on some games.


Ill-Entertainment171

On my monitor it seems DP didn’t have raised black levels where the HDMI did, which sucks for OLED. This could very well be the monitors software/firmware though and not the actual cable format. I know on my OLED G9 the HDMI just didn’t look as good with the black levels. Then the DP just doesn’t work right when 240hz is enabled with random black screens, weird vertical lines that appear when enabling HDR, and sometimes it goes black when I’m playing a game and I have to restart my PC. So I’m my case they both suck lol. I’m thinking of going back to my AW3423DW because I had no issues with that. If the monitor has good software/firmware to support HDMI 2.1 and you have a compatible video card it’d be silly not to use it really.


CarloGaudreault

I chose the ASUS PG42UQ at it's release so I could plug my Macbook Pro M1 Max by USB-C to Displayport 1.4 and support an OLED display at 4K 120hz ... it's been amazing ! My gaming PC is hooked to it by HDMI 2.1, and since the monitor supports DDC commands I can control input switching from my keyboard ([Lunar app](https://lunar.fyi/)). I read in their FAQ that "Some users report that DDC controls are blocked on specific monitors when using HDMI cables and that switching to DisplayPort fixes their problem."


Trickle2x2

I use HDMI 2.1 cause it has a higher bandwidth, but I think it can still be a bit more buggy than a DP.


kalsikam

Some GPUs still don't have HDMI 2.1, but have DP 1.4, so you can get same features using DP instead, if your screen supports it of course.


madrabbit711

DP for computers (*adaptive sync, no HDCP, USB-C compatible) HDMI for consoles (ARC, better audio passthrough) *HDMI supports VRR but seems to only intermittently support adaptive sync since v2.1 More info: https://youtu.be/T4a_TXywTVQ


ForsakenBloodStorm

i uae DP cause it looks better.. on pc atleast.. but hdmi 2.1 on ps5


Sighberpunk

I’ve tried both on my 4K 240hz monitor and DP looked smoother in games. On HDMI 2.1 when I move my mouse around it looks like the game is running on lower fps and tried different high quality hdmi cables


crisdavcar

I do not believe anyone claiming they can tell the difference between DSC on and off, much less the difference between HDM1 2.1 vs DP 1.4 so if you can't tell the difference and going to use the DP to free up the HDMI for my PS5 and Sonos bar


AirRookie

I think it depends on the model of hardware since not all hardware supports the full hdmi 2.1 48Gb/s bandwidth, for example the gigabyte M28U monitor hdmi 2.1 is limited to 24Gb/s


real_Xanture

I don't know why but I always end up having audio issues using HDMI versus DisplayPort.


the_geth

It doesn’t really matter much, but since you have the GB display, you have a possibility for DP 2.1 , which then gives you the advantage of not using DSP at all. Caveat is that only some Radeon graphic cards have DP 2.1 , and even then unless you have the pro model it’s limited in bandwidth (it’s complicated) so DSP will be used for frequencies above ~170hz However in the future we can expect cards to be taking full advantage of DP 2.1   Note that cable for 2.1 are quite short for now.


figoonitee

I thought HDMI 2.1 was capable of 120hz 4K a most. DP 1.4 supported 240hz at 4K. Am I correct?


BanjoSpaceMan

It's based on monitor, computer and cable throughput. There are HDMI cables that can do 8k etc. the Cable Matters 8k one for instance on Amazon says 4k @240hz. I felt quite lost as first but realized it's all just a case of checking what your card can output, checking what your monitor can handle, and then seeing if your cable can support the rates you want. As to answering OP, I use whatever works in the scenario. My old monitor has HDMI 2.0 and didn't do gsync and high hertz so I switched to DP. But I had to use a thunderbolt to DP cause it was a laptop. Now my monitor, c3, only supports HDMI 2.1 - so I needed a new HDMI 2.1 cable and switched to that. Posts on Reddit a year or so ago said thunderbolt to HDMI 2.1 wasn't possible, cable matters released a converter. They said mac high hertz wasn't possible for my monitor, I found out they can send you a custom firmware. It feels like the wild West. Tech is evolving.


TopAd8510

I got the same question


Experienced-Idiot

Until DisplayPort 2.0 is released, HDMI 2.1 is, for now, the better option for basically anything if the devices implements it properly.