
It seems that 7th gen and later Intel processors can support HDMI 2.0, but only with an external "LSPCON" (Level Shifter and Protocol CONverter) component on the motherboard. See <https://www.intel.com/content/dam/support/us/en/documents/graphics/HDR_Intel_Graphics_TechWhitePaper.pdf> As I've mentioned before, going through DisplayPort also works and does not require a LSPCON.

On Sat, Feb 08, 2020 at 01:07:35PM -0500, D. Hugh Redelmeier via talk wrote:
It seems that 7th gen and later Intel processors can support HDMI 2.0, but only with an external "LSPCON" (Level Shifter and Protocol CONverter) component on the motherboard.
As I've mentioned before, going through DisplayPort also works and does not require a LSPCON.
I love how it mentions future generations won't need it. Unfortunately they mean future generations of the graphics core, not the CPU since as far as I can find, the same problem applies to 8th and 9th gen CPUs since the graphics core design hasn't changed. I wonder if they will skip 2.0 and go to 2.1 when they finally get around to updating that design. -- Len Sorensen

On 2/10/20 10:44 AM, Lennart Sorensen via talk wrote:
It seems that 7th gen and later Intel processors can support HDMI 2.0, but only with an external "LSPCON" (Level Shifter and Protocol CONverter) component on the motherboard.
As I've mentioned before, going through DisplayPort also works and does not require a LSPCON. I love how it mentions future generations won't need it. Unfortunately
On Sat, Feb 08, 2020 at 01:07:35PM -0500, D. Hugh Redelmeier via talk wrote: they mean future generations of the graphics core, not the CPU since as far as I can find, the same problem applies to 8th and 9th gen CPUs since the graphics core design hasn't changed.
I wonder if they will skip 2.0 and go to 2.1 when they finally get around to updating that design. If I recall HDMI 2.0 like USB 3.2 and PCI 4.0 does not require a hardware upgrade. The protocol changed but older hardware should use it depending on if it can handle the new requirements. This was one of the reasons AMD did allow for awhile, PCI 4.0 on older Ryzen chipsets.
If HDMI for the last two versions is like this it really doesn't make sense to not allow it on certain older computers. Probably just a way for Intel to sell more chips maybe. I could be mistaken but from memory this is what I recall, Nick

On Mon, Feb 10, 2020 at 11:50:32AM -0500, Nicholas Krause via talk wrote:
If I recall HDMI 2.0 like USB 3.2 and PCI 4.0 does not require a hardware upgrade. The protocol changed but older hardware should use it depending on if it can handle the new requirements. This was one of the reasons AMD did allow for awhile, PCI 4.0 on older Ryzen chipsets.
If HDMI for the last two versions is like this it really doesn't make sense to not allow it on certain older computers. Probably just a way for Intel to sell more chips maybe.
I could be mistaken but from memory this is what I recall, Nick
HDMI 1.0 through 1.2 had 4.95Gbps bandwidth (like DVI). HDMI 1.3 and 1.4 had 10.2Gbps bandwidth. HDMI 2.0 had 18Gbps bandwidth. HDMI 2.1 has 48Gbps bandwidth. So hardware changes are required to increase those link speeds. Now 2.0a and 2.0b are just small additions to the protocol of 2.0. The reason people want HDMI 2.0 and 2.1 is to get 4K resolution at a decent frame rate, which HDMI 1.4 ports (like what intel still puts in their graphics core) can't do. 30Hz doesn't cut it. 2.0 can do 4K 60Hz, but if you want HDR (so more than 24bit color) you have to use chroma sub sampling and then text can start to look bad. 2.1 can do 120Hz and has the bandwidth for HDR as well. -- Len Sorensen

On Mon, Feb 10, 2020 at 11:50:32AM -0500, Nicholas Krause via talk wrote:
If I recall HDMI 2.0 like USB 3.2 and PCI 4.0 does not require a hardware upgrade. The protocol changed but older hardware should use it depending on if it can handle the new requirements. This was one of the reasons AMD did allow for awhile, PCI 4.0 on older Ryzen chipsets.
If HDMI for the last two versions is like this it really doesn't make sense to not allow it on certain older computers. Probably just a way for Intel to sell more chips maybe.
I could be mistaken but from memory this is what I recall, Nick HDMI 1.0 through 1.2 had 4.95Gbps bandwidth (like DVI). HDMI 1.3 and 1.4 had 10.2Gbps bandwidth. HDMI 2.0 had 18Gbps bandwidth. HDMI 2.1 has 48Gbps bandwidth.
So hardware changes are required to increase those link speeds. Probably but PCI 4.0 doubled its speed without wire changes or additions just a protocol change. Maybe HDMI can't do
On 2/10/20 12:37 PM, Lennart Sorensen wrote: that or only for minor version changes. That was from just under 16GBs to just under 32GBs in a x16 lane. Nick
Now 2.0a and 2.0b are just small additions to the protocol of 2.0.
The reason people want HDMI 2.0 and 2.1 is to get 4K resolution at a decent frame rate, which HDMI 1.4 ports (like what intel still puts in their graphics core) can't do. 30Hz doesn't cut it.
2.0 can do 4K 60Hz, but if you want HDR (so more than 24bit color) you have to use chroma sub sampling and then text can start to look bad. 2.1 can do 120Hz and has the bandwidth for HDR as well.

On Mon, Feb 10, 2020 at 12:42:37PM -0500, Nicholas Krause wrote:
Probably but PCI 4.0 doubled its speed without wire changes or additions just a protocol change. Maybe HDMI can't do that or only for minor version changes. That was from just under 16GBs to just under 32GBs in a x16 lane.
HDMI is just as backwards compatible as PCIe. Same connector, same wires, different signalling depending on the negotiation of the two devices involved. Just rather pathetic that intel's graphcis chips are so outdated that they can't drive a modern display at a decent resolution and framerate. They can use it, but at very low frame rate. Same as PCIe. You only get 4.0 speed if both the machine and the card support it, otherwise it falls back to the best that both support. If you need 4.0 speed, but the machine doesn't support it, then you are out of luck. The state of intel's graphics is like someone selling PCIe 2.0 machines while 4.0 is the current version. -- Len Sorensen

On 2/10/20 4:37 PM, Lennart Sorensen wrote:
On Mon, Feb 10, 2020 at 12:42:37PM -0500, Nicholas Krause wrote:
Probably but PCI 4.0 doubled its speed without wire changes or additions just a protocol change. Maybe HDMI can't do that or only for minor version changes. That was from just under 16GBs to just under 32GBs in a x16 lane. HDMI is just as backwards compatible as PCIe. Same connector, same wires, different signalling depending on the negotiation of the two devices involved.
Just rather pathetic that intel's graphcis chips are so outdated that they can't drive a modern display at a decent resolution and framerate. They can use it, but at very low frame rate.
Same as PCIe. You only get 4.0 speed if both the machine and the card support it, otherwise it falls back to the best that both support. If you need 4.0 speed, but the machine doesn't support it, then you are out of luck. The state of intel's graphics is like someone selling PCIe 2.0 machines while 4.0 is the current version. That makes sense. Through I would rather have a professional level monitor at 1080p then 4K. Color depth, accuracy and text contrast matter a lot more than resolution when it comes down to it through. And frankly 4K pro is a lot more expensive due to being cutting edge. And yes text contrast is important for programming or other contrast in forms of scaling/rendering text.
I would make the same argument about keyboards as well in that I would rather have a mechanical keyboard rather than any laptop keyboard. And frankly a great keyboard is a very underrated similar to the above text issues for monitors. Through Intel has at least in my view historically been a little too conservative when updating to new buses or hardware versions but it doesn't surprise me, Nick

On Mon, Feb 10, 2020 at 5:27 PM Nicholas Krause via talk <talk@gtalug.org> wrote:
On 2/10/20 4:37 PM, Lennart Sorensen wrote:
On Mon, Feb 10, 2020 at 12:42:37PM -0500, Nicholas Krause wrote:
snip
Same as PCIe. You only get 4.0 speed if both the machine and the card support it, otherwise it falls back to the best that both support. If you need 4.0 speed, but the machine doesn't support it, then you are out of luck. The state of intel's graphics is like someone selling PCIe 2.0 machines while 4.0 is the current version.
That makes sense. Through I would rather have a professional level monitor at 1080p then 4K. Color depth, accuracy and text contrast matter a lot more than resolution when it comes down to it through. And frankly 4K pro is a lot more expensive due to being cutting edge. And yes text contrast is important for programming or other contrast in forms of scaling/rendering text.
Sorry, brother - - - - - you seem to be a wee bit behind the times. 8k is the cutting edge and in the broadcast studio IIRC its more like 10 or 12k that is viewed as the cutting edge (not much use). 4k is now in the domain of the sub $300 TV market.
I would make the same argument about keyboards as well in that I would rather have a mechanical keyboard rather than any laptop keyboard. And frankly a great keyboard is a very underrated similar to the above text issues for monitors.
Is there a decent keyboard out there any more? I'm finding that a 30% failure rate in 6 to 9 months is now considered normal. 20 years ago things sure weren't that 'good'. Regards

| From: Nicholas Krause via talk <talk@gtalug.org> | That makes sense. Through I would rather have a professional level | monitor at 1080p then 4K. Color depth, accuracy and text contrast | matter a lot more than resolution when it comes down to it through. | And frankly 4K pro is a lot more expensive due to being cutting edge. | And yes text contrast is important for programming or other contrast | in forms of scaling/rendering text. I don't agree. I talked about this in my lightning talk at last month's meeting. It depends on you use case. I program and read on my monitor. I don't game or do photo-editing or watch movies. For my use case, resolution is quite useful whereas 8-bit is enough colour depth (I don't really know about 6-bit displays with dithering to simulate 8-bit). Text contrast should not be a problem if the display is bright enough (all desktops ought to be whereas some notebooks might not be). UltraHD TV sets (as opposed to monitors) are dirt cheap. My biggest surprise: 4:2:2 chroma subsampling is really a non-problem for my uses. Second biggest: 30Hz refresh isn't ideal but it is OK. | I would make the same argument about keyboards as well in that | I would rather have a mechanical keyboard rather than any laptop | keyboard. And frankly a great keyboard is a very underrated | similar to the above text issues for monitors. I don't think that that is the same argument at all. Ultra-thin notebooks are constrained to have thin keyboards. ThinkPad non-ultrabooks have fairly good keyboards. Stand-alone keyboards give you the most freedom. On my desktop, I use a mechanical keyboard with Cherry MX brown keys. I like it but I could live with a non-mechanical keyboard. It glows green all the time because I don't have a driver for the stupid lights.

On 2/10/20 6:49 PM, D. Hugh Redelmeier via talk wrote:
| From: Nicholas Krause via talk <talk@gtalug.org>
| That makes sense. Through I would rather have a professional level | monitor at 1080p then 4K. Color depth, accuracy and text contrast | matter a lot more than resolution when it comes down to it through. | And frankly 4K pro is a lot more expensive due to being cutting edge. | And yes text contrast is important for programming or other contrast | in forms of scaling/rendering text.
I don't agree. I talked about this in my lightning talk at last month's meeting.
It depends on you use case. I program and read on my monitor. I don't game or do photo-editing or watch movies.
For my use case, resolution is quite useful whereas 8-bit is enough colour depth (I don't really know about 6-bit displays with dithering to simulate 8-bit). Text contrast should not be a problem if the display is bright enough (all desktops ought to be whereas some notebooks might not be). UltraHD TV sets (as opposed to monitors) are dirt cheap.
My biggest surprise: 4:2:2 chroma subsampling is really a non-problem for my uses. Second biggest: 30Hz refresh isn't ideal but it is OK. Readability of text is what I was talking about not just color contrast. A lot of people assume that reading text is the same on all monitors, it isn't and a lot of the pro level displays are better at this. Your free to disagree that matters. However in my view it does help.
In addition I was also hinting at how well dpi is implemented at a higher resolution, which does matter. Scaling for a higher resolution in text is very much dependent on this. For whatever reason better dpi scaling and text scaling almost always goes hand in hand with better color contrast.
| I would make the same argument about keyboards as well in that | I would rather have a mechanical keyboard rather than any laptop | keyboard. And frankly a great keyboard is a very underrated | similar to the above text issues for monitors.
I don't think that that is the same argument at all.
Ultra-thin notebooks are constrained to have thin keyboards. ThinkPad non-ultrabooks have fairly good keyboards. Stand-alone keyboards give you the most freedom.
On my desktop, I use a mechanical keyboard with Cherry MX brown keys. I like it but I could live with a non-mechanical keyboard. It glows green all the time because I don't have a driver for the stupid lights.
My point is that most people focus on certain things like resolution without all the details. This is similar to other things, maybe my explanation was faulty. Sorry for the misunderstanding, Nick
--- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

| From: Nicholas Krause via talk <talk@gtalug.org> | Readability of text is what I was talking about not just color contrast. OK. Most but not all TV sets are fine for this. I would not trust the RGBW displays but I haven't tried them. Also: go for IPS or VBA technology. I think that I mentioned this in my Lightning Talk. | A lot of people assume that reading text is the same on all monitors, | it isn't and a lot of the pro level displays are better at this. "pro level" sounds like a marketing term. Perhaps you mean: very expensive, aimed at professional _____. Photographers? Videographers? "Prosumers"? Traders? Programmers? Engineers? Architects? | Your free | to disagree that matters. However in my view it does help. Opinions can be refined by research. That's what I've tried to contribute to, on this list and my talk. | In addition I was also hinting at how well dpi is implemented at a higher | resolution, which does matter. Scaling for a higher resolution in text | is very much dependent on this. For whatever reason better dpi scaling | and text scaling almost always goes hand in hand with better color contrast. If you are letting your monitor do scaling you are doing it wrong. You should let your computer do that. One exception: if a computer only does 1920x1080, you can let a TV/monitor double the pixels in each dimension. This is dumb in the long term but sometimes you need to do it for a short time (eg. to adjust firmware settings in a server). (It's really annoying to not be able to see text during POST and the subsequent startup.) Scaling TV or movies is an interesting problem since it extends into the 4th dimension (time). You really don't want to get into that with a monitor. In fact, you want to turn off any multi-frame processing that a TV does because it will add latency to the display. | My point is that most people focus on certain things like resolution | without all the details. True. Again, this is why my talk was "what I've learned about UltraHD". Actual experience is enlightening. Reading specs is important but not sufficient. Ergonomics is full of surprises. One of them is: not everyone is the same. That's why I tried to frame my talk as about me :-)

On Tue, Feb 11, 2020 at 8:39 AM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: Nicholas Krause via talk <talk@gtalug.org>
| Readability of text is what I was talking about not just color contrast.
OK. Most but not all TV sets are fine for this. I would not trust the RGBW displays but I haven't tried them.
Also: go for IPS or VBA technology.
I think that I mentioned this in my Lightning Talk.
| A lot of people assume that reading text is the same on all monitors, | it isn't and a lot of the pro level displays are better at this.
"pro level" sounds like a marketing term. Perhaps you mean: very expensive, aimed at professional _____. Photographers? Videographers? "Prosumers"? Traders? Programmers? Engineers? Architects?
| Your free | to disagree that matters. However in my view it does help.
Opinions can be refined by research. That's what I've tried to contribute to, on this list and my talk.
| In addition I was also hinting at how well dpi is implemented at a higher | resolution, which does matter. Scaling for a higher resolution in text | is very much dependent on this. For whatever reason better dpi scaling | and text scaling almost always goes hand in hand with better color contrast.
If you are letting your monitor do scaling you are doing it wrong. You should let your computer do that.
One exception: if a computer only does 1920x1080, you can let a TV/monitor double the pixels in each dimension. This is dumb in the long term but sometimes you need to do it for a short time (eg. to adjust firmware settings in a server). (It's really annoying to not be able to see text during POST and the subsequent startup.)
Scaling TV or movies is an interesting problem since it extends into the 4th dimension (time). You really don't want to get into that with a monitor. In fact, you want to turn off any multi-frame processing that a TV does because it will add latency to the display.
| My point is that most people focus on certain things like resolution | without all the details.
True. Again, this is why my talk was "what I've learned about UltraHD". Actual experience is enlightening. Reading specs is important but not sufficient. Ergonomics is full of surprises. One of them is: not everyone is the same. That's why I tried to frame my talk as about me :-) ---
To Mr Hugh (hope I have that correct!) Looked in my email fine and search doesn't return anything appropriate for GTALug + lightening talk. Would you be able to provide a link so that I might 'see' such? (Distance makes personal attendance somewhat challenging!) Regards

| From: o1bigtenor via talk <talk@gtalug.org> | To Mr Hugh (hope I have that correct!) You don't have to be so formal. "Hugh" is fine. | Looked in my email fine and search doesn't return anything appropriate for | GTALug + lightening talk. | | Would you be able to provide a link so that I might 'see' such? | (Distance makes personal attendance somewhat challenging!) We have a great team recording each meeting. You could surf our YouTube channel for our other wonderful talks. Here's what Alex posted about January's meeting:
From talk@gtalug.org Sat Jan 25 22:16:48 2020 From: Alex Volkov via talk <talk@gtalug.org> To: GTALUG Talk <talk@gtalug.org> Date: Sat, 25 Jan 2020 22:16:44 -0500 Subject: [GTALUG] January 2020 meeting videos
Hello everyone, Here are the videos from our January meeting * What I've learned about Ultra HD with D. Hugh Redelmeier -- https://youtu.be/i-eXS-1wdzI * Chris' intro to Kubernetes -- https://youtu.be/F7z8AZjS28U * ZEO Bedside 2020 with Seneca Cunningham -- https://youtu.be/NIMRI3wr1Yc * LUG community organizing and meetup.com with Alex Volkov -- https://youtu.be/0I880MBDAXM Here are these videos in a single playlist -- https://www.youtube.com/playlist?list=PLUgE6dqIXiEukZ9-hlcBfMniy_avsB6bQ I would like to know you opinion -- Should I send the email about posted videos to announce list, or just sending it to talk is enough? Thank you, Alex.

On Wed, Feb 12, 2020 at 11:01 AM D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: o1bigtenor via talk <talk@gtalug.org>
| To Mr Hugh (hope I have that correct!)
You don't have to be so formal. "Hugh" is fine.
| Looked in my email fine and search doesn't return anything appropriate for | GTALug + lightening talk. | | Would you be able to provide a link so that I might 'see' such? | (Distance makes personal attendance somewhat challenging!)
We have a great team recording each meeting. You could surf our YouTube channel for our other wonderful talks. Here's what Alex posted about January's meeting:
My search terms would not have given what you have included.
From talk@gtalug.org Sat Jan 25 22:16:48 2020 From: Alex Volkov via talk <talk@gtalug.org> To: GTALUG Talk <talk@gtalug.org> Date: Sat, 25 Jan 2020 22:16:44 -0500 Subject: [GTALUG] January 2020 meeting videos
Hello everyone,
Here are the videos from our January meeting
* What I've learned about Ultra HD with D. Hugh Redelmeier -- https://youtu.be/i-eXS-1wdzI
* Chris' intro to Kubernetes -- https://youtu.be/F7z8AZjS28U
* ZEO Bedside 2020 with Seneca Cunningham -- https://youtu.be/NIMRI3wr1Yc
* LUG community organizing and meetup.com with Alex Volkov -- https://youtu.be/0I880MBDAXM
Here are these videos in a single playlist -- https://www.youtube.com/playlist?list=PLUgE6dqIXiEukZ9-hlcBfMniy_avsB6bQ
I would like to know you opinion -- Should I send the email about posted videos to announce list, or just sending it to talk is enough?
I now remember the email. When I was looking for the email though - - - I used lightening talk in my search terms and that 'term' isn't included anywhere in the email. For me - - - - the announcement as it was made is great. Maybe adding something like 'lightening talk videos' or some such would make it easier to search for that particular email. The title of the talk is useful but as at least I am thinking of the various talks as GTALug lightening talk topic/blah/blah finding what I'm looking for just wasn't happening. Thanking you for your assistance Regards

On Tue, Feb 11, 2020 at 09:39:29AM -0500, D. Hugh Redelmeier via talk wrote:
OK. Most but not all TV sets are fine for this. I would not trust the RGBW displays but I haven't tried them.
My TV is RGBW, but it is OLED. Some RGBW LCD panels from LG in the past did not actually have RGB subpixels for every pixel which was a problem. The OLED ones do have RGBW subpixels for every pixel (so 32M subpixels). I would not use it as my computer display, although I am sure it could do it just fine. I do have a mythtv frontend connected to it, and due to HDMI 1.4 limitations on that machine, I have to force it to 1920x1080@60Hz rather than the 3840x2160@30Hz it likes to default to since my content on mythtv is never more than 1920x1080 but is often 60Hz.
Also: go for IPS or VBA technology.
For a computer monitor, sure, but for my TV, not way.
If you are letting your monitor do scaling you are doing it wrong. You should let your computer do that.
Well sure but if your HDMI link is too slow, letting the screen do it is a better option than having terrible frame rate.
One exception: if a computer only does 1920x1080, you can let a TV/monitor double the pixels in each dimension. This is dumb in the long term but sometimes you need to do it for a short time (eg. to adjust firmware settings in a server). (It's really annoying to not be able to see text during POST and the subsequent startup.)
Scaling TV or movies is an interesting problem since it extends into the 4th dimension (time). You really don't want to get into that with a monitor. In fact, you want to turn off any multi-frame processing that a TV does because it will add latency to the display.
True. Again, this is why my talk was "what I've learned about UltraHD". Actual experience is enlightening. Reading specs is important but not sufficient. Ergonomics is full of surprises. One of them is: not everyone is the same. That's why I tried to frame my talk as about me :-)
The specs are often lacking in details. -- Len Sorensen

On 2/11/20 9:39 AM, D. Hugh Redelmeier via talk wrote:
| From: Nicholas Krause via talk <talk@gtalug.org>
| Readability of text is what I was talking about not just color contrast.
OK. Most but not all TV sets are fine for this. I would not trust the RGBW displays but I haven't tried them.
Also: go for IPS or VBA technology.
I think that I mentioned this in my Lightning Talk.
| A lot of people assume that reading text is the same on all monitors, | it isn't and a lot of the pro level displays are better at this.
"pro level" sounds like a marketing term. Perhaps you mean: very expensive, aimed at professional _____. Photographers? Videographers? "Prosumers"? Traders? Programmers? Engineers? Architects? When I mean pro level I mean anything that can be used professionally or is good enough. You can find budget versions of these monitors at below $500 for 27 inches these days. The higher end ones just have better color depth and other things. If you shop around you can possibly find it cheaper used or on sale and I would doubt that the ones above budget level get much better for text. However for text they would be read then a consumer display.
| Your free | to disagree that matters. However in my view it does help.
Opinions can be refined by research. That's what I've tried to contribute to, on this list and my talk.
| In addition I was also hinting at how well dpi is implemented at a higher | resolution, which does matter. Scaling for a higher resolution in text | is very much dependent on this. For whatever reason better dpi scaling | and text scaling almost always goes hand in hand with better color contrast.
If you are letting your monitor do scaling you are doing it wrong. You should let your computer do that.
One exception: if a computer only does 1920x1080, you can let a TV/monitor double the pixels in each dimension. This is dumb in the long term but sometimes you need to do it for a short time (eg. to adjust firmware settings in a server). (It's really annoying to not be able to see text during POST and the subsequent startup.) Not actually, scaling is not what you think I mean. I don't mean it scales well to fit the screen i.e. text is too small but readability of text. This is where better dpi and text settings come in. For example ereader displays are very readable because of their high dpi to screen ratio and good text contrast alongside other screen settings.
Scaling TV or movies is an interesting problem since it extends into the 4th dimension (time). You really don't want to get into that with a monitor. In fact, you want to turn off any multi-frame processing that a TV does because it will add latency to the display.
| My point is that most people focus on certain things like resolution | without all the details.
True. Again, this is why my talk was "what I've learned about UltraHD". Actual experience is enlightening. Reading specs is important but not sufficient. Ergonomics is full of surprises. One of them is: not everyone is the same. That's why I tried to frame my talk as about me :-) Of course the biggest problem with buying monitors these days is its basically a case by case purchase.
Nick
--- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

| From: Nicholas Krause via talk <talk@gtalug.org> | If I recall HDMI 2.0 like USB 3.2 and PCI 4.0 does not require a hardware | upgrade. The protocol changed but older hardware should use it depending | on if it can handle the new requirements. Not exactly. HDMI 2.0 does not require new cables over HDMI 1.x. The connectors and signals are "the same". But the bandwidth is higher so old cables may well not work. There is an optional certification system for cables. HDMI 2.0 sources and sinks must support HDMI 1.x. So when you hook up an HDMI system, if either end is only 1.x, the result will be that a 1.x signal will be used. If you want UltraHD @ 60 Hz, you need HDMI 2.0 on both sides and a cable that can handle the bandwidth. | This was one of the reasons | AMD did allow for awhile, PCI 4.0 on older Ryzen chipsets. That's not how I understand it. This generation of Ryzen CPUs support PCI 4.0. Certain Ryzen support chips can support PCI 4.0, even those on motherboards that predate Ryzens CPUs that support PCI 4.0. So: you could buy an older Ryzen motherboard and plug in a current Ryzen chip, and get PCI 4.0. But: those motherboards were probably not tested/qualified for PCI 4.0. They might not be reliable. So AMD sent out a firmware update to disable PCI 4.0 on those motherboards. Alternate explanation: those motherboards were cheap. AMD wanted to steer PCI 4.0 customers to new motherboards with expensive support chips. The difference in price was about $100-$150, if I remember correctly. This, of course, upset customers that thought that they could get PCI 4.0 and had this taken away. Luckily, hardly any consumer device is bottlenecked by PCI 3.x at the moment. In the future, GPUs and NVMe SSDs will be. <https://www.tomshardware.com/news/amd-pcie-4.0-socket-am4-motherboard,39559.html> | If HDMI for the last two versions is like this it really doesn't make sense | to not allow it on certain older computers. Probably just a way for Intel | to sell more chips maybe. I don't think so. The bandwidths of HDMI are really quite high. Getting old chips to do it isn't likely a simple firmware upgrade. Both HDMI 2.0 and PCI 4.0 probably consume more power. I'm more annoyed at system manufacturers that didn't bother to include an LSPCON. We have two Dell notebooks that don't have them: one low-end and another high-end (with an UltraHD built-in screen!). Grrr. The majority of our computers (way too many!) are Haswell and older because newer processors aren't good enough to obsolete Haswell. But the GPU is a weak spot that even an LSPCON could not fix.

On 2/10/20 12:52 PM, D. Hugh Redelmeier via talk wrote:
| From: Nicholas Krause via talk <talk@gtalug.org>
| If I recall HDMI 2.0 like USB 3.2 and PCI 4.0 does not require a hardware | upgrade. The protocol changed but older hardware should use it depending | on if it can handle the new requirements.
Not exactly.
HDMI 2.0 does not require new cables over HDMI 1.x. The connectors and signals are "the same". But the bandwidth is higher so old cables may well not work. There is an optional certification system for cables.
HDMI 2.0 sources and sinks must support HDMI 1.x. So when you hook up an HDMI system, if either end is only 1.x, the result will be that a 1.x signal will be used.
If you want UltraHD @ 60 Hz, you need HDMI 2.0 on both sides and a cable that can handle the bandwidth.
| This was one of the reasons | AMD did allow for awhile, PCI 4.0 on older Ryzen chipsets.
That's not how I understand it.
This generation of Ryzen CPUs support PCI 4.0. Certain Ryzen support chips can support PCI 4.0, even those on motherboards that predate Ryzens CPUs that support PCI 4.0.
So: you could buy an older Ryzen motherboard and plug in a current Ryzen chip, and get PCI 4.0.
But: those motherboards were probably not tested/qualified for PCI 4.0. They might not be reliable. So AMD sent out a firmware update to disable PCI 4.0 on those motherboards.
Alternate explanation: those motherboards were cheap. AMD wanted to steer PCI 4.0 customers to new motherboards with expensive support chips. The difference in price was about $100-$150, if I remember correctly.
This, of course, upset customers that thought that they could get PCI 4.0 and had this taken away.
Luckily, hardly any consumer device is bottlenecked by PCI 3.x at the moment. In the future, GPUs and NVMe SSDs will be. That's correct they weren't qualified or tested but could handle it. The spec does not mention wire changes through. Also you forget 100Gbs NICs but that probably doesn't matter for most of us :).
<https://www.tomshardware.com/news/amd-pcie-4.0-socket-am4-motherboard,39559.html>
| If HDMI for the last two versions is like this it really doesn't make sense | to not allow it on certain older computers. Probably just a way for Intel | to sell more chips maybe.
I don't think so. The bandwidths of HDMI are really quite high. Getting old chips to do it isn't likely a simple firmware upgrade.
Both HDMI 2.0 and PCI 4.0 probably consume more power.
I'm more annoyed at system manufacturers that didn't bother to include an LSPCON. We have two Dell notebooks that don't have them: one low-end and another high-end (with an UltraHD built-in screen!). Grrr.
The majority of our computers (way too many!) are Haswell and older because newer processors aren't good enough to obsolete Haswell. But the GPU is a weak spot that even an LSPCON could not fix.
I'm assuming most are laptops as if not you probably could find some inexpensive used GPUs that support that. Not sure how much that would cost but its an idea. Nick
--- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

On Mon, Feb 10, 2020 at 12:52:54PM -0500, D. Hugh Redelmeier via talk wrote:
Not exactly.
HDMI 2.0 does not require new cables over HDMI 1.x. The connectors and signals are "the same". But the bandwidth is higher so old cables may well not work. There is an optional certification system for cables.
HDMI 2.0 sources and sinks must support HDMI 1.x. So when you hook up an HDMI system, if either end is only 1.x, the result will be that a 1.x signal will be used.
If you want UltraHD @ 60 Hz, you need HDMI 2.0 on both sides and a cable that can handle the bandwidth.
So HDMI 1.0 to 1.2 used 165Mhz TMDS signals with 8b/10b encoding (10 bits per TMDS "clock") with 3 channels for RGB or YUV or whatever format is being sent. HDMI 1.3 and 1.4 increased the max clock to 340MHz so almost double but same signalling. HDMI 2.0 increases the clock to effectively 600MHz (well technically 150Mhz, but they do 4 bytes per clock instead of 1). HDMI 2.1 makes rather large changes. It doubles the bytes per clock (so from 6Gbps/channel to 12Gbps/channel at 150MHz clock), and also moves from 8b/10b to 16b/18b encoding to increase efficiency. It also drops the use of a dedicated clock channel and embeds the clock in the data and uses the old clock lane as a fourth data channel, hence the 48Gbps total bandwidth. And yes certainly whenever they double the frequency of the signal going over the wire it has a tendancy (but not requirement) to need better cables. -- Len Sorensen
participants (4)
-
D. Hugh Redelmeier
-
lsorense@csclub.uwaterloo.ca
-
Nicholas Krause
-
o1bigtenor