
On Mon, Feb 10, 2020 at 12:52:54PM -0500, D. Hugh Redelmeier via talk wrote:
Not exactly.
HDMI 2.0 does not require new cables over HDMI 1.x. The connectors and signals are "the same". But the bandwidth is higher so old cables may well not work. There is an optional certification system for cables.
HDMI 2.0 sources and sinks must support HDMI 1.x. So when you hook up an HDMI system, if either end is only 1.x, the result will be that a 1.x signal will be used.
If you want UltraHD @ 60 Hz, you need HDMI 2.0 on both sides and a cable that can handle the bandwidth.
So HDMI 1.0 to 1.2 used 165Mhz TMDS signals with 8b/10b encoding (10 bits per TMDS "clock") with 3 channels for RGB or YUV or whatever format is being sent. HDMI 1.3 and 1.4 increased the max clock to 340MHz so almost double but same signalling. HDMI 2.0 increases the clock to effectively 600MHz (well technically 150Mhz, but they do 4 bytes per clock instead of 1). HDMI 2.1 makes rather large changes. It doubles the bytes per clock (so from 6Gbps/channel to 12Gbps/channel at 150MHz clock), and also moves from 8b/10b to 16b/18b encoding to increase efficiency. It also drops the use of a dedicated clock channel and embeds the clock in the data and uses the old clock lane as a fourth data channel, hence the 48Gbps total bandwidth. And yes certainly whenever they double the frequency of the signal going over the wire it has a tendancy (but not requirement) to need better cables. -- Len Sorensen