
On Tue, Jun 20, 2023 at 11:16:05AM -0400, James Knott via talk wrote:
Other than a small bit about lip sync, there is nothing about syncing the signal in that.
Well https://www.youtube.com/watch?v=5acgSK0kWTE has a lot of details on how HDMI works. I don't think it says how audio and video is synced, although given they are being sent interleaved, it probably isn't very complicated.
First off, that RTP article mentions video, not just audio. While the details may differ, the principles remain the same, that is the framing is embedded in the data. Don't confuse transport with signal.
In the case of my IPTV, the exact same signal is delivered to my TV, as I would receive over the old digital system. And they both use HDMI to reach my TV.
The IPTV goes to some device that decodes it and converts it to uncompressed video frames and audio. HDMI just carries uncompressed video and audio (either LPCM or some other digital audio format) to the TV. Of course if the TV itself runs apps, the decoded video doesn't have to go over HDMI unless the TV internally is also using HDMI from the processor to the video handling (which is actually commonly how it is done). If you have an ATSC tuner, the signal received is decoded and decompressed and again sent as uncompressed video to be displayed. HDMI has no involvement in MPEG or ATSC or IPTV or any other compressed video system. It only cares about RGB (or YUV) video data and audio and a few control signals. For some reason HDMI (and the compatible DVI) decided to keep using CVT video signalling complete with blanking intervals, although using reduced blanking (CVT-RB and CVT-RBv2), and they put the audio and some other extra bits into that part of the signal. Maybe this was so analog displays could also work with DVI and HDMI?
By comparison, consider the audio in cell phones. Way back in the dark ages of "1G" phones, the signal was analog. Then came 2G, with a few different methods (CODECs) of converting the audio to a digital signal, with a major goal being to reduce the bandwidth, to the point where three or so digital calls would use the same amount of spectrum as one analog. The difference between 2G and 3G, which used the GSM CODEC, is with 2G, the data was a continuous stream, but with 3G the exact same audio was transmitted in packets, though not over IP. Saving bandwidth was still a goal. Then, with 4g and lots of bandwidth available, the goals shifted from saving bandwidth to providing a better quality call. IP was now being used to carry the calls. Through all this, the goal remained the same, that is to carry a voice conversation. With the digital systems, the sync was carried along with the call data, even though different CODECs might have been used at different times.
Again, there is absolutely no need for blanking intervals with digital TV. I suspect Hugh's question arises because he is using an analog monitor, if I read his post correctly. Then analog framing, including blanking interval, has to be created.
BTW, it is possible to have analog video without blanking intervals. Back in the 70s, I used to maintain some video terminals where the sync was fed directly to the monitor, instead of being combined with the video.
My understanding was that on a CRT it needed a bit of blanking time in order to have time to change the magnetic field so the beam could start at the begining of the next line. Whether you used composite sync or seperate H and V sync didn't matter, it still needed the time. -- Len Sorensen