Re: [GTALUG] Has the graphics-card world gone mad?

| From: Evan Leibovitch via talk <talk@gtalug.org> | [GTALUG] Has the graphics-card world gone mad? Yes. For example, I've bought a couple ofAMD rx570-class video cards over the last two or three years. Each with 8G of RAM. $200 each, roughly. I recently saw a used one being offered for $500! These are obsolete! I ask the vendor if he was serious; he replied "yes, look at other folks prices." Insanity. The common wisdom (until the chip shortage became a meme) was that a lot of people in the US took their covid stimulus cheques and spent them on upgrading their rigs. I have no idea if this is a too-good-to-fact-check story or it is reality. There did seem to be a disruption in the shipping between Asia and North America, but video cards are so high price / mass ratio that that should not be a problem. Mining has gone crazy again. It's so crazy that apparently even less-than-optimal cards are being grabbed. But I don't actually know if this is just a myth. Perceived shortage brings on hoarding. That's going on for sure. Everyone building video cards wants to use TSMC's 7nm process. There's a bottleneck for sure. It should be possible to build mid-range GPUs with 10nm, 12nm or 14nm, I would think. | I made the mistake that I might want to upgrade my video card. I have a | Radeon RX 550 that is struggling to drive two 4K screens. It works, but | plenty of flickering under both Linux and Windows. Do you really mean flickering? Or to you mean laggy image updates. I would have thought that the card could do rock-solid 60Hz refresh but might have trouble updating the contents of the frame buffer in a timely fashion. My desktop use of a monitor has mostly static stuff so frame buffer updates need not be very quick. My use of a TV set to view streaming contents would get very annoying if the frame buffer updates were laggy. I use an AndroidTV box for streaming TV. Processor iGPUs are getting better. Perhaps a new processor's iGPU would be good enough and might not be bid up as much. Intel Xe seems really impressive (in 11th gen Core chips). AMD APUs might be good enough. It all depends on what you are really trying to do. You have the floor tomorrow. Perhaps a discussion on this would be interesting. | These don't strike me as cards powerful enough to do coin mining. Why are | they so rare and expensive? Is it the global chip shortage or something | else? And does anyone have an idea how long we'll have to wait this out ... | or how I can tune my card to support the screens? "Markets can remain irrational longer than you can remain solvent." -- John Maynard Keynes Nobody knows. I've often found panics have been used to set people's expectation of pricing. I.e. prices will stick higher. Perhaps not at peak prices, but higher than previous prices. --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

| From: D. Hugh Redelmeier via talk <talk@gtalug.org> | You have the floor tomorrow. Perhaps a discussion on this would be | interesting. Duh. The next meeting is two weeks away. Evan will be the speaker.

Yeah, but it won't be about video cards (thankfully). (FWIW, the issues I am having driving two 4K monitors with an RX 550 exist in both Windows and Linux, suggesting that it's not capable of driving both monitors at full spec. Either I need to dial down the frequency as Russell suggests, or I need a new horsepower card...) - Evan On Mon, 29 Mar 2021 at 18:15, D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: D. Hugh Redelmeier via talk <talk@gtalug.org>
| You have the floor tomorrow. Perhaps a discussion on this would be | interesting.
Duh. The next meeting is two weeks away. Evan will be the speaker. --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

On Tue, Mar 30, 2021 at 12:33 AM Evan Leibovitch via talk <talk@gtalug.org> wrote:
Yeah, but it won't be about video cards (thankfully).
(FWIW, the issues I am having driving two 4K monitors with an RX 550 exist in both Windows and Linux, suggesting that it's not capable of driving both monitors at full spec. Either I need to dial down the frequency as Russell suggests, or I need a new horsepower card...)
Hmmmmmmmmmmm - - - the card isn't current so finding specs isn't straightforward - - - but - - - - https://www.cnet.com/products/xfx-radeon-rx-550-graphics-card-radeon-rx-550-... gives a possible reason for the problem - - - - (one sales agency lists a 'Resolution and refresh rate' and for an RX6900XT said space is 7680x4320 or an 8k space) the gpu just isn't rated for a 7680x2180 space. Here I have a 7680x3000 space - - - but - - - - I'm running 2 gpus - - - that has its own 'problems'! If I had to rebuild my system (not retaining any monitors) I would likely go to 3 - 4k monitors and maybe 3 moderate gpus or if finances were available - - - seldom are - - - 3 mid-upper end - - - - just checked and even an nvidiia 3090 (brand I will no longer consider) where there are 5 ports yet only 4 monitors are supported and a space of 7680x4320@60Hz - - - so a multi-gpu system is what I would need - - - and it works. Regards

On Tue, Mar 30, 2021 at 06:49:57AM -0500, o1bigtenor via talk wrote:
Hmmmmmmmmmmm - - - the card isn't current so finding specs isn't straightforward - - - but - - - - https://www.cnet.com/products/xfx-radeon-rx-550-graphics-card-radeon-rx-550-... gives a possible reason for the problem - - - - (one sales agency lists a 'Resolution and refresh rate' and for an RX6900XT said space is 7680x4320 or an 8k space) the gpu just isn't rated for a 7680x2180 space.
Here I have a 7680x3000 space - - - but - - - - I'm running 2 gpus - - - that has its own 'problems'!
If I had to rebuild my system (not retaining any monitors) I would likely go to 3 - 4k monitors and maybe 3 moderate gpus or if finances were available - - - seldom are - - - 3 mid-upper end - - - - just checked and even an nvidiia 3090 (brand I will no longer consider) where there are 5 ports yet only 4 monitors are supported and a space of 7680x4320@60Hz - - - so a multi-gpu system is what I would need - - - and it works.
Why would you need 7680x4320@60Hz? That's 8K. 3 4K monitors is less than that. Or is it that you want to run three 3K screens side by side? I found a claim that the 980Ti at least was listed as supporting 11520x2160 to run three 4K screens so it would seem at least nvidia cards that are current can do it. I think you had to go to the 10 series to get HDMI 2.0b though, the 900 series probably required using displayport to do it. -- Len Sorensen

OK, I'm now a little more confused than I was when I started. I have two 4K screens (Samsung U28E590D) that I want to use in day to day work. Maybe some streaming, but the most intensive game I would play is Cities: Skylines Any suggestions on what is the minimum GPU that will reasonably drive them? I've backordered a GTX 1660 at $315. Would a 1650 Super be enough? - Evan On Tue, 30 Mar 2021 at 10:46, Lennart Sorensen via talk <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021 at 06:49:57AM -0500, o1bigtenor via talk wrote:
Hmmmmmmmmmmm - - - the card isn't current so finding specs isn't straightforward - - - but - - - -
https://www.cnet.com/products/xfx-radeon-rx-550-graphics-card-radeon-rx-550-...
gives a possible reason for the problem - - - - (one sales agency lists a 'Resolution and refresh rate' and for an RX6900XT said space is 7680x4320 or an 8k space) the gpu just isn't rated for a 7680x2180 space.
Here I have a 7680x3000 space - - - but - - - - I'm running 2 gpus - - - that has its own 'problems'!
If I had to rebuild my system (not retaining any monitors) I would likely go to 3 - 4k monitors and maybe 3 moderate gpus or if finances were available - - - seldom are - - - 3 mid-upper end - - - - just checked and even an nvidiia 3090 (brand I will no longer consider) where there are 5 ports yet only 4 monitors are supported and a space of 7680x4320@60Hz - - - so a multi-gpu system is what I would need - - - and it works.
Why would you need 7680x4320@60Hz? That's 8K. 3 4K monitors is less than that. Or is it that you want to run three 3K screens side by side? I found a claim that the 980Ti at least was listed as supporting 11520x2160 to run three 4K screens so it would seem at least nvidia cards that are current can do it. I think you had to go to the 10 series to get HDMI 2.0b though, the 900 series probably required using displayport to do it.
-- Len Sorensen --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

Evan Leibovitch via talk wrote:
I have two 4K screens (Samsung U28E590D) that I want to use in day to day work. Maybe some streaming, but the most intensive game I would play is Cities: Skylines Any suggestions on what is the minimum GPU that will reasonably drive them? I've backordered a GTX 1660 at $315. Would a 1650 Super be enough?
If all else fails, a Raspberry Pi 4 will drive a pair of high-end monitors, probably not with the grunt that any modern game would want, but enough to be the new X Terminal. -- Anthony de Boer

On Tue, 30 Mar 2021 at 13:08, Anthony de Boer via talk <talk@gtalug.org> wrote:
Evan Leibovitch via talk wrote:
I have two 4K screens (Samsung U28E590D) that I want to use in day to day work. Maybe some streaming, but the most intensive game I would play is Cities: Skylines Any suggestions on what is the minimum GPU that will reasonably drive them? I've backordered a GTX 1660 at $315. Would a 1650 Super be enough?
If all else fails, a Raspberry Pi 4 will drive a pair of high-end monitors, probably not with the grunt that any modern game would want, but enough to be the new X Terminal.
That just shifts the problem. He has a good computer with a not-entirely-working video card. You're suggesting something that would put him in a situation where he'd have a passable video card with a not-entirely-working computer attached to it. Don't get me wrong: I use Raspberry Pis for a lot of things. But I don't use them as my daily driver, because they don't have the horsepower. Shifting to ARM would also break most games. -- Giles https://www.gilesorr.com/ gilesorr@gmail.com

Giles Orr wrote:
On Tue, 30 Mar 2021 at 13:08, Anthony de Boer via talk <talk@gtalug.org> wrote:
... If all else fails, a Raspberry Pi 4 will drive a pair of high-end monitors, probably not with the grunt that any modern game would want, but enough to be the new X Terminal.
That just shifts the problem. He has a good computer with a not-entirely-working video card. You're suggesting something that would put him in a situation where he'd have a passable video card with a not-entirely-working computer attached to it.
At no point did I suggest this would outperform anything, other than a lack of a usable video card on the good computer vs the Pi being able to light lots'n'lots of pixels. Back in the day it used to be customary to run X applications on the big grunty server in the machine room, talking over the network to a relatively underpowered desktop X Terminal that knew little more than how to paint stuff on the screen, and that's still a possible fallback today, with the big PC using the RPi as a terminal. Awesome PC video card > RPi4 > crappy or no video at all.
Don't get me wrong: I use Raspberry Pis for a lot of things. But I don't use them as my daily driver, because they don't have the horsepower. Shifting to ARM would also break most games.
They're more powerful than most of the machines us oldtimers have used, not an entirely bad modest desktop experience in their own right, and a possible silent-computer frontend to noisy spinning rust in another room. Granted, the ancient X protocol didn't include sound and this is probably going to be a much more viable setup for a non-gamer. But back then we had 10 Mbit thinnet shared with everyone else on the floor, while nowadays we have to make do with GigE, so the experience might actually be better than remembered. -- Anthony de Boer

| From: Anthony de Boer via talk <talk@gtalug.org> | Back in the day it used to be customary to run X applications on the big | grunty server in the machine room, talking over the network to a | relatively underpowered desktop X Terminal that knew little more than how | to paint stuff on the screen, and that's still a possible fallback today, | with the big PC using the RPi as a terminal. It seems that network transparent graphics is no longer a thing. Scott Sullivan has explained this in our last meeting. Here's what I absorbed (Scott may consider this a distortion). - performance is crap because the X protocol doesn't express things in a way that engages the capabilities of modern GPUs - security was poor. (Surely that could have been fixed.) - network bandwidth just doesn't match CPU to GPU bandwidth - latencies annoy folks. TCP/IP Networking makes no latency guarantees - the demand for network transparent desktops is very low among the folks that actually develop the software On the other hand, I like network transparent graphics. I used it a lot. I laughed at Windows for not having it. I laughed at the hackiness of VNC as a solution. Well, the last laugh is on me. Evan said that the idea that the thing on your desk was a server was confusing and stupid and that it was good that it is gone. I don't agree. After all, the internet is a network of peers (except for those behind NAT). The thing on your desk can run a server (process or service) -- mine runs a lot.

On Tue, Mar 30, 2021 at 06:49:04PM -0400, D. Hugh Redelmeier via talk wrote:
It seems that network transparent graphics is no longer a thing.
Scott Sullivan has explained this in our last meeting. Here's what I absorbed (Scott may consider this a distortion).
- performance is crap because the X protocol doesn't express things in a way that engages the capabilities of modern GPUs
- security was poor. (Surely that could have been fixed.)
- network bandwidth just doesn't match CPU to GPU bandwidth
- latencies annoy folks. TCP/IP Networking makes no latency guarantees
- the demand for network transparent desktops is very low among the folks that actually develop the software
On the other hand, I like network transparent graphics. I used it a lot. I laughed at Windows for not having it. I laughed at the hackiness of VNC as a solution. Well, the last laugh is on me.
Evan said that the idea that the thing on your desk was a server was confusing and stupid and that it was good that it is gone. I don't agree. After all, the internet is a network of peers (except for those behind NAT). The thing on your desk can run a server (process or service) -- mine runs a lot.
Wayland has certainly declared that remote access is not within scope of the project and is something to be solved elsewhere (with vnc or rdp or similar). Perhaps that also solves the security problem of X11. -- Len Sorensen

I'll side with Gilles here. I have an RPi 4 and it's useful for many things, but a replacement for my desktop it's not. This is not just a gaming issue; I often need dozens of browser tabs open at once and the system is used for video, audio and graphics editing as well as communications (not just Zoom but also MS Teams, Signal, Telegram, WhatsApp and the occasional Webex and Skype). Even 16GB RAM isn't enough without swapping. In any case, my desktop with the underpowered video isn't non-functional, just flickers enough to be annoying; I have the luxury of being able to back-order a 1660S and wait until video card sanity returns. I could probably throttle down my video to match the Pi's spec but long term that's not why I bought these monitors. - Evan On Tue, 30 Mar 2021 at 15:40, Anthony de Boer via talk <talk@gtalug.org> wrote:
Giles Orr wrote:
On Tue, 30 Mar 2021 at 13:08, Anthony de Boer via talk <talk@gtalug.org> wrote:
... If all else fails, a Raspberry Pi 4 will drive a pair of high-end monitors, probably not with the grunt that any modern game would want, but enough to be the new X Terminal.
That just shifts the problem. He has a good computer with a not-entirely-working video card. You're suggesting something that would put him in a situation where he'd have a passable video card with a not-entirely-working computer attached to it.
At no point did I suggest this would outperform anything, other than a lack of a usable video card on the good computer vs the Pi being able to light lots'n'lots of pixels.
Back in the day it used to be customary to run X applications on the big grunty server in the machine room, talking over the network to a relatively underpowered desktop X Terminal that knew little more than how to paint stuff on the screen, and that's still a possible fallback today, with the big PC using the RPi as a terminal.
Awesome PC video card > RPi4 > crappy or no video at all.
Don't get me wrong: I use Raspberry Pis for a lot of things. But I don't use them as my daily driver, because they don't have the horsepower. Shifting to ARM would also break most games.
They're more powerful than most of the machines us oldtimers have used, not an entirely bad modest desktop experience in their own right, and a possible silent-computer frontend to noisy spinning rust in another room.
Granted, the ancient X protocol didn't include sound and this is probably going to be a much more viable setup for a non-gamer. But back then we had 10 Mbit thinnet shared with everyone else on the floor, while nowadays we have to make do with GigE, so the experience might actually be better than remembered.
-- Anthony de Boer --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

On Tue, Mar 30, 2021 at 01:08:53PM -0400, Anthony de Boer via talk wrote:
If all else fails, a Raspberry Pi 4 will drive a pair of high-end monitors, probably not with the grunt that any modern game would want, but enough to be the new X Terminal.
It can drive one 4K screen at 60Hz. It can drive two 4K screens at 30Hz, which I sure wouldn't want to use. It rather ruins video playback in many cases. -- Len Sorensen

| From: Lennart Sorensen via talk <talk@gtalug.org> | On Tue, Mar 30, 2021 at 01:08:53PM -0400, Anthony de Boer via talk wrote: | > If all else fails, a Raspberry Pi 4 will drive a pair of high-end | > monitors, probably not with the grunt that any modern game would want, | > but enough to be the new X Terminal. | | It can drive one 4K screen at 60Hz. It can drive two 4K screens at 30Hz, | which I sure wouldn't want to use. It rather ruins video playback in | many cases. TL;DR: at least sometimes 30Hz is fine. Everything depends on your individual eyes, what you use the monitor for, and the setting of the monitor use. For my desktop use, 30Hz is actually fine. It's what I've used for over six years. I have, within arm's reach, all I need to switch to 60Hz but it hasn't seemed worth the reconfiguration effort. Details of my use: - most of what I do is fairly static. I don't seriously watch videos on my desktop (YouTube seems fine). I don't play games. - my monitor is 39" and perhaps 24"-30" from my face. UltraHD. - I wear special fixed-focus glasses when using the monitor (as opposed to my regular progressive glasses Gamers want refresh rates well above 60Hz. To get that, they seem to be willing to choose lower resolutions. That would be a terrible trade-off for my use. Films are traditionally 25 frames/second (each frame is flashed twice by traditional projectors).

On Tue, Mar 30, 2021 at 06:13:28PM -0400, D. Hugh Redelmeier via talk wrote:
TL;DR: at least sometimes 30Hz is fine.
Everything depends on your individual eyes, what you use the monitor for, and the setting of the monitor use.
For my desktop use, 30Hz is actually fine. It's what I've used for over six years. I have, within arm's reach, all I need to switch to 60Hz but it hasn't seemed worth the reconfiguration effort.
I find a lot of youtube content is 60 fps, as is much of the content I have on mythtv so due to my video card not having HDMI 2.0, I run at 1080p@60Hz rather than 2160p@30Hz. Maybe someday I will update the video card to fix that, although none of the mythtv content needs 4K. X annoyingly thinks that when it detects the TV, it should run at 4K@30Hz rather than the explicitly configured 1080p@60Hz. This happens every time I change inputs on the TV.
Details of my use:
- most of what I do is fairly static. I don't seriously watch videos on my desktop (YouTube seems fine). I don't play games.
- my monitor is 39" and perhaps 24"-30" from my face. UltraHD.
- I wear special fixed-focus glasses when using the monitor (as opposed to my regular progressive glasses
Gamers want refresh rates well above 60Hz. To get that, they seem to be willing to choose lower resolutions. That would be a terrible trade-off for my use.
Films are traditionally 25 frames/second (each frame is flashed twice by traditional projectors).
Well 24 unless you are in europe where they run the movies 4% fast when shown on TV. Shown 0.1% slow on north american TVs. -- Len Sorensen

On Tue, Mar 30, 2021 at 7:21 PM Lennart Sorensen via talk <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021 at 06:13:28PM -0400, D. Hugh Redelmeier via talk wrote:
TL;DR: at least sometimes 30Hz is fine.
Everything depends on your individual eyes, what you use the monitor for, and the setting of the monitor use.
For my desktop use, 30Hz is actually fine. It's what I've used for over six years. I have, within arm's reach, all I need to switch to 60Hz but it hasn't seemed worth the reconfiguration effort.
I find a lot of youtube content is 60 fps, as is much of the content I have on mythtv so due to my video card not having HDMI 2.0, I run at 1080p@60Hz rather than 2160p@30Hz. Maybe someday I will update the video card to fix that, although none of the mythtv content needs 4K.
X annoyingly thinks that when it detects the TV, it should run at 4K@30Hz rather than the explicitly configured 1080p@60Hz. This happens every time I change inputs on the TV.
Details of my use:
- most of what I do is fairly static. I don't seriously watch videos on my desktop (YouTube seems fine). I don't play games.
- my monitor is 39" and perhaps 24"-30" from my face. UltraHD.
- I wear special fixed-focus glasses when using the monitor (as opposed to my regular progressive glasses
Gamers want refresh rates well above 60Hz. To get that, they seem to be willing to choose lower resolutions. That would be a terrible trade-off for my use.
Films are traditionally 25 frames/second (each frame is flashed twice by traditional projectors).
Well 24 unless you are in europe where they run the movies 4% fast when shown on TV. Shown 0.1% slow on north american TVs.
Unless you count the subliminal frame. Then it's 25 fps. https://www.ijcr.eu/articole/330_07%20Maria%20FLOREA.pdf Remember to buy popcorn and enjoy a cool refreshing coca cola.
-- Len Sorensen --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
-- Russell

| From: Lennart Sorensen via talk <talk@gtalug.org> | On Tue, Mar 30, 2021 at 06:13:28PM -0400, D. Hugh Redelmeier via talk wrote: | > TL;DR: at least sometimes 30Hz is fine. | I find a lot of youtube content is 60 fps, as is much of the content I | have on mythtv so due to my video card not having HDMI 2.0, I run at | 1080p@60Hz rather than 2160p@30Hz. Maybe someday I will update the | video card to fix that, although none of the mythtv content needs 4K. On my desktop, I never run videos full-screen. A 39" diagonal screen at 24" is just too overwhelming. I don't care that the YouTube content is stored at 60Hz. Something adjusts it. I think -- I just don't do this much. | X annoyingly thinks that when it detects the TV, it should run at 4K@30Hz | rather than the explicitly configured 1080p@60Hz. This happens every | time I change inputs on the TV. In the old days, there was an key sequence that got X to cycle through the mode settings. I've not really needed it for years since I always want LCDs run at the native resolution when possible. But your use case makes sense. Does that key sequence still exist? (By default, "they" removed the key sequences for shutting down X and for shutting down the machine.) I'm pretty sure you still use X because I know you favour Nvidia GPUs and that almost demands the proprietary driver. (My desktop is in the same boat because of a bargain I scored over eight years ago.) | > Films are traditionally 25 frames/second (each frame is flashed | > twice by traditional projectors). | | Well 24 unless you are in europe where they run the movies 4% fast when | shown on TV. Shown 0.1% slow on north american TVs. Duh. Silly mistake on my part.

On Tue, Mar 30, 2021 at 09:31:26PM -0400, D. Hugh Redelmeier via talk wrote:
On my desktop, I never run videos full-screen. A 39" diagonal screen at 24" is just too overwhelming. I don't care that the YouTube content is stored at 60Hz. Something adjusts it. I think -- I just don't do this much.
In the old days, there was an key sequence that got X to cycle through the mode settings. I've not really needed it for years since I always want LCDs run at the native resolution when possible. But your use case makes sense. Does that key sequence still exist? (By default, "they" removed the key sequences for shutting down X and for shutting down the machine.)
No idea. I just figure it ought to do what I told it and not try to outsmart me. I tried to setup autorandr to fix it automatically, but I can't remember if I got it working or not.
I'm pretty sure you still use X because I know you favour Nvidia GPUs and that almost demands the proprietary driver. (My desktop is in the same boat because of a bargain I scored over eight years ago.)
Yeah I don't even know if mythtv runs on anything other than X11 yet. And the machine has an nvidia card in it (used to be a GTX 275 in that machine until it decided to fuse itself to the PCIe slot of the motherboard, now it is a GTX 660 Ti that I had meant to build a game machine with years ago and never got around to it). -- Len Sorensen

Lennart Sorensen wrote:
On Tue, Mar 30, 2021 at 01:08:53PM -0400, Anthony de Boer via talk wrote:
If all else fails, a Raspberry Pi 4 will drive a pair of high-end monitors, probably not with the grunt that any modern game would want, but enough to be the new X Terminal.
It can drive one 4K screen at 60Hz. It can drive two 4K screens at 30Hz, which I sure wouldn't want to use. It rather ruins video playback in many cases.
Some of us are more into a few lovely huge xterms for hacking on code, maybe SSH sessions somewhere or a browser or MUA and such, a bit of Bach happening on the audio output, and not so much into games and video and the overwhelming-bandwidth experience. (At one point my son was looking at the price of some decent hardware for running flight sim, and found he could have quite a few hours wet rental of a Cessna 152 for that money; the latter won out. I'm of the same opinion; get up from the keyboard and experience real life rather than trying to simulate it on the computer.) -- Anthony de Boer

"Anthony de Boer via talk" <talk@gtalug.org> wrote:
(At one point my son was looking at the price of some decent hardware for running flight sim, and found he could have quite a few hours wet rental of a Cessna 152 for that money; the latter won out. I'm of the same opinion; get up from the keyboard and experience real life rather than trying to simulate it on the computer.)
I deeply miss the ancient Flight Simulator with the polygon graphics. Third-party add-ons made it infinitely customizable, just about any way one wanted. When Microsoft took it over and made it photorealistic, all that got taken out of our hands. BUT, all I have to do is develop some decent OpenGL chops, and I could have it all back.

On Tue, Mar 30, 2021 at 12:22:14PM -0400, Evan Leibovitch wrote:
OK, I'm now a little more confused than I was when I started.
I have two 4K screens (Samsung U28E590D) that I want to use in day to day work. Maybe some streaming, but the most intensive game I would play is Cities: Skylines Any suggestions on what is the minimum GPU that will reasonably drive them? I've backordered a GTX 1660 at $315. Would a 1650 Super be enough?
Well I wouldn't think Cities: Skylines takes that much GPU power, although I don't know how much detail it supports. Of course if you spread it across 3 screens at 4K that starts to be a lot of pixels to render, even without a lot of details. I found one page claiming a 1060 could run Cities: Skylines at Ultra settings on one 4K screen at about 53fps. A 1660 should be a bit faster than that. Not sure by how much. Of course if you were to run the game at 4K on 3 screens, that would probably hurt the frame rate a lot. Of course one could always drop to 3x1080p while running the game and get good framerate, while running the desktop and such at 4K. Or you could play one 1 screen for the game. No idea how that game is at multi monitor support. -- Len Sorensen

On Tuesday, 30 March 2021, o1bigtenor <o1bigtenor@gmail.com> wrote:
On Tue, Mar 30, 2021 at 9:21 AM Russell Reiter <rreiter91@gmail.com> wrote:
On Tue, Mar 30, 2021, 9:50 AM o1bigtenor via talk, <talk@gtalug.org>
On Tue, Mar 30, 2021 at 8:00 AM Russell Reiter via talk <
talk@gtalug.org> wrote:
On Tue, Mar 30, 2021, 1:33 AM Evan Leibovitch via talk, <
talk@gtalug.org> wrote:
Having spent quite a few hours working on things in this area I have found a few things.
IMO I think the issue is actually due the display EDID provided by
many monitor / tv manufacturers is lacking in certain format/reporting respects. While linux autodetection generally works well in most use cases,
wrote: this is the type of problem linux users have historically faced. I think this is probably due, not in any small part, to certain anti competetative practices.
Linux auto-detection is based on manufacturers adhering to standards.
EDID has become a joke - - - - the kernel docs talk about this - - - so its not just my opinion.
I wouldnt nesessarily call EDID a joke. I mean what exactly is the alternative. With it you are at least able to parse the binary and output more human understandable xml structure, which you may then use to try and craft a customised solution for yourself.
Hmm - - - I don't think you're understanding - - - - you are assuming that the data parsed is relevant and accurate. When it is not accurate and/or incomplete - - - - what is it? IMO that's a sick joke.
What I am assuming is that data is provided and comes from a source. It's not a set of instructions, it is purely information. What makes it relevant is the place it comes from. What makes it accurate or inaccurate is that either the information accurately represents a fact a manufacturer wants to make public or it does not.
DeviceID should work better but really doesn't have any kind of real linux connection. At least nvidia - - - - well - - - - they're not too worried about adhering to any standard either - - - why should they - - - they KNOW they own the market (at least only some 80+%). My LG 4k monitor is actually made by Goldstar. The extreme level of profits desired in the industry means that cheap manufacture wins. Also means that details that aren't considered crucial - - - -well - - - they're just ignored!
Take for example the issues with devices using ccd. The lack of linux
I think in this case a closer look at the EDID for each monitor,
assuming they are not exact duplicates of each other, may provide a workable solution. In fact it may be fixed already in a kernel/firmware upgrade.
If that is not possible/desirable then xrandr, get-edid and
What is so very fascinating is that both get-edid and parse-edid really aren't that useful.
I have found them very useful in the past in conjunction with xrandr but
friendly colour profiles for ccd is one of the largest barriers to linux users in their choices of scanners and cameras etc. parse-edid will provide a better understanding of the autogeneration of the display modelines used by Xorg or whatever server is used. personally havent had to use them for quite a number of years. As others have said Linux just works.
You bet - - - - it just works until it doesn't work. What does one do then? https://www.kernel.org/doc/html/v5.9/admin-guide/edid.html Except I just couldn't find anything besides this " "make" in tools/edid/ " which didn't do anything here. You bet I didn't know what I was doing - - - - but I also couldn't find anyone talking about what to do either!
Maybe because they already know what to do with information as postulated and assume their readership does as well. It's not all that farfetched to assume that people using linux do read the fine manual.
Its not that the tools don't work but the tools rely on information provided and that information is all too often not correct. Changing the incorrect information - - - - well - - - I couldn't find a way to do that.
In some cases is as easy as calling xrandr to author a modeline with the
exact specifications it detects after probing as opposed to relying solely on information provided by the manufacturers who may not be entirely concerned with the fact that some users cant just buy the dashboard app they use in their day to day business.
Hmm - - - - how does one find a modeline?
Type xrandr modeline into google and see what pops up.
That oh so wonderful EDID indicated that the monitor was made in 2017 and the company released the product in 2020. Sorta looks like inaccurate information and that wasn't the only thing inaccurate so it was time to find something else.
Products are rebranded all the time. It's not really necessary to change every bit of esoterica in informatics.
Enough of the information sites are themselves outdated (lists 4 and 5 years old really aren't helpful when working with product release in the last 18 months).
In fact several solutions I have read in the past point to the fact
that you can counterfit a manufactures EDID to overcome video tearing and flicker on both linux and windows.
This is possible but to do so means hacking at the kernel level.
Not necessarily, I dont hack kernels, I operate in userland. I've solved issues with these tools for this sort of thing in the past and fully expect to do so in the future.
Best wishes and the best of luck!
-- Russell

Mr Russell
What I am assuming
When one works from assumptions - - - - well its very easy to be quite out to lunch.
Maybe because they already know what to do with information as postulated and assume their readership does as well. It's not all that farfetched to assume that people using linux do read the fine manual.
Had you actually read the page that I linked. I would posit not - - - you might have responded differently if you had.
Products are rebranded all the time. It's not really necessary to change every bit of esoterica in informatics.
Hmmmmm - - - - every bit of esoterica - - - - the edid information is at best 30% correct - - - - the rest is esoterica by your definition (that incorrect information included mode lines and physical information at a minimum). Regards

On Tuesday, 30 March 2021, o1bigtenor <o1bigtenor@gmail.com> wrote:
Mr Russell
What I am assuming
<Snipped the facts>
When one works from assumptions - - - - well its very easy to be quite out to lunch.
Now you decide to edit my posts. Remember you were the one who asked how to find a modeline. I told you how to get more info by using google and keywords.
Maybe because they already know what to do with information as
postulated and assume their readership does as well. It's not all that farfetched to assume that people using linux do read the fine manual.
Had you actually read the page that I linked. I would posit not - - - you might have responded differently if you had.
What makes you think I haven't seen that page before? I've been setting up linux for more than 20 years. The EDID parser falls back to VLB for historical reasons.
Products are rebranded all the time. It's not really necessary to change every bit of esoterica in informatics.
Hmmmmm - - - - every bit of esoterica - - - - the edid information is at best 30% correct - - - - the rest is esoterica by your definition
Who told you the information provided by the manufacturer is only 30 per cent correct? Perhaps incomplete and or poorly formed but never incorrect. It is their own information after all. How could you make that determination now at this time, after stating in your previous post you don't know what you are doing?
Oh yah, you can seem to be reasonable by eliminating my actual words and then making vague references to facts not in evidence, but what end does that serve? There is no argument that EDID information is not perfectly crafted in order to provide absolute certainty but it is information which the author's deem valid at the time of release.
(that incorrect information
included mode lines and physical information at a minimum).
This last sentence makes no sense whatsoever because unless you can show the other 70% of information you don't cite is actually invalid it's just so much hot air.
Regards
-- Russell

On 30/03/2021 01:32, Evan Leibovitch via talk wrote:
Yeah, but it won't be about video cards (thankfully).
(FWIW, the issues I am having driving two 4K monitors with an RX 550 exist in both Windows and Linux, suggesting that it's not capable of driving both monitors at full spec. Either I need to dial down the frequency as Russell suggests, or I need a new horsepower card...)
Limit your search to cards with 4GB memory and that will exclude the overpriced mining cards. Am running a new nvidia 1650 series card here with no issues, and doing h264 encoding of the output at the same time. Under $300 if you look around. e.g. https://www.pc-canada.com/item/PH%2DGTX1650%2DO4GD6.html - those are nice because they are powered solely by the PCIe slot, so no extra GPU cable needed. Cheers, Jamon

On Tue, 30 Mar 2021 at 08:55, Jamon Camisso via talk <talk@gtalug.org> wrote:
Am running a new nvidia 1650 series card here with no issues, and doing h264 encoding of the output at the same time. Under $300 if you look around.
I've been looking everywhere for 1650 cards (my first choice is actually a 1660 Super <https://www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=156451> for a few dollars more). If anyone knows of a source with stock please tell. e.g. https://www.pc-canada.com/item/PH%2DGTX1650%2DO4GD6.html Out of stock, at PC Canada and everywhere. What I've found is that such cards are either out of stock listed at normal prices, or available at triple the price <https://www.newegg.ca/msi-geforce-gtx-1660-super-gtx-1660-super-ventus-xs-oc/p/N82E16814137475?Description=1660%20Super&cm_re=1660_Super-_-14-137-475-_-Product> ... which is what prompted me to write my original message. - Evan

On Tue, Mar 30, 2021, 1:33 AM Evan Leibovitch via talk, <talk@gtalug.org> wrote:
Yeah, but it won't be about video cards (thankfully).
(FWIW, the issues I am having driving two 4K monitors with an RX 550 exist in both Windows and Linux, suggesting that it's not capable of driving both monitors at full spec. Either I need to dial down the frequency as Russell suggests, or I need a new horsepower card...)
- Evan
IMO I think the issue is actually due the display EDID provided by many monitor / tv manufacturers is lacking in certain format/reporting respects. While linux autodetection generally works well in most use cases, this is the type of problem linux users have historically faced. I think this is probably due, not in any small part, to certain anti competetative practices. Take for example the issues with devices using ccd. The lack of linux friendly colour profiles for ccd is one of the largest barriers to linux users in their choices of scanners and cameras etc. I think in this case a closer look at the EDID for each monitor, assuming they are not exact duplicates of each other, may provide a workable solution. In fact it may be fixed already in a kernel/firmware upgrade. If that is not possible/desirable then xrandr, get-edid and parse-edid will provide a better understanding of the autogeneration of the display modelines used by Xorg or whatever server is used. In fact several solutions I have read in the past point to the fact that you can counterfit a manufactures EDID to overcome video tearing and flicker on both linux and windows. I think linux tools for finer grained control of hi-def monitors are out there, you just have to figure out how to stitch them all together. I just ran parse-edid on one of my machines EDID collection in /sys after booting into Debian Buster and it told me mine was not valid. The displays are working ok but neither of the Toshiba TV display or the dell monitor are new or 4k, so definately YMMV.
On Mon, 29 Mar 2021 at 18:15, D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
| From: D. Hugh Redelmeier via talk <talk@gtalug.org>
| You have the floor tomorrow. Perhaps a discussion on this would be | interesting.
Duh. The next meeting is two weeks away. Evan will be the speaker. --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
--- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

On Tue, Mar 30, 2021 at 8:00 AM Russell Reiter via talk <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021, 1:33 AM Evan Leibovitch via talk, <talk@gtalug.org> wrote:
Having spent quite a few hours working on things in this area I have found a few things.
IMO I think the issue is actually due the display EDID provided by many monitor / tv manufacturers is lacking in certain format/reporting respects. While linux autodetection generally works well in most use cases, this is the type of problem linux users have historically faced. I think this is probably due, not in any small part, to certain anti competetative practices.
Linux auto-detection is based on manufacturers adhering to standards. EDID has become a joke - - - - the kernel docs talk about this - - - so its not just my opinion. DeviceID should work better but really doesn't have any kind of real linux connection. At least nvidia - - - - well - - - - they're not too worried about adhering to any standard either - - - why should they - - - they KNOW they own the market (at least only some 80+%). My LG 4k monitor is actually made by Goldstar. The extreme level of profits desired in the industry means that cheap manufacture wins. Also means that details that aren't considered crucial - - - -well - - - they're just ignored!
Take for example the issues with devices using ccd. The lack of linux friendly colour profiles for ccd is one of the largest barriers to linux users in their choices of scanners and cameras etc.
I think in this case a closer look at the EDID for each monitor, assuming they are not exact duplicates of each other, may provide a workable solution. In fact it may be fixed already in a kernel/firmware upgrade.
If that is not possible/desirable then xrandr, get-edid and parse-edid will provide a better understanding of the autogeneration of the display modelines used by Xorg or whatever server is used.
What is so very fascinating is that both get-edid and parse-edid really aren't that useful. Its not that the tools don't work but the tools rely on information provided and that information is all too often not correct. Changing the incorrect information - - - - well - - - I couldn't find a way to do that. Enough of the information sites are themselves outdated (lists 4 and 5 years old really aren't helpful when working with product release in the last 18 months).
In fact several solutions I have read in the past point to the fact that you can counterfit a manufactures EDID to overcome video tearing and flicker on both linux and windows.
This is possible but to do so means hacking at the kernel level. I'm not up to that and I'd bet I'm not the only one out there that is that limited.
I think linux tools for finer grained control of hi-def monitors are out there, you just have to figure out how to stitch them all together.
That would be a reasonable assumption - - - - except that the divergence between what should be and what is - - - - well - - - to be kind - - - it seems that there is absolutely no connection. I would support the original subject - - - but - - - its only one more example of our present climate of 'who gives a @#$%^& about the customer'. Am expecting things to only get worse. But then most of the world now wants to live on a 'stupid phone' or maybe a laptop - - - in extremis - - - so its a Sysiphisian job of gargantuan dimensions to effect ANY change. Regards

On Tue., Mar. 30, 2021, 09:50 o1bigtenor via talk, <talk@gtalug.org> wrote:
My LG 4k monitor is actually made by Goldstar.
LG was formerly known as Lucky Goldstar. It's probably embedded in some ancient table, like the "Acer" name for my Asus monitor. Cheers Stewart

On Tue, 30 Mar 2021 at 09:51, o1bigtenor via talk <talk@gtalug.org> wrote: My LG 4k monitor is actually made by Goldstar.
Um, LG is an abbreviation for (and 1995 rebranding of <https://techcrunch.com/2007/02/08/the-futurist-from-lucky-goldstar-to-lg-or-brands-that-change-with-the-times/>) "Lucky Goldstar" which has been making consumer monitors since the original IBM PCs and maybe before. So all LG products are "made by Goldstar" by definition. - Evan

On 2021-03-30 10:08 a.m., Evan Leibovitch via talk wrote:
On Tue, 30 Mar 2021 at 09:51, o1bigtenor via talk <talk@gtalug.org <mailto:talk@gtalug.org>> wrote:
My LG 4k monitor is actually made by Goldstar.
Um, LG is an abbreviation for (and 1995 rebranding of "Lucky Goldstar"
Really? I remember when I first started seeing ads for LG TVs and appliances. The ads used to say "Life's Good". I always thought that is what the letters stood for. -- Cheers! Kevin. http://www.ve3syb.ca/ | "Nerds make the shiny things that https://www.patreon.com/KevinCozens | distract the mouth-breathers, and | that's why we're powerful" Owner of Elecraft K2 #2172 | #include <disclaimer/favourite> | --Chris Hardwick

On Tue, 30 Mar 2021 at 14:53, Kevin Cozens via talk <talk@gtalug.org> wrote:
The ads used to say "Life's Good". I always thought that is what the letters stood for.
They likely used the tagline hoping people would think that. It helps disassociate LG from Lucky Goldstar, since Goldstar was considered by many to be a low end brand. https://en.wikipedia.org/wiki/LG_Corporation -- Scott

On 2021-03-30 2:53 p.m., Kevin Cozens via talk wrote:
Um, LG is an abbreviation for (and 1995 rebranding of "Lucky Goldstar"
Really? I remember when I first started seeing ads for LG TVs and appliances. The ads used to say "Life's Good". I always thought that is what the letters stood for.
Life may be good, but LG products aren't. I've had a monitor and cell phone made by them. I wasn't happy with either.

On Tue, Mar 30, 2021 at 3:24 PM James Knott via talk <talk@gtalug.org> wrote:
On 2021-03-30 2:53 p.m., Kevin Cozens via talk wrote:
Um, LG is an abbreviation for (and 1995 rebranding of "Lucky Goldstar"
Really? I remember when I first started seeing ads for LG TVs and appliances. The ads used to say "Life's Good". I always thought that is what the letters stood for.
Life may be good, but LG products aren't. I've had a monitor and cell phone made by them. I wasn't happy with either.
So far so good for the 4k monitor here. They do make some decent white goods as well. Time will tell though - - - - monitor is only in use for some few months. I'll know much better in a few years. Regards

On 2021-03-30 4:27 p.m., o1bigtenor via talk wrote:
They do make some decent white goods as well.
Yes, they surely do. Paid an arm and a leg for an LG washer with a condenser dryer - a thing that's been standard in Europe for 30+ years but is virtually unknown in the energy-efficiency backwater that is Canada. Stewart

| From: James Knott via talk <talk@gtalug.org> | Life may be good, but LG products aren't. I've had a monitor and cell phone | made by them. I wasn't happy with either. This is the TV I'd like. But I haven't overcome sticker shock. It is very good. <https://www.lg.com/ca_en/tvs/lg-OLED65CXPUA>

On 3/31/21 12:25 AM, D. Hugh Redelmeier via talk wrote:
| From: James Knott via talk <talk@gtalug.org>
| Life may be good, but LG products aren't. I've had a monitor and cell phone | made by them. I wasn't happy with either.
This is the TV I'd like. But I haven't overcome sticker shock. It is very good. <https://www.lg.com/ca_en/tvs/lg-OLED65CXPUA> Sprang for one an LG 65" OLED TV over Xmas and have been very pleased with it. Their WebOS is a variant of linux that has a usable API. The images quality is stunning. I just need to get my OTA antenna working again so I can leverage mythtv.
We have a couple of other LG TVs and my biggest complaint is that they are running fine and dandy making it impossible to find a justification to replace them. Both my wife and I had LG cell phones (In the form of a Google phone) and they served us very well. To date we have had fairly good luck with LG products but like all products YMMV. -- Alvin Starr || land: (647)478-6285 Netvel Inc. || Cell: (416)806-0133 alvin@netvel.net ||

On 2021-03-31 9:06 a.m., Alvin Starr via talk wrote:
Both my wife and I had LG cell phones (In the form of a Google phone) and they served us very well.
I had a Google Nexus 5, which was made by LG. It developed an intermittent audio problem to the speaker. I wound up buying a bluetooth earpiece so that the phone would be usable. This was apparently a common problem with those phones. Also I had to replace the battery far sooner than should have been necessary. On the other hand, my Nexus 1 and Pixel 2, made by HTC are rock solid. They even feel sturdier than the Nexus 5. The only reason I bought the Nexus 5 was my Nexus 1 was running out of free memory. Of course, it will soon be unusable when 3G goes away. BTW, the Nexus 1 came with my name etched on the back. I have not see that on any other phone.

On Wed, Mar 31, 2021 at 12:25:32AM -0400, D. Hugh Redelmeier via talk wrote:
This is the TV I'd like. But I haven't overcome sticker shock. It is very good. <https://www.lg.com/ca_en/tvs/lg-OLED65CXPUA>
I opted for the Sony A1E 55" over the LG C7 a few years ago. No regrets. If I had a reason to get a new TV (which I obviously do not) I would be getting the Sony A90J at this point. Same LG OLED panel, just better electronics. I have washer, dryer, dishwasher and fridge by LG, stove and cell phone by Samsung. -- Len Sorensen

On Tue, 30 Mar 2021 at 16:25, James Knott via talk <talk@gtalug.org> wrote: Life may be good, but LG products aren't. I've had a monitor and cell
phone made by them. I wasn't happy with either.
<https://www.itworldcanada.com/article/lg-closes-its-smartphone-business/445441> It seems like LG heard our comments and just gave up. <https://www.itworldcanada.com/article/lg-closes-its-smartphone-business/445441> - Evan

On Thu, 8 Apr 2021 01:57:39 -0400 Evan Leibovitch via talk <talk@gtalug.org> wrote:
On Tue, 30 Mar 2021 at 16:25, James Knott via talk <talk@gtalug.org> wrote:
Life may be good, but LG products aren't. I've had a monitor and cell
phone made by them. I wasn't happy with either.
Evan, I bought an LG refrigerator when I bought my house in 2005. It is working fine. Maybe I got lucky. -- Howard Gibson hgibson@eol.ca jhowardgibson@gmail.com http://home.eol.ca/~hgibson

Sometimes it's a crapshoot, even different models from the same company will have different levels of reliability and cost-cutting tradeoffs. One can get lucky in either direction. I have a Viking freezer in my basement that hasn't missed a beat from the day we unpacked it. Evan Leibovitch, Toronto Canada @evanleibovitch / @el56 On Thu, 8 Apr 2021 at 02:05, Howard Gibson via talk <talk@gtalug.org> wrote:
On Thu, 8 Apr 2021 01:57:39 -0400 Evan Leibovitch via talk <talk@gtalug.org> wrote:
On Tue, 30 Mar 2021 at 16:25, James Knott via talk <talk@gtalug.org> wrote:
Life may be good, but LG products aren't. I've had a monitor and cell
phone made by them. I wasn't happy with either.
Evan,
I bought an LG refrigerator when I bought my house in 2005. It is working fine. Maybe I got lucky.
-- Howard Gibson hgibson@eol.ca jhowardgibson@gmail.com http://home.eol.ca/~hgibson --- Post to this mailing list talk@gtalug.org Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk

On Thu, Apr 08, 2021 at 02:04:29AM -0400, Howard Gibson via talk wrote:
I bought an LG refrigerator when I bought my house in 2005. It is working fine. Maybe I got lucky.
I have LG fridge, dishwasher, washer and dryer, and they made the OLED panel in my TV (but not the TV itself). No problem with any of those. Never had any of their phones. I think I have mainly had Nokia, Samsung and Blackberry cell phones. -- Len Sorensen

On Tue, Mar 30, 2021, 9:50 AM o1bigtenor via talk, <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021 at 8:00 AM Russell Reiter via talk <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021, 1:33 AM Evan Leibovitch via talk, <talk@gtalug.org>
wrote:
Having spent quite a few hours working on things in this area I have found a few things.
IMO I think the issue is actually due the display EDID provided by many
monitor / tv manufacturers is lacking in certain format/reporting respects. While linux autodetection generally works well in most use cases, this is the type of problem linux users have historically faced. I think this is probably due, not in any small part, to certain anti competetative practices.
Linux auto-detection is based on manufacturers adhering to standards.
EDID has become a joke - - - - the kernel docs talk about this - - - so its not just my opinion.
I wouldnt nesessarily call EDID a joke. I mean what exactly is the alternative. With it you are at least able to parse the binary and output more human understandable xml structure, which you may then use to try and craft a customised solution for yourself. DeviceID should work better but really doesn't have any kind of real
linux connection. At least nvidia - - - - well - - - - they're not too worried about adhering to any standard either - - - why should they - - - they KNOW they own the market (at least only some 80+%). My LG 4k monitor is actually made by Goldstar. The extreme level of profits desired in the industry means that cheap manufacture wins. Also means that details that aren't considered crucial - - - -well - - - they're just ignored!
Take for example the issues with devices using ccd. The lack of linux
friendly colour profiles for ccd is one of the largest barriers to linux users in their choices of scanners and cameras etc.
I think in this case a closer look at the EDID for each monitor,
assuming they are not exact duplicates of each other, may provide a workable solution. In fact it may be fixed already in a kernel/firmware upgrade.
If that is not possible/desirable then xrandr, get-edid and parse-edid
will provide a better understanding of the autogeneration of the display modelines used by Xorg or whatever server is used.
What is so very fascinating is that both get-edid and parse-edid really aren't that useful.
I have found them very useful in the past in conjunction with xrandr but personally havent had to use them for quite a number of years. As others have said Linux just works. Its not that the tools don't work but the tools rely on information
provided and that information is all too often not correct. Changing the incorrect information - - - - well - - - I couldn't find a way to do that.
In some cases is as easy as calling xrandr to author a modeline with the exact specifications it detects after probing as opposed to relying solely on information provided by the manufacturers who may not be entirely concerned with the fact that some users cant just buy the dashboard app they use in their day to day business. Enough of the information sites are themselves outdated (lists 4 and 5
years old really aren't helpful when working with product release in the last 18 months).
In fact several solutions I have read in the past point to the fact that
you can counterfit a manufactures EDID to overcome video tearing and flicker on both linux and windows.
This is possible but to do so means hacking at the kernel level.
Not necessarily, I dont hack kernels, I operate in userland. I've solved issues with these tools for this sort of thing in the past and fully expect to do so in the future. I'm not up to that and I'd bet I'm not the only one out there that is
that limited.
I think linux tools for finer grained control of hi-def monitors are out
there, you just have to figure out how to stitch them all together.
That would be a reasonable assumption - - - - except that the divergence between what should be and what is - - - - well - - - to be kind - - - it seems that there is absolutely no connection.
I would support the original subject - - - but - - - its only one more example of our present climate of 'who gives a @#$%^& about the customer'. Am expecting things to only get worse.
But then most of the world now wants to live on a 'stupid phone' or maybe a laptop - - - in extremis - - - so its a Sysiphisian job of gargantuan dimensions to effect ANY change.
Regards ---
Post to this mailing list talk@gtalug.org
Unsubscribe from this mailing list https://gtalug.org/mailman/listinfo/talk
Russell

On Tue, Mar 30, 2021 at 9:21 AM Russell Reiter <rreiter91@gmail.com> wrote:
On Tue, Mar 30, 2021, 9:50 AM o1bigtenor via talk, <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021 at 8:00 AM Russell Reiter via talk <talk@gtalug.org> wrote:
On Tue, Mar 30, 2021, 1:33 AM Evan Leibovitch via talk, <talk@gtalug.org> wrote:
Having spent quite a few hours working on things in this area I have found a few things.
IMO I think the issue is actually due the display EDID provided by many monitor / tv manufacturers is lacking in certain format/reporting respects. While linux autodetection generally works well in most use cases, this is the type of problem linux users have historically faced. I think this is probably due, not in any small part, to certain anti competetative practices.
Linux auto-detection is based on manufacturers adhering to standards.
EDID has become a joke - - - - the kernel docs talk about this - - - so its not just my opinion.
I wouldnt nesessarily call EDID a joke. I mean what exactly is the alternative. With it you are at least able to parse the binary and output more human understandable xml structure, which you may then use to try and craft a customised solution for yourself.
Hmm - - - I don't think you're understanding - - - - you are assuming that the data parsed is relevant and accurate. When it is not accurate and/or incomplete - - - - what is it? IMO that's a sick joke.
DeviceID should work better but really doesn't have any kind of real linux connection. At least nvidia - - - - well - - - - they're not too worried about adhering to any standard either - - - why should they - - - they KNOW they own the market (at least only some 80+%). My LG 4k monitor is actually made by Goldstar. The extreme level of profits desired in the industry means that cheap manufacture wins. Also means that details that aren't considered crucial - - - -well - - - they're just ignored!
Take for example the issues with devices using ccd. The lack of linux friendly colour profiles for ccd is one of the largest barriers to linux users in their choices of scanners and cameras etc.
I think in this case a closer look at the EDID for each monitor, assuming they are not exact duplicates of each other, may provide a workable solution. In fact it may be fixed already in a kernel/firmware upgrade.
If that is not possible/desirable then xrandr, get-edid and parse-edid will provide a better understanding of the autogeneration of the display modelines used by Xorg or whatever server is used.
What is so very fascinating is that both get-edid and parse-edid really aren't that useful.
I have found them very useful in the past in conjunction with xrandr but personally havent had to use them for quite a number of years. As others have said Linux just works.
You bet - - - - it just works until it doesn't work. What does one do then? https://www.kernel.org/doc/html/v5.9/admin-guide/edid.html Except I just couldn't find anything besides this " "make" in tools/edid/ " which didn't do anything here. You bet I didn't know what I was doing - - - - but I also couldn't find anyone talking about what to do either!
Its not that the tools don't work but the tools rely on information provided and that information is all too often not correct. Changing the incorrect information - - - - well - - - I couldn't find a way to do that.
In some cases is as easy as calling xrandr to author a modeline with the exact specifications it detects after probing as opposed to relying solely on information provided by the manufacturers who may not be entirely concerned with the fact that some users cant just buy the dashboard app they use in their day to day business.
Hmm - - - - how does one find a modeline? That oh so wonderful EDID indicated that the monitor was made in 2017 and the company released the product in 2020. Sorta looks like inaccurate information and that wasn't the only thing inaccurate so it was time to find something else.
Enough of the information sites are themselves outdated (lists 4 and 5 years old really aren't helpful when working with product release in the last 18 months).
In fact several solutions I have read in the past point to the fact that you can counterfit a manufactures EDID to overcome video tearing and flicker on both linux and windows.
This is possible but to do so means hacking at the kernel level.
Not necessarily, I dont hack kernels, I operate in userland. I've solved issues with these tools for this sort of thing in the past and fully expect to do so in the future.
Best wishes and the best of luck!
participants (16)
-
Alvin Starr
-
Anthony de Boer
-
D. Hugh Redelmeier
-
Evan Leibovitch
-
Giles Orr
-
Howard Gibson
-
James Knott
-
Jamon Camisso
-
Kevin Cozens
-
lsorense@csclub.uwaterloo.ca
-
mwilson@Vex.Net
-
o1bigtenor
-
Russell Reiter
-
Scott Allen
-
Stewart C. Russell
-
Stewart Russell