
From my research, it seemed as if the most likely problem would involve
My 39" UltraHD TV, the one that I used as my main computer monitor for almost four years, stopped working. The symptom was that it just would not turn on. The status light below the screen stayed red, meaning something like "standby". Normally it turns blue when I'm using it. Googling and watching YouTube videos convinced me that there was a chance that I could repair it. LCDs seem to have certain standard PC boards. - T-Con (timing control) - power supply - processor - LED light & video driver Replacement boards are reasonably inexpensive, apparently from chop shops (i.e. they buy broken TVs and sell the working parts). This shows someone fixing my model of TV. <https://www.youtube.com/watch?v=XD7rIEgYULI> the power supply module. I could get one for US$~20 + ~$20 for shipping. I opened up the monitor and examined the entrails. There was a burnt spot on the power supply board. I posted my problem to the BadCaps.com forum and got encouragement that a little bit of solder would fix the board. I tried this, and it worked. At least for now. I'm using the monitor to compose this mail. <https://www.badcaps.net/forum/showthread.php?p=898646> I spent several hours researching and perhaps an hour disassembling, soldering, and reassembling. It might not have been worth that time given the value of the monitor ($350 original price, but used for 4 years and obsolete). I find it satisfying to fix a hardware problem, even though I'm a software guy. Summary: not all hardware problems are hard.

Congrats. One less monitor in the landfill! I strive to repair all I can. On Thu, 30 May 2019 at 13:41, D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
My 39" UltraHD TV, the one that I used as my main computer monitor for almost four years, stopped working.
The symptom was that it just would not turn on. The status light below the screen stayed red, meaning something like "standby". Normally it turns blue when I'm using it.
Googling and watching YouTube videos convinced me that there was a chance that I could repair it. LCDs seem to have certain standard PC boards.
- T-Con (timing control)
- power supply
- processor
- LED light & video driver
Replacement boards are reasonably inexpensive, apparently from chop shops (i.e. they buy broken TVs and sell the working parts).
This shows someone fixing my model of TV. <https://www.youtube.com/watch?v=XD7rIEgYULI>
From my research, it seemed as if the most likely problem would involve the power supply module. I could get one for US$~20 + ~$20 for shipping.
I opened up the monitor and examined the entrails. There was a burnt spot on the power supply board. I posted my problem to the BadCaps.com forum and got encouragement that a little bit of solder would fix the board. I tried this, and it worked. At least for now. I'm using the monitor to compose this mail.
<https://www.badcaps.net/forum/showthread.php?p=898646>
I spent several hours researching and perhaps an hour disassembling, soldering, and reassembling. It might not have been worth that time given the value of the monitor ($350 original price, but used for 4 years and obsolete). I find it satisfying to fix a hardware problem, even though I'm a software guy.
Summary: not all hardware problems are hard. --- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk

For the benefit of others, I would highly recommend Rosebud Technologies in Markham if you do not have the time, ability, or inclination to repair your broken electronics yourself. http://www.rosebudtech.ca/ Regards, Clifford Ilkay +1 647-778-8696 On Thu, May 30, 2019 at 1:43 PM Don Tai via talk <talk@gtalug.org> wrote:
Congrats. One less monitor in the landfill! I strive to repair all I can.
On Thu, 30 May 2019 at 13:41, D. Hugh Redelmeier via talk <talk@gtalug.org> wrote:
My 39" UltraHD TV, the one that I used as my main computer monitor for almost four years, stopped working.
The symptom was that it just would not turn on. The status light below the screen stayed red, meaning something like "standby". Normally it turns blue when I'm using it.
Googling and watching YouTube videos convinced me that there was a chance that I could repair it. LCDs seem to have certain standard PC boards.
- T-Con (timing control)
- power supply
- processor
- LED light & video driver
Replacement boards are reasonably inexpensive, apparently from chop shops (i.e. they buy broken TVs and sell the working parts).
This shows someone fixing my model of TV. <https://www.youtube.com/watch?v=XD7rIEgYULI>
From my research, it seemed as if the most likely problem would involve the power supply module. I could get one for US$~20 + ~$20 for shipping.
I opened up the monitor and examined the entrails. There was a burnt spot on the power supply board. I posted my problem to the BadCaps.com forum and got encouragement that a little bit of solder would fix the board. I tried this, and it worked. At least for now. I'm using the monitor to compose this mail.
<https://www.badcaps.net/forum/showthread.php?p=898646>
I spent several hours researching and perhaps an hour disassembling, soldering, and reassembling. It might not have been worth that time given the value of the monitor ($350 original price, but used for 4 years and obsolete). I find it satisfying to fix a hardware problem, even though I'm a software guy.
Summary: not all hardware problems are hard. --- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk
--- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk

On 2019-05-30 02:09 PM, Clifford Ilkay via talk wrote:
For the benefit of others, I would highly recommend Rosebud Technologies in Markham if you do not have the time, ability, or inclination to repair your broken electronics yourself. http://www.rosebudtech.ca/
FWIW, I recently took a 42" Sharp TV to the recycling centre. I bought a new 43" Sharp for $349. That's getting close to what a repair is likely to cost. Much of my career has been in repair, though not TVs. I've often realized it's better to toss & replace, than attempt to repair and I have experience in replacing large (over 100 lead) surface mount chips. Incidentally, my new TV uses about 1/5th the power of the old one, so there's that to consider too.

| From: James Knott via talk <talk@gtalug.org> | FWIW, I recently took a 42" Sharp TV to the recycling centre. I bought | a new 43" Sharp for $349. That's getting close to what a repair is | likely to cost. Much of my career has been in repair, though not TVs. | I've often realized it's better to toss & replace, than attempt to | repair and I have experience in replacing large (over 100 lead) surface | mount chips. | | Incidentally, my new TV uses about 1/5th the power of the old one, so | there's that to consider too. Thanks for the datapoint. I agree with your point. I certainly bow to your experience. For example, I solder less than once a year so I'm not very proficient. Sharp used to be a very good brand for LCD TVs. They pioneered in that market. But they essentially went broke competing with inexpensive Chinese brands. Since 2015 the Sharp TV brand was licensed by HiSense (a state-owned Chinese enterprise) in North America. Since 2016, Sharp itself has been owned by Foxconn (a Taiwan-based group). In 2017 Sharp (Foxconn) sued Hisense for damaging the reputation of the brand (since dropped). Just this month Sharp re-acquired the brand for TVs in the US (and eventually Canada). So: before 2015, Sharp was an interesting brand. Now: it is a low-end brand (often with good value for money). Future: who knows. But: there have been real improvements technology over that same time period, so a new Sharp TV might be better than an old Sharp TV. This is another example of how branding can mean something and then stop meaning what one thought it did. Power reduction: interesting. Standby or operating power? I wonder why? Fluorescent backlight vs LED? How old was your old one? I intend to buy a much better monitor (which might be a TV). The thing is that (1) I'm cheap, and (2) my idea of "much better" is alway coming soon but never here. Sharp is expected to put its high-end 8k TVs on sale in the US this fall. One more reason to wait :-) (I have several models of Sharp Zaurus PDAs that run Linux. Too bad that line died.)

On 2019-05-31 09:33 AM, D. Hugh Redelmeier via talk wrote:
So: before 2015, Sharp was an interesting brand. Now: it is a low-end brand (often with good value for money). Future: who knows. But: there have been real improvements technology over that same time period, so a new Sharp TV might be better than an old Sharp TV.
This is another example of how branding can mean something and then stop meaning what one thought it did.
Power reduction: interesting. Standby or operating power? I wonder why? Fluorescent backlight vs LED? How old was your old one?
I intend to buy a much better monitor (which might be a TV). The thing is that (1) I'm cheap, and (2) my idea of "much better" is alway coming soon but never here.
My old set was just over 10 years old. It had fluorescent back light. The new set, despite a slightly larger screen, is a bit smaller, certainly thinner, than the old one. I agree about what's happening to the tech companies. You may see some low end sets from RCA or Westinghouse. Of course, those have absolutely nothing to do with the companies that were behind those names for so many years. In fact, RCA & subsidiary the NBC network, were largely responsible for the development of the NTSC standard, which was the HDTV of it's day.

I don't think you can get a TV or monitor that is not made in China, but someone please tell me I'm wrong. On Fri, 31 May 2019 at 09:54, James Knott via talk <talk@gtalug.org> wrote:
On 2019-05-31 09:33 AM, D. Hugh Redelmeier via talk wrote:
So: before 2015, Sharp was an interesting brand. Now: it is a low-end brand (often with good value for money). Future: who knows. But: there have been real improvements technology over that same time period, so a new Sharp TV might be better than an old Sharp TV.
This is another example of how branding can mean something and then stop meaning what one thought it did.
Power reduction: interesting. Standby or operating power? I wonder why? Fluorescent backlight vs LED? How old was your old one?
I intend to buy a much better monitor (which might be a TV). The thing is that (1) I'm cheap, and (2) my idea of "much better" is alway coming soon but never here.
My old set was just over 10 years old. It had fluorescent back light. The new set, despite a slightly larger screen, is a bit smaller, certainly thinner, than the old one. I agree about what's happening to the tech companies. You may see some low end sets from RCA or Westinghouse. Of course, those have absolutely nothing to do with the companies that were behind those names for so many years. In fact, RCA & subsidiary the NBC network, were largely responsible for the development of the NTSC standard, which was the HDTV of it's day.
--- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk

I've seen some screens with 'Made in Mexico' label. https://www.npr.org/sections/money/2012/11/30/166180397/why-mexico-is-the-wo... According to this article from 2012 Mexico is the largest exporter of flat screen TVs. Alex. On 2019-05-31 10:19 a.m., Don Tai via talk wrote:
I don't think you can get a TV or monitor that is not made in China, but someone please tell me I'm wrong.
On Fri, 31 May 2019 at 09:54, James Knott via talk <talk@gtalug.org <mailto:talk@gtalug.org>> wrote:
On 2019-05-31 09:33 AM, D. Hugh Redelmeier via talk wrote: > So: before 2015, Sharp was an interesting brand. > Now: it is a low-end brand (often with good value for money). > Future: who knows. > But: there have been real improvements technology over that same time > period, so a new Sharp TV might be better than an old Sharp TV. > > This is another example of how branding can mean something and then > stop meaning what one thought it did. > > Power reduction: interesting. Standby or operating power? I wonder > why? Fluorescent backlight vs LED? How old was your old one? > > I intend to buy a much better monitor (which might be a TV). The > thing is that (1) I'm cheap, and (2) my idea of "much better" is alway > coming soon but never here.
My old set was just over 10 years old. It had fluorescent back light. The new set, despite a slightly larger screen, is a bit smaller, certainly thinner, than the old one. I agree about what's happening to the tech companies. You may see some low end sets from RCA or Westinghouse. Of course, those have absolutely nothing to do with the companies that were behind those names for so many years. In fact, RCA & subsidiary the NBC network, were largely responsible for the development of the NTSC standard, which was the HDTV of it's day.
--- Talk Mailing List talk@gtalug.org <mailto:talk@gtalug.org> https://gtalug.org/mailman/listinfo/talk
--- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk

On Fri, May 31, 2019 at 10:19:10AM -0400, Don Tai via talk wrote:
I don't think you can get a TV or monitor that is not made in China, but someone please tell me I'm wrong.
Define "made". :) Certainly as of a couple of years ago, LG OLED TVs were assembled in Mexico for the north american market, Poland for the european market, and who knows where (probably China) for the rest of the world. The panel is made in South Korea, and the other components are made in China. So where are they "made"? -- Len Sorensen

| From: Don Tai via talk <talk@gtalug.org> | I don't think you can get a TV or monitor that is not made in China, but | someone please tell me I'm wrong. Sharp had Japanese factories. They had Mexican factories. Those are places that the old TV might have been manufactured. The more you look at it, the more the question becomes complicated. China is not the lowest-cost country for manufacturing. But it does have the manufacturing networks that don't exist anywhere else in the world. And Chinese companies have been moving up the "value chain". It takes a while for perceptions to catch up to reality. When we buy products, we cannot afford to do the research to have a high probability of making the right choice. We lean on brands. Companies know this, with a few results: - most companies try to cultivate a brand identity + advertising [GM] + high standards [Rolls Royce] + niche image [Tesla, Bugatti, Land Rover] - sometime companies just exploit the delay between cheapening a product and the marketplace recognizing that effect. [Porche makes SUVs! RCA, Westinghouse, Marantz, ...] - many companies are at a disadvantage because their brand is unknown and therefore not trusted. [HiSense] For whatever reason, few RoC companies have brands that are valuable in North America. I'm old enough that I remember that being true of Japanese brands. I think I first heard of Sharp in 1965 on a visit to Hong Kong. In Japan, on that same trip, I first heard of the Japanese car brands that became ubiquitous in North America a few years later. Something manufactured in China with a "Motorola" brand (owned by Lenovo, an RoC company) may seem like a safer bet than one with a "Umidigi" or "Doogee" brand. Remember when Motorola was a US company? When they had their own important microprocessors (6800, 68000, etc.)? Too many Chinese products that have interested me have been "fire and forget": no support, no updates. If you look at single-board computers (think Raspberry Pi), there are many Chinese competitors that are technically superior until you look at these issues.

On 2019-05-31 11:29 AM, D. Hugh Redelmeier via talk wrote:
For whatever reason, few RoC companies have brands that are valuable in North America. I'm old enough that I remember that being true of Japanese brands. I think I first heard of Sharp in 1965 on a visit to Hong Kong. In Japan, on that same trip, I first heard of the Japanese car brands that became ubiquitous in North America a few years later.
Something manufactured in China with a "Motorola" brand (owned by Lenovo, an RoC company) may seem like a safer bet than one with a "Umidigi" or "Doogee" brand. Remember when Motorola was a US company? When they had their own important microprocessors (6800, 68000, etc.)?
Too many Chinese products that have interested me have been "fire and forget": no support, no updates. If you look at single-board computers (think Raspberry Pi), there are many Chinese competitors that are technically superior until you look at these issues.
The first time I recall hearing about Sharp was on a cassette deck I bought in the early 70s, IIRC. As for brands, I go with Lenovo ThinkPads. This is in part because I used to work for IBM and most of my work was on ThinkPads, but also the ThinkPad line just seems to be better quality than the regular Lenovo products. Also, if it doesn't have a TrackPoint, I'm not interested. BTW, a friend bought a Lenovo, not ThinkPad, and it had that horrible English/French keyboard, that wasn't compatible with either the original English or French keyboards. She soon returned it for that reason.

On Fri, May 31, 2019 at 11:40:05AM -0400, James Knott via talk wrote:
BTW, a friend bought a Lenovo, not ThinkPad, and it had that horrible English/French keyboard, that wasn't compatible with either the original English or French keyboards. She soon returned it for that reason.
The non thinkpads from lenovo are a lot less durable than the thinkpads. Thinkpads are generally just great, although the *x40 models that removed the physical buttson for the trackpoint were pretty universally hated. And a few models recently have had BIOS disasters (switch the graphics setting to discrete from automatic and the machine is bricked, and this is the documented recommended setting for Linux users. So much for QA testing.) -- Len Sorensen

| From: D. Hugh Redelmeier via talk <talk@gtalug.org> | - sometime companies just exploit the delay between cheapening | a product and the marketplace recognizing that effect. | | [Porche makes SUVs! RCA, Westinghouse, Marantz, ...] Of course these were pretty off-brand for Porsche too: <https://en.wikipedia.org/wiki/Elefant> <https://en.wikipedia.org/wiki/Panzer_VIII_Maus>

On Fri, May 31, 2019 at 11:29:08AM -0400, D. Hugh Redelmeier via talk wrote:
Something manufactured in China with a "Motorola" brand (owned by Lenovo, an RoC company) may seem like a safer bet than one with a "Umidigi" or "Doogee" brand. Remember when Motorola was a US company? When they had their own important microprocessors (6800, 68000, etc.)?
I wonder how many pieces Motorola split into. Microprocessors was split off and I think was renamed freescale, then NXP took over. Qualcomm tried to get NXP but was not allowed. Their cell phones went to Google and then Lenovo. Their enterprise stuff (wifi and logistics management devices) went to Zebra, and then they sold the wifi part to Extreme Networks. Plenty of other bits went in who knows what direction. -- Len Sorensen

On Fri, 7 Jun 2019 at 10:19, Lennart Sorensen via talk <talk@gtalug.org> wrote:
On Fri, May 31, 2019 at 11:29:08AM -0400, D. Hugh Redelmeier via talk wrote:
Something manufactured in China with a "Motorola" brand (owned by Lenovo, an RoC company) may seem like a safer bet than one with a "Umidigi" or "Doogee" brand. Remember when Motorola was a US company? When they had their own important microprocessors (6800, 68000, etc.)?
I wonder how many pieces Motorola split into. Microprocessors was split off and I think was renamed freescale, then NXP took over. Qualcomm tried to get NXP but was not allowed. Their cell phones went to Google and then Lenovo. Their enterprise stuff (wifi and logistics management devices) went to Zebra, and then they sold the wifi part to Extreme Networks. Plenty of other bits went in who knows what direction.
I own a few shares (and loved getting annual reports back in the Iridium project days where the company had a major Space Division), so got some notices of things. I had a few shares of Freescale at one point... Having share holdings didn't lead to getting all that much knowledge about the "spinning" :-( Your list is more complete than what I was aware of. -- When confronted by a difficult problem, solve it by reducing it to the question, "How would the Lone Ranger handle this?"

On 2019-06-07 11:55 AM, Christopher Browne via talk wrote:
I own a few shares (and loved getting annual reports back in the Iridium project days where the company had a major Space Division), so got some notices of things. I had a few shares of Freescale at one point...
Having share holdings didn't lead to getting all that much knowledge about the "spinning" :-( Your list is more complete than what I was aware of. -- When confronted by a difficult problem, solve it by reducing it to the question, "How would the Lone Ranger handle this?"
He probably would have sold his shares. ;-)

On Fri, May 31, 2019 at 09:33:49AM -0400, D. Hugh Redelmeier via talk wrote:
I intend to buy a much better monitor (which might be a TV). The thing is that (1) I'm cheap, and (2) my idea of "much better" is alway coming soon but never here.
Isn't that always the problem.
Sharp is expected to put its high-end 8k TVs on sale in the US this fall. One more reason to wait :-)
I hope they figure out HDMI 2.1 before they do that rather than their proprietary quad HDMI 2.0 for 8k. -- Len Sorensen

Hi all. I'd like to take the opportunity of this thread to ask about the suitability of using a TV as a computer monitor. Right now I have a dual screen setup with one 24" and one 22". The colour doesn't quite match between the two of them and some thick bezels prevent useful work with a window that spans both monitors. Now, it's possible to replace them with a single 32" widescreen monitor <https://www.amazon.ca/dp/B01BMES072/> for about $550. For $477 I could get what looks to be a top-tier 43" Samsung 4K. I am wondering if the lower price is because of greater volumes and consumer orientation rather than any inherent quality of the screen. As Fathers Day approaches I expect some deals a-coming here. How viable Is it to use a TV as monitor, ignore the "smart" crap and just plug in the HDMI? Are there features or specs needed for a TV to make it usable for close viewing? Thanks! - Evan

On 2019-05-31 01:45 PM, Evan Leibovitch via talk wrote:
How viable Is it to use a TV as monitor, ignore the "smart" crap and just plug in the HDMI? Are there features or specs needed for a TV to make it usable for close viewing?
I have used my TVs on occasion as a monitor and worked fine. On the other hand, my monitor is connected to a Rogers box for watching TV. As for close viewing, that would depend entirely on the TV. Some are better than others. For example, my new Sharp TV seems to have a Sharper (sorry <g>) image than the old Sharp TV.

I have used a projector for “TV” (Netflix/Youtube) watching for the last 5 years. It has 2 HDMI ports, and is quite nice. It’s “only” 1080p, but when it’s a 90” screen, that’s pretty good. I had a BenQ, but the bulb was always a problem, so I recently replaced it with a XGIMI, and it’s awesome! I occasionally plug the HDMI into my laptop, and that works well too. It claims to be 4K (speaking of brands) and it will take a 4096x2160 signal, but the underlying display is 1920x1080. I also have a 39” “RCA” that I bought at Loblaws 5 years ago for $199 (I think. it *might* have been as much as $299, but I don’t think so)… The colours aren’t perfect (mostly meaning that the range is limited), but it’s our bedroom screen, and it works pretty well as a monitor for my laptop, too. ../Dave On May 31, 2019, 1:50 PM -0400, James Knott via talk , wrote:
On 2019-05-31 01:45 PM, Evan Leibovitch via talk wrote:
How viable Is it to use a TV as monitor, ignore the "smart" crap and just plug in the HDMI? Are there features or specs needed for a TV to make it usable for close viewing?
I have used my TVs on occasion as a monitor and worked fine. On the other hand, my monitor is connected to a Rogers box for watching TV. As for close viewing, that would depend entirely on the TV. Some are better than others. For example, my new Sharp TV seems to have a Sharper (sorry <g>) image than the old Sharp TV.
--- Talk Mailing List talk@gtalug.org https://gtalug.org/mailman/listinfo/talk

On Fri., May 31, 2019, 13:46 Evan Leibovitch via talk, <talk@gtalug.org> wrote:
Right now I have a dual screen setup with one 24" and one 22". The colour
doesn't quite match between the two of them
Have you tried colour calibrating them? Different vendors have different white points, and almost every manufacturer ships with everything set as blue-white as possible so it'll look brightest in the store. A proper calibrated monitor looks pink by comparison, but it should be possible to get each monitor calibrated for seamless colours. (To go back the the "where are they made?" subtopic, Ontario used to be a major source of an element critical to high quality colour CRTs: yttrium, I think it was. The uranium mines around Elliot Lake were a major source, but now neither is particularly in demand) Stewart

On 2019-05-31 03:16 PM, Stewart Russell via talk wrote:
(To go back the the "where are they made?" subtopic, Ontario used to be a major source of an element critical to high quality colour CRTs: yttrium, I think it was. The uranium mines around Elliot Lake were a major source, but now neither is particularly in demand)
RCA used to have a picture tube plant in Midland. It's probably gone now. ;-) Incidentally, it's been years since I've seen a CRT set on sale. Around the time I got my first HDTV, 10 years ago, there was a single small CRT TV on sale at Walmart. I wonder if they ever sold it.

On 2019-05-31 3:45 p.m., James Knott via talk wrote:
RCA used to have a picture tube plant in Midland. It's probably gone now. ;-)
Yup, the little white dot plinked away some time ago. Midland had some surprising high tech for a while. Leitz had their lens and camera factory there. I'd heard it was originally to secure a supply of optical balsam for lens manufacture, but this may not be the case. Some of Leica's signature lens designs originated in Midland. It's now owned by Raytheon.

On Fri, 31 May 2019, Evan Leibovitch via talk wrote:
How viable Is it to use a TV as monitor, ignore the "smart" crap and just plug in the HDMI? Are there features or specs needed for a TV to make it usable for close viewing?
I use a 49" LG 1080p smart TV for a monitor via HDMI. I love it.

On Fri, 31 May 2019 at 13:46, Evan Leibovitch via talk <talk@gtalug.org> wrote:
I'd like to take the opportunity of this thread to ask about the suitability of using a TV as a computer monitor. Right now I have a dual screen setup with one 24" and one 22". The colour doesn't quite match between the two of them and some thick bezels prevent useful work with a window that spans both monitors.
Now, it's possible to replace them with a single 32" widescreen monitor <https://www.amazon.ca/dp/B01BMES072/> for about $550. For $477 I could get what looks to be a top-tier 43" Samsung 4K.
I am wondering if the lower price is because of greater volumes and consumer orientation rather than any inherent quality of the screen. As Fathers Day approaches I expect some deals a-coming here.
How viable Is it to use a TV as monitor, ignore the "smart" crap and just plug in the HDMI? Are there features or specs needed for a TV to make it usable for close viewing?
Someone once told me a couple things (colour behaviour, maybe response time) are different between TVs and monitors - but many people use them interchangeably without much difficulty. If you're worried about colour correction for photography ... a TV is probably not as good as a "monitor." But most people don't really notice. For a while Best Buy sold a 43" 4K monitor (not TV) made by Phillips. It was about $500 a year and a half ago, and I jumped on it. It has some weird ghosting issues (hard to see, temporary, but can last minutes), but in every other respect I love it. Of course it completely dominates my desk ... But for the first time in my life I'm not routinely looking for another monitor to park my next window on. I don't game, and I seem to recall its response time probably wouldn't have been great for that. But I do all my day-to-day work on it, and occasionally watch movies on it. It's great, I recommend the experience. I would recommend that you go 4K at that size, not 1920x1080. Even if your eyesight is poor, higher resolution gives you more choices and better future-proofing (for almost no increase in price these days). -- Giles https://www.gilesorr.com/ gilesorr@gmail.com

| From: Evan Leibovitch via talk <talk@gtalug.org> | Subject: Re: [GTALUG] war story: fixing an LCD TV | I'd like to take the opportunity of this thread to ask about the | suitability of using a TV as a computer monitor. I've been mentioning this on the mailing list for almost 4 years. | Right now I have a dual screen setup with one 24" and one 22". The colour | doesn't quite match between the two of them Colour is a subtle topic about which I'm not qualified to talk. I don't think Linux is up to snuff here either. HDR (High Dynamic Range) means a confusing variety of things, especially after marketing has had at it. But "normal" computer pixels have 8 bits per colour in each pixel. HDR often means 10 bits per colour in each pixel. So if you care about colour, you might want HDR. You also need to worry about chroma subsampling (TVs often do this). | and some thick bezels prevent | useful work with a window that spans both monitors. and an UltraHD display has the same number of pixels as four FullHD monitors, not just two. A 24" FullHD monitor has the same pixel density as a 48" UltraHD monitor. I want to maximize information so I prefer 40" over 32". If you want to maximize beauty, 32" might be better. I don't have experience wih 43" but it might be better for information than 40". | Now, it's possible to replace them with a single 32" widescreen monitor | <https://www.amazon.ca/dp/B01BMES072/> for about $550. For $477 I could get | what looks to be a top-tier 43" Samsung 4K. Top-tier Samsungs are more than that. <https://www.bestbuy.ca/en-ca/product/samsung-43-4k-uhd-hdr-qled-tizen-smart-tv-qn43q60rafxzc/13407245> That "Quantum Dot" technology might be good -- I've not looked at it. | I am wondering if the lower price is because of greater volumes and | consumer orientation rather than any inherent quality of the screen. As | Fathers Day approaches I expect some deals a-coming here. There is some silly divergence. Off the top of my head: - monitors tend to be more expensive than TVs for the same level. Quality improves faster in TVs (model lifetime is short). I'm pretty sure this is a function of the market size. - TVs only do HDMI or worse. Monitors have DisplayPort, which supports higher bandwidth (depending on the state of leapfrogging standards). Monitors generally support HDMI as well. - Older HDMI standards didn't support UltraHDMI well - TVs often do chroma subsampling, without being mentioned in the spec sheets. Monitors do not. - TVs can be annoyingly "smart" - some monitors have USB hubs (mildly useful) - some monitors don't support sound (inconvenient) - viewing angles differ between TV and monitor uses. So the tradeoffs differ. Also refresh rates, if you are a gamer. - TVs sometimes do interpolation to make videos smoother. I wonder if that impairs accuracy. It seems to increase latency. You will want to disable some TV processing tricks (not always clearly documented) | How viable Is it to use a TV as monitor, ignore the "smart" crap and just | plug in the HDMI? Are there features or specs needed for a TV to make it | usable for close viewing? It is easy and has been for years. There are gotchas if some part of your video chain is a few years older: UltraHD wasn't widely supported. At least HDMI 2.0 or DisplayPort 1.2 are needed to support 3840x2160@60Hz. And then there is HDR. There is a deal at Costco, ending tomorrow, for a 40" Samsung that might fit your bill. Read this thread for some context: <http://forums.redflagdeals.com/costco-samsung-40-4k-model-40nu7100-369-99-2287167/> Other deals will surely come up.

On Sat, Jun 01, 2019 at 05:39:15PM -0400, D. Hugh Redelmeier via talk wrote:
HDR (High Dynamic Range) means a confusing variety of things, especially after marketing has had at it. But "normal" computer pixels have 8 bits per colour in each pixel. HDR often means 10 bits per colour in each pixel. So if you care about colour, you might want HDR.
8bit HDR does exist, but is rarely used since it tends to cause color banding.
You also need to worry about chroma subsampling (TVs often do this).
As long as you have HDMI 2.0 you should be able to use 8bit per channel at 4K and 60Hz without chroma subsampling (which ruins the clarity of text). Goign to HDR requires either dropping the refresh rate or using chroma subsampling. Neither is likely to be that desirable unless you are just doing photo editing and hence frame rate is less important than color range or you are doing video in which case chroma subsampling isn't an issue (it probably isn't a problem for photos either).
There is some silly divergence. Off the top of my head:
- monitors tend to be more expensive than TVs for the same level. Quality improves faster in TVs (model lifetime is short). I'm pretty sure this is a function of the market size.
- TVs only do HDMI or worse. Monitors have DisplayPort, which supports higher bandwidth (depending on the state of leapfrogging standards). Monitors generally support HDMI as well.
- Older HDMI standards didn't support UltraHDMI well
- TVs often do chroma subsampling, without being mentioned in the spec sheets. Monitors do not.
The TV does not do it, the signal sent to it may. The TV uses whatever it is sent, but if course what formats it supports varies so the source device may not have a choice. A monitor does it too if sent such a signal. Connect a bluray player to a monitor and you will get a signal with chroma subsampling, because that is how video on bluray works. -- Len Sorensen

| From: Lennart Sorensen via talk <talk@gtalug.org> Thanks for this informative message. | To: D. Hugh Redelmeier <hugh@mimosa.com>, GTALUG Talk <talk@gtalug.org> | | On Sat, Jun 01, 2019 at 05:39:15PM -0400, D. Hugh Redelmeier via talk wrote: | > HDR (High Dynamic Range) means a confusing variety of things, | > especially after marketing has had at it. But "normal" computer | > pixels have 8 bits per colour in each pixel. HDR often means 10 bits | > per colour in each pixel. So if you care about colour, you might want | > HDR. | | 8bit HDR does exist, but is rarely used since it tends to cause color | banding. How would it qualify as HDR? OK: I know, HDR is anything marketeers think that they can get away with. I guess dithering can sort-of add a couple of bits. | > You also need to worry about chroma subsampling (TVs often do this). | | As long as you have HDMI 2.0 you should be able to use 8bit per channel at | 4K and 60Hz without chroma subsampling (which ruins the clarity of text). My (cheap, old) Seiki SE39UY04 TV is limited to HDMI 1.4. So at best it can do UltraHD at 30Hz, with 4:2:2 chroma subsampling. I don't know its LCD technology. So it should be bad. But for my usage, it seems pretty good. - I don't have a lot of dynamic content on my screen, so slow refresh doesn't have a lot of effect. The mouse cursor movement isn't as smooth as it would be with faster refresh. - almost nothing I do exposes the limitations of chroma subsampling. Text is the killer test case but foreground and background for most text differs in luminance (full resolution), not chromanence (reduced resolution). Some artistic creations have text that renders badly but even then, most artistic creations use quite large text. In my browsing, I've only encountered this once or twice. As far as the type of panel, I don't know. I've googled for the code on the panel and get only a few hits, all useless. The electronics are sufficiently modular that I wonder if I could graft a HDMI 2.0 T-Con board and processor board, to get a better system. I might do it if someone else pioneered. Someone else hacked his/her SE39UY04 and discovered that it's running Linux and can be modded: <http://www.zeroepoch.com/blog/se39uy04>

On Fri, Jun 07, 2019 at 12:15:50PM -0400, D. Hugh Redelmeier via talk wrote:
How would it qualify as HDR? OK: I know, HDR is anything marketeers think that they can get away with.
HDR defines a larger range of brightness. So without HDR your 8bit values 0 to 255 (well video rather than computers use 16 to 240 would cover 0 to 120 nits (or 80 if using sRGB). In HDR 255 (or 240) would be 10000 nits instead, so the display would interpret the incoming values differently as a result. This is why you would tend to get severe color bands in 8 bit HDR, since your content from 0 to 120 nits now has to be covered by a much smaller range of your 8 bit values, due to a lot of values covering the 120 to 10000 nit range. HDR does use logorithmic values rather than linear to help a bit, but it really needs 10 or more bits to get decent gradiants. Of course no current display can do 10000 nits, but the HDR standards seem to be designed with that as the limit in the future. Dolby has a screen that can do 4000 nits, but more typical high end LCD TVs can do 1500 to 2000 nits, with OLED limited to about 800 nits (but due to having true black and individual pixel control, the contrast is much higher than the LCD).
I guess dithering can sort-of add a couple of bits.
Yes 8 bit HDR with dithering is supposed to be quite good actually. I have never tried doing HDR with a computer to my TV.
My (cheap, old) Seiki SE39UY04 TV is limited to HDMI 1.4. So at best it can do UltraHD at 30Hz, with 4:2:2 chroma subsampling. I don't know its LCD technology.
Yeah HDMI 1.4 is certainly a big limitation. 2.0 is much better.
From what I can find, it appears that TV is S-MVA which is supposed to be similar (but not quite as good) to IPS in viewing angle but have better black levels.
So it should be bad. But for my usage, it seems pretty good.
- I don't have a lot of dynamic content on my screen, so slow refresh doesn't have a lot of effect. The mouse cursor movement isn't as smooth as it would be with faster refresh.
- almost nothing I do exposes the limitations of chroma subsampling. Text is the killer test case but foreground and background for most text differs in luminance (full resolution), not chromanence (reduced resolution). Some artistic creations have text that renders badly but even then, most artistic creations use quite large text. In my browsing, I've only encountered this once or twice.
Color text is the biggest problem. Black or white text generally no big deal.
As far as the type of panel, I don't know. I've googled for the code on the panel and get only a few hits, all useless.
The electronics are sufficiently modular that I wonder if I could graft a HDMI 2.0 T-Con board and processor board, to get a better system. I might do it if someone else pioneered.
Someone else hacked his/her SE39UY04 and discovered that it's running Linux and can be modded: <http://www.zeroepoch.com/blog/se39uy04>
Neat. -- Len Sorensen

On Fri, May 31, 2019 at 01:45:39PM -0400, Evan Leibovitch via talk wrote:
Hi all.
I'd like to take the opportunity of this thread to ask about the suitability of using a TV as a computer monitor. Right now I have a dual screen setup with one 24" and one 22". The colour doesn't quite match between the two of them and some thick bezels prevent useful work with a window that spans both monitors.
Now, it's possible to replace them with a single 32" widescreen monitor <https://www.amazon.ca/dp/B01BMES072/> for about $550. For $477 I could get what looks to be a top-tier 43" Samsung 4K.
I am wondering if the lower price is because of greater volumes and consumer orientation rather than any inherent quality of the screen. As Fathers Day approaches I expect some deals a-coming here.
How viable Is it to use a TV as monitor, ignore the "smart" crap and just plug in the HDMI? Are there features or specs needed for a TV to make it usable for close viewing?
The terrible viewing angles of VA LCD panels can be a problem when sitting close since you will potentially be at a very bad angle for the sides of the screen. At TV viewing distance that is less of a problem (although some people, like me, don't use LCD TVs partially for this reason). So I would think that rules out Samsung for computer use. A few TVs (mostly some models from LG) use IPS panels which have much better viewing angles, but less brightness and color volume. OLED solves the viewing angle problem entirely, but currently minimum size is 55", and the price is quite a bit higher, and computer use is generally not recommended due to the potential of causing screen burn in. So, it's all a matter of what you think is most important. Certainly I have found the IPS based computer monitors to be far better than other types, but I have never tried an LG IPS based TV as a computer screen. It might actually work well. -- Len Sorensen
participants (13)
-
Alex Volkov
-
Chris F.A. Johnson
-
Christopher Browne
-
Clifford Ilkay
-
D. Hugh Redelmeier
-
David Mason
-
Don Tai
-
Evan Leibovitch
-
Giles Orr
-
James Knott
-
lsorense@csclub.uwaterloo.ca
-
Stewart C. Russell
-
Stewart Russell