
| From: Alex Volkov via talk <talk@gtalug.org> | I'm looking to buy used Nvidia GeForce GXT 1060 to run some ML | tutorials. The advantage of nvidia over AMD is wider support. CUDA is nvidia-only (but AMD's ROCm is intended to be easy to port to from CUDA). The disadvantage is that nvidia's stuff is closed source. Yuck. nvidia also has terrible licensing terms that can force you to buy more expensive cards. You probably won't be hit by this: <https://www.theregister.co.uk/2018/01/03/nvidia_server_gpus/> In general, nvidia does more "price discrimination". But AMD is not immune: AMD sells "workstation" cards for extra money. For raw computing power per dollar, my impression is that AMD can be a better deal. | I got a good deal on Dell OEM one. Are there any pitfalls in running one | in a non-dell system? More specifically nothing even close to that i.e | amd fx CPU and AMD 970 chipset. There are no problems that I know of. I used a Dell OEM nvidia card many years ago without issue. Some OEM cards are a little crippled. My 5-year-old desktop came with an OEM AMD card. The specs said "1920x1200 max resolution" but also said "Dual Link DVI" (which is only needed for higher resolutions). So I assumed that it could do 2560x1600 like the non-OEM versions. It could not. (My best guess is that they cheaped-out on a TDMS chip and did not, in fact, support dual-link, but I had no way to test.) So check out the specs. Bonus hint: before buying the card, make sure it will fit in your system: - I had a problem with an RX 570 being too long for my computer's motherboard - many cards now require extra power connectors that your power supply might not support. And the number of pins on those connectors changed in recent years. - you may need a power supply with more capacity. - with more power comes more heat -- will your case handle that? (Probably)