The point is that until I found that out I was convinced that "anything can do 2d, you only have to worry about 3d". This had been largely true since graphics cards shipped with at least 8MB (yes, megabytes) of RAM to hold the entire framebuffer.
But it turns out that early HDMI (and to a lesser extent displayport) really wheren't up to the bandwidth needed for some of these resolutions (and refreshes), so simply grabbing any cheap card as long as it has three working outputs on the back won't quite work, at least for the most demanding monitors.
I recently upgraded virtually my entire system and considered delaying upgrading the GPU for a generation (I'd expect more "4k friendly" GPUs to be cheaper) but discovered that mating my trusty GTX560ti (a fine 1080 GPU, especially if your tastes run to old games) to a 4k monitor just wasn't going to happen. So when I saw "570" pop up, I figured I had to speak up (I thought I knew GPUs pretty well but that was a surprise).
All this fuss and he goes with a [single?] 1080 monitor and RX5700 GPU? At least attach a few more monitors to it if you are going to look like a trader...
Seeing that it is basically a 2004 game, I'd assume the cheapest/oldest existing PC you can find on craigslist.
You might want [barely] upgrade such a thing, so look for computers with DDR3 and SATA ports (I wouldn't trust a real 2004 machine to have them). Going the used route should do wonders for things like your Windows license, monitor, case, power supply, mouse, and keyboard. If you slip a bit more RAM or SSD into the thing, so much the better.
A quick check of my local craigslist had an old phenonII [don't ask] based system with monitor for $120, and a vastly more powerful intel i5 system for a bit more (but no monitor).
I also found this https://www.newegg.com/p/2NR-000A-00CT3?Item=9SIA9AX9MD6728
There are cheaper Dells on the list, but they have Pentium 4s which tend to run hot and I'd avoid (although I'd prefer core and later to any pre-Ryzen AMD chip). Also Microcenter has some good deals in refurbished PCs, although expect to buy the monitor seperately (which could easily double the price over the linked AIO).
Minecraft works on Java, so it is a no-brainer to install (the original, getting the next one that is designed for windows might be an issue).
Steam works directly with Ubuntu, so that is a good place to start. Just don't expect an extensive Steam library to all be there (but you might see more than you expect).
There are claims that fps is as good as windows (presumably things with a full Linux port), but that hasn't been my experience (even with Kerbal Space Program which comes with the official Linux port). Depending on your hardware (especially the GPU), you might get more "bang for your buck" with that $100 MS price than paying a similar amount to Nvidia. One great thing about Linux is that you can try your entire build with it before having to cough up for Windows, so I'd certainly recommend trying it first and seeing if you still feel that the ~$100 is warranted.
As you might guess "custom gaming PCs" are pretty expensive for what you get. I'd be curious if the "water cooling" includes the GPU as well (my guess is no, that wouldn't work with their "easy to upgrade" claim).
I'd expect it to work fine at 4k on older games like Skyrim (even SE), Witcher, [not sure about the latest Tomb Raider]: I've been doing fine on Skyrim at 4k on a Vega56 (certainly less powerful than a 2070).
I'd also recommend looking at UK build guides here : https://uk.pcpartpicker.com/guide/
Don't forget to add ~85 quid for Windows (185 if you want Pro). This does include AIO watercooling (which might just add extra noise on the AMD side), so that evens things out a bit. I don't think either novatech nor the build guides include a monitor at those prices.
Basically PCs have been plug and play for decades, and this entire site is dedicated to custom builds. The lack of information of putting the thing all together is largely due to the shear simplicity of things: if the parts fit together: that's where they are supposed to go.
Probably the last existing [new old stock] part of that model. They are hoping that somebody will need exactly that part. If a contract calls for that SSD, getting it changed can easily cost >$17k (currency of your choice).
Or it is a mis-price, that's my real guess. Either way, nobody really cares about 280G SSDs anymore (unless something is very specific to a contract).
Shouldn't be an issue, the only two possibilities are:
Does it short two pins (can't see it squishing out enough and linking the next door pin)?
Can it physically block the pin? No, not going to happen here (and typically I'd fear a short first).
I guess it might be possible to spread out and insulate the chip from the socket on that pin, but generally speaking anything that conducts heat conducts electricity (diamond is probably the least exotic material that only conducts heat). I know I have a larger bit of thermal paste in my socket (I bought the motherboard open box), and really didn't worry about it as it won't short two pins. How in the world did you (and the guy who had mine earlier) manage this?
If you really mean "workstation" it might be for you. If you want a gaming GPU forget it. The biggest obvious advantage is 16GB of extremely fast RAM. I'd also be nervous about AMD's OpenGL software, and check it against whatever software you are using before buying this.
Finding a new one might be difficult, I doubt they made any more than the original bunch.
The problem didn't seem to be on the GPU side. The issue was that you had to set the TV to HDMI2 (done in the TV settings) and feed it a signal no deeper than 8 bits (done via the GPU settings). Then and only then would the TV default to 60Hz (presumably a limitation of the HDMI input, or possibly how it outputs high definition).
The "difficulty running 4k" is largely for more modern games. I suspect I'll be more willing to move sliders down than drop the resolution, but the option will be there. On the other hand I wanted to see what DDO (Dungeons and Dragons Online, originally launched in ~2006) looked like in 4k and the GPU was capable of >200fps at max settings (other than supersample antialiasing).
I'd say "probably not, but check your software". This type of thing is specific to whatever you use to render on the GPU. No idea even if another 1080 will improve things.
It probably isn't a storage issue, although if streaming causes a big enough hit you might try saving/buffering on your SSD. If it suddenly starts working better then you will likely need a larger SSD.
Since Spotify is giving you trouble, I suspect it is an ISP issue (nothing in a vaguely modern PC should have issues with audio).
I really have to question the idea of a AIO being louder. Sure, the pump will make some noise, but nearly all the noise comes from the fan. For equal temperature/Watt, I'd expect the AIO to come out ahead (moreso with 240mm fans). AIO will only be noisier as they tend to be bought by people who will simply overclock higher with such a system (note that this doesn't seem to matter with the latest zen2 generation, and maybe other latest and greatest CPUs). Note that my AIO+bulldozer was significantly louder (especially websurfing) than Wraith+Ryzen, so if I can get my adaptor+AIO working I'll have to post if it is louder (at least before overclocking).
That said, any water-based system will have to expel any air that gets in the system and some people absolutely hate the "gurgling sound". I've never heard it in mine.
[this was supposed to be in the main thread, but at least references vagabond's link].
First of all, nearly the big issues in "how a CPU works" have already been decided once you choose an x86 (PC) CPU. After that, there are only a few different ways a core is designed: coffee lake (and all previous Intel cores since Sandy Bridge: they are a straightforward progression at the block diagram level plus maybe different vector [SSE/AVX] extensions), zen (zen, zen+, zen2), and soon ice lake (try to avoid bulldozer, jaguar, and atom cores)).
Even if you know the ins and outs of a CPU, it wouldn't have been obvious that bulldozer would have been such a disaster (and AMD would have switched course earlier). The devil's in a lot of proprietary details. You might care about things like "typical IPC" (a number that is specific to each bit of software, and can only be averaged so well), clock speed (which thanks to boost clocks is changing to a variable similar to IPC), and cache sizes (which fortunately doesn't change, but you'll have L1, L2, L3, and maybe L4 to consider). Things that might matter in choosing a PC:
How fast is the core on its own? (basically IPC*clock rate)
How many are on the chip (and will my software use them all often)?
How big are the caches/how fast ram do they need?
Does it have SMT and how much of a boost can I expect (and if so, will it speed up my software at all?)
If you can slog through the introduction in vagabond's link it might help. Going into the specific chapters appears necessary only if you plan on designing your own computer (presumably from a FPGA or something). If it really interests you, it might make more sense to learn how the AVR in arduinos work, and then effectively throw that all away and learn how modern CPUs work (they basically emulate a hypothetical single issue - in order 64 bit x86, but do so by guessing which instructions need to be executed and executing multiple ones simultaneously. So it helps to understand the ultimate goal while learning how they do it).
One down, 99 to go (or maybe 199)...
[the rest of this post is in response to Allan M. System's post]
But I was recently upgrading my system from a GTX560 to 4k and found out that even if I was willing to drop my resolution down to 1080P for gaming, I'd still be stuck with 1440P for desktop use thanks to limitations in the GTX5x0 chips. And I really didn't discover that little detail until pretty late (because nobody cared about 4k way back then).
For the longest time you have been able to get away with just about anything for desktop use (see Intel's on die GPUs) and have assumed that it will work. But if you want 4k, you will need at least a GTX7xx (or RX5xx [I'd guess RX4xx as well, I think the RX570 is a rebranded RX480]).
And of course you have to be careful about your search terms (although ebay shouldn't be an issue: it is "let the seller beware").
No kidding: Sempron CPU, presumably DDR2 ram? Very hard to find upgrades that justify the bang per buck. If you are trying to find ways to stretch the budget, you could reuse the case. Don't try to reuse the power supply without careful checking of exactly what you had: Dell was notorious for "custom power supplies" that were dangerously incompatible with PC-type power supplies.
Also check any modern reviews of the 3400G: that should provide plenty of CPU grunt, but I wouldn't expect nearly as much in the way of GPU power. Also remember that 3400G are Zen+, so compare it to any R3 2400 CPU (instead of the recently released zen2). You might find that you can swing a 2400 (or 1600) Ryzen plus a RX570 card.
I suspect that by the time 8 core/16 threads becomes a significant gaming advantage, you will be able to buy older (>8 core) zen2 chips at a significant discount...
[note, the reason I'm so late to the conversation is in part due to getting my rig back together (especially the password manager!) after upgrading to a R7 2700[no x]. But I choose the R7 because it was $150, and still had to think long and hard over a $200 R5 3600. I wouldn't pay more than the $200 for an R7, and really don't expect any gaming help from the 7-8th cores].
Note as far as GPUs go, that the [AMD] RX570 is likely a good choice, a [Nvidia] GX570 (assuming you could somehow find one) might not be. You need at least a 7xx series from nvidia to display a 4k screen (just a 2d desktop, not assuming 60fps Crysis or anything). I'd assume that any "*50" series nvidia (presumably a 1050) would also be a good choice due to low power consumption and noise.
I'd assume so. You'll almost certainly wipe the drive, but if you don't care about the linux installation that shouldn't be an issue. Expect to need a new Windows license (Linux was probably loaded on the thing for just that reason).
Note that this means it was a "real Linux laptop", not a chromebook or something. Installing Windows on ARM is almost certainly not worth it, and even an x86 Chromebook might not have the right bios types (although if you can install [x86] Linux, you almost certainly can install windows). [New] laptops shipped with Linux are so vanishingly rare that they certainly start with windows-compatible parts (unless ARM ChromeOS hardware).
I'd check how well Linux is working on the thing (mainly how the wifi works, and what was done to make it work if it is) just in case windows doesn't like the cheap laptop (insufficient memory, too old, whatever). Since Linux can occupy a tiny slice for a ton of free applications, I'd at least think about trying to leave some space for Linux after installing Windows. Right now "/" fits fine in 32GB (just don't try to install any steam games or other programs written for windows), and before that it was crammed into 16GB on my first SSD (Windows winds up taking up most of SSD space because its such a pig).
I think the problem lies in the developers (more specifically the graphics engine developers) not designing with SLI/Crossfire in mind than any issue with the number of users. And it probably works out that designing around SLI/Crossfire means losing too much (even 1%) in the common "non SLI/Crossfire" case that they were never going to consider it in the first place.
Considering that modern GPUs tend to render in small tiles it is really strange that SLI/Crossfire didn't take off [again]. But I guess the devil is always in the details, and too much information had to keep crossing the PCIe bus.
HP is now selling AMD parts, I doubt Dell will. The current situation probably has more to do with previous shortages than any bribes Intel could make: HP has already been recommending EPYC to customers when they simply didn't have the Xeons. Dell's unwavering obedience to Intel likely kept them in Intel chips over the shortages, and Dell isn't going to risk that relationship over what might well be another temporary AMD advantage.
It does look like most companies still fear Intel, though. You can't find many laptops with AMD chips, no matter how competitive the new ALUs are in that segment. Somehow I suspect that anyone who shipped those units aren't going to be getting any icy lake chips (presumably the best laptop chips available).
Never mind the "several million dollar fines", they had to cough up at least a billion to AMD. And that was still a case of "crime pays, at least if you are listed on NYSE/NASDAQ". Pretty much everyone watching the case knew that AMD was due much more than that, but probably couldn't last long enough without it to collect.
I'm completely unsold on high refresh gaming (then again my eyes are much older than the esports set), but it seems that monitors only exist in 60Hz, 144Hz, and 244Hz, so if you want >60Hz fps you need a 144Hz monitor. Freesync is unlikely to be needed at the resolutions you are using, but probably a help for a 60Hz monitor (and shouldn't really increase the price).
I'd look first at size and resolution. A high refresh is typically only important in things like esports where knowing the position of an enemy that very millisecond becomes important, and somewhat helps when rotating quickly while whatever you are focusing on (such as an enemy player) is moving in another direction from the background (the cannonical example is the opening scene in Oblivion. As an old game, you could play it at ~30Hz, but expect the opening sequence to be jarring as the front towers moved much faster than the background [https://www.youtube.com/watch?v=PGERPMvw0C0 1:25-1:40]).
Dropping down to 60Hz will let you choose an IPS monitor over a TN one. This should (in theory, check reviews of the monitor in question) allow for more accurate colors at a cost of refresh in gaming. At 60Hz the issues mentioned above should be barely visible, but a slight disadvantage in esports.
Going up to 2560x1440 could easily double the price (but going with the unreviewed https://pcpartpicker.com/product/9B66Mp/aoc-q3279vwfd8-315-2560x1440-60hz-monitor-q3279vwfd8 might well also double the size and pixels, so perhaps a good value), and neither video card is likely to be held back by the 60Hz limit (it should be better for non-gaming desktop tasks, RPG, and possibly single player games).
A lot depends on your streaming/editing/compressing software. GPUs are significantly better suited for those tasks, but are often difficult to program. If it uses the GPU, you might want to look at a better GPU than a 2400G/3400G (and use a less powerful CPU if necessary).
As mentioned above, the "actual power draw" is what matters. I'd almost certainly expect that the EVGA power supply has power factor correction [PFC]
(required to sell in the EU, more or less required to avoid tripping a 15A [US standard] breaker). As long as you have more-or-less working PFC, the Amps you draw will be the wattage used divided by the voltage (without power factor correction things get hairy).
The "right" answer will take the kill-a-watt device, but here are my guesses:
monitors: low draw (especially with LED backlights)
PS4: 120W, 140W peaks (according to google. I'd expect PFC here as well).
printer: high if laser (they operate via heat), low if inkjet.
3D printer: pretty high (anything that uses heat to operate eats power), but probably not more than a laser printer.
VR headset, similar to monitors.
As far as I know, ebay is basically "let the seller beware". If there is a problem, ebay almost always sides with the buyer. The biggest danger is an overclocked CPU that dies in months: any immediate issue at most requires shipping the thing back.
The advantage of the 2200G is that you can use it after your GPU fries, at least while buying another one (I found this useful during the days of GPUs included on the motherboard, even if the thing included was barely functional). Probably not worth $20.
In an even more specific benefit, AMD ALUs might be more useful for "GPU computing" as any data returned doesn't need to go over a PCIe line (just memory). Don't expect any benefit from standard software, those devs likely read the optimization manuals and assumed the PCIe bus (which takes up most of a GPU programming guide). This benefit is pretty much for homemade GPU software.
I'd at least look into a 2600 or even a 1600 (if available) with 12 threads (and check for non-PCpartpicker-listed sales of 2700 and 1700). They are likely out of stock, but it doesn't hurt to check (I managed to buy a 2700 on Prime Day in the US for about 75% the price of a new 3600).
Pcpartpicker (set to the Netherlands) shows that 1st and 2nd generation R5s are reasonably priced, and that the i5 9600KF might be a strong contender (don't forget heatsink and motherboard prices) to the "obvious" R5 3600 choice (the 3600 slightly breaks the budget. And only gets worse if you just have to pair it with a X570).
There is a R7 1800 listed as just a bit more than the R5 3600, but probably can't even keep up with it for streaming/editing tasks, let alone everything else. If somebody else has it cheaper, that might be a possibility (I don't expect anyone to sell a R5 3600 cheap, or even an i5 9600 cheaper than already listed).
Seagate had some pretty bad 2T drives [one specific model] about 5 years ago (or more), I don't think anybody (including Seagate) has had anything similar that sticks in peoples minds. Right now, all consumer drives are pretty good and there isn't really a good reason to avoid one.
While people like to disparage the backblaze data, if you want to measure hard drive life expectancy in an "accelerated testing methodology" (i.e. finish the test before the HDD is obsolete), Backblaze's use case is pretty much the textbook example of how to do it. It might not be very good data, but it is still the best you can find. And for years Seagates have been pretty similar to everybody else.
You definitely need to upgrade the bios before changing chips. You might even have to use an "intermediate" bios in some cases before flashing the final bios.
All the power issues depend on your motherboard choice. If you could overclock a 2700X (or simply run it at peak boost when appropriate), you shouldn't have issues with a 3700X. If you were barely running a 2600, things might be not work so well.
PBO appears to be functional in MSI's B450 Tomahawk motherboard: https://www.reddit.com/r/Amd/comments/cdabjo/3600x_msi_b450_tomahawk_pbo_testing/ Don't know how often this will be true.
Looks like the coolers have pretty similar performance. Keep the DeepCool if you want RGB.
At stock clocks, the pcpartpicker reports 300W for the system. Undervolted vega64 shouldn't add much more than 100Ws to that, and pushing Ryzens doesn't seem to be all that effective (BPO seems to work as well as overclocking, and that should keep it within thermal limits). If you are buying new, you might want to bump it up a little, but I wouldn't replace a 600W supply until I at least saw what was involved in overclocking (and how much headroom your specific chip has).
I simply don't expect "something for nothing" by somehow managing to get RT without either new circuits nor using existing computational units without significant delay. Vega might have been able to do something like that, but only because it had far more compute capacity than needed for current gaming needs (and was limited by insufficient pixel pipeline circuitry). With Navi fixing that, I don't expect them to have "excess compute power" simply lying around.
Navi 10: 251mm2
TU106 (2070/2060 Super/2060): 454mm2
TU104 (2070 Super/2080/2080 Super): 545mm2
For the first time in a long, long, time it (since Polaris?, before?) it seems AMD is punching above its weight. And didn't require any fancy memory systems that nvidia wasn't using (at the consumer level).
Is the 1080ti obsolete? Last I heard it was selling close to launch price (if you can find one), and used ones aren't much cheaper.
"Moore's law is dead" (technically not true: Gordon Moore merely stated that the number of the transistors will double every n years. This is still true for flash devices. It is "just" that you lose all performance in GPUs/CPUs/similar devices you'd justify a few billion transistors so we tend to think of Moore's law as something in the past). If the name of the Youtube channel is any indication, GPU improvement will largely rest on architectural/design improvement and can't be simply "muscled through" via throwing more transistors/increasing clockspeed.
Will the RTX series always remain a set of GPUs with 25% of the silicon set aside for raytracing that will be unlikely to ever be effectively used? I suspect that will be the case (if RT becomes a thing expect it to want 21xx and above). Perhaps RT titles will work with the 20x0 series, who knows.
The whole idea of getting 4k@60Hz (and higher) on a console is almost unthinkable. Since AMD appears to have the contract for such things, and the 5700XT can't do it, getting even more appears difficult.
zen2 chiplet: 80mm2 (assume 402 for 4 cores/1CCX).
this gives us 300mm2 that doesn't even do 4k@60Hz (but does get close, perhaps with "console optimization" it will be enough). But RTX spends upwards of 25% of the area on RT, so that would take us to 350mm2 (for barely functional 4k@60Hz or raytracing, but not both at the same time). At the massive cost of a 350mm2 7nm chip.
Console economics could traditionally get away with a massive launch chip, often with fairly high (but not too high) launch price and then shrinking the chip into an affordable system. Don't expect to shrink 7nm in the next few years (although I'd expect it to launch on some cheaper EUV 7nm).
Oddly enough, I have great hopes for raytracing. Just not in the AAA+++ titles that control console economics. Look at the existing raytracing titles: the only one that sports an obvious difference is Quake2. I'd expect the titles you will really want to have RT available for will be Indie Unity titles (haven't heard if/when Unity will support RT, so perhaps some similar engine). Having "perfect lighting" "just work" will be a godsend for small developers, not sure if the big boys will be willing to give up their highly optimized engines that will likely give better results (on the other hand, it might be driven by those same big boy's bosses, who want to brutally slash headcount and RT appears the means. Don't count on that going to plan).
But "improper" PCIe3x4" are about $15 more than cheaper SATA devices at the 500GB level (and about the same premium at the TB level as well). And they are faster than the fastest SATA drives (I don't expect anyone to bother beating a 960 evo using SATA) while the cheap, dramless QLC SATA have nasty performance surprises (I've heard the ADATA SU650 uses QLC caching well, but plenty of horror stories in that market).
I'd also recommend looking at the Inland premium (not the PCIx2 pro) NVMe drive. Similar NVMe costs, but it gets rave reviews.
It comes down to whether that $15 (and time finding a good SATA drive at that price) is worth speeding up the large file transfer and slightly faster storage. I'm also curious if you can add a low-cost card to house your old drive (probably only getting one lane of PCIe) if you want a new, bigger, faster PCIe4 drive.
Unless they are willing to ship a cut down navi, all those sales will simply go to nvidia (or possibly Intel). And that "cut down navi" would so completely violate the "40% margins" [which was carefully framed to 7nm to protect both "value" 3200G/3600G chips and 570/580/590 boards] promise that there would be far more pushback.
And they still need pre-RDNA software support for all those (12nm) Ryzen3x00G ALUs they are still selling, and are also strategically important for AMD. My understanding is that those chips are far closer to the Polaris design than the Vega, so they still need those boards on the market as well.
It just doesn't make any sense. The only reason to halt them would be GloFo capacity issues, something I really can't believe. I'd even expect them to port the thing to a similar process at TSMC or Samsung before canceling those chips/boards. There are still plenty of 1080@60Hz systems out there, and that line is ideal for them.
If they do cancel the boards, I'd look for the mythical "cut down navi", probably all 7nm, but 12nm wouldn't surprise me at all, and GDDR5 on a whole new mask/die (implying a big bet by AMD on total sales). Probably announced in September with Rome, 3950X, and possibly a new threadripper (to be announced/shipped even later). But in practice I'd assume all "leaks" are faked until proven otherwise (especially leaks that don't make sense).
I recently bought one of these for my dad who was in a similar situation (SU650). It seemed to match a budget price with SLC caching to match speeds of SSDs with DRAM. Unfortunately I didn't have time to both copy and install the drive when I was there and dad isn't up to that (I'll see them soon enough), so I can't tell you how well it worked. But both the price and performance seemed to be there.
I checked hybrids, and the whole idea seems dead. Probably would have made sense a year or two ago.
Reviews of AMD's OpenGL implementations (fairly critical for SolidWorks work) are pretty bad. Given the choice, I'd go with nvidia. I also wouldn't expect AMD to ever bother fixing Vega code, they've moved on while nvidia has plenty of cash to provide support across the line.
That is pretty bizarre. RX570-RX590 is a price/performance powerhouse, and not competing at all with RX5700. The only reason I could imagine them dropping this is if they needed more wafers for 14nm "Matisse" (the I/O) chips from GloFo, or possibly the "Ryzen 3000" 12nm ALU/low end chips. I can't imagine GloFo being at full production (maybe they are helping Intel with their supply issues:). Even if the Ryzen chips were enough for their contractual obligations, I suspect it is worth making the 570-590 chips.
Probably some fantasy from nvidia fanboys. That's about the only place (along with ALUs) that AMD clearly makes sense over the nvidia competition (a 5700/5700XT with a real fan might make sense on price, but it isn't as overwhelming as Intel's low end boards). But even then, I can't see fanboys caring about the low end (even if they buy/use those boards).
I have no idea how/if VIA has a license to x86 (32 bit). They bought Cyrix, who I know didn't have a license. They had their chips manufactured by IBM (who did) and possibly Texas Instruments (I'm less certain). They also bought Centaur, but I don't know how Centaur legally made their chips. I think all "Via x86 chips" have been Centaur (Winchip) derivatives, although typically with Cyrix branding.
Any previous license from Intel wouldn't give them any advantage with AMD-64 (which is what we normally think of x86-64 now). You don't even need a license for 32 bit x86 (the patents have long since expired), but that might not be enough to get you to SSE (and leave you stuck with X87 [insert Cyrix joke here]). Sledgehammer shipped in 2003, so I can't imagine that any patent-based IP on the x86-64 ISA will last much longer (except all the SSE/AVX improvements, and similar things).
Of course, none of this matters in China. And none of this matters if it can't compete with a China sourced Ryzen, either.
I'd expect you need to go to the used market or all the way down to the 1000 series. The best competition I'm aware of is the 2700X vs. the 3600, with somewhat equal chips (the 2700X winning in traditional AMD areas, the 3600 ahead everywhere else and don't expect big differences either way) with the 2700x 10% more expensive (and only 10% faster in POVray. Often 5% slower in other benchmarks).
That 1920X might sound good, but only if you got the motherboard equally cheap (or really needed the memory bandwidth and 32G+ of memory).
If it is on backorder he doesn't have it. Also, I'm not even sure that will ever run a 3600:
lists for the 450 the Pro Carbon AC
and for the 470 both Pro Carbon AC and Pro Carbon
If the thing doesn't explicitly include "pro carbon AC" I'd cancel the order (to be fair, all the hits returned for "450 Pro Carbon" included the AC, so I'm not sure they ever made a "non-AC" pro carbon).
Also note that the 2700X is extremely comparable to the 3600. The 3600 clearly wins in games and low-thread applications while the 2700X wins by a similar margin in video editing and high-thread applications (a more "Intel-like balance" if you will). But even in America the 2700X costs 10% more (and I'd hate to pay an extra 100 pounds).
The only clearcut "just copy AMD" solution would be to ditch the onboard graphics for more cores. The catch is that Intel cores are much more power hungry and you would need even beefier motherboards and VRMs to deal with yet more Intel cores.
"Double Speculative Execution" What does that even mean? I really doubt there are any speculative instructions that Intel isn't taking now, and also Intel seems to be particularly bad at that: witness meltdown and all the other spectre-class exploits available on Intel chips. Do you really think any such chip would be faster after the "exploit mitigation" patches were slapped on? And I'd only assume that Intel isn't speculating now because the instructions are unlikely to occur: do they really want to use that much more power than zen (it would kill them in both server and notebook market. That's what they want to protect, not the desktop market).
Double/Triple/Quadruple cache sizes: The size of L1 and L2 are carefully chosen for hit rate vs. latency. They really aren't limited by size (there's a reason there are L1/L2/L3 caches after all) and blindly doubling one of them will likely slow things down. Doubling (or higher) L3 would certainly help, assuming that Intel can make that many chips (they had shortages of 12nm+++++ for two years after all).
Enable SMT on everything: Sure, Intel can stop crippling their chips for market segmentation reasons. They can also call themselves "Advanced Nano Devices" to confuse people with AMD. I can't image Intel ever thinking they should stop being Intel (and yes, I know IBM was doing it back when the "traitorous eight" were working for Shockley. And AMD locks down some things down when they know they have a great product. But Intel's gotta Intel.)
Intel's future depends on Sunny Cove and presumably 10nm (or possibly skipping 10nm and going to "7nm", which would be something like TSMC's "7nm++"). There aren't many "big jumps in performance for minimum time invested" unless they are willing to produce parts that have a real TDP>300W (shades of the "5GHz Bulldozer").
<quote>After all AMD weathered not really seriously competitive for quite a few years with a fraction of the revenue and resources and they managed just fine.</quote>
AMD failed with phenom, had an "ok" product with phenom 2, and then failed badly with the entire bulldozer line. I suspect they could afford to fail again.
Intel is almost certain to get something like 80% of the total market, no matter how badly they fall behind. Not only that, I'd only call the 10nm process a "failure", their old chips are certainly far more competitive to the ryzens than bulldozer (or phenom) was with the Intel chips of their day (a comparison might be made between phenom2 and core[2?]). It just isn't better "enough" compared to AMD's ability to spam zen2 cores all over a chiplet.
It might be curious if Dell defects to AMD, but I suspect that after not having supply issues during Intel shortages, they probably want to stand by Intel.
Don't forget that Intel is focusing first on the notebook with their 10nm chips, while AMD is only supplying 12nm "Ryzen 3000 (zen+)" chips to notebooks. I suppose somebody will put a 65W "real Ryzen" in some, but don't expect 7nm when you see a "Ryzen 3000 laptop". That said, I'm sure the ALU group really wants to get their hands on some Navi, it looks like they finally have some performance/mm2 and performance/Watt (they might need that if Intel really tries to develop a GPU).
And of course, AMD's real hopes are in Rome. Considering what the Ryzen can do, I can't imagine that Rome won't conquer plenty of server rooms/datacenters (and it certainly needed a generation or two of EPYC servers for testing before really conquering, so expect things to happen this year). Just don't expect Xenon to disappear (especially since Rome has all those cross communications issues thanks to being a MCM (and the infinity fabric doesn't appear to be quite as powerful as Intel's solution).
Youtubers need their conflict and drama to drive clicks. AMD also just happens to be better for a lot of tasks that youtubers specifically have (and aren't so common for everyone else). So if mainly do video editing, you absolutely want an AMD CPU.
For anyone buying a new motherboard and new CPU, it clearly looks like AMD is the better deal across the board. If you have an Intel getting reasonably close to 5GHz, you probably won't see any improvement in gaming (and will likely see small bits of backsliding occasionally). Expect only serious improvement in non-gaming highly parallel workloads (such as the youtubers when editing their videos).
New architecture and shrunken transistors scheduled for laptops only next year, maybe desktops in 2021. Long term Intel should have little to worry about (except having competition again, no more monopoly for you).
Short term doesn't look good at all. We also haven't seen Rome in action, which is the chip (and its successors) that Intel really has to worry about. But that is more a long term issue.
Considering that Intel couldn't even make as many chips as they could otherwise sell for nearly two years, it isn't all that clear that competition will be as big a problem as it could be (although they may have to trim prices below the superior competition). Since Intel had some record profits during those shortages, I'd have to expect that Intel isn't in trouble at all.
Intel's stock might take a hit. Wall Street always wants even higher profits regardless if they just had record profits and presumably AMD will be eating plenty of the "shortages ending" profits. I doubt it will change R&D spending and employee head count (possibly bonuses, but even those shouldn't be an issue [process guys might be another story]).
The 1080P@144Hz sounds rough for the 2700X. If you want to decrease your GPU settings until you hit those numbers, you will likely want a 3600 (but don't expect miracles. Just a slight improvement over the 2700X).
The 3600 is ~5% better at gaming (and your monitor presumably won't bottleneck it) while the 2700X is ~5% better (maybe even less) at high thread throughput (like streaming). Your call.
Also note that Ryzen 3 3000 chips use zen+ (and have a similarly updated ALU/GPU), while 5/7/9 use zen2. Don't expect a CPU performance change between Ryzen 3 2000 (non-G) and Ryzen 3 3000 (the Ryzen 3 2200G/2400G used the original zen CPU).
I don't know why you would expect anything else, the flash hasn't changed just the interface. It is like the difference between PCIe and SATA, only moreso as the PCIe3 already has sky-high sequential read/write times, while PCIe4 has even higher ones and more or less similar random times.
Perhaps if Intel (or Micron) produce an Optane drive you will see a lot more, but expect that to cost an arm and a leg (or be tiny like current Intel consumer Optane drives).
But Ryzen 4000 won't work in his motherboard. And might not work with DDR4 at that. We don't even know if zen3 will be any better for desktops. While zen2 seems more balanced between single and multi-threading, AMD cares more about the server side and could well only improve the multi-threading (not improving any listed game at all). So maybe wait until 3950X drops and the price of 3900X falls a bit, that way you can at least keep using the rest of the box with all the gains of a 3900X (or other higher end Ryzen 3000).
But I certainly would check to see if I could expect any improvement (I wouldn't count on it) and also if somebody might buy the old CPU (don't expect even $80 USD if you live anywhere near a Microcenter).
Threadripper (the current edition) only makes sense if you go big (>16 cores) or really need the I/O. If you need the I/O, it is either threadripper or a server platform, so threadripper makes plenty of sense there.
The bigger problem is that at least in Germany (budget specified euros, but not country. Germany seemed a good place to start), the cheapest TR motherboard (according to PCpartpicker) is 260 euros and threadrippers run 287 euros for 12 threads and 363 euros for 16 (I'd expect that CPU sales exist, but I'm less sure about motherboards). It won't fit his budget.
The numbers I've seen show little more 2-3% improvement on a 4.3GHz overclock (I suspect an aftermarket cooler). 5% on POVray (of course, you could get even more benefit [an additional 10%] from a 2700 on that application).
If it isn't boosting, it might not be in a position to actually benefit from the additional frequency. It seems to boost enough to match an less OCed Ryzen.
A quick look at the GamingNexus youtube review shows it 10% slower in games than the $200 3600, 10% faster in parallel CPU content. If you want "more AMD" from your Ryzen, this is the chip. If you want something balanced more like an Intel, go with the 3600/3600X.
I'm on the fence about the 1600. Cheap ($80). Very cheap. But it gives up a lot of power (and I don't expect the "next leapfrog" to go much faster).
One thing you can do while waiting for the BIOS situation to be more clear is to check the framerates you are currently getting on your listed games. Then cut the OC down to 3.7GHz or something and see if there is any change. If there isn't, then increasing your CPU performance (by even over 30%) won't help at all in GPU-limited situations.
If you later have a CPU-limited condition, you might well take the full leap to a 8 core chip (I'm not sure about a 370 motherboard and a 12-16 core chip). You might even see the price drop a bit (don't expect such things until well after September and the 3950X and the 550 motherboards).
There's only 20 lanes of PCI-e on either listed motherboard.
You'd have a hard time noticing the difference.
My guess is that if you really wanted the best possible I/O, you would use something like a server or threadripper. I'd also expect that plenty of servers are already doing this. But on a desktop, your best bet would involve switching your GPU slot to a x8 slot and using the "other" x8 slot to connect two NVMe cards.
You'd be far better off just buying a 570 and a [reliable] PCIe4 NVMe card. Pretty much all SSD cards already use RAIN (redundant array of individual NAND), the difference is you typically don't have fair warning when it will brick (with RAID you know when each drive fails).
If you really like this idea, I'd recommend waiting for a potential Zen2-based threadripper (or just get a server CPU+motherboard).
No, and things like office work and even CAD tend to be single core. Gaming probably uses more core (and requires more cycles) than most things. The more "pure graphics" something is, the more likely it can use all the cores (and probably should be on the GPU, not the CPU).