I'm in my thirties and I've seen alot of hardware come and go.
I did my first online gaming on a 100mhz Pentium with a pci graphics card and 56kbps dialup.
I've spent alot of cash of hardware replacements since then and there is one thing that was always true.
The newest hardware isn't always the best option for your gaming PC.
I'm not saying that from a performance perspective, but rather from a cost efficiency perspective.
You can usually play modern games on 1 to 2 year old hardware at max or near max settings and have really great results.
So instead of buying the best hardware from today, or settling for mediocre hardware from today, buy the best hardware from a year ago after its been marked down by 75%.
Many types of hardware like CPUs and graphics cards will come with many tiers. The lower tiers are often significantly slower than the higher tiers from 3 years ago.
Usually only the feature sets improve but the processor clocks on the CPUs and video cards improve very little over the years.
Its been nearly 20 years since the first desktop 5ghz CPU, and we are still buying new CPUs with sub-5ghz clock rates. Really they have just added more multithreading features and AMD and later Intel moved the memory control onto the CPU. But even then we had the ability to run 8 cores was back in 1996 with the Pentium Pro 133mhz by using an 8 CPU motherboard. (not saying that an 8 core Pentium Pro would run crysis though)
For example, my little sister plays modern MMOs on my 5 year old gaming PC that was built by me using an older AMD Phenom CPU and a Radeon card and she doesn't have to turn the settings down at all.
Comments
There are rare exceptions where a company produced too much of some particular GPU and then needs to get rid of it. AMD did have clearance pricing on their Cypress GPU for roughly the month of April 2011, and their Hawaii GPU for a while some years later. But those are unusual.
An eight-socket server is not at all comparable to an eight-core CPU. For starters, going multiple sockets creates a ton of NUMA problems. It also makes a mess of your memory configuration, as every socket has to have its own memory.
Of course you are not going to upgrade on a yearly basis and most gamer will build a computer which will run for around 5 years ( with some upgrades in the mean time if necessary ) .
From a cost efficiency perspective, for exemple, I bought a GTX 1060 right around when it was released for : EUR 347,99 and almost 2 years later , the price is EUR 375,80.
Also , CPU GHz has an effect on gaming performance but is not the only thing to consider, and so , your theory about this "issue" is not valid.
"Its been nearly 20 years since the first desktop 5ghz CPU" - I .. did google this , but can't seems to find anything close to this. All I found is :
On March 6, 2000, AMD reached the 1 GHz milestone a few months ahead of Intel
In 2002, an Intel Pentium 4 model was introduced as the first CPU with a clock rate of 3 GHz
As of mid-2013, the highest clock rate on a production processor is the IBM zEC12, clocked at 5.5 GHz, which was released in August 2012
So if you take the 2002 P4 3 GHz and compare it with Ryzen 5 1600 for exemple, which has a base CPU speed of 3.2 GHz , you will see massive difference, even if the GHz are almost the same.
If anything , we are on a very good track in terms of computers this days.
Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.
EDIT: There's a good case to be made for not buying most expensive hardware, but advising people to buy previous generation hardware is stupid. Some parts are occasionally available at cheap prices, but those are exceptions usually the latest gen hardware is best choice unless you're buying used stuff.
Also:
https://www.engadget.com/2007/01/24/pentium-4-overclocked-to-8ghz-lets-see-your-fancy-core-2-try-t/
https://www.computerworld.com/article/2490182/computer-processors/intel-s-new-core-i7-chip-hits-5ghz.htmlhttps://www.techradar.com/news/computing-components/processors/amd-on-its-5ghz-processor-you-don-t-buy-a-ferrari-for-the-mpg-1158675
Also, the P4's topped out at 3.8ghz out of the box in 2005. The AMD processors at the time were 64bit and clocked at 2ghz and were alot faster because of the faster memory buss (dual channel dd2) and more cache.
At the time Intel was working on an entirely incompatible 64bit CPU called italatium and AMD released the Athlon64 which could run all old versions of 32bit windows and then run an extended intel x86 instruction set called the AMD64 instruction set and run WindowsXP 64 bit. This caused a huge shakeup at Intel because Pentiums were too slow to compete with the Athlon chips and Intel could not run 64bit desktop Windows (Italium required a special version of Windows server or Linux).
Anyways, after Intel adopted the AMD instruction set and produced the Pentium Ds which could run 64bit Windows XP, they ran clock speeds comparable to the AMD chips (2ghz ish).
We have been bouncing around between 3 and 4ghz for basically 16 years. They just keep adding more cores (which most games wont use) or sometimes a little more cache. The memory buss has increased significantly.
However, the progress after 2003 is embarrassing slow compared to that of 1990 to 2003. Its like we get excited over the most insignificantly small gains and then rush out and spend a thousand dollars and then come home disappointed.
Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
The first CPUs that could realistically hit 5 GHz apart from exotic cooling was Sandy Bridge in 2011, and even that took an enormous overclock and some lucky silicon. Even with liquid nitrogen, the first that could hit 5 GHz was surely some sort of Pentium 4, and likely either Northwood (2002) or Prescott (2004). That's not 20 years ago, either.
But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you? If all that matters is the clock speed, then which one? Whichever number is largest?
"We all do the best we can based on life experience, point of view, and our ability to believe in ourselves." - Naropa "We don't see things as they are, we see them as we are." SR Covey
Your example of a pentium 4 at 3.8ghz... would not hardly run modern games at lowest resolution and lowest settings (if it ran at all, which is doubtful) compared to my old first gen i7 that only clocks 3.6ghz that still runs pretty good. Then theres newer gen i7's that are about the same clock speed as my i7, but are 30% faster or or more at single core performance.
It's true that stuff isnt increasing as fast as it used to, but eh, your examples are so far off the charts they aren't even applicable.
I'd also challenge you to try to run black desert or bless on that 5 yr old pc without turning everything to low. Even WOW i bet you get no more than 20fps on high settings.
Measuring instructions per second is far more useful. A computer program is a series of instructions.
Comparing pcmark benchmarks would be more useful for games because it takes into account the moving around of large memory buffers that happens in games but not in other programs (in gaming the CPU has to copy the textures and vertex buffers into the graphics card which is nearly 70% of the work performed by the CPU in a game, the graphics card is responsible for most of the work performed in any game as the CPU really just tracks player coordinates and handles game logic).
Using clock speed to compare different processors would be like using top speed to compare different cars, and then concluding that a Ferrari is better for transporting stuff than a delivery truck: You've got a valid unit, but your ignoring other factors that are also important, and arriving to wrong conclusions.
Herz was very good unit for comparing speeds of different processors back in 1990s, but it isn't any more. Nowadays other factors that affect CPU speed are too significant and using herz alone will not give you good info on CPU's real world performance.
I think we all can admit endorphins release when we smell new computer parts and see the styrofoam peanuts.
The truth is game developers aren't even utilizing the hardware to it's full potential so unless you're in some industry that requires it there's no real REAL reason to be upgrading every 2-3 years. The cycle can be way more reasonable, but you know;
Mmmmmmmm, new PC parts smell...
¯\_(ツ)_/¯
But I did test. I compared a P4 2.66GHz with an I7 920 - 2.66GHz . P4's release date is 2002 , while I7 920 , 2008. 6 years difference, yes ?
So in 6 years of technology we have :
Single core difference - P4 2.66 vs i7 920 2.66GHz = around 170% better ( for i7 )
Multi core difference - .. around 1000%
Now, what did technology accomplished in 6 years? around 200%+ better single core speed, and A LOT more multi core speed. Plus many other good things. ( and I didn't take best of '08 CPU - while P4 2.4 to 2.8 GHz was high-end )
I say that technology from 2000 to 2010 has evolved better then 1990 to 2000.
About your .. benchmarks remark , you do know that there are plenty of benchmarks who stress test single/multi cpu core for other uses, and not Games only, right?
Here is a link for reference on GPUs but CPUs suffer the same issue.
https://cs.nyu.edu/courses/spring12/CSCI-GA.3033-012/lecture12.pdf
I don't play any of the newest titles however, most are at least 2 to 4 years post launch which is likely why.
There isn't anything in MMORPG space I've been unable to play, even the newer ones because they code to a lower hardware standard these days.
"True friends stab you in the front." | Oscar Wilde
"I need to finish" - Christian Wolff: The Accountant
Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm
Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV
Don't just play games, inhabit virtual worlds™
"This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon
Heck, I had a i7 920 in the last 7/8 years. The only upgrades were a GPU and a SSD ( makes sense ) , and I could play almost any game out there on mid/high settings, which is fine.
Now I just upgraded to a 6800k , 16 GB Ram and a m.2 . I kept the GPU since is a GTX 1060. As I know myself, it will be around 2-3 years before I upgrade my GPU and around 5/6 years when i'll upgrade CPU/MB/RAM.
If it's an iteration of an existing technology, and that iteration far exceeds what I'm currently working with, then I'll typically buy it. For instance, if I'm working with a GTX 460 video card and a GTX 560 comes out, then I probably won't upgrade. The difference between the two cards performance wise was well under 20% iirc.
The performance difference just isn't worth the money. But when the GTX 670 came out, I ponied up the dough because that card was a significant upgrade over the GTX 460 and the technology is iterative and not completely new, therefore the chance of it being a complete failure is low.
Now let's take a look at VR technology. It's brand new, it's not established as a "must have" product yet, it's clunky, very few games use it and, good fucking grief, the decent VR systems are expensive as shit. So no, I'm not dipping into that technology pool yet. Give it a few iterations, see if it even catches on at all, and somewhere around the Gen 3 mark, I'll buy in.
Lastly, if one technology relies too heavily on another technology catching up to it to be useful, then I'll generally wait. For instance, the price of GPU's right now that can push 4K monitors at high frames and with great settings, is pretty high right now. Not only that, but I do a lot of gaming on triple monitors. Swapping out all of my monitors to 4K and finding a GPU that will push them, is just off the table. When they both come down significantly in price, then I'll buy in, but for now 1080p triple monitor with a single mid range card is completely satisfactory gaming.
Essentially, if it gives a 25% or better increase in speed/performance, then I'm up for swapping.
So in the end I disagree with your point of view.
In fact the processors are of comparable speeds, they just have more cores now.
Really the performance difference between hardware generations is exaggerated.
Its really more of a luxury to have the newest than a necessity.
Only thing I did since was upgrade from 8 gigs of ram to 16 gigs and my 7970 died last year so I got a GTX 1060. I only play mmos on pc and I have no problems playing those with what I have.
For example, I'm in the market for a vulkan test rig so I'll probably buy a little newer that I'm used to this year.
I'm more of an AMD fanboi than an Nvidia fanboi. The last nvidia card I owned was that classic FX5200 (first video card to require 2 slots and separate power supply) which replaced my geforce3ti500 which I held on to because it was faster than the geforce4's.
I switched to the AMD64 processors after holding out on the Cyrix CPUs when they stopped making them at 1.3ghz. That first gen athlon 64 + nvidiafx5200 + ddr2 windowsxp 64bit machine was like the only time that I actually just bought the best of the best (even had a WD raptor HDD with 10krpm disks).