Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Cacodemon345

How the modern CPU's speed don't go over 3-4 GHz?

Recommended Posts

I don't know if this is true or not but,

Back in the days of Doom and Wolf3D, the CPUs had leaps. It was possible for your old PC to quickly become obsolete, especially when it was only 3-5 years old.

For example, we saw the 486DX being quickly obsoleted by newer Pentiums, especially Pentium MMX. They also saw a surge on speed, the first high speed CPUs, meaning those over 500 MHz quickly exploded into market and 1 GHz CPUs appeared in 2000s.

However, today we see that CPUs has barely increased in terms of speed. Today people are still using old Intel Core i3, i5 and i7 CPUs. We also see that desktop CPUs only have 3.50 to 4.70 GHz and nothing more beyond that.

Why this is happening?

Share this post


Link to post
35 minutes ago, Cacodemon345 said:

I want to know people's opinions here.

Opinions are useless for this question, it's a topic controlled by facts alone. To wit, @RightField has already given you the answer.

Share this post


Link to post

Opinions? Here's an opinion... Youtubers want more video card than CPU. I hear them complain but Skyrim won't work on my 1.8 gHz celeron processor despite having a dual 8 GB VRAM unit.

Share this post


Link to post

CPUs stopped increasing in clock speed because we pretty much reached the limits of the technology about 10 years ago. CPU manufacturers have compensated by adding more cores instead. But regardless, most of the work associated with doing 3D rendering has moved to GPUs anyway; CPU speed isn't very important nowadays because we don't do software rendering any more.

Share this post


Link to post
1 hour ago, Cacodemon345 said:

@geo, ditch you Intel Celeron CPU and buy a new one. But expect it to be no more than 3.x GHz.

Who cares about just the number stamped all over the CPU box? You know nothing about CPUs and how they work, please at least read about them and how they work before giving "advice" about them, the same goes for everything in a computer.

Share this post


Link to post
6 hours ago, Cacodemon345 said:

I want to know people's opinions here.

So what you really mean is you're waiting for someone just as wrong as you are to validate your ridiculous hangup on a number that stopped being relevant around the time you were born. Not that clock speed ever really was a good indicator of a CPU's performance--in the 1990s a Macintosh Quadra 700 (68040 @ 25 MHz) was much faster than a Macintosh IIfx (68030 @ 40 MHz)--but it made for good marketing.

Share this post


Link to post

I remember the days when CPU speed was a thing. Then they invented Dual Core processors and it stopped being a thing.

 

It's like asking "Why don't car revs go higher than 9,000 RPM?", and completely ignoring everything else that determines the speed of a car.

Share this post


Link to post

Pretty much what RightField already linked.  CPU clock speed doesn't really translate into a useful performance metric.

 

Another thing to consider is the average number of instructions a CPU can execute in a single clock cycle.  A CPU at 10 MHz may be able to do just a single instruction per cycle, but another 10 MHz CPU may be able to do two or three instructions per cycle.  Or for a more real world example, an Intel Pentium 4 EE at 3.2GHz does something like an average of 3 instructions per clock cycle, while an Intel Core i5 7300U at 2.6 GHz is almost 21 instructions per cycle.

 

7 hours ago, Cacodemon345 said:

I want to know people's opinions here.

So in your opinion, why is it shitty?

Share this post


Link to post
13 hours ago, Cacodemon345 said:

I want to know people's opinions here.

 

I only repeat what I told you in some other thread: Do some research before writing posts that make you look dumb. There's no 'Opinion' to be had here, the current limits of clock speed are dictated by physics and it probably won't get much faster because - you know - we are getting in the realm where some hard physical constants will get hit. All we can try is to get more power out of each clock cycle but even here there's not much headroom left to increase performance.

 

There's a good reason why most modern programming languages increasingly focus on parallelism although it still leaves much to be desired - but rest assured that these problems will eventually be overcome and it will become much more natural to actually harness the power of multiple cores - and then there will practically no limits because doubling the number of cores is no longer playing with hard limits. We already can see with graphics hardware what kind of power you can get with a radical parallel design. They still manage to double their processing power every few years.

 

So, my prediction is that programming is about to radically change in the future because the existing paradigm has reached its end and you cannot squeeze more out of it.

 

 

Doom is an entirely different matter, of course. It depends too much on serial execution so I do not expect to see major leaps forward that allow to play 100000+ monster maps to be played at reasonable frame rates ever.

 

Share this post


Link to post
4 minutes ago, Cacodemon345 said:

I forgot to mention why no real CPU instructions aren't developed anymore.

what-text.png

Share this post


Link to post

Let me rephrase then; You don't understand how CPUs work and your question is not applicable.

Share this post


Link to post
15 minutes ago, Cacodemon345 said:

I forgot to mention why no real CPU instructions aren't developed anymore.

lol what.

 

Like, actual individual instructions?  Or instruction sets?

 

Also, for the record and as an example, ARMv8.1 had new instructions added to it a few years ago, if I'm not mistaken.  Some SIMD-related ones.  Also check out this list of x86 instructions and when they got added.

Share this post


Link to post

I meant instruction sets for x86 CPUs. Those instruction sets are sort of old and got added about 8-5 years back.

@Memfis, I think another uncommon user here would be royaldj. He acts like he was here from stone age.

Share this post


Link to post
Just now, Cacodemon345 said:

I meant instruction sets for x86 CPUs. Those instruction sets are sort of old and got added about 8-5 years back.

x86 is the instruction set.  It's been expanded upon continuously since it first appeared in the late 1970s.  Things like x86-64, MMX, 3Dnow!, the virtualization-related stuff, SSE, AVX, etc. are all expansions to x86.

 

It also works well for what it's used for.  m68k, ARM, POWER, MIPS, SuperH, Sparc... they've all been around for a while as well, but are still used.  To ditch an instruction set just because it's supposedly "old" is crazy.

Share this post


Link to post

Can you PLEASE, PLEASE, PLEAAASSEEEE cut out the incessant bullshit thread making? You always talk about things you have no understanding of. Holy fuck.

Share this post


Link to post

Guys why can't CPUs speak English instead of like machine code language? If it were proper words instead of all 1's and 2's like a binary code, surely they'd understand things better and there'd be less errors?

:P

Share this post


Link to post
Guest
This topic is now closed to further replies.
×