Cacodemon345 Posted January 20, 2018 I don't know if this is true or not but, Back in the days of Doom and Wolf3D, the CPUs had leaps. It was possible for your old PC to quickly become obsolete, especially when it was only 3-5 years old. For example, we saw the 486DX being quickly obsoleted by newer Pentiums, especially Pentium MMX. They also saw a surge on speed, the first high speed CPUs, meaning those over 500 MHz quickly exploded into market and 1 GHz CPUs appeared in 2000s. However, today we see that CPUs has barely increased in terms of speed. Today people are still using old Intel Core i3, i5 and i7 CPUs. We also see that desktop CPUs only have 3.50 to 4.70 GHz and nothing more beyond that. Why this is happening? 0 Share this post Link to post
RightField Posted January 20, 2018 i did one google search and the first hit was an article explaining it pretty good. 4 Share this post Link to post
Cacodemon345 Posted January 20, 2018 I want to know people's opinions here. 0 Share this post Link to post
Cacodemon345 Posted January 20, 2018 I know power and transitors affect this, but other reasons? 0 Share this post Link to post
Edward850 Posted January 20, 2018 35 minutes ago, Cacodemon345 said: I want to know people's opinions here. Opinions are useless for this question, it's a topic controlled by facts alone. To wit, @RightField has already given you the answer. 14 Share this post Link to post
geo Posted January 20, 2018 Opinions? Here's an opinion... Youtubers want more video card than CPU. I hear them complain but Skyrim won't work on my 1.8 gHz celeron processor despite having a dual 8 GB VRAM unit. 0 Share this post Link to post
Cacodemon345 Posted January 20, 2018 @geo, ditch you Intel Celeron CPU and buy a new one. But expect it to be no more than 3.x GHz. 0 Share this post Link to post
fraggle Posted January 20, 2018 CPUs stopped increasing in clock speed because we pretty much reached the limits of the technology about 10 years ago. CPU manufacturers have compensated by adding more cores instead. But regardless, most of the work associated with doing 3D rendering has moved to GPUs anyway; CPU speed isn't very important nowadays because we don't do software rendering any more. 6 Share this post Link to post
Cruduxy Pegg Posted January 20, 2018 1 hour ago, Cacodemon345 said: @geo, ditch you Intel Celeron CPU and buy a new one. But expect it to be no more than 3.x GHz. Who cares about just the number stamped all over the CPUÂ box? You know nothing about CPUs and how they work, please at least read about them and how they work before giving "advice" about them, the same goes for everything in a computer. 0 Share this post Link to post
Jon Posted January 20, 2018 This article on the Cyrix processors of the 90s is interesting reading http://liam-on-linux.livejournal.com/49259.html 3 Share this post Link to post
ETTiNGRiNDER Posted January 20, 2018 1 hour ago, Jon said: This article on the Cyrix processors of the 90s is interesting reading http://liam-on-linux.livejournal.com/49259.html That is interesting. I wonder if similar reasons have anything to do with why Heretic II performance is so lousy on my AMD K6-2 based rig, even though most other stuff of the era seems to do okay. 0 Share this post Link to post
Woolie Wool Posted January 20, 2018 6 hours ago, Cacodemon345 said: I want to know people's opinions here. So what you really mean is you're waiting for someone just as wrong as you are to validate your ridiculous hangup on a number that stopped being relevant around the time you were born. Not that clock speed ever really was a good indicator of a CPU's performance--in the 1990s a Macintosh Quadra 700 (68040 @ 25 MHz) was much faster than a Macintosh IIfx (68030 @ 40 MHz)--but it made for good marketing. 2 Share this post Link to post
Bauul Posted January 21, 2018 I remember the days when CPU speed was a thing. Then they invented Dual Core processors and it stopped being a thing. Â It's like asking "Why don't car revs go higher than 9,000 RPM?", and completely ignoring everything else that determines the speed of a car. 3 Share this post Link to post
Remilia Scarlet Posted January 21, 2018 Pretty much what RightField already linked. CPU clock speed doesn't really translate into a useful performance metric.  Another thing to consider is the average number of instructions a CPU can execute in a single clock cycle. A CPU at 10 MHz may be able to do just a single instruction per cycle, but another 10 MHz CPU may be able to do two or three instructions per cycle. Or for a more real world example, an Intel Pentium 4 EE at 3.2GHz does something like an average of 3 instructions per clock cycle, while an Intel Core i5 7300U at 2.6 GHz is almost 21 instructions per cycle.  7 hours ago, Cacodemon345 said: I want to know people's opinions here. So in your opinion, why is it shitty? 3 Share this post Link to post
Jerry.C Posted January 21, 2018 (edited) 13 hours ago, Cacodemon345 said: I want to know people's opinions here. Â I only repeat what I told you in some other thread: Do some research before writing posts that make you look dumb. There's no 'Opinion' to be had here, the current limits of clock speed are dictated by physics and it probably won't get much faster because - you know - we are getting in the realm where some hard physical constants will get hit. All we can try is to get more power out of each clock cycle but even here there's not much headroom left to increase performance. Â There's a good reason why most modern programming languages increasingly focus on parallelism although it still leaves much to be desired - but rest assured that these problems will eventually be overcome and it will become much more natural to actually harness the power of multiple cores - and then there will practically no limits because doubling the number of cores is no longer playing with hard limits. We already can see with graphics hardware what kind of power you can get with a radical parallel design. They still manage to double their processing power every few years. Â So, my prediction is that programming is about to radically change in the future because the existing paradigm has reached its end and you cannot squeeze more out of it. Â Â Doom is an entirely different matter, of course. It depends too much on serial execution so I do not expect to see major leaps forward that allow to play 100000+ monster maps to be played at reasonable frame rates ever. Â 1 Share this post Link to post
Cacodemon345 Posted January 21, 2018 I forgot to mention why no real CPU instructions aren't developed anymore. 0 Share this post Link to post
Edward850 Posted January 21, 2018 4 minutes ago, Cacodemon345 said: I forgot to mention why no real CPU instructions aren't developed anymore. 2 Share this post Link to post
Edward850 Posted January 21, 2018 (edited) Let me rephrase then; You don't understand how CPUs work and your question is not applicable. 1 Share this post Link to post
Remilia Scarlet Posted January 21, 2018 15 minutes ago, Cacodemon345 said: I forgot to mention why no real CPU instructions aren't developed anymore. lol what.  Like, actual individual instructions? Or instruction sets?  Also, for the record and as an example, ARMv8.1 had new instructions added to it a few years ago, if I'm not mistaken. Some SIMD-related ones. Also check out this list of x86 instructions and when they got added. 0 Share this post Link to post
Memfis Posted January 21, 2018 Cacodemon345 is probably the weirdest user currently on Doomworld. 4 Share this post Link to post
Cacodemon345 Posted January 21, 2018 (edited) I meant instruction sets for x86 CPUs. Those instruction sets are sort of old and got added about 8-5 years back. @Memfis, I think another uncommon user here would be royaldj. He acts like he was here from stone age. 0 Share this post Link to post
Remilia Scarlet Posted January 21, 2018 Just now, Cacodemon345 said: I meant instruction sets for x86 CPUs. Those instruction sets are sort of old and got added about 8-5 years back. x86 is the instruction set. It's been expanded upon continuously since it first appeared in the late 1970s. Things like x86-64, MMX, 3Dnow!, the virtualization-related stuff, SSE, AVX, etc. are all expansions to x86.  It also works well for what it's used for. m68k, ARM, POWER, MIPS, SuperH, Sparc... they've all been around for a while as well, but are still used. To ditch an instruction set just because it's supposedly "old" is crazy. 0 Share this post Link to post
Cacodemon345 Posted January 21, 2018 I meant instruction set expansions. 0 Share this post Link to post
Dragonfly Posted January 21, 2018 Can you PLEASE, PLEASE, PLEAAASSEEEE cut out the incessant bullshit thread making? You always talk about things you have no understanding of. Holy fuck. 3 Share this post Link to post
Jayextee Posted January 21, 2018 Guys why can't CPUs speak English instead of like machine code language? If it were proper words instead of all 1's and 2's like a binary code, surely they'd understand things better and there'd be less errors? :P 3 Share this post Link to post
Dragonfly Posted January 21, 2018 It is in my opinion that CPU's should speak latin, so it's at least a bit difficult to make programs on!!!!!!! 1 Share this post Link to post
Cruduxy Pegg Posted January 21, 2018 Stupid machines doing everything as it is written, why don't they have any creativity!! 0 Share this post Link to post