• Please review our updated Terms and Rules here

So how many cores are enough these days?

Personally, I think a chance to change horses for IBM and the PC community was lost when the Power architecture came out. Still very much alive; some of these chips have 18 cores...
 
From now on the only acceptable replies to this thread are unsigned integers directly responding to the question "So how many cores are enough these days?".

2.

Yes, I'll vote 2 myself. As I already mentioned in another post, I need one core for myself, and Firefox can have the other one. Everything's working out smoothly. That's the one good thing that came out of my laptop's charging circuitry failing so that I had to switch to the notebook. The old laptop has one core, the notebook two. The notebook has a much poorer display, but the two cores takes care of the Firefox annoyance I constantly struggled with on the laptop. My office computer back home has tons of everything, but the big difference is between one core and two cores.

-Tor
 
I'm not sure what the upper limit is, but I think it's more than 2. Regardless of whether or not a particular application(s) you're using a single threaded there's plenty going on behind the scenes with the OS that benefits from multi-core.
 
Heh. I chose "two" partially tongue-in-cheek, partially because at this point I'd say a that the early Core2Duo machines sit just above the "too slow" line for general-purpose computing; IE, while I as a geek can easily find good uses for older machines a C2D is about the slowest thing I'd feel comfortable, say, handing to a computer-phobic elderly relative and setting them loose on YouTube, FaceBook, etc.

(And while SMP machines certainly existed before the C2D, particularly in the non-x86 world in the form of Apple PowerMacs, they weren't really mainstream consumer products until the Core Duo came along... and yes, strictly speaking the Athlon X2 slightly beat it to market, I've just never actually had my hands on one.)

I'm sure other people on this board would say they can still get along fine with a mid-level P4 or even slower, but, well, my personal criteria came from setting up a machine for my kid to play the Flash games on PBS.org, and sometime around 2012 or so a P4 just wouldn't cut it for that anymore. C2Ds are still seem adequate enough, even for high-res YouTube-ing, which is pretty impressive really.
 
Heh. I chose "two" partially tongue-in-cheek, partially because at this point I'd say a that the early Core2Duo machines sit just above the "too slow" line for general-purpose computing; IE, while I as a geek can easily find good uses for older machines a C2D is about the slowest thing I'd feel comfortable, say, handing to a computer-phobic elderly relative and setting them loose on YouTube, FaceBook, etc.

A real geek should be overclocking the nuts off it :p
555fsb.jpg

That's my old E8500 when it was new. Long gone now, but the overclocked i7 quad core that replaced it didn't feel any faster to use.
Although that isn't an "early C2D" - but even those can be punched up to a higher speed.
 
A real geek should be overclocking the nuts off it :p

Heh. I sorta gave up on overclocking sometime in the early 'aughts; if anything these days I'd rather underclock it if I knew for sure it'd make it reliable. ;)

In retrospect I should have probably said "four" because my kid's current machine is a Q6600 desktop I saved from being recycled last year. (Easy to forget that upgrade since I pretty much just swapped hard disks and apt-get'ed the nVidia driver go with the GeForce 8000-something the new box had in it.) Think I stuck with "2" because my standards are low enough that the Q6600 still seems pretty capable and quick. (At least until I tried installing a Playstation 2 emulator on it. That was a bridge too far.)
 
C2Q on Asus P5N-D is the way to go i think, side by side with a brand new I7, and there literally isn't much difference, until i feed either one a solid state drive, then the difference is very noticeable. And yes, a true geek overclocks the pants of his computers.
 
I'll chime in: Because some applications don't play nice with the system, you must have a bare minimum of 2. A realistic number given what you get on a typical web page these days is 4 cores, so that the loading and rendering threads are on their own metal.

Because I do multimedia editing and compression, I can never have enough cores. I've been running a Core i7 920 with hyperthreading turned on (4 cores, total of 8 hardware threads) overclocked to 3.2GHz for almost 7 years. I'm waiting until 6- or 8-core desktop Intel CPUs are in my price range before I upgrade again.
 
C2Q on Asus P5N-D is the way to go i think, side by side with a brand new I7, and there literally isn't much difference, until i feed either one a solid state drive, then the difference is very noticeable. And yes, a true geek overclocks the pants of his computers.

My PC at work is a Q6600. There's a colossal difference that I can notice between that and my home i7-4930k OC'd to 4.2 GHz with almost everything I do! Granted, my work PC is running Vista...
 
I'll chime in: Because some applications don't play nice with the system, you must have a bare minimum of 2. A realistic number given what you get on a typical web page these days is 4 cores, so that the loading and rendering threads are on their own metal.

Because I do multimedia editing and compression, I can never have enough cores. I've been running a Core i7 920 with hyperthreading turned on (4 cores, total of 8 hardware threads) overclocked to 3.2GHz for almost 7 years. I'm waiting until 6- or 8-core desktop Intel CPUs are in my price range before I upgrade again.

^ This. 5 years ago, getting on a high-end Pentium 4 system still felt adequate for day-to-day use. If you try it today with modern software and the modern web, it's an absolute snail.
 
Well, hyperthreading reduces the performance of a core, so unless you need the parallellism it'll give less performance.. so for my very demanding single-threaded processing I have hyperthreading switched off on the big server.

-Tor
 
If your workloads are primarily single-threaded (ie. you don't run any software that uses multithreading, or you tend to do only one thing at a time), then yes, disabling hyperthreading will allow each core to run faster.

Because I work in video production, whose editors and encoders are multithreaded, I see a significant throughput improvement by enabling hyperthreading.
 
So all of us with 2 or 4 core cpus with hyperthreading, can see an improvement in disabling such?
Of course not. Only when your usage fits the case where you need maximum single thread performance, e.g. the job you need done cannot be be parallelized and you need maximum performance (process the data in the shortest amount of time). As the case I described in what you quoted.
The point is that hyper-threading doesn't create performance out of thin air, it simply splits the core's workflow so that you can get one instruction processing one task, and the next instruction could process the other task. There are some additional resources in a core with HT, but the execution resources are shared. This reduces the maximum performance you can get out if it, seen from a single tasks point of view.
For your day-to-day usage of the PC, with an operating system that supports it, HT is probably an improvement. But it's not optimal for all cases. And for those cases you can run your case-specific benchmark test, turn off HT, and run again. If the latter improves performance, keep HT off. As I did.

-Tor
 
Back
Top