topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Sunday December 15, 2024, 9:02 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Last post Author Topic: Is the Core i7 2600K really worth the extra cost over Core i5 2500K?  (Read 16021 times)

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
I've been building a NewEgg wishlist for a new computer I'm saving up for, and I've been drooling over the new Core i7 2600K ever since I heard about it a few months ago. But dang! It's nearly 1/3 of the entire build cost just for the CPU!

So today I decided to look at the Core i5 2500K and compare the two. According to NewEgg's Details pages on the two CPUs, there are only three differences between the two:

The Core i7 has
  • Hyperthreading support
  • 2MB more (8MB total) for the L3 cache
  • 0.1 Ghz faster clock speed

Is it really worth almost $100 more just for that? Is there more to it that I'm not seeing or thinking about? E.g. does the i7 overclock a lot higher or with more stability than the i5?

I went to Tom's Hardware to compare the two and of course in most cases the i7 performs better, but is it really going to be that big of a boost to justify the extra cost?

My uses for the PC would be as a gaming machine, as well as a media server (probably just music but possibly video), as well as some (relatively) lowpoly 3D modeling, and of course having a zillion tabs open in my browser. Possibly all at the same time.

40hz

  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 11,859
    • View Profile
    • Donate to Member
IMO, if you're doing professional video editing, CGI/animation, or serious 3D modeling, the i7 may be worth the extra money. For everything else most mortals will be doing, the i5 2500 should do just fine. And then some. It should be overkill for what you're using it for. If I had an extra $100 burning a hole in my pocket, I'd opt for a more powerful video card rather than spend it on a hotter CPU. But that's me. I'm usually out for more 'bang for the buck' than I am for absolute specs. So I always look for that sweet spot. Which is usually two or three down from the flagship when it comes to Intel's CPUs.

Not having deep pockets does that to you.  ;D



Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
If I had an extra $100 burning a hole in my pocket, I'd opt for a more powerful video card rather than spend it on a hotter CPU. But that's me. I'm usually out for more 'bang for the buck' than I am for absolute specs. So I always look for that sweet spot.

Yeah, that's the situation I'm in as well. I don't have extra money for the luxury of a new PC, so I doubt I'll be buying this "Wish List PC" for several months. But shaving about $100 off the cost would allow me to spend it elsewhere or not have to save up for so long before having the money to buy it.

I'm just so sick of my doggone slow AMD 64 3500 that I keep thinking I'm going to spend a lot of money on a really nice CPU so I won't have to worry about upgrading it for a good long time. But now that I've looked into the differences between the i5 2500K and the i7 2600K I think it's probably not really worth the extra cost.

Then again, maybe by the time I've saved up the money to buy the machine the difference in cost between the two CPUs will be more proportional to the difference in performance. . . :-\

Ath

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 3,629
    • View Profile
    • Donate to Member
so I doubt I'll be buying this "Wish List PC" for several months.
If you wait a few more months the difference will have eroded off, because by then even faster/better CPU's in the i7 line are probably released, and the net effect of that is the current generation goes down in price :Thmbsup:

phitsc

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 1,198
    • View Profile
    • Donate to Member
What budget are you targeting for, if you don't mind me ask?

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
What budget are you targeting for, if you don't mind me ask?

Well my budget was mostly... put together a really nice computer and then see how much it costs. :D

I've currently got it at just under $1,000 (with the i7) but I'd love to bring that down.

On the other hand, I was going to "recycle" some parts out of my current PC, like DVD Burner, HDDs, WiFi adapter, etc. But I'd also like to get an SSD and that hasn't been added to the wish list yet. I'm still waiting for a 120GB SSD for about $100, so that may be a future upgrade after I buy the PC.

And I was also considering getting a couple 2 TB drives in a mirror RAID. But as with the SSD those could be part of a future upgrade and aren't necessarily related to this thread. ;) And also some recent threads here on DC have discouraged RAID for home machines, so I'm not so sure I want to do that anymore. . .

eleman

  • Spam Killer
  • Supporting Member
  • Joined in 2009
  • **
  • default avatar
  • Posts: 413
    • View Profile
    • Donate to Member
I'm just so sick of my doggone slow AMD 64 3500 that I keep thinking I'm going to spend a lot of money on a really nice CPU so I won't have to worry about upgrading it for a good long time.

Buy any dual core processor (even low-end phenoms or ancient core2duos) and you will feel like you have upgraded from a 1990s chevy to a corvette. So, my advice to you is "don't go paying a netbook price for a cpu". The law of diminishing returns says that the utility a ferrari will bring over a corvette will be substantially smaller.

Stoic Joker

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 6,649
    • View Profile
    • Donate to Member
IMO, if you're doing professional video editing, CGI/animation, or serious 3D modeling, the i7 may be worth the extra money. For everything else most mortals will be doing, the i5 2500 should do just fine. And then some. It should be overkill for what you're using it for. If I had an extra $100 burning a hole in my pocket, I'd opt for a more powerful video card rather than spend it on a hotter CPU. But that's me. I'm usually out for more 'bang for the buck' than I am for absolute specs. So I always look for that sweet spot. Which is usually two or three down from the flagship when it comes to Intel's CPUs.

Not having deep pockets does that to you.  ;D

+1

I'm running Windows 7 x86 on an AMD 64 X2 3800+ with 2GB of (Mushkin) RAM here at the office, and it's actually quite tolerable for a 4/5 year old machine. Having 50 windows, MSVS2005, and a Virtual PC running doesn't even cripple it (Adding a 2nd VM will however).

I've setup several i5 (and i3) office workstations and the i5's really are nice even with a basic (business) hardware package.

Carol Haynes

  • Waffles for England (patent pending)
  • Global Moderator
  • Joined in 2005
  • *****
  • Posts: 8,069
    • View Profile
    • Donate to Member
AMD CPUs tend to be cheaper - have you seen the Phenom II x6 processor - I am VERY happy with mine and I haven't managed to max it out yet (despite trying)!!

On Amazon UK the Phenom II x6 3.3GHz is £80 cheaper then the i7-core x4 3.4GHz

AMD seems like a lot better value and the mobos are the same price.
« Last Edit: June 01, 2011, 01:31 PM by Carol Haynes »

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
I'm running Windows 7 x86 on an AMD 64 X2 3800+ with 2GB of (Mushkin) RAM here at the office, and it's actually quite tolerable for a 4/5 year old machine. Having 50 windows, MSVS2005, and a Virtual PC running doesn't even cripple it (Adding a 2nd VM will however).

I've setup several i5 (and i3) office workstations and the i5's really are nice even with a basic (business) hardware package.

I don't know what's wrong with my PC. It's frequently sluggish even on fresh boot of Windows 7 x86, and uses up about 1.5 GB (of 2GB) RAM, IIRC. I think I may be having hardware (mobo) problems, too, so I'm not sure how that affects it, if at all.


AMD CPUs tend to be cheaper - have you seen the Phenom II x6 processor - I am VERY happy with mine and I haven't managed to max it out yet (despite trying)!!

AMD seems like a lot better value and the mobos are the same price.
-Carol Haynes (June 01, 2011, 01:28 PM)

I probably would have been more interested in a 6 core AMD, but I asked about it earlier this year and the response I got was that even a 6-core AMD wasn't as good as a 4-core Intel. But maybe I misunderstood and that was just for the specific scenario I asked about?

Shades

  • Member
  • Joined in 2006
  • **
  • Posts: 2,939
    • View Profile
    • Donate to Member
Nope, I watched LAN parties over here in Paraguay between rich kids. AMD is definitely not bad at all (and in your case a MAJOR improvement), but but Intel quad core does blow it out of the gaming waters.

Strange as it may sound, in my experience Intel processor with ATI/AMD videocard(s) and AMD processor with NVidia videocard(s) perform best.
One of those kids I know is an NVidia fanatic and has 2 GTX460 (SLI) on his Intel quadcore. But some time ago he listened to reason and bought a huge ATI/AMD (HD5870). As I sometimes observe what he is gaming on his PC I saw a lot less "flickering/pausing" in games with the single ATI card, then with the dual NVidia cards.

When I mention this I get his typical fanboy answer including the obligatory benchmark numbers, but I am not blind and yes, he is using up-to-date drivers for his Windows 7 64-bit.

wraith808

  • Supporting Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 11,190
    • View Profile
    • Donate to Member
^ No I find the same thing.  I used to use NVidia for everything with my intel setups... I switched over to AMD/ATI for my video, and have noticed nothing but improvements.  And I also was in the AMD processor camp for a while- but that switched a while back, and while AMD might be less expensive, it seems that Intel has more staying power in terms of how long between upgrades the processor can take you, especially since the changes in architecture are now very easy to predict.

40hz

  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 11,859
    • View Profile
    • Donate to Member
FWIW I stopped using nVidia cards (too many driver hassles) and switched back over to ATI about a year ago. I haven't had any regrets about that decision.  :)




Carol Haynes

  • Waffles for England (patent pending)
  • Global Moderator
  • Joined in 2005
  • *****
  • Posts: 8,069
    • View Profile
    • Donate to Member
FWIW I stopped using nVidia cards (too many driver hassles) and switched back over to ATI about a year ago. I haven't had any regrets about that decision.  :)

Ditto - the only problem I have had is cleaning off nVidia drivers to get ATI drivers to install without issue.

Stoic Joker

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 6,649
    • View Profile
    • Donate to Member
Same here too - I stepped-off on nVidia back when Vista was in beta, they were just taking way too long to create a stable driver. I grabbed an ATI card (X300 IIRC) for my at the time test box ...(damn thing ran just fine with OOB drivers)... And I ain't looked back since.

40hz

  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 11,859
    • View Profile
    • Donate to Member
Funny thing was, I started using nVidia when they got more friendly towards Linux than anybody else. Most really good graphics experiences on Linux came if you had an nV card installed. Prior to that I always opted for ATI.

Then they pooched on their Nix commitment...

Oh well. I've had my fling and I'm back with my old flame now.  ;D


Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
Hmm... That's good to know. I had an nVidia on my wishlist.

So what's a good ATI video card to add to my wishlist? I think the last time I used ATI was back when GPUs went in an AGP slot (rather than PCIe), so I'm kind of out of the loop on what is good for ATI/AMD. In fact I didn't even know that ATI and AMD merged or partnered or whatever. :-[

The nVidia I had in my wish list (GeForce GTX 550 Ti (Fermi) 1GB 192-bit GDDR5) is about $125-$150 (depending on if you count Mail In Rebates against the price or not), so approximately that budget range would be nice. But it's looking more like I'd go for the Core i5 which means I could perhaps afford to spend a bit more on the GPU if there was significant reason.

Carol Haynes

  • Waffles for England (patent pending)
  • Global Moderator
  • Joined in 2005
  • *****
  • Posts: 8,069
    • View Profile
    • Donate to Member
The one I am using is a Sapphire Radeon HD-5670 1Gb DDR5 which supports 3 full HD digital displays and is pretty good on the games I have installed. In the UK it is about £80-£85 which makes it pretty good value.

If you do look at these there are a range of models with a variety of output formats - make sure that you get the one with 3 digital outputs (DVI, HDMI and Display port) - it comes with a DVI-VGA adapter and HDMI-DVI adapter. Sapphire also do Display Port-DVI adapters separately.

I have mine set up with 3 x 24" Samsung HD 1920 x 1080 screen and it is fantastic for the price point.

steeladept

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 1,061
    • View Profile
    • Donate to Member
Here is a related question for those of you who know....Assuming you are not gaming or doing any high-quality artwork, is there any reason not to use onboard video?  From what I have seen, the onboard video cards are quite capable, even for gaming (as long as it isn't cutting edge games).  Is this true or do you still suggest a separate card...If it is not true, I would really like to hear the reasons you would suggest sticking to a separate card.  If it is true, perhaps Deo would like to consider that for another cost cutting area.

Oh, and just for the sake of argument, let's assume it is a single monitor.  That right there is the main reason I am considering a separate card, but if there are options there, I am interested to hear about them as well.

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
Here is a related question for those of you who know....Assuming you are not gaming or doing any high-quality artwork, is there any reason not to use onboard video?

First of all, your assumption is wrong. I'm a gamer.  :D

Second of all there is no such thing as "Onboard Video" for the motherboards that maximize the efficiency allow overclocking of the CPU I'm getting.

IIRC, there are P67 and H67 motherboards for these CPUs, the H67 series enables the CPU's video abilities (Intel HD 3000) and the P67 series unlocks the CPU's overclocking abilities, but disables the built in video capabilities. Or something.

Anyway, the point is that I wanted to overclock so I'm getting a P67 series motherboard, so I'd need a discreet GPU anyway.

But I think you're right, and most people would be happy with the H67 mobo that has video stuff without a discreet GPU.

Here's a video detailing the features of these CPUs:



EDIT: added screenshot of slide seen at about 6:11 in the video above:

Core iX-2xxxK.png

EDIT 2: I just realized that your question wasn't directed at me, but that it was a question in general about whether or not onboard video would be good enough in the scenario you described.
« Last Edit: June 03, 2011, 01:17 AM by Deozaan, Reason: General clarifications »

steeladept

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 1,061
    • View Profile
    • Donate to Member

First of all, your assumption is wrong. I'm a gamer.  :D


Sorry to imply the assumption applied to you.  I was pretty sure you are a gamer, though I didn't know how cutting edge the games you played were.  I was asking more about the quality of the onboard video given an assumption that someone was not a hard-core, cutting edge gamer.  Guess I should have made that more clear.   :-[

Given the rest of the specs, I didn't figure you even would have that as an option, but I threw that out there because I was being too lazy to look up the specs of what you had on your list.

NOTE:  Doh!  Just notice the last line in your post.  Well it is out there twice now for other readers  :P
« Last Edit: June 03, 2011, 01:12 AM by steeladept »

Deozaan

  • Charter Member
  • Joined in 2006
  • ***
  • Points: 1
  • Posts: 9,778
    • View Profile
    • Read more about this member.
    • Donate to Member
Guess I should have made that more clear.   :-[

No, it was my fault. I skimmed the first sentence you wrote (even though I quoted it! :-[) which made it pretty clear that your question wasn't directed at/about me.

Sorry.  :-[

Stoic Joker

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 6,649
    • View Profile
    • Donate to Member
Here is a related question for those of you who know....Assuming you are not gaming or doing any high-quality artwork, is there any reason not to use onboard video?  From what I have seen, the onboard video cards are quite capable, even for gaming (as long as it isn't cutting edge games).

The Dell Dimension E521 I'm using here at the office does just fine with onboard video. Granted I don't play games on it (well okay Mahjong once or twice) but for workstuff or just general computing it's fine by me.

And yes it was a cost cutting excercise. I got a base machine with a CPU selected by best price point, and added RAM later when the price dropped. It's been a great machine for what I do.

40hz

  • Supporting Member
  • Joined in 2007
  • **
  • Posts: 11,859
    • View Profile
    • Donate to Member
+1 w/Stoic. For many applications, onboard video works just fine. For general business and productivity it's usually all you'll need. :Thmbsup:


steeladept

  • Supporting Member
  • Joined in 2006
  • **
  • Posts: 1,061
    • View Profile
    • Donate to Member
I was thinking more like for Mom & Dad who would probably be doing some gaming, but probably nothing like your typical gamer would consider gaming.  Games that were perhaps popular 5 years ago or more, perhaps.  And of course doing quick photo editing via the stock programs that come with their camera, web browsing, maybe even video conferencing.  Do you consider those types of activities as business/productivity?  My gut says that on-board would work fine, but I just don't know anymore.