Results 1 to 12 of 12
  1. #1
    Community Member Draccus's Avatar
    Join Date
    May 2008
    Posts
    1,044

    Default Help me choose a video card

    I'm having a new PC built to replace my 6 year old Dell and I can't decide on a video card. These are two of my options:

    nVidia GTS 250 1gb
    ATI Radeon HD 5770 (about $60 more)
    nVidia GTX 260 (about $90 more)

    How much of a difference would I see for the extra $90? Would it make sense to go with ATI since they currently support DirectX 11?

    Basic, universal rogue build advice
    "Not in the face! Not in the faaaaaace!"

  2. #2
    Community Member
    Join Date
    May 2006
    Posts
    1,213

    Default

    oh man.

    I just wrote this huge long response about how you shouldn't be upgrading your video card in a 6-year old rig. Just as I finished trying to convince you that you need a new rig altogether, I re-read your post and saw that's what you're doing

    To best answer your questions, I need to know your monitor's resolution. Without knowing that, I will just go ahead and recommend an option you have not offered: the GTS 250, then an upgrade to a newer nvidia card in a year or two.

    250 vs 260: the 250 is better in many ways, such as the 1GB of memory. The 260 may perform better in some/most applications, but I bet the difference is too small to notice.

    250 vs 5770: DX11 support would be nice, but ATI's ongoing driver issue have kept me away from them for quite a while. Also, I have a hunch that there's a reason behind nVidia being so slow to adopt DX11 (even the 3XX series, slated for release in March, is not expected to have DX11 support). I have no data to back this up, but I suspect there's more going on there than we've been told. Also... I know there's been talk of DX11 in DDO, but I don't expect to see it anytime soon.

    What I do think is that in a year or possibly two, nVidia will come out with a new line (possibly the 4XX or a new naming convention altogether) that will support, say, DX 11.1 or DX 12... a.k.a. "DX11, fixed". In which case, having a DX11 card will do little good.

    These are all just hunches. I'm sure someone will disagree.

  3. #3
    Community Member Draccus's Avatar
    Join Date
    May 2008
    Posts
    1,044

    Default

    Thanks! I'm actually building a whole new PC, so this isn't just a video card purchase.

    I have a 20" widescreen monitor and have no idea what resolution I'd run it at.

    Basic, universal rogue build advice
    "Not in the face! Not in the faaaaaace!"

  4. #4
    Community Member
    Join Date
    May 2006
    Posts
    1,213

    Default

    Quote Originally Posted by Draccus View Post
    Thanks! I'm actually building a whole new PC, so this isn't just a video card purchase.

    I have a 20" widescreen monitor and have no idea what resolution I'd run it at.
    In that case I think any of those cards would do just fine, and you probably wouldn't be able to tell the difference between them.

  5. #5
    Community Member dasungeheuer's Avatar
    Join Date
    Oct 2009
    Posts
    118

    Default

    Depends on what you'll be using it for. I have a 9800GT here, and it runs DDO on max resolution fluidly. It won't run the most current games fluidly, I don't think.

    ATI does have DX11. Once DDO starts supporting it - sometime this year - you may get a little better shadows and water. From the Turbine comments I have seen, I don't expect tesselation.

    NVidia will start supporting DX11 with their GF100 (GT300) "Fermi" boards. The only reason NVidia is late is that they got delayed - huge issues with making GT300 work due to the size of the chip. semiaccurate.com has a pretty good timeline on that debacle. fudzilla.com has the odd interesting tidbit, too, such as NVidia accelerating development of Fermi Mark 2 so they can get out from under losing money with Fermi as quickly as possible.

    I'd say figure out what games you wish to play, whether a little bit of shadow and water eye-candy in DDO is important, and then choose. If DDO is your main game, then a sub-$100 card is a reasonable choice.

    Alright, let's play the "name game". NVidia's naming is utterly confusing, as they have taken the same chip and renamed it several times during its lifetime. I'll try and unravel the renaming game for the GT(S)250 at least.

    GTS250: 8800GT -> 9800GT(X+) -> GTS250 (see also http://www.theinquirer.net/inquirer/...a-spins-gts250)

    This is a chip quite capable of running DDO at max resolution. It is the G92 / 8800GT chip from way back in 2007, just shrunk once and renamed twice - the last rename was from the 9800GTX+. The 8800GT got horribly hot, but it's shrink to 55nm didn't: If you can get a 9800GT (or derivative, such as the 9800GTX) for less money than the GTS250, you may as well go for that. There are some very nice low-power versions of the 9800GT out there that need minimal cooling and thus produce minimal noise. Same is true for the GT(S)250 - if you choose this chip, just get the best bang for your buck, regardless of which name you buy it under.

    1GB of video memory would come into play at 1080p resolutions - a 23" monitor and up. If you plan to replace your 20" monitor with something bigger during the lifetime of the card, then yes, 1GB makes sense.

    The GTX260 is a 55nm part as well. It is actually a G200 under the hood, not just a rebadged G92. While GTX260M is another rebadged G92 - confused yet? NVidia EOL'd the GTX260 Nov/Dec timeframe, as they were losing money on it. StrakeIn is right in that the performance difference to the GT(S)250 is minimal. Save your money. Unless you can find a fire sale of it somewhere, where it goes for the same price as the GT(S)250. See also http://semiaccurate.com/2009/10/06/n...gh-end-market/ for the story of how and why NVidia got out of that market.

    Now on to DX11 / ATI. In the low-end, mid-range and performance DX11 segment, ATI is currently the only game in town, as NVidia have left the market to them. NVidia is going to offer a perfomance option with GF100 (Fermi), sometime in March. I don't expect Fermi-derivative mid-range or low-end boards anytime soon from NVidia. I think we'll see Fermi Mk2, and then we'll see derivatives.

    I'd go and decide whether DX11 is important to you. If so, a 5750 may be a good choice when it comes to bang for the buck, or even a 5770. The 5600 range may be an option, too, just don't expect to run current-generation DX11 titles on it. DDO doesn't count, it's not current-generation, and it's an MMO, thus designed to run on a wider range of hardware. Some reviews to help you decide: http://www.anandtech.com/video/showdoc.aspx?i=3720 for the 5670, and http://www.anandtech.com/video/showdoc.aspx?i=3658 for the 5750/5770. The 5750 is a little bit faster than a GTS250, at about the same price point, and with DX11 support.

    In the performance range, we have the 5850 and 5870, with the 5850 being the better deal. It's still too expensive, though - it should sell at around 250, not 300. Now that TSMC have resolved their process issues and yields are promised to go up - hopefully TSMC will keep that promise - I am hopeful that the prices of these will come down as NVidia launches Fermi-based boards.

    Lots of text, eh? In what seems to be your budget range, choose between the GT(S)250 or its namesakes on the one hand, or the 5750/5770 on the other hand. Choose 1GB if you want to be ready for a bigger monitor, and choose ATI if you want to use the DX11 features that Turbine is adding to DDO.

    [Edit]
    A P.S.: I was pondering where StrakeIn got the information that NVidia won't be supporting DX11 with the 300-series cards. And I know what he means, now. He's right, it's just more of the confusing naming game. Fermi, aka GT300, aka GF100, will be supporting DX11. That's the part NVidia couldn't get out the door. In the meantime, as they had no new product, they started renaming old boards - to the 200 series, and then the 300 series. http://semiaccurate.com/2009/12/08/n...-series-grows/ and http://semiaccurate.com/2009/11/25/n...-renamed-g210/ have some details on that. It's actually so confusing I think I'll stop tracking it - the 200-series parts have been renamed themselves in many, but not all, cases, and then the 300-series parts are another rename. Which is why, I think, NVidia dropped the original GT300 moniker that they had reserved for Fermi - they now use it for a rebadge and marketing spin game. And called Fermi the GF100 instead.

    Ouch my head.
    Last edited by dasungeheuer; 01-25-2010 at 08:08 AM.

  6. #6
    Community Member
    Join Date
    May 2006
    Posts
    1,213

    Default

    Whew... what dasungeheuer said

    One caveat:
    This is a chip quite capable of running DDO at max resolution.
    I think that needs to be interpreted as "max resolution the OP will have on the monitor he said he'll be running".

    It's hard for me to say, as I've never played with a GTS250, but I did fry the bejesus out of my twin 8800GTs, (playing at 2560x1600 with not-maxed graphics). I'm unsure of whether or not the 250 would do the same... I know I was running into memory issues, for sure. (Run a GTX285 w/2GB now, all settings maxed, no problems).

  7. #7
    Community Member dasungeheuer's Avatar
    Join Date
    Oct 2009
    Posts
    118

    Default

    Quote Originally Posted by Strakeln View Post
    One caveat:

    I think that needs to be interpreted as "max resolution the OP will have on the monitor he said he'll be running".
    Oh, yeah, thanks for catching that. I meant to say "max graphics settings on the max resolution of the monitor that the OP said he'd be running". Meaning, for a 20" monitor, that card will run DDO at decent frame rates (40-50) with max quality and draw distance settings. I know, because I run its namesake on a 22" monitor.

  8. #8
    Community Member
    Join Date
    May 2006
    Posts
    1,213

    Default

    Quote Originally Posted by dasungeheuer View Post
    Oh, yeah, thanks for catching that. I meant to say "max graphics settings on the max resolution of the monitor that the OP said he'd be running". Meaning, for a 20" monitor, that card will run DDO at decent frame rates (40-50) with max quality and draw distance settings. I know, because I run its namesake on a 22" monitor.
    I figured that's what you meant, and agree completely.

  9. #9
    Community Member Draccus's Avatar
    Join Date
    May 2008
    Posts
    1,044

    Default

    Wow, thanks a ton all! You guys saved me a bit of money.

    Yesterday, I bought the PC and spec'd the GTS250. Given that I'm currently playing on a Radeon X300, I don't think I'd notice the difference between the 250 and the GTX260

    AMD PhenomTII 955 Black Edition Quad-core...lol...no idea what that means but it was the second-best AMD processor they had.

    Liquid cooling? Seriously? I bought liquid cooling? mkaaaayy...do I need to run a water line to my PC? Is this any good? It was a free upgrade so I hope it's an improvement.

    I only got a 600w power supply but may upgrade it to 700w before it's built. Thoughts?

    Basic, universal rogue build advice
    "Not in the face! Not in the faaaaaace!"

  10. #10
    Community Member
    Join Date
    May 2006
    Posts
    1,213

    Default

    Quote Originally Posted by Draccus View Post
    I only got a 600w power supply but may upgrade it to 700w before it's built. Thoughts?
    700W is probably sufficient, but if you can, going up to 1000W (of a decent power supply, not some walmart version) might be worthwhile (thinking forward to future upgrades). Not necessary, I'm thinking, but not a bad idea.

  11. #11
    Community Member dasungeheuer's Avatar
    Join Date
    Oct 2009
    Posts
    118

    Default

    The industry has gone bat**** on PSUs while I wasn't looking, it seems. The most demanding graphics cards - way above GTS250 level - will draw about 280Watt. Add the rest of your stuff, and a 600W PSU is plenty unless you wish to run two really high-end graphics cards. At which point you'll be shelling out something on the order of 800 bucks just for the graphics cards in your system. As I can't _ever_ see you doing something so wasteful, the 600W will serve you fine now and in the future.

    I have a 520W and it's plenty. Plenty and then some. For a single GPU, that is.

    [Edit]

    The "Black Edition" Phenoms have great overclock ability. The water cooling you got plays to that, too. Water cooling is usually also a lot quieter than fans. You can have a quiet system, and if you feel like it, you can play with overclocking your CPU. Just take it slow when you do, read a lot: If you bump voltages too high in your zeal for speed, you'll kill the CPU. "OCCT" is a great stability testing tool after an overclock. If it survives 4 hours of OCCT, with maybe an extra hour of OCCT:Linpack thrown in, it's a rock solid overclock.
    Last edited by dasungeheuer; 01-26-2010 at 09:57 PM.

  12. #12
    Community Member Draccus's Avatar
    Join Date
    May 2008
    Posts
    1,044

    Default

    I don't ever see me overclocking my CPU. The most complicated thing I do to my computer is back up my music and photo folders. I'm not a PC guy. It's not a hobby for me and I treat it like a TV. It turn it on, watch it, and turn it off.

    That being said, it's nice to know the potential is there should I ever decide to take this stuff more seriously. I did do a lot of reading on the liquid cooling device (after I bought it, of course...lol) and it seems like pretty reliable technology. I think it's a good idea for me because my PC sits inside a hardwood computer credenza and doesn't get great air flow. The more cooling the better!

    I actually did upgrade the power supply to 700w. There COULD be a beta nVidia Fermi pre-production test card in my future and I want to ensure I have the power for it. Overkill? Yeah, but for $10, I'd rather be safe.

    Basic, universal rogue build advice
    "Not in the face! Not in the faaaaaace!"

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

This form's session has expired. You need to reload the page.

Reload