SEARCH

The How-To Geek Forums Have Migrated to Discourse

How-To Geek Forums / Build Your Own PC

Video cards

(30 posts)
  • Started 5 years ago by JKay
  • Latest reply from drifta
  • Topic Viewed 3234 times

JKay
JKay
Posts: 78

Okay so I'm torn between the NVIDIA 9800 GX2 and the NVIDIA GTX 280, all the reviews say the GTX has a higher minimum fps but the GX2 seems to have better overall... I dont know, any help?

Posted 5 years ago
Top
 
jack7h3r1pp3r
jack7h3r1pp3r
Posts: 2815

here are the specs:

GPU/VPU: NVIDIA GeForce GTX 280

RAMDAC: Dual 400 MHz

Fill Rate per Second: 48.2 Billion pixels

Additional Features: HDTV Ready
SLI Ready
DirectX 10
OpenGL 2.1
PCI Express 2.0
HDCP Enabled (Dual-Link)

Maximum Resolution: 2560 x 1600 (Digital)

Video Memory: 1GB

Memory Type: GDDR3

Core Clock: 602 MHz

Memory Interface: 512-bit

Memory Clock: 2214 MHz

Shader Clock: 1296 MHz

Stream Processors: 240

Memory Bandwidth: 141.7GB/sec.

Interface Type: PCI Express 2.0

Interface Speed: x16

Connector(s): Dual DVI (Dual Link)
HDTV/S-Video
VGA (w/DVI to VGA Adapter)

Multiple Monitors Support: Yes

GPU/VPU: NVIDIA GeForce 9800 GX2

RAMDAC: Dual 400 MHz

Fill Rate per Second: 76.8 Billion pixels

Additional Features: HDCP Enabled
DirectX 10
OpenGL 2.1
PCI Express 2.0
Vista Certified

Maximum Resolution: 2560 x 1600 (Digital) - Dual Link DVI

Video Memory: 1GB ( 2 x 512MB)

Memory Type: GDDR3

Core Clock: 600 MHz

Memory Interface: 512-bit (256-bit per GPU)

Memory Clock: 2000 MHz

Shader Clock: 1500 MHz

Stream Processors: 256 (128 per GPU)

Memory Bandwidth: 128.0GB/sec.

Interface Type: PCI Express 2.0

Interface Speed: x16

Connector(s): HDMI
Dual DVI (Dual Link)

Multiple Monitors Support: Yes

this is where i pulled them from:
http://www.tigerdirect.com/app.....CatId=2306

http://www.tigerdirect.com/app.....CatId=3669

Posted 5 years ago
Top
 
jack7h3r1pp3r
jack7h3r1pp3r
Posts: 2815

i think that the gtx 280 is a little better except for the filter rate then the 9800 is a little faster there but other than that the other is better

Posted 5 years ago
Top
 
whs
whs
Posts: 17584

These guys look like you can heat the room with them.

Posted 5 years ago
Top
 
ScottW
ScottW
Posts: 6609

Hold up. Isn't a GX2 essentially 2 GPUs in one? The GTX is equal or better with just one GPU -- that's amazing! The single GTX may require less power, generate less heat, and require less cooling. I don't know if there are any games that can't recognize 2 GPUs, but the GTX would be the clear winner in such a situation.

Posted 5 years ago
Top
 
raphoenix
raphoenix
Posts: 14920

Posted 5 years ago
Top
 
drifta
drifta
Posts: 446

i would personally go for the GTX because its FPS is good and it has a single GPU.
like ScottW said i dont think there are many games that recognise 2 GPUs

it is really upto you, they both seem good. i dont know the price of each so i guess u should go with which ever is cheaper

Posted 5 years ago
Top
 
jonhill987
jonhill987
Posts: 161

"I don't think there are many games that recognise 2 GPUs"

Second.

Posted 5 years ago
Top
 
JKay
JKay
Posts: 78

Okay so I've got it down to the GTX 260, now I dont know why but I've built a couple of computers on newegg and gone almost all out on GPUs now I'm torn, I can 1 super NVIDIA watercooled, factory OCed, the works... OR I could get 2 of ATIs best new card for $100 (USD) more. I've been out of the video card race for awhile, last time I was shopping for one ATI was dominating over NVIDIA, I know NVIDIA has been ahead since then but these ATI cards have really good reviews, even being compared to NVIDIA... anyways here's the specs for the NVIDIA card:
Model
Brand BFG Tech
Model BFGEGTX260896H2OCWE
Interface
Interface PCI Express 2.0 x16
Chipset
Chipset Manufacturer NVIDIA
GPU GeForce GTX 260
Core clock 675MHz (vs. 576MHz standard)
Stream Processors 192 processing cores
Memory
Memory Clock 2326MHz (vs. 1998MHz standard)
Memory Size 896MB
Memory Interface 448-bit
Memory Type GDDR3
3D API
DirectX DirectX 10
OpenGL OpenGL 2.1
Ports
HDMI 1 via Adapter
DVI 2
TV-Out HDTV / S-Video Out
General
RAMDAC 400 MHz
Max Resolution 2560 x 1600
SLI Supported Yes
Cooler Water Cooling
System Requirements 525W PCI Express-compliant system power supply with a combined 12V current rating of 38A or more*
Two 6-pin PCI Express supplementary power connectors -or- One 6-pin PCI Express and two 4-pin peripheral supplementary power connectors
NOTE: For the power requirements of multiple GeForce GTX 280-based graphics cards in an NVIDIA SLI configuration, please visit www.bfgtech.com/slipower
Power Connector 2 x 6 Pin
Dual-Link DVI Supported Yes
HDCP Ready Yes

and the ATI:
Model
Brand DIAMOND
Model 4870PE5512
Interface
Interface PCI Express 2.0 x16
Chipset
Chipset Manufacturer ATI
GPU Radeon HD 4870
Core clock 750MHz
Stream Processors 800 Stream Processing Units
Memory
Memory Clock 1800MHz
Memory Size 512MB
Memory Interface 256-bit
Memory Type GDDR5
3D API
DirectX DirectX 10.1
OpenGL OpenGL 2.0
Ports
HDMI 1 via Adapter
DVI 2
TV-Out HDTV / S-Video Out
General
RAMDAC 400 MHz
Max Resolution 2560 x 1600
CrossFire Supported Yes
Cooler With Fan
System Requirements ATI Radeon HD 4870 and ATI Radeon HD 4850 System Requirements PCI Express based PC is required with one X16 lane graphics slot available on the motherboard 450 Watt or greater power supply with 75 Watt 6-pin PCI Express power connector recommended (550 Watt and two 6-pin connectors for dual ATI CrossFireX) Certified power supplies are recommended 1GB of system memory recommended Installation software requires an optical drive DVD playback requires DVD drive Blu-ray / HD DVD playback requires a Blu-ray / HD DVD drive For a complete ATI CrossFireX system, additional ATI Radeon™ HD 4800 series graphics card(s), an ATI CrossFireX Ready motherboard and one ATI CrossFireX Bridge Interconnect cable per board (included) are required.
Power Connector 2 x 6 Pin
Dual-Link DVI Supported Yes
HDCP Ready Yes
Packaging

Anybody experience both of these and have any advice? thanks in advance

Posted 5 years ago
Top
 
JKay
JKay
Posts: 78

Oh and another thing that isnt on there, I went with the 260 over the 280 (NVIDIA side) because the 260 was a single slot and with the lastest and greatest NVIDIA (790i) mobo it's a bit tight if you want to eventually go SLi and use the PCIe 1x slots

Posted 5 years ago
Top
 
jack7h3r1pp3r
jack7h3r1pp3r
Posts: 2815

well the nvidia card looks like it is a bit better than the ati but if you have two then i think that would blow away the nvidia but if you had two of the nvidia's then thats a whole new story :)

Posted 5 years ago
Top
 
ScottW
ScottW
Posts: 6609

JKay, it's hard to make an apples-to-apples comparison with just the specs listed. For example, all of the clock numbers -- core clock, memory clock -- are applied to different GPUs. There is more memory on the Nvidia card, 896 MB to 512 MB, but if you get two of the ATIs that will total 1 GB. There are 800 stream processors on the ATI GPU to the Nvidia's 192. Is that for real? 800 stream processors is amazing!

When you say you could get one factory OC'd water-cooled Nvidia, is that BFG board the one? I would find that odd since the GTX 260 is the scaled-down version of the GTX 280. If you're going to soup up a chip, why not start with the faster one? All that overclocking and special cooling will not add vertex shaders and stream processors.

Here's a suggestion. Get one ATI 4870 board and try your games on it. If it doesn't give enough oomph, add a second one.

Posted 5 years ago
Top
 
JKay
JKay
Posts: 78

I never really thought about just getting one of the ATIs and trying it out, okay last question about the topic... is it a good idea running ATI cards on a NVIDIA chipset mobo?

Posted 5 years ago
Top
 
raphoenix
raphoenix
Posts: 14920

@ScottW,

GOOD POST !!

I'm still trying to figure out what the Vertex Shaders, etc., etc., nomenclature means in Gamer Video Cards.

Regards,
Rick P.

Posted 5 years ago
Top
 
jack7h3r1pp3r
jack7h3r1pp3r
Posts: 2815

ya i don't know that one either

Posted 5 years ago
Top
 
ScottW
ScottW
Posts: 6609

JKay, since we are talking about leaving room for a dual card upgrade, you should NOT use an Nvidia motherboard chipset with ATI Radeon GPUs. The Nvidia motherboard chipset is SLI-ready, but not CrossFireX ready.

Posted 5 years ago
Top
 
jack7h3r1pp3r
jack7h3r1pp3r
Posts: 2815

oh ya i missed that part of it you always seem to find the smallest things scottw

Posted 5 years ago
Top
 
JKay
JKay
Posts: 78

I was afraid of that... it's a shame I really like the 790i mobo, would anyone recommend using an AMD CPU with this? I was going to use a 2.6ghz intel core 2 quad, but seems AMD makes a quad core 2.6ghz CPU. Would there be an advantage to using AMD with ATI since they are one in the same?

Posted 5 years ago
Top
 
ScottW
ScottW
Posts: 6609

@Jack, God is in the details.

@JKay, you picked your GPU first, which tells me that you are building a gaming rig. You might ask yourself if the games that you will play can benefit from a quad core CPU. Last I checked, there were few games that were multi-core aware and the conventional wisdom was that you would be better off with a dual-core CPU running at a higher clock speed. Of course, in gaming things change fast so this may no longer be true.

I'm pasting a link to an article at Tom's Hardware where they review the latest AMD chipset with an onboard Radeon GPU. They call this the first *true* AMD/ATI chip since AMD's acquisition of ATI. One benefit that AMD claims is for Hybrid CrossFire. This allows the chipset GPU to assist the PCI-E card GPU. I have not seen any reviews of Hybrid CrossFire, but it is a neat concept on paper. Even with only one graphics adapter, you could still be running a dual GPU configuration. Here's the chipset review:
http://www.tomshardware.com/re.....,1785.html

Posted 5 years ago
Top
 
ScottW
ScottW
Posts: 6609

Doh! I have seen a review that touches on Hybrid CrossFire -- it's the one I just posted! See page 13 of the 780G chipset review at Tom's for benchmark results with Hybrid CrossFire.

Posted 5 years ago
Top
 



Topic Closed

This topic has been closed to new replies.

Enter Your Email Here to Get Access for Free:

Go check your email!