Last but not least, Company of Heroes: Tales of Valor,
the gap is merely 3.5%
The gap opens to 5% with 4xAA & 16xAF.
You can see from the result that both cards perform at a similar level, with the GeForce GTX260+ edges out by a small margin in most games. Given both cards are selling at the same price, the Nvidia card definitely makes a better choice.
Part 1: Introduction & Specification
Part 2: Test Setup
Part 3: 3Dmark & Call of Duty 5
Part 4: NSF:Undercover & FarCry2
Part 5: COH: Tales of Valor
10 May 2009
GeForce GTX260+ vs Radeon HD4890 Part5
GeForce GTX260+ vs Radeon HD4890 Part4
Need For Speed: Undercover, again no significant difference between the two.
The GTX260+ almost achieved what is known as "free AA",
while the HD4890 suffered a 12.6% drop in frame rate.
Moving on to the first DX10 game benchmark, FarCry2,
HD 4890 performs much better than its counterpart.
The gap maintained after switching on AA & AF.
Part 1: Introduction & Specification
Part 2: Test Setup
Part 3: 3Dmark & Call of Duty 5
Part 4: NSF:Undercover & FarCry2
Part 5: COH: Tales of Valor
GeForce GTX260+ vs Radeon HD4890 Part3
Benchmark Result
3Dmark Vantage (Performance),
where GTX260+ leads by 1717 points or 16%,
thanks to its support for PhysX technology.
3Dmark06 running at its default 1280x1024,
shows that both cards are on par.
Moving on to Call of Duty 5, running at 1920x1200 0xAA 0xAF,
GTX260+ is merely 2fps faster, no significant difference between the two.
GTX260+ suffers a drastic drop in performance when AA & AF is turned on.
Part 1: Introduction & Specification
Part 2: Test Setup
Part 3: 3Dmark & Call of Duty 5
Part 4: NSF:Undercover & FarCry2
Part 5: COH: Tales of Valor
GeForce GTX260+ vs Radeon HD4890 Part2
Test Setup
Processor: Intel Core 2 Quad Q9650 @ 3.6Ghz (400x9)
Memory: Corsair DDR2-1066 2x1GB
Motherboard: Asus P45
Harddisk: Seagate Barracuda 320GB 7200.10
Power Supply: GreatWall 650W
Display: Hanns.G 28" (1920x200)
Graphics:
Gainward GeForce GTX260+ 55nm 896MB Golden Sample (625/1348/2200)
Radeon HD 4890 GDDR5 1GB (850/3900)
Driver:
Forceware
O/S: Windows Vista Ultimate SP1
Chipset: Intel P45 Vista 9.0.0.1008 WHQL
Framerate: FRAPS 2.9.8
Part 1: Introduction & Specification
Part 2: Test Setup
Part 3: 3Dmark & Call of Duty 5
Part 4: NSF:Undercover & FarCry2
Part 5: COH: Tales of Valor
GeForce GTX260+ vs Radeon HD4890 Part1
AMD turns up the heat with an improved core and thus the birth of it's flagship Radeon HD4890. As part of their pricing strategy, ATI has announced that HD4890 will be retailing for under $260, which is also the price point of an overclocked GTX260+. Today, we will find out which card is the most bang for bucks.
Sample GPU-Z screenshot:
GeForce GTX260+ core216 with a 55nm core
Specification Comparison
Product | GeForce GTX 260+ | Radeon HD4890 |
Core Code | GT200 | RV790 |
Process | 55nm | 55nm |
Transistors | 1400 million | 959 million |
Core Clock | 625Mhz | 850Mhz |
Shader | 1348Mhz | 850 Mhz |
Shader | 216 | 800 |
ROP | 28 | 16 |
TMU | 72 | 40 |
Memory Clock | 2200Mhz | 3900Mhz |
Memory Bus Width | 448-bit | 256-bit |
Memory Size/Type | 896MB GDDR3 | 1GB GDDR5 |
Part 1: Introduction & Specification
Part 2: Test Setup
Part 3: 3Dmark & Call of Duty 5
Part 4: NSF:Undercover & FarCry2
Part 5: COH: Tales of Valor
15 March 2009
FEAR 2: Project Origin - GPU Graphics Performance
Test Platform
Intel Core 2 Quad Q8200 @ 2.33Ghz
Gigabyte X48
Apacer DDR2-1066 5-5-5-15 2x1GB
Seagate 7200.10 SATA 500Gb
Nvidia GeForce 9600GSO (650/1400Mhz)
Nvidia GeForce 9600GT (650/1800MHz)
Nvidia GeForce 9800GT (600/1800MHz)
Nvidia GeForce 9800GTX+ (738/2200MHz)
Nvidia GeForce GTX260 (576/1998MHz)
AMD Radeon HD4670 (750/2000MHz)
AMD Radeon HD4830 (575/1800MHz)
AMD Radeon HD4850 (625/2000MHz)
AMD Radeon HD4870 (750/3600MHz)
Windows Vista Ultimate SP1
Forceware 182.06 / Catalyst 9.2

1440x900
There is no option for GTX260 to run the game at 1440x900.

1680x1050 4xAA
Project Orignal seems to be less demanding to the system than its previous releases, even lower end graphics cards like HD4670 and 9600GSO are able to handle the game with maximum graphics details, scoring above 50fps.

1920x1200 4xAA
HD4870 512MB dominates this game at every resolution. While on the mainstream side, HD4670 512MB & 9600GSO 256MB manage to chink out playable frame rates, even at this high resolution with anti-aliasing turned on.
27 January 2009
GeForce 8800GT vs Radeon HD4870 1GB - FarCry2

Bench it with the exact same settings, average result is taken after 3 loops of small ranch.
Get ready for some insane frame rates!

8800GT 512MB OC (713/1728/2000)

HD4870 1GB OC (790/3800)
WHAT?! HD4870 1GB is only 1.4fps faster than 8800GT?!
Test System
Processor: Intel Core2Quad Q6600 @ 3.2Ghz (355x9)
Motherboard: ASUS P5B-E Plus
Memory: 2x1GB Kingston @ DDR2-1066 5-5-5-18
Storage: Hitachi 160GB 7200 SATA2
Power Supply: Coolermaster iGreen 500W
Display: BenQ FP202W 1680x1050 8ms
Graphics:
MSI NX8800GT 512MB OC (T2D512E) @ 713/1728/2000
MSI R4870 1GB OC (T2D1G) @ 790/3800
Windows Vista Ultimate SP1 32-bit DX10
Forceware 180.48 WHQL
Catalyst 8.12
FarCry 2 v1.0
After installing Catalyst 8.12 Hotfix...

the frame rates shoot through the roof!!!
Problem solved.
-hotfix.png)
A better picture with this graph.

Data table of the test results...
other sites on FarCry 2 graphics performance:
Guru3D
Techspot
08 November 2008
Radeon HD3850 X2 + GeForce 8600GT
Radeon HD3850 X2 + GeForce 8600GT
PhysX, the "in" thing in 3D PC gaming these days. Nvidia injected a boost of PhysX support to all GeForce 8 series & above cards. On the other hand, ATI has trouble coming up with similar features in the same field. So what can ATI fans do for now? One way is to add a mid-range GeForce card to the existing CrossFire platform, which will be covered here today. The following is an extract from the original article.

The test will be based on the Colorful C.A790GX X3. As the product name suggests, the board has 3 PCI-E 16x slots.

The PCI Express slots.
The blue one is a fully compatible PCI-E 2.0 16x slot. Under closer observation, you can see that the blacks' slots are missing half the pins, that is because they are in x8 configuration, providing x8 and x4 speed respectively.

The jumpers need to be changed to 2-3 configuration in order to enable a dual-card CrossFire mode. In CrossFire mode, the first two PCI-E slots (blue and black) will be run at x8 + x8 configuration.
Test System Specifications

CPU: AMD Phenom X4 9600 2.3Ghz
Mainboard: Colorful C.A790GX X3 (AMD 790GX + SB750)
Memory: OCZ DDR2-1150 2x1GB
Storage: Seagate 7200.10 160GB
Graphics:
Colorful Radeon HD3850-GD3 (x2)
Colorful GeForce 8600GT-GD3
Power: Thermaltake Toughpower 1200W
O/S: Windows XP Pro SP3
DirectX 9.0c 2008-3
AMD Chipset 8.9
ATI Catalyst 8.10
ForceWare 178.24 WHQL
Baseline Benchmark:

8660 for 3DMark06.
No miracle of a CrossFire-SLI here, the score matches the result for a pair of HD3850 X2 in x8 + x8. As the score suggests, the benchmark program gained no advantage from the additional 8600GT.

MKZ without PhysX acceleration

MKZ with PhysX acceleration from the 8600GT. The average framerate shows significant improvement of around 67%.

Nurien, without PhysX acceleration.

Nurien, with PhysX acceleration, received a big boost from the help of 8600GT.
27 October 2008
GeForce GTX260 vs GTX260+ vs Radeon HD4870

Initially, the GT200 based GeForce GTX280 and GTX260 were priced so high, the price/performance ratio was so bad that it didn't make any sense to pay the huge premium over the little performance gain. Of course we are comparing it against Radeon HD4800 series. To boost sales, Nvidia cut GeForce GTX280 price by 62%; GTX260 by 33%, placing the latter in the same price point with Radeon HD4870.
Things got interesting when Nvidia launched a new GTX260 version known as GeForce GTX 260+ Core 216, containing 216 shader processors as compared to the original GTX260's 192 SP. Bla bla bla.. Let's not waste anymore time on complex technical details, time to watch the triple-threat match of GeForce GTX260 192SP vs GTX260+ 216SP vs Radeon HD 4870!
Test Platform
CPU | Intel Core 2 Extreme QX6850 |
Motherboard | GIGABYTE GA-X48T-DQ6 |
Memory | A-DATA DDR3-1066 Extreme CL7 2x1GB |
Graphics | NVIDIA GeForce GTX 260 192SP 896MB |
Inno3D GeForce GTX 260 GOLD 216PS 896MB | |
ATI Radeon HD 4870 512MB | |
Power | GIGABYTE ODIN GT 1200W |
OS | Windows Vista SP1 |
Driver | Forceware 177.43 |
ATI Catalyst 8.8 |
Benchmark Result
GeForce GTX 260+ 216SP 896MB | Radeon HD 4870 800SP 512MB | GeForce GTX 260 192SP 896MB | |
3DMark 05 | |||
16 x 10 | 17487 | 18840 | 17335 |
19 x 12 | 16975 | 18367 | 16712 |
16 x 10 8AA | 15415 | 17328 | 15281 |
19 x 12 8AA | 14299 | 16238 | 14025 |
3DMark 06 | |||
16 x 10 | 14260 | 14125 | 13948 |
19 x 12 | 13246 | 13183 | 12844 |
16 x 10 8AA | 9448 | 10606 | 9339 |
19 x 12 8AA | 8755 | 9704 | 8607 |
3DMark Vantage | |||
High (GPU) | 5726 | 5133 | 5321 |
Extreme (GPU) | 4030 | 3604 | 3712 |
Company of Hero (DX10, High) | |||
16 x 10 | 57.1 | 56.1 | 55.9 |
19 x 12 | 54.5 | 52.9 | 52.2 |
16 x 10 8AA | 45.3 | 52.4 | 43.1 |
19 x 10 8AA | 38.9 | 48.5 | 36.1 |
Crysis (DX 10, High) | |||
16 x 10 | 41.46 | 39.46 | 39.21 |
19 x 12 | 35.06 | 33.02 | 32.81 |
16 x 10 8AA | 30.54 | 35.54 | 29.25 |
19 x 10 8AA | 24.33 | 29.38 | 23.33 |
PT Boat (DX10, High) | |||
16 x 10 | 56.9 | 46.2 | 55.6 |
19 x 12 | 54.3 | 43.9 | 52.8 |
16 x 10 8AA | 40.3 | 23.7 * | 39.1 |
19 x 12 8AA | 34.4 | 16.5 * | 33.2 |
Lost Planet (DX10, High) | |||
16 x 10 | 87.5 | 51.4 | 82.2 |
19 x 12 | 71.1 | 42.8 | 68.1 |
16 x 10 8AA | 61.4 | 48.7 | 58.7 |
19 x 12 8AA | 49.7 | 40.9 | 48.1 |
On average, the new GTX260 containing 216SP outperforms it's 192SP brother by around 5%.
Source: http://game.ali213.net/thread-2275669-1-1.html
26 October 2008
Radeon HD 4830 512MB Review Part4
CoH, WIC, FEAR, UT3

Maximum quality; shader made @ DX10.
HD4830 is struggling to keep up at 1440x900, else they are pretty on par at other resolutions.

Very high settings on DX10 mode, tested using the in-game benchmark tool.

Maximum quality, again tested with in-game benchmark tool.
Clear victory for the RV770LE based card.

Maximum quality with V-sync disabled.
The result looks very inconclusive. But overall, with 20/29 benchmarks in hand, the winner is no doubt Radeon HD4830, in terms of raw performance.
Part1: Introduction & Specifications
Part2: Test System, 3Dmark06 & Vantage
Part3: CoD4, Assassin's Creed, HL2:EP2
Part4: CoH, WIC, FEAR, UT3
Source: http://diy.pconline.com.cn/graphics/reviews/0810/1452670.html
Radeon HD 4830 512MB Review Part3
CoD4, Assassin's Creed, HL2:EP2

Maximum quality, V-sync off.
Close fight between HD4830 & 9800GT.

A clear win for the Radeons. The gap is more obvious at low resolution.

Again, everything at maximum quality.
This time round, the gap is wider at higher resolution, interesting phenomenon. On average, HD4830 is 5.2% faster than 9800GT here.
Part1: Introduction & Specifications
Part2: Test System, 3Dmark06 & Vantage
Part3: CoD4, Assassin's Creed, HL2:EP2
Part4: CoH, WIC, FEAR, UT3
Radeon HD 4830 512MB Review Part2
Test System, 3Dmark06 & Vantage
Test Platform
CPU: Intel Core2 Quad QX9770 @ 3.6Ghz
Mainboard: Asus X48
Memory: Aeneon DDR2-1066 2x1GB @ 5-5-5-15
Storage: Seagate 7200.10 500GB SATA
Graphics:
Radeon HD4850 512MB (625/1986)
Sapphire Radeon HD4830 512MB (575/1800)
GeForce 9800GT 512MB (600/1500/1800)
OS: Windows Vista Ultimate SP1 DX10
Drivers: Catalyst 8.54 RC1/ Forceware 178.13 Vista WHQL

ATI cards are traditionally weak in 3Dmark06, so no big surprise here.

3Dmark Vantage tells a very different story, so it all goes down to the real world game tests.
Part1: Introduction & Specifications
Part2: Test System, 3Dmark06 & Vantage
Part3: CoD4, Assassin's Creed, HL2:EP2
Part4: CoH, WIC, FEAR, UT3
Radeon HD 4830 512MB Review Part1
Just-another-card from AMD to cover the price point vacuum between Radeon HD 4670 & HD 4850, as well as to battle with Nvidia's Geforce 9800 GT. This card is based on the RV770LE core, which has 1 cluster of shader processors (usually defective) disabled, bringing the magic number to 640. Where the fully functional RV770 (HD4850/4870) have all the 800.
An interesting to note, AMD confirmed that disabling the cluster is a hardware-cut, so chances are it is impossible for anyone to "revive" the card with a modded BIOS. Let aside the controversy surrounding its "missing shaders", let's take a look at this extracted review:
Graphics cards | HD 4830 512MB (RV770LE) | HD 4850 512MB (RV770) | HD 3870 512MB (RV670) | 9800 GT 512MB (G92) |
---|---|---|---|---|
Core clock | 575MHz | 625MHz | 775MHz | 600MHz |
Shader clock | 575MHz | 625MHz | 775MHz | 1,500MHz |
Memory clock | 1,800MHz | 2,000MHz | 2,250MHz | 1,800MHz |
Bus width, size, type | 256-bit, 512MB, GDDR3 | 256-bit, 512MB, GDDR4 | 256-bit, 512MB, GDDR3 | |
Memory bandwidth | 57.6GB/s | 64GB/s | 72.8GB/s | 57.6GB/s |
Process | 55nm | 65nm/55nm | ||
Transistor count | 956mil | 956mil | 666mil | 754mil |
Die size | 260mm² | 260mm² | 192mm² | 296mm²/230mm² |
Peak GFLOPS | 736 | 1,000 | 496 | 504 |
ROPs | 12 | 16 | 16 | 16 |
TDP | 110W | 110W | 105W | 105W |
Part1: Introduction & Specifications
Part2: Test System, 3Dmark06 & Vantage
Part3: CoD4, Assassin's Creed, HL2:EP2
Part4: CoH, WIC, FEAR, UT3