3DNews Vendor Reference English Resource -
All you need to know about your products!
Digital-Daily.com
Digital-Daily

nVidia GeForce FX5200 (NV34) Video Card Review

Date: 29/05/2003


By:

NV30 was announced quite a long time ago. But video cards based on the nVidia GeForce FX 5800 (NV 30) chip are too high in price and beyond pocket to many, or the user simply isn't willing to pay extra bucks for speed not so badly needed. Usually, after the release of a flagship model, its cut-down version is then produced, which allows covering all the market sectors.

For May 2003, nVidia is producing three chips based on the FX architecture, to be more precise, FX 5600 video cards are to hit the mass sales only next month - just for now its production stores are being piled up. To be absolutely honest, then as we predicted, the NV30 will never be released to the mainstream market and is aimed solely at the professional Quadro product line, where the price doesn't play the decisive part as it does on the consumer market. In the road-maps of all video card manufacturers, the NV30 has been replaced with the NV35. Nevertheless, in our comparative table we did place the NV30... Much water will have flown by the time the NV35 hits the real-life retail.

Comparison data for the FX family video cards:

Sector High-End Middle Low-End
The chip NV30 NV30 Ultra NV31 NV31 Ultra NV34 NV34 Ultra
Name FX 5800 FX 5600 FX 5200
Process technology, mk 0.13 0.13 0.15
Q-ty of transistors, mln 125 75 45
Pixel pipelines 4 2 2
Texture units 8 4 4
Core speed, MHz 400 500 300 350 250 325
Memory bus speed, MHz
800
1000
500
700
400
650
Memory bus, bit 128 (DDR II) 128 (DDR) 128 (DDR)
RAMDAC, Mhz 2х400 2х400 2х400
Additional power Mandatory Advisable Optional
Notes: Recommended speeds of the core and memory for NV31 and NV34 chips have been revised several times.

Today we are presenting benchmarks of the nVidia GeForce FX 5200 (NV34) video cards, but in the near future we'll amend that and publish an article to do with the mid-range GeForce FX 5600 (NV31) chip.

Distinctions of the NV34 Chip

To date, nVidia is releasing a line of three GPUs of the FX family for all the three market sectors, with each chip produced in two makes: regular and ultra:

  • 5800 (Ultra) - NV30. High-End, a replacement for Ti4800
  • 5600 (Ultra) - NV31. Middle class, a replacement for Ti4200
  • 5200 (Ultra) - NV34. Low-end, a replacement for MX440

As we can see, nVidia has given up using the MX label to designate its Low-end produce. You now can forget about the MX...

Making the chip and video card cheaper requires a price to be paid. Here is a list of main distinctions of the high-end NV30 from the NV34. So, what is missing in NV30:

  • 0.13 Micron Process Technology - allows placing greater number of semiconductor components per chip and increasing the speed of the 256-bit core. In the FX5200 series, the 0.15 process technology is used.
  • Intellisample Technology - a new FSAA technology that eliminates "roughness", "toothed" image outlines at least by 50% better than before. It also allows tuning the color spectrum in terms of the light & color perception difference and the way it is reproduced on the monitor. Besides, this technology uses improved anisotropic filtering which reduces distortion of textures through introducing dynamic corrections to the image. FX5200 does not offer Z-compression or "hardcoded" color support inherent to this technology. It simply cannot offer that - the power of the chip is not enough to implement such technologies.
  • 8 Pixel Pipelines - allows outputting up to 8 pixels per clock. In our case (5200) there are only 4.
  • 400 MHz RAMDAC - for the 5200 series, the video data DAC runs at 350 MHz.
  • DDR II memory - instead of the cutting-edge DDR II memory, FX 5200 uses regular DDR.

The nVidia GeForce FX 5200 Chip

The chip has 45 mln transistors and is built on the 0.15 mk process technology.


GeForce FX 5200 offers 2 pixel pipelines and 4 texture units. But all this is very relative, because to date you can judge a video card only by the number of pipelines of one type or the other. This is caused by that the driver configures operation of the chip for each particular scene of a computer game.

All in all, we can say the 3D handling potentials in NV34 are not different from those in NV30/31. NV34 supports API DirectX 9, thus shaders 2.0 and 2.0+. There are differences of course: the GeForce FX 5200 chip does not offer IntelliSample optimization.

The memory interface in GeForce FX 5200 is also different from the higher-end brother - NV30. The chip uses a standard DDR memory controller, which in theory results in essential performance drops, especially with the anisotropic filtering and FSAA enabled.

GeForce FX 5200 chips do not have support for the HDTV, but can boast having the integrated TV codec, a TMDS-transmitter and two integrated 350 MHz RAMDACs. But today you are unlikely to surprise anyone with this.

For the long period of its development and short life, the GeForce FX 5200 has changed the "recommended" clock speeds for the core and memory at least 10 times. The problem is the first tests of pre-production samples demonstrated a performance so astonishingly low that it would be out of the question and even fatal to launch video cards in such a bad condition on to the market. We'd better refrain from bringing in those first raw results - that simply won't be fair to nVidia, but I can assure you the results were much lower than for MX440. The cause of that was primarily in buggy Detonator drivers not fit to an entirely new chip architecture and really low clock speeds of the new chips. Gradually, things broke even - the standard recommended clock speeds had to be raised, and every new revision of the driver streamlined the performance of video cards, with the yield ratio was bit by bit coming to reasonable technology norms. In our comparison table in the beginning of the article we brought in the "original" values of core and memory clock speeds, but they may prove different in reality which you can see for yourself buying a card in the shop close to you. Video card manufacturers are not shy about varying these values within wide ranges ...

Direct competitors of the new chips are Radeon 9000 and 9000 Pro which will be soon replaced with Radeon 9200 and 9200 PRO.

Daytona GEF FX5200

Video cards of this manufacturer have always stood out with their low price and middling manufacturing quality.



The video card has the AGP 2x/4x/8x interface. The layout is nonstandard, which isn't strange at all, because Daytona video cards feature in their specific layout and design. The cooling is standard, passive and is a needle-shaped mid-sized radiator.


Cooling efficiency on such a chip is rather disputable, since the video card was heating up immensely anyway, and in its overclocking it is desirable the fan be replaced with some more suitable. The memory chips are not covered with radiators.


The video card is equipped with 128 MB memory and offers 6 ns of access time. The memory made by PMI is marked as HP58C2128164SAT-6. That's where the cause of missing memory cooling is - the slow 6 ns memory does not heat up much.

The clock speeds of the core and memory in Daytona GEF FX5200 are 250 MHz and 150 (300DDR) MHz, respectively. Warning! The memory operation frequency in this board is lower than it should be as per nVidia's latest recommendations (200 (400DDR) MHz).


The card has a standard set outlets: analogous, digital (DVI) and TV-Out. The TV-out is implemented by the GeForce FX 5200 chip itself, because the TV-Out features are already integrated in it.


It sometimes comes shipped in a box, but the package bundle is like in the OEM version, i.e. the Daytona GEF FX5200 card plus a drivers CD.

Benchmarking Results

Unreal Tournament 2003



FX 5200 lags well behind both to its direct competitor ATI Radeon 9000 and its predecessor - GeForce 4 MX480. ATI Radeon 9000 and nVidia GeForce 4 MX480 demonstrated about the same results.



In the botmatch test, the scores are the same, but the gap between nVidia GeForce FX and ATI Radeon 9000/ nVidia GeForce 4 MX480 has shrunk, although it's not small anyway: 4-8 fps on the average:


Serious Sam The Second Encounter



nVidia GeForce FX lags behind ATI Radeon 9000 and is well behind nVidia GeForce 4 MX480. In this test, nVidia GeForce 4 MX480 wins: it takes the lead over its competitor ATI Radeon 9000 with a fairly wide gap.


3D Mark 2001 SE



nVidia GeForce FX lags behind ATI Radeon 9000 and loses to nVidia GeForce 4 MX480.


3D Mark 2001 Detailed Results

Gaming test 4 - Nature


The benchmarking results for nVidia GeForce FX are practically the same as those for ATI Radeon 9000. The nVidia GeForce 4 MX480 failed to pass the test since it doesn't support pixel shaders.

Fill rate



At fillrate tests, the nVidia GeForce FX 5200 is an evident loser and we were growing doubtful whether FX 5200 video cards are built on the 2x2 pixel pipeline design.

High polygon rate - 8 illumination sources



Again the results are not in favor of nVidia GeForce FX 5200.

Vertex shader speed


At that, nVidia GeForce 4 MX480 is again the leader. The second comes ATI Radeon 9000 and nVidia GeForce FX 5200 the third.

Pixel shader speed



At pixel shaders, nVidia GeForce FX 5200 is doing much better.

Advanced pixel shader speed


There is a mess with results at this test: nVidia GeForce FX 5200 lags well behind its competitor ATI Radeon 9000. The thing is that video cards optimized for DirectX 9 support pixel shaders v.1.4, and if a card does not support v.1.4, then this test uses pixel shaders v.1.1 which require more passes. It's still unclear what caused such results - the drivers or the 3DMark package itself.


Codecreatures



This benchmark uses pixel shaders of the DirectX 8.1 generation. nVidia GeForce FX shows a better progress at DirectX 8.1 than at ATI Radeon 9000 does. The gap is not dramatical, but it's there. nVidia GeForce 4 MX480 does not pass this test at all since it does not support pixel shaders.


Codecreatures - average polygon rate



The test results allow representing and comparing the average speed of polygons handled per second.


Image quality





nVidia GeForce FX 5200 slightly lags behind nVidia GeForce 4 MX480, and all would have looked much better if nVidia GeForce FX 5200 didn't lack IntelliSample engine. We were unable to read the results for ATI Radeon 9000 because of problems with ATI's drivers with Unreal Tournament 2003.


Findings

GeForce FX 5200 left dual impression. On the one hand, the performance about the same as that for GeForce4 MX, sometimes much lower than shown by the competition and the predecessor.

On the other hand, its low price ~90$ (anyway, the price is worth cutting down since cards based on this graphics chip are able competing only with GeForce4 MX и RADEON 9000), good DirectX 9 optimization and excellent pixel handling, albeit not without flaws: test results in the Advanced Pixel Shaders scene point to that, but these are more likely to be driver problems and soon they will be fixed.

All in all, the FX5200 is the same previous MX with improved functionality for DX9 support. But who will buy a weak video card for still non-existent demanding games that require DX9 to watch a slide show in the end? Gamers definitely won't want it - they need more serious solutions, while all the others do not feel a real need for DX9...

Copyright © 2005 Digital-Daily. All Rights Reserved.
contact -