![]() |
3DNews Vendor Reference English Resource - All you need to know about your products! |
![]() |
||||
![]() |
![]() |
![]() |
||||
![]() |
![]() |
![]() |
||||
![]() |
![]() |
![]() |
||||
|
Xabre 400 Roundup - August 2002Date: 14/08/2002
IntroductionKeeping our eyes fixed on the video cards industry, we notice the stars of the first magnitude mostly, thus producing the impression of a specific graphics chip manufacturerin the consumer mindset. Mind you, nobody would argue that GeForce4 Ti4600 as well as Radeon8500 are nVidia ATi top performers. These names have firmly fixed in the gamer's mindset as a standard of speed, reliability, trouble-free piece of hardware. But, partitioning the video cards market into the nVidia andATI domains we are sometines unaware of other video cards manufacturers, not so bright but no less ambitious than the two industry giants. Trident and SiS are definitely not able to withstand competition against nVidia and ATi in terms of high-performance high-end video cards, so they target at the market of lower-end video cards which are supplied as OEM components to equip new office computers or sold to those on a tight budget, who don't care much about gaming speed/performance. Where "price-performance ratio" is the key factor, any company with a fair amount of video cards manufacturing experience might easily press the competitors, ATI and nVidia, and win a niche. But how deep that niche could be depends much on the product which represents the company's image.![]() Traditionally, video cards based on SiS chipsets have gained little credit. Even though SiS missed an opportunity to switch to the 2D/3D accelerators having a strong 3D bias, the company is determined to retake positions and won't step back. If SiS 305 chip could hardly be a match for competitors in none of the parameters, SiS 315 in the end proved on par with low-end nVidia GeForce2 MX200 video cards. If video cards manufacturers hadn't overlooked SiS315 at the right time, it wouldn't have passed unnoticed. Today SiS is manufacturing a new line of video chips under the new name - Xabre. SiS is determined to promote Xabre chips in all the three groups of the video cards market - Value, Mainstream and Performance. ![]() First, SiS Xabre400 video chips of the performance class will be released, then SiS will announce a slower "mainstream" Xabre200 video card and Xabre80 as GPUs for the low-end video cards. There are also plans to launch Xabre600 (overclocked version of Xabre400 and Xabre II, a brand new graphics chip). The main competitors for Xabre400 are GeForce4 MX440/460, GeForce3, GeForce4 Ti4200 and Radeon 8500LE. Let's take a look at how Xabre-family GPUs rank among the rivals.
In theory, Xabre400 is quite powerful and even superior in some ways to the competitors. This chip lacks many flaws typical of the GeForce4 MX line. For example, pixel shaders and Environmental Mapped Bump Mapping are supported, and even though Xabre as well as GeForce4 MX have no hardware support for vertex shaders, they can be emulated with the CPU resources. More than that, Xabre is the first chip to support AGP 8x which is gradually taking over. Frankly, the speed difference between AGP4x and AGP8x is virtually not seen to the naked eye, and today very few applications can take advantage of that new standard. Moreover, on VIA motherboards Xabrecan run at AGP2x only, so it's still yet to be seen which is better - an AGP8x with a dubious AGP4x support or a true AGP4x on any motherboard?
It's interesting that from SiS315 this Xabre inherited a joint operation with SiS301 chip responsible for TV-out or output to a second monitor. This chip has a very weak RAMDAC whose frequency is merely 135 MHz, which was acceptable in the GeForce2 MX times but is not justified today when RAMDAC of 350 MHz or higher frequencies is integrated into graphics chips. On the other hand, use of diverse SiS301 brands allows to produce video cards with a DVI-interface, 3D-glasses output or TV-in socket. We'll compare various Xabre400-based video cards afterwards, but now let's look into thechip itself. DriversDrivers have always been a bottleneck for video cards manufacturers. It is hard to count up how many excellent solutions implemented in chips remain unclaimed because of poor drivers. So many chips have failed to reveal their potentials because of programmers' failure to create quality drivers! While ATi and nVidia after many years of hard work have finally succeeded in creating decent level, almost claim-free drivers, other manufacturers like SiS, STM, Trident and evenMatrox shudder at the thought of drivers writing. See how SiS took writing drivers for Xabre. We used the most recent drivers of version 3.03.51.![]() Compared to nVidia Detonator or ATi Catalyst,drivers for Xabre look mean. They lack basic 3D settings like filtering, sync, or anti-aliasing. Use of TV-out is a bit confusing, since the respective menu item is enabled only after hooking up the cable and rebooting the PC. ![]() Color calibration menu ![]() Video player menu ![]() Monitor settings Since the drivers come with the 3DWizard utility, it's possible to tweak anti-aliasing, overclock the video card, make the image stereographic and cheat by making the walls in a game look transparent. ![]() By the way, with SiS315 it also is possible to look through walls, and no negative response has come up. Video cards on SiS-based GPUs are not so widespread to cause mass cheats in Quake III cyber-fights over the Internet, so you are very unlikely to come across a player who uses a Xabre video card - currently SiS315 is not that powerful, we'll see what will come of it. TV-Out and DualHeadXabre400 video cards use chip codec SiS301 which has been around for some time in notebooks. The chip includes a NTSC/PAL TV codec with Macrovision 7. 1. L1, a TMDSTM with bilinear scaling to output to LCD monitors and a RGB port for conventional VGA monitors.![]() SiS301 allows to use 24-bit, 18-bit and 12-bit LCD monitors. Maximum resolution - 1280x1024@60 Hz. Some companies like Triplex ship their cards with DVI-VGA adapters enabling to hook up a second CRT monitor to the video card. The slow 135 MHz RAMDAC substantially limits the maximum resolution and refresh rate. The maximum value is 1280x1024@75 Hz. Although at 1280x1024 the image is well defined, some shady lines are seen. The maximum acceptable resolution for the second monitor is as high as 1024x768 with 32-bit color depth. So, as regards to signal quality the second output is far better than the first one. Using Xabre to handle two monitors is unlikely to appeal to you. To activate the TV-out, plug the cable into the card. After rebooting, the TV-out will be recognized by the drivers. Admissible resolutions for the TV-out are 800x600 and 1024x768. Regarding quality, there are pros and cons. First, the "cons: on activating the TV-out the host monitor should run atthe same resolution as the TV-out does. You won't be able to use TV-out to expand the desktop. Besides, fonts on the VGA monitor turn bold. So, SiS should spend a while to fix that. One more mystery - at 1024x768 you were not able to select 32-bit color, although no problems with it were at 800x600. Now, the "pros": there are numerous image settings on the TV-out. You can move and scale image along the vertical and horizontal axes. The image can be scaled to 20 pre-set positions, so you can fit it seamlessly to your screen. The driver also suggests the overscan mode. nVidia specialists will find much to be surprised about SiS drivers. 3D QualitySiS Xabre400 seems to have problems with playing 3D images. As is most likely the case, the problem is drivers. As you have just seen, the drivers do not allow to change 3D settings - by default they are not set to the best quality. That is understandable: SiS company needs to advertise the Xabre400 speed at a sacrifice to graphics quality.For instance, let's take a look at Quake III Arena. With Xabre, this game looks poor. What especially catches eye is the awful sky resembling those when 16-bit color and texture compression were common. ![]() It's not only the sky that mars up the impression. Besides, it's the texture filtering. ![]() As you see, the filtering is bilinear, that's a past century's way of making graphics. However hard we tried to enable the trilinear filtering, we failed to do it. When I saw that sky and the filtering I recalled I had already seen both flaws somewhere. Guess where? Yes, right you are - in SiS315! Things haven't gone to the better since the chip appeared, and Xabre inherited all the worst traits of its predecessor. Quality in 3DMark2001SEWe wished the image quality in Direct3D were better, but our hopes failed to come true. To be more precise, it's not quite so. An image in Direct3D looks blurred and indistinct. That must have been the filtering settings in the drivers which caused the first MIP-level start up too close to the spectator, and we see the filtered textures closer than they should be. Here is what we finally got: have a look at the first test 3DMark2001SE. ![]() You see that with Xabre400 the textures are heavily blurred compared to the reference and look dimmed. The same was observed in other games. Full Scene AntiAliasingThe most interest things are with the full-scene antialiasing. Xabre has evolved further compared to SiS315 and now supports two antialiasing methods: supersampling and Blur. I think it`s not necessary to explain what supersampling is - it`s the good old rendering at a higher resolution with further reduction, but the "Blur" is something new.![]() ![]() It may seem by appearance that the 2x + Blur gives the same anti-aliasing quality as FSAA4x does, but that`s not quite so, because the image in fact is dithered with the 2xAA method, whereas "Blur" simply blurs the whole scene, labels failed to read and the image looked mean. ![]() SiS Xabre, 2xAA +Blur ![]() SiS Xabre, NoAA The upper image displays a car with the "Blur" applied, the lower one - without it. Even if the "Blur" by default makes the image more blurred than the reference, the "Blur" makes the image look more blurred. It`s no longer a fun to play at such quality. Now let`s see how enabled anti-aliasing affected the speed. ![]() As you see, the speed decrease with the full-scene anti-aliasing is excessive. At the 2xAA, there is a two-fold drop in speed (so is for for image quality), but at 3xAA and 4xAA the drop is 3- and 4-fold, respectively. Since it`s not possible to enable the 2xAA without soaping the "Blur", then the minimum anti-aliasing that would suit the quality fan is the 3xAA method with a 3-fold drop in speed. Xabre400 Reference Video CardThe first SiS Xabre video card we are reviewing is a reference video card that came from SiS. Basing on its design, video card manufacturers are already designing their own video cards considering the design specifics. Here is a green video card based on the Xabre400 chip with 64 Mb DDR SDRAM onboard, standard-sized, with a massive heatsink on top. Let's take a closer look at it.![]() ![]() On the panel of the video card there is a D-Sub connector, TV-out for the S-Video format and a DVI-output pluggable to a flat monitor. The DVI-out as well as the TV-out are enabled by a special version of the chip SiS301 - SiS301MV with a DVI support. ![]() This chip has another RAMDAC, a TV-decoder, and a transmitter. Right above it there are jumpers. Using the jumpers you can toggle between the TV-output standards: NTSC or PAL. But the reference board we tested does not have them, so by default the TV-out is set to NTSC. Most part of the front side of the video card is covered with the cooler radiator. Similar to first GeForce3 Ti500 samples, SiS installed a massive cooler on their video card to impart it a menacing look. ![]() Such huge coolers are used to cool the graphics chip and memory, but on SiS Xabre400, first, the radiator does not touch the memory chips and only the core is cooled, and secondly, the memory is solded on the board from both sides, so if necessary we would have had to put the radiators on both sides of the video card, otherwise such a card would have been useless. Anyway, congratulations to SIS on the menacing look of their video card. The cooler fan is plugged to the board with three wires, but nowhere in the drivers there are options to find out the rotation speed. We can be almost certain that Xabre400-based video cards cannot monitor the cooler operation. ![]() Under the massive cooler, there is the core itself, the chip GPU Xabre400. The inner circle of the chip, revision A1, is made mirror-like. Looks fantastic, but we're too busy to admire it. ![]() The onboard memory is made up of eight DDR SDRAM video chipsby EtronTech. The memory cycle time is 3. 3 ns, the same as for GeForce3. The rated clock frequency of the memory is606 MHz, but in Xabre400 video cards it runs at 500 MHz, lower than the rated value, therefore, there is a good reason to expect good overclocking results for that memory. OverclockingBy default, the video card runs at 250/500 MHz. The video chip and memory can run asynchronously, i. e. at different frequencies. The card ran at 280/540 MHz without extra cooling.![]() But it found out in the operation that the core of the Xabre400 wouldn't cope with that frequency, so we had to push the overclocking down to 270 MHz. As a result, the video card overclocked precisely by 20 MHz and the final maximum high and stable clock speed reached 270/270(540) MHz. Page 4 - Elitegroup ECS AG400 Video Card Elitegroup ECS AG400Elitegroup was among the few companies who produced SiS315-based video cards and the first to create a Xabre400-based board. The new video card ECS AG400 promises to bring you to the magic world. The video card has 64 Mb DDR SDRAM onboard, with only a TV-out and a 3D-glasses outlet installed.![]() The video card is shipped with a setup instructions manual and a driver CD. It's strange, but no cables to plug the card to the TV were in the package. Quite unusual, isn't it? Normally, TV-out video cards manufacturers put cables and adapters into the package. Do they essentially add to the video card price? ![]() ![]() The board ECS AG400 is made by a design that differs from the reference. These differences are clearly seen to the naked eye. What catches the eye is the video card panel. No DVI-out is there. ![]() On the panel, from left to right (or downwards) there isa 3D-glasses outlet, a TV-out Composite, a TV-outS-Video and a D-Sub outlet to the analogous monitor. Such a configuration suits more to a cheap video card which will hardly ever come across a DVI-input monitor, but the two types of the TV-out connectors fit well in here. It's a pity that 3D-glasses need to be purchased separately. ![]() On ECS EG400, another chip, SiS301DHMV, of a different SiS301 version is installed. Because of that, there is no DVI-out onboard, - only a 3D-glasses output. Below the SiS301 chip, there are jumpers to toggle between the TV-out standards:NTSC or PAL and select the interface settings: AGP1x/2x/4x/8x. By default, the board is set to AGP 4x/8x and NTSC. To the right of the SiS301, there are jumpers responsible for displaying the image. They switch between theRGB, YUV, SCART TV, Normal TV formats as well as PAL and NTSC formats. ![]() ECS EG400 uses the same DDR SDRAM by EtronTech with the cycle time 3. 3 ns. This memory module is solded on both sides of the board, the way it is on the reference board. But here it is not covered with a cooler that is much smaller in size than that on the reference board. The cooler is plugged to the video card with two wires into other point as well, because the ECS offspring lacked the power supply unit availablr on the SiS reference board. The Xabre400 chip consumes not so much power to make ado about installing a whole power supply unit for it. A couple of transistors will easily cope with the board's power supply as was proved by Elitegroup. ![]() The Xabre400 chip is exactly the same as that on the reference board. It is of revision A1, though released a bit later. Its surface is as mirror-like, and it seems to me that manufacturing chips of not only metallic color but mirror-like as well will soon come into fashion. OverclockingAs is the case with the reference board,ECS EG400 runs at 250/250(500) MHz by default. But its design slightly differs from the reference, and there is a hope that the overclocking will prove the best.![]() Unfortunately, the memory on that board failed to overclock by over 20 MHz and the maximum possible result froze at 270 MHz where it ran without image artifacts and didn't hang. That must be its limit. On the contrary, the core was easier to achieve clock speed records and ran trouble-free at even 300 MHz. Sothe maximum stable clock frequency which we managed to make the card work at was 300/270(540) MHz. Gigabyte SP64D-HThe SiS Xabre chip has attracted attention of the most named eminent brands, among them the reputable Gigabyte. That companypresented their video card SP64D-H. Here at 3DNews we received an OEM version of this card which came in a Radeon7500 package turned inside out, so we can't yet say anything certain about the final shipment of Xabre from Gigabyte.Although the video card itself looked similar to the reference, there are numerous differences. But first, the general features. As other boards in our review, Gigabyte SP64D-H also has 64 MB DDR SDRAM onboard. There is a DVI-out and a TV-out on the board. ![]() ![]() The video card is made on a red PCB. Upon a closer look it becomes clear that Gigabyte engineers took much care of their first video card based on the new SiS chip. That is best seen on the board panel. The DVI-out is the same as on the SiS-based video card, the S-Video connector here has seven pins instead of four, and the S-Video cable can be easily plugged to it, so can the adapter S-Video - Composite. ![]() Use of the DVI-out forced us to install a SiS301 chip of MV version on the video card, as is the case for the SiS reference board. ![]() This videocard from Gigabyte has no jumpers, so switching between AGP modes and image display on the TV will be effected either automatically or via drivers. Since the drivers to Xabre chips are rather poorly made currently, the user might not be able to make the card run at the AGP 8x mode or output a signal in the needed format whenever necessary. ![]() The remarkable feature of the video card from Gigabyte are those two cooler sockets. They are mounted at the edge of the video card, close to its power supply unit. Both sockets have 3 pins each and are marked as AN1 and AN2. A question is why the video card need two coolers? And, if the cooler is hooked up with 3 wires, why don't the drivers provide an opportunity to read the rotational speeds? ![]() Of course, there is the only one cooler installed on the Xabre400 chip. That is a plainaluminium cooler, small insize, with thick ribs. It resides on a thermal pad, so the cooling is not as good as it could be provided thermopaste were used instead. The cooler has an integrated rotational speed sensor, but its readings are accessible only through drivers modified by Gigabyte - the reference driver has no tab to do with that. ![]() A Xabre400 graphics chip is hidden under the cooler. The chip has the same revision A1 as in other video cards but was produced later, so its surface is not mirror-like as other video cards in our review, but mat. Frankly, it's not a big deal what surface is inside. What matters is the overclocking results. ![]() The video memory on board is made up of eight DDR SDRAM chips installed along the both sides of the video card one upon another. Instead of standard EtronTech memory module having 3. 3 ns cycle time that is used on most video cards with the Xabre400 chips, here we used a 4 ns Hynix module. I wouldn't dare say which memory is cheaper or better, but the 4 ns memory at Gigabyte SP64D-H runs at its rated clock frequency - 500 MHz, so the overclocking may be problematic. On the other hand, Hynix is a reputable brand. So let's see. OverclockingThe rated clock frequencies of Gigabyte SP64D-H are still the same: 250/250(500) MHz The first thing of interest to us was the memory overcloking. Both previous video cards with the EtronTech 3. 3 ns memory successfully overclocked up to 270(540) MHz. So from that video card we expected any different results - no matter higher or lower, but different. It was a great surprise for us to know that the memory wouldn't overclock up to anything higher than 270 MHz! The clock speed is the same, it's like a threshold which none of the tested video cards was able to get over. This suggests that SiS may have intentionally imposed a limit to the memory controller clock speed, but we won't assert what is not yet known for sure.![]() At first, the core ran at 300 MHz but after a while the board hanged because of overheating, so we pushed the clock speed down to 290 MHz. Xabre400 withstood that threshold for 30 minutes, but then again failed. Of course it's inadequate cooling that caused the failure, so the maximum stable clock speed we achieved was285 MHz. As a result, the board was overclocked to as high as 285/270 MHz, the average in our test. Page 6 - Vinix VX-3340 Vinix VX-3340Vinix VX-3340 is an illustrative example of a video card whose design differs from the reference. The Taiwan-based company Vinix have developed their own board for Xabre400 where the improved power system (Advanced V-Power System 1. 0) is used and requires a PC-Plug 12V socket in the power supply. The Vinix VX-3340 is shipped in a retail package in which besides the card there is a driver CD, a one-page user instructions and that's all - even a PC-Plug splitter is missing, which means you will lose one power supply socket in installing the board.![]() The Xabre400 board is different from the reference board even in appearance. It is of a very uncommon dark raspberry color and there are the same 64 Mb DDR SDRAM, DVI and TV-out onboard as before. ![]() 750x478 ![]() 750x490 Vinix departed from conventional standards and used a black D-Sub soket for the VX-3340. As we found out later during the tests, that didn't give any advantages and the quality covered in detail here turned out to be poor. ![]() On the board there are pads for three jumpers, one of them used to toggle the TV-out between NTSC/PAL modes, another one triggers the INTA (what it is is unknown), and the third one switches the 64K memory address. The jumpers themselves are missing on the board, so by default it runs in the NTSC mode, with the INTA disabled and the 64k memory address set to Down. ![]() It's strange to see the video chip SiS301 installed on Vinix VX-3340, without the letters "MV" in the name which are there on other Xabre boards. Frankly, that hardly makes sense, since the SiS301 does its functions the same way, and here it supports the DVI and TVI-out. Of course, nobody would mind if I say that a cooler installed on the VX-3340 looks better than on any other Xabre board among the tested. It's not as big as that on the reference board, but it's nicer and looks much like those used on the GeForce4 Ti4600. ![]() At the top it is closed with a lid that protects it from wires inside the case getting into and guides the air flow to the upper and back parts of the video card. The lid is easy to remove in order to see the cooler fins. ![]() It's nice - the thick aluminum fans are bent in a way so that the air flow is split into two flows. The cooler is glued to the video chip with a two-sided sticker and is not so easy to remove, so we won't be able to have a look at the Xabre400 chip. By the way, the fan is plugged to the board with two wires, which makes hardware monitoring impossible. It seems to me that no onboard hardware monitoring for reading the tachometer's display is there at all in the Xabre400. ![]() The power supply of the video card is in the back. Admittedly, the Vinix engineers made an honest job of powering the video card. The number of capacitors here is greater than on other Xabre400-based video cards. One of the reasons for that may be a necessity to convert the arriving 12 V into a voltage needed for the chipset. We can see a PC-Plug in the middle, among the capacitors. To be maximally honest, all that Advanced V-Power System is no more than an advertising trick, and the video card runs fine without any external power, i. e. its own AGP resources are enough. Why shouldn't that be enough? - the Xabre400 chip was conceived as a low-power consumption solution to be inmplemented in mobile and low-power desktop computers. ![]() The video memory on board is made up of eight DDR SDRAM chips installed on both sides of the video card one upon another. Instead of the standard EtronTech memory module having 3. 3 ns cycle time used on most Xabre400-based video cards, here we used the same 4 ns Hynix module as used in theGigabyte SP64-DH. The 4 ns memory runs by default at its normal speed 500 MHz, therefore, the overclocking might be problematic. On the other hand, Hynix is a reputable brand. So, it remains to be seen. OverclockingThe normal clock speeds of the Vinix VX-3340 are the same as before: 250/250(500) MHz After the three video cards failed to overclock the memory speed higher than 270 MHz we immediately set that speed value and checked how the video card runs at it. We could assert with absolute confidence that the Vinix VX-3340 will catch up with the 540 MHz in memory, which came true. Not a single megaherz higher. That leaves no doubt that SiS somehow enabled a hardware ban for the memory clock speed to go beyond 270 MHz.![]() Owing to the big cooler on board the core succeeded in running at 300 MHz. Set to that from the very beginning, the board went on running without failures. I must say we were overclocking theVX-3340 both with extra power applied and without it. It hasn't affected neither the memory overclocking not the CPU overclocking. Before that, only a card made by ECS was able to withstand the 300 MHz core speed. Quite promising. PowerColor EvilXabre400 128MbPowerColor produces a line of video cards branded as "Evil". EvilXabre is a line of gaming video boards based on SiS Xabre chips. The line includes boards Xabre200 and Xabre400. In fact, a long time ago PowerColor promised to produce video boards on nVidia chips only, but since the produce of C. P. Technology (the owner of the PowerColor brand) is too rich in variety to fit within nVidia chips, then PowerColor was did all tricks to cheat the public through producing GPUs of ATi and SiS under various brands, e. g Club3D. Today, nobody remembers that promise and it's no longer a shame for PowerColor to batch-produce gaming cards based on Radeon's, GeForce's and Xabre's. In our lab we tested one of the latest, Xabre400-basedvideo cards of the company.![]() We received an EvilXabre400 as a Retail package. So we were fortunate to see with our own eyes that PowerColor is growing better and now it's not that brand the public used to chase away. First, the package contains a large manual in English, German, French and Chinese languages. Secondly, there are two CDs: with drivers and utilities to watch PowerDVD video. Besides, PowerColor provided the EvilXabre400 video card (has a TV-out) with a S-Video-Composite adapter and a Composite-Composite cord. Although no one is likely to ever read the manual and the drivers with the DVD player can be made available from the Internet, anyway, PowerColor provided the complete versions, which shows an improving attitude towards the customer. ![]() *770x930 PowerColor did not follow the way of inventing the wheel - it simply improved the wheel. The board itself is made on the reference design and is practically identical to the reference Xabre400 board. The PCB is yellow and has no frills, but there is as much as 128 Mb of DDR SDRAM, a DVI- and a TV-out. ![]() *900x596 ![]() *900x599 If you look at the back side of the video card, you can see the labels XP400A, XP200A, 32MB, 64Mb and 128Mb. Does that mean that PowerColor is going to produce Xabre200-, Xabre400-based video cards with various memory configurations on one and the same PCB? Most interesting is whether such unusual things like a Xabre400 with 32 Mb onboard or a Xabre200 with128 Mb video memory will ever come up. Anyhow, Xabre is not the chip for which it hardly makes sense soldering all six possible video card options. The TV-out S-Video on PowerColor EvilXabre can run in two standards - PAL and NTSC. Although only three jumper layouts are specified on the back side of the video card, the board hasonly one jumper which toggles between the TV-out standards. By default, PAL is enabled. ![]() Since only a DVI-out to flat monitors is installed on the board, a SiS301MV is wired close to it. ![]() By the way, there is no 'DVI to D-Sub' adapter in the shipping package. The reason is that nobody seems to regard the Xabre400 with its second 135 MHz RAMDAC as a multimonitor video card. But why? At low resolutions, if the video card has a poor 2D quality, you might plug a monitor via the DVI and there are good chances that the image will be better. However, the 2D quality hasn't been covered yet. The PowerColor EvilXabre400 has an quite ordinary video card cooler installed. Even if the fan is a bit wider, it is plugged to the board with two wires. ![]() The cooler is glued on the chip with a droplet of silicon thermopaste. If necessary, it can be easily removed to show the GPU Xabre400 itself. ![]() 1986 is not the year it was manufactured, but the chip serial number. We can see that it was produced a bit later than that of the Gigabyte board. I wonder how that will affect the overclocking. If we take only the external differences into considerations, then the Xabre is not mirror-like at all as it is on the reference board or that provided by ECS. ![]() Although PowerColor installed 128 Mb of graphics memory on the board, the memory is distributed in eight chips on both sides of the board. Here the DDR SDRAM used was manufactured by Samsung with the 4. 0 ns cycle time. Again the memory is slower than on the reference board, although it runs on its default rated clock speed. 250(500) MHz. Having acquired some overclocking experience on Xabre400-based boards, I bet it won't get overclocked to speeds higher than 270 MHz. But again - why should a video card have so much memory? Especially, the low-end card which targets at the market sector where each extra dollar paid matters much. On the one hand, those times are over when video cards manufacturers could rubber-stamp identical cards and consumers bought them on the basis of trust to the brand. Currently, every manufacturer is striving to make their own producs stand out against competitor products,which is revealed by our review. So PowerColor followed suit by producing the 128 Mb version of Xabre400. Well, Xabre400 is a new chip and it's still unknown how it will act with extra memory installed, isn't it? OverclockingThe rated clock speeds of PowerColor EvilXabre400 do not differ from others: 250/250(500) MHz Nevertheless, 128 Mb is something out of the ordinary, so there is a hope to overcome the 270 MHz barrier of Xabre400. And - wow! fantastic! - the video memory did run at even 275(550) MHz! That is the highest overclocking score achived among the video boards tested.![]() We built great hopes on overclocking the core, especially because the PowerStrip utility was able to make the GPU run at 305 MHz, albeit for a short while and with poor stability. Assuming we are dealing with the most overclockable Xabre-based video card, we set it to the highest core speed, 300 MHz and were disappointed by the short time the EvilXabre was running in that mode. The disappointment was mixed with a joy when the board was running steadily at the core speed 295 MHz. On the one hand, we added 5 MHz to the memory, on the other - we lost in the core speed. All in all, we got a Xabre400 video card with 128 Mb onbord, running at 295/275 MHz. And this board is ready to challenge all the competitors. Now that we've got a better idea of our 'guinea-pigs' reviewed, it's high time we tested and found out their potentials. Before moving on to the results, let me take a few minutes of your time to tell what and how we tested. For comparisons, we used all the five Xabre400 video cards received for testing - a reference card from Elitegroup, one from Vinix, one from PowerColor and from Gigabyte. All those cards were tested at the normal and overclocked modes. In the nearest future a new chip, Xabre200, will come into play, differing from Xabrea400 in the clock speed only: it will run at 200/166(333) MHz. So, to get an idea of its performance, we pushed the clock speed of the reference board Xabre400 down to that of Xabre200 and produced a 64-MB analog of a still non-existent video card. Since we did our tests on a motherboard based on the VIA KT333 chipset, all the Xabre video cards switched into the AGP2x mode automatically. Frankly, the performance difference for these video cards at AGP2x, 4x and 8x will be negligible, considering that each of the cards had 64 Mb onboard. Since the flaw comes from some engineers at SiS, and VIA chips sell like cakes, we won't accentuate that fact - mind you, the tests were performed in real-life conditions, on a real computer, and Xabre videocards have got to be prepared to face these like any other kind of hardware. As far as the installation is concerned, most video cards had no trouble installing under Windows XP. Only PowerColor EvilXabre400 was reluctant to install and wouldn't work in a mode other than VGA. Perhaps that was caused by installation of extra 64 Mb, maybe the right drivers are not yet ready, but I would call the drivers almost a miracle, and I wouldn't want to replace them with newer ones for that video card. For comparisons, we used two mid-end nVidia-based video cards of two competitors. Those were cards based on GeForce4 MX440 and GeForce4 Ti4200. The first one had 64 MB DDR SGRAM and ran at 270/400 MHz. The second one had 128 MB DDR SDRAM and ran at 250/444 MHz. Well, let's start. Tests
In the first test we analyzed the 2D graphics and gauged the performance. The issue of 3D quality has already been covered. Unfortunately, nothing good was said about that, - but in the 2D the situation is reverse. The 2D image quality in the reference board from SiS and Gigabyte SP64D-H is simply fantastic, no soaping was observed even at high resolutions. Can't be compared with competitors. SiS did a really great job of that - that's what the 400 MHz RAMDAC means! The layout of the boards is excellent. But Elitegroup spoiled all with the awful design of AG400, which resulted in that the video card failed to run at resolutions higher than 1024x768 at 85Hz. The "soaping" will also spoil the scene. The same was observed for the Vinix board. Visually, the SiS reference board was the best performer, with the Gigabyte counterpart following quite close, then follow GeForce4-based video cards (at virtually the same score),then PowerColor EvilXabre made a big leap from the counterparts,whereas the ECS AG400and Vinix VX-3340 lagged far behind with the worst scores. With this awful quality, Vinix VX-3340 anyway has an advantage over the ECS AG400, namely: via a DVI-VGA adapter a monitor can be plugged to it, which will improve the quality slightly, but it will be restricted to the resoultion 1280x1024 at 32-bit color depth. And one problem is there: such an adapter is not shipped with the package, so we would have to look for it elsewhere. We tested the 2D speed using the PCMark2002 benchmark. The 2D speed is fairly high and enough for most users, so we tested video cards at the rated modes at rated speeds. Here are the results, with the best highlighted:
As you can see, Xabre is losing in some tests and leads in others. Of course, it's not quite honest to compare the 2D core of Xabre400 with the powerful GeForce4, but it's the reality. Frankly, we expected things about it to be even worse, but Xabre is beginning to justify its hopes, although it is still early to make conclusions - the best is still to be seen. Direct3D In Direct3D there is no better tool like 3DMark2001SE benchmark to measure the video card clock speed. We wanted to test Xabre in CodeCreatures Bencmark Pro where even GeForce4 Ti4600 sucks, but the benchmark failed to run with the SiS gear. If such problems arise, Xabre is not fully compatible with DirectX 8. 1 then. Well, let's make an allowance for the raw state of drivers and get round to analyzing the 3DMark2001 results. We first start with the official benchmark scores for these video cards.
As is seen from the results, Xabre performed no worse than GeForce4 MX440 at theoretical tests, and at fillrates with multitexturing it scores even better than GeForce4 Ti4200. At gaming benchmarks, Xabre is superior to MX440, but we'll talk about that below. In the final scores, Xabre easily wins over MX440, albeit still far behind Ti4200. Now move on to the gaming performance tests. In the tests, the video cards ran at the rated and overclocked modes. ![]() It's amazing to see that Xabre video cards were almost on par with GeForce4 MX440. Sure, Xabre wins. By the way, Xabre has a specific trait. When overclocked, the video cards exhibit lower speeds than observed at the rated frequencies. This was seen for Gigabyte SP64D-H at high resolutions. No mistake here - indeed, the card did show lower speeds than at the rated frequencies. And note - only at high resolutions. By the way, it's seen from the very first tests that the 128-Mb card has advantages over PowerColor at high resolutions. ![]() Game 2 is an easier test and here we see how hard it is for competitors to reach the scores achievedby the GeForce4 Ti4200 core. If we disregard the 1600x1200 resolution, Xabre defeats GeForce4 MX440 at other resolutions, and the 128-Mb version of Xabre400 performed on par even at high resolutions! ![]() The trend stays: while GeForce4 Ti4200 is far ahead of the pack, the Xabre cards press on GeForce4 MX440. There is not much of a noticeable advantage, but it is explicit and stays at all resolutions except 1280x1024, where the low-end GeForce is second only to overclocked Xabre cards. In this test, the Vinix board made quite a good showing, although it is losing to the PowerColor stuff. ![]() GeForce4 MX sucks in Game4, but Xabre is alive, which shows its advantage. No use comparing a video card equipped with software-driven vertex shaders to the advanced core of CeForce4 Ti having two shader blocks running in parallel. In this test, PowerColor acts on behalf of the whole Xabre line and takes all the hard job on itself. This is especially seen at the resolution 1600x1200. I worked that test through twice at the EvilXabre400 - no mistake here: there is a three-fold speed increase compared to 64-Mb video cards. Dungeon Siege Since the CodeCreatures benchmark failed to run at Xabre, we tested it on the Dungeon Siege game. The procedure of gauging the speed was quite simple - you simply switch on the FPS counter at the same scene in the game and register its average value. Of course, that was not Timedemo, but since the results were correct for equal conditions, the results can be regarded as valid. More important is that Dungeon Siege is a real application, not a synthetic benchmark. The resolution used was 1024x768, 32-bit color depth, with all the image settings set to maximum quality. ![]() GeForce4 Ti4200 is the leader in the test as usual, but Xabre failed to catch up with the low-end GeForce4 MX440. Xabre cards shared the scores and gave in the performance crown to the PowerColor counterpart. Amazingly, overclocking gave little use to the ECS and Gigabyte boards. OpenGL To test the OpenGL performance, we used Quake III Arena. All the settings were set to maximum quality, and the test was run at the 32-bit color depth. ![]() Real cool speeds start from several hundreds of FPS. The gap between GeForce4 Ti4200 here is not so wide as before. MX440 wins at the lowest resolution only. In all the other ways, the trend persists: a sure leadership of the reference card at lower resolutions, with the Gigabyte video card at middle resolutions, and PowerColor taking a lead at high resolutions. Villagemark The Xabre specifications lack any information on the video memory access optimization. So the only way to find out about Xabre operation at high Overdraw rates is starting up STMicro's VillageMark benchmark. If the video card handles all hidden surfaces, it won't withstand speed tests in either Direct3D or OpenGL. ![]() As is seen from the results, Xabre400 is optimized for handling such scenes and applies some technique for cutting off invisible surfaces. And at Direct3D it does that even better than GeForce4 Ti4200. MX440 sucks! ConclusionsThat video cards manufacturers drew attention to Xabre400 is definitely a good sign. We see that it's a really promising graphics chip. Despite the flaws with 3D image quality, Xabre400 video cards can and must compete with GeForce4 MX and Radeon7500 lines. But the fate of Xabre400 is not a concern of ours. If you intend to buy a mid-end video card and still are hesitating, you've got to have a clear idea of the Xabre core and video cards based on that.![]()
If we consider each card separately, then it's the reference board that performed best. SiS proved it in practice that Xabre400 is not a SiS315 at all and even not a GeForce4 MX. We wish the video cards manufacturers didn't change anything in the Xabre reference design, in that case they would get highest scores and excellent 2D quality. Gigabyte SP64D-H is slightly inferior to the reference. Its 2D graphics quality is almost as high, since it's very hard to tell the difference when comparing two ideal images. This board is fast, and if Gigabyte takes more care of the cooler, it could overclock the core up to 300 MHz. There is a DVI-out and a TV-out. On the whole, it's a nice piece of product on a new graphics chip. ECS AG400 spoils the impression by its low 2D image quality. Because of that, I wouldn't recommend the video card to anyone. Elitegroup must have saved more on the video card than they should have done. As is required, there is a TV-out, but no cables shipped. There is a 3D-glasses connector, but no glasses themselves in the package, and it's no good looking for suitable ones elsewhere. We wish they furnished the card with a DVI-out. But that card overclocked best of all the others. Reminding again that such a low 2D quality in the 21st century is inexcusable. Vinix VX-3340 is god knows what. On the one hand, the 2D image is awful, the way it is at ECS AG400, on the other hand, the performance and overclocking potentials are not bad. In fact, we didn't make use of the improved power system, so it makes no sense talking about its advantages (if any). Since it's impossible to run it at 2D, I wish I wouldn't see it on retail stores. This is more likely to come true - who needs an unknown Vinix while such reputable brands like Gigabyte, PowerColor, Inno3D and others are easily available? SiS can be proud of PowerColor who produced EvilXabre400 with 128 Mb on board. This board demonstrates excelellent speeds, best overclocking potentials, good 2D quality, has a cord and a TV-out adapter in the shipping package. A remarkable product which has only one flaw - it is not easy to install under Windows XP (under Win98, thereused to be no problems with that). If that annoying stuff is fixed in future releases of the drivers, then we can say that EvilXabre is the leader among the Xabre400-based video cards. It's also possible that 128-Mb Xabre-based boards produced by third-party manufacturers might start hitting thestores soon. The tests showed that extra memory was to the benefit of that GPU, and why not to make use of that? Summing up with our review, I wouldn't give one-one assessments for both the Xabre400 chip and the derived video cards. As anything that doesn't fit within standards, it has its own pros and cons. Compared to what we've seen in SiS315, the new Xabre400 is a big leap forward. Moving at such steps, SiS might end up catching up with ATi and nVidia. What matters is moving in the right direction. |
![]() |
![]() |
|