Geforce 8800 gts year of manufacture. Video cards


It is well known that flagship models of graphics adapters belonging to the highest price range are, first of all, a public demonstration of the technological achievements of the developer company. Although these solutions are deservedly popular among enthusiast players, they never make the main sales picture. Not everyone is able or willing to pay $600, an amount comparable to the cost of the most expensive modern gaming console, just for a graphics card, therefore, the main contribution to the income of AMD/ATI and Nvidia is made by less expensive, but much more widespread cards.

On November 9 last year, Nvidia announced the first consumer GPU with a unified architecture and support for DirectX 10. The new product was described in detail in our Directly Unified: Nvidia GeForce 8800 Architecture Review. Initially, the new product formed the basis of two new graphics cards – GeForce 8800 GTX and GeForce 8800 GTS. As you know, the older model performed well in games and can well be considered the choice of an enthusiast who is not bothered by the price, while the younger model has taken its rightful place in its price category - less than $500, but more than $350.

$449 is not a very high price for a new generation product that has full support for DirectX 10 and can offer the user a serious level of performance in modern games. However, Nvidia decided not to stop there, and on February 12, 2007, presented to the public a more affordable GeForce 8800 GTS 320MB model with an official price of $299, which seriously strengthened its position in this sector. These two graphics cards will be discussed in today's review. Along the way, we will find out how critical the amount of video memory is for the GeForce 8 family.

GeForce 8800 GTS: technical specifications

To evaluate the qualities and capabilities of both GeForce 8800 GTS models, we should remind our readers of the characteristics of the GeForce 8800 family.


All three GeForce 8800 models use the same G80 graphics core, consisting of 681 million transistors, as well as an additional NVIO chip containing TMDS transmitters, RAMDAC, etc. The use of such a complex chip to produce several graphics models adapters belonging to different price categories is not the best option in terms of the cost of the final product, however, it cannot be called unsuccessful: Nvidia has the opportunity to sell rejected versions of the GeForce 8800 GTX (those that have not passed the frequency screening and/or have a number of defective blocks), and the cost of video cards selling for over $250 is hardly critical. This approach is actively used by both Nvidia and its arch-competitor ATI; just remember the history of the G71 graphics processor, which can be found both in the mass-market inexpensive video adapter GeForce 7900 GS and in the powerful two-chip monster GeForce 7950 GX2.

The GeForce 8800 GTS was created in the same way. As you can see from the table, in terms of technical characteristics, this video adapter is significantly different from its older brother: not only does it have lower clock frequencies and some stream processors are disabled, but also the amount of video memory is reduced, the width of the access bus is trimmed, and some of the TMU and rasterization units are inactive .

In total, the GeForce 8800 GTS has 6 groups of stream processors, each with 16 ALUs, giving a total of 96 ALUs. This card's main rival, the AMD Radeon X1950 XTX, features 48 pixel processors, each of which, in turn, consists of 2 vector and 2 scalar ALUs for a total of 192 ALUs.

It would seem that the GeForce 8800 GTS should be quite seriously inferior to the Radeon X1950 XTX in pure computing power, but there are a number of nuances that make such an assumption not entirely legitimate. The first of them is that the GeForce 8800 GTS stream processors, like the ALU in Intel NetBurst, operate at a significantly higher frequency than the rest of the core - 1200 MHz versus 500 MHz, which already means a very significant increase in performance. Another nuance follows from the architecture of the R580 GPU. In theory, each of its 48 pixel shader execution units is capable of executing 4 instructions per clock cycle, not counting branch instructions. However, only 2 of them will be of type ADD/MUL/MADD, and the remaining two are always ADD instructions with a modifier. Accordingly, the efficiency of R580 pixel processors will not be maximum in all cases. On the other hand, G80 stream processors have a completely scalar architecture and each of them is capable of executing two scalar operations per clock cycle, for example, MAD+MUL. Although we still do not have exact data on the architecture of Nvidia stream processors, in this article we will look at how the new unified architecture of the GeForce 8800 is more advanced than the Radeon X1900 architecture and how this affects the speed in games.

As for the performance of texturing and rasterization systems, judging by the characteristics, the GeForce 8800 GTS has a larger number of texture units (24) and rasterizers (20) compared to the Radeon X1950 XTX (16 TMU, 16 ROP), however, their clock speed is ( 500MHz) is significantly lower than the clock frequency of the ATI product (650MHz). Thus, neither side has a decisive advantage, which means that gaming performance will be affected mainly by the “success” of the micro-architecture, and not by the numerical advantage of the execution units.

It is noteworthy that both the GeForce 8800 GTS and the Radeon X1950 XTX have the same memory bandwidth - 64GB/sec, however the GeForce 8800 GTS uses a 320-bit video memory access bus, it uses GDDR3 memory operating at 1600MHz, while the Radeon X1950 XTX can be found with 2GHz GDDR4 memory with 256-bit access. Given ATI's claims of a more advanced ring-topology memory controller in the R580 compared to a typical Nvidia controller, it will be interesting to see if ATI's Radeon solution gains some advantage at high resolutions with full-screen anti-aliasing enabled against its next-gen competitor, as was the case with the GeForce 7.

A less expensive version of the GeForce 8800 GTS with 320MB of memory, announced on February 12, 2007 and designed to replace the GeForce 7950 GT in the performance-mainstream segment, differs from the regular model only in the amount of video memory. In fact, to get this card, Nvidia only needed to replace the 512Mbit memory chips with 256Mbit chips. A simple and technologically advanced solution, it allowed Nvidia to assert its technological superiority in the price category of $299, which is quite popular among users. In the future, we will find out how much this affected the performance of the new product and whether a potential buyer should pay an extra $150 for a model with 640 MB of video memory.

In our today's review, the GeForce 8800 GTS 640MB will be represented by the MSI NX8800GTS-T2D640E-HD-OC video adapter. Let's tell you more about this product.

MSI NX8800GTS-T2D640E-HD-OC: packaging and accessories

The video adapter arrived at our laboratory in a retail version - packed in a colorful box along with all the accompanying accessories. The box turned out to be relatively small, especially in comparison with the box from MSI NX6800 GT, which at one time could compete in terms of dimensions with Asustek Computer packaging. Despite its modest size, MSI's packaging is traditionally equipped with a convenient carrying handle.


The design of the box is made in calm white and blue tones and does not hurt the eyes; the front side is decorated with an image of a pretty red-haired angel girl, so there is no talk of aggressive motifs, so popular among video card manufacturers. Three stickers inform the buyer that the card is pre-overclocked by the manufacturer, supports HDCP, and comes with the full version of the Company of Heroes game. On the back of the box you can find information about Nvidia SLI and MSI D.O.T technologies. Express. The latter is a dynamic overclocking technology, and, according to MSI, it can increase the performance of the video adapter by 2%-10%, depending on the overclocking profile used.

Having opened the box, in addition to the video adapter itself, we found the following set of accessories:


Quick Installation Guide
Quick User Guide
Adapter DVI-I -> D-Sub
YPbPr/S-Video/RCA splitter
S-Video cable
Power adapter 2xMolex -> 6-pin PCI Express
CD with MSI drivers and utilities
Two-disc edition of the game Company of Heroes

Both guides are designed as posters; in our opinion, they are too simple and contain only the most basic information. The pursuit of the number of languages, and there are 26 of them in the short user manual, has led to the fact that nothing particularly useful can be gleaned from it, except for basic information on installing the card into the system. We think the manuals could have been a little more detailed, which would have given some benefit to inexperienced users.

The driver disk contains an outdated version of Nvidia ForceWare 97.29, as well as a number of proprietary utilities, among which MSI DualCoreCenter and MSI Live Update 3 deserve special mention. The first is a unified control center that allows you to overclock both the video card and the central processor, however, for For full functionality, the program requires an MSI motherboard equipped with a CoreCell chip and, therefore, is of little use to owners of boards from other manufacturers. MSI Live Update 3 utility is designed to track driver and BIOS updates and update them conveniently via the Internet. This is a fairly convenient method, especially for those who do not want to understand the intricacies of the manual process of updating the BIOS of a video adapter.

The inclusion of the full version of the popular tactical RTS Company of Heroes deserves special praise from MSI. This is truly a game of the highest category, with excellent graphics and thoroughly developed gameplay; many players call it the best game in this genre, which is confirmed by numerous awards, including the title of “Best Strategy Game E3 2006”. As we have already noted, despite belonging to the real-time strategy genre, Company of Heroes boasts modern graphics at the level of a good first-person shooter, so the game is perfect for demonstrating the capabilities of the GeForce 8800 GTS. In addition to Company of Heroes, the discs contain a demo version of Warhammer 40,000: Dawn of War – Dark Crusade.

We can confidently call the package of the MSI NX8800GTS-T2D640E-HD-OC good due to the presence of the full version of the very popular tactical RTS Company of Heroes and convenient software from MSI.

MSI NX8800GTS-T2D640E-HD-OC: PCB design

For the GeForce 8800 GTS model, Nvidia has developed a separate, more compact printed circuit board than the one used to produce the GeForce 8800 GTX. Since all GeForce 8800 are supplied to Nvidia's partners ready-made, almost everything that will be said below applies not only to the MSI NX8800GTS, but also to any other GeForce 8800 GTS model, be it the version with 640 or 320 MB of video memory.


The GeForce 8800 GTS PCB is significantly shorter than the GeForce 8800 GTX. Its length is only 22.8 centimeters versus almost 28 centimeters for the flagship model GeForce 8. In fact, the dimensions of the GeForce 8800 GTS are the same as those of the Radeon X1950 XTX, even a little smaller, since the cooler does not protrude beyond the PCB.

Our MSI NX8800GTS sample uses a board covered with a green mask, although the product is shown on the company's website with a PCB in a more familiar black color. Currently, both “black” and “green” GeForce 8800 GTX and GTS are on sale. Despite numerous rumors circulating on the Internet, there is no difference between these cards, other than the PCB color itself, as confirmed by the official Nvidia website. What is the reason for this “return to roots”?

There are many conflicting rumors on this matter. According to some of them, the composition of the black coating is more toxic than the traditional green one, while others believe that the black coating is more difficult to apply or more expensive. In practice, this is most likely not the case - as a rule, prices for solder masks of different colors are the same, which eliminates additional problems with masks of certain colors. The most likely is the simplest and most logical scenario - cards of different colors are produced by different contract manufacturers - Foxconn and Flextronics. Moreover, Foxconn probably uses coatings of both colors, since we have seen both “black” and “green” cards from this manufacturer.


The power system of the GeForce 8800 GTS is almost as complex as that of the GeForce 8800 GTX and even contains a larger number of electrolytic capacitors, but it has a more dense layout and only one external power connector, thanks to which the printed circuit board was made much shorter. The same digital PWM controller as in the GeForce 8800 GTX and Primarion PX3540 is responsible for managing the GPU power. Memory power management is carried out by a second controller, Intersil ISL6549, which, by the way, is absent on the GeForce 8800 GTX, where the memory power supply scheme is different.

The left side of the PCB, where the main components of the GeForce 8800 GTS are located - GPU, NVIO and memory, is almost identical to the similar section of the GeForce 8800 GTX PCB, which is not surprising, since developing the entire board from scratch would require significant financial and time costs. In addition, it most likely would not have been possible to significantly simplify the board for the GeForce 8800 GTS by designing it from scratch, in light of the need to use the same G80 and NVIO tandem as on the flagship model. The only visible difference from the GeForce 8800 GTX is the absence of the second “comb” of the MIO interface (SLI), in its place there is space for installing a technological connector with latches, possibly performing the same function, but not soldered. Even the 384-bit memory bus layout is preserved, and the bus itself is cut to the required width in the simplest way: instead of 12 GDDR3 chips, only 10 are installed. Since each chip has a 32-bit bus, 10 chips in total give the required 320 bits. Theoretically, nothing prevents the creation of a GeForce 8800 GTS with a 384-bit memory bus, but the appearance of such a card in practice is extremely unlikely, therefore, a full-fledged GeForce 8800 GTX with lower frequencies has a high chance of being released.


The MSI NX8800GTS-T2D640E-HD-OC is equipped with 10 GDDR3 Samsung K4J52324QE-BC12 chips with a capacity of 512 Mbit, operating at a supply voltage of 1.8 V and having a nominal frequency of 800 (1600) MHz. According to the official Nvidia specifications for the GeForce 8800 GTS, the memory of this video adapter should have exactly this frequency. But it’s not for nothing that the version of MSI NX8800GTS we are considering has the letters “OC” in its name - it is pre-overclocked, so the memory operates at a slightly higher frequency of 850 (1700) MHz, which gives an increase in bandwidth from 64 GB/sec. up to 68 GB/sec.

Since the only difference between the GeForce 8800 GTS 320MB and the regular model is the video memory volume reduced by half, memory chips with a capacity of 256 Mbit, for example, Samsung K4J55323QC/QI series or Hynix HY5RS573225AFP, are simply installed on this card. Otherwise, the two GeForce 8800 GTS models are identical to each other down to the smallest detail.

The NX8800GTS GPU marking is slightly different from the GeForce 8800 GTX processor marking and looks like “G80-100-K0-A2”, while in the reference flagship card the chip is marked with the symbols “G80-300-A2”. We know that the production of GeForce 8800 GTS may include G80 units that have defects in functional units and/or have not passed frequency selection. Perhaps these features are reflected in the labeling.

The 8800 GTS processor has 96 active stream processors out of 128, 24 TMUs out of 32, and 20 ROPs out of 24. For the standard GeForce 8800 GTS, the base GPU frequency is 500 MHz (513 MHz actual frequency), and the shader processor frequency is 1200 MHz (1188 MHz real frequency), but for the MSI NX8800GTS-T2D640E-HD-OC these parameters are 576 and 1350 MHz, which corresponds to the frequencies of the GeForce 8800 GTX. We will find out how much this will affect the performance of the MSI product later, in the section dedicated to the gaming test results.

The configuration of the NX8800GTS output connectors is standard: two DVI-I connectors capable of operating in dual-link mode and a universal seven-pin mini-DIN connector, allowing both the connection of HDTV devices via the analog YPbPr interface, and SDTV devices using the S-Video or Composite interface. In the MSI product, both DVI connectors are carefully covered with rubber protective caps - a rather meaningless, but pleasant detail.

MSI NX8800GTS-T2D640E-HD-OC: Cooling System Design

The cooling system installed on the MSI NX8800GTS, as well as on the vast majority of GeForce 8800 GTS from other graphics card vendors, is a shortened version of the GeForce 8800 GTX cooling system described in the corresponding review.


The radiator and heat pipe, which transmits heat flow from the copper base in contact with the heat spreader of the graphics processor, have been shortened. Also located differently is the flat U-shaped heat pipe, pressed into the base and responsible for the uniform distribution of heat flow. Aluminum frame on which all cooler parts are fixed. has many protrusions in places of contact with memory chips, power transistors of the power regulator and the NVIO chip crystal. Reliable thermal contact is ensured by traditional inorganic fiber pads impregnated with white thermal paste. For the GPU, a different, but also familiar to our readers, thick dark gray thermal paste is used.

Due to the fact that there are relatively few copper elements in the design of the cooling system, its mass is small, and the fastening does not require the use of special plates that prevent fatal bending of the PCB. Eight regular spring-loaded bolts securing the cooler directly to the board are quite enough. The possibility of damage to the graphics processor is virtually eliminated, since it is equipped with a heat spreader cover and is surrounded by a wide metal frame that protects the chip from possible distortion of the cooling system, and the board from excessive bending.

A radial fan with an impeller diameter of approximately 75 millimeters, which has the same electrical parameters as in the GeForce 8800 GTX cooling system - 0.48A/12V, and is connected to the board via a four-pin connector, is responsible for blowing the radiator. The system is closed with a translucent plastic casing in such a way that hot air is blown out through the slots in the mounting strip.

The design of the GeForce 8800 GTX and 8800 GTS coolers is well thought out, reliable, time-tested, virtually silent in operation and provides high cooling efficiency, so there is no point in changing it to anything else. MSI only replaced the Nvidia sticker on the casing with its own, repeating the design on the box, and provided the fan with another sticker with its own logo.

MSI NX8800GTS-T2D640E-HD-OC: noise and power consumption

To evaluate the noise level generated by the MSI NX8800GTS cooling system, a Velleman DVM1326 digital sound level meter with a resolution of 0.1 dB was used. Measurements were made using the A-weighted curve. At the time of measurements, the background noise level in the laboratory was 36 dBA, and the noise level at a distance of one meter from a working stand equipped with a passively cooled graphics card was 40 dBA.






In terms of noise, the cooling system of the NX8800GTS (and any other GeForce 8800 GTS) behaves exactly the same as the system installed on the GeForce 8800 GTX. The noise level is very low in all modes; In this parameter, Nvidia's new design surpasses even the excellent GeForce 7900 GTX cooler, which was previously rightfully considered the best in its class. In this case, the only way to achieve complete silence and not lose cooling efficiency is by installing a water cooling system, especially if serious overclocking is planned.

As our readers know, the reference copies of the GeForce 8800 GTX from the first batches refused to run on a stand equipped to measure the level of power consumption of video cards. However, most of the new cards belonging to the GeForce 8800 family, and among them the MSI NX8800GTS-T2D640E-HD-OC, worked without problems on this system, which has the following configuration:

Processor Intel Pentium 4 560 (3.60GHz, 1MB L2);
Intel Desktop Board D925XCV (i925X);
Memory PC-4300 DDR2 SDRAM (2x512MB);
Hard drive Samsung SpinPoint SP1213C (120 GB, Serial ATA-150, 8MB buffer);
Microsoft Windows XP Pro SP2, DirectX 9.0c.

As we reported, the motherboard, which is the heart of the measuring platform, has been specially upgraded: measuring shunts equipped with connectors for connecting measuring equipment are included in the power supply circuit of the PCI Express x16 slot. The 2xMolex -> 6-pin PCI Express power adapter is also equipped with the same shunt. The measuring tool used is a Velleman DVM850BL multimeter, which has a measurement error of no more than 0.5%.

To create a load on the video adapter in 3D mode, the first graphics test SM3.0/HDR is used, which is part of the Futuremark 3DMark06 package and is launched in an endless loop at a resolution of 1600x1200 with forced anisotropic filtering of 16x. Peak 2D mode is emulated by running the 2D Transparent Windows benchmark, part of the Futuremark PCMark05 suite.

Thus, after carrying out a standard measurement procedure, we were able to obtain reliable data on the level of power consumption not only of the MSI NX8800GTS-T2D640E-HD-OC, but also of the entire Nvidia GeForce 8800 family.











The GeForce 8800 GTX is indeed ahead of the previous “leader”, Radeon X1950 XTX, in terms of power consumption, but only by 7 Watts. Considering the enormous complexity of the G80, 131.5 Watts in 3D mode can be considered a good indicator. Both additional power connectors of the GeForce 8800 GTX consume approximately the same power, not exceeding 45 W even in the heaviest mode. Although the PCB design of the GeForce 8800 GTX assumes the installation of one eight-pin power connector instead of a six-pin one, it is unlikely to be relevant even if the GPU and memory clock speeds significantly increase. In idle mode, the efficiency of Nvidia's flagship leaves much to be desired, but this is a price to pay for 681 million transistors and a huge shader processor frequency by GPU standards. Such a high level of power consumption during idle is partly due to the fact that the GeForce 8800 family does not reduce clock speeds in this mode.

Both versions of the GeForce 8800 GTS have noticeably more modest performance, although they cannot boast of the same level of efficiency as Nvidia cards using the previous generation G71 core. The single power connector of these cards bears a much more serious load, in some cases it can reach 70 Watts or more. The power consumption levels of the GeForce 8800 GTS variants with 640 and 320 MB of video memory differ slightly, which is not surprising - after all, this parameter is the only difference between these cards from each other. The MSI product, operating at higher frequencies, consumes more than the standard version of the GeForce 8800 GTS - about 116 Watts under load in 3D mode, which is still less than the same figure for the Radeon X1950 XTX. Of course, in 2D mode, an AMD card is much more economical, however, video adapters of this class are purchased specifically for use in 3D, therefore, this parameter is not as critical as the level of power consumption in games and 3D applications.

MSI NX8800GTS-T2D640E-HD-OC: overclocking features

Overclocking members of the Nvidia GeForce 8800 family involves a number of features that we consider it necessary to tell our readers about. As you probably remember, the first representatives of the seventh generation GeForce, using the 0.11-μm G70 core, could increase the frequencies of rasterization units and pixel processors only in 27 MHz increments, and if the overclocking was less than this value, there was practically no performance gain. Later, in cards based on the G71, Nvidia returned to the standard overclocking scheme in 1 MHz increments, however, in the eighth generation GeForce, discrete changes in clock frequencies appeared again.

The scheme for distributing and changing clock frequencies in the GeForce 8800 is quite non-trivial, which is due to the fact that the shader processor units in the G80 operate at a significantly higher frequency than other GPU units. The frequency ratio is approximately 2.3 to 1. Although the main frequency of the graphics core can change in smaller steps than 27 MHz, the frequency of shader processors always changes in steps of 54 MHz (2x27 MHz), which creates additional difficulties when overclocking, because all utilities manipulate the main frequency , and not at all the frequency of the shader “domain”. There is, however, a simple formula that allows you to determine with sufficient accuracy the frequency of GeForce 8800 stream processors after overclocking:

OC shader clk = Default shader clk / Default core clk * OC core clk


Where OC shader clk is the desired frequency (approximately), Default shader clk is the initial frequency of shader processors, Default core clk is the initial core frequency, and OC core clk is the frequency of the overclocked core.

Let's look at the behavior of the MSI NX8800GTS-T2D640E-HD-OC when overclocked using the RivaTuner2 FR utility, which allows you to monitor the real frequencies of various areas or, as they are also called, “domains” of the G80 GPU. Since the MSI product has the same GPU clocks (576/1350) as the GeForce 8800 GTX, the following information is also relevant for Nvidia's flagship graphics card. We will increase the main GPU frequency in steps of 5 MHz: this is a fairly small step and it is not a multiple of 27 MHz.


Empirical testing confirmed: the main frequency of the graphics core can indeed change in variable steps - 9, 18 or 27 MHz, and we were not able to catch the pattern of change. The frequency of shader processors in all cases was changed in steps of 54 MHz. Because of this, some frequencies of the main “domain” of the G80 turn out to be practically useless when overclocking, and their use will only lead to excessive heating of the GPU. For example, there is no point in increasing the main core frequency to 621 MHz - the shader unit frequency will still be 1458 MHz. Thus, overclocking the GeForce 8800 should be carried out carefully, using the above formula and checking the monitoring data of Riva Tuner or another utility with similar functionality.

It would be illogical to expect serious overclocking results from a version of the NX8800GTS already overclocked by the manufacturer; however, the card unexpectedly showed quite good potential, at least on the GPU side. We managed to raise its frequencies from the factory 576/1350 MHz to 675/1566 MHz, while the NX8800GTS steadily passed several 3DMark06 cycles in a row without any additional cooling. The processor temperature, according to Riva Tuner, did not exceed 70 degrees.

The memory was much worse overclocked, since the NX8800GTX OC Edition was equipped with chips designed for 800 (1600) MHz, operating at a frequency higher than the nominal - 850 (1700) MHz. As a result, we had to stop at 900 (1800) MHz, since further attempts to increase the memory frequency invariably led to freezing or crashing in the driver.

Thus, the card showed good overclocking potential, but only for the GPU: relatively slow memory chips did not allow it to significantly increase its frequency. For them, the GeForce 8800 GTX level should be considered a good achievement, and a 320-bit bus at this frequency is already capable of providing a significant advantage in throughput over the Radeon X1950 XTX: 72 GB/sec versus 64 GB/sec. Of course, overclocking results may vary depending on the specific MSI NX8800GTS OC Edition and the use of additional means, such as modifying the card's power supply or installing water cooling.

Test platform configuration and testing methods

A comparative study of the performance of the GeForce 8800 GTX was carried out on platforms with the following configuration.

AMD Athlon 64 FX-60 processor (2 x 2.60GHz, 2 x 1MB L2)
Abit AN8 32X motherboard (nForce4 SLI X16) for Nvidia GeForce cards
Asus A8R32-MVP Deluxe motherboard (ATI CrossFire Xpress 3200) for ATI Radeon cards
Memory OCZ PC-3200 Platinum EL DDR SDRAM (2x1GB, CL2-3-2-5)
Hard drive Maxtor MaXLine III 7B250S0 (Serial ATA-150, 16MB buffer)
Creative SoundBlaster Audigy 2 sound card
Power supply Enermax Liberty 620W (ELT620AWT, rated power 620W)
Dell 3007WFP monitor (30", maximum resolution 2560x1600)
Microsoft Windows XP Pro SP2, DirectX 9.0c
AMD Catalyst 7.2
Nvidia ForceWare 97.92

Since we consider the use of trilinear and anisotropic filtering optimizations unjustified, the drivers were configured in a standard way, implying the highest possible quality of texture filtering:

AMD Catalyst:

Catalyst A.I.: Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
High Quality AF: On

Nvidia ForceWare:

Texture Filtering: High Quality
Vertical sync: Off
Trilinear optimization: Off
Anisotropic optimization: Off
Anisotropic sample optimization: Off
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

Each game was set to the highest possible level of graphics quality, and the game configuration files were not modified. To capture performance data, either the game's built-in capabilities or, in their absence, the Fraps utility were used. Where possible, minimum performance data was recorded.

Testing was carried out in three standard resolutions for our methodology: 1280x1024, 1600x1200 and 1920x1200. One of the goals of this review is to evaluate the impact of the GeForce 8800 GTS video memory on performance. In addition, the technical characteristics and cost of both versions of this video adapter allow us to count on a fairly high level of performance in modern games when using FSAA 4x, so we tried to use the “eye candy” mode wherever possible.

FSAA and anisotropic filtering were activated using game tools; in the absence of such, they were forced using the appropriate settings of the ATI Catalyst and Nvidia ForceWare drivers. Testing without full-screen antialiasing was used only for games that do not support FSAA for technical reasons, or when using FP HDR while testing representatives of the GeForce 7 family that do not support simultaneous operation of these capabilities.

Since our task was, among other things, to compare the performance of graphics cards that differed only in the amount of video memory, the MSI NX8800GTS-T2D640E-HD-OC was tested 2 times: at factory frequencies, and at frequencies reduced to the reference values ​​for the GeForce 8800 GTS: 513 /1188/800 (1600) MHz. In addition to the MSI product and the reference Nvidia GeForce 8800 GTS 320MB, the following video adapters took part in testing:

Nvidia GeForce 8800 GTX (G80, 576/1350/1900MHz, 128sp, 32tmu, 24rop, 384-bit, 768MB)
Nvidia GeForce 7950 GX2 (2xG71, 500/1200MHz, 48pp, 16vp, 48tmu, 32rop, 256-bit, 512MB)
AMD Radeon X1950 XTX (R580+, 650/2000MHz, 48pp, 8vp, 16tmu, 16rop, 256-bit, 512MB)

The following set of games and applications was used as test software:

3D first-person shooters:

Battlefield 2142
Call of Juarez
Far Cry
F.E.A.R. Extraction Point
Tom Clancy's Ghost Recon Advanced Warfighter
Half-Life 2: Episode One
Prey
Serious Sam 2
S.T.A.L.K.E.R.: Shadow of Chernobyl


3D shooters with third person view:

Tomb Raider: Legend


RPG:

Gothic 3
Neverwinter Nights 2
The Elder Scrolls IV: Oblivion


Simulators:


Strategy games:

Command & Conquer: Tiberium Wars
Company of Heroes
Supreme Commander


Synthetic gaming tests:

Futuremark 3DMark05
Futuremark 3DMark06

Game tests: Battlefield 2142


There is no significant difference between the two versions of the GeForce 8800 GTS with different amounts of video memory on board up to a resolution of 1920x1200, although at 1600x1200 the younger model is inferior to the older one by about 4-5 frames per second with quite comfortable performance for both. The 1920x1440 resolution, however, is a turning point: the GeForce 8800 GTS 320MB sharply drops out of the game with more than 1.5 times the lag on average and two times in the minimum fps. Moreover, it also loses to cards of the previous generation. There is a lack of video memory or a problem with the implementation of its management in the GeForce 8800 family.

The MSI NX8800GTS OC Edition is noticeably ahead of the reference model, starting with a resolution of 1600x1200, but, of course, it cannot catch up with the GeForce 8800 GTX, although at 1920x1440 the gap between these cards becomes impressively narrow. Obviously, the difference in the memory access bus width between the GeForce 8800 GTS and GTX is insignificant here.

Game tests: Call of Juarez


Both GeForce 8800 GTS models show the same level of performance in all resolutions, including 1920x1200. This is quite logical, given testing with HDR enabled but FSAA disabled. Operating at nominal frequencies, the cards are inferior to the GeForce 7950 GX2.

The overclocked version of MSI allows you to achieve parity in high resolutions, the use of which in this game is impractical even if you have a GeForce 8800 GTX in your system. For example, at 1600x1200 the average performance of Nvidia's flagship graphics card is only 40 fps with dips in graphically intensive scenes up to 21 fps. For a first-person shooter, such indicators can hardly be called truly comfortable.

Game tests: Far Cry


The game is far from young and is poorly suited for testing modern high-end video adapters. Despite the use of anti-aliasing, noticeable differences in their behavior can only be seen at a resolution of 1920x1200. The GeForce 8800 GTS 320MB faces a shortage of video memory here, and is therefore inferior by about 12% to the model equipped with 640 MB of video memory. However, due to the modest requirements of Far Cry by today's standards, the player is not in danger of losing comfort.

The MSI NX8800GTS OC Edition is almost on par with the GeForce 8800 GTX: the latter's power is clearly unclaimed in Far Cry.


Due to the nature of the scene recorded at the Research level, the readings are more varied; already at a resolution of 1600x1200 you can see the differences in the performance of various representatives of the GeForce 8800 family. Moreover, the lag of the version with 320MB of memory is already visible here, despite the fact that the action takes place in the confined space of an underground cave. The performance difference between the MSI product and the GeForce 8800 GTX at 1920x1200 resolution is significantly greater than in the previous case, since the performance of shader processors at this level plays a more important role.




In FP HDR mode, the GeForce 8800 GTS 320MB no longer experiences problems with video memory capacity and is in no way inferior to its older brother, providing a decent level of performance in all resolutions. The version offered by MSI gives another 15% increase in speed, but even the version running at standard clock speeds is fast enough to use a resolution of 1920x1200, and the GeForce 8800 GTX will undoubtedly provide comfortable conditions for the player at a resolution of 2560x1600.

Game tests: F.E.A.R. Extraction Point


The visual richness of F.E.A.R. requires appropriate resources from the video adapter, and the 5% lag of the GeForce 8800 GTS 320MB is already visible at the resolution of 1280x1024, and in the next resolution, 1600x1200, it sharply turns into 40%.

The benefits of overclocking the GeForce 8800 GTS are not obvious: both the overclocked and normal versions allow you to play equally successfully at a resolution of 1600x1200. At the next resolution, the speed increase from overclocking is simply not enough to reach a level comfortable for first-person shooters. Only the GeForce 8800 GTX with 128 active shader processors and a 384-bit memory subsystem can do this.

Game tests: Tom Clancy's Ghost Recon Advanced Warfighter

Due to the use of deferred rendering, using FSAA in GRAW is technically impossible, therefore, the data is shown only for the anisotropic filtering mode.


The advantage of the MSI NX8800GTS OC Edition over the regular reference card increases as the resolution increases, and at 1920x1200 resolution it reaches 19%. In this case, it is this 19% that allows us to achieve an average performance of 55 fps, which is quite comfortable for the player.

As for comparing two GeForce 8800 GTS models with different amounts of video memory, there is no difference in their performance.

Game Tests: Half-Life 2: Episode One


At a resolution of 1280x1024 there is a limitation on the part of the central processor of our test system - all cards show the same result. At 1600x1200 the differences are already apparent, but they are not fundamental, at least for three variants of the GeForce 8800 GTS: all three provide a very comfortable level of performance. The same can be said about the resolution of 1920x1200. Despite the high-quality graphics, the game is undemanding in terms of video memory and the GeForce 8800 GTS 320MB loses only about 5% to the older and much more expensive model with 640 MB of memory on board. The overclocked version of the GeForce 8800 GTS, offered by MSI, confidently takes second place after the GeForce 8800 GTX.

Although the GeForce 7950 GX2 shows better results than the GeForce 8800 GTS at 1600x1200 resolution, we should not forget about the problems that can arise when using a card that is essentially an SLI tandem, as well as the significantly lower quality of texture filtering in the GeForce 7 family The new Nvidia solution, of course, still has problems with drivers, but it has promising capabilities, and, unlike the GeForce 7950 GX2, has every chance of getting rid of “childhood diseases” in the shortest possible time.

Game tests: Prey


The GeForce 8800 GTS 640MB does not show the slightest advantage over the GeForce 8800 GTS 320MB, perhaps because the game uses a modified Doom III engine and does not show any special appetite in terms of video memory requirements. As with GRAW, the increased performance of the NX8800GTS OC Edition allows owners of this video adapter to expect fairly comfortable gaming at 1920x1200 resolution. For comparison, the regular GeForce 8800 GTS shows the same numbers at a resolution of 1600x1200. The flagship of the line, the GeForce 8800 GTX is beyond competition.

Game tests: Serious Sam 2


The brainchild of Croatian developers from Croteam has always strictly required the video adapter to have 512 MB of video memory, otherwise punishing it with a monstrous drop in performance. The capacity provided by the inexpensive version of the GeForce 8800 GTS was not enough to satisfy the game's appetites, as a result of which it was able to show only 30 fps at a resolution of 1280x1024, while the version with 640 MB of memory on board was more than twice as fast.

For some unknown reason, the minimum performance of all GeForce 8800 in Serious Sam 2 is extremely low, which may be due to both the architectural features of the family, which, as is known, has a unified architecture without division into pixel and vertex shaders, and to shortcomings in the ForceWare drivers. For this reason, GeForce 8800 owners will not yet be able to achieve complete comfort in this game.

Game tests: S.T.A.L.K.E.R.: Shadow of Chernobyl

Eagerly awaited by many players, the GSC Game World project, after many years of development, finally saw the light of day, 6 or 7 years after the announcement. The game turned out to be ambiguous, but, nevertheless, multifaceted enough to try to describe it in a few phrases. We only note that compared to one of the first versions, the project engine has been significantly improved. The game received support for a number of modern technologies, including Shader Model 3.0, HDR, parallax mapping and others, but did not lose the ability to work in a simplified mode with a static lighting model, providing excellent performance on not very powerful systems.

Because we focus on the highest level of image quality, we tested the game in full dynamic lighting mode with maximum detail. This mode, which involves, among other things, the use of HDR, does not support FSAA; at least that's the case in the current version of S.T.A.L.K.E.R. Since when using a static lighting model and DirectX 8 effects, the game loses much of its attractiveness, we limited ourselves to anisotropic filtering.


The game does not suffer from modest appetites - with maximum detail, even the GeForce 8800 GTX is not able to provide 60 fps in a resolution of 1280x1024. However, it should be noted that in low resolutions the main limiting factor is CPU performance, since the spread between cards is small and their average results are quite close.

However, some lag between the GeForce 8800 GTS 320MB and its older brother is already visible here, and with increasing resolution it only gets worse, and at a resolution of 1920x1200, the youngest member of the GeForce 8800 family simply does not have enough available video memory. This is not surprising, given the scale of the game scenes and the abundance of special effects used in them.

Overall, we can say that the GeForce 8800 GTX does not provide a serious advantage in S.T.A.L.K.E.R. ahead of the GeForce 8800 GTS, and the Radeon X1950 XTX looks just as successful as the GeForce 8800 GTS 320MB. AMD's solution is even somewhat superior to Nvidia's solution, since it works at a resolution of 1920x1200, however, the practical use of this mode is impractical due to the average performance of 30-35 fps. The same applies to the GeForce 7950 GX2, which, by the way, is somewhat ahead of both its direct competitor and the younger model of the new generation.

Game tests: Hitman: Blood Money


We noted earlier that the presence of 512 MB of video memory provides such a video adapter with some gains in Hitman: Blood Money in high resolutions. Apparently, 320 MB is also enough, since the GeForce 8800 GTS 320MB is almost as good as the regular GeForce 8800 GTS, regardless of the resolution used; the difference does not exceed 5%.

Both cards, as well as the overclocked GeForce 8800 GTS offered by MSI, allow you to successfully play in all resolutions, and the GeForce 8800 GTX even allows the use of higher-quality FSAA modes than the usual MSAA 4x, since it has the necessary performance margin for this.

Game Tests: Tomb Raider: Legend


Despite using settings that provide maximum graphics quality, the GeForce 8800 GTS 320MB handles the game just as successfully as the regular GeForce 8800 GTS. Both cards make 1920x1200 resolution available to the player in "eye candy" mode. The MSI NX8800GTS OC Edition is slightly superior to both reference cards, but only on average fps - the minimum remains the same. It is no more than that of the GeForce 8800 GTX, which may mean that this indicator is due to some features of the game engine.

Game tests: Gothic 3

The current version of Gothic 3 does not support FSAA, so testing was carried out using only anisotropic filtering.


Despite the lack of full-screen anti-aliasing support, the GeForce 8800 GTS 320MB is seriously inferior not only to the regular GeForce 8800 GTS, but also to the Radeon X1950 XTX, slightly ahead of only the GeForce 7950 GX2. Due to performance at 26-27 fps at 1280x1024 resolution, this card is not the best for Gothic 3.

Note that the GeForce 8800 GTX is ahead of the GeForce 8800 GTS, at best, by 20%. Apparently, the game is unable to use all the resources available to Nvidia's flagship model. This is also evidenced by the slight difference between the regular and overclocked version of the GeForce 8800 GTS.

Game tests: Neverwinter Nights 2

As of version 1.04, the game allows the use of FSAA, but HDR support is still a work in progress, so we tested NWN 2 in "eye candy" mode.


As already mentioned, the minimum playability barrier for Neverwinter Nights 2 is 15 frames per second, and the GeForce 8800 GTS 320MB balances on this edge already at a resolution of 1600x1200, while for the version with 640 MB of memory 15 fps is the minimum indicator below which its productivity does not drop.

Game Tests: The Elder Scrolls IV: Oblivion

Without HDR, the game loses much of its appeal, and although players have differing opinions on this matter, we tested TES IV in the mode with FP HDR enabled.


The performance of the GeForce 8800 GTS 320MB directly depends on the resolution used: if at 1280x1024 the new product is able to compete with the most productive cards of the previous generation, then at 1600x1200 and, especially, 1920x1200 it loses to them, losing up to 10% to the Radeon X1950 XTX and up to 25% to the GeForce 7950 GX2. However, this is a very good result for a solution that has an official price of only $299.

The regular GeForce 8800 GTS and its overclocked version offered by MSI feel more confident and provide comfortable performance at the level of first-person shooters in all resolutions.


Examining two versions of the GeForce 7950 GT, which differ in the amount of video memory, we did not record any serious differences in performance in TES IV, however, in a similar situation with two versions of the GeForce 8800 GTS, the picture is completely different.
If at 1280x1024 they behave the same, then at 1600x1200 the version with 320 MB of memory is more than half as good as the version equipped with 640 MB, and at a resolution of 1920x1200 its performance drops to the level of the Radeon X1650 XT. It is quite obvious that the issue here is not the amount of video memory as such, but the features of its distribution by the driver. The problem can probably be fixed by updating ForceWare, and we will check this claim with the release of new versions of Nvidia drivers.

As for the GeForce 8800 GTS and MSI NX8800GTS OC Edition, even in the open spaces of the Oblivion world they provide a high level of comfort in all resolutions, although, of course, not in the region of 60 fps, as in enclosed spaces. The most powerful solutions of the previous generation are simply not able to compete with them.

Game tests: X3: Reunion


The average performance of all members of the GeForce 8800 family is quite high, but the minimum is still at a low level, which means that drivers need to be improved. The results of the GeForce 8800 GTS 320MB are the same as those of the GeForce 8800 GTS 640MB.

Game Tests: Command & Conquer 3: Tiberium Wars

The Command & Conquer real-time strategy series is probably familiar to anyone who is even more or less interested in computer games. The continuation of the series, recently released by Electronic Arts, takes the player into the well-known world of confrontation between GDI and Brotherhood of Nod, which, this time, was joined by a third faction in the form of alien invaders. The game engine is modern and uses advanced special effects; In addition, it has one feature - an fps limiter fixed at 30 frames per second. This may be done to limit the speed of the AI ​​and thus avoid unfair advantage over the player. Since the limiter is not disabled by standard means, we tested the game with it, which means we paid attention primarily to the minimum fps.


Almost all test participants can achieve 30 fps in all resolutions, with the exception of the GeForce 7950 GX2, which has problems with the functioning of the SLI mode. Most likely, the driver simply does not have the appropriate support, since the last time the official Windows XP Nvidia ForceWare driver for the GeForce 7 family was updated was more than six months ago.

As for both GeForce 8800 GTS models, they demonstrate the same minimum fps, and, therefore, provide the same level of comfort for the player. Although the model with 320 MB of video memory is inferior to the older model in a resolution of 1920x1200, 2 frames per second is hardly a critical value, which, with the same minimum performance, again, does not affect the gameplay in any way. A complete lack of discrete control can only be provided by the GeForce 8800 GTX, whose minimum fps does not fall below 25 frames per second.

Game tests: Company of Heroes

Due to issues with FSAA activation in this game, we decided not to use the "eye candy" mode and tested it in pure performance mode with anisotropic filtering enabled.


Here we have another game where the GeForce 8800 GTS 320MB is inferior to the previous generation with a non-unified architecture. In fact, Nvidia's $299 solution is suitable for use at resolutions no higher than 1280x1024, even with anti-aliasing disabled, while the $449 model, which differs only in the amount of video memory, allows you to successfully play even at 1920x1200. However, this is also available to owners of AMD Radeon X1950 XTX.

Game Tests: Supreme Commander


But Supreme Commander, unlike Company of Heroes, does not have strict requirements for the amount of video memory. In this game, the GeForce 8800 GTS 320MB and GeForce 8800 GTS show equally high results. Some additional gains can be achieved through overclocking, as demonstrated by the MSI product, but such a step will still not allow it to reach the level of the GeForce 8800 GTX. However, the available performance is sufficient to use all resolutions, including 1920x1200, especially since its fluctuations are small, and the minimum fps is only slightly inferior to the average.

Synthetic tests: Futuremark 3DMark05


Since by default 3DMark05 uses a resolution of 1024x768 and does not use full-screen anti-aliasing, the GeForce 8800 GTS 320MB naturally demonstrates the same result as the regular version with 640 MB of video memory. The overclocked version of the GeForce 8800 GTS, supplied to the market by Micro-Star International, boasts a nice even result of 13,800 points.






Unlike the overall result obtained in the default mode, we obtain the results of individual tests by running them in the "eye candy" mode. But in this case, this did not affect the performance of the GeForce 8800 GTS 320MB in any way - no noticeable lag behind the GeForce 8800 GTS was recorded even in the third, most resource-intensive test. The MSI NX8800GTS OC Edition took a stable second place in all cases after the GeForce 8800 GTX, confirming the results obtained in the overall standings.

Synthetic tests: Futuremark 3DMark06


Both GeForce 8800 GTS variants behave the same as in the previous case. However, 3DMark06 uses more complex graphics, which, combined with the use of FSAA 4x in some tests, may give a different picture. Let's get a look.






The results of individual groups of tests are also consistent. The SM3.0/HDR group uses a larger number of more complex shaders, so the advantage of the GeForce 8800 GTX is more pronounced than in the SM2.0 group. The AMD Radeon X1950 XTX also looks more advantageous in the case of active use of Shader Model 3.0 and HDR, and the GeForce 7950 GX2, on the contrary, in the SM2.0 tests.




After enabling FSAA, the GeForce 8800 GTS 320MB actually begins to lose to the GeForce 8800 GTS 640MB at 1600x1200 resolution, and at 1920x1200 the new Nvidia solution cannot pass the tests at all due to lack of video memory. The loss is close to twofold in both the first and second SM2.0 tests, despite the fact that they are very different in the construction of graphic scenes.






In the first SM3.0/HDR test, the effect of video memory on performance is clearly visible already at a resolution of 1280x1024. The younger model GeForce 8800 GTS is inferior to the older one by about 33%, then, at a resolution of 1600x1200, the gap increases to almost 50%. The second test, with a much less complex and large-scale scene, is not so demanding on the amount of video memory, and here the lag is 5% and about 20%, respectively.

Conclusion

Time to take stock. We tested two Nvidia GeForce 8800 GTS models, one of which is a direct competitor to the AMD Radeon X1950 XTX, and the other is aimed at the $299 mainstream performance card sector. What can we say with the results of gaming tests?

The older model, which has an official price of $449, performed well when it comes to performance. In most tests, the GeForce 8800 GTS outperformed the AMD Radeon X1950 XTX and only in some cases showed equal performance with the AMD solution and lagged behind the dual-processor GeForce 7950 GX2 tandem. However, given the exceptionally high performance of the GeForce 8800 GTS 640MB, we would not unequivocally compare it with products of the previous generation: they do not support DirectX 10, while the GeForce 7950 GX2 has a significantly worse quality of anisotropic filtering, and potential problems caused by incompatibility of one or another games with Nvidia SLI technology.

GeForce 8800 GTS 640MB can confidently be called the best solution in the price range of $449-$499. However, it is worth noting that the new generation of Nvidia products has not yet been cured of childhood diseases: in Call of Juarez flickering shadows are still visible, and Splinter Cell: Double Agent, although it works, it requires special launch on drivers version 97.94. At least until cards based on the new generation AMD graphics processor appear on the market, the GeForce 8800 GTS has every chance to take its rightful place as the “best accelerator worth $449.” However, before purchasing a GeForce 8800 GTS, we would recommend clarifying the issue of compatibility of the new Nvidia family with your favorite games.

The new GeForce 8800 GTS 320MB for $299 is also a very good purchase for the money: support for DirectX 10, high-quality anisotropic filtering and not a bad level of performance in typical resolutions are just some of the advantages of the new product. Thus, if you plan to play at 1280x1024 or 1600x1200 resolutions, the GeForce 8800 GTS 320MB is an excellent choice.

Unfortunately, a very promising card from a technical point of view, which differs from the more expensive version only in the amount of video memory, is sometimes seriously inferior to the GeForce 8800 GTS 640MB, not only in games with high requirements for video memory, such as Serious Sam 2, but also where previously the difference in performance of cards with 512 and 256 MB of memory was not recorded. In particular, such games include TES IV: Oblivion, Neverwinter Nights 2, F.E.A.R. Extraction Point and some others. Considering that 320 MB of video memory is clearly larger than 256 MB, the problem is clearly related to its inefficient allocation, but, unfortunately, we do not know whether this is due to flaws in the drivers or something else. However, even taking into account the above-described shortcomings, the GeForce 8800 GTS 320MB looks much more attractive than the GeForce 7950 GT and Radeon X1950 XT, although the latter will inevitably lose price with the advent of this video adapter.

As for the MSI NX8800GTS-T2D640E-HD-OC, we have a well-equipped product that differs from the Nvidia reference card not only in packaging, accessories and a sticker on the cooler. The video adapter is overclocked by the manufacturer and in most games provides a noticeable increase in performance compared to the standard GeForce 8800 GTS 640MB. Of course, it cannot reach the level of the GeForce 8800 GTX, but additional fps is never superfluous. Apparently, these cards are carefully selected for their ability to operate at higher frequencies; at least, our sample showed quite good results in the field of overclocking and it is possible that most copies of the NX8800GTS OC Edition are capable of overclocking well beyond what has already been done by the manufacturer.

The inclusion of a two-disc edition of Company of Heroes, considered by many game reviewers to be the best strategy game of the year, deserves special praise. If you are serious about purchasing a GeForce 8800 GTS, then this MSI product has every chance of becoming your choice.

MSI NX8800GTS-T2D640E-HD-OC: advantages and disadvantages

Advantages:

Increased performance level compared to the reference GeForce 8800 GTS
High level of performance at high resolutions using FSAA





Low noise level
Good overclocking potential
Good equipment

Flaws:

Insufficiently debugged drivers

GeForce 8800 GTS 320MB: advantages and disadvantages

Advantages:

High level of performance in its class
Support for new modes and anti-aliasing methods
Excellent anisotropic filtering quality
Unified architecture with 96 shader processors
Future proof: support for DirectX 10 and Shader Model 4.0
Efficient cooling system
Low noise level

Flaws:

Insufficiently debugged drivers (problem with video memory allocation, poor performance in some games and/or modes)
High energy consumption

Comparative testing of four GeForce 8800GTS 512 and 8800GT

Let's take a look at the GeForce 8800GTS 512 boards and compare them with the cheaper GeForce 8800GT and the veteran GeForce 8800GTX. At the same time, we are running a new test bench and collecting flaws in drivers for DX10

With the release of a new series of video cards GeForce 8800GTS 512, NVIDIA has significantly strengthened its position. The new product replaced the more expensive, hotter and bulkier GeForce 8800GTX, and the only drawback compared to its predecessor was a narrower memory bus of 256 bits (versus 384 for the GeForce 8800GTX) and a smaller amount of memory, equal to 512 MB (versus 768 MB for the GeForce 8800GTX) . However, the new product has undergone not only reductions, but also some improvements: the number of texture blocks has been increased from 32 to 64, which undoubtedly partly compensates for the simplifications in the card. Also, to compensate for the simplifications, the frequencies were increased compared to its predecessor, and the amount of video memory is easily expanded to 1 GB by simply installing chips of larger capacity, which, by the way, some manufacturers have already begun to do. But, despite the fact that the GeForce 8800GTS 512 video card replaced the GeForce 8800GTX, its main competitor is not its predecessor, but its closest relative, the GeForce 8800GT, and the whole point is in its lower price. The GeForce 8800GTS 512 and GeForce 8800GT video cards are not much different from each other, since the GeForce 8800GT is a stripped-down version of the GeForce 8800GTS 512 and, oddly enough, appeared on the market before the full-fledged version. Both video cards are equipped with 512 MB of video memory and, as today's research showed, they have the same memory. The main differences lie in the graphics processor, and specifically, in the GT version some of its functional blocks are disabled. See the table below for more details:

As you can see, the GeForce 8800GT differs from its older sister in that the number of universal processors has been reduced to 112 and the number of texture units has been reduced to 56. Initially, the cards also differ in clock speeds, but this does not matter for our today's review, since almost all cards have been factory overclocked. Let's find out how the differences on paper translated into reality.

Leadtek 8800GTS 512

The designers from Leadtek chose a bright orange color to attract attention to their video card, and they were absolutely right: the new product will not go unnoticed.
The face of the new product was an image of a scene from a fictional shooter, under which were the technical characteristics of the video card and a note about the bonus - the full version of the game Neverwinter Nights 2.
The back of the box contains the specifications of the video card, a list of the delivery kit and standard information from NVIDIA.
  • S-video splitter > S-video + component out;
  • DVI > D-sub adapter;
  • CD with drivers;
  • CD with Power DVD 7 program;

The Leadtek 8800GTS 512 video card is based on the reference design familiar to us from the GeForce 8800GT boards. Externally, the new product is distinguished by a “two-story” cooling system, which, unlike its predecessor, throws hot air outside the computer. The advantages of such a solution are obvious, and the reason for using an improved cooling system is, most likely, not that the “new” chip heats up more, but that for more money the buyer has every right to get a better product. After all, to be frank, the reference system of the GeForce 8800GT does not cope with its responsibilities in the best way.
The reverse sides of the GeForce 8800GTS 512 and GeForce 8800GT look almost identical and differ in that the 8800GTS 512 version has all the elements mounted. However, we will be able to see the differences later on the example of the Leadtek 8800GT video card, but for now let’s get under the hood of the new product.
By removing the cooling system, we can again verify that the boards are identical. However, pay attention to the right side of the board, where the power subsystem is located. Where the GeForce 8800GT is empty and has only seats, the Leadtek 8800GTS 512 has a space densely populated with radio elements. It turns out that the GeForce 8800GTS 512 has a more complex power subsystem than the GeForce 8800GT. In principle, this is not surprising, since the GeForce 8800GTS 512 has higher operating frequencies, and, consequently, more stringent requirements for power quality.
There are no external differences between the G92 chip in the Leadtek 8800GTS 512 and the G92 chip in the GeForce 8800GT video cards.
The new video card uses the same Qimonda chips with a 1.0 ns access time as the GeForce 8800GT. A set of eight chips forms 512 MB of video memory. The nominal frequency for such chips is 2000 MHz DDR, but the actual frequency set in the video card is slightly lower.
The cooling system for the video card is aluminum with a copper plate. This combination of two materials has been used for a long time and allows one to achieve the required efficiency with less weight and a lower price.
The processing of the copper “core” is at a satisfactory level, but no more.
After removing the casing from the cooling system, we are presented with an amazing picture: as many as three heat pipes are engaged in removing heat from the copper base, which go to different parts of the radiator made of aluminum plates. This scheme serves to distribute heat evenly, and the large dimensions of the radiator should have the best effect on the quality of cooling, which cannot be said about the reference cooling system of the GeForce 8800GT. There are also three heat pipes, but their dimensions are noticeably smaller, as are the dimensions of the radiator itself.

Differences, overclocking and cooling system efficiency


The differences from the GeForce 8800GT lie in the number of universal processors increased from 112 to 128, as well as the operating frequencies of the entire GPU.
The Leadtek 8800GTS 512 frequencies correspond to the recommended ones and are equal to 650/1625 MHz for the graphics processor and 1944 MHz for video memory.

Now - about the heating of the video card, which we will check using the Oblivion game with maximum settings.


The Leadtek 8800GTS 512 video card heated up from 55 degrees at rest to 71 degrees, while the noise from the fan was practically inaudible. However, this was not enough for overclocking, and using the same Riva Tuner, we increased the fan speed to 50% of the possible maximum.
After this, the GPU temperature did not rise above 64 degrees, while the noise level remained low. The Leadtek 8800GTS 512 video card was overclocked to 756/1890 MHz for the GPU and 2100 MHz for the video memory. Such high frequencies were unavailable for the GeForce 8800GT, apparently due to the simplified power system.

Well, let's get acquainted with the next participant in our testing today - the ASUS EN8800GTS TOP video card.

ASUS EN8800GTS TOP


When looking at the packaging of powerful ASUS video cards, you may get the feeling that this is not a video card at all, but, for example, a motherboard. It's all about the large dimensions; for example, in our case, the size of the box is noticeably larger than that of the first participant in today's testing. The large area of ​​the front side of the package made it possible to fit a large image of a branded archer girl and a considerable diagram showing a 7% increase in speed compared to the “regular” GeForce 8800GTS 512. The abbreviation “TOP” in the name of the video card indicates that it has been factory overclocked. The downside of the packaging is that it is not obvious that the video card belongs to the GeForce 8800GTS 512 series, but, by and large, these are minor things. At first it is surprising that there is too little information on the box, however, the truth reveals itself later, by itself, and in the literal sense.
As soon as you take the box by the handle, at the first whiff of the breeze it opens like a book. The information under the cover is entirely devoted to proprietary utilities from ASUS, in particular, ASUS Gamer OSD, which can now not only change brightness/contrast/color in real time, but also show the FPS value, as well as record video and take screenshots. The second utility described, called Smart Doctor, is designed to monitor the value of the supply voltages and frequencies of the video card, and also allows you to overclock it. It should be noted that the proprietary utility from ASUS can change two GPU frequencies, that is, the core and the shader unit. This brings it very close to the famous Riva Tuner utility.
The back of the box contains a little bit of everything, in particular, a brief description of the Video Security utility, designed to use your computer as a “smart” online video surveillance system.
The configuration of the card is made according to the “nothing superfluous” principle:
  • adapter for powering PCI-express cards;
  • adapter S-video > component out;
  • DVI > D-sub adapter;
  • bag for 16 discs;
  • CD with drivers;
  • CD with documentation;
  • Brief instructions for installing a video card.

Externally, the video card is an almost exact copy of the Leadtek 8800GTS 512, and this is not surprising: both cards are based on the reference design and, most likely, were produced at the same factory by order of NVIDIA itself, and only then sent to Leadtek and ASUS. Simply put, today, a card from Leadtek could very well become a card from ASUS, and vice versa.
It is clear that the reverse side of the video card is also no different from that of the Leadtek 8800GTS 512, except that they have different branded stickers.
There is also nothing unusual under the cooling system. The power supply system on the right side of the board is fully assembled, in the center is the G92 graphics processor with 128 active stream processors and eight memory chips, totaling 512 MB.
The memory chips are manufactured by Qimonda and have an access time of 1.0 ns, which corresponds to a frequency of 2000 MHz.
The appearance of the GPU does not reveal its noble origin, just like the Leadtek 8800GTS 512.
The cooling system of the ASUS EN8800GTS TOP video card is exactly the same as that of the Leadtek 8800GTS 512 video card: a copper “core” is built into the aluminum radiator to remove heat from the GPU.
The polishing quality of the copper core is satisfactory, like its predecessor.
The heat from the copper core is distributed across the aluminum fins using three copper heat pipes. We have already seen the effectiveness of this solution using the example of the first card.

Rated frequencies and overclocking

As we have already said, the TOP prefix after the name of the video card indicates its factory overclocking. The standard frequencies of the new product are 740/1780 MHz for the GPU (versus 650/1625 MHz for Leadtek) and 2072 MHz for video memory (versus 1944 MHz for Leadtek). Note that for memory chips with 1.0 ns access time, the nominal clock speed is 2000 MHz.

We managed to overclock the card to the same frequencies as the Leadtek 8800GTS 512: 756/1890 MHz for the GPU and 2100 MHz for the video memory at a fan speed of 50% of the maximum.

Well, now let's go down a notch and get acquainted with two video cards of the GeForce 8800GT class.

Leadtek 8800GT

The Leadtek 8800GT video card is a typical representative of the GeForce 8800GT series and, in fact, is not much different from the majority. The whole point is that GeForce 8800GT video cards are cheaper than the “advanced” GeForce 8800GTS 512, so they don’t become any less interesting.
The box of the Leadtek 8800GT is almost the same as that of the more expensive 8800GTS 512. The differences are in the smaller thickness, the absence of a carrying handle and, of course, in the name of the video card. The inscription “extreme” after the name of the video card indicates its factory overclocking.
The back of the box contains brief information about the video card, its advantages and a list of accessories. By the way, in our case, the game Neverwinter Nights 2 and instructions for installing the video card were missing.
The new product package includes:
  • adapter for powering PCI-express cards;
  • S-video splitter > S-video + component out;
  • DVI > D-sub adapter;
  • CD with drivers;
  • CD with Power DVD 7 program;
  • CD with the full version of the game Newervinter Nights 2;
  • Brief instructions for installing a video card.

The Leadtek 8800GT video card is manufactured according to the reference design and differs in appearance only by the sticker on the cooling system casing.
The reverse side of the video card also does not stand out in any way, however, after getting acquainted with the GeForce 8800GTS 512 video card, the missing row of chip capacitors on the left of the board attracts attention.
The cooling system is manufactured according to a reference design and is well known to us from previous reviews.
When examining the printed circuit board, one notices the absence of elements on the right side of the card, which, as we have already seen, are mounted in the 8800GTS 512 version. Otherwise, it is a completely ordinary board with a G92 graphics processor “cut” to 112 stream processors and eight memory chips, in total forming 512 MB.
Like the previous participants in today's testing, the memory chips of the Leadtek 8800GT are manufactured by Qimonda and have an access time of 1.0 ns, which corresponds to 2000 MHz.

Rated frequencies and overclocking

As already mentioned, the Leadtek 8800GT video card has a standard factory overclock. Its nominal frequencies are 678/1700 MHz for the graphics processor and 2000 MHz for video memory. Very good, however, despite such a considerable factory overclock, the video card did not show the best result when overclocked manually, only 713/1782 MHz for the graphics processor and 2100 MHz for video memory. Let us recall that the participants in previous reviews were overclocked to frequencies of 740/1800 MHz for the video processor and 2000-2100 MHz for video memory. We also note that we achieved this result at the maximum speed of the cooling system fan, since, as we have already said, the reference system of the GeForce 8800GT does not cope with its duties in the best way.

Now let's move on to the next participant in today's testing.

Palit 8800GT sonic


The face of the Palit 8800GT sonic video card is a combat frog in a spectacular design. Silly, but very funny! However, our life consists of nonsense, and it doesn’t hurt to remember this once again. Moving from fun to business, you should pay attention to the lower right corner, where there is a sticker indicating the frequencies of the video card and its other characteristics. The frequencies of the new product are almost the same as those of the GeForce 8800GTS 512: 650/1625 MHz for the graphics processor and 1900 MHz for video memory, which is only 44 MHz less than the 8800GTS 512.
The back side of the box does not contain anything remarkable, because everything interesting is located on the front side.
The new product package includes:
  • adapter for powering PCI-express cards;
  • adapter S-video > component out;
  • S-video adapter > tulip;
  • DVI > D-sub adapter;
  • DVI > HDMI adapter;
  • CD with drivers;
  • CD with the full version of the game Tomb Raider The Legend;
  • Brief instructions for installing a video card.
It should be noted that this is the first video card of the GeForce 8800GT class with a DVI > HDMI adapter that has been in our test laboratory; Previously, only some video cards of the AMD Radeon family were equipped with such an adapter.
Here comes the first surprise! The Palit 8800GT sonic video card is based on a printed circuit board of its own design and is equipped with a proprietary cooling system.
The reverse side of the video card also has differences, but it is still difficult for us to judge the pros and cons of the new design. But we can fully judge the installation of video card components and its quality.
Since the height of the stands between the heatsink for the GPU and the board is less than the gap between them, and the heatsink is secured with screws without any damping spacers, the board itself and the graphics chip substrate are very curved. Unfortunately, this can lead to their damage, and the problem lies not in the strength of the PCB from which the board is made, but in the tracks, which can burst under tension. However, it is not at all necessary that this will happen, but the manufacturer should pay more attention to mounting cooling systems on their video cards.
The cooling system is made of painted aluminum and consists of three parts - for the graphics processor, video memory and power subsystem. The base of the heatsink for the GPU does not boast any special treatment, and a solid gray mass is used as a heat-conducting interface.
Changes in the design of the printed circuit board affected the power subsystem; small elements were replaced with larger ones, and their layout changed. Otherwise, we have before us the well-known GeForce 8800GT with a G92 graphics processor and eight video memory chips, totaling 512 MB.
Like the rest of today's test participants, the memory chips are manufactured by Qimonda and have an access time of 1.0 ns.

Cooling efficiency and overclocking

We will test the effectiveness of the proprietary cooling system used in the Palit 8800GT sonic using the Oblivion game with maximum settings, as always.


The video card warmed up from 51 to 61 degrees, which is, in general, a very good result. However, the fan speed increased noticeably, as a result of which the already not quiet cooling system became clearly audible against the general background. Therefore, it is difficult to recommend a video card from Palit to lovers of silence.

Despite changes in the power subsystem and improved cooling, the Palit 8800GT sonic video card was overclocked to the usual frequencies of 734/1782 MHz for the graphics processor and 2000 MHz for video memory.

Now we have finished getting acquainted with the participants of today’s testing, and therefore let’s move on to reviewing the test results.

Testing and conclusions

Today's testing differs not only in that we compare four video cards with each other, but also in the fact that we performed it on a test bench different from the one you are familiar with, the configuration of which is as follows:

The change in the test platform is due to the fact that initially it was planned to test the Leadtek 8800GTS 512 and ASUS EN8800GTS TOP video cards in SLI mode, but, unfortunately, the ASUS video card could not withstand our bullying by the end of the tests, and the idea collapsed. Therefore, we decided to move SLI testing into a separate article as soon as we have the necessary hardware in our hands, but for now we will limit ourselves to tests of single video cards. We will compare seven video cards, one of which is the GeForce 8800GTS 512 overclocked to frequencies of 756/1890/2100 MHz. For comparison, we added the GeForce 8800GT and GeForce 8800GTX video cards, operating at the frequencies recommended by NVIDIA. To make it easier for you to navigate, we present a table with the clock frequencies of all testing participants:

Video card name GPU frequency, core/shader unit, MHz Effective video memory frequency, MHz
Leadtek 8800GTS 512 650 / 1625 1944
ASUS EN8800GTS TOP 740 / 1780 2072
Leadtek 8800GT 678 / 1674 2000
Palit 8800GT 650 / 1625 1900
Overclocked GeForce 8800GTS 512 (in the diagram 8800GTS 512 756/1890/2100) 756 / 1890 2100
GeForce 8800GT (8800GT in the diagram) 600 / 1500 1800
GeForce 8800GTX (8800GTX in the diagram) 575 / 1350 1800

We used ForceWare 169.21 and ForceWare 169.25 drivers for Windows XP and Windows Vista, respectively. Let's traditionally start getting acquainted with the test results with the 3DMark tests:
Based on the 3DMark test results, of course, you can see who is stronger and who is weaker, but the difference is so small that there are no obvious leaders. But still, it is worth noting the fact that the most expensive of the participants - the GeForce 8800GTX video card - took last place. To complete the picture, it is necessary to familiarize yourself with the results of game tests, which, as before, we performed with 4x anti-aliasing and 16x anisotropic filtering.
In the Call of Duty 4 game, attention is drawn to the fact that the Leadtek 8800GT video card is almost on a par with the Leadtek 8800GTS 512, and the ASUS EN8800 TOP video card is almost not behind the overclocked GeForce 8800GTS 512. The Palit 8800GT video card was in penultimate place, slightly ahead of the reference GeForce 8800GT. The winner was the GeForce 8800GTX video card, apparently due to a wider (compared to other testing participants) memory bus.
In the game Call of Juarez under Windows XP, the Leadtek 8800GTS 512 video card is almost on par with the GeForce 8800GTX, which is no longer saved by a wider memory bus. Let us note the fact that the Leadtek 8800GT video card does not lag behind them, and at a resolution of 1024x768 even outperforms them, which is explained by higher frequencies compared to the other two video cards. The leaders are the video card from ASUS and the overclocked GeForce 8800GTS 512, and in the penultimate place is again the video card from Palit, right after the GeForce 8800GT.
The game Call of Juarez running Windows Vista had problems with the resolution 1600x1200, in which there were large differences in speed and in some places very strong “brakes”. We assume that the problem is a lack of video memory in such a heavy mode, and whether this is true or not, we will check in the next review using the example of the ASUS 8800GT video card with 1 GB of video memory. Let's immediately note that there were no problems with the GeForce 8800GTX. Based on the results in two lower resolutions, it is clear that the balance of power has remained virtually unchanged compared to Windows XP, except that the GeForce 8800GTX recalled its noble origins, but did not become a leader.
In the game Crysis under Windows XP, the balance of power has changed a little, but essentially everything remains the same: the Leadtek 8800GTS 512 and Leadtek 8800GT video cards are at approximately the same level, the leaders are the ASUS EN8800GTS TOP video cards and the overclocked GeForce 8800GTS 512, and the last place goes to the video card GeForce 8800GT. We also note the fact that as the resolution increases, the gap between the overclocked GeForce 8800GTS 512 and the GeForce 8800GTX is reduced due to the latter's wider memory bus. However, high clock frequencies still take over, and yesterday's champion remains out of work.
The problem in Windows Vista with a resolution of 1600x1200 did not bypass the game Crysis, passing only the GeForce 8800GTX. Like Call of Juarez, there were some speed bumps and sometimes very severe drops in performance, sometimes below one frame per second. Based on the results in two lower resolutions, it can be seen that this time the Leadtek 8800GTS 512 video card surpassed its younger sister, taking third place. The first places went to the ASUS EN8800GTS TOP video cards, the overclocked GeForce 8800GTS 512 and the GeForce 8800GTX, which finally took the lead with a resolution of 1280x1024.
In the game Need for Speed ​​Pro Street Racing, the GeForce 8800GTX video card is in the lead, and at a resolution of 1024x768 - by a large margin. Following it is the Leadtek 8800GTS 512 video card, followed by the ASUS EN8800GTS TOP and the overclocked GeForce 8800GTS 512, and the last places go to the GeForce 8800GT and Palit 8800GT sonic cards. Since the GeForce 8800GTX video card has become the leader, we can conclude that the game is highly dependent on video memory bandwidth. After this, we can guess why the overclocked versions of the GeForce 8800GTS 512 turned out to be slower than the non-overclocked version. Apparently, this is due to increased video memory latencies due to an increase in its clock frequency.
In the game Need for Speed ​​Carbon, we see a familiar picture: the Leadtek 8800GTS 512 and Leadtek 8800GT video cards are approximately on par, the first places are taken by the overclocked GeForce 8800GTS 512 and ASUS EN8800GTS TOP, and the last place is taken by the GeForce 8800GT. The GeForce 8800GTX video card looks good, but nothing more.
In the Oblivion game, attention is drawn to the fact that at a resolution of 1024x768, the overclocked GeForce 8800GTS 512 and ASUS EN8800GTS TOP took last place. We assumed that the problem was the memory latencies that had increased due to the increase in frequency, and we were right: after lowering the memory frequency of the overclocked GeForce 8800GTS 512 video card to its nominal value, it showed a result of over 100 frames per second. As resolution grows, things return to normal and former outsiders become leaders. By the way, it is noteworthy that the Leadtek 8800GT outperforms the Leadtek 8800GTS 512, apparently due to the high frequency of the shader unit.
The Prey game turned out to be undemanding for all video cards, and they were arranged according to their clock frequencies. Except that the GeForce 8800GTX behaved a little differently, but this is understandable, because it has a wider memory bus, and the game is highly dependent on its bandwidth.

conclusions

The purpose of today's testing was to find out how much the video cards differ from each other, and how justified is the high price for the “advanced” video card GeForce 8800GTS 512. Based on the results obtained, it is clear that the cards are very close to each other, and this despite the fact that The characteristics of the GeForce 8800GTS 512 are superior to the GeForce 8800GT, including in terms of active functional units inside the GPU. The obvious advantages of the new GeForce 8800GTS 512 video cards are a high-quality and quiet cooling system and greater overclocking potential than the GeForce 8800GT. Particularly noteworthy is the video card from ASUS, which, thanks to factory overclocking, occupies a leading position. Of course, you can overclock the card yourself, and, most likely, all GeForce 8800GTS 512 video cards will “take” the frequencies of the video card from ASUS. In general, we note once again that the new family of video cards based on G92 graphics chips turned out to be very successful and may well replace the recent leader GeForce 8800GTX.

Pros and cons of individual video cards:

Leadtek 8800GTS 512

Pros:
  • good overclocking potential;
  • good equipment;
  • bright and convenient packaging.
Minuses:
  • not noticed.

ASUS EN8800GTS TOP

  • Pros:
  • factory overclocking;
  • high-quality cooling system;
  • good overclocking potential.
Minuses:
  • The packaging is too large and inconvenient.

Leadtek 8800GT

Pros:
  • factory overclocking;
  • good equipment.
Minuses:
  • not noticed.

Palit 8800GT sonic

Pros:
  • factory overclocking;
  • alternative cooling system;
  • good equipment.
Minuses:
  • strongly curved board in the GPU area;
  • noticeable fan noise.

To begin with, NVIDIA installed the G80 on 2 video cards: GeForce 8800 GTX and GeForce 8800 GTS.

Specifications of GeForce 8800 series video cards
GeForce 8800 GTX GeForce 8800 GTS
Number of transistors 681 million 681 million
Core frequency (including allocator, texture units, ROP units) 575 MHz 500 MHz
Shader (stream processor) frequency 1350 MHz 1200 MHz
Number of shaders (stream processors) 128 96
Memory frequency 900 MHz (1.8 GHz effective) 800 MHz (1.6 GHz effective)
Memory interface 384 bit 320 bit
Memory Bandwidth (GB/s) 86.4 GB/s 64 GB/s
Number of ROP blocks 24 20
Memory 768 MB 640 MB

As you can see, the number of transistors in the GeForce 8800 GTX and 8800 GTS is the same, this is because they are absolutely the same G80 GPUs. As already mentioned, the main difference between these GPU options is 2 disabled banks of stream processors - a total of 32 shaders. At the same time, the number of working shader units has been reduced from 128 in the GeForce 8800 GTX to 96 in the GeForce 8800 GTS. NVIDIA also disabled 1 ROP (rasterization unit).

The core and memory frequencies of these video cards are also slightly different: the core frequency of the GeForce 8800 GTX is 575 MHz, and that of the GeForce 8800 GTS is 500 MHz. GTX shader units operate at 1350 MHz, GTS – 1200 MHz. With the GeForce 8800 GTS, NVIDIA also uses a narrower 320-bit memory interface and 640 MB of slower memory that runs at 800 MHz. GeForce 8800 GTX has a 384-bit memory interface, 768 MB memory / 900 MHz. And, of course, a completely different price.

The video cards themselves are very different:


As you can see in these photos, the GeForce 8800 reference boards are black (a first for NVIDIA). With a cooling module, the GeForce 8800 GTX and 8800 GTS are dual-slot. The GeForce 8800 GTX is slightly longer than the GeForce 8800 GTS: its length is 267 mm, versus 229 mm for the GeForce 8800 GTS, and, as stated earlier, the GeForce 8800 GTX has 2 PCIe power connectors. Why 2? Maximum power consumption of GeForce 8800 GTX is 177 W. However, NVIDIA says that this can only be as a last resort, when all functional units of the GPU are maximally loaded, and in normal games during testing, the video card consumed an average of 116 - 120 W, with a maximum of 145 W.

Since each external PCIe power connector on the video card itself is designed for a maximum of 75 W, and the PCIe slot is also designed for a maximum of 75 W, then 2 of these connectors will not be enough to supply 177 W, so we had to make 2 external PCIe power connectors. By adding a second connector, NVIDIA provided the 8800 GTX with a solid power reserve. By the way, the maximum power consumption of the 8800 GTS is 147 W, so it can get by with one PCIe power connector.

Another feature added to the design of the GeForce 8800 GTX reference board is a second SLI connector - a first for an NVIDIA GPU. NVIDIA has not officially announced anything about the purpose of the second SLI connector, but journalists were able to obtain the following information from the developers: “The second SLI connector on the GeForce 8800 GTX is designed for hardware support for possible expansion of the SLI configuration. With current drivers, only one SLI connector is used. Users can connect an SLI bridge to both the first and second contact groups.”

Based on this, and the fact that nForce 680i SLI motherboards come with three PCI Express (PEG) slots, we can conclude that NVIDIA plans to support three SLI video cards in the near future. Another option could be to increase the power for SLI physics, but this does not explain why the GeForce 8800 GTS does not have a second SLI connector.

It can be assumed that NVIDIA is reserving its GX2 “Quad SLI” technology for the less powerful GeForce 8800 GTS, while the more powerful GeForce 8800 GTX will operate in a triple SLI configuration.

If you remember, the original Quad SLI video cards from NVIDIA are closer in characteristics to the GeForce 7900 GT than to the GeForce 7900 GTX, since the 7900 GT video cards have lower power consumption/heat dissipation. It is quite natural to assume that NVIDIA will follow the same path in the case of the GeForce 8800. Gamers with motherboards with three PEG slots will be able to increase the speed of the graphics subsystem by assembling a triple SLI 8800 GTX configuration, which in some cases will give them better performance than Quad SLI system, judging by the characteristics of the 8800 GTS.

Again, this is just a guess.

The cooling unit of the GeForce 8800 GTS and 8800 GTX is made of a dual-slot, channeled design that removes hot air from the GPU outside the computer case. The heatsink consists of a large aluminum heatsink, copper and aluminum heat pipes, and a copper plate that is pressed against the GPU. This whole structure is blown by a large radial fan, which looks a little intimidating, but is actually quite quiet. The cooling system of the 8800 GTX is similar to the cooling system of the 8800 GTS, but the former has a slightly longer heatsink.


In general, the new cooler copes with cooling the GPU quite well, and at the same time is almost silent - like the GeForce 7900 GTX and 7800 GTX 512MB video cards, but the GeForce 8800 GTS and 8800 GTX are audible a little more. In some cases, you will need to listen carefully to hear the noise from the video card fan.

Production

All production of GeForce 8800 GTX and 8800 GTS is carried out under the NVIDIA contract. This means that whether you buy a graphics card from ASUS, EVGA, PNY, XFX or any other manufacturer, they are all made by the same company. NVIDIA does not even allow manufacturers to overclock the first batches of GeForce 8800 GTX and GTS video cards: they all go on sale at the same clock speeds, regardless of the manufacturer. But they are allowed to install their own cooling systems.

For example, EVGA has already released its version of the e-GeForce 8800 GTX ACS3 Edition with its unique ACS3 cooler. The ACS3 graphics card is hidden in a single large aluminum cocoon. It bears the letters E-V-G-A. For additional cooling, EVGA placed an additional heatsink on the back of the graphics card, directly opposite the GPU G80.

In addition to cooling, manufacturers of the first GeForce 8800 video cards can only customize their products with warranties and packaging - games and accessories. For example, EVGA bundles its video cards with the Dark Messiah game, and the GeForce 8800 GTS BFG video card is sold with a BFG T-shirt and a mouse pad.

It will be interesting to see what happens next - many NVIDIA partners believe that for subsequent releases of GeForce 8800 video cards, NVIDIA restrictions will not be so strict, and they will be able to compete in overclocking.

Since all video cards come from the same assembly line, all GeForce 8800 support 2 dual-link DVI and HDCP connectors. In addition, it became known that NVIDIA does not plan to change the memory size of the GeForce 8800 GTX and GTS (for example, 256 MB GeForce 8800 GTS or 512 MB 8800 GTX). At least for now, the standard configuration for the GeForce 8800 GTX is 768 MB, and the GeForce 8800 GTS is 640 MB. NVIDIA also has no plans to make an AGP version of GeForce 8800 GTX/GTS video cards.

Driver for 8800

NVIDIA has made several changes to the GeForce 8800 driver, which definitely need to be mentioned. First of all, the traditional overclocking utility Coolbits has been removed, replaced by NVIDIA nTune. That is, if you want to overclock a GeForce 8800 video card, you will need to download the nTune utility. This is probably good for owners of motherboards based on the nForce chipset, since the nTune utility can be used not only to overclock the video card, but also for system configuration. Otherwise, those, for example, who managed to upgrade to Core 2 and have a motherboard with a 975X or P965 chipset, will have to download a 30 MB application to overclock the video card.

Another change in the new driver that we noticed is that there is no option to go to the classic NVIDIA Control Panel. I would like to believe that NVIDIA will return this feature to its video driver, since many people liked it, unlike the new NVIDIA control panel interface.

The 8800 GTX was a landmark event in the history of 3D graphics. It was the first card to support DirectX 10 and its associated unified shader model, which greatly improved image quality over previous generations, and it remained unrivaled in terms of performance for a long time. Unfortunately, all this power came at a cost. With expected competition from ATI and the release of lower-priced mid-range models based on the same technology, the GTX was considered a card aimed only at those enthusiasts who wanted to be at the forefront of modern advances in graphics processing.

Model history

To correct this situation, nVidia released a card of the same line GTS 640MB a month later, and a couple of months later the GTS 320MB. Both offered similar performance to the GTX, but at a much more reasonable price. However, at around $300-$350, they were still too expensive for gamers on a budget - they were not mid-range, but high-end models. In hindsight, the GTS were worth every penny, as what followed for the rest of 2007 was one disappointment after another.

First up were the supposed mid-range 8600 GTS and GT cards, which were heavily stripped-down versions of the 8800 series. They were smaller and quieter and had new HD video processing capabilities, but their performance was below expected levels. Purchasing them was impractical, although they were relatively inexpensive. The alternative ATI Radeon HD 2900 XT matched the GTS 640MB in terms of performance, but consumed a huge amount of power under load and was too expensive to be considered mid-range. Finally, ATI attempted to release the DX10 series in the form of the HD 2600 XT and Pro, which had even better multimedia capabilities than the nVidia 8600, but lacked the power to be worth the attention of gamers who had already purchased previous generation graphics cards such as the X1950 Pro or 7900 GS.

And now, a year after the start of sales of the 8800 GTX with the release of the 8800 GT, the first real update of the model with support for DirectX 10 appeared. Although it took a lot of time, the nVidia GeForce 8800 GT had the same characteristics as the GTS model, and the cost was in the range of 200-250 dollars , has finally reached the mid-range price range that everyone has been waiting for. But what made the card so special?

More is not better

As technology develops and the number of transistors in CPUs and GPUs develops, there is a natural need to reduce their size. This leads to lower energy consumption, which in turn means less heat. More processors fit on one silicon chip, which reduces their cost and theoretically puts a lower limit on the price of equipment made from them. However, changing production processes poses high risks for business, so it is customary to release a completely new architecture based on already existing and proven technologies, as was the case with the 8800 GTX and HD 2900 XT. With the improvement of the architecture comes a shift to less power-intensive hardware, on which the new design is later again based.

The 8800 series followed this path with the G80 cores of the GTX and GTS, produced using 90 nm technology, and the nVidia GeForce 8800 GT is based on the G92 chip, already made using a 65 nm process. While the change doesn't seem like much, it equates to a 34% reduction in wafer size or a 34% increase in the number of processors on a silicon wafer. As a result, electronic components are becoming smaller, cheaper, and more efficient, which is an extremely positive change. However, the G92 core is not just smaller, there is something else.

First of all, the VP2 video processing engine that was used in the 8600 series has now appeared in the GeForce 8800 GT 512MB. So now you can enjoy high-definition video without system slowdown. The final display engine, which is controlled by a separate chip on the 8800 GTX, is also integrated into the G92. The result is 73 million more transistors on-chip than the 8800 GTX (754 million versus 681 million), although the number of stream processors, texture processing and ROP power is less than that of the more powerful model.

A new version of nVidia's transparent anti-aliasing algorithm added to the GeForce 8800 GT is designed to significantly improve image quality while maintaining high system performance. In addition, the new processor did not add any new graphics capabilities.

The manufacturing company apparently thought for a long time about which functionality of the previous 8800 series cards was not fully used and could be reduced, and which should be left. The result was a GPU design that, in terms of performance, fell somewhere between GTX and GTS, but with GTS functionality. As a result, the 8800 GTS card became completely redundant. The 8800 Ultra and GTX still provide more graphics power, but with fewer features, a much higher price, and higher power consumption. Against this background, the GeForce 8800 GT 512 MB card really took a strong position.

GPU architecture

The GeForce 8800 GT uses the same unified architecture that Nvidia introduced when it first announced the G80 processor. The G92 consists of 754 million transistors and is manufactured using TSMC's 65nm process. The substrate size is about 330 mm 2 , and although this is noticeably smaller than the G80, it is still a long way from being called a small piece of silicon. There are a total of 112 scalar thread cores, which run at 1500 MHz in the standard configuration. They are grouped into 7 clusters, each of which has 16 stream processors that share 8 texture address blocks, 8 texture filter sections and their own independent cache. This is the same configuration that Nvidia used in the G84 and G86 chips at the shader cluster level, but the G92 is a much more complex GPU than either of them.

Each of the shader processors can generate two MADD and MUL commands in one clock cycle; the blocks combined into a single structure can process all shader operations and calculations that come in both integer and floating point form. What's interesting, however, is that despite the stream processors' capabilities being the same as the G80 (except for number and frequency), Nvidia claims the chip can do up to 336 GFLOPS. However, NADD and MUL calculations require 504 GFLOPS. As it turned out, the manufacturing company took a conservative approach to determining computing power and did not take MUL into account when calculating overall performance. At briefings and roundtables, Nvidia representatives said that some architectural improvements should allow the chip to approach its theoretical maximum throughput. In particular, the task manager has been improved, distributing and balancing data that comes through the pipeline. NVidia has announced that it will support double precision in future GPUs, but this chip only emulates it due to the need to follow IEEE standards.

ROP architecture

The ROP structure of the G92 is similar to that of any other graphics processor in the GeForce 8-series family. This means that each section has a L2 cache and is assigned to a 64-bit memory channel. There are a total of 4 ROP sections and a 256-bit data storage interface. Each of them is capable of processing 4 pixels per clock cycle, if each of them is specified by four parameters (RGB color and Z). If only the Z component is present, then each section can process 32 pixels per clock cycle.

ROPs support all common anti-aliasing formats used in previous GeForce 8-series GPUs. Since the chip has a 256-bit GDDR interface, Nvidia decided to make some improvements to ROP compression efficiency to reduce bandwidth and graphics memory usage when anti-aliasing is enabled at 1600x1200 and 1920x1200 resolutions.

As a derivative of the original G80 architecture, the filter and texture address blocks, as well as the ROP sections, operate at a different clock speed than the stream processors. Nvidia calls this the base speed. In the case of the GeForce 8800 GT, the video card's characteristics are determined by a frequency of 600 MHz. This theoretically results in a fill rate of 9600 gigapixels per second (Gp/s) and a bilinear texture fill rate of 33.6 Gp/s. According to users, the clock frequency is very low, and an increase in the number of transistors does not guarantee the addition or preservation of functionality. When the company switched from 110nm to 90nm technology, it reduced the number of transistors by 10% through optimization. Therefore, it would not be surprising if there are at least 16 more stream processors on the chip that are disabled in this product.

Design

The reference design of the card provides for the core, shader unit and memory to operate at 600 MHz, 1500 MHz and 1800 MHz, respectively. The 8800 GT features a single-slot cooling system, and the glossy black metal casing almost completely hides its front side. The 50 mm fan corresponds to the design of radial coolers of top models and performs its duties very quietly in all operating modes. It doesn’t matter whether the computer is idling, loaded only with the Windows desktop, or your favorite game is running - it will be practically inaudible against the background of other noise sources in the PC case. However, it is worth noting that the first time you turn on a computer with a new video card, you can be scared. The fan starts to howl when the GPU is loaded at full capacity, but the noise subsides before the desktop appears.

The metal front panel attracts fingerprints, but this is of little concern, since once installed it will be impossible to see them. According to user reviews, the cover helps prevent accidental damage to components such as capacitors on the front of the card. The green printed circuit board combined with the black heatsink bezel gives the 8800 GT a distinctive look. The model is marked with the GeForce logo along the top edge of the front panel. Mark Rein, the company's vice president, told reporters that this entailed additional costs, but was necessary to help users figure out which graphics card is the heart of the system at LAN parties.

Under the heatsink are eight 512-megabit graphics memory chips, giving a total of 512 MB of storage capacity. This is GDDR3 DRAM with an effective frequency of up to 2000 MHz. The GPU supports both GDDR3 and GDDR4, but this feature was never used in this series.

Heating and power consumption

The nVidia GeForce 8800 GT video card is very sexy. Its design is simply very pleasing to the eye and, given the internal changes to the G92, it exudes a sophisticated design feel.

More important than the aesthetic aspects, however, according to users, is the fact that the manufacturer managed to pack all the power into a single-slot device. This is not just a welcome change, it is a pleasant surprise. The characteristics of the GeForce 8800 GT are such that we can assume the presence of a cooler two slots high. The reason why Nvidia went with such a thin design was due to changes in the manufacturing process that reduced the heat to a level that a low-profile fan could handle. In fact, temperatures have dropped so much that even a relatively small cooler doesn't have to spin very quickly, resulting in the card remaining virtually silent even when handling intense games. However, the board temperature rises significantly, requiring a significant amount of air to prevent overheating. As a result of the process reduction, the GeForce 8800 GT 512 MB consumes only 105 W even under full load. Thus, only one six-pin power connector is required. This is another nice change.

The card was the first to support PCIe 2.0, allowing it to receive power up to 150 W. However, the company decided that for backward compatibility it is much easier to limit the power through it to 75 watts. This means that regardless of whether the card is connected to PCIe 1.1 or PCIe 2.0 motherboards, only 75 W are supplied through the connector, with the rest of the power supplied through the auxiliary connector.

Processor VP2

Speaking about the possibility of transmitting HDCP signals, it is worth touching on the new generation video processor that nVidia incorporated into the G92. The VP2 is a single programmable SIMD processor whose flexibility allows it to be expanded in the future. It enables very intensive processing of H.264 encoded video, shifting the load from the CPU to the GPU. In addition to VP2, there is also an H.264 stream processor and an AES128 decoder. The first of these is specifically designed to accelerate CAVLC and CABAC encoding schemes - tasks that are very CPU intensive in a pure software environment. AES128 enables faster processing of the encryption protocol required by video content security schemes such as AACS and Media Foundation. Both of these schemes require encoding of video data (both compressed and uncompressed) when transferred over buses like PCI-Express.

Improving Image Quality

Nvidia is trying hard to improve the transparent anti-aliasing technique that first appeared in the 7-series GeForce. Multisampling reduces card performance slightly, but in most cases it is not effective. On the other hand, supersapling provides much better and more stable image quality, but at the cost of reduced operation speed - it is an incredibly resource-intensive method of anti-aliasing.

The drivers that come with the video card contain a new multisampling algorithm. The differences are quite significant, but the final decision is made by the user. The good news is that since this is a driver-level change, any hardware that supports transparent antialiasing can use the new algorithm, including cards released after the GeForce 7800 GTX. To activate the new mode, you just need to download the latest updates from the manufacturer's website.

According to user reviews, updating the driver for the GeForce 8800 GT will not be difficult. Although the video card's web page only contains links to files for Windows Vista and XP, searching from the main page allows you to find what you need. For nVidia GeForce 8800 GT, Windows 7-10 drivers are installed using the 292 MB GeForce 342.01 Driver utility.

Connectivity

The output connectors of the nVidia GeForce 8800 GT are quite standard - 2 dual-channel DVI-I ports with HDCP support, which are suitable for both analog and digital interfaces of monitors and TVs, and a 7-pin analog video port provides conventional composite and component output. DVI connectors can be used in combination with a DVI-VGA and DVI-HDMI adapter, so any connection option is possible. However, Nvidia still makes audio support for use with HDMI connectors an option for third-party manufacturers - there is no audio processor inside the VP2, so audio is implemented through the on-board S/PDIF connector. This is disappointing, since the thin and quiet card is ideal for gaming home theaters.

The GeForce 8800 GT is the first graphics system compatible with PCI Express 2.0, which means it can access memory at speeds of 16 GB/s. - twice as fast as the previous standard. While this may be useful for workstations and intensive computing, it won't be of much use to the average gamer. In any case, the standard is fully compatible with all previous versions of PCIe, so there is nothing to worry about.

nVidia's partner companies offer overclocked versions of the GeForce 8800 GT, as well as game packages.

BioShock from 2K Games

BioShock was one of the best games that existed at the time the video card was released. It's a "genetically modified" first-person shooter set in the underwater city of Rapture, created on the floor of the Atlantic Ocean by a man named Andrew Ryan as part of the realization of his 1930s art deco dream. 2K Boston and 2K Australia have licensed and used Epic Games' Unreal Engine 3 to best effect, and also leveraged some DirectX 10 capabilities. All of this is controlled through an option in the game's graphics control panel.

The BioShock setting forced the developers to use a lot of water shaders. DirectX 10 technology helped improve the ripples when characters move through water, and pixel shaders were used en masse to create wet objects and surfaces. Additionally, the DX10 version of the game uses a depth buffer to create "soft" particle effects where they interact with their surroundings and look more realistic.

The nVidia GeForce 8800 GT, whose characteristics allow it to show its strengths in the BioShock game, is only slightly inferior to the GTX at a resolution of 1680x1050. As this parameter increases, the gap between the cards increases, but not by a large margin. The reason for this is likely due to the fact that the game did not support transparent anti-aliasing, making the 8800 GTX's massive memory bandwidth advantage moot.

According to user reviews, the 8800 GT also works quite well with SLI enabled. Although its capabilities are not close to those of the GTX, it competes with the Radeon HD 2900 XT graphics card with 512 MB of memory in the CrossFire configuration. Perhaps even more interesting is the fact that at 1920x1200 the 8800 GT is almost as fast as the 640MB GTS!

Crysis Syngle Player Demo from Electronic Arts

This game will literally make your video card cry! The big surprise was its graphics - they surpassed everything that was in computer games before it. Testing with the built-in GPU speed meter is much faster than in reality. About 25 fps in the performance test is enough to get an acceptable frame rate for the user. Unlike other games, the low frame rate here still looks pretty flat.

The nVidia GeForce 8800 GT video card, whose characteristics in Crysis allow it to achieve sufficient frame rates at a resolution of 1680x1050 with high detail under DirectX 10, is not as fast as the GTX, but is noticeably more productive than the Radeon HD 2900 XT and 8800 GTS 640MB. The GTS 320MB struggles to handle Crysis and will need to drop most settings to medium to get framerates above 25 fps even at 1280 x 1024 image quality.

Performance

As you'd expect, the 8800 GTX remains unbeatable, but overall the GeForce 8800 GT GTS is ahead in most tests. At the highest resolutions and anti-aliasing settings, the GT's reduced memory bandwidth lets it down and the GTS occasionally pulls ahead. However, considering the price difference and other advantages, the 8800 GT is better in any case. Conversely, a comparison of GeForce GTX 8800/GT 8800 every time confirms why the first card is so expensive. While other models begin to slow down significantly as the number of image pixels increases, transparent anti-aliasing and anisotropic filtering are used, the 8800 GTX continues to demonstrate excellent results. In particular, Team Fortress 2 at a resolution of 1920x1200 with 8xAA and 16xAF on the 8800 GTX runs twice as fast as on the GT. However, for the most part, the GeForce 8800 GT performs well. Unless, of course, you take into account the incredibly low frame rate in Crysis.

Conclusion

While the GeForce 8800 GT doesn't match the specs of the 8800 GTX series leader, it offers similar performance at a fraction of the price and includes many additional features. And if you add small size and quiet operation, the model will seem simply phenomenal.

For more than a year that has passed since the release of video cards based on NVIDIA GeForce 8800 line chips, the situation on the graphics accelerator market has become extremely unfavorable for the end buyer. In fact, an overclocker who could pay a tidy sum of money for top-end video cards simply had no alternative. A competitor from ATI (AMD) appeared later and, ultimately, was not able to compete with the GeForce 8800 GTX, and subsequently the Ultra version of the NVIDIA GeForce 8800. Therefore, NVIDIA marketers easily realized that in the absence of competition, reduce the cost of top-end Video cards are not necessary at all. As a result, throughout this entire period, prices for GeForce 8800 GTX and Ultra remained at the same very high level, and only a few could afford such video cards.

However, the upper price segment has never been the determining and priority for manufacturers of graphics chips and video cards. Yes, leadership in this class is certainly prestigious for any company, but from an economic point of view, the most profitable is the middle price range. However, as recent tests of the AMD Radeon HD 3850 and 3870, which claim supremacy in the mid-range, have shown, the performance of such video cards is unsatisfactory for modern games and, in principle, unacceptable for their high-quality modes. The NVIDIA GeForce 8800 GT is faster than this pair, but also falls short of comfort in DirectX 10 games. What comes next if there is an opportunity to pay extra? Until yesterday, there was virtually nothing, since there is literally a gap in price terms between the GT and GTX and that’s all.

But technical progress does not stand still - the appearance of the new NVIDIA G92 chip, produced using 65 nm technology, allowed the company not only to attract overclockers with the quite successful GeForce 8800 GT video card, but also yesterday, December 11 at 17:00 Moscow time, to announce a new product - GeForce 8800 GTS 512 MB. Despite the quite simple name of the video card, the new graphics accelerator has a number of significant differences from the regular version of the GeForce 8800 GTS. In today's material, we will get acquainted with one of the first GeForce 8800 GTS 512 MB video cards appearing on the Russian market, check its temperature conditions and overclocking potential, and, of course, study the performance of the new product.

advertising

1. Technical characteristics of video cards participating in testing

The technical characteristics of the new product are presented to your attention in the table below in comparison with NVIDIA video cards of the GeForce 8800 family:

Name of technical
characteristics
NVIDIA GeForce
8800 GT 8800 GTS 8800 GTS
512 MB
8800
GTX/Ultra
GPU G92 (TSMC) G80 (TSMC) G92 (TSMC) G80 (TSMC)
Technical process, nm 65 (low-k) 90 (low-k) 65 (low-k) 90 (low-k)
Core area, sq.mm 330 484 330 484
Number of transistors, million 754 681 754 681
GPU frequency, MHz 600
(1512 shader)
513
(1188 shader)
650
(1625 shader)
575 / 612
(1350 / 1500
shader)
Effective operating frequency
video memory, MHz
1800 1584 1940 1800 / 2160
Memory capacity, MB 256 / 512 320 / 640 512 768
Supported memory type GDDR3
Memory bus width, bits 256
(4 x 64)
320 256
(4 x 64)
384
Interface PCI-Express
x16 (v2.0)
PCI-Express
x16 (v1.x)
PCI-Express
x16 (v2.0)
PCI-Express
x16 (v1.x)
Number of unified shaders
processors, pcs.
112 96 128
Number of texture blocks, pcs. 56 (28) 24 64 (32) 32
Number of rasterization units (ROP’s), pcs. 16 20 16 24
Pixel Shaders/Vertex version support
Shaders
4.0 / 4.0
Video memory bandwidth, Gb/sec ~57.6 ~61.9 ~62.1 ~86.4 / ~103.7

shading, Gpixel/sec
~9.6 ~10.3 ~10.4 ~13.8 / ~14.7
Theoretical top speed
texture samples, Gtex./sec
~33.6 ~24.0 ~41.6 ~36.8 / ~39.2
Peak power consumption in
3D operating mode, Watt
~106 ~180
Power supply requirements
Watt
~400 ~400 ~400 ~450 / ~550
Reference video card dimensions
design, mm (L x H x T)
220 x 100 x 15 228 x 100 x 39 220 x 100 x 32 270 x 100 x 38
Exits 2 x DVI-I
(Dual Link)
TV-Out, HDTV-
Out, HDCP
2 x DVI-I
(Dual Link)
TV-Out,
HDTV-Out
2 x DVI-I
(Dual Link)
TV-Out, HDTV-
Out, HDCP
2 x DVI-I
(Dual Link)
TV-Out,
HDTV-Out
Additionally SLI support
Recommended cost, US dollars 199 / 249 349 ~ 399 299~349 499~599 / 699

2. Review of BFG GeForce 8800 GTS 512 MB OC (BFGR88512GTSE)

The newest video card from a company well known to overclockers comes in a very compact box, decorated in dark colors.







2024 gtavrl.ru.