Pitfalls for future owners.


On August 12, 2004, at the QuakeCon conference in Texas, another high-profile event took place, which greatly shook up the entire IT industry, which had not been dormant in recent months. The Californian company NVIDIA Corporation introduced new solutions in the GeForce 6 Series line - GeForce 6600 and GeForce 6600GT. It would seem that after the presentation of the 6800 and the first test results, surprising the public with something else would be an incredible task. However, NVIDIA was able to do this. This time, the middle-end sector of the video accelerator market was the testing ground for the sensation.

Among the innovations, it is worth noting the transition to a 0.11-micron technical process (versus 0.13 for the 6800 Series), a 128Bit memory bus, a reduced number of pixel and vertex pipelines - to 8 and 3, respectively, and a reduced number of transistors to 146 million , and PCI-Express x16 interface. By the way, it’s worth noting that these are the first NVIDIA cards to have native PCI-Express support. As for supporting AGP 8X, this will require equipping the boards with an HSI bridge, which, in fact, will fully justify this development. Regarding technologies, it should be noted that the new line supports all the joys of the GeForce 6800 Series - Intellisample 3.0, UltraShadow II, and pixel and vertex shaders version 3.0, however, we will give the full table below.

Read also: Abit AA8 DuraMax: Intel 925X MSI 915P Neo2: Intel 915P Asus P5AD2 Premium: Intel 925X Asus P5GD2 Deluxe: Intel 915P GBT 8GPNXP-Duo: Intel 915P Kingmax DDR2-533 Intel Pentium4 LGA775 PCI-E VGA Roundup MSI RX600XT and PCX575 0GBT GeForce PCX 5900

For now, let’s look at the situation inside the line itself. The main differences between the 6600GT and the 6600 are, first of all, the memory standard used - the 6600GT has GDDR3 installed; the standard GPU frequency, which for the 6600GT is 500 MHz versus 300 MHz for the 6600; and support for NVIDIA SLI technology, which is present on the GeForce 6600GT. The latter can be separately identified as a particularly attractive side of the new solution, because now SLI is available not only in the high-end sector; so that buyers who initially target the middle market will be able to taste all the delights of this technology.

As for the price, and this is what always greatly worries any consumer of products from the mainstream sector, then everything here is very, very rosy. The recommended price is 150 USD for GeForce 6600 and 200 USD to GT version.

That is, we get excellent performance for more than modest investments - a real paradise for gamers (it’s not for nothing that the presentation of new solutions was held under the motto “Play DOOM3 The Way It’s Meant To Be Played” - however, how true this is, we will find out in detail in the process our testing), and computer enthusiasts!

Today we will look at the flagship of the new line - GeForce 6600GT.

Map NVIDIA GeForce 6600GT
Chip code name NV43
Technical process 0,11
Number of transistors 146 million
Memory bus 128 Bit (GDDR3) / 16 Gb/s
Interface PCI-Express x16
Memory 128Mb
Chip frequency 500MHz
Memory frequency 500MHz(1000MHz DDR)
Number of vertex pipelines 3
Number of pixel pipelines 8
Vertex program version 3.0
Pixel programs version 3.0
Generation DirectX 9.0+
SLI support +
Number of display outputs 2
General optimizations 32-bit floating point precision,
64-bit texture filtering and blending,
MPEG4/WMV decoding,
Adaptive deinterlacing,
built-in video processor;
technology support:
CineFX 3.0
Intellisample 3.0
UltraShadow II
NVIDIA® nView,
DVC 3.0
NVIDIA HPDR
VMR,
MRT.
RAMDAC 2x400 MHz

Among the PCI-E solutions already announced today in the middle-end sector of the video processor market, the 6600 Series' competitors include the PCX5900/5750 and RX600XT/PRO from ATi Technologies. However, these video cards should be considered only as the “first pancake” of PCI-E, on which companies tested the new technology. How lumpy this “damn” came out will be seen during the testing process.

“For company,” the RX300 SE was also included in the list of tested video cards - “hello from the low-end sector” :-) That is. We included absolutely all announced PCI-E cards in testing.

Let us remind you of the brief characteristics of the above-mentioned video cards:

Cards ATI RX600XT ATI RX600Pro NVIDIA GeForce PCX5750 NVIDIA GeForce PCX5900
Code name RV380 RV380 NV36 NV35
Chip technology 256 bit
Technical process 0.13 µm low-k 0.13 µm
Memory bus 128 bit (DDR) 128 bit (DDR) 128 bit (DDR) 256 bit (DDR)
Tire PCI-Express x16
Memory 128/256 MB
Chip frequency 500 MHz 400 MHz 425 MHz 350 MHz
Memory frequency 330 MHz
(660 DDR)
300 MHz
(600 DDR)
250 MHz
(500 DDR)
275 MHz
(550 DDR)
Number of pixel pipelines 4 4 4 4(8)
Number of textures per pipeline 1 1 1 2(1)
Number of textures per texture block 16 16
Vertex program version 2.0 2.0+
Pixel programs version 2.0 2.0+
Generation DirectX 9.0
Number of display outputs 2

Now it’s time to conduct a detailed inspection of the new product.

Discrete video cards in laptops are not new, but they solve many problems. For example, a user wants to not only work on a laptop, but also play various computer games. For these purposes, a discrete card is required, since the built-in graphics processors do not have the resources to cope with such loads.

AMD Radeon 6600m and 6700m series: characteristics (general)

The prefix “M” at the end of the name of each video card means “mobile”, that is, it can only be used in laptops. Both video cards have a 40-nanometer technological process, the core of which is called Turks. Connection interface PCI Ex 2.1x16. Both GPUs are capable of supporting up to six monitors, and no special software or drivers are required - it all depends on hardware support. The 128-bit bus allows for high throughput.

By type of video memory, both support the GDDR5 format, and the RAM is in the DDR3 format. The RAM memory frequency is 900 megahertz, and the speed is about two gigabits per second. The video memory has an operating frequency of up to 800 megahertz, and a 128-bit bus increases memory bandwidth to 57.6 gigabits per second. AMD Radeon 6600m and 6700m series, the characteristics of which demonstrate overclocking video memory speed to 3.6 gigabits per second, can only show high-quality images when paired with powerful RAM.

Differences in characteristics

ATI Radeon 6600m and 6700m series: their characteristics differ in the frequency of the graphics core. For the 6600m model it is 500 megahertz, and for the 6700m it is slightly higher - 725 megahertz.

Math block

AMD Radeon 6600m and 6700m series: characteristics regarding the math unit - both cards provide about 800 stream processors. There are 40 texture blocks, as well as 16 rasterization color blocks.

Graphics capabilities

AMD Radeon 6600m and 6700m series: Features supported by DirectX 11 include the new Shader Model version 5 and Direct Compute 11. There is also support for programming the hardware tessellation unit, which allows you to enjoy more realistic images than without it. There is the possibility of accelerated multithreading and improved texture processing. There is also transparency, which is independent of order.

The AMD Radeon 6600m and 6700m series graphics card, which features OpenGL 4.1 technology, has the ability to improve image quality. This technology includes 24x object smoothing capabilities and tailored processing.

AMD Eyefinity technology on the AMD Radeon 6600m and 6700m series features support for up to six monitors that share the same resolution, frame rate, color gamut, and video overlay. It is possible to set the monitors so that they look like one screen, then the image will be divided into six monitors, so a group of monitors will look like one large display.

The AMD Radeon 6600m and 6700m series video card, the characteristics of which demonstrate the presence of CrossFireX technology, allows the use of multiple graphics processors, in a word - this is an analogue of SLI from NVidia. There is also support for OpenCL 11.

AMD Radeon 6600m and 6700m series: description of supported technologies

The first technology that has become new for the series of video cards from this manufacturer is accelerated encoding, transcoding and video stretching.

Both GPUs play video in all modern formats and Adobe Flash Player version 9.

Enhanced image quality technologies include advanced post-processing and upscaling, as well as color correction and automatic contrast adjustment. Increased white brightness and expanded blue range. Independent technology of gradual transition from one tone to another. Continuous automatic video range adjustment. Supports two 1080 pixel playback streams simultaneously.

The next technology is called AMD 3D, which provides full support for stereo glasses and three-dimensional images on the monitor. It also supports 3D games, and can work with stereo software from other manufacturers with 3D systems.

It is worth noting AMD PowerPlay automatic power management technology, which saves energy by automatically distributing power at rest. Properly distributes the load when working with multiple cards.

The latest technology is a built-in controller called HD Audio. Via HDMI, you can enjoy surround sound from a secure 7.1 output channel that requires no additional cables. This technology reproduces audio in all known high quality formats.

Like desktop video cards, mobile ones are also subject to wear and contamination. For these reasons, it is worth replacing the thermal paste every few months, as it affects the heating of the video memory chip. Next, you need to clean the video card cooler at least once every six months, because dust accumulation can negatively affect the temperature of the chip. It is better to install drivers for this video card from a disk or download from the manufacturer’s website.

The beginning of autumn... Just like 5 years ago, when the first product with T&L hardware support - GeForce256 - was announced, today we are meeting new products from NVIDIA. Of course, this is not High-End, but that doesn’t make the announcement any less interesting. Actually, the announcement was back in August, but then only the characteristics of video cards were presented, and now we, like many other media outlets, have the opportunity to show what the latest products for the Middle sector of 3D accelerators are capable of.

Until recently, products from ATI held the leadership in this market segment: RADEON 9600/PRO/XT, X600PRO/XT, outperforming their rivals such as NVIDIA GeForce FX 5700/Ultra, PCX5750/5900 in terms of speed in modern games with active use of shader technologies. And only the FX 5900XT, launched “from above” into this segment, was able to become popular and supplant the hegemony of Canadian products. And now...

“Nala is already on her way to claim the main prize... Ruby will have to hold the line...”

Yes, it is no coincidence that the mermaid and the brave girl from the corresponding NVIDIA and ATI demo programs demonstrating new technologies (SM3.0 from NVIDIA and 3Dc/SM2.0b from ATI) were taken as heroines. New products from the Californian company, which we will study today, fully support shaders version 3.0, like their older brothers.

Will Ruby give up her royal diamond to Nala, who is chasing her? After all, soon there will be an announcement of new products from ATI in the same sector of video cards. What will be the outcome of the battle? - We don’t know yet. I think that the material on RV410 will be no less interesting and exciting. But for now we will abstract from this and consider the NV43 (GeForce 6600GT/6600) as if these cards had already gone on sale. Accordingly, their competitors will be those accelerators that are currently popular in the price segment from 150 to 200 US dollars. And, of course, those video cards that are being replaced by new products.

Looking ahead, we note that the NV43 has built-in support for the PCI-Express interface (hereinafter referred to as PCX), so AGP products without a kit with an HSI bridge are impossible. Consequently, they will be more expensive to produce and will be released later than their PCX counterparts (if they come out, everything will depend on demand). This is a significant disadvantage of new products today, since the PCX sector is just beginning to develop, and the demand for such platforms is still minimal. Therefore, no matter how wonderful the new product may be, it is doomed from the very beginning to relatively low demand in the Retail market due to the fact that upgrading from an AGP platform to PCX still has dubious benefits. On the other hand, the OEM market and PC assemblers, especially foreign ones, will not fail to launch models with DirectX PCX solutions that are not as expensive as top-end ones, but still fully meet the modern needs of DirectX PCX.

In addition, who knows, maybe the release of interesting and profitable video cards in terms of price-speed ratio may spur interest in PCX as a whole. In general, time will tell. And let's not forget that ATI's RV410 will also come out with native PCX support only, and the Canadian company does not have its own two-way AGPPCX bridges, and therefore it will be almost impossible for it to implement new products on the AGP bus. However, this sector is already crowded, and there are many different solutions with similar performance from previously released or currently being released.

It was very interesting for us to compare not only cards on the same interface, but also AGP and PCX options with each other. This is, of course, very difficult to do, since the platforms differ significantly. However, we remember that we are in the Middle-End sector, where modern processors are quite capable of causing 100% accelerator load, and performance after a certain resolution does not depend so much on the platform. Below you will find out what came out of our cross-platform research.

Now let's return to the objects of today's analysis: NVIDIA NV43 or GeForce 6600GT/6600 (the line currently consists of two cards that differ only in parts).

Official specifications GeForce 6600GT/6600 (NV43)

  1. Chip codename NV43
  2. 110nm technology (TSMC) (!)
  3. 146 million transistors
  4. FC case (inverted chip, without metal cover)
  5. 128 bit dual channel memory interface (!)
  6. Up to 256 megabytes DDR/GDDR2/GDDR3 memory
  7. PCI Express16x bus interface built into the chip
  8. Ability to translate the interface to APG 8x using a two-way PCI ExpressAGP HSI bridge
  9. 8 Pixel processors, one texture unit each with arbitrary filtering of integer and floating textures (anisotropy up to 16x inclusive).
  10. 3 Vertex processors, one texture unit each, no filtering of selected values ​​(discrete sampling)
  11. Calculation, blending and recording of up to 8 full (color, depth, template buffer) pixels per clock cycle (the experiment shows up to 4)
  12. Calculation and recording of up to 16 depth and pattern buffer values ​​per clock cycle (if color operations are not performed) (the experiment shows up to 8)
  13. Support for "two-way" template buffer
  14. Support for special geometry rendering optimizations to speed up template buffer-based shadow algorithms (so-called Ultra Shadow II technology), particularly those widely used in the Doom III engine
  15. Everything needed to support 3.0 pixel and vertex shaders, including dynamic branching in pixel and vertex processors, fetching texture values ​​from vertex processors, etc.
  16. Texture filtering in floating format
  17. Floating framebuffer support (including blending operations)
  18. 2 RAMDAC 400 MHz
  19. 2 DVI interfaces (interface chips required)
  20. TV-Out and TV-In interface (requires interface chips)
  21. Programmable streaming video processor (for compression, decompression and video post-processing tasks)
  22. 2D accelerator with support for all GDI+ functions
  23. Built-in temperature and power monitoring

GeForce 6600 GT reference card specifications

  1. Core frequency 500 MHz
  2. Effective memory frequency 1 GHz (2*500 MHz)
  3. Memory bus 128 bit
  4. Memory type GDDR3
  5. Memory capacity 128 megabytes
  6. Memory bandwidth 16 gigabytes per second.
  7. Theoretical fill rate is 4 gigapixels per second.
  8. Theoretical texture sampling speed is 4 gigatexels per second.
  9. One VGA (D-Sub) and one DVI-I connector
  10. TV-Out
  11. Consumes up to 70 Watts of energy (i.e. on a PCI-Express card a connector for additional power is not needed, a power source with a total power of 300 Watts or more is recommended)

List of cards currently released based on NV43:

  • GeForce 6600GT: 500/500 (1000) MHz, 128MB GDDR3, PCI-Express x16, 8 pixel and 3 vertex pipelines ($199) - competitor to NVIDIA GeForce PCX5900, ATI RADEON X600 XT(?), as well as future ATI solutions (RV410) ;
  • GeForce 6600: 300/250-300 (500-600) MHz, 128/256MB DDR, PCI-Express x16, 8 pixel and 3 vertex pipelines ($149) - competitor to NVIDIA GeForce PCX5750, ATI RADEON X600 PRO (X600 XT?).

General circuit diagram of the chip

Special architectural differences from the NV40 are not noticeable, which, however, is not surprising - NV43 is a scaled (by reducing the number of vertex and pixel processors and memory controller channels) solution based on the NV40 architecture. The differences are quantitative (shown in bold in the diagram), not qualitative - from an architectural point of view, the chip has remained virtually unchanged.

So, there are 3 (there were 6) vertex processors, and two (there were four) independent pixel processors, each of which works with one quad (2x2 pixel fragment). It’s interesting that this time, PCI Express has become a native (i.e., implemented on a chip) bus interface, and AGP 8x boards will contain an additional two-way PIC-E AGP bridge (shown in dotted line), which we have already described in detail earlier.
In addition, we note a very important limiting point - a dual-channel controller and a 128-bit memory bus - we will discuss and study this fact in detail further.

The architecture of vertex and pixel processors and the video processor remained the same - these elements were described in detail in our review of the GeForce 6800 Ultra (link). Now, let's talk about potential quantitative and qualitative changes in relation to the NV40:

Theoretical considerations about what was cut and how

In general, at the moment, we get the following line of solutions based on the NV4X and R4XX architectures:

Pixel/
Vertex

Memory strip

Fill rate
MPix.

Core frequency

256 (4 x 64)
GDDR3 1100

256 (4 x 64)
GDDR3 1000

256 (4 x 64)
DDR 700

256 (4x64)
DDR 700

128 (2x64)
GDDR 3 1000

128 (2x64)
DDR 500-600-700

256 (4 x 64)
GDDR3 1000

256 (4 x 64)
GDDR3 1100

256 (4 x 64)
DDR 700

256 (4x64)
DDR(?)

X 700 PRO / SE *

128 (2x64)
?

Based on the previous generation architecture

*) data is based on unverified rumors (beyond3d forum and other unofficial online sources), these products will be officially announced soon.

If the 6800 Ultra, GT and just 6800 look like fairly balanced solutions in terms of the ratio of memory bandwidth and fill speed, then the 6800LE will more often run into insufficient fill speed - there is an excess of memory bandwidth, and both 6600 models will primarily suffer due to a lack of bandwidth transmission. The 6600 GT's peak fill rate is nearly 2/3 that of the 6800 Ultra, while memory bandwidth is more than half that, and that's before potentially reduced caches and a dual-channel memory controller.

Thus, we can predict that the weak point of the 6600 family will be large resolutions and modes with full-screen anti-aliasing, especially in simple applications, and the strong point will be programs with long and complex shaders and anisotropic filtering without simultaneous MSAA. Next, we will check this assumption with game and synthetic tests.

It is difficult now to judge how justified the move with a 128-bit memory bus was - on the one hand, this reduces the cost of the chip body and reduces the number of defective chips, on the other hand, the difference in the price of a printed circuit board for 256 bits and 128 bits is not large, and is more than compensated by the difference in the price of regular DDR and still expensive high-speed GDDR3 memory. Probably, from the point of view of card manufacturers, a solution with a 256-bit bus would be more convenient, at least if they had the opportunity to choose, and from the point of view of NVIDIA, which produces chips and often sells memory with them, a 128-bit solution is more profitable complete with GDDR3. Another thing is how this will affect the speed - after all, there is a potential limitation of the excellent capabilities of the chip (8 pipelines, 500 MHz core frequency, and this is not the limit) due to the significantly reduced memory bandwidth:

DDR 700 x 256 bits = 22.4 Gigabytes vs GDDR3 1000x128 bits = 16 Gigabytes.

This fact is especially troubling given the rumors about the older X700 model, which will be equipped with 256-bit memory.

However, we note that NVIDIA has reserved the Ultra suffix for now - given the great overclocking potential of 110 nm technology, we can expect the appearance of a card with a core frequency of about 600 MHz or slightly less, 1100 or even 1200 memory (in the future) and the name 6600 Ultra. But what will its price be? In the long term, we can predict the appearance of an updated, 256-bit version of the Mainstream solution, let's mentally call it NV46, optimized for performance, with 8 or even 12 pipelines and a 256-bit bus.

The vertex and pixel processors in the NV43, apparently, remained unchanged, but the internal caches could be reduced in proportion to the number of pipelines. However, the number of transistors does not give much cause for concern - given the not so large size of the caches, it would be more reasonable to leave them the same as in the NV40, thereby compensating for the noticeable lack of memory bandwidth. It is quite possible that the ALU array, which is quite large in terms of transistors, performing post-processing, verification, Z generation and pixel blending for writing the results to the frame buffer, was also reduced on each pipeline compared to the NV40 - anyway, the reduced memory bandwidth will not allow writing 4 full gigapixels in second, and the shading potential (8 pipelines at 500 MHz) will be fully used only on more or less complex shaders, with more than two textures and accompanying shader calculations.

We will check all these assumptions during subsequent synthetic and game tests.

Before studying the card itself, here is a list of articles devoted to the study of previous new products: NV40/R420. After all, it is already obvious that the NV43 architecture is a direct successor to the NV40 technologies (after the chip power was divided in half).

Theoretical and analytical materials and reviews of video cards, which discuss the functional features of the ATI RADEON X800 (R420) and NVIDIA GeForce 6800 (NV40) GPUs

  • NVIDIA GeForce 6800 Ultra (NV40). Part 1 - architectural features and synthetic tests in D3D RightMark (one-page version)
  • NVIDIA GeForce 6800 Ultra (NV40). Part 1 - architectural features and synthetic tests in D3D RightMark (option is divided into pages)
  • NVIDIA GeForce 6800 Ultra (NV40). Part 2 - Study of Performance and Quality in Gaming Applications (one-page version)
  • NVIDIA GeForce 6800 Ultra (NV40). Part 2 - a study of performance and quality in gaming applications (the option is divided into pages)
  • Borodino battle between ATI RADEON X800 XT and NVIDIA GeForce 6800 Ultra - Picture Two: 450 MHz for the second and new tests for both cards (one-page version)
  • Borodino battle between ATI RADEON X800 XT and NVIDIA GeForce 6800 Ultra - Picture Two: 450 MHz for the second and new tests for both cards (the option is divided into pages)
  • Borodino battle between RADEON X800 and GeForce 6800: Picture three - Trilinear filtering (synthetic examples)
  • Borodino battle between RADEON X800 and GeForce 6800: Picture four: filtering tests based on RightMark (one-page version)
  • Borodino battle between RADEON X800 and GeForce 6800: Picture four: filtering tests based on RightMark (variant divided into pages)
  • Borodino battle between ATI RADEON X800 and NVIDIA GeForce 6800 - Picture Five: filtering tests based on games (one-page version)
  • Borodino battle between ATI RADEON X800 and NVIDIA GeForce 6800 - Picture Five: filtering tests based on games (the option is divided into pages)
  • Review of PowerColor RADEON X800 PRO Limited Edition, hardware conversion of X800 PRO into X800 XT Platinum Edition (one-page version)
  • Review of PowerColor RADEON X800 PRO Limited Edition, hardware conversion of the X800 PRO into the X800 XT Platinum Edition (the option is divided into pages)
  • Review of Leadtek WinFast A400 TDH, Leadtek WinFast A400 Ultra TDH based on NVIDIA GeForce 6800/6800 Ultra (one-page version)
  • Review of Leadtek WinFast A400 TDH, Leadtek WinFast A400 Ultra TDH based on NVIDIA GeForce 6800/6800 Ultra (option is divided into pages)
  • Borodino battle between ATI RADEON X800 and NVIDIA GeForce 6800 - Scene Six: Filtering in games (continued) (one-page version)
  • Borodino battle between ATI RADEON X800 and NVIDIA GeForce 6800 - Scene Six: Filtering in games (continued) (option divided into pages)
  • A brief report on testing FarCry v.1.2 and the first implementation of Shader 3.0 into reality
  • Brief report on operational testing of modern 3D cards in DOOM III (X800PRO/XT, GF6800/GT/Ultra, 9800XT/5950U)
  • Chaintech Apogee GeForce 6800 Ultra based on NVIDIA GeForce 6800 Ultra - Testing in DOOM III with “optimizations”

Let me emphasize once again that today is only the 1st part, dedicated to the performance of new products. We will look at the quality components later in the second part (3D quality and video playback).

Now let's talk about the map. Why do we have two cards in the title, but we are actually considering one? The fact is that the 6600GT and 6600 differ from each other only in operating frequencies, so we can most likely emulate the GF 6600 by reducing the frequencies of the 6600GT. That's what we did. Of course, taking into account that the serial GeForce 6600 will not have GDDR3 memory, but simple DDR (besides the frequencies, the timings are also different), and also that NVIDIA does not strictly declare the memory frequencies of such cards, and clockings of 250 can occur up to 300 MHz memory; We can’t talk about a 100% coincidence of our results with those shown by the final GeForce 6600. But we can estimate. And even useful. And therefore, our results will also show the GeForce 6600 300/300 (600) MHz (we take the limiting case). Everyone understands that real 6600 will show performance NOT HIGHER than what we have in the diagrams, and we can roughly estimate within what limits it will be.

So, reference card GeForce 6600GT.

It is obvious that the design of the GF 6600GT is unique and unlike any previous one. First of all, this is a reduction in the size of the card itself, which allows for the absence of a 256-bit bus, which still affects the size of the PCB. Also, a strong simplification of the power unit contributed to a reduction in the PCB area (after all, PCX cards consuming less than 75 W no longer require external power, and this simplifies the design). Our facility consumes less than 75W at the maximum load, so no connections directly to the power supply are required.

Despite the huge frequencies for an 8-pipeline chip, the cooler is quite primitive.

We can assume that manufacturers of such cards will conduct experiments using their own coolers, or will use the developments that were previously made for the GeForce4 Ti (GeForce FX 5600/5700).

The GPU itself has a relatively small footprint (of course, it has a 128-bit bus), and in general looks very similar to the GeForce FX 5700. And the die dimensions are almost the same. But if the NV36 only fit 4 pixel and 3 vertex pipelines into these dimensions, then in this case there are twice as many pixel pipelines. Still 0.11 microns...

The video card has an important feature designed for the future, namely support for SLI (that is, as in the days of Voodoo2, it is possible to increase the total power of 3D graphics by adding a similar accelerator). To do this, on the top of the board there is a corresponding connector for connecting a special cable (or connector) to two video cards to synchronize their operation:

Finishing the study of the card itself, we note that it has support for VIVO, implemented through the Philips 7115 (we have not yet encountered such an encoder, so our permanent researcher of multimedia add-ons or features of video cards, Alexey Samsonov, is already impatiently waiting to test the new product).

Now let's talk about overclocking. Thanks to the efficiency of RivaTuner author Alexey Nikolaychuk, this utility can already work with NV43.

The number of pipelines (both pixel and vertex) is determined for the map. In the second screenshot we see that the card has two quads (four pixel pipelines) working.

So, the board was able to operate stably at frequencies 590/590 (1180) MHz!. Unprecedented potential! I can even assume that after the release of the ATI RV410, NVIDIA will release the GeForce 6600 Ultra (it’s not for nothing that the older model now only has the GT suffix).

The card operated at these frequencies, blown by an external fan. And here are the temperatures we saw:

Yes, sometimes the core heating reached 88 degrees, but, as you know, for such chips this is clearly not the limit (they can heat up to 100 degrees). It is interesting to note that the external fan practically cooled only the memory, since its removal did not lead to any increase in core temperature.

This article focuses on one of the best video cards, the popularity of which has not declined over several years - GeForce 6600. In just a few years, technologies in the field of gaming applications have advanced so far that most users no longer even remember about devices of this class. However, there is a category of people who prefer to use computers until the first failure, without subjecting them to improvements. It is for this target audience that this article is intended.

The reader will find out why this video adapter is so popular among users, get acquainted with market representatives built on the GeForce 6600, and receive complete performance testing information and overclocking data for this video card.

Historical reference

The thing is that the 6600 video adapter was produced at the time when manufacturers of computer components were switching from one technology to another (the AGP video bus was replaced by a PCI connector). Accordingly, the manufacturer tried to sit on two chairs: to make the fastest device for the old AGP technology and, not knowing the trends of the new market, to introduce all technologies available at the time of manufacture into devices with the new PCI bus. It is generally accepted that it was with the GeForce 6600 video adapter that the epic overclocking of the graphics core and memory bus began.

It cannot be said that after several years the video adapter continues to occupy top positions; rather, on the contrary, its performance is negligible for modern games. But we can say with confidence that the 6600 is the best solution for owners of old computers.

If we talk about the chipset

The chipset is codenamed NV43 and exists in several modifications. The first division is based on platforms: PCI and AGP. And the second relates directly to technical characteristics. So the manufacturer created an inexpensive solution called GeForce 6600 LE, endowing it with weak characteristics. He released many basic modifications that do not have any letters in the markings, and made the top line GT, providing it with high power. Moreover, this very power is achieved by increasing the frequencies of the memory and graphics core - ordinary overclocking, only done by the hands of the manufacturer.

There are quite a lot of reviews in the media from IT specialists who claim that in fact, dividing video adapters according to modifications responsible for performance is a common rejection. If the chipset is capable of stable operation in hard mode, it is labeled as GeForce 6600 GT. If the chip is unstable, it is not marked in any way, and if strong heating occurs during overclocking, then it should be an inexpensive LE modification.

Specifications

The NV43 chip manufacturing technology complies with the 256-bit standard, has a 0.11 micron process technology and includes 143 million transistors. The manufacturer based on Nvidia GeForce 6600 presented modifications with the GDDR and GDDR3 memory bus. Naturally, this difference was reflected in the performance of video adapters (in fact, the difference in throughput for GDRR3 is 1.5 times higher). device is 256 MB, although sometimes there are instances with 128 MB on board.

The real operating frequency of the improved modification is 500 MHz, but for cheap chips the limit is set at 350 MHz. All devices support DirectX 9.0c, 10 bits per color channel and are equipped with a built-in TV encoder that uses the power of the video processor to operate. Do not forget that it was the sixth modification from Nvidia that began the epic with the introduction of special functions into its hardware solutions: adaptive, true trilinear filtering, UltraShadow and similar additions.

Direct competitor of the chipset

Thanks to the fact that Nvidia has a competitor in the market - ATI, all users have the opportunity to purchase computer components at an affordable price. This statement also applies to the video card in the review - GeForce 6600, the characteristics of which are as close as possible to the ATi Pro. The competitor's video adapter has a lower graphics core frequency (425 MHz) and a lower memory frequency (800 MHz), but this does not prevent it from showing high performance.

It's all about the technologies that ATI adapters support. Having long ago abandoned the race for increased frequencies, the manufacturer began to introduce its own developments into the system, which allow it to demonstrate excellent performance potential. In the media, owners note poor technical support for users in terms of drivers, as is the case with the GeForce 6600. Windows 7, detecting the discrete video adapter ATI Radeon X700 Pro, assigns an integrated driver to it.

Leader in the video card market

The most productive video adapter based on Nvidia GeForce 6600 GT is considered to be a product from Leadtek. It owes its enormous power to the high graphics core frequency (550 MHz) and memory bus (1120 MHz). High speed video data transmission is provided by memory modules from Samsung, the response time of which is 1.6 nanoseconds.

The absolute leader in performance on the market also has several shortcomings. Firstly, the cooling system. It provides only the graphics chip with a flow of cold air, but the memory modules and other important components of the device are left unattended. Secondly, there is no video input, as is implemented in all competitors. But the video output supports HDTV. In fact, it is not clear whether this is an advantage or a disadvantage. By the way, many potential buyers were literally chasing the Leadtek video adapter, because its package included three licensed games: Splinter Cell, Prince Of Persia and Pandora Tomorrow.

Gold series

When it comes to devices that are built to last, many owners immediately think of the GeForce 6600 GT video adapter from Gainward. The video card is part of the gold series, which means, along with high performance, it guarantees fault-tolerance in operation. The manufacturer made an interesting marketing move by setting the basic settings for the video core frequencies (500 MHz) and memory bus (1000 MHz) in the video card BIOS; he gave the device a lifetime warranty. Only on condition that the user does not overclock it. However, when installing the proprietary software supplied on the disk, the video driver independently overclocks the graphics core to 540 MHz and the bus to 1150 MHz.

Compared to its competitors, the performance of the Gainward video adapter is very high. Hardware support for video input and video output from HDTV is a great addition to a worthy purchase. The future owner will also be pleased with a system that effectively cools all modules installed on the board.

No noise and dust

If the user thinks that the 6600 GT with passive cooling is unlikely to surprise fans of dynamic games, then he is definitely mistaken, because Gigabyte will never allow everyone around to consider its products low-grade. A copper heatsink wraps around the video card on both sides, covering all the chips installed on it. To normalize temperatures, there is a fairly wide copper tube between the two radiators.

And yet she gets warm. But not strong enough for the protection to work, and the computer went into a system reboot. Experts recommend that future owners consider having a spacious system case with decent ventilation. High performance of the adapter is ensured by the increased memory frequency - 1120 MHz and Samsung 1.6 ns modules. As a result, this particular product is recommended for many users who prioritize high performance and quiet operation and gaming - Gigabyte has no competitors for the NV43 chip and is not expected to do so.

Diamond Series

The legendary 6600 AGP from MSI was remembered by all users for its advertising, because it was thanks to it that all fans of dynamic games learned about the wonderful shooter “The Chronicles of Riddick”. The film's hero himself was depicted on the box, and the toy was supplied free of charge with the video adapter. The power of the device was indicated by the Diamond marking. The video card for the AGP bus had enormous potential - 400 MHz for the core and 800 MHz for the memory. It is a pity that the MSI developers did not take another step forward by providing the user with at least some tool for increasing the frequency of the graphics core and bus. This is a little lacking in the tests.

The manufacturer also took care of cooling: the cooler blows air over the graphics controller, and the memory chips are equipped with radiators. Judging by the reviews, users are confused by the absence of a copper pad on the back of the video card, because the dissipated power heats up this particular part of the board. In general, the attitude towards the product is positive, the video card is beautiful, made wisely and, for the technology of the last century, copes with the tasks very well.

Strange extreme

ASUS products have always occupied top positions in all tests, but with the GeForce 6600 128Mb Extreme video card, the leader of the computer market was embarrassed. Memory and core clock speeds set at the factory are not subject to change. What the manufacturer ASUS wanted to achieve in the market remains a mystery. By removing the video input and placing slow memory modules with a response of 2.5 ns, the famous Taiwanese brand did not even bother to reduce the price of its products.

In the media, many owners assure others that ASUS is the only company in the world that has added the licensed game “Joint Operations” to the package. There is no logic in such statements, because the game can be purchased separately, and everyone wants to see a powerful video card in their own computer, and not consumer goods.

Last Hero

Albatron's product called Trinity GeForce 6600 AGP is considered the standard for many video cards, because it was this brand that set the direction for the market of all components at the dawn of the computer era. The appearance of Albatron devices, be it a video card, motherboard or tuner, has always been distinguished from its competitors by its emphasized rigor and conservatism.

The market leader decided not to participate in the race for high performance, focusing all its potential on the package and software that comes with the video card. It is a pity that many manufacturers currently pay very little attention to the software supplied with the video adapter, because it is in these products that users’ attachment to the brand lies.

Testing video adapters

It is unlikely that it will be possible to force a GeForce 6600 PCI device running on DirectX 9.0 technology to run a more advanced game in tests, so corresponding applications (The Chronicles Of Riddick, Half-Life 2 and Doom3) will be used to compare video cards. Naturally, there is no question of any FullHD parameter; the standard resolution of 1024x768 dpi with 32 bits of color is used.

As one would expect, the leading positions were taken by devices with maximum graphics core and memory bus frequencies: Leadtek, Gainward and Gigabyte. From this we can draw one correct conclusion: the video card that has the maximum overclocking wins in games. Appearance, beauty and equipment are all just to distract potential buyers from high-performance video adapters.

Pitfalls for future owners

Manufacturers always hide the shortcomings of their products from prying eyes. This happened with the NV43 chip, which has already managed to distinguish itself from its competitors and demonstrate its ultra-high crystal strength, which, if handled carelessly, leads to disastrous results. Having improved the performance characteristics of the GeForce 6600 with 143 million transistors on the chip, the manufacturer did not think about safety, which is violated if the video card is handled carelessly.

The fact is that the chip does not fit tightly to the surface of the video card, and the cooler installed on top for cooling does not have a recess for the graphics chip. Accordingly, if you press one of the edges of the radiator, the graphics core crystal will burst. This is already a lottery: only a corner of the chip will fall off or the entire graphics processor will split in two. In the first case, the video adapter will most likely work, but in the second, you will have to go to the store for a new purchase.

Problems with installing video cards

Many owners of non-modern computers have already noticed the strange arrangement of memory slots on motherboards with installed It feels like the manufacturer was making a joke on users by placing the contact for installing a video card in close proximity to the locks for attaching memory modules. All owners of GeForce 6600 video cards have a chance of breaking a couple of air conditioners on the board during the installation process. And to prevent this from happening, you need to carry out the installation with maximum responsibility and in good lighting.

In the media, IT specialists recommend installing memory modules first and only then inserting the video card into the slot. Carefully lowering it all the way down, you need to use the fingers of your other hand to move the edge of the board away from the tabs for attaching the RAM modules. If the end of the printed circuit board of the video adapter for some reason rests on the capacitor on the motherboard, then the user can easily fix the problem on his own by carefully bending the interfering device to the side.

About efficient overclocking

An inquisitive user will certainly come to his own conclusions that the performance of the video adapter can be increased by overclocking further, if the appropriate software is at hand and the video card has excellent cooling. Judging by reviews over several years in the media, a lot of time has not been wasted on overclocking the GeForce 6600. The owners managed to achieve record figures at that time: graphics core frequency of 615 MHz and bus frequency of 1340 MHz.

The first task of the user is to provide the video card with decent cooling. There are many options here. You can replace the standard cooler with some professional solution from Shuttle or Zalman, but the price of the cooling systems raises doubts. It’s easier to change the motherboard with the processor than to pay a hefty sum for a professional solution. The second option is to install an additional 12 mm fan with an air flow of 90-140 CFM above the standard cooler. For overclocking, it is recommended to use the official software that comes with the Nvidia driver.

Finally

The GeForce 6600 video card is the only link that still connects two eras of personal computers capable of working with dynamic games. Transitional video card models will leave the market - the time of old PCs will end. And if a potential buyer is faced with a choice between buying a GeForce 6600 or upgrading the entire system, then it’s worth thinking about the latter thought, because nothing in our world exists forever. Sooner or later, old technologies will be lost to humanity forever. But if, in order to improve a new system, there is a choice between the video adapters themselves, then clearly, in terms of the “price-quality” criterion, the buyer will not find anything more worthy than the GeForce 6600 chip.







2024 gtavrl.ru.