Program for accelerating 3D games. Causes of errors related to hardware acceleration


GeForce4 Ti 4200

Video card(Also video adapter, graphics adapter, graphics card, graphics card, graphics accelerator) - a device that converts a graphic image stored as the contents of the computer memory (or the adapter itself) into a form suitable for further display on a monitor screen. The first monitors, built on cathode ray tubes, worked on the television principle of scanning the screen with an electron beam, and required a video signal generated by the video card for display.

However, this basic function, while remaining necessary and in demand, has gone into the shadows, ceasing to determine the level of image formation capabilities - the quality of the video signal (image clarity) has very little to do with price and technical level modern video card. First of all, now a graphics adapter is understood as a device with a graphics processor - a graphics accelerator, which is responsible for the formation of the graphic image itself. Modern video cards are not limited to simple image output; they have a built-in graphics processor that can perform additional processing, removing this task from the computer's central processor. For example, all modern Nvidia and AMD (ATi) graphics cards render the OpenGL and DirectX graphics pipeline at the hardware level. Recently, there has also been a trend to use the computing power of the GPU to solve non-graphics problems.

Typically, a video card is made in the form of a printed circuit board (expansion card) and is inserted into an expansion connector, universal or specialized (AGP, PCI Express). Video cards built-in (integrated) into the motherboard are also widespread - both in the form of a separate chip and as part of the northbridge of the chipset or CPU; in this case, the device, strictly speaking, cannot be called a video card.

History of creation

One of the first graphics adapters for the IBM PC became MDA (Monochrome Display Adapter) in 1981. It worked only in text mode with a resolution of 80x25 characters (physically 720x350 pixels) and supported five text attributes: normal, bright, inverse, underlined and flashing. No color or graphic information it could not transmit, and what color the letters would be was determined by the model of the monitor used. They were usually white, amber or emerald on a black background. In 1982, Hercules released a further development of the MDA adapter, the HGC (Hercules Graphics Controller) video adapter, which had a graphic resolution of 720x348 pixels and supported two graphic pages. But he still didn’t allow me to work with color.

The first color video card was the CGA (Color Graphics Adapter), released by IBM and which became the basis for subsequent video card standards. It could work either in text mode with resolutions of 40×25 familiarity and 80×25 familiarity (character matrix - 8×8), or in graphic mode with resolutions of 320×200 pixels or 640×200 pixels. In text modes, 256 character attributes were available - 16 character colors and 16 background colors (or 8 background colors and a blink attribute), in the 320x200 graphics mode four palettes of four colors each were available, and the 640x200 high-resolution mode was monochrome. In development of this card, EGA (Enhanced Graphics Adapter) appeared - an improved graphics adapter, with a palette expanded to 64 colors, and an intermediate buffer. The resolution was improved to 640x350, resulting in an 80x43 text mode with an 8x8 character matrix. For the 80×25 mode, a large matrix was used - 8×14, 16 colors could be used simultaneously, color palette was expanded to 64 colors. The graphics mode also allowed the use of 16 colors from a palette of 64 colors at a resolution of 640x350. Was compatible with CGA and MDA.

It is worth noting that the interfaces with the monitor of all these types of video adapters were digital, MDA and HGC transmitted only whether the dot was lit or not lit and additional signal brightness for the text attribute “bright”, similarly, CGA transmitted the main video signal through three channels (red, green, blue), and could additionally transmit a brightness signal (16 colors in total), EGA had two transmission lines for each of the primary colors, then each primary color could be displayed at full brightness, 2/3 or 1/3 of full brightness, which gave a total of a maximum of 64 colors.

In early models of computers from IBM PS/2, a new graphics adapter appeared, MCGA (Multicolor Graphics Adapter). The text resolution was raised to 640x400, which made it possible to use the 80x50 mode with an 8x8 matrix, and for the 80x25 mode to use an 8x16 matrix. The number of colors was increased to 262144 (64 brightness levels for each color); for compatibility with EGA in text modes, a color table was introduced, through which the 64-color EGA space was converted to the MCGA color space. A 320x200x256 mode appeared, where each pixel on the screen was encoded by the corresponding byte in video memory, there were no bit planes, accordingly, compatibility with EGA remained only in text modes, compatibility with CGA was complete. Because of huge amount brightness of primary colors, it became necessary to use an analog color signal; the horizontal scanning frequency was already 31.5 kHz.

Then IBM went even further and made VGA (Video Graphics Array), an extension of MCGA, compatible with EGA and introduced in mid-range PS/2 models. This has been the de facto video adapter standard since the late 80s. Added: 720x400 text resolution for MDA emulation and 640x480 graphics mode with bitplane access. The 640x480 mode is remarkable because it uses a square pixel, that is, the ratio of the number of horizontal and vertical pixels is the same as the standard screen aspect ratio of 4:3. Then came the IBM 8514/a with resolutions of 640x480x256 and 1024x768x256, and the IBM XGA with 132x25 text mode (1056x400) and increased color depth (640x480x65K).

Device

A modern video card consists of the following parts:

GPU Cooling System

Video memory is used for temporary storage, in addition to the image data itself, and others: textures, shaders, vertex buffers, Z-buffer (removal of image elements in 3D graphics), and similar graphics subsystem data (with the exception, for the most part, of Video BIOS data , internal memory GPU, etc.) and codes.

Characteristics of video cards

  • Memory bus width, measured in bits - the number of bits of information transmitted per clock cycle. An important parameter in card performance.
  • video memory size, measured in megabytes - the amount of the video card’s own RAM. Larger volume does not always mean greater productivity.

Video cards integrated into the set system logic motherboard or being part of the CPU, usually do not have their own video memory and use part of the computer’s RAM (UMA - Unified Memory Access) for their needs.

  • core and memory frequencies- measured in megahertz, the more, the faster the video card will process information.
  • texture and pixel fill rate, measured in million pixels per second, shows the amount of information displayed per unit of time.
  • To the important technical features Characteristics of the video card include the built-in cooling system, if implemented, and data transfer interface connectors.

3D accelerators

The term itself 3D accelerator formally means an additional expansion card that performs secondary functions accelerating the formation of three-dimensional graphics. Displaying the result as a 2D image and transferring it to the monitor is not the task of a 3D accelerator. IN modern understanding 3D accelerators are practically never found as a separate device. Almost any (except highly specialized) modern video card, including modern integrated graphics adapters as part of processors and system logic, perform hardware acceleration for displaying two-dimensional and three-dimensional graphics.

Hardware acceleration of the generation of graphic images was initially included in the characteristics of many personal computers, but the first IBM PC model had only text modes and did not have the ability to display graphics. However, the first video cards for IBM PC-compatible computers with support for hardware acceleration of 2D and 3D graphics appeared quite early. So, back in 1984, IBM began producing and selling video cards of the PGC standard. PGC was created for professional use, performed hardware acceleration for the construction of 2D and 3D primitives and was a solution primarily for CAD applications. True, IBM PGC had an extremely high cost. The price of this video card was much higher than the computer itself. Therefore, such solutions have not received significant distribution. To be fair, it is worth saying that on the market professional solutions there were video cards and 3D accelerators from other manufacturers.

The distribution of affordable 3D accelerators for IBM PC-compatible computers began in 1994. The development of graphical user interfaces, and primarily operating systems with graphical user interfaces, affected the development of video cards in general. Video cards require fast and high-quality display in high resolutions with greater color depth. In addition, in order to reduce the reaction time of user actions and relieve CPU From processing a large amount of graphics to a computer, 2D graphics acceleration functions appear in some video cards. So, with the growing popularity Microsoft Windows Some graphics adapters implement the functions of hardware cursor display, hardware filling of screen areas, hardware copying and transferring of screen areas (including hardware scrolling functions), as well as hardware display of 2D primitives. The development of this direction was the emergence of hardware display functions for 3D primitives. The first video card to support hardware acceleration of 3D graphics display was Matrox Impression Plus released in 1994 (used chip Matrox Athena). Later that year Matrox introduces a new chip Matrox Storm and a video card based on his Matrox Millennium. The 1994 Matrox Millennium was the first video card in the highly successful Millennium series. Millennium video cards were produced until the mid-2000s.

In 1995, several companies were already releasing new graphics chips with support for hardware acceleration of 3D graphics generation. So Matrox releases the MGA-2064W, Number Nine Visual Technology celebrates the release of the Imagine 128-II GPU, Yamaha introduces the YGV611 and YGV612 chips, 3DLabs releases the Glint 300SX, and Nvidia releases the NV1 (which is also produced under an agreement with SGS-THOMSON under named STG2000). In the same year, based on these solutions, a large number of video cards from various manufacturers with support for 3D graphics acceleration.

A real breakthrough in the market of 3D accelerators and video cards with hardware acceleration of 3D graphics was in 1996. This year was the year of mass introduction and popularization of hardware 3D graphics on IBM PC-compatible computers. This year there are new graphics solutions from 3DLabs, Matrox, ATI Technologies, Rendition, Chromatic Research, Number Nine Visual Technology, Trident, PowerVR. And although based on these GPUs This year, a lot of 3D accelerators and full-fledged video cards with 3D graphics acceleration functions are being released, the main event being the release of 3D accelerators based on a set of 3Dfx Voodoo Graphics chips. The company 3dfx Interactive, which previously produced specialized 3D accelerators for arcade machines, presented a set of chips for the IBM PC-compatible computer market. The speed and quality of rendering 3D scenes made by Voodoo Graphics cards were at the level of modern ones slot machines, and most video card manufacturers began releasing 3D accelerators based on the Voodoo Graphics set, and soon most computer game manufacturers supported Voodoo Graphics and released new games for IBM PC-compatible computers with a completely new level of 3D graphics. There has been an explosion of interest in 3D games and, accordingly, in 3D accelerators.

Gaming video accelerators

Gaming video accelerators are video cards aimed at accelerating 3D graphics in games.

Since 1998, SLI (Scan Line Interleave) technology has been developing (3dfx company, Voodoo2 card), which allows using the power of several interconnected video cards for processing three-dimensional images. See NVIDIA SLI and ATI CrossFire

Professional video accelerators

Professional graphic cards- video cards aimed at working in graphics stations and for use in mathematical and graphical 2D and 3D modeling packages, which bear a significant load when calculating and drawing models of designed objects.

The cores of professional video accelerators from major manufacturers, AMD and NVIDIA, differ little from the inside from their gaming counterparts. They have long unified their GPUs and use them in different areas. It was precisely this move that allowed these companies to oust companies engaged in the development and promotion of specialized graphics chips for professional applications from the market.

Particular attention is paid to the video memory subsystem, since this is a particularly important component of professional accelerators, which bears the main load when working with gigantic models; In particular, except noticeably large volumes memory for cards comparable in performance, video cards in the professional segment can use ECC memory.

Separately, there are products from Matrox, whose highly specialized accelerators as of 2017 were used for video encoding, TV signal processing and work with complex 2D graphics.

Types of Graphic Cards

Discrete video cards

The most high-performance class of graphics adapters. Typically connected to the high-speed PCI Express data bus. Previously, there were video cards connected to AGP buses (a specialized data exchange bus for connecting only video cards), PCI, VESA and ISA. At the moment, modern video cards are connected only via the PCI Express bus, and all other types of connections are obsolete. In computers with architectures other than IBM-compatible, there were also other types of video card connections.

The discrete card may not necessarily be removed from the device (for example, on laptops the discrete card is often soldered to the motherboard). It is called discrete because it is made in the form of a separate chip (or chipset) and is not part of other computer components (unlike graphics solutions built into system logic chips motherboards or directly to the central processor). Most discrete graphics cards have their own random access memory (VRAM), which can often have more high speed access or a faster access bus than conventional computer RAM. Although previously there were video cards that fully or partially used the main RAM for storing and processing graphic information, now almost all modern video cards use their own video memory. Also, sometimes (but quite rarely) there are video cards whose RAM is not installed in the form of separate memory chips, but is part of the graphics chip (in the form of separate crystals, or on the same crystal with the graphics processor).

Made as a separate set of system logic, rather than as part of other chips, discrete graphics cards can be quite complex and much more powerful than integrated graphics. In addition, since discrete video cards have their own video memory, there is no need to share RAM with other computer components (primarily with the central processor). Having your own RAM allows you to avoid wasting the main RAM for storing information that is not needed by the central processor and other computer components. On the other hand, the video processor does not have to wait in line to access the computer’s RAM, which can currently be accessed by both the central processor and other components. All this has a positive effect on the performance of discrete video cards compared to integrated graphics.

Technologies such as

The DirectX package is a special set of various libraries that help create visual and sound effects for various applications and games. The software is released freely, so users can then download Direct3D for free for Windows 7 64 bit, 32 bit and other systems.

DirectX is considered a core part of any operating system. Thanks to this package, the quality and level of processing can be significantly improved. various films, games that include 3D animation, color graphics, full stereo sound and other multimedia elements. In addition, Direct X can increase system performance and security.

DirectX consists of the following parts:

  • Direct3D is an element that is responsible for displaying three-dimensional graphics. Here it all depends on the video card installed on the computer - the more powerful it is, the brighter the 3D elements will be visible.
  • DirectDraw is the part of the package responsible for displaying a two-dimensional image.
  • DirectSound is an element that processes sound effects for movies, games and applications. This component is also used for hardware audio acceleration and mixing.

Required System Requirements

In order for DirectX 3D to work correctly, the following conditions are necessary:

  1. operating room Windows system x64/x32 (x86) architectures.
  2. Stable Internet connection for possible download additional modules or components.

What can the DirectX suite do?

The main features of the component are as follows:

  • Regularly update your operating system's security packages.
  • Fully compatible with video drivers such as GeForce, NVidia and ATI.
  • Improve video quality.
  • Working with sound files in WAV format.
  • Encoding or, otherwise, decoding music tracks.
  • Play audio files of complex formats.
  • Demonstration of animation on web pages.
  • Correct work with animation and graphics of 3D and 2D types.
  • Reducing the load on the computer processor.
  • Increased rendering speed of visual components.
  • Volumetric demonstration of elements.

What's new?

DirectX is constantly updated: the latest package has added the following features:

  • Significantly increased quality and efficiency when running multimedia applications.
  • The Direct3D element has been completely updated.
  • Thanks to the support innovative technologies, the display of shadows and textures has been improved, resulting in a realistic effect when viewing animations and graphics in gaming applications and videos.
  • The package has become more stable.

How is DirectX different from other drivers?

In fact, the Direct X package has its competitors, namely this OpenGL driver, but they have a number of differences. This list shows that DirectX is much better in terms of its characteristics.

Lesnik 61 29-01-2009 20:56

Friends, my son is asking for the 3D Game Graphics Accelerator program. latest version. I'm in this like a blind man in the forest.
Best regards, Lesnik61

ober 29-01-2009 21:14

uh, not a new video card? like Nvidia Quadro FX 5800? maybe you need DirectX 11?

Lesnik 61 29-01-2009 21:33

quote: uh, not a new video card? like Nvidia Quadro FX 5800? maybe you need DirectX 11?

The son claims that DirectX 11 is just in development. And DirectX 10.1 is only available for the ADM video card.
Sorry, I'm passing the keyboard to my son.
Hello, the accelerator works to speed up the video card (2nd assistant), for example, the game Conr Strike Source for 20 people is slow, we launch the program and everything works fine, the problem is that I have an outdated version that can overclock the video card from 1% to 18%, in this version you can turn it on to medium (by default), but the new one will allow it to maximum, on the network this is just the thing. I have DirectX 10, I recently downloaded it. Help me please.
Best regards, Gleb.

seysen 29-01-2009 21:39

This is the first time I've heard of this...

person 29-01-2009 21:48



the problem is that I have an outdated version


... what? When the name is known new version search is a little easier
Riva tuner or what?

ober 29-01-2009 22:01

what video card?

Lesnik 61 29-01-2009 22:11

quote: what video card?

INVIDIA
Best regards, Gleb.

seysen 29-01-2009 22:21

quote: Originally posted by Forester 61:

INVIDIA


maybe nVidia?
If so, then something must go further....

Ostwind 30-01-2009 02:09

So it’s true that the Riva tuner will probably speed up, you’ll have to buy a new video card

Mihoshi 30-01-2009 15:22

This refers to a physics accelerator, a separate board is no longer produced and the company was bought by NVIDIA to support it with branded firewood on new chips. I’ll warn you right away, it’s expensive, but it’s not that much use.
GeForce GTX 295
GeForce GTX 285

Choose one of these

Ostwind 30-01-2009 15:34

I strongly doubt that it is a physics accelerator, what kind of 1-18% do you mean then?

Mihoshi 30-01-2009 15:52

They are. Really working technology in a crisis, for example, the difference is shocking. But games with penny support.

Ostwind 30-01-2009 17:31

I know that they exist, but as I understand it, they haven’t really caught on, and I doubt that CS:S supports it

Mihoshi 30-01-2009 18:16

They have just entered the market. And the KS really supports the invisibility partner for. They simply process commands from havok.

Walenok 30-01-2009 22:00

Contra Source is slowing down, apparently due to something else. It ran with a bang on my GeForce 6800. Physics has nothing to do with it. The power of the processor should be enough.
Further, all overclockers for video cards can burn them. A more or less significant increase in performance is possible on stripped-down versions with lower frequencies, blocked by converters (video devices 5000, 6000, 7000 series) or processors (all from the 8000 series), I wrote about Nvidia. This is if the unlocked blocks are functional.
In the Nvidia Control Panel you need to improve performance.
Anisotropic filtering- Application controlled. If you can, then maybe 2 or 4.
Antialiasing (all types) - Application controlled.
In the game, you can reduce the detail of textures and models. Simplify physics if it exists. (it's still not noticeable). Resolution has little effect on modern video cards. (within reasonable limits).
The latest Nvidia driver has also been released, which includes hardware acceleration of physics using a video card. (from 8000 series)

Walenok 30-01-2009 22:06

The GeForce 180.48 driver is available for XP and Vista.
Two or three video cameras are only useful if the monitor is 24" or, even better, 32". Or the chickens don't eat money. Take a top-end video camera and live in peace for 2 years.

Mozgun 31-01-2009 09:15

The guy obviously meant G3dAccel, I didn’t use it, but it would be interesting to read the statistics of serious testing, because I have little faith in such programs. Although there are positive reviews.
The trick is that this program replaces DirectX libraries with its own in assembler, due to this everything that runs on DirectX resources starts to run faster.
Anyone want to try?
There are tons of links on Google.

PS: I downloaded the first version I came across, I’ll run it on 3DMark 2001, with and without it, and I’ll post it here. If it accelerates, it will be visible.

ZZY: I tested it - on my ancient Radeon 9000 there is no increase, on the contrary, performance drops by about a percentage.

zim 31-01-2009 14:03

quote: The GeForce 180.48 driver is available for XP and Vista.

Absolutely right. It’s true that I don’t have a system unit, but a laptop, but still.
On the NVIDIA website, this driver is positioned as a “new driver for laptops” with a built-in engine. After installation, the card on the laptop in the game warmed up to 90 degrees. Whereas on a year-old driver 177.51 dated March 03, 2008 without a built-in engine, the temperature in games remains stable at 84-86 degrees. On the old game IGI 2 Hidden Strike, the temperature remains stable at 65-67 degrees. I didn't notice any differences in physics.

Lesnik 61 31-01-2009 16:12

What programs can be used for overclocking, powerful games will soon be running, but you don’t want to change your computer, you need to prepare in advance, otherwise they’ll release you and start not working, and when you find the game, it’s no longer popular, I’m so thoughtful))) .
Riva tuner, Graphics accelerator for 3D games. What else can you find interesting?
Who has worked with riva tuner, what are your opinions on this program, is it worth downloading or not?
With all the same, Respect Gleb.

Mihoshi 31-01-2009 16:54

Expecting that the New Driver will speed up the calculation of physics is as naive as hoping that the blue scribblers and spoiler speed up the Nine and make the ride more comfortable. The driver works only on cards of the GeForce GTX 295 and GeForce GTX 285 series. On others, only if there is an SLI mode where the second video card will go under physics. Also, this is only compatible with the Havoc physics engine, and only if the game supports physics accelerators. I translate into Russian only Crisis and 5 other games released in 2008. Contrasting due to constant patching too.

Lesnik 61 31-01-2009 19:34

What is patching??? Is this a patch or something like that?

Walenok 31-01-2009 21:15

Well, yes.
And forget about this physics. Apart from beauty, it doesn’t affect anything. Even on graphics. With the help of Riva you can squeeze out a little, but the video can burn. It's not worth it.

Mihoshi 01-02-2009 03:59

If you buy the above vidyuhi you can get 280/290 current, they heat up more and you unscrew everything all the way in the control panel.

person 01-02-2009 13:07

quote: Originally posted by Walenok:

With the help of Riva you can squeeze out a little, but the video can burn. It's not worth it.


If the cooling is normal, including the case, then 10-15% is quite safe. Although there is of course a difference...
quote: Originally posted by Mihoshi:

unscrew everything all the way in the control panel


Now you can change the mode (bus overclocking) in the BIOS.

Mozgun 01-02-2009 15:20

You can also stir up voltmode if you think like that. I have sinned like this before. And I modified the BIOS of the video card, but it’s all just pampering, just a workout for the brain. Moreover, it’s a risky business, you need to have a bunch of iron that you don’t mind burning if it fails.
If you have a separately purchased cooling system like Zalman, it is quite possible to run the software without risk. You just can’t kick out a lot. The website http://www.overclockers.ru/ has everything for beginner overclockers, and there is also a good forum there. What if they made the video card “at the end of the month” and installed memory chips faster than necessary, or is the chipset on the card also from the next generation with locked pipelines? There have been such cases in history...

About five years ago, few could have suspected that soon on almost any mobile device, no matter how cheap, it would be possible to run demanding 3D games and watch movies in FullHD quality. However, this happened, and thanks for this should be said to the one who decided to install full-fledged 3D accelerators in smartphones and tablets.

It is thanks to these modules that we can now play games on our devices using Adnroid. By the way, if you are looking for where android games download for free, then follow the link that we published a little higher. On this site you can find all the most interesting applications, and download them without paying a cent.

Why do you need a 3D accelerator?

You may know that the role of the graphics engine is regular computers performed by the video card. This is what it is special device, which is responsible for processing data, mainly graphic.

Here in mobile devices 3D accelerators are essentially the same thing as video cards, only they are smaller in size and more energy efficient.

First of all, graphics accelerators are used by games that work in 3D mode, that is, with three-dimensional images. Without such a device, even the most powerful devices slowed down in the simplest games.

The second thing the 3D accelerator is responsible for is processing two-dimensional images. These include drawing the interface, displaying images and playing videos. In particular, the latter requires quite a lot of power, which is why, without the use of video accelerators here, playing 1080p or even 720p video would be simply impossible or a very difficult task.

Why is there no way without an accelerator?

You probably have a question: why is a graphics accelerator so important for video and graphics processing? Is the central processor really not able to cope with the calculation of the relevant information?

The answer is: graphics processing is a rather specific task. For fast work it requires a special chip structure, as well as the presence of special additional modules. Apart from this, the use of a 3D accelerator is more rational, since the CPU is not loaded when processing graphics, and the relevant data is processed exclusively by the graphics chip, which, in turn, offers higher performance in specific tasks.

The year 1997 in the annals of the PC is inextricably linked with the name of the 3dfx company and its revolutionary product - the Voodoo Graphics board, which was the ultimate dream of every gamer of that time and launched the triumphant march of 3D graphics in the world of personal computers (in fact, Voodoo was introduced in December 1996, but it was the next year that all the fun began). To fully appreciate the historical significance of Voodoo at a time so distant from the events described, it is worth talking about the state of 3D rendering technology at that time. To do this, you will have to delve into history a little further and remember about other, sometimes exotic to the modern eye, devices that were born even before the debut of Voodoo Graphics.

They were able to create a three-dimensional image in real time using special-purpose chips even before 3dfx, but for quite a long time similar technologies were the privilege of workstations, arcade machines and home game consoles. Today, PC players often blame consoles for always lagging behind the power of discrete video cards in their development, when this or that project looks great in the form of a beta version, but on the way to release it loses some of the visual luxury in order to equalize the capabilities of several gaming platforms. In the early to mid-90s, everything was the other way around. Predominated on personal computers software method real-time rendering, and the press always reported whether a particular game supported 3D acceleration - in most cases this feature was optional. Even Quake, released in 1996, the first shooter in “honest” 3D, and already an absolutely modern game in its basic principles, was published without support for any acceleration methods and only later received compatibility with the OpenGL API.

Hardware 3D graphics first became available to the masses in slot machines in 1992. Owners of fifth-generation home consoles (Sega Saturn, Sony PlayStation and Nintendo 64) also enjoyed, albeit extremely rough, modern standards, but a three-dimensional image rendered in hardware. Even the execution of the early stages of the rendering pipeline - polygon transformation and lighting (T&L) - was implemented in the silicon of arcade machines and consoles (the Nintendo 64 distinguished itself here) years before gaming graphics cards reached this milestone ( NVIDIA GeForce 256 and ATI Radeon 7000 appeared only in 1999-2000).

After the 3D graphics revolution rocked consoles, computer hardware manufacturers tried to bring to market hardware that could offer gamers comparable levels of performance and features that required processing power that was not available with software rendering on the CPUs common at the time - such as 16-bit color representation, bilinear and trilinear filtering. Contrary to Voodoo Graphics' reputation as the very first 3D accelerator for PCs, by the time it was introduced to the consumer market there were already several devices that combined hardware rendering capabilities GUI Windows with a 3D graphics pipeline. The task of listing all early accelerators is beyond the scope of this article - there were quite a few of them produced, so we will limit ourselves to the most well-known devices.

Thus, one of the first mass-produced 3D accelerators was released by S3 - at that time it was a recognized and respectable manufacturer of video cards with raster (2D) graphics acceleration, and now a subsidiary of Taiwanese HTC. However, video cards of the S3 ViRGE family, released in 1995, were not 3D graphics accelerators in the literal sense of the word due to mediocre performance in real use scenarios. Best results reached the products of Matrox (video card Matrox Mistique), as well as the then small company ATI (3D Rage family).

But the most promising platform for gaming 3D rendering at that time was considered the Vérité V1000 chip from Rendition. It was its proprietary API that the creators of Quake initially focused on, even if in the end the game’s 3D acceleration came in the form of universal OpenGL. Until the advent of Voodoo, this accelerator had highest level speed and functionality. Thus, the device operated with 16-bit color, supported bilinear texture filtering, MIP texturing and Edge Anti-Aliasing. Unlike common methods today full screen Anti-aliasing, Edge AA is performed by drawing lines in the screen plane superimposed on the visible boundaries of polygons. Interestingly, early implementations of edge anti-aliasing in game engines gave way to full-screen SSAA and MSAA methods, and today polygon edge smoothing by processing the final image is an important part of modern high-performance algorithms.

The Vérité V1000 also stood out from an architectural point of view, because, unlike all competing solutions, the device was a programmable RISC processor combined with a number of fixed functionality blocks, and not a pure ASIC (Application-Specific Integrated Circuit). It's a pity that game developers didn't appreciate the flexibility that such an architecture has. You may have already noticed a trend - many of the technologies that are an integral attribute of GPUs today (such as the programmable rendering pipeline in this case) appeared in one form or another in the early days of hardware-accelerated rendering, but have only become in demand for many years later.

Sierra Screamin" 3D video card on Rendition Vérité V1000 chip

The debut product of the now almighty NVIDIA deserves special mention - the STG2000 accelerator on the NV1 chip, launched on the market in 1995 under the Diamond Edge 3D brand. But before we tell you why this video card was so remarkable, let’s make a brief digression. If 3dfx Voodoo wasn't the first consumer 3D accelerator, what device does that honor belong to? Giving an exact answer to this question is more difficult than it seems, because even the simplest rendering pipeline was not fully implemented in hardware right away. For example, 2D video cards Impression Plus and Millenium from Matrox back in 1994 were able to process three-dimensional images, only they lacked the ability to overlay textures on polygons, without which modern graphics are unthinkable. So, the first commercially available device for home PCs with support for hardware texturing was the Diamond Edge 3D accelerator based on the NV1 processor. But the features of NVIDIA’s first-born don’t end there.

The NV1 is the only 3D gaming accelerator ever released for PC that renders based on quad primitives. The standard for modern APIs is triangular primitives, but in that era the main line of development for real-time rendering had not yet been defined. Constructing models from quadrangles has the advantage that such a primitive does not have to be flat: moving one vertex out of the plane generates a figure with a complex surface. Additionally, in addition to the basic four vertices of the primitive, NV1 allowed five additional vertices to be specified to form more detailed geometry. The downside to this approach is that non-flat surfaces don't play well with standard texture mapping methods, as each unique shape in a game often needs to have its own texture to avoid distortion. This is not a problem for CAD applications, where quadrilateral primitives are still in use, but, as NVIDIA has learned, it overly complicates the development and porting of games written with triangle rendering in mind.

A promising feature of the NV1 seemed to be the transfer of games from the Sega Saturn platform, where a similar rendering method was used, and the STG2000 itself was an actual analogue of a game console in the form factor of an expansion board. In addition to accelerating 3D graphics, the card accelerated bitmap, sound output and even had connectors for connecting Sega controllers. Alas, Saturn was not successful in the console market, and after Microsoft opted for triangles in the Direct3D API, NVIDIA had to accept the rules of the game and stop development of the NV2 chip in favor of NV3, on the basis of which the company later released the Riva 128 accelerator.

One of the factors that delayed the emergence of mass-available - no, not GPU, this term was introduced into use by NVIDIA only in 1999 - 3D graphics accelerators on PCs was economics. But as soon as the selling prices of EDO DRAM memory dropped so much that the production of very expensive expansion cards for games became a profitable enterprise, the computer graphics market exploded 3dfx, and then several companies clashed in the fight for the opened market niche.

We know very well who ultimately emerged victorious in this race: NVIDIA and ATI are still alive today (the latter in the form of Radeon Technologies Group, a division of AMD), and 3dfx, as a result of a number of critical mistakes on the part of management, went bankrupt, and most of its resources passed into the possession of that same NVIDIA. But it was the Voodoo Graphics brand that for a long time became almost synonymous with the concept of “3D accelerator” thanks to its unsurpassed performance and wide support from game developers.

The Voodoo Graphics card was an expensive proposition for gamers of the time. Today, no one is surprised at prices over $700 for a top-end video card, but then not all gamers could afford a 3D accelerator for $299 (this was the original price of Voodoo Graphics with 4 MB of EDO DRAM), especially in Russia in the 90s. In addition to unprecedented performance, 3dfx helped justify this price with the bold and correct decision to release Voodoo as a separate expansion card, which worked in tandem with the video card to output images. In the category of 2D accelerators, the S3 products and Matrox, highly regarded for their image quality, ruled the roost. Owners of expensive video cards appreciated the partial upgrade opportunity that 3dfx provided, unlike the combined 2D/3D accelerators of previous years, which often forced them to compromise on raster image clarity and speed.

An option that combined 3dfx logic with a 2D chip on a single board (Voodoo Rush) fell victim to an unsuccessful architecture, and in 1998, when Voodoo 2 was released, the manufacturer again relied on a discrete 3D accelerator, which for the first time offered the ability to combine two boards in SLI (Scan-Line Interleave) mode. This concept, under the same acronym but with a different meaning (Scalable Link Interface), was resurrected many years later thanks to the advent of the bus PCI Express.

Another of strengths 3dfx products had their own API called Glide. Since Glide implemented only the functionality that 3dfx chips had, and the code written for Glide was closer to hardware than in the universal, but underdeveloped API Direct3D and OpenGL at that time, everything best games 90s (and Quake 2, and Half-Life, and Unreal) supported Glide. Thus, 3dfx anticipated another modern trend - the introduction of low-level programming interfaces Mantle, Vulkan and Metal.

Contemporaries of the first generation Voodoo accelerators and close rivals in terms of performance were NVIDIA video cards RIVA 128 and 3D Rage Pro from ATI. RIVA 128 was NVIDIA's first achievement after the extremely interesting, but commercially unsuccessful NV1 chip. In addition to impressive performance, the device was distinguished by support for higher screen resolutions compared to Voodoo, and a high-quality integrated 2D core. It was one of the first video cards to support the AGP bus, which appeared in motherboards for Pentium II. As for the 3D Rage Pro, ATI also used a new AGP compatible chip in this board, which has been improved since the first generation Rage in both performance and rendering functions, bringing it support OpenGL interface, which the first generation 3D Rage lacked. At the same time, Rendition released the second and final iteration of the original accelerators based on programmable RISC architecture, the Vérité V2000. In terms of performance, Rendition again failed to meet the high bar set by Voodoo Graphics, but it was one of the rare 3D chips of that time capable of displaying 32-bit color.

Despite the aura of legend surrounding the names 3dfx and Voodoo, we admit that the boom in 3D graphics in the field of home PCs would certainly not have passed, even if such a company had never existed, and others (also well known to us) had thundered in its place. names. And yet it was 3dfx that played a key role in the formation computer technology and computer games as the gigantic industry that they are today, and in the history of our site it has a special place. Let us reveal a secret to readers who joined us in not so distant years - the site was originally called nothing more than 3dfx-ru.com.







2024 gtavrl.ru.