Problems with mounting video cards

Sales of AMD Radeon 6000 graphics cards started on October 22, 2010 with the release of the HD 6850 and HD 6870 models. The Radeon HD 6000 is based on the Northern Islands family of 40-nm video processors, although it was originally announced that 32-nm process technology could be used ... The video adapters have support for DirectX 11, in addition, Eyefinity technology has been further developed in them, which allows you to connect six monitors to the graphics card simultaneously. New to the Radeon 6000 series is the ability to view 3D images. Moreover, when using 3D Stereo, you are free to choose the model of the monitor and glasses, that is, AMD abandoned nVidia's policy, which imposed exclusively its 3D developments on users.

Radeon 6400

The youngest line of the 6000 series is represented by one video card - AMD Radeon 6450. This graphics solution is not intended for computer games in any way, therefore, the price is quite affordable. If you are not a fan of virtual entertainment, but you need features such as multiple monitors, three-dimensional image, high video picture quality, then the Radeon HD 6450 will be the perfect choice, there is simply no point in buying a more expensive model. The HD 6450 is based on the budget Caicos GPU and comes in two flavors. The first - with a clock frequency of 750 MHz, the volume of the video buffer is 512 with a frequency of 3600 MHz, the GPU clock speed of the second is 625 MHz, the type of RAM is DDR3 or GDDR5, the frequency is 1600 and 3200 MHz, respectively.

Radeon 6500

The lineup of the AMD Radeon 6500 line also has only one representative - HD 6570. The video card belongs to one of the lower price categories, therefore it does not demonstrate particularly high performance, in particular, it loses in this regard. The graphics processor Radeon HD 6570 Turks is clocked at 650 MHz. There are models with GDDR3 and GDDR5 memory types. The first option has 512, 1024 or 2048 MB at 1800 MHz, the second - 512 or 1024 MB at 4000 MHz.

Radeon 6600

Another AMD Radeon 6000 family, represented by the only model - HD 6670. The graphics adapter is based on the same video processor as the AMD Radeon 6570, but its capabilities are somewhat expanded here. The Radeon 6670 is worthy of being called a gaming graphics card. It will handle most games at a resolution of 1680x1050, but you should not expect the maximum level of graphics, and the brakes will certainly make themselves felt. But even these problems can be avoided - it is enough to reduce the resolution. The clock speed of the GPU Radeon HD 6670 is 800 MHz, the amount of GDDR5 RAM is 512 or 1024 MB, the clock speed is 4000 MHz, the bus width is 128 bits.

Radeon 6700

The Radeon HD 6700 line consists of three video cards - HD 6750, HD 6770, HD 6790. It is noteworthy that the first two of them are based on the Juniper GPU used in the Radeon HD 5700. Nevertheless, in terms of characteristics, they are significantly ahead of similar models of the previous generation, so, the Radeon HD 6750 in terms of performance is very similar to the HD 5850, and the Radeon HD 6770 is ahead, which was the most powerful video card with a single chip in the five thousandth series. The Radeon 6750 is equipped with a 700 MHz video processor and 1024 MB of GDDR5 RAM with an effective frequency of 4600 MHz. The frequency of the Radeon 6770 processor is 800 MHz, the amount of GDDR5 RAM is 1024 MB with an operating frequency of 4400 MHz. The Radeon HD 6790, based on a next-generation GPU called Barts, delivers excellent performance and outperforms major competitors from nVidia. The Radeon 6790 has an 840-MHz core, a 1024 MB GDDR5 video buffer, and a 4200 MHz RAM on a 256-bit bus.

Radeon 6800

The core of both graphics adapters of the Radeon HD 6800 family is the Barts video processor. Economical and powerful, it makes the HD 6850 and HD 6870 graphics cards one of the market leaders and is a dangerous competitor to nVidia's GTX 460. At first glance, Barts is inferior to the Cypress GPU used in the Radeon 5800, but this is only at first glance. Indeed, the number of transistors and the size of the chip itself have somewhat decreased; nevertheless, the unique architecture gives it undeniable advantages over its predecessor. Plus, the processor Radeon HD 6850 and Radeon HD 6870, unlike Cypress, are not the flagship of the series. The type of RAM for both video cards is GDDR 5, the volume is 1024 or 2048 MB, the memory bus width is 256 bits. The clock speed of the AMD Radeon 6870 GPU is 900 MHz, the AMD Radeon 6850 processor is less powerful - 775 MHz.

Radeon 6900

And, finally, the flagship line of the six thousandth series - Radeon HD 6900. There are three boards - HD 6950, HD 6970 and HD 6990. And if the first two have one high-tech Cayman graphics processor built in each, then the Radeon 6990 boasts two cores at once. These graphics adapters are the most gaming ones, a gamer's dream, already achievable, albeit not the cheapest. The characteristics of the Radeon 6950 are as follows: 800 MHz GPU, 1024 or 2048 MB GDDR5 with an operating frequency of 5000 MHz. The Radeon 6970 has a core clock speed of 880 MHz, the video buffer is 2048 MB GDDR5 with an operating frequency of 5500 MHz. If with the release of the Radeon HD 6950 and Radeon HD 6970 the dual-core HD 5970 remained the leader in performance among video cards, the release of the Radeon HD 6990 demonstrated the superiority of the latter, the new product turned out to be 30% faster than its predecessor.

This article focuses on one of the best video cards, the popularity of which has not declined for several years - GeForce 6600. In just a few years, technologies in the field of gaming applications have gone so far that most users no longer remember about devices of this class. However, there is a category of people who prefer to use computers before the first breakdown, without subjecting them to improvements. It is for this target audience that this article is intended.

The reader will find out why this video adapter is so popular among users, get acquainted with market representatives built on the basis of GeForce 6600, and receive complete information on performance testing and data on overclocking of this video card.

Historical reference

The thing is that the 6600 video adapter was produced at the time of the transition of computer component manufacturers from one technology to another (the AGP video bus was replaced by a PCI slot). Accordingly, the manufacturer tried to sit on two chairs: to make the fastest device for the old AGP technology and, not knowing the trends of the new market, to implement all the technologies available at the time of production in devices with the new PCI bus. It is generally accepted that it was with the GeForce 6600 video adapter that the epic with overclocking the graphics core and memory bus began.

It cannot be said that after a few years the video adapter continues to occupy the top positions; rather, on the contrary, its performance is negligible for modern games. But we can say with confidence that 6600 is the best solution for owners of old computers.

If we talk about the chipset

The chipset is codenamed NV43 and comes in several modifications. The first division goes by platform: PCI and AGP. And the second concerns directly the technical characteristics. So the manufacturer created an inexpensive solution called GeForce 6600 LE, endowing it with weak characteristics. He released many basic modifications that do not have any letters in the marking, and made the top-end GT line, providing it with high power. Moreover, this very power is achieved by increasing the frequencies of the memory and the graphics core - the usual overclocking, only performed by the manufacturer's hands.

There are a lot of reviews from IT specialists in the media who assure that in fact the division of video adapters by modifications responsible for performance is a common rejection. If the chipset is able to work stably in hard mode, it will be labeled as GeForce 6600 GT. If the chip is unstable, it is not marked in any way, and if there is a strong heating during overclocking, then it should be an inexpensive LE modification.

Specifications

The NV43 chip manufacturing technology complies with the 256-bit standard, has a 0.11 micron manufacturing process and includes 143 million transistors. The manufacturer based on Nvidia GeForce 6600 introduced modifications with GDDR and GDDR3 memory bus. Naturally, this difference was reflected in the performance of video adapters (in fact, the difference in bandwidth for GDRR3 is 1.5 times higher). the device is 256 MB, although sometimes there are copies with 128 MB on board.

The real operating frequency of the improved modification is 500 MHz, but for cheap chips the limit is set at 350 MHz. All devices support DirectX 9.0c, 10 bits per color channel and are equipped with a built-in TV encoder that uses the power of the video processor to operate. Do not forget that it was the sixth modification from Nvidia that began the epic with the introduction of special functions into its hardware solutions: adaptive, true trilinear filtering, UltraShadow and similar additions.

Direct competitor of the chipset

Due to the fact that Nvidia has a competitor on the market - ATI, all users have the opportunity to purchase computer components at an affordable price. This statement also applies to the video card in the review - GeForce 6600, whose characteristics are as close to ATi Pro as possible. The competitor's video adapter has a lower frequency of the graphics core (425 MHz) and an underestimated memory frequency (800 MHz), but this does not prevent it from showing high performance at all.

It's all about the technologies that ATI adapters support. Having long abandoned the race for increased frequencies, the manufacturer began to introduce its own developments into the system, which allow them to demonstrate excellent potential in work. In the media, the owners note the poor technical support for users in terms of drivers, as is developed in the GeForce 6600. Windows 7, detecting a discrete video adapter ATI Radeon X700 Pro, assigns it an integrated driver.

Leader in the video card market

The most productive video adapter based on Nvidia GeForce 6600 GT is considered to be a product from Leadtek. It owes its enormous power to the high frequency of the graphics core (550 MHz) and the memory bus (1120 MHz). High speed for transferring video data is provided by Samsung memory modules, the response time of which is 1.6 nanoseconds.

The absolute leader in performance in the market has several drawbacks. First, the cooling system. It provides only the graphics chip with a stream of cold air, but memory modules and other important components of the device were left unattended. Secondly, there is no video input, as is implemented by all competitors. But the video output has support for HDTV. In fact, it is not clear what this is - a virtue or a disadvantage. By the way, many potential buyers were literally chasing after the Leadtek video adapter, because it included three licensed games: Splinter Cell, Prince Of Persia and Pandora Tomorrow.

Gold Series

When it comes to devices that are built to last, many owners immediately think of the GeForce 6600 GT video adapter from Gainward. The video card is included in the gold series, which means, along with high performance, it guarantees fault tolerance in operation. The manufacturer made an interesting marketing ploy by installing the basic settings for the frequencies of the video core (500 MHz) and the memory bus (1000 MHz) in the BIOS of the video card, he gave the device a lifetime warranty. Only on condition that the user will not overclock it. However, when installing the proprietary software supplied on the disk, the video driver independently overclocks the graphics core to 540 MHz, and the bus to 1150 MHz.

Compared to the competition, the performance of the Gainward video adapter is very high. Hardware support for video input and video output from HDTVs is a great addition to a worthy purchase. The future owner will also be pleased with the system that effectively cools all modules installed on the board.

No noise and dust

If the user thinks that the 6600 GT with passive cooling is unlikely to surprise fans of dynamic games, then he is definitely mistaken, because the Gigabyte company will never allow everyone around to consider its products low-grade. A copper heatsink wraps around the video card from both sides, covering all the chips installed on it. To normalize temperatures between the two radiators, there is a wide enough copper tube.

And yet it gets warm. But not strong enough for the protection to work, and the computer went into a system reboot. Experts recommend that future owners think about having a spacious system case with decent ventilation. High performance of the adapter is provided by the increased memory frequency - 1120 MHz and Samsung modules 1.6 ns. As a result, this particular product is recommended to many users for whom high performance and noiselessness in work and games are a priority - Gigabyte has no competitors for the NV43 chip and is not expected.

Diamond Series

The legendary 6600 AGP by MSI will be remembered by all users for its advertising, because it was thanks to it that all fans of dynamic games learned about the excellent shooter The Chronicles of Riddick. The hero of the film himself was depicted on the box, and the toy was supplied free of charge in the set to the video adapter. The power of the device was declared Diamond. The video card for the AGP bus had a huge potential - 400 MHz for the core and 800 MHz for the memory. It is a pity that the developers of MSI did not take another step forward, providing the user with at least some tool to increase the frequency of the graphics core and bus. In tests, this is a bit sorely lacking.

The manufacturer also took care of cooling: the cooler blows on the graphics controller, and the memory chips are equipped with heatsinks. Judging by the reviews, users are confused by the absence of a copper lining on the back side of the video card, because the power dissipated is precisely this part of the board. In general, the attitude towards the product is positive, the video card is beautiful, made with intelligence and, as for the technology of the last century, copes very well with the tasks.

Strange extreme

ASUS products have always occupied top positions in all tests, however, the leader of the computer market got embarrassed with the GeForce 6600 128Mb Extreme video card. The factory-set memory and core clock speeds are unaffected. What the manufacturer ASUS wanted to achieve in the market remains a mystery. By removing the video input and placing slow 2.5 ns memory modules, the famous Taiwanese brand didn't even bother to lower the price of its products.

In the media, many owners assure those around them that ASUS is the only company in the world to supplement the bundle with the licensed Joint Operations game. There is no logic in such statements, because the game can be purchased separately, and everyone wants to see a productive video card in their own computer, not consumer goods.

Last Hero

Albatron's product Trinity GeForce 6600 AGP is considered the benchmark for many video cards, because this brand at the dawn of the computer era set the direction of the market for all components. The appearance of Albatron devices, be it a video card, a motherboard or a tuner, has always differed from competitors in emphasized rigor and conservatism.

In the race for high performance, the market leader decided not to participate, directing all his potential to the packaging and software that comes with the video card. It is a pity that many manufacturers at this point in time pay very little attention to the software supplied with the video adapter, because it is in these products that users' attachment to the brand lies.

Testing video adapters

It is unlikely that it will be possible to force a GeForce 6600 PCI device running on DirectX 9.0 technology to run a more advanced toy in the tests, so the corresponding applications (The Chronicles Of Riddick, Half-Life 2 and Doom3) will be used to compare video cards. Naturally, there can be no question of any FullHD parameter, a standard resolution of 1024x768 dpi with 32 bits of color is used.

As expected, the leading positions were taken by devices with the maximum frequencies of the graphics core and memory bus: Leadtek, Gainward and Gigabyte. From this, one single correct conclusion should be made that the video card with the maximum overclocking wins in games. Appearance, beauty and equipment are all just to distract potential buyers from high-performance video adapters.

Pitfalls for future owners

Manufacturers always hide the flaws of their products from prying eyes. So it happened with the NV43 chip, which has already managed to distinguish itself from its competitors and demonstrate its ultra-high crystal strength, which, if handled carelessly, leads to disastrous results. Having improved the performance characteristics for the GeForce 6600 with 143 million transistors on the chip, the manufacturer did not think about security, which is violated by careless handling of the video card.

The fact is that the chip does not fit tightly to the surface of the video card, and the cooler installed on top for cooling does not have a recess for the graphics chip. Accordingly, if you press one of the edges of the radiator, the crystal of the graphics core bursts. There is already a lottery: only a corner of the chip will disappear or the entire GPU will split in two. In the first case, the video adapter is likely to work, and in the second, you will have to go to the store for a new purchase.

Problems with mounting video cards

Many owners of non-modern computers have already noticed the strange location of memory slots on motherboards with installed It seems that the manufacturer was joking with users by placing a contact for installing a video card in the immediate vicinity of the locks for attaching memory modules. All owners of GeForce 6600 video cards have a chance to break a couple of air conditioners on the board during the installation process. And to prevent this from happening, you need to install with the utmost responsibility and with good lighting.

In the media, IT professionals recommend installing memory modules first and only then inserting the video card into the slot. Gently lowering it down until it stops, you need to use the fingers of your other hand to move the edge of the board away from the legs for attaching the RAM modules. If the end of the video adapter's printed circuit board for some reason abuts against the capacitor on the motherboard, then the user can easily fix the problem on his own by carefully bending the interfering device to the side.

Effective overclocking

A curious user will certainly come to independent conclusions that the performance of the video adapter can be increased by overclocking and further, if the appropriate software is at hand, and the video card has excellent cooling. Judging by the reviews for several years in the media, much time has been devoted to overclocking the GeForce 6600. The owners managed to achieve record at that time indicators: the frequency of the graphics core is 615 MHz and the bus frequency is 1340 MHz.

First of all, the user's task is to provide the video card with decent cooling. There are many options. You can replace the stock cooler with some professional solution from Shuttle or Zalman, but the price of the cooling systems raises doubts. It is easier to change a motherboard with a processor than to pay a large sum for a professional solution. The second option is to install an additional 12 mm fan with an air flow of 90-140 CFM above the standard cooler. For overclocking, it is recommended to use the official software that comes with the Nvidia driver.

Finally

The GeForce 6600 video card is the only link that still connects the two eras of personal computers capable of running dynamic games. Transitional models of video cards will leave the market - the time of old PCs will end. And if a potential buyer is faced with a choice whether to buy a GeForce 6600 or upgrade the entire system, then it is worth thinking about the last thought, because nothing in our world exists forever. Sooner or later, old technologies will be lost for humanity forever. But if to improve a new system there is a choice between the video adapters themselves, then the buyer will definitely not find anything worthy of the GeForce 6600 chip in terms of price-quality criterion.

Discrete graphics in laptops are not new, but they solve many problems. For example, a user wants not only to work on a laptop, but also to play various computer games. For these purposes, a discrete card is required, since integrated graphics processors do not have the resources to cope with such loads.

AMD Radeon 6600m and 6700m series Specifications (General)

The prefix "M" at the end of the name of each video card means "mobile", that is, it can only be used in laptops. Both video cards have a 40nm manufacturing process, the core of which is called Turks. PCI Ex 2.1x16 connection interface. Both GPUs are capable of supporting up to six monitors without the need for special software or drivers - it all depends on the hardware support. The 128-bit bus keeps the bandwidth high.

By the type of video memory, both support the GDDR5 format, and the RAM is in the DDR3 format. The memory frequency of RAM is 900 megahertz, and the speed is about two gigabits per second. The video memory frequency reaches 800 megahertz, and the 128-bit bus increases the memory bandwidth up to 57.6 gigabits per second. AMD Radeon 6600m and 6700m series, whose characteristics demonstrate overclocking video memory speed up to 3.6 gigabits per second, can only show a high-quality image when paired with powerful RAM.

Differences in characteristics

ATI Radeon 6600m and 6700m series: their characteristics differ in the frequency of the graphics core. In the 6600m model it is 500 megahertz, and in the 6700m it is slightly more - 725 megahertz.

Math block

AMD Radeon 6600m and 6700m series: specs in terms of the math block - both cards produce about 800 stream processors. There are 40 texture blocks, as well as 16 color rasterization blocks.

Graphics capabilities

AMD Radeon 6600m and 6700m series: DirectX 11-supported features include the new Shader Model 5 and Direct Compute 11. There is also support for hardware tessellation block programming, which allows you to enjoy more realistic images than without it. There is the possibility of accelerated multithreading and improvements in texture processing. There is also transparency, which is order independent.

The AMD Radeon 6600m and 6700m series graphics card, which features OpenGL 4.1 technology, has the option of enhanced image quality. This technology includes 24x object smoothing and tailored processing.

AMD Eyefinity Technology on the AMD Radeon 6600m and 6700m series is capable of supporting up to six monitors with total resolution, frame rate, color gamut and video overlay. It is possible to set up the monitors so that they look like one screen, then the image will be divided into six monitors, so the group of monitors will look like one large display.

The AMD Radeon 6600m and 6700m series video card, whose characteristics demonstrate the presence of CrossFireX technology, allows the use of multiple GPUs, in a word, it is an analogue of SLI from NVidia. There is also support for OpenCL 11.

AMD Radeon 6600m and 6700m series Supported Technologies Overview

The first technology that has become new for a series of video cards from this manufacturer is accelerated encoding, transcoding and stretching of the video sequence.

Both GPUs play video in all modern formats and Adobe Flash Player 9.

Advanced image quality technologies include improved post-processing and scaling, color correction and automatic contrast adjustment. Increased white brightness and extended blue range. Independent technology of gradual transition from one tone to another. Continuous automatic video range adjustment. Supports two simultaneous playback streams at 1080p.

The next technology is called AMD 3D, which has full support for stereo glasses and three-dimensional images on the monitor. Also has support for 3D games, and can work with stereo software from other manufacturers with 3D systems.

It is worth noting AMD PowerPlay automatic power control technology, which saves energy through automatic power distribution at rest. Competently distributes the load when working with many cards.

The latest technology is a built-in controller called HD audio. With HDMI, you can enjoy protected 7.1 surround sound without the need for additional cables. This technology plays audio in all known high quality formats.

Like desktop video cards, mobile ones are also subject to wear and tear. For these reasons, it is worth replacing the thermal paste every few months, since it affects the heating of the video memory chip. Next, you need to clean the cooler of the video card at least once every six months, because the accumulation of dust can negatively affect the temperature of the chip. It is better to install the drivers for this video card from the disk or download from the manufacturer's website.

About half a year ago, the site published an article "Staying Alive: ATI Radeon X550 vs. NVIDIA GeForce 6600 LE" in which two video cards of the lower price segment were reviewed and tested. Since then, two new "players" have appeared in the range of up to $ 100 - video cards based on ATI Radeon X1300 (RV515) and NVIDIA GeForce 7300 GS (G72) chips. The price of ATI Radeon X1300 today does not exceed $ 85, and NVIDIA GeForce 7300 GS costs even less - about $ 70-75. For about the same money today you can buy video cards GeForce 6200 and 6600 LE, as well as something from the Radeon X600 Pro or even X700, depending on the amount of installed memory. The relevance of the last two video cards with the release of each new game supporting Shader Model 3.0 is getting lower, but such "oldies" as GeForce 6600 (LE) support pixel shaders of the third version, and at the same time have significantly lost in price. In addition, it would be interesting to see the difference in speed between the "correct" GeForce 6600 with 8 pixel and 3 vertex pipelines and the GeForce 6600 LE operating in a 4/2 pipeline scheme, but at frequencies significantly higher than the nominal frequencies of the GeForce 6600. Thus , four video cards of the lower price segment will take part in today's tests: NVIDIA GeForce 6600, 6600 LE, 7300 GS and ATI Radeon X1300.

1. Technical characteristics of video cards NVIDIA GeForce 6600 (LE), 7300 GS and ATI Radeon X1300

Let's look at the technical characteristics of the video cards tested today in one summary table:

Name of characteristics NVIDIA GeForce 6600 (LE) ATI Radeon X1300 NVIDIA GeForce 7300 GS
Graphics chip NV43 (NV43V) RV515 G72
Technical process, microns 0.11 0.09, TSMC 0.09
Number of transistors, mln. ~143 ~105 n / a
Core area, mm² 150 100 n / a
Operating frequencies, MHz (graphics chip / memory) 300 (350, 425) / 500 (700, 1000) 450 / 500 550 / 700
Memory bus width 128 bit 128/64/32 bit 64 bit
Number of pixel conveyors, pcs. 8(4) 4 4
TMU per conveyor, pcs. 1 1 1
Number of vertex processors, pcs. 3(2) 2 3
Pixel Shaders version support 3 3 3
Vertex Shaders version support 3 3 3
Interface PCI-Express х16
DirectX version support 9.0c 9.0c 9.0c
Additionally DVI, TV-Out, VIVO (optional) DVI, TV-Out, VIVO, HDTV DVI, TV-Out, VIVO, HDTV

2. Review of the video cards participating in testing

All video cards were provided for testing by F-Center, so prices are given according to the company's price list.

Leadtek WinFast PX6600 TD 128 Mb

The first participant in today's tests - the Leadtek WinFast PX6600 TD 128 Mb video card - comes in a large box made of thick cardboard, which is already well known to our readers in style and design:

Leadtek included the following components with its graphics card:

  • splitter adapter;
  • cD with WinFast 3D Graphics drivers;
  • cD with PowerDVD 6 and other programs;
  • 2 DVDs with Splinter Cell 3: Chaos Theory and Prince of Persia: Warrior Within;
  • user manual in several languages \u200b\u200b(including Russian).

The user will be pleased that even with such an inexpensive video card there are two games included, albeit not the newest ones.

The video card is located in the central compartment of the box, designed in such a way as to exclude the possibility of damage to the device during transportation. On the front side of the green Leadtek WinFast PX6600 TD 128 Mb PCB there are eight memory chips in a TSOP package, an aluminum cooler and power system elements:

The video card is equipped with analog, digital (DVI) and TV-out. There is nothing to pay attention to on the reverse side of the board:

Without a cooling system Leadtek WinFast PX6600 TD 128 Mb looks like this:

The all-aluminum cooler is secured to the board with two plastic spring-loaded nails. Around the perimeter of the chip, there is a soft gasket that protects the chip from accidental chips:

GPU released in Taiwan 45 week of last year, A4 chip revision:

The nominal frequency of the GPU complies with the specifications and is equal to 300 MHz. The number of pixel pipelines is 8, and the number of vertex pipelines is 3.

128 Mb of DDR video memory are packed with 8 Samsung chips with a nominal access time of 3.6 ns:

Memory tagging - K4D261638F-LC36. The bus width of the video card memory is equal to 128 Bit, and the nominal frequency of the video memory is 500 MHz. In general, the video card fully complies with NVIDIA specifications for the GeForce 6600.

Before moving on to studying the overclocking potential of a video card, it should be added that all four participants in today's article were tested for overclocking and further tested with the installed Zalman VF700-Cu cooler at the maximum rotation speed of its fan (about 2750 RPM). I understand perfectly well that users who buy video cards for themselves at a cost of $ 75-100 will not change their cooling by installing a cooler that costs almost a third of a video card. However, the replacement of standard cooling systems is caused not only by the desire to squeeze the maximum out of the tested video cards without volt mods, but also by the desire to put all cards on an equal footing in terms of cooling.

After replacing the stock cooler with the Zalman VF700-Cu, the Leadtek WinFast PX6600 TD 128 Mb video card was overclocked from the nominal frequencies of 300/500 MHz to 435/614 MHz:

The video card demonstrated a good overclocking potential: the increase in GPU was 45.0%, and 22.8% in video memory. Before replacing the cooling system, the graphics processor was able to operate stably at 390 MHz, while warming up to 67 degrees Celsius. Replacing the cooler did not affect memory overclocking in any way.

At the end of the review of Leadtek WinFast PX6600 TD 128 Mb, I present a link to the video card BIOS (41.4 Kb, WinRAR archive). The cost of the video card at the time of publication of the article is $ 101.

Palit GeForce 6600Very + 128 Mb (6600 LE)

The video card Palit GeForce 6600Very + 128 Mb (6600 LE) was provided to us for testing in an OEM package. In addition to the video card, the antistatic package contains a CD with drivers, one 15 pin DVI / D-Sub adapter, an S-Video cable, and a splitter adapter for HDTV and VIVO:

Before us is a small board made on a red PCB with a burgundy tint:

Like the above video card from Leadtek, Palit GeForce 6600Very + 128 Mb (6600 LE) is equipped with analog, digital (DVI) and TV-out. The all-aluminum radiator of the cooling system, cooled by a small turbine, is in contact only with the graphics chip through a thin layer of white thermal paste and "sour cream" consistency.

All memory chips of the video card are located on the front side of the board, therefore, on the back side, you can pay attention only to a kind of backplate with a wire fastening and a soft gasket:

We are already familiar with this method of mounting the cooling system from the Palit Radeon X1600 Pro 128 Mb tests. This simple and convenient mounting method generates sufficient clamping force for efficient cooling.

Without a cooling system around the chip, we can mark as many as 8 holes in the board, which, as it seems to me, will allow installing almost any cooling system:

Let's look at the GPU: the country of origin is Taiwan, the release date is 7 week 2005, that is, the chip was released over a year ago.

Please note that unlike the previous "full-fledged" GeForce 6600, the Palit GeForce 6600Very + has the name of the chip on the substrate. The nominal frequency of the graphics processor of the video card is 350 MHz, which is 50 MHz higher than that of the GeForce 6600 by Leadtek.

Alas, but with the graphics processor GeForce 6600Very + 128 Mb, they were not lucky, overclockers in this case will have to be content with only four pixel and two vertex pipelines and hope for a significant overclocking of the chip:

However, it doesn't take long to get upset about the four-pipeline chip. The fact is that the main highlight of Palit GeForce 6600Very + is its equipment with 128 Mb of standard memory GDDR-3 with a nominal access time of 2.2 ns:

The entire memory capacity is recruited by four Infineon chips labeled HYB18T256321F-22. The nominal frequency of the video memory is 700 MHz, which is 200 (!) MHz higher than that of ordinary GeForce 6600. The maximum theoretical frequency of this memory is 900 MHz, which allows us to hope for a good overclocking and the hopes were justified. Both before and after replacing the standard cooler with the Zalman VF700-Cu, the memory functioned stably at a frequency of 1000 MHz, which is equal to the nominal value of the GeForce 6600 GT memory, which costs almost twice as much:

The graphics processor did not remain in debt either, increasing its frequency to 481 MHz, or by 37.4% (460 MHz with a standard cooler). Thus, from the usual GeForce 6600 LE, according to the frequency formula, it turned out to be almost a GeForce 6600 GT, albeit with only four pixel and two vertex pipelines.

A similar video card is already known from the article "Troubled times on the video card market: beware of the GeForce 6600!" But then this card was considered as a "wrong" GeForce 6600. I suggest you look at Palit GeForce 6600Very + 128 Mb in a slightly different way, because the cost of this video card today is comparable to the GeForce 6200 and is about $ 85, which is 15-20 dollars lower than the "correct" GeForce 6600. At the same time, in comparison with the GeForce 6200, the GeForce 6600Very + lacks only one vertex pipeline, which is compensated by (I'm not afraid of this word) significantly higher frequencies. But whether such high frequencies of the graphics chip and memory will help Palit GeForce 6600Very + 128 Mb successfully compete with the "regular" GeForce 6600, as well as with other participants in today's article, will be shown by testing.

Here it remains to look at the temperature regime of the video card, which turned out to be very modest when using the Zalman VF700-Cu:

I will add that the original cooler of the video card works quietly, although it is slightly audible against the background of a quiet system unit.

You can download BIOS Palit GeForce 6600Very + 128 Mb (41.4 Kb, WinRAR archive).

Sapphire Radeon X1300 256 Mb

A video card from Sapphire also ended up in the OEM package, where, in addition to it, the package included:

  • s-Video cable;
  • adapter from S-Video to coaxial cable;
  • two 15 pin DVI / D-Sub adapters;
  • cD with drivers.

The video card surprised me with a large needle-shaped passive heatsink covering almost the entire front side of the board:

The heatsink contacts the GPU through a thin layer of very thick gray thermal paste. In turn, with memory chips located on the front side of the board, the contact of the heatsink is provided using thick thermal pads:

Without heatsink Sapphire Radeon X1300 256 Mb looks like this:

Directly at the board outputs - analog, digital (DVI) and TV - there is empty space for an unsoldered ATI Rage Theater chip. Eight memory chips in a TSOP package are located on both sides of the board, but there are no heatsinks on the memory chips on the back side.

Country of manufacture of the RV515 GPU is Taiwan:

The chip is dated 47 weeks of last year, has 4 pixel, 2 vertex pipelines and operates at a nominal frequency of 450 MHz. But video memory today can almost be called antiques:

Its nominal access time is 5.0 ns, which corresponds to a frequency of 400 MHz, at which the memory operates. Unfortunately, this is 100 MHz below ATI Radeon X1300 specifications. The chips are manufactured by Samsung and are marked K4D551638F-TC50.

If we talk about overclocking a video card, then even at nominal frequencies, the passive radiator of the video card warmed up to such an extent that a finger on it could not stand even 2-3 seconds, so replacing the radiator with a Zalman VF700-Cu came in handy. After that, the graphics processor was overclocked to 600 MHz or +34% to the nominal:

The video memory only overclocked to 440 MHz, however, its high nominal access time did not give any hope for overclocking ...

Alas, the video card does not have built-in monitoring tools, so it is difficult to judge its temperature after replacing the cooling system and overclocking. If someone needs BIOS Sapphire Radeon X1300 256 Mb, then you won't have to download a large volume for this (34.1 Kb, WinRAR archive). At the time of this posting, this graphics card can be purchased for $ 86.

Palit GeForce 7300 GS 256 Mb

Another video card produced by Palit - GeForce 7300 GS 256 Mb - came to us for testing in a small cardboard box:

And do not be confused by the sticker with 512 Mb video memory, since we are talking only about support for the GeForce 7300 GS of this memory size. In fact, the tested video card has 256 Mb of video memory.

In addition to the video card, the following components were found in the box:

The filling of the box is, frankly, "Spartan":

  • splitter adapter for HDTV and VIVO;
  • one 15 pin DVI / D-Sub adapter;
  • cD with PowerDVD 5;
  • cD with video card drivers.

The Palit GeForce 7300 GS 256 Mb video card is somewhat surprising with its small dimensions:

The height of the video card is quite standard, but its length is such that it will protrude only 1.5 cm from the edge of the PCI-Express slot.On the front side there is a small cooler with an aluminum radiator and a holographic sticker in the center, as well as 4 video memory chips arranged vertically in a row. There are no memory chips on the back of the board:

Palit GeForce 7300 GS 256 Mb is equipped with analog, digital (DVI) and TV-out, and the graphics processor, in comparison with the above-mentioned video cards, looks just tiny:

The chip revision is A3, and the release date is 52 weeks of last year:

Four silicone pads, the edges of which are visible at the four corners of the chip backing (as well as in the previous photo), prevent the core from chipping and damage, however, as you can see, it was not without them. The nominal frequency of the graphics processor is 550 MHz, which is 250 MHz higher than that of the GeForce 6600.

GDDR-2 memory chips manufactured by Elixir on week 30 of 2005:

The chips are marked with N2TU51216AG-XP, and their nominal operating frequency is 700 MHz. With a total memory capacity of 256 Mb, its communication bus with a graphics chip is only 64 bit, and this, in my opinion, is the main drawback of the GeForce 7300 GS line.

Speaking about the overclocking potential of the Palit GeForce 7300 GS 256 Mb, we must admit that it turned out to be quite average: the graphics processor worked stably at 640 MHz (+16.4%), and the video memory at 876 MHz (+25.1%):

The last version of Everest available at the time of this article's preparation refused to correctly perceive the frequency of the GeForce 7300 GS video memory. Unfortunately, the developers of this information and diagnostic utility are often late with entering information about new devices into it, for example, the GeForce 7600 GT still does not understand this program.

The Palit GeForce 7300 GS 256 Mb does not have any means of monitoring the temperature regime, so it remains to add that the small fan of the standard cooling system works quite quietly.

At the end of the video card review, I post a link to the BIOS Palit GeForce 7300 GS 256 Mb (46.6 Kb, WinRAR archive). The cost of this video card is currently 69 US dollars.

3. Testbed configuration, operating system, drivers, benchmarks and games

Testing of video cards was carried out in a closed case of the system unit with the following configuration:

  • Motherboard: ABIT AN8 SLI (nForce 4 SLI), Socket 939, BIOS v.2.0;
  • Processor: AMD Athlon 64 3200+ (2000 MHz), 1.40 V, 512 Kb, Cool & Quiet - Disable, (Venice, E6).
  • Processor cooling system: Thermaltake Big Typhoon "Hands Edition", ~ 1100-2000 RPM, ~ 16-21 dBA;
  • Thermal interface: Coollaboratory Liquid Pro;
  • RAM: 2 x 512 Mb PC3200 Corsair TWINXP1024-3200C2 (SPD: 400 MHz, 2-2-2-5_1T), @ 467 MHz 2-3-4-8_1T;
  • Disk subsystem: SATA-II 160 Gb Seagate Barracuda 7200.9 (ST3160812AS 2AAA) 7200 RPM, 8 Mb;
  • Case: ATX ASUS ASCOT 6AR2-B Black & Silver + blowing Coolink SWiF 120 mm (~ 1200 RPM, ~ 24 dBA) + blowing 120 mm Sharkoon Luminous Blue LED (~ 1000 RPM, ~ 21 dBA);
  • Power supply: MGE Magnum 500 (500 W) + 80mm GlacialTech SilentBlade fan (~ 1700 RPM, 19 dBA);
  • Monitor: LCD DELL 1800 / 1FP UltraSharp (1280x1024, DVI, 60 Hz).

The operating system was Windows XP Professional Edition Service Pack 2 installed on the first 8 Gb hard disk partition. All test benchmarks and games were installed on the third partition of the hard drive with a size of 65 Gb. All unnecessary services, except for the minimum necessary ones, were disabled. Testing was carried out using NVIDIA nForce system drivers 6.82 and DirectX 9.0c libraries (release date - December 2005). ForceWare 83.40 was used as a driver for video cards on NVIDIA chips, and for the Radeon X1300, the last one available at the time of this article's preparation was ATI Catalyst 6.3.

The performance of the video cards was tested in two resolutions: 800x600 and 1024x768 with driver settings only in "Quality" mode. Anisotropic filtering and full-screen anti-aliasing were not activated. Since the performance of the video cards tested today is at a low level, all enabled optimizations in the Catalyst and ForceWare drivers have been activated. Catalyst A.I. set to "High" mode. In addition, the testing settings in games were set in such a way as to provide a comfortable number of frames per second (at least at a minimum). Thus, the following set of synthetic, semi-synthetic benchmarks, games and their settings were used for testing:

  • 3DMark 2005 - build 1.2.0, default settings;
  • 3DMark 2006 - build 1.0.2, only resolution 1280 x 1024, default settings;
  • Half-life 2: Lost Coast (Direct3D) - demo recording "d2" (647.2 Kb), maximum graphics settings are set in the game itself;
  • The Chronicles Of Riddick: Escape From Butcher Bay (OpenGL) - game version 1.0.0.1, maximum graphics quality, Shader 2.0 , demo "ducche";
  • Call of duty 2 (Direct3D) - game version 1.01, texture settings set to "Medium", demo recording "d1" (614.8 Kb);
  • Serious sam 2 (Direct3D) - game version 2.066, standard demo "GREENDALE", "Medium" graphics settings in the game, HDR Off;
  • Quake 4 (OpenGL) - game version 1.0.4.0 build 2147, demo at the very beginning of the "Hangar Perimeter" level, graphics detail in the game - "Medium Quality", triple demo recording to minimize the dependence of the results on hard disk speed and caching;
  • F.E.A.R. (Direct3D) - game version 1.02, built-in benchmark, all graphics settings during testing are set to "Medium", Soft Shadows \u003d Off.

On the diagrams, to display the results of video cards operating in nominal modes, a "cold" blue-blue gamma was used, and for overclocked ones, a "warm" orange-red one. If the results obtained had a minimum FPS, then it was indicated on the diagrams.

4. Results of testing video cards and their analysis

First, let's see how video cards perform in popular synthetic benchmarks: 3DMark 2005, as well as the new 3DMark 2006.

3DMark 2005

3DMark 2006

If in 3DMark 2005 we have only one outsider - GeForce 6600 LE, then in 3DMark 2006 the GeForce 7300 GS was added to it, which in 3DMark 2005 in the nominal mode of video cards was listed among the leaders. Meanwhile, a modest in frequency (including after overclocking), but full-fledged in terms of the number of pipelines, the GeForce 6600 confidently bypasses the next Radeon X1300. However, the balance of power in both synthetic 3DMark benchmarks is often far from the test results in real games, which we will study further.

Half-life 2: Lost Coast

In Half-life 2: Lost Coast, the video cards tested today allow you to play at maximum graphics quality settings, albeit only in the resolutions selected for the tests and without using HDR technology. As you can see, even the high frequencies of the GeForce 6600 LE after overclocking do not help this video card to break out of the underdog at 1024x768. The GeForce 6600 is again in the lead, followed by the Radeon X1300, and the new NVIDIA GeForce 7300 GS is in third place with a minimal lead over the Radeon X1300.

The Chronicles Of Riddick: Escape From Butcher Bay

The Chronicles Of Riddick: Escape From Butcher Bay, which has practically left the gaming scene, is capable of loading not only video cards of the lower and middle price segment, but also some top graphics accelerators. To be fair, I have to say that you have to use Shader Model 2.0 ++ and high resolutions. Meanwhile, the flexibility of the settings for this game allows, with the maximum quality and Shader Model 2.0 at 800x600 and 1024x768 resolutions, to get an acceptable FPS even on video cards as weak as the ones tested today. The leaders are GeForce 6600 and 6600 LE, and the second one in 800x600 resolution outperforms the eight-pipelined GeForce 6600, but with an increase in resolution, the high frequencies of the GeForce 6600 LE no longer compensate for the lack of pipelines. The GeForce 7300 GS is slightly ahead of the Radeon X1300 in this game.

Call of duty 2

In one of the "heaviest" games in terms of graphics - Call of Duty 2 - even with medium settings, the video cards tested today can only play at 800x600 (this is a stretch). To use DirectX 8 profile or even more modest graphics settings, in my opinion, is unacceptable. Better not to play at all than to try to "play" this way. A completely inadequate idea of \u200b\u200bthe game and the gameplay is developing, which then generates sharply negative reviews, there are plenty of examples in our conference. If we talk about the test results in this game, then quite unexpectedly, the Radeon X1300 turned out to be the leader. GeForce 6600 and 7300 GS lag slightly behind it. At the end, a high-frequency cut of the GeForce 6600 LE "sadly weaves".

Serious sam 2

But Serious Sam 2, also at medium graphics quality settings, demonstrates a more comfortable number of frames per second than Call Of Duty 2. But even here we see nothing new in the alignment of forces in relation to the test results in previous games.

Quake 4

The leadership of video cards based on NVIDIA chips in Quake 4 is far from a novelty for us. Even the 64-bit GeForce 7300 GS beats the Radeon X1300. The advantage of the GeForce 6600 is undeniable in both resolutions, both in the nominal operating mode of video cards and during overclocking.

F.E.A.R.

Of all the video cards tested today, only the overclocked GeForce 6600 demonstrates acceptable speed at a resolution of 1024x768 and with medium graphics quality settings (the minimum FPS does not drop below 20). All other video cards are unlikely to allow comfortable playing F.E.A.R. Otherwise, you will have to sacrifice resolution, or further reduce the graphics quality settings in the game.

Conclusion

First of all, I would like to say that owners of a GeForce 6600 with 8 pixel and 3 vertex pipelines need not worry: it is difficult to find a competitor to this video card in the price range up to $ 100. Considering that the GeForce 6600 is nevertheless somewhat more expensive than the rest of the test participants (and overpaying $ 15-20, in my opinion, is critical for video cards of such a price), it is more correct to choose from Palit GeForce 6600 Ver. and Sapphire Radeon X1300 256 Mb ($ 86). Their speed is about the same: in half of the games (Half-life 2, Call Of Duty 2 and Serious Sam 2) the Radeon X1300 is ahead, in the second half (The Chronicles Of Riddick, Quake 4 and F.E.A.R.) the advantage is on the side of the GeForce 6600 LE. However, if you "play" the benchmarks, then in 3DMark 05 and 06 the Radeon X1300 wins.

If we compare the new products in the face of the GeForce 7300 GS and the Radeon X1300, then the latter has a victory only in 3DMark 2006, Call Of Duty 2 and F.E.A.R. However, one should take into account the fact that the video card was tested with a video memory frequency that was too low by 100 MHz relative to the specifications, and the results of the Radeon X1300 should increase a little more. However, Palit GeForce 7300 GS 256 Mb at a lower price ($ 69) provides approximate parity in all other games and a win in Quake 4. It should be noted that you may come across cheap Radeon X1300 variants with a 64-bit and even 32-bit bus on sale. ! I think there is no point in explaining that you are unlikely to be able to play on such a "visible map".

Thus, if we take the entire range up to $ 100, then the GeForce 6600 looks the best, just make sure that the card's characteristics meet the specifications. If every $ 15 counts, then there are two leaders - a full-fledged GeForce 6600, as the fastest, but relatively expensive, and GeForce 7300 GS, as the cheapest, but not very slow. There are many others between these two options, given the huge variety of non-standard cards that differ from ATI and NVIDIA recommendations, which are produced by manufacturers of off-the-shelf graphics solutions.

Acquaintance with the assortment of video cards in the price range up to $ 100 and reviewing the corresponding discussion threads at the conference revealed one rather interesting fact. In the low-end segment, manufacturers sometimes show excessive independence, completely not following ATI and NVIDIA specifications. I do not argue, just hi aboutit’s when someone equips the GeForce 6600 with fast GDDR-3 memory and sells it at a price that does not exceed its counterparts with conventional DDR memory. But what if, at the same time, instead of eight pixel pipelines and three vertex pipelines set by the GeForce 6600 specifications, the user gets a 4/2 scheme, which is even worse than that of the GeForce 6200, which operates in a 4/3 scheme? The test results show that the race for high clock frequencies to the detriment of the number of pipelines and even more so, the width of the memory bus does not justify itself at all. This was previously confirmed by the tests of the Radeon X1300 Pro and Radeon X1600 XT (Pro). Today the proof of this is the confident victory of the full-fledged GeForce 6600 in the overwhelming majority of tests and the complete defeat of the GeForce 6600 LE card, albeit with such a high-frequency one.

Another typical example is equipping the Radeon X1300 with 512 (!) Mb video memory with a frequency that is too low relative to the specifications. I don’t presume to judge exactly how ATI and NVIDIA control this process (and whether they control it at all), but once again I would like to warn you and remind you that you need to be technically competent in terms of the main characteristics and be very careful when purchasing video cards of this price range. I hope that today's article helped you make your choice and clarified the balance of power.

Speaking about the general level of performance of video cards in this price segment, it should be noted that the owners have to sacrifice not only the image quality, but also the resolution. You can forget about the use of anisotropic filtering and, moreover, full-screen anti-aliasing. It is still possible to achieve a comfortable FPS, but only at the expense of picture quality. Therefore, the decision on the advisability of purchasing such video cards is yours.

The beginning of autumn ... Just like 5 years ago, when the announcement of the first product with hardware support for T&L - GeForce256, was announced, today we meet new products from NVIDIA. Of course, this is not High-End, but the announcement does not become less interesting from this. Actually, the announcement was made back in August, but then only the characteristics of video cards were presented, and now we, like many other media, have the opportunity to show what the latest products for the Middle-sector of 3D accelerators are capable of.

Until recently, ATI's products held the lead in this market segment: RADEON 9600 / PRO / XT, X600PRO / XT, outperforming their rivals NVIDIA GeForce FX 5700 / Ultra, PCX5750 / 5900 in speed in modern games with active use of shader technologies. And only launched "from the top" in this segment, the FX 5900XT was able to become popular and squeeze the hegemony of Canadian products. And now ...

"Nal is on his way for the main prize ... Ruby will have to keep the line ..."

Yes, it is no coincidence that a mermaid and a brave girl from the corresponding demo programs from NVIDIA and ATI, demonstrating new technologies (SM3.0 from NVIDIA and 3Dc / SM2.0b from ATI), were taken as heroines. New items from the Californian company, which we will be studying today, support shaders 3.0 in full, like their older brothers.

Will Ruby give her royal diamond to Nala, who is catching up with her? After all, soon there will be an announcement of new products from ATI in the same sector of video cards. What will be the outcome of the battle? - We don't know yet. I think that the material on RV410 will be no less interesting and exciting. But for now we will ignore this and consider the NV43 (GeForce 6600GT / 6600) as if these cards were already on sale. Accordingly, they will compete with those accelerators that are currently popular in the segment of prices from 150 to 200 US dollars. And, of course, those video cards that are being replaced by new products.

Looking ahead, we note that NV43 has built-in support for the PCI-Express interface (hereinafter referred to as PCX), so AGP products are impossible without a kit with an HSI bridge. Consequently, they will be more expensive to manufacture and will be released later than their PCX counterparts (if they do, everything will depend on demand). This is a significant disadvantage of new products today, since the PCX sector is just beginning its development, and the demand for such platforms is still minimal. Therefore, no matter how wonderful the new product is, it is doomed from the very beginning to a relatively small demand in the Retail market, since the upgrade from the AGP platform to the PCX still has dubious benefits. On the other hand, the OEM market and PC builders, especially foreign ones, will not hesitate to launch models with not as expensive as top-end solutions, but still fully meeting the modern needs of DirectX PCX solutions.

Besides, who knows, the release of interesting and advantageous video cards from the point of view of price-speed ratio may spur interest in PCX in general. In general, time will tell. And let's not forget that ATI's RV410 will also come with native support for PCX only, while the Canadian company does not have its own two-way AGPPCX bridges, and therefore it will be almost impossible for it to implement new items on the AGP bus. However, this sector is already cramped, and there are many different solutions with similar performance from previously released or released today.

It was very interesting for us to compare not only cards based on the same interface, but also AGP and PCX versions. This is, of course, very difficult to do, since the platforms are very different. However, we remember that we are in the Middle-End sector, where modern processors are quite capable of causing 100% loading of the accelerator, and the performance after a certain resolution does not depend so much on the platform. Find out below what came out of our cross-platform research.

Now let's get back to the objects of today's analysis: NVIDIA NV43 or GeForce 6600GT / 6600 (the line so far consists of two cards differing only in parts).

Official specifications GeForce 6600GT / 6600 (NV43)

  1. Chip codename NV43
  2. 110nm Technology (TSMC) (!)
  3. 146 million transistors
  4. FC case (inverted chip, no metal cover)
  5. 128 bit dual channel memory interface (!)
  6. Up to 256 MB DDR / GDDR2 / GDDR3 memory
  7. Bus interface on-chip PCI Express16x
  8. Ability to translate the interface to APG 8x using a two-way PCI ExpressAGP bridge HSI
  9. 8 Pixel processors, one texture unit on each with arbitrary filtering of integer and floating textures (anisotropy of the degree up to 16x inclusive).
  10. 3 vertex processors, one texture unit on each, without filtering the selected values \u200b\u200b(discrete sampling)
  11. Calculation, blending and recording up to 8 full (color, depth, stencil buffer) pixels per clock (experiment shows - up to 4)
  12. Calculation and recording of up to 16 values \u200b\u200bof depth and pattern buffer per clock cycle (if no operations are performed with color) (experiment shows - up to 8)
  13. Two-way stencil buffer support
  14. Support for special geometry rendering optimizations to accelerate stencil buffer based shadow algorithms (the so-called Ultra Shadow II technology), in particular, widely used in the Doom III engine
  15. Everything you need to support Pixel and Vertex Shaders 3.0, including dynamic branching in pixel and vertex processors, fetching texture values \u200b\u200bfrom vertex processors, etc.
  16. Filtering textures in floating format
  17. Floating frame buffer supported (including blending operations)
  18. 2 RAMDAC 400 MHz
  19. 2 DVI interfaces (interface chips required)
  20. TV-Out and TV-In interface (interface chips required)
  21. Programmable streaming video processor (for video compression, decompression and post-processing tasks)
  22. 2D accelerator with support for all GDI + functions
  23. Built-in temperature and power consumption monitoring

GeForce 6600 GT Reference Card Specifications

  1. Core frequency 500 MHz
  2. Effective memory frequency 1 GHz (2 * 500 MHz)
  3. Memory bus 128 bit
  4. GDDR3 memory type
  5. Memory capacity 128 megabytes
  6. Memory bandwidth 16 gigabytes per second.
  7. Theoretical fill rate 4 gigapixels per second.
  8. Theoretical texture sampling rate 4 gigatexels per second.
  9. One VGA (D-Sub) and one DVI-I connector
  10. TV-Out
  11. Consumes up to 70 watts of power (i.e., on a PCI-Express card, a connector for additional power is not needed, a power supply with a total capacity of 300 watts or more is recommended)

List of cards currently released on the NV43 base:

  • GeForce 6600GT: 500/500 (1000) MHz, 128MB GDDR3, PCI-Express x16, 8 pixel and 3 vertex pipelines ($ 199) - competitor to NVIDIA GeForce PCX5900, ATI RADEON X600 XT (?), As well as future ATI solutions (RV410) ;
  • GeForce 6600: 300 / 250-300 (500-600) MHz, 128 / 256MB DDR, PCI-Express x16, 8 pixel and 3 vertex pipelines ($ 149) - competitor to NVIDIA GeForce PCX5750, ATI RADEON X600 PRO (X600 XT?).

General circuit of the chip

There are no special architectural differences from NV40, which, however, is not surprising - NV43 is a scaled (by reducing the number of vertex and pixel processors and memory controller channels) solution based on the NV40 architecture. The differences are quantitative (shown in bold in the diagram), not qualitative - from the architectural point of view, the chip has practically not changed.

So, there are 3 (there were 6) vertex processors, and two (there were four) independent pixel processors, each of which works with one quad (a 2x2 pixel fragment). Interestingly, this time PCI Express has become a native (that is, implemented on a chip) bus interface, and AGP 8x boards will contain an additional two-way PIC-E AGP bridge (shown by a dotted line), which we described in detail earlier.
In addition, we note a very important limiting point - a two-channel controller and a 128-bit memory bus - we will discuss and investigate this fact in detail below.

The architecture of vertex and pixel processors and video processor remains the same - we described these elements in detail in our review of the GeForce 6800 Ultra (link). Now, let's talk about potential quantitative and qualitative changes in relation to the NV40:

Theoretical considerations about what was cut and how

In general, at the moment, we are receiving the following line of solutions based on NV4X and R4XX architectures:

Pixel /
Vertex

Memory lane

Fillrate
Mpix.

Core frequency

256 (4 x 64)
GDDR3 1100

256 (4 x 64)
GDDR3 1000

256 (4 x 64)
DDR 700

256 (4x64)
DDR 700

128 (2x64)
GDDR 3 1000

128 (2x64)
DDR 500-600-700

256 (4 x 64)
GDDR3 1000

256 (4 x 64)
GDDR3 1100

256 (4 x 64)
DDR 700

256 (4x64)
DDR (?)

X 700 PRO / SE *

128 (2x64)
?

Based on previous generation architecture

*) data based on unverified rumors (beyond3d forum and other unofficial online sources), soon these products will be officially announced.

While the 6800 Ultra, GT, and just the 6800 seem to be fairly balanced solutions in terms of memory bandwidth and fill rate, the 6800LE will more often run into insufficient fill rate - there is too much memory bandwidth, and both 6600 models will primarily suffer from lack of bandwidth. transmission. The 6600 GT has a peak fill rate of nearly 2/3 of the 6800 Ultra, while the memory bandwidth is more than half that, excluding the potentially smaller caches and dual-channel memory controller.

Thus, we can predict that the weak point of the 6600 family will be high resolutions and modes with full-screen anti-aliasing, especially in simple applications, while programs with long and complex shaders and anisotropic filtering without simultaneous MSAA will be strong. Next, we will check this assumption with game and synthetic tests.

It is difficult now to judge how justified the move with a 128-bit memory bus was - on the one hand, it makes the chip case cheaper and reduces the number of defective chips, on the other hand, the difference in the price of a printed circuit board for 256 bits and 128 bits is not large, and is more than compensated for by the difference in the price of regular DDR and so far expensive high-speed GDDR3 memory. Probably, from the point of view of card manufacturers, a solution with a 256-bit bus would be more convenient, at least if they had a choice, and from the point of view of NVIDIA, which manufactures chips and often sells memory with them, a 128-bit solution in complete with GDDR3. It's another matter how this will affect the speed - after all, there is a potential limitation of the excellent capabilities of the chip (8 pipelines, 500 MHz core frequency, and this is not the limit yet) due to the significantly reduced memory bandwidth:

DDR 700x256 bits \u003d 22.4 Gigabytes versus GDDR3 1000x128 bits \u003d 16 Gigabytes.

This fact is especially worrisome against the background of rumors about the older model X700, which will be equipped with 256-bit memory.

However, note that NVIDIA has kept the Ultra suffix for now - given the large overclocking potential of 110 nm technology, we can expect a card with a core frequency of about 600 MHz or a little less, with 1100 or even 1200 (in the future) memory and the 6600 Ultra name. But what will be its price? In the long run, we can predict the appearance of an updated 256-bit version of the Mainstream solution, mentally let's call it NV46 optimized for performance, with 8 or even 12 pipelines and a 256-bit bus.

To all appearances, the vertex and pixel processors in NV43 remained unchanged, but the internal caches could have been reduced in proportion to the number of pipelines. However, the number of transistors does not give much cause for concern - given the not so large cache sizes, it would be more reasonable to leave them the same as in NV40, thereby compensating for the noticeable lack of memory bandwidth. It is quite possible that an array of ALUs, which are quite large in transistors, performing post-processing, verification, Z generation and pixel blending for recording the results into the frame buffer has also been reduced on each pipeline compared to the NV40 - anyway, the reduced memory bandwidth will not allow writing 4 full gigapixels to second, and the filling potential (8 pipelines at 500 MHz) will be fully used only on more or less complex shaders, with more than two textures and accompanying shader calculations.

We will check all these assumptions during the subsequent synthetic and gaming tests.

Before examining the card itself, here is a list of articles devoted to studying the previous new products: NV40 / R420. After all, it is already obvious that the NV43 architecture is a direct heir to the NV40 technologies (after the power of the chip they were divided in half).

Theoretical and analytical materials and reviews of video cards, which examine the functional features of the GPU ATI RADEON X800 (R420) and NVIDIA GeForce 6800 (NV40)

  • NVIDIA GeForce 6800 Ultra (NV40). Part 1 - architecture features and synthetic tests in D3D RightMark (one-page version)
  • NVIDIA GeForce 6800 Ultra (NV40). Part 1 - architecture features and synthetic tests in D3D RightMark (paginated version)
  • NVIDIA GeForce 6800 Ultra (NV40). Part 2 - Examining Performance and Quality in Gaming Applications (One Page)
  • NVIDIA GeForce 6800 Ultra (NV40). Part 2 - Investigating Performance and Quality in Gaming Applications (Paginated Option)
  • Battle of Borodino between ATI RADEON X800 XT and NVIDIA GeForce 6800 Ultra - Scene Two: 450 MHz in the second and new tests for both cards (one-page version)
  • Battle of Borodino between ATI RADEON X800 XT and NVIDIA GeForce 6800 Ultra - Scene Two: 450 MHz for the second and new tests for both cards (the version is divided into pages)
  • Battle of Borodino between RADEON X800 and GeForce 6800: Scene Three - Trilinear Filtering (synthetic examples)
  • Battle of Borodino between RADEON X800 and GeForce 6800: Scene 4: filtering tests based on RightMark (one-page version)
  • Battle of Borodino between RADEON X800 and GeForce 6800: Scene 4: filtering tests based on RightMark (the version is divided into pages)
  • Battle of Borodino between ATI RADEON X800 and NVIDIA GeForce 6800 - Scene Five: Game-Based Filtering Tests (one-page version)
  • Battle of Borodino between ATI RADEON X800 and NVIDIA GeForce 6800 - Scene Five: Game-Based Filtering Tests (paginated version)
  • PowerColor RADEON X800 PRO Limited Edition review, hardware rework of X800 PRO into X800 XT Platinum Edition (one-page version)
  • PowerColor RADEON X800 PRO Limited Edition review, hardware rework of X800 PRO into X800 XT Platinum Edition (paginated version)
  • Review of Leadtek WinFast A400 TDH, Leadtek WinFast A400 Ultra TDH based on NVIDIA GeForce 6800/6800 Ultra (single page version)
  • Review of Leadtek WinFast A400 TDH, Leadtek WinFast A400 Ultra TDH based on NVIDIA GeForce 6800/6800 Ultra (split into pages)
  • Battle of Borodino between ATI RADEON X800 and NVIDIA GeForce 6800 - Scene Six: Filtration in games (continued) (one-page version)
  • Battle of Borodino between ATI RADEON X800 and NVIDIA GeForce 6800 - Scene Six: Filtration in Games (continued) (option is paginated)
  • Brief report on testing FarCry v.1.2 and the first incarnation of Shader 3.0 into reality
  • Brief report on operational testing of modern 3D cards in DOOM III (X800PRO / XT, GF6800 / GT / Ultra, 9800XT / 5950U)
  • Chaintech Apogee GeForce 6800 Ultra based on NVIDIA GeForce 6800 Ultra - Tests in DOOM III with "optimizations"

Let me emphasize once again that today only the 1st part is devoted to the performance of new products. We will look at the quality components later in the second part (3D quality and video playback).

Now let's talk about the map. Why do we have two cards in the title, but are we actually considering one? The fact is that 6600GT and 6600 differ from each other only in operating frequencies, so we can most likely emulate the GF 6600 by decreasing frequencies in 6600GT. Which we did. Of course, given that the serial GeForce 6600 will have not GDDR3 memory, but simple DDR (apart from frequencies, the timings are also different), and also the fact that NVIDIA does not rigidly declare memory frequencies for such cards, and clocking from 250 up to 300 MHz memory; we cannot speak of 100% coincidence of our results with those of the final GeForce 6600. But you can estimate. And even useful. Therefore, our results will show the GeForce 6600 300/300 (600) MHz (the limiting case is taken). Everyone understands that the real 6600 will show the performance NOT MORE THAN what we have in the diagrams, and you can roughly estimate the range within which it will be.

So, the reference card GeForce 6600GT.

Obviously, the design of GF 6600GT is unique and unlike any previous one. First of all, this is a reduction in the size of the card itself, which allows the absence of a 256-bit bus, which still affects the size of the PCB. And also a strong simplification of the power unit contributed to a decrease in the PCB area (after all, for PCX-cards consuming less than 75W, an external power supply is no longer required, which simplifies the design). Our facility consumes less than 75W at its maximum load, so no direct connections to the power supply are required.

Despite the huge frequencies for an 8-pipelined chip, the cooler is rather primitive.

It can be assumed that the manufacturers of such cards will experiment with their own coolers, or use the developments that were made earlier for the GeForce4 Ti (GeForce FX 5600/5700).

The GPU itself has a relatively small size of its platform (of course, because the 128-bit bus), and in general looks very similar to the GeForce FX 5700. And the crystal dimensions are almost the same. But if in NV36 these dimensions fit only 4 pixel and 3 vertex pipelines, then in this case there are twice as many pixel pipelines. Still 0.11 microns ...

The video card has an important feature for the future, namely support for SLI (that is, as in the days of Voodoo2, it is possible to increase the total power of 3D graphics by adding a similar accelerator). To do this, on the board in the upper part there is a corresponding connector for connecting a special loop (or connector) to two video cards to synchronize their work:

Finishing the study of the card itself, we note that it has VIVO support implemented through Philips 7115 (we haven’t met such an encoder yet, so our permanent researcher of multimedia add-ons or features of video cards Alexey Samsonov is already impatiently waiting to test the new product).

Now let's talk about overclocking. Thanks to the efficiency of RivaTuner author Alexei Nikolaychuk, this utility can already work with NV43.

The number of pipelines (both pixel and vertex) is determined for the map. In the second screenshot, we see that the card has two quads (fours of pixel pipelines).

So, the board was able to work stably at frequencies 590/590 (1180) MHz!... Unprecedented potential! I can even assume that after the release of the ATI RV410 NVIDIA will release the GeForce 6600 Ultra (it's not in vain that the older model now only has the suffix GT).

The card worked at these frequencies, blown by an external fan. And here are the temperatures we saw:

Yes, sometimes the core temperature reached 88 degrees, but, as you know, for such chips this is clearly not the limit (they can heat up to 100 degrees). It is interesting to note that the external fan practically cooled only the memory, because removing it did not lead to any increase in the core temperature.