All objects by nVidia

A Californian company that was founded in 1993.

nVidia started with the NV1 back in 1995 and sold to retail as the Diamond Edge 3D. The NV1 featured 2D, 3D, a Sound Blaster compatible audio system, 15-pin Joystick ports and a Sega Saturn compatible joypad port. Although it had many features, not everything was working as stable and flawless as they hoped it would be.

More success was gained with the launch of the NV3 chip. The new chip was used in the RIVA 128 cards and is Direct3D 5.0 compatible. It was a big step forward compared to the NV1 but it was not until the release of the NV4 to have a worthy rival. NV4 is better known as Riva TNT; a 32-bit Direct3D 6.0 graphics card.

Riva TNT did not sell as well as the Voodoo 2 (which was incredibly popular). The nV5 (Riva TNT 2) changed this: better drivers, better performance, better support (as Direct3D API's we're used for games). The TNT 2 sold in many forms to serve the market well.

Real breakthrough came with the launch of the GeForce256 DDR (NV10). A popular card was the ASUS V6600 /32M(TVR) which performed very well. The NV10 is DirectX 7.0 compliant and features a hardware texture and lightning unit. The latter was not supported by rival 3dfx which made the GeForce 256 run pretty fast in systems with a slower CPU.Image

In the years 2000 to 2002 nVidia held the performance crown. 3dfx was bought by nVidia and ATi needed time to get the Radeon running correctly. The Radeon 256 performed pretty good but compatibility wise nVidia was a step ahead. nVidia launched GeForce 2's in many forms (From slow MX versions to fast Ti and Ultra versions), launched the GeForce 3 in 2001 and brought us the GeForce 4 Ti in 2002. The GeForce 4 Ti was the card to go for if you wanted performance and quality in 2002.

Image In 2003 the green ship started to turn slowly, however. ATi had it's Radeon 9500 Pro and 9700 Pro ready and was about to release the 9800 Pro. Drivers we're getting better and many gamers favoured the Radeon over the GeForce. Especially after the release of the GeForce FX in 2003. The GeForce FX was DirectX 9 compatible (just like the Radeon 9500 Pro already was) but fell behind in terms of performance and image quality. The GeForce FX also ran much hotter which needed better cooling systems and better power supplies. nVidia's reference design for the GeForce FX 5800 Ultra is actually very noisy and was often used as argument in Radeon <> GeForce discussions. Eventually it turned out that Pixel Shader 2 routines (which we're used in the new games back then) we're very slow on the GeForce FX which hurt nVidia's image even more. nVidia later introduced a newer version of the GeForce FX (i.e. FX5700 and FX5900 series) which fixed some of the problems.

In 2004 nVidia launched the GeForce 6-series; a card that solved all the trouble of the GeForce FX. From this period both ATi and nVidia released performance crowns and disappointments (ATi Radeon 2900XT, big and slow. GeForce 8800GTX with ball soldering to fail after a few years) and still do this until this very day.


Click on the names of the objects to see more details.


The list will show every object in my database that has been made by nVidia.

Vendor
Name
RAM
Int.
ES?
Date
CoreClck
RAMClck
nVidia
64MB
AGP
No
0239
250MHz
311MHz
nVidia
128MB
AGP
Yes
0000
500MHz
500MHz
nVidia
128MB
AGP
Yes
0000
500MHz
500MHz
nVidia
64MB
AGP
No
0315
275MHz
200MHz
nVidia
256MB
AGP
No
0337
475MHz
450MHz
nVidia
256MB
AGP
Yes
0412
600MHz?
400MHz?
nVidia
256MB
PCI-e
Yes
0536
500MHz
400MHz
nVidia
64MB
PCI-e
No
0531
275MHz
250MHz
nVidia
128MB
PCI-e
No
0528
350MHz
300MHz