Nvidia

From Wikipedia, the free encyclopedia

Nvidia (NASDAQNVDA pronounced /ɛnˈvɪ.di.ə/) is a multinational corporation which specializes in the development of graphics processing units and chipset technologies for workstations, personal computers, and mobile devices. Based in Santa Clara, California, the company has become a major supplier of integrated circuits (ICs), designing graphics processing units (GPUs) and chipsets used in graphics cards, in personal-computer motherboards, and in video game consoles.
Notable Nvidia product lines include:

Nvidia Corporation
Type Public (NASDAQNVDA)
Founded 1993
Founder(s) Jen-Hsun Huang
Chris Malachowsky
Headquarters 2701 San Tomas Expressway
Santa Clara, California
USA
Area served Worldwide
Key people Jen-Hsun Huang, Co-founder, President and CEO
Chris Malachowsky, Co-founder, Nvidia Fellow, Senior Vice President, Engineering and Operations
Jonah M. Alben, Vice President, GPU Engineering
Debora Shoquist, Senior Vice President, Operations
Dr. Ranga Jayaraman, CIO
Industry Semiconductors — Specialized
Products Graphics processing units
Chipsets
Revenue $ 3.424 billion (2009)
Operating income $ 70.70 million (2009)
Net income $ 30.04 million (2009)
Total assets $ 3.350 billion (2009)
Total equity $ 2.394 billion (2009)
Employees over 4,985 (as of June 2008)
Website Nvidia.com

 

 

Company history

Three people co-founded Nvidia in 1993:
The founders gained venture capital funding from Sequoia Capital.
In 2000, Nvidia acquired the intellectual assets of its one-time rival 3dfx, one of the biggest graphics companies of the mid- to late-1990s.
On December 14, 2005, Nvidia acquired ULI Electronics, which at the time supplied third-party southbridge parts for chipsets to ATI, Nvidia's competitor.
In March 2006, Nvidia acquired Hybrid Graphics
In December 2006, Nvidia, along with its main rival in the graphics industry AMD (which had acquired ATI), received subpoenas from the Justice Department regarding possible antitrust violations in the graphics card industry.
Forbes magazine named Nvidia its Company of the Year for 2007, citing the accomplishments it made during the said period as well as during the previous 5 years.
On January 5, 2007, Nvidia announced that it had completed the acquisition of PortalPlayer, Inc.
In February 2008, Nvidia acquired Ageia Technologies for an undisclosed sum. "The purchase reflects both companies' shared goal of creating the most amazing and captivating game experiences," said Jen-Hsun Huang, president and CEO of Nvidia. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce-accelerated PhysX to twelve million gamers around the world." (The press-release made no mention of the acquisition-cost nor of future plans for specific products.)

Branding

The company's name combines an initial n (a letter usable as a pronumeral in mathematical statements) and the root of video (from Latin videre, "to see"), thus implying "the best visual experience" or perhaps "immeasurable display." The sound of the name Nvidia suggests "envy" (Spanish: envidia; Latin, Italian, or Romanian: invidia); and Nvidia's GeForce 8 series product (manufactured 2006-2008) used the slogan "Green with envy."
The company name appears entirely in upper-case ("NVIDIA") in the company's technical documentation. The mixed-case form ("nVIDIA," with a full-height, lower-case "n") appears only in the corporate logo.

Products


Nvidia headquarters in Santa Clara

A graphics processing unit on an Nvidia GeForce 6600 GT
Nvidia's product portfolio includes graphics processors, wireless communications processors, PC platform (motherboard core logic) chipsets, and digital media player software. The community of computer users arguably has come to know Nvidia best for its "GeForce" product line, which consists of both a complete line of discrete graphics chips found in AIB (add-in board) video cards and core graphics technology used in nForce motherboards, the Microsoft Xbox game console, and Sony's PlayStation 3 game console.
In many respects Nvidia resembles its competitor ATI. Both companies began with a focus on the PC market and later expanded their activities into chips for non-PC applications. As part of their operations, both ATI and Nvidia create reference designs (circuit board schematics) and provide manufacturing samples to their board partners. However, unlike ATI, Nvidia does not sell graphics boards into the retail market, instead focusing on the development of GPU chips. As a fabless semiconductor company, Nvidia contracts out the manufacture of their chips to Taiwan Semiconductor Manufacturing Company, Ltd. (TSMC). Manufacturers of Nvidia video cards include BFG, EVGA, Foxconn, and PNY. The manufacturers ASUS, ECS, Gigabyte Technology, MSI, Palit, and XFX produce both ATI and Nvidia cards.
December 2004 saw the announcement that Nvidia would assist Sony with the design of the graphics processor (RSX) in the PlayStation 3 game console. In March 2006 it emerged that Nvidia would deliver RSX to Sony as an IP core, and that Sony alone would organize the manufacture of the RSX. Under the agreement, Nvidia will provide ongoing support to port the RSX to Sony's fabs of choice (Sony and Toshiba), as well as die shrinks to 65 nm. This practice contrasts with Nvidia's business arrangement with Microsoft, in which Nvidia managed production and delivery of the Xbox GPU through Nvidia's usual third-party foundry contracts. Meanwhile, Microsoft chose to license a design by ATI and to make its own manufacturing arrangements for the Xbox 360 graphics hardware, as has Nintendo for the Wii console (which succeeds the ATI-based Nintendo GameCube).
On February 4, 2008, Nvidia announced plans to acquire physics-software producer Ageia, whose PhysX physics engine program formed part of hundreds of games shipping or in development for PlayStation 3, Xbox 360, Wii, and gaming PCs. This transaction completed on February 13, 2008 and efforts to integrate PhysX into the GeForce 8800's CUDA system began.
On June 2, 2008 Nvidia officially announced its new Tegra product line. The Tegra, a system-on-a-chip (SoC), integrates an ARM CPU, GPU, northbridge and southbridge onto a single chip. Commentators opine that Nvidia will target this product at the smartphone and mobile Internet device markets.

Graphics chipsets

Desktop motherboard chipsets

  • nForce series
    • nForce: AMD Athlon/Athlon XP/Duron K7 CPUs (System Platform Processor (SPP) and Media and Communications Processor (MCP) or GeForce2 MX-class Integrated Graphics Processor (IGP) and MCP, SoundStorm available)
    • nForce2: AMD Athlon/Athlon XP/Duron/Sempron K7 CPUs (SPP + MCP or GeForce4 MX-class IGP + MCP, SoundStorm available)
    • nForce3: AMD Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs (unified MCP only)
    • nForce4
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs (unified MCP, SPP + MCP, or MCP paired with GeForce 6100 series/Quadro NVS 210S IGP)
      • Intel: Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Celeron/Celeron D NetBurst CPUs (SPP + MCP only)
    • nForce 500
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs (unified MCP or SPP + MCP)
      • Intel Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Pentium Dual-Core/Core 2 Duo/Core 2 Extreme/Celeron/Celeron D NetBurst and Core 2 CPUs (SPP + MCP only)
    • nForce 600
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs, Quad FX-capable (unified MCP or MCP paired with GeForce 7000 series/GeForce 7100 series IGP)
      • Intel: Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Pentium Dual-Core/Core 2 Duo/Core 2 Extreme/Core 2 Quad/Celeron/Celeron D NetBurst and Core 2 CPUs (SPP + MCP or MCP paired with GeForce 7000 series/GeForce 7100 series IGP)
    • nForce 700
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Athlon X2/Opteron/Phenom X3/Phenom X4/Sempron K8 and K10 CPUs
      • Intel: Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Pentium Dual-Core/Core 2 Duo/Core 2 Extreme/Core 2 Quad/Celeron/Celeron D NetBurst and Core 2 CPUs
    • nForce 900: AMD Athlon 64/Athlon 64 X2/Athlon 64 FX/Athlon X2/Athlon II X2/Athlon II X3/Athlon II X4/Opteron/Phenom X3/Phenom X4/Phenom II X2/Phenom II X3/Phenom II X4/Sempron K8 and K10 CPUs

Documentation and drivers

Nvidia does not publish the documentation for its hardware, meaning that programmers cannot write appropriate and effective open-source drivers for Nvidia's products (compare Graphics hardware and FOSS). Instead, Nvidia provides its own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. Nvidia also supports an obfuscated open-source driver that only supports two-dimensional hardware acceleration and ships with the X.Org distribution. Nvidia's Linux support has promoted mutual adoption in the entertainment, scientific visualization, defense and simulation/training industries, traditionally dominated by SGI, Evans & Sutherland, and other relatively costly vendors.
The proprietary nature of Nvidia's drivers has generated dissatisfaction within free-software communities. Some Linux and BSD users insist on using only open-source drivers, and regard Nvidia's insistence on providing nothing more than a binary-only driver as wholly inadequate, given that competing manufacturers (like Intel) offer support and documentation for open-source developers, and that others (like ATI) release partial documentation.Because of the closed nature of the drivers, Nvidia video cards do not deliver adequate features on several platforms and architectures, such as FreeBSD on the x86-64 architecture and the other BSD operating systems on any architecture. Support for three-dimensional graphics acceleration in Linux on the PowerPC does not exist; nor does support for Linux on the hypervisor-restricted PlayStation 3 console. While some users accept the Nvidia-supported drivers, many users of open-source software would prefer better out-of-the-box performance if given the choice. However, the performance and functionality of the binary Nvidia video card drivers surpass those of open-source alternatives following VESA standards.
X.Org Foundation and Freedesktop.org have started the Nouveau project, which aims to develop free-software drivers for Nvidia graphics cards by reverse engineering Nvidia's current proprietary drivers for Linux.

Market share

According to a survey conducted in the third quarter of 2007 by market watch firm Jon Peddie Research, Nvidia occupied the top slot in the desktop graphic devices market with a market share of 37.8%. However, in the mobile space, it remained third with 22.8% of the market. Overall Nvidia has maintained its position as the second-largest supplier of PC graphic shipments, which includes both integrated and discrete GPUs, with 33.9% market share, their highest in many years, which puts them just behind Intel (38%).
According to the Steam hardware survey conducted by the game developer Valve, Nvidia had 64.64% of PC video card market share (as of 1 December 2008 (2008 -12-01)). ATI had 27.12% of the PC video card market share. But this could relate to Valve releasing trial versions of The Orange Box compilation to Nvidia graphics card users: the compilation provided a link to the survey. However, free copies of The Orange Box were also released to purchasers of ATI cards, notably to those who purchased the Radeon 2900XT.

Market history

Before DirectX


An Nvidia RIVA 128 AGP video card
Nvidia released its first graphics card, the NV1, in 1995. Its design used quadratic surfaces, with an integrated playback-only sound card and ports for Sega Saturn gamepads. Because the Saturn also used forward-rendered quadratics, programmers ported several Saturn games to play on a PC with NV1, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a marketplace full of several competing proprietary standards.
Market interest in the product ended when Microsoft announced the DirectX specifications, based on polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped that an integrated chip with both graphics and sound capabilities would cut the manufacturing cost of the next Sega console. However, Sega eventually realized the flaws in implementing quadratic surfaces, and the NV2 project never resulted in a finished product.

Transition to DirectX

Nvidia's CEO Jen-Hsun Huang realized at this point that after two failed products, something had to change for the company to survive. He hired David Kirk as Chief Scientist from software developer Crystal Dynamics. Kirk would combine Nvidia's experience in 3D hardware with an intimate understanding of practical implementations of rendering.
As part of the corporate transformation, Nvidia sought to provide full support for DirectX, and dropped multimedia functionality in order to reduce manufacturing costs. Nvidia also adopted the goal of an internal six-month product cycle, based on the expectation that it could mitigate a failure of any one product by having a replacement moving through the development pipeline.
However, since the Sega NV2 contract remained secret, and since Nvidia had recently laid off employees, it appeared to many industry observers that Nvidia had ceased active research and development. So when Nvidia first announced the RIVA 128 in 1997, the market found the specifications hard to believe: performance superior to market-leader 3dfx Voodoo Graphics, and a fully hardware-based triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance made it a popular choice for OEMs.

Ascendancy: RIVA TNT

Having finally developed and shipped in volume a market-leading integrated graphics chipset, Nvidia set itself the goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance gain. The TwiN Texel (RIVA TNT) engine which Nvidia subsequently developed could either apply two textures to a single pixel, or process two pixels per clock cycle. The former case allowed for improved visual quality, the latter for doubling the maximum fillrate.
New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects (such as transistor count) the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality-integrated features, it failed to displace the market leader, 3dfx's Voodoo2, because the actual clock rate ended up at only 90 MHz, about 35% lower than expected.
Nvidia followed with a refresh part: a die shrink for the TNT architecture from 350 nm to 250 nm. A stock TNT2 now ran at 125 MHz, a TNT2 Ultra at 150 MHz. Though the Voodoo3 beat Nvidia to the market, 3dfx's offering proved disappointing; it did not run much faster and lacked features that were becoming standard, such as 32-bit color and textures of resolution greater than 256 x 256 pixels.
The RIVA TNT2 marked a major turning point for Nvidia. They had finally delivered a product competitive with the fastest on the market, with a superior feature set, strong 2D functionality, all integrated onto a single die with strong yields, and that ramped to impressive clock rates. Nvidia's six-month cycle refresh took the competition by surprise, giving it the initiative in rolling out new products.

Market leadership: GeForce


A GeForce4 MX 64 MB card. Nvidia produced such cards from 2002 to 2003
The northern-hemisphere autumn of 1999 saw the release of the GeForce 256 (NV10), most notably introducing on-board transformation and lighting (T&L) to consumer-level 3D hardware. Running at 120 MHz and featuring four pixel pipelines, it implemented advanced video acceleration, motion compensation, and hardware sub-picture alpha blending. The GeForce outperformed existing products - including the ATI Rage 128, 3dfx Voodoo3, Matrox G400 MAX, and RIVA TNT2 - by a wide margin.
Due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsoft's Xbox game console, which earned Nvidia a $200 million advance. However, the project drew the time of many of Nvidia's best engineers away from other projects. In the short term this did not matter, and the GeForce2 GTS shipped in the summer of 2000.
The GTS benefited from the fact that Nvidia had by this time acquired extensive manufacturing experience with its highly integrated cores, and as a result it succeeded in optimizing the core for higher clock-rates. The volume of chips produced by Nvidia also allowed the segregation of parts: Nvidia could pick out the highest-quality cores from the same batch as regular parts for its premium range. As a result, the GTS shipped at 200 MHz. The pixel fillrate of the GeForce256 nearly doubled, and texel fillrate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.
In 2000 Nvidia shipped the GeForce2 MX, intended for the budget and OEM market. It had two fewer pixel pipelines and ran at 165 MHz (later at 250 MHz). Offering strong performance at a mid-range price, the GeForce2 MX became one of the most successful graphics chipsets. Nvidia also shipped a mobile derivative called the GeForce2 Go at the end of 2000.
Nvidia's success proved too much for 3dfx to recover its past market share. The long-delayed Voodoo 5, the successor to the Voodoo3, did not compare favorably with the GeForce2 in either price or performance, and failed to generate the sales needed to keep the company afloat. With 3dfx on the verge of bankruptcy near the end of 2000, Nvidia purchased most of 3dfx's intellectual property (in dispute at the time). Nvidia acquired anti-aliasing expertise and about 100 engineers, but not the company itself, which filed for bankruptcy in 2002.
Nvidia developed the GeForce3, which pioneered DirectX 8 vertex and pixel shaders, and eventually refined it with the GeForce4 Ti line. Nvidia announced the GeForce4 Ti, MX, and Go in January 2002, one of the largest releases in Nvidia's history. The chips in the Ti and Go series differed only in chip and memory clock rates. The MX series lacked the pixel and vertex shader functionalities; it derived from GeForce2 level hardware and assumed the GeForce2 MX's position in the value segment.

Stumbles with the FX series

At this point Nvidia dominated the GPU market. However, ATI Technologies remained competitive due to its new Radeon product, which had performance comparable to the GeForce2 GTS. Though ATI's answer to the GeForce3, the Radeon 8500, came later to market and initially suffered from issues with drivers, the 8500 proved a superior competitor due to its lower price. Nvidia countered ATI's offering with the GeForce4 Ti line. ATI concentrated efforts on its next-generation Radeon 9700 rather than on directly challenging the GeForce4 Ti.
During the development of the next-generation GeForce FX chips, many Nvidia engineers focused on the Xbox contract. Nvidia also had a contractual obligation to develop newer and more hack-resistant NV2A chips, and this requirement left even fewer engineers to work on the FX project. Since the Xbox contract did not anticipate or encompass falling manufacturing costs, Microsoft sought to re-negotiate the terms of the contract, and relations between Nvidia and Microsoft deteriorated as a result. The two companies later settled the dispute through arbitration without releasing the terms of the settlement to the public.
Following their dispute, Microsoft did not consult Nvidia during the development of the DirectX 9 specification, allowing ATI to establish much of the specification themselves. During this time, ATI limited rendering color support to 24-bit floating point, and emphasized shader performance. Microsoft also built the shader compiler using the Radeon 9700 as the base card. In contrast, Nvidia's cards offered 16- and 32-bit floating-point modes, offering either lower visual quality (as compared to the competition), or slower performance. The 32-bit support made them much more expensive to manufacture, requiring a higher transistor count. Shader performance often remained at half or less of the speed provided by ATI's competing products. Having made its reputation by designing easy-to-manufacture DirectX-compatible parts, Nvidia had misjudged Microsoft's next standard and paid a heavy price: As more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became more obvious. With the exception of the FX 5700 series (a late revision), the FX series did not compete well against ATI cards.
Nvidia released an "FX only" demo called "Dawn," but a hacked wrapper enabled it to run on a Radeon 9700, where it ran faster despite translation overhead. Nvidia began to use application detection to optimize its drivers. Hardware review sites published articles showing that Nvidia's driver auto-detected benchmarks and that it produced artificially inflated scores that did not relate to real-world performance. Often tips from ATI's driver development team lay behind these articles. While Nvidia did partially close the performance gap with new instruction-reordering capabilities introduced in later drivers, shader performance remained weak and over-sensitive to hardware-specific code compilation. Nvidia worked with Microsoft to release an updated DirectX compiler that generated code optimized for the GeForce FX.
Furthermore, GeForce FX devices also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The GeForce FX 5800 Ultra became notorious for its fan noise, and acquired the nicknames "dustbuster" and "leafblower." Nvidia jokingly acknowledged these accusations with a video in which the marketing team compares the cards to a Harley-Davidson motorcycle. Although the quieter 5900 replaced the 5800 without fanfare, the FX chips still needed large and expensive fans, placing Nvidia's partners at a manufacturing cost disadvantage compared to ATI.
Seemingly as a culmination of these events at the corporate level and the subsequent weaknesses of the FX series, Nvidia ceded its market leadership position to ATI.

GeForce 6 series and later


A former Nvidia logo, in use until 2006
With the GeForce 6 series Nvidia moved beyond the DX9 performance problems that had plagued the previous generation. The GeForce 6 series not only performed competitively against other Direct 3D shaders, but also supported DirectX Shader Model 3.0, while ATI's competing X800 series chips only supported the previous 2.0 specification. This proved an insignificant advantage, mainly because games of that period did not employ extensions for Shader Model 3.0. However, it demonstrated Nvidia's desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the products of the two firms, ATI and Nvidia, offered equivalent performance. The two firms traded the performance lead in specific titles and specific criteria (resolution, image quality, anisotropic filtering/anti-aliasing), but the differences were becoming more abstract. As a result, price/performance ratio became the reigning concern in comparisons of the two. The mid-range offerings of the two firms demonstrated consumer appetite for affordable, high-performance graphics cards. This price segment came to determine much of each firm's profitability. The GeForce 6 series emerged at a very interesting period: The game Doom 3 had just been released, and ATI's Radeon 9700 was found to struggle with OpenGL performance in the game. In 2004, the GeForce 6800 performed excellently, while the GeForce 6600GT remained as important to Nvidia as the GeForce2 MX a few years previously. The GeForce 6600GT enabled users of the card to play Doom 3 at very high resolutions and graphical settings, which had been thought to be highly unlikely considering its selling price. The GeForce 6 series also introduced SLI, which is similar to technology that 3dfx had employed with the Voodoo2. A combination of SLI and other hardware performance gains returned Nvidia to market leadership.

Badge displayed on products certified by Nvidia to utilize SLI technology
The GeForce 7 series represented a heavily beefed-up extension of the reliable 6 series. The introduction of the PCI Express bus standard allowed Nvidia to release SLI (Scalable Link Interface), a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards vis-à-vis one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and with a dual-rendering solution called "ATI CrossFire". Sony selected Nvidia to develop the "RSX" chip (a modified version of the 7800 GPU) used in the PlayStation 3.
Nvidia released a GeForce 8 series chip towards the end of 2006, making the 8 series the first to support Microsoft's next-generation DirectX 10 specification. The 8 series GPUs also featured the revolutionary Unified Shader Architecture, and Nvidia leveraged this to provide better support for General Purpose Computing on GPU (GPGPU). A new product line of "compute only" devices called Nvidia Tesla emerged from the G80 architecture, and subsequently Nvidia also became the market leader of this new field by introducing the world's first C programming language API for GPGPU, CUDA.
Nvidia released two models of the high-end 8 series (8800) chip: the 8800GTS (640 MB and 320 MB) and the 8800GTX (768 MB). Later, Nvidia released the 8800 Ultra (essentially an 8800GTX with a different cooler and higher clocks). All three of these cards derive from the 90 nm G80 core (with 681 million transistors). The GTS model had 96 stream processors and 20 ROPS and the GTX/Ultra had 128 stream processors and 24 ROPS.
In early 2007 Nvidia released the 8800GTS 320 MB. This card resembles an 8800GTS 640 MB, but with 32 MB memory chips instead of 64 MB (the cards contained 10 memory chips).
In October 2007 Nvidia released the 8800GT. The 8800GT used the new 65 nm G92 GPU and had 112 stream processors. It contained 512 MB of VRAM and operated on a 256-bit bus. It had several fixes and new features that the previous 8800s lacked.
Later in December 2007 Nvidia released the 8800GTS G92. It represented a larger 8800GT with higher clocks and all of the 128 stream processors of the G92 unlocked. Both the 8800GTS G92 and 8800GT have full PCI Express 2.0 support.
In February 2008 Nvidia released a GeForce 9 series chip, which supports Microsoft's DirectX 10 specification, in response to ATI's release of the Radeon HD3800 series. After March, Nvidia released the GeForce 9800 GX2, which effectively packaged two GeForce 8800 GTS G92s working in an internal SLI configuration on a single card.
In June 2008 Nvidia released its new flagship GPUs: the GTX 280 and GTX 260. The cards used the same basic Unified Architecture deployed in the previous 8 and 9 series cards, but with a upgrade in power. Both of the cards use the GT200 GPU as a basis for their design. This GPU contains 1.4 billion transistors on a 65 nm fabrication process. The GTX 280 has 240 shaders (stream processors) and the GTX 260 has 192 shaders (stream processors). The GTX 280 has 1 GB of GDDR3 VRAM and uses a 512-bit memory bus. The GTX 260 has 896 MB of GDDR3 VRAM on a 448-bit memory bus (revised in September 2008 to include 216 shaders). The GTX 280 allegedly provides approximately 933 GFLOPS of floating point power.
In January 2009 Nvidia released a 55 nm die shrink of GT200 called GT200b. Cards using this chip include an update to the GTX 280 card (called GTX 285) which allegedly provides 1062.72 GFLOPS of floating point power), an update to the GTX 260 (still called the GTX 260) with 216 shaders, and a multi-chip card (called GTX 295) which features two GT200b chips. Distinctively, each individual GPU features 240 stream processors, but only a 448-bit memory bus. The GTX 295 has 1.75 GB (1792 MB, 896 MB per GPU) of GDDR3 VRAM. The GTX 295 allegedly provides approximately 1788.48 GFLOPS of floating point power.
March 2009 saw the release of the GTS 250 mainstream chip, based on a 55 nm die shrink of G92, called G92b. The GTS 250 derives from the 9800GTX+ (some cards consist of rebranded 9800GTX+s) and has 128 shaders (stream processors) with a 256-bit memory bus and 512 MB or 1 GB of GDDR3 VRAM.
On May 12, 2009, Nvidia released images of a new revised edition of the GTX 295. This design, while similar to ATI's HD4870x2, differs from the original. The first production run of the GTX 295 literally consisted of two separate graphics accelerators sandwiched in the same casing and connected by an SLI ribbon cable. The new design encompasses both GPUs on one PCB. The card still has the same specifications of the first production run, although speculation admits it will sell for less due to lower manufacturing costs for the more compact device.
Nvidia plans to launch the Geforce 300 series - based on the Fermi architecture. - postponed to March 2010.

Defective mobile video adapters

In July 2008, Nvidia noted increased rates of failure in certain mobile video adapters. In response to this issue, Dell and HP released BIOS updates for all affected notebook computers which turn on the cooling fan at lower temperatures than previously configured in an effort to keep the defective video adapter from reaching higher temperatures. Leigh Stark of APC Magazine has suggested that this may lead to the premature failure of the cooling fan. This resolution/workaround may possibly only delay component failure past warranty expiration.
But at the end of August 2008, Nvidia reportedly issued a product change notification announcing plans to update the bump material of GeForce 8 and 9 series chips "to increase supply and enhance package robustness." In response to the possibility of defects in some mobile video adapters from Nvidia, some manufacturers of notebooks have allegedly turned to ATI to provide graphics options on their new Montevina notebook computers.
On August 18, 2008, according to the direct2dell.com blog, Dell began to offer a 12-month limited warranty "enhancement" specific to this issue on affected notebook computers worldwide.
On September 8, 2008, Nvidia made a deal with large OEMs, including Dell and HP, that Nvidia would pay $200 per affected notebook to manufacturers as compensation for the defects.
On October 9, 2008, Apple Inc. announced on a support page that some MacBook Pro notebook computers had exhibited faulty Nvidia GeForce 8600M GT graphics adapters. The manufacture of affected computers took place between approximately May 2007 and September 2008. Apple also stated that it would repair affected MacBook Pros within three years of the original purchase date free of charge and also offered refunds to customers who had paid for repairs related to this issue.



Source : Wikipedia

0 comments:

Post a Comment

Wednesday, February 3, 2010

Nvidia

From Wikipedia, the free encyclopedia

Nvidia (NASDAQNVDA pronounced /ɛnˈvɪ.di.ə/) is a multinational corporation which specializes in the development of graphics processing units and chipset technologies for workstations, personal computers, and mobile devices. Based in Santa Clara, California, the company has become a major supplier of integrated circuits (ICs), designing graphics processing units (GPUs) and chipsets used in graphics cards, in personal-computer motherboards, and in video game consoles.
Notable Nvidia product lines include:

Nvidia Corporation
Type Public (NASDAQNVDA)
Founded 1993
Founder(s) Jen-Hsun Huang
Chris Malachowsky
Headquarters 2701 San Tomas Expressway
Santa Clara, California
USA
Area served Worldwide
Key people Jen-Hsun Huang, Co-founder, President and CEO
Chris Malachowsky, Co-founder, Nvidia Fellow, Senior Vice President, Engineering and Operations
Jonah M. Alben, Vice President, GPU Engineering
Debora Shoquist, Senior Vice President, Operations
Dr. Ranga Jayaraman, CIO
Industry Semiconductors — Specialized
Products Graphics processing units
Chipsets
Revenue $ 3.424 billion (2009)
Operating income $ 70.70 million (2009)
Net income $ 30.04 million (2009)
Total assets $ 3.350 billion (2009)
Total equity $ 2.394 billion (2009)
Employees over 4,985 (as of June 2008)
Website Nvidia.com

 

 

Company history

Three people co-founded Nvidia in 1993:
The founders gained venture capital funding from Sequoia Capital.
In 2000, Nvidia acquired the intellectual assets of its one-time rival 3dfx, one of the biggest graphics companies of the mid- to late-1990s.
On December 14, 2005, Nvidia acquired ULI Electronics, which at the time supplied third-party southbridge parts for chipsets to ATI, Nvidia's competitor.
In March 2006, Nvidia acquired Hybrid Graphics
In December 2006, Nvidia, along with its main rival in the graphics industry AMD (which had acquired ATI), received subpoenas from the Justice Department regarding possible antitrust violations in the graphics card industry.
Forbes magazine named Nvidia its Company of the Year for 2007, citing the accomplishments it made during the said period as well as during the previous 5 years.
On January 5, 2007, Nvidia announced that it had completed the acquisition of PortalPlayer, Inc.
In February 2008, Nvidia acquired Ageia Technologies for an undisclosed sum. "The purchase reflects both companies' shared goal of creating the most amazing and captivating game experiences," said Jen-Hsun Huang, president and CEO of Nvidia. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce-accelerated PhysX to twelve million gamers around the world." (The press-release made no mention of the acquisition-cost nor of future plans for specific products.)

Branding

The company's name combines an initial n (a letter usable as a pronumeral in mathematical statements) and the root of video (from Latin videre, "to see"), thus implying "the best visual experience" or perhaps "immeasurable display." The sound of the name Nvidia suggests "envy" (Spanish: envidia; Latin, Italian, or Romanian: invidia); and Nvidia's GeForce 8 series product (manufactured 2006-2008) used the slogan "Green with envy."
The company name appears entirely in upper-case ("NVIDIA") in the company's technical documentation. The mixed-case form ("nVIDIA," with a full-height, lower-case "n") appears only in the corporate logo.

Products


Nvidia headquarters in Santa Clara

A graphics processing unit on an Nvidia GeForce 6600 GT
Nvidia's product portfolio includes graphics processors, wireless communications processors, PC platform (motherboard core logic) chipsets, and digital media player software. The community of computer users arguably has come to know Nvidia best for its "GeForce" product line, which consists of both a complete line of discrete graphics chips found in AIB (add-in board) video cards and core graphics technology used in nForce motherboards, the Microsoft Xbox game console, and Sony's PlayStation 3 game console.
In many respects Nvidia resembles its competitor ATI. Both companies began with a focus on the PC market and later expanded their activities into chips for non-PC applications. As part of their operations, both ATI and Nvidia create reference designs (circuit board schematics) and provide manufacturing samples to their board partners. However, unlike ATI, Nvidia does not sell graphics boards into the retail market, instead focusing on the development of GPU chips. As a fabless semiconductor company, Nvidia contracts out the manufacture of their chips to Taiwan Semiconductor Manufacturing Company, Ltd. (TSMC). Manufacturers of Nvidia video cards include BFG, EVGA, Foxconn, and PNY. The manufacturers ASUS, ECS, Gigabyte Technology, MSI, Palit, and XFX produce both ATI and Nvidia cards.
December 2004 saw the announcement that Nvidia would assist Sony with the design of the graphics processor (RSX) in the PlayStation 3 game console. In March 2006 it emerged that Nvidia would deliver RSX to Sony as an IP core, and that Sony alone would organize the manufacture of the RSX. Under the agreement, Nvidia will provide ongoing support to port the RSX to Sony's fabs of choice (Sony and Toshiba), as well as die shrinks to 65 nm. This practice contrasts with Nvidia's business arrangement with Microsoft, in which Nvidia managed production and delivery of the Xbox GPU through Nvidia's usual third-party foundry contracts. Meanwhile, Microsoft chose to license a design by ATI and to make its own manufacturing arrangements for the Xbox 360 graphics hardware, as has Nintendo for the Wii console (which succeeds the ATI-based Nintendo GameCube).
On February 4, 2008, Nvidia announced plans to acquire physics-software producer Ageia, whose PhysX physics engine program formed part of hundreds of games shipping or in development for PlayStation 3, Xbox 360, Wii, and gaming PCs. This transaction completed on February 13, 2008 and efforts to integrate PhysX into the GeForce 8800's CUDA system began.
On June 2, 2008 Nvidia officially announced its new Tegra product line. The Tegra, a system-on-a-chip (SoC), integrates an ARM CPU, GPU, northbridge and southbridge onto a single chip. Commentators opine that Nvidia will target this product at the smartphone and mobile Internet device markets.

Graphics chipsets

Desktop motherboard chipsets

  • nForce series
    • nForce: AMD Athlon/Athlon XP/Duron K7 CPUs (System Platform Processor (SPP) and Media and Communications Processor (MCP) or GeForce2 MX-class Integrated Graphics Processor (IGP) and MCP, SoundStorm available)
    • nForce2: AMD Athlon/Athlon XP/Duron/Sempron K7 CPUs (SPP + MCP or GeForce4 MX-class IGP + MCP, SoundStorm available)
    • nForce3: AMD Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs (unified MCP only)
    • nForce4
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs (unified MCP, SPP + MCP, or MCP paired with GeForce 6100 series/Quadro NVS 210S IGP)
      • Intel: Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Celeron/Celeron D NetBurst CPUs (SPP + MCP only)
    • nForce 500
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs (unified MCP or SPP + MCP)
      • Intel Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Pentium Dual-Core/Core 2 Duo/Core 2 Extreme/Celeron/Celeron D NetBurst and Core 2 CPUs (SPP + MCP only)
    • nForce 600
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron/Sempron K8 CPUs, Quad FX-capable (unified MCP or MCP paired with GeForce 7000 series/GeForce 7100 series IGP)
      • Intel: Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Pentium Dual-Core/Core 2 Duo/Core 2 Extreme/Core 2 Quad/Celeron/Celeron D NetBurst and Core 2 CPUs (SPP + MCP or MCP paired with GeForce 7000 series/GeForce 7100 series IGP)
    • nForce 700
      • AMD: Athlon 64/Athlon 64 X2/Athlon 64 FX/Athlon X2/Opteron/Phenom X3/Phenom X4/Sempron K8 and K10 CPUs
      • Intel: Pentium 4/Pentium 4 Extreme Edition/Pentium D/Pentium Extreme Edition/Pentium Dual-Core/Core 2 Duo/Core 2 Extreme/Core 2 Quad/Celeron/Celeron D NetBurst and Core 2 CPUs
    • nForce 900: AMD Athlon 64/Athlon 64 X2/Athlon 64 FX/Athlon X2/Athlon II X2/Athlon II X3/Athlon II X4/Opteron/Phenom X3/Phenom X4/Phenom II X2/Phenom II X3/Phenom II X4/Sempron K8 and K10 CPUs

Documentation and drivers

Nvidia does not publish the documentation for its hardware, meaning that programmers cannot write appropriate and effective open-source drivers for Nvidia's products (compare Graphics hardware and FOSS). Instead, Nvidia provides its own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. Nvidia also supports an obfuscated open-source driver that only supports two-dimensional hardware acceleration and ships with the X.Org distribution. Nvidia's Linux support has promoted mutual adoption in the entertainment, scientific visualization, defense and simulation/training industries, traditionally dominated by SGI, Evans & Sutherland, and other relatively costly vendors.
The proprietary nature of Nvidia's drivers has generated dissatisfaction within free-software communities. Some Linux and BSD users insist on using only open-source drivers, and regard Nvidia's insistence on providing nothing more than a binary-only driver as wholly inadequate, given that competing manufacturers (like Intel) offer support and documentation for open-source developers, and that others (like ATI) release partial documentation.Because of the closed nature of the drivers, Nvidia video cards do not deliver adequate features on several platforms and architectures, such as FreeBSD on the x86-64 architecture and the other BSD operating systems on any architecture. Support for three-dimensional graphics acceleration in Linux on the PowerPC does not exist; nor does support for Linux on the hypervisor-restricted PlayStation 3 console. While some users accept the Nvidia-supported drivers, many users of open-source software would prefer better out-of-the-box performance if given the choice. However, the performance and functionality of the binary Nvidia video card drivers surpass those of open-source alternatives following VESA standards.
X.Org Foundation and Freedesktop.org have started the Nouveau project, which aims to develop free-software drivers for Nvidia graphics cards by reverse engineering Nvidia's current proprietary drivers for Linux.

Market share

According to a survey conducted in the third quarter of 2007 by market watch firm Jon Peddie Research, Nvidia occupied the top slot in the desktop graphic devices market with a market share of 37.8%. However, in the mobile space, it remained third with 22.8% of the market. Overall Nvidia has maintained its position as the second-largest supplier of PC graphic shipments, which includes both integrated and discrete GPUs, with 33.9% market share, their highest in many years, which puts them just behind Intel (38%).
According to the Steam hardware survey conducted by the game developer Valve, Nvidia had 64.64% of PC video card market share (as of 1 December 2008 (2008 -12-01)). ATI had 27.12% of the PC video card market share. But this could relate to Valve releasing trial versions of The Orange Box compilation to Nvidia graphics card users: the compilation provided a link to the survey. However, free copies of The Orange Box were also released to purchasers of ATI cards, notably to those who purchased the Radeon 2900XT.

Market history

Before DirectX


An Nvidia RIVA 128 AGP video card
Nvidia released its first graphics card, the NV1, in 1995. Its design used quadratic surfaces, with an integrated playback-only sound card and ports for Sega Saturn gamepads. Because the Saturn also used forward-rendered quadratics, programmers ported several Saturn games to play on a PC with NV1, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a marketplace full of several competing proprietary standards.
Market interest in the product ended when Microsoft announced the DirectX specifications, based on polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped that an integrated chip with both graphics and sound capabilities would cut the manufacturing cost of the next Sega console. However, Sega eventually realized the flaws in implementing quadratic surfaces, and the NV2 project never resulted in a finished product.

Transition to DirectX

Nvidia's CEO Jen-Hsun Huang realized at this point that after two failed products, something had to change for the company to survive. He hired David Kirk as Chief Scientist from software developer Crystal Dynamics. Kirk would combine Nvidia's experience in 3D hardware with an intimate understanding of practical implementations of rendering.
As part of the corporate transformation, Nvidia sought to provide full support for DirectX, and dropped multimedia functionality in order to reduce manufacturing costs. Nvidia also adopted the goal of an internal six-month product cycle, based on the expectation that it could mitigate a failure of any one product by having a replacement moving through the development pipeline.
However, since the Sega NV2 contract remained secret, and since Nvidia had recently laid off employees, it appeared to many industry observers that Nvidia had ceased active research and development. So when Nvidia first announced the RIVA 128 in 1997, the market found the specifications hard to believe: performance superior to market-leader 3dfx Voodoo Graphics, and a fully hardware-based triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance made it a popular choice for OEMs.

Ascendancy: RIVA TNT

Having finally developed and shipped in volume a market-leading integrated graphics chipset, Nvidia set itself the goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance gain. The TwiN Texel (RIVA TNT) engine which Nvidia subsequently developed could either apply two textures to a single pixel, or process two pixels per clock cycle. The former case allowed for improved visual quality, the latter for doubling the maximum fillrate.
New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects (such as transistor count) the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality-integrated features, it failed to displace the market leader, 3dfx's Voodoo2, because the actual clock rate ended up at only 90 MHz, about 35% lower than expected.
Nvidia followed with a refresh part: a die shrink for the TNT architecture from 350 nm to 250 nm. A stock TNT2 now ran at 125 MHz, a TNT2 Ultra at 150 MHz. Though the Voodoo3 beat Nvidia to the market, 3dfx's offering proved disappointing; it did not run much faster and lacked features that were becoming standard, such as 32-bit color and textures of resolution greater than 256 x 256 pixels.
The RIVA TNT2 marked a major turning point for Nvidia. They had finally delivered a product competitive with the fastest on the market, with a superior feature set, strong 2D functionality, all integrated onto a single die with strong yields, and that ramped to impressive clock rates. Nvidia's six-month cycle refresh took the competition by surprise, giving it the initiative in rolling out new products.

Market leadership: GeForce


A GeForce4 MX 64 MB card. Nvidia produced such cards from 2002 to 2003
The northern-hemisphere autumn of 1999 saw the release of the GeForce 256 (NV10), most notably introducing on-board transformation and lighting (T&L) to consumer-level 3D hardware. Running at 120 MHz and featuring four pixel pipelines, it implemented advanced video acceleration, motion compensation, and hardware sub-picture alpha blending. The GeForce outperformed existing products - including the ATI Rage 128, 3dfx Voodoo3, Matrox G400 MAX, and RIVA TNT2 - by a wide margin.
Due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsoft's Xbox game console, which earned Nvidia a $200 million advance. However, the project drew the time of many of Nvidia's best engineers away from other projects. In the short term this did not matter, and the GeForce2 GTS shipped in the summer of 2000.
The GTS benefited from the fact that Nvidia had by this time acquired extensive manufacturing experience with its highly integrated cores, and as a result it succeeded in optimizing the core for higher clock-rates. The volume of chips produced by Nvidia also allowed the segregation of parts: Nvidia could pick out the highest-quality cores from the same batch as regular parts for its premium range. As a result, the GTS shipped at 200 MHz. The pixel fillrate of the GeForce256 nearly doubled, and texel fillrate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.
In 2000 Nvidia shipped the GeForce2 MX, intended for the budget and OEM market. It had two fewer pixel pipelines and ran at 165 MHz (later at 250 MHz). Offering strong performance at a mid-range price, the GeForce2 MX became one of the most successful graphics chipsets. Nvidia also shipped a mobile derivative called the GeForce2 Go at the end of 2000.
Nvidia's success proved too much for 3dfx to recover its past market share. The long-delayed Voodoo 5, the successor to the Voodoo3, did not compare favorably with the GeForce2 in either price or performance, and failed to generate the sales needed to keep the company afloat. With 3dfx on the verge of bankruptcy near the end of 2000, Nvidia purchased most of 3dfx's intellectual property (in dispute at the time). Nvidia acquired anti-aliasing expertise and about 100 engineers, but not the company itself, which filed for bankruptcy in 2002.
Nvidia developed the GeForce3, which pioneered DirectX 8 vertex and pixel shaders, and eventually refined it with the GeForce4 Ti line. Nvidia announced the GeForce4 Ti, MX, and Go in January 2002, one of the largest releases in Nvidia's history. The chips in the Ti and Go series differed only in chip and memory clock rates. The MX series lacked the pixel and vertex shader functionalities; it derived from GeForce2 level hardware and assumed the GeForce2 MX's position in the value segment.

Stumbles with the FX series

At this point Nvidia dominated the GPU market. However, ATI Technologies remained competitive due to its new Radeon product, which had performance comparable to the GeForce2 GTS. Though ATI's answer to the GeForce3, the Radeon 8500, came later to market and initially suffered from issues with drivers, the 8500 proved a superior competitor due to its lower price. Nvidia countered ATI's offering with the GeForce4 Ti line. ATI concentrated efforts on its next-generation Radeon 9700 rather than on directly challenging the GeForce4 Ti.
During the development of the next-generation GeForce FX chips, many Nvidia engineers focused on the Xbox contract. Nvidia also had a contractual obligation to develop newer and more hack-resistant NV2A chips, and this requirement left even fewer engineers to work on the FX project. Since the Xbox contract did not anticipate or encompass falling manufacturing costs, Microsoft sought to re-negotiate the terms of the contract, and relations between Nvidia and Microsoft deteriorated as a result. The two companies later settled the dispute through arbitration without releasing the terms of the settlement to the public.
Following their dispute, Microsoft did not consult Nvidia during the development of the DirectX 9 specification, allowing ATI to establish much of the specification themselves. During this time, ATI limited rendering color support to 24-bit floating point, and emphasized shader performance. Microsoft also built the shader compiler using the Radeon 9700 as the base card. In contrast, Nvidia's cards offered 16- and 32-bit floating-point modes, offering either lower visual quality (as compared to the competition), or slower performance. The 32-bit support made them much more expensive to manufacture, requiring a higher transistor count. Shader performance often remained at half or less of the speed provided by ATI's competing products. Having made its reputation by designing easy-to-manufacture DirectX-compatible parts, Nvidia had misjudged Microsoft's next standard and paid a heavy price: As more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became more obvious. With the exception of the FX 5700 series (a late revision), the FX series did not compete well against ATI cards.
Nvidia released an "FX only" demo called "Dawn," but a hacked wrapper enabled it to run on a Radeon 9700, where it ran faster despite translation overhead. Nvidia began to use application detection to optimize its drivers. Hardware review sites published articles showing that Nvidia's driver auto-detected benchmarks and that it produced artificially inflated scores that did not relate to real-world performance. Often tips from ATI's driver development team lay behind these articles. While Nvidia did partially close the performance gap with new instruction-reordering capabilities introduced in later drivers, shader performance remained weak and over-sensitive to hardware-specific code compilation. Nvidia worked with Microsoft to release an updated DirectX compiler that generated code optimized for the GeForce FX.
Furthermore, GeForce FX devices also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The GeForce FX 5800 Ultra became notorious for its fan noise, and acquired the nicknames "dustbuster" and "leafblower." Nvidia jokingly acknowledged these accusations with a video in which the marketing team compares the cards to a Harley-Davidson motorcycle. Although the quieter 5900 replaced the 5800 without fanfare, the FX chips still needed large and expensive fans, placing Nvidia's partners at a manufacturing cost disadvantage compared to ATI.
Seemingly as a culmination of these events at the corporate level and the subsequent weaknesses of the FX series, Nvidia ceded its market leadership position to ATI.

GeForce 6 series and later


A former Nvidia logo, in use until 2006
With the GeForce 6 series Nvidia moved beyond the DX9 performance problems that had plagued the previous generation. The GeForce 6 series not only performed competitively against other Direct 3D shaders, but also supported DirectX Shader Model 3.0, while ATI's competing X800 series chips only supported the previous 2.0 specification. This proved an insignificant advantage, mainly because games of that period did not employ extensions for Shader Model 3.0. However, it demonstrated Nvidia's desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the products of the two firms, ATI and Nvidia, offered equivalent performance. The two firms traded the performance lead in specific titles and specific criteria (resolution, image quality, anisotropic filtering/anti-aliasing), but the differences were becoming more abstract. As a result, price/performance ratio became the reigning concern in comparisons of the two. The mid-range offerings of the two firms demonstrated consumer appetite for affordable, high-performance graphics cards. This price segment came to determine much of each firm's profitability. The GeForce 6 series emerged at a very interesting period: The game Doom 3 had just been released, and ATI's Radeon 9700 was found to struggle with OpenGL performance in the game. In 2004, the GeForce 6800 performed excellently, while the GeForce 6600GT remained as important to Nvidia as the GeForce2 MX a few years previously. The GeForce 6600GT enabled users of the card to play Doom 3 at very high resolutions and graphical settings, which had been thought to be highly unlikely considering its selling price. The GeForce 6 series also introduced SLI, which is similar to technology that 3dfx had employed with the Voodoo2. A combination of SLI and other hardware performance gains returned Nvidia to market leadership.

Badge displayed on products certified by Nvidia to utilize SLI technology
The GeForce 7 series represented a heavily beefed-up extension of the reliable 6 series. The introduction of the PCI Express bus standard allowed Nvidia to release SLI (Scalable Link Interface), a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards vis-à-vis one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and with a dual-rendering solution called "ATI CrossFire". Sony selected Nvidia to develop the "RSX" chip (a modified version of the 7800 GPU) used in the PlayStation 3.
Nvidia released a GeForce 8 series chip towards the end of 2006, making the 8 series the first to support Microsoft's next-generation DirectX 10 specification. The 8 series GPUs also featured the revolutionary Unified Shader Architecture, and Nvidia leveraged this to provide better support for General Purpose Computing on GPU (GPGPU). A new product line of "compute only" devices called Nvidia Tesla emerged from the G80 architecture, and subsequently Nvidia also became the market leader of this new field by introducing the world's first C programming language API for GPGPU, CUDA.
Nvidia released two models of the high-end 8 series (8800) chip: the 8800GTS (640 MB and 320 MB) and the 8800GTX (768 MB). Later, Nvidia released the 8800 Ultra (essentially an 8800GTX with a different cooler and higher clocks). All three of these cards derive from the 90 nm G80 core (with 681 million transistors). The GTS model had 96 stream processors and 20 ROPS and the GTX/Ultra had 128 stream processors and 24 ROPS.
In early 2007 Nvidia released the 8800GTS 320 MB. This card resembles an 8800GTS 640 MB, but with 32 MB memory chips instead of 64 MB (the cards contained 10 memory chips).
In October 2007 Nvidia released the 8800GT. The 8800GT used the new 65 nm G92 GPU and had 112 stream processors. It contained 512 MB of VRAM and operated on a 256-bit bus. It had several fixes and new features that the previous 8800s lacked.
Later in December 2007 Nvidia released the 8800GTS G92. It represented a larger 8800GT with higher clocks and all of the 128 stream processors of the G92 unlocked. Both the 8800GTS G92 and 8800GT have full PCI Express 2.0 support.
In February 2008 Nvidia released a GeForce 9 series chip, which supports Microsoft's DirectX 10 specification, in response to ATI's release of the Radeon HD3800 series. After March, Nvidia released the GeForce 9800 GX2, which effectively packaged two GeForce 8800 GTS G92s working in an internal SLI configuration on a single card.
In June 2008 Nvidia released its new flagship GPUs: the GTX 280 and GTX 260. The cards used the same basic Unified Architecture deployed in the previous 8 and 9 series cards, but with a upgrade in power. Both of the cards use the GT200 GPU as a basis for their design. This GPU contains 1.4 billion transistors on a 65 nm fabrication process. The GTX 280 has 240 shaders (stream processors) and the GTX 260 has 192 shaders (stream processors). The GTX 280 has 1 GB of GDDR3 VRAM and uses a 512-bit memory bus. The GTX 260 has 896 MB of GDDR3 VRAM on a 448-bit memory bus (revised in September 2008 to include 216 shaders). The GTX 280 allegedly provides approximately 933 GFLOPS of floating point power.
In January 2009 Nvidia released a 55 nm die shrink of GT200 called GT200b. Cards using this chip include an update to the GTX 280 card (called GTX 285) which allegedly provides 1062.72 GFLOPS of floating point power), an update to the GTX 260 (still called the GTX 260) with 216 shaders, and a multi-chip card (called GTX 295) which features two GT200b chips. Distinctively, each individual GPU features 240 stream processors, but only a 448-bit memory bus. The GTX 295 has 1.75 GB (1792 MB, 896 MB per GPU) of GDDR3 VRAM. The GTX 295 allegedly provides approximately 1788.48 GFLOPS of floating point power.
March 2009 saw the release of the GTS 250 mainstream chip, based on a 55 nm die shrink of G92, called G92b. The GTS 250 derives from the 9800GTX+ (some cards consist of rebranded 9800GTX+s) and has 128 shaders (stream processors) with a 256-bit memory bus and 512 MB or 1 GB of GDDR3 VRAM.
On May 12, 2009, Nvidia released images of a new revised edition of the GTX 295. This design, while similar to ATI's HD4870x2, differs from the original. The first production run of the GTX 295 literally consisted of two separate graphics accelerators sandwiched in the same casing and connected by an SLI ribbon cable. The new design encompasses both GPUs on one PCB. The card still has the same specifications of the first production run, although speculation admits it will sell for less due to lower manufacturing costs for the more compact device.
Nvidia plans to launch the Geforce 300 series - based on the Fermi architecture. - postponed to March 2010.

Defective mobile video adapters

In July 2008, Nvidia noted increased rates of failure in certain mobile video adapters. In response to this issue, Dell and HP released BIOS updates for all affected notebook computers which turn on the cooling fan at lower temperatures than previously configured in an effort to keep the defective video adapter from reaching higher temperatures. Leigh Stark of APC Magazine has suggested that this may lead to the premature failure of the cooling fan. This resolution/workaround may possibly only delay component failure past warranty expiration.
But at the end of August 2008, Nvidia reportedly issued a product change notification announcing plans to update the bump material of GeForce 8 and 9 series chips "to increase supply and enhance package robustness." In response to the possibility of defects in some mobile video adapters from Nvidia, some manufacturers of notebooks have allegedly turned to ATI to provide graphics options on their new Montevina notebook computers.
On August 18, 2008, according to the direct2dell.com blog, Dell began to offer a 12-month limited warranty "enhancement" specific to this issue on affected notebook computers worldwide.
On September 8, 2008, Nvidia made a deal with large OEMs, including Dell and HP, that Nvidia would pay $200 per affected notebook to manufacturers as compensation for the defects.
On October 9, 2008, Apple Inc. announced on a support page that some MacBook Pro notebook computers had exhibited faulty Nvidia GeForce 8600M GT graphics adapters. The manufacture of affected computers took place between approximately May 2007 and September 2008. Apple also stated that it would repair affected MacBook Pros within three years of the original purchase date free of charge and also offered refunds to customers who had paid for repairs related to this issue.



Source : Wikipedia

0 comments:

Post a Comment

Páginas

Buscar

Popular Posts

Powered by Blogger.

Copyright © / Just Disheveled Articles Blog.

Template by : Urang-kurai / powered by :blogger