Postoji izraz ``Ferrari crvena boja``. A i sam znak ima crvenu boju, i tipican Ferrari je crven bas kao ATI, i boja sibolise i predstavlja kvalitet, brzinu, pouzdanost.
Dok su u istocnoj Nemackoj svi oni krsevi od automobila proizvedeni 50-tih pa nadalje imali tipicno zelenkastu boju, to je neka nijansa bas kao iz Skajevog potpisa. Simbol socijale, raspada,...
Pa moze da se napravi podela na to sta vozis: ATI=Ferrari nVidia=soc auto istocne nemacke, rumunije,... itd
Malo sale.
Demon to some. Angel to others. _______________________________
Fudzilla has reported the existence of a dual-GPU version of Nvidia`s upcoming GF100 in April, roughly one month after the single-GPU GF100 launch, if all goes well and A3 is the final stepping. However, is such a mammoth product, the dual-GF100, actually feasible?
Earlier rumours pointed to a TDP of 180-200W for the single GPU GF100, but the recent demonstration in CES 2010 saw a 8 pin + 6 pin PCI-e power configuration, suggesting a TDP of somewhere between 225W and 300W. Since then, SemiAccurate have reported the actual TDP to be a rumoured 280W.
We must keep in mind that the silicon demonstrated at CES 2010 is most certainly not the final stepping. Even so, barring a major miracle, the power draw of GF100 is likely to remain north of 225W. This kind of power draw is no surprise and something we would expect from a 3 billion transistor, 500+ mm2 monster GPU. Unfortunately, this means that a dual-GPU GF100 is never going to make sense, irrespective of agressive downclocking and binning, at under 300W TDP. Technically, Nvidia can break the 300W barrier, but this will mean violating PCI-e regulations and foregoing PCI-e official compliance.
In essence, this is GT200 all over again. It was the last Nvidia single-GPU to carry a 6 pin + 8 pin configuration - and as history speaks, Nvidia was unable to make a dual-GPU iteration till the GT200b die shrink, simply because the single GPU ran too hot to begin with.
What this means is GF100`s direct competitor is likely to be the ATI Radeon HD 5970, not the single-GPU Radeon HD 5870, despite rumours suggesting an existence of a dual-GF100 product. Once again, recalling the GTX 280 vs. HD 4870 X2 battle. However, GF100 looks to be more promising, albeit delayed, than the GTX 280. The question is, can it beat two of ATI`s Cypress?
We`ve learned that the six-month delay of Geforce version of GF100 Fermi chips and cards won`t actually hinder the introduction of other, slower version of Fermi card.
The whole point of making the insanely big Fermi is to win the hearts of journalists and enthusiasts and to bring this architecture to performance, mainstream and finally entry-level markets.
Entry-level is where the money is, even though you need to send millions of these cards to make a decent profit. Mainstream Fermi is a project that Nvidia is pushing side-by-side with high-end Fermi and we`ve learned that mainstream Fermi should be on time.
You should roughly expect it one quarter after the first Fermi, and if all goes well June should be a good month to launch. Since Nvidia changed quite a lot of plans in the last few months, don’t be surprised if it launches a bit later.
Izgleda ce TDP single Fermi da bude izmedju 225 i 300 W .Ovi sa Fudzilla pitaju kako je uopste tehnicki moguce .napraviti dual Fermi.(8 pin + 6 pin PCI-e power configuration, suggesting a TDP of somewhere between 225W and 300W).Izlgeda ce tek u Junu izlazak na trziste.Usrala je motku Nvidia sa ovim kasnjenjima.
Broj postavljenih tema: 60472. Broj poslatih odgovora: 648167. Trenutno niste prijavljeni na PC Berzu i zbog toga imate status 'gosta'. Kao gost ne možete da šaljete poruke na Forum. Ako ste registrovani kao član PC Berze, prijavite se. Ako ste novi korisnik, molimo registrujte se da bi dobili mogućnost aktivnog učešća u radu Foruma.