Author Topic: Gigabyte's single-card SLI  (Read 2602 times)

0 Members and 1 Guest are viewing this topic.

Offline Darius Greywind

  • Sr. Member
  • ****
  • Male
  • Posts: 456
    • www.svwdhq.us
Gigabyte's single-card SLI
« on: February 13, 2005, 12:08:17 am »
Gigabyte has released a single PCI-Express video card with two Geforce 6600 GPUs in an SLI configuration. So far, its only available as a bundle with their SLI motherboard, apparently because the card wont even function in non-SLI capable boards, even as a single-core.

So far, the benchmarks are mixed. Games which can take good advantage of SLI run much better with dual 6600 cores than a single more powerful one (Geforce 6800,Radeon x800). Games that dont, well, run about as well as on a single-core 6600.

The question is, will nVidia improve their poorly written drivers so this card can be used on boards without explicit SLI support? (Those are significantly more expensive than non-SLI boards)

Should be interesting to see where this technology goes, if anywhere.

Offline Cesarin

  • Hero Member
  • Famous Huggable & Artistic Dragon
  • *****
  • Male
  • Posts: 1852
    • http://www.cesarin.furtopia.org
Gigabyte's single-card SLI
« Reply #1 on: February 13, 2005, 12:20:45 am »
I think they will trash the dual core 6600 until they improve their engine,
if you see atm.. they're imitating all the technology 3dfx used.
first the SLI mode, wich linked 2 voodoo 1 or voodoo banshee togueters...

Voodoo 2 was a new card, then the voodoo 2 banshee came was a similar as the dual core 6600 ( voodoo 2 banshee had 2x voodoo2 vpus )
the voodoo 2 wanst very good, because it didnt had a 2d aceleration and it didnt sold very good, but they improved the technology in voodoo3 ( wich If i remember correctly, had 2x voodoo2 vpus and a 2d aceleration chip. and it was preety popular )

My recomendation is wait until they fix their errors, Nvidia lately has adding too much "technology jumps" that they cant see their own errors ( like the slow card and monster vpu on the 5500 series.. )


myself im an Ati fan since I trashed my geforce 2 MX 400 wich gave me more bugs than good playing .. and im happily standing with a RADEON9800 pro..

---------
and If I remember correctly, Nvidia as nothing to do with the dual core proyect atm, since its gigabyte's product ( ie,  until nvidia oficially releases the dual core videocard , I dont think they will do good drivers ) so you have to use the gigabyte's drivers..
thats my 2 cents
end of edit
----------




-= Cesarin The White DraGoN=-
<a href="http://www.cesarin.furtopia.org" target="_blank">My Webpage</a>

Offline Darius Greywind

  • Sr. Member
  • ****
  • Male
  • Posts: 456
    • www.svwdhq.us
Gigabyte's single-card SLI
« Reply #2 on: February 13, 2005, 12:30:40 am »
nVidia doesnt have anything to do with putting both cores on one card, true. However they're the reason SLI works at all. The sort of SLI you remember from the Voodoo days isnt at all what I'm talking about here.

nVidia's SLI is designed to dynamically load-balance alternate frames or partial frames between two rendering engines. Meaning that games can take advantage of it, even if its not a 100% improvement. nVidia's drivers are what Gigabyte has to use for their single card implementation. The advantage is, you dont need to shell out for 2 GF6600's, nor do you have to deal with the heat that generates, or the space it takes up.

The problem is, nVidia's drivers are hardcoded to only allow SLI on SLI-designed boards with two x16 PCI-Express slots.

Still, like I said, some games wont even take advantage of SLI at  all, so its a benefit only if you know which games you intend to play with the setup.

Offline Cesarin

  • Hero Member
  • Famous Huggable & Artistic Dragon
  • *****
  • Male
  • Posts: 1852
    • http://www.cesarin.furtopia.org
Gigabyte's single-card SLI
« Reply #3 on: February 13, 2005, 01:57:06 am »
Quote (Darius Greywind @ Feb. 12 2005, 11:30 pm)
nVidia's SLI is designed to dynamically load-balance alternate frames or partial frames between two rendering engines. Meaning that games can take advantage of it, even if its not a 100% improvement. nVidia's drivers are what Gigabyte has to use for their single card implementation. The advantage is, you dont need to shell out for 2 GF6600's, nor do you have to deal with the heat that generates, or the space it takes up.

the voodo banshee had a SLI design to plug 2 pci cards toguether, whats what im triying to mean, what they're putting atm, it was already designed by 3dfx..

Gygabyte is just using the VOODOO 2 banshee technology to put 2 cores on same card.
(thats what I was triying to explain)
While Nvidia is using the Voodoo Banshee SLI tech for their SLI versions ( wich uses a conector to join 2 cards insteath of using 2 cores on the same card )

and im not sure about what you mean with nvidia having to make Gygabyte's drivers..
all manufacturers usually design their own, but most of the time are usually slower to appear than their vpus company oficial drivers..

Like Hercules.. Hercules 9800 and 9700 had diferent memory structure, so they needed a special driver to work, the first catalyst for 9700 and 9800 didnt worked on these hercules versions...
so I dont have a clue why gygabyte has to wait until Nvidia delivers them...
-= Cesarin The White DraGoN=-
<a href="http://www.cesarin.furtopia.org" target="_blank">My Webpage</a>

Offline Darius Greywind

  • Sr. Member
  • ****
  • Male
  • Posts: 456
    • www.svwdhq.us
Gigabyte's single-card SLI
« Reply #4 on: February 13, 2005, 02:05:21 am »
nVidia develops all the drivers for their chips now. Gigabyte can only use the ones they supply. They're paranoid about giving out any sort of details on how their chips work these days. This is annoying if you're using Linux, since nVidia's drivers dont work with most Linux installs properly.

There's no actual difference between putting both cores on one card, as far as the system is concerned. The link is exactly the same as using their cable to connect two regular 6600's.

Offline Cesarin

  • Hero Member
  • Famous Huggable & Artistic Dragon
  • *****
  • Male
  • Posts: 1852
    • http://www.cesarin.furtopia.org
Gigabyte's single-card SLI
« Reply #5 on: February 13, 2005, 01:27:00 pm »
Quote (Darius Greywind @ Feb. 13 2005, 1:05 am)
nVidia develops all the drivers for their chips now. Gigabyte can only use the ones they supply. They're paranoid about giving out any sort of details on how their chips work these days. This is annoying if you're using Linux, since nVidia's drivers dont work with most Linux installs properly.

There's no actual difference between putting both cores on one card, as far as the system is concerned. The link is exactly the same as using their cable to connect two regular 6600's.

not at all
if you have 2 6600 cards togheter, they have let's say 256 MB each one..
so using both you have 512 MB..

while using a single card, its harder to fit 512 MB in one bang..
also the distance is diferent...

voltaje is diferent too
while it uses 1 pci express to deliver the data to the cpu
the other uses 2 pci express.
same with power... if it has 2 cpus in the same card, that means it needs  a monster powersupply cable ( 3 cables at least? remember 6800s need 2 cables each one )

not to mention how hot the dual cpu card will get.. o_O

and I seriusly agree with you on the nvidia thing....
getting the drivers hidden wont help them, they're seriusly making the same mistakes as 3dfx.....
who needs a monster chip that eats exesive ammount of power, its noisy, its HUGE and eats a lot of space in your computer, plus it only has at most.. 15% more horsepower than the competitor's flagship ( if not way less )

I prefer the efficiency rather than top end "cute alien tech"
-= Cesarin The White DraGoN=-
<a href="http://www.cesarin.furtopia.org" target="_blank">My Webpage</a>

Offline Darius Greywind

  • Sr. Member
  • ****
  • Male
  • Posts: 456
    • www.svwdhq.us
Gigabyte's single-card SLI
« Reply #6 on: February 13, 2005, 08:09:53 pm »
Actually...

nVidia's SLI only uses the frame buffer of one card. So two 128mb 6600's means only 128mb of frame buffer. And that doesnt change either with Gigabyte's card. Yes, wasteful, isnt it?

And two cards in SLI use one x8 PCI-E channel each. Gigabyte's card uses x16 from one slot total. PCI-E can split up like that, which is cool in some ways.

Offline Cesarin

  • Hero Member
  • Famous Huggable & Artistic Dragon
  • *****
  • Male
  • Posts: 1852
    • http://www.cesarin.furtopia.org
Gigabyte's single-card SLI
« Reply #7 on: February 13, 2005, 11:12:25 pm »
Quote (Darius Greywind @ Feb. 13 2005, 7:09 pm)
Actually...

nVidia's SLI only uses the frame buffer of one card. So two 128mb 6600's means only 128mb of frame buffer. And that doesnt change either with Gigabyte's card. Yes, wasteful, isnt it?

And two cards in SLI use one x8 PCI-E channel each. Gigabyte's card uses x16 from one slot total. PCI-E can split up like that, which is cool in some ways.

booo, then they're not fully exploiting the total power of both cards or both cpus.... o_O

now I know why a 2x SLI 6800 cards didnt had almost 2X performance on games.. but only 80% more of horsepower..
-= Cesarin The White DraGoN=-
<a href="http://www.cesarin.furtopia.org" target="_blank">My Webpage</a>