A box arrived a few days ago at work.
The box contained a little supercomputer comprised of 8 GeForce Titan GPUs in a Tyan FT77A platform.
The system fits nicely in our server room.
And the 8 GPUs light up too!
It would be great if you could post some rendering benchmarks on v-ray or anything that utilize multiple GPU’s. thanks.
Hey, this machine looks awesome. Any stability issues when running all 8 GPUs at the same time?
Also, is there any room for additional PCIe cards when you have 8 GPUs installed? It look like the other PCIe slots are obstructed by the GPUs…..
I don’t have any performance measurements that I can share right now. Compared to an 8 GPU system with PCIe 2.0 and GTX 580 it is significantly faster. No quantum leap but faster.
If 8 GPUs are installed there is no room for an additional PCIe card. You can remove one or two cards and put your PCIe card (Infiniband?) in there.
Hi, impressive setup! I’ve been looking for a GPU expander that uses Gen3 with 8 double sized gpu slots. Did you install the Titan’s yourself or did the company you purchased it form install them? Can you make a recommendation?
companies usually have to be convinced to sell Titans for such systems, they will give you numerous reasons why it is a bad idea. Thy will cite warranty, ECC, double precision etc. But the fact of the matter is, consumer grade cards are faster due to their less conservative clock settings. And they are cheaper.
I would not recommend them for 24/7 simulation calculations often done in HPC but for applications with real-time constraints they are quite useful.
Shoot me an e-mail if you want a recommendation about where to buy them (in Germany).
Just curious, what OS are you using on it? I have the same chassis with 8x GTX780 cards, and I can only get 7 to work with Windows 7 or 8.1 – 8th card to come up always gives error 43.
I heared that before. On our system all 8 GPUs are working. We’re running Ubuntu 12.04, newest drivers, CUDA 5.5.
Awesome rig you have their dude! very envious.
Your email is’nt obviously available on your site/blog (intentional probably), would you mind emailing me where you purchased this unit from?
@Boris, If you ever come across a solution to your problem please share it, I can see myself running to this hurdle to.
I sent you an e-mail. You can reach me at firstname.lastname@example.org.
Hi Boris, did you solve your problems? I suggest you replace one of your GTX780s with a Titan or GTX780 Ti and see if it does the trick.
BTW, why not use Linux? Linux is able to handle more GPUs. Recently I have built a 16-GPU rig with NVIDIA cards, and it just works fine under Linux.
A 16 card rig sound intriguing. Can you go into some details about that system.
I agree about Linux. It is well supported and all the tools are readily available.
Hi Sebastian Schaetz,
Yes, and I think building a rig with such a large number of gpus is very challenging and interesting.
Note that my rig has only 11 cards, not 16. But five of them are dual-gpu cards, so the total number of gpus is exactly 16.
The motherboard I used is a Supermicro X9DRX+-F, which has 11 PCI-e x8 slots (but no any x16 slots). PCI-e extenders are required to connect the cards to the board properly.
The reason why only 5 dual-gpu cards used is just because I have no enough dual-gpu cards at hand. Two more dual-gpu cards were ordered today and I’m curious to see if 17 or 18 gpus on a single system will work.
To save money, I only used some cheap cards to build the rig. The 11 cards are composed of 6x GTX660 Ti and 5x GTX295, devicequery and nvidia-smi information for the rig can be found here:
thanks for the detailed info about your setup. Sounds awesome! Looking forward to hearing more about your experiences with even more cards! Might I ask what you’re doing with your system? Our system is used for medical imaging – we reconstruct images with the 8 GPU rig in (almost) real-time.
Hi, my server is used for CUDA software development, but I put so many GPU cards in it is mainly just for fun. I’m curious if there is really a limit for 16 GPUs.
And now, the answer is out there. My rig has been upgraded to 18 GPUs and it just works fine!
For more detailed info, see:
Impressive zzz1000! To keep in the spirit of this post, can you share some pictures maybe?
Great blog/post. Would you guys tell me something about power consumption and stability of such a system? Could it be used on a cluster, up 24/7 . How long the cards survive?
We have a cluster. The vendor installed some GPUs and it usually presented problems. We had to call the vendor several times to fix the installation. So I guess that a proper/robust installation of GPUs is not that simple.
the power consumption of this system is not higher than 2kW if all 8 GPUs are running. We had some stability problems in the beginning but they seem to be mostly gone at the moment. Like you, we had some intensive interaction with the vendor for replacements, software updates (Nvidia driver + BIOS) and configuration settings. All in all not a nice experience. But in my experience, this stabilizes after some time and the systems then run reliably for about 2 years or more.
I can however not attest to how stable they are in a 24/7 setting. Our systems run 24/7 but computation on GPUs is only done in small bursts throughout the day. There are however some computing centers that run multi-GPU systems (not 8 but 2 or 3 per node) with consumer grade cards quite successfully. There are probably more problems than with professional cards but they are also a lot more expensive.
What kind of system are you using? And what for?
I am having a very similar issue with 8 x GTX 980 Tis. Did you ever get it to work with 8 cards in Windows 7 x64?
It would be very interesting to know if you got it working and how?
PS: I also sent a PM to Sebastian.
I understand it can be late to ask, however, I would try :-)
Could you please explain how you deal with the embedded graphics? Do you disable it or, in contrary, you keep it enabled and you can use it as a normal card display can be connected to, thus leaving 8 GPUs solely for calculations?
TYAN vs SuperMicro. So this TYAN system has a dual PCIe root complex, i.e. four GPUs under CPU 1 and four GPUs under CPU 2. How do you like the TYAN platform compared with something like a SuperMicro 4208GR-TRT2? At Lambda Labs, we use the SuperMicro platform but are considering switching. It’s a bit unfortunate that the TYAN doesn’t have a single root complex. (Accomplished with two PEX 8696 PLX Switches.)