Monster GPU Cruncher Build

Message boards : Number crunching : Monster GPU Cruncher Build
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

AuthorMessage
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65867
Credit: 55,293,173
RAC: 49
United States
Message 1671360 - Posted: 29 Apr 2015, 5:48:15 UTC
Last modified: 29 Apr 2015, 5:53:14 UTC

How is this? It's the RIF6 and is for sale on Amazon.. The cord is 3' long or 91.44cm.

The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1671360 · Report as offensive
woohoo
Volunteer tester

Send message
Joined: 30 Oct 13
Posts: 972
Credit: 165,671,404
RAC: 5
United States
Message 1671371 - Posted: 29 Apr 2015, 6:11:41 UTC

Wow that's something I didn't think of and it's cheap. The powered riser that would piggyback wouldn't need to be long because the 50cm takes care of that; I would only be using the powered riser to make up for the power deficit. My EVGA 1600 T2 has a bunch of unused jacks on it and I have a good amount of extra molex cables of decent length.

So right now the case I'm using is an open-air Aerocool Strike-X Air. The box was so big it looked like I was buying a lawn mower and I had a rough time getting it into my car. It takes up a lot of desk space. I'm not using the optical drive cage; I have power supply cables coming through the hole and I have a cheap hard drive on the metal plate. I took out the hard drive cage because the power supply is so deep that the cables would hit the back of the cage; I have power supply cables coming out of the hole where the hard drive cage was supposed to go. I'm not using the top fan because I want to be able to see the interior. The hoses for the 295x2 radiator aren't very long so I mounted the radiator on the back rail. The rail only has a few holes drilled in it and I'm not about to drill any more so there's nowhere to mount the Corsair H100i; it's just sitting on the desk. I supposed the easiest thing to do would be to throw a fourth gpu where the optical drive rack would have been.
ID: 1671371 · Report as offensive
woohoo
Volunteer tester

Send message
Joined: 30 Oct 13
Posts: 972
Credit: 165,671,404
RAC: 5
United States
Message 1671373 - Posted: 29 Apr 2015, 6:20:03 UTC

That x1 riser is similar to what I already have but I'm pretty certain there would be a significant performance hit on some projects if I used that.

I would rather use my existing 3M 50cm 164pin ribbon cable assembly(not powered) along with a cheap short x16 to x16 powered riser attached to it. Don't know if it would work, but it doesn't cost an arm and a leg.
ID: 1671373 · Report as offensive
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65867
Credit: 55,293,173
RAC: 49
United States
Message 1671375 - Posted: 29 Apr 2015, 6:23:23 UTC
Last modified: 29 Apr 2015, 6:24:51 UTC

The RIF6 looks like it is using a USB 3.0 cable, a shorter one could be bought, if one desires that. For $3.99 or a bit higher on ebay, the RIF6 doesn't sound like a real gamble.
The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1671375 · Report as offensive
woohoo
Volunteer tester

Send message
Joined: 30 Oct 13
Posts: 972
Credit: 165,671,404
RAC: 5
United States
Message 1671376 - Posted: 29 Apr 2015, 6:29:16 UTC

The length is not the issue. The problem is that what you have pictured is an x1 to x16 cable and x1 bandwidth would be a bottleneck on some projects. For the same money from the same vendor I could get an x16 to x16 powered riser.
ID: 1671376 · Report as offensive
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65867
Credit: 55,293,173
RAC: 49
United States
Message 1671378 - Posted: 29 Apr 2015, 6:31:02 UTC
Last modified: 29 Apr 2015, 6:34:07 UTC

Then there is this Here the cable is supposed to be 2' or 60cm long and is $5.49.

The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1671378 · Report as offensive
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65867
Credit: 55,293,173
RAC: 49
United States
Message 1671380 - Posted: 29 Apr 2015, 6:42:40 UTC - in response to Message 1671376.  

The length is not the issue. The problem is that what you have pictured is an x1 to x16 cable and x1 bandwidth would be a bottleneck on some projects. For the same money from the same vendor I could get an x16 to x16 powered riser.

It was only a suggestion, I just do Seti@Home.
The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1671380 · Report as offensive
Profile Wiggo
Avatar

Send message
Joined: 24 Jan 00
Posts: 35200
Credit: 261,360,520
RAC: 489
Australia
Message 1671381 - Posted: 29 Apr 2015, 6:51:21 UTC - in response to Message 1671376.  

The length is not the issue. The problem is that what you have pictured is an x1 to x16 cable and x1 bandwidth would be a bottleneck on some projects. For the same money from the same vendor I could get an x16 to x16 powered riser.

Here at SETI@home I've found no difference between 4x, 8x or 16x (@ PCI-e 2.0, let alone 3.0 speed slots), but yes some projects do take an impact at narrower bandwidths (though they're only backup only projects to me so of no real consequence in the long run at all). ;-)

Cheers.
ID: 1671381 · Report as offensive
KLiK
Volunteer tester

Send message
Joined: 31 Mar 14
Posts: 1304
Credit: 22,994,597
RAC: 60
Croatia
Message 1671414 - Posted: 29 Apr 2015, 9:51:59 UTC

if GPUs are together 1mm apart, why not go this way:

water cooling! ;)

also, you can cool the CPU with water...


about dual-PSU...once I did run dual-PSU on old BP6, by messing with wires...old 2x300W just wasn't enough for 7xHDD, DVD-RAM, MBO with 2xCPU, max RAM, Kyro2 GPU with 2x Voodoo2, SB32, network card, USB extender, etc...and couldn't find stronger on the market!
today, you have these:

;)


non-profit org. Play4Life in Zagreb, Croatia, EU
ID: 1671414 · Report as offensive
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1671417 - Posted: 29 Apr 2015, 10:03:18 UTC

I notice one thing in those pics which may interest Zalster ... Notice the spacer blocks between the cards ...
ID: 1671417 · Report as offensive
woohoo
Volunteer tester

Send message
Joined: 30 Oct 13
Posts: 972
Credit: 165,671,404
RAC: 5
United States
Message 1671424 - Posted: 29 Apr 2015, 10:16:07 UTC

I would prefer the trivial cost of making a mess of my desk with usb powered risers versus the time and money required to create a custom water loop where I would need to convert five video cards to use water blocks, plus the additional pumps, fittings, reservoirs, tubing, radiators, etc.
ID: 1671424 · Report as offensive
woohoo
Volunteer tester

Send message
Joined: 30 Oct 13
Posts: 972
Credit: 165,671,404
RAC: 5
United States
Message 1671425 - Posted: 29 Apr 2015, 10:17:12 UTC

Those spacer blocks are actually water cooling fittings
ID: 1671425 · Report as offensive
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65867
Credit: 55,293,173
RAC: 49
United States
Message 1671465 - Posted: 29 Apr 2015, 13:10:00 UTC - in response to Message 1671414.  

Nice pics Klik, I have this to connect My Corsair 950w psu and My Tt 650w video card psu together.

The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1671465 · Report as offensive
KLiK
Volunteer tester

Send message
Joined: 31 Mar 14
Posts: 1304
Credit: 22,994,597
RAC: 60
Croatia
Message 1671482 - Posted: 29 Apr 2015, 14:38:25 UTC - in response to Message 1671465.  
Last modified: 29 Apr 2015, 14:39:06 UTC

Nice pics Klik, I have this to connect My Corsair 950w psu and My Tt 650w video card psu together.
http://i110.photobucket.com/albums/n107/JokerCPoC/Asus%20Rampage%20III%20Extreme/HAF-X2ndPsu%20001_zpsv5w7s1pv.jpg

well, that cable that I put on last pic, only "turns PSU on-off"...

and you connect your 2nd PSU with your card by appropriate cables...especially if you have "modular PSU"...then you can combine!
or, you can don't have the appropriate cables on the PSU, there are some cables like:



non-profit org. Play4Life in Zagreb, Croatia, EU
ID: 1671482 · Report as offensive
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65867
Credit: 55,293,173
RAC: 49
United States
Message 1671492 - Posted: 29 Apr 2015, 14:53:39 UTC - in response to Message 1671482.  
Last modified: 29 Apr 2015, 14:58:54 UTC

Nice pics Klik, I have this to connect My Corsair 950w psu and My Tt 650w video card psu together.
http://i110.photobucket.com/albums/n107/JokerCPoC/Asus%20Rampage%20III%20Extreme/HAF-X2ndPsu%20001_zpsv5w7s1pv.jpg

well, that cable that I put on last pic, only "turns PSU on-off"...

and you connect your 2nd PSU with your card by appropriate cables...especially if you have "modular PSU"...then you can combine!
or, you can don't have the appropriate cables on the PSU, there are some cables like:
http://www.jmt.bg/images/products/24871_55043.jpg

This Thermaltake W0158RU psu(see pic) only uses the green and black wires, My solution keeps from having to put the main 24pin cable onto another cable that is between the psu and the motherboard.

I've had a psu fry its main cable on two pins and the cable needs repairs before the psu will be usable again, since the Enermax Revolution85+ 1050w psu is not totally modular, if the psu were totally modular I'd replace the cable, but it isn't, the lines that got damaged are the two 12v lines.

The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1671492 · Report as offensive
Profile Sutaru Tsureku
Volunteer tester

Send message
Joined: 6 Apr 07
Posts: 7105
Credit: 147,663,825
RAC: 5
Germany
Message 1671596 - Posted: 29 Apr 2015, 19:24:32 UTC
Last modified: 29 Apr 2015, 19:46:55 UTC

Please don't use BBCode's [img] for to show pictures.

There are also dial up (56k) users around.
To load the forum/threads would last days. ;-)

Not all members know that they can edit their 'com prefs' and uncheck: Show images as links.

Please use the BBCode's [url] or [url=http://google.com/*]link to website[/url*], so that all can decide to look or not. Also if the 56k users load again threads (if they closed the browser), that they don't need to load again already seen pictures.
[used * for to disable it]

It's just a kindly note. :-)

Thanks.

BTW, I have just DSL2000RAM (more isn't possible here where I live) and sometimes the load of the forum/threads lasts very long. ;-)
ID: 1671596 · Report as offensive
Profile Sutaru Tsureku
Volunteer tester

Send message
Joined: 6 Apr 07
Posts: 7105
Credit: 147,663,825
RAC: 5
Germany
Message 1671601 - Posted: 29 Apr 2015, 19:45:56 UTC
Last modified: 29 Apr 2015, 19:55:50 UTC

Thanks for the water cooling's ...
I 'forgot' it. ;-)

Yes, this would also go, I'll think about if this would be possible with my motherboard and GPU cards.
IIRC, it's little bit tricky, if you have 4 GPU cards.
The connection from the water innertube to the GPU cards heatsink ...
This 90° water point at the GPU cards heatsink touch the next GPU card (push off) - it was in past like this as I informed me.
Maybe today this 90° water points are 'smaller' so that they don't touch the next GPU card?
Need to inform me. ;-)

Here posted KLiK pictures.
On the first the cards are not very close, I guess there are at least 1 or 2 PCIe slots between them.
But on the second picture it's like this that the heatsinks are directly connected with each other?
If yes, this way PCIe slot side by side GPU cards connection would be possible (also with dual GPU cards? There is much heat which need to cool down) ...
ID: 1671601 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22286
Credit: 416,307,556
RAC: 380
United Kingdom
Message 1671607 - Posted: 29 Apr 2015, 20:06:44 UTC

Some time back I looked at water cooling (indeed until I had a domestic burst had a pair of water cooled gtx690 running very well). One manufacturer did the bits to directly couple GPUs together and so you could get a much higher card density than normal - if the GPU was thin enough you could use every slot on a single spaced board. Sadly it didn't work with reference design GTX690s as they are just too thick to pair up into single slot spacing, but a lot of lesser GPUs had the hardware available. Sorry I can't recall who it was.
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 1671607 · Report as offensive
KLiK
Volunteer tester

Send message
Joined: 31 Mar 14
Posts: 1304
Credit: 22,994,597
RAC: 60
Croatia
Message 1671856 - Posted: 30 Apr 2015, 6:05:40 UTC - in response to Message 1671601.  

Thanks for the water cooling's ...
I 'forgot' it. ;-)

Yes, this would also go, I'll think about if this would be possible with my motherboard and GPU cards.
IIRC, it's little bit tricky, if you have 4 GPU cards.
The connection from the water innertube to the GPU cards heatsink ...
This 90° water point at the GPU cards heatsink touch the next GPU card (push off) - it was in past like this as I informed me.
Maybe today this 90° water points are 'smaller' so that they don't touch the next GPU card?
Need to inform me. ;-)

Here posted KLiK pictures.
On the first the cards are not very close, I guess there are at least 1 or 2 PCIe slots between them.
But on the second picture it's like this that the heatsinks are directly connected with each other?
If yes, this way PCIe slot side by side GPU cards connection would be possible (also with dual GPU cards? There is much heat which need to cool down) ...

just don't forget to put the FAN on CASE...so you can cool of the MBO chipset, which isn't blown in case of water-cooling... ;)


non-profit org. Play4Life in Zagreb, Croatia, EU
ID: 1671856 · Report as offensive
Profile Sutaru Tsureku
Volunteer tester

Send message
Joined: 6 Apr 07
Posts: 7105
Credit: 147,663,825
RAC: 5
Germany
Message 1672585 - Posted: 1 May 2015, 15:11:06 UTC
Last modified: 1 May 2015, 15:23:06 UTC

Already have:
2x Intel Xeon E5-2630v2 (CPU 80W TDP)
1x ASUS Z9PE-D8 WS (Motherboard)
4x AMD Radeon HD7990 (dual GPU card, out of stock)

Ordered:
1x 2000 Watt Super Flower Leadex 80 Plus Platinum 8Pack Edt. PSU - SF-2000F14HP(BK)
2x Corsair Vengeance Low Profile 16GB Kit DDR3 PC3-12800 CL8 - CML16GX3M4X1600C8 (4x4GB Kit. So 16GB/CPU, 32GB/system.)
2x Intel Thermal Solution - BXTS13A (up to 140W TDP)
1x Thermaltake Core X9 - CA-1D8-00F1WN-00 (PC Case)
1x Microsoft Windows 8.1 Pro 64 Bit German OEM - FQC-06942 (OS)
Update #03:
Instead of: 250GB WD VelociRaptor 64MB 3.5" SATA 6Gb/s - WD2500HHTZ (HDD) (2.5" drive in a 3.5" 'heatsink')
I ordered: 1TB WD VelociRaptor 64MB 3.5" SATA 6Gb/s - WD1000DHTZ (HDD) (2.5" drive in a 3.5" 'heatsink')
I ordered: 1024MB ZOTAC GeForce GT 730 LP Passiv PCIe 2.0 x1 - ZT-71107-10L (for my J1900 CPU PC ;-)


I read tests and tests and tests ... ;-)
I decided to go with 1TB instead of 250GB, because I read 'more disks -> more read/write heads -> faster 'something'' - can't remember after reading so much. But faster is better. ;-)

I'll see if the HDD LED will be continuously on, then I'll decide if RAM-Disk, or not.


Hm, the (e.g. Sapphire's) AMD R9 295X2 (in AMD's reference design) have water cooling.
It's a radiator with 1x120mm fan (it looks like). For 500W! (If someone have such card, the temps are OK if SETI/AP runs 24/7 on it?)

AFAIK, if you would like to cool down (a 130W TDP CPU) like a big tower heatsink, you should go (e.g. Corsair Hydro Series H55) with 1x120mm fan/radiator.
If you would like to cool more down, then a (e.g. Enermax Liqmax II 240) 2x120mm fan/radiator is needed.

The AMD R9 295X2 with 500W TDP. This small 1x120mm fan/radiator can cool down this much?

If I would go with water cooling, each HD7990 (375W) with water block, each with 4x120mm fan/radiator (for to cool down to tolerable 24/7 crunching temps), I would need in whole a 16x120mm fan/radiator. ;-)
I guess in this case it would be better to use 4 own circulation loops.
Because if not the 1st GPU get 'cool' water, the 8th GPU get already 'warmed' water for cooling. I guess this isn't good.
Currently I have no idea to do this or that. I'm a water cooling beginner. ;-)
I searched the web if I could find a shop which could add the water cooling block the the GPU cards, but until now I found no shop which could do this.
Until now I have no idea if the GPU heatsinks are with paste or with glued pads connected. I don't want to destroy the chip, not the VRAM and not the voltage transformer.

Finally, water cooling for my Monster would be very 'costly' - too 'costly' (complete €1,000+?)?
Or I wouldn't need a '16x120mm wide surface (or 4x 4x120mm)' for to cool down to tolerable 24/7 crunching temps (maybe like the stock fans/heatsink turn at 100%?)?

Hints and tips are very welcome.

Thanks.
ID: 1672585 · Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 · 6 · Next

Message boards : Number crunching : Monster GPU Cruncher Build


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.