Building a 32 thread xeon system doesn't need to cost a lot

Message boards : Number crunching : Building a 32 thread xeon system doesn't need to cost a lot
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 . . . 12 · Next

AuthorMessage
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13742
Credit: 208,696,464
RAC: 304
Australia
Message 1777346 - Posted: 8 Apr 2016, 23:53:09 UTC - in response to Message 1777332.  

Actually it's a hard limit of 100 CPU tasks. Not per CPU socket.

Damn.

So even with a 4 socket system you still only get 100WUs. Given CPU crunch times, that puts it (very roughly) 16 CPU threads = 1 GTX750Ti (running 2WUs at a time).
So running much more than 50 or so instances on a single system CPU(s) would result in running out of work even with very minor Seti server outages.

So running the system mentioned in the opening post would be OK (similar cache to a couple of GTX 750Tis). But not much point in running something with many more cores or sockets.
:-/
Grant
Darwin NT
ID: 1777346 · Report as offensive
AMDave
Volunteer tester

Send message
Joined: 9 Mar 01
Posts: 234
Credit: 11,671,730
RAC: 0
United States
Message 1777350 - Posted: 8 Apr 2016, 23:58:26 UTC - in response to Message 1777344.  

If you look at the Application details on one of your hosts you can see the number of tasks it has completed for each app in a given day. Number of tasks today

Number of tasks today	31

That seems very low. What is the time basis? Is it UTC, PST, ...?
ID: 1777350 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13742
Credit: 208,696,464
RAC: 304
Australia
Message 1777351 - Posted: 8 Apr 2016, 23:59:17 UTC - in response to Message 1777333.  
Last modified: 9 Apr 2016, 0:04:18 UTC

Not quite following you. Does this mean with 4 cards installed, the limit would be 400 (GPU) WUs/day?

The server limit is 100WUs per GPU in your cache at a time. So 1 GPU= 100WUs, 5 GPUs= 500WUs.
You can set the cache as large as you like, but the most you will get is 100WUs due to the server limits.

The per day limit (Max tasks per day under your computer's Application Details)is a result of invalids/errors reducing the number of WUs you can get per day offset by the number of valid results returned. The more valid work you returned, the more work you are able to download in a 24 hour period.

For my C2D I could (if it were able to process that much work) download 18,775 WUs each day. And each time I get a valid result, that number goes up. Return a error or invalid, it goes down.
Grant
Darwin NT
ID: 1777351 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13742
Credit: 208,696,464
RAC: 304
Australia
Message 1777354 - Posted: 9 Apr 2016, 0:09:15 UTC

I believe BOINC only detects the number of cores/threads present in the system.

Would be nice if they could make it 100WUs for every 4 cores.
That would keep most systems at their current levels, but allow more work for systems with more sockets so they wouldn't be as adversely impacted by Seti server outages.
Grant
Darwin NT
ID: 1777354 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1777358 - Posted: 9 Apr 2016, 0:14:15 UTC - in response to Message 1777346.  

Actually it's a hard limit of 100 CPU tasks. Not per CPU socket.

Damn.

So even with a 4 socket system you still only get 100WUs. Given CPU crunch times, that puts it (very roughly) 16 CPU threads = 1 GTX750Ti (running 2WUs at a time).
So running much more than 50 or so instances on a single system CPU(s) would result in running out of work even with very minor Seti server outages.

So running the system mentioned in the opening post would be OK (similar cache to a couple of GTX 750Tis). But not much point in running something with many more cores or sockets.
:-/

As I mentioned previously my dual Xeon E5645 server, with only 12c/24t, would often run out of SETI@home. As normal AR tasks would run around 2.5 hours a queue of 100 tasks only lasted about 10 hours. A queue of shorties wouldn't even last through a normal Tuesday maintenance.
So running multiple instances of BOINC on a system like that is really the way to go.

Does anyone know a good source for SSI EEB chassis? I need to find some with mounting holes for CPU standoffs. E-ATX, also 12"x13", chassis tend to not have the required standoff holes.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1777358 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1777359 - Posted: 9 Apr 2016, 0:16:26 UTC - in response to Message 1777354.  
Last modified: 9 Apr 2016, 0:17:44 UTC

I believe BOINC only detects the number of cores/threads present in the system.

Would be nice if they could make it 100WUs for every 4 cores.
That would keep most systems at their current levels, but allow more work for systems with more sockets so they wouldn't be as adversely impacted by Seti server outages.

When they had a limit of 50 per core it was nice. A limit of 25 per core would probably work on most systems.

However It would also be REALLY easy to bypass the per core limits. By setting the number of CPUs in a cc_config.xml and then use an app_config.xml to limit the number of running tasks.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1777359 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13742
Credit: 208,696,464
RAC: 304
Australia
Message 1777363 - Posted: 9 Apr 2016, 0:26:51 UTC - in response to Message 1777358.  

Does anyone know a good source for SSI EEB chassis? I need to find some with mounting holes for CPU standoffs. E-ATX, also 12"x13", chassis tend to not have the required standoff holes.


Newegg return a few cases that are meant to be EEB compliant, but if you're after a rack unit type chassis, RackmountMart might be the way to go.
Grant
Darwin NT
ID: 1777363 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1777375 - Posted: 9 Apr 2016, 0:47:24 UTC - in response to Message 1777363.  

Does anyone know a good source for SSI EEB chassis? I need to find some with mounting holes for CPU standoffs. E-ATX, also 12"x13", chassis tend to not have the required standoff holes.


Newegg return a few cases that are meant to be EEB compliant, but if you're after a rack unit type chassis, RackmountMart might be the way to go.

Unfortunately all of those Phanteks chassis have large openings in the MB trey where I need to put standoffs. The heatsinks get bolted directly to the MB trey rather than to the MB for the dual Xeon system I'm working on right now.
I was wanting not mess around with drilling a bunch of holes in a MB tray or spend $200-300 on a bare chassis.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1777375 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1777533 - Posted: 9 Apr 2016, 15:08:11 UTC
Last modified: 9 Apr 2016, 15:14:33 UTC

Anyone wanting to build one of these (I am drooling at the prospect myself) may want to consider that ECC Registered RAM is MUCH cheaper than desktop RAM. For example, I got 12x4GB 1333Mhz for < $100 for a Z800 motherboard. Check prices on eBay before buying desktop RAM...

Just saying...

EDIT: Just checked when looking for something else on eBay, and ran into 4x8gb 1333 ECC Reg. for $62. Pretty darned cheap, and I wasn't even trying.
ID: 1777533 · Report as offensive
Gamboleer

Send message
Joined: 3 Jun 06
Posts: 29
Credit: 12,391,598
RAC: 0
United States
Message 1777549 - Posted: 9 Apr 2016, 17:42:03 UTC - in response to Message 1777375.  

ID: 1777549 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1777554 - Posted: 9 Apr 2016, 18:06:42 UTC - in response to Message 1777549.  

Here's a horizontal SSI-EEB for less than $150 USD:

http://www.amazon.com/Silverstone-GD08B-Aluminum-Extended-compatible/dp/B007X8TQYI/ref=sr_1_1?ie=UTF8&qid=1460223599&sr=8-1&keywords=ssi-eeb

Sadly some manufactures, like Silverstone, think SSI EEB and EATX are the same thing. They both share the same 13"x12" physical dimensions, but SSI EEB has other specifications. Like the mounting positions for heatsinks that I require.

The MB tray should resemble swiss cheese. Like this Chenbro chassis


The Chenbro is $85 at newegg. So it is currently the cheapest one I've found so far, but is doesn't have an ideal fan configuration.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1777554 · Report as offensive
Cosmic_Ocean
Avatar

Send message
Joined: 23 Dec 00
Posts: 3027
Credit: 13,516,867
RAC: 13
United States
Message 1777575 - Posted: 9 Apr 2016, 19:37:19 UTC - in response to Message 1777350.  

If you look at the Application details on one of your hosts you can see the number of tasks it has completed for each app in a given day. Number of tasks today

Number of tasks today	31

That seems very low. What is the time basis? Is it UTC, PST, ...?

UTC...ish. It's often pretty close to midnight UTC, but sometimes it can be delayed an hour or so, probably depending on the server's mood. :p Kidding.. there's probably an actual explanation behind that.
Linux laptop:
record uptime: 1511d 20h 19m (ended due to the power brick giving-up)
ID: 1777575 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13742
Credit: 208,696,464
RAC: 304
Australia
Message 1777604 - Posted: 9 Apr 2016, 21:20:59 UTC - in response to Message 1777554.  

The Chenbro is $85 at newegg. So it is currently the cheapest one I've found so far, but is doesn't have an ideal fan configuration.

I thought the whole point of the EEB chassis was the CPU fan(s) were in specific fixed locations in order to maximise airflow?
Grant
Darwin NT
ID: 1777604 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1777626 - Posted: 9 Apr 2016, 22:17:19 UTC - in response to Message 1777604.  

The Chenbro is $85 at newegg. So it is currently the cheapest one I've found so far, but is doesn't have an ideal fan configuration.

I thought the whole point of the EEB chassis was the CPU fan(s) were in specific fixed locations in order to maximise airflow?

Indeed. That is why I find the Chenbro one less than ideal & likely why it is <$100.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1777626 · Report as offensive
Gamboleer

Send message
Joined: 3 Jun 06
Posts: 29
Credit: 12,391,598
RAC: 0
United States
Message 1777740 - Posted: 10 Apr 2016, 6:47:57 UTC - in response to Message 1777554.  

Here's a horizontal SSI-EEB for less than $150 USD:

http://www.amazon.com/Silverstone-GD08B-Aluminum-Extended-compatible/dp/B007X8TQYI/ref=sr_1_1?ie=UTF8&qid=1460223599&sr=8-1&keywords=ssi-eeb

Sadly some manufactures, like Silverstone, think SSI EEB and EATX are the same thing


Well, that explains the hole in the wrong place on my Silverstone case (not the one I showed you). I'm running short one peg at the moment, but I plan to pull the motherboard and tap a new hole at some point.
ID: 1777740 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13742
Credit: 208,696,464
RAC: 304
Australia
Message 1777745 - Posted: 10 Apr 2016, 7:36:34 UTC - in response to Message 1777740.  

Here's a horizontal SSI-EEB for less than $150 USD:

http://www.amazon.com/Silverstone-GD08B-Aluminum-Extended-compatible/dp/B007X8TQYI/ref=sr_1_1?ie=UTF8&qid=1460223599&sr=8-1&keywords=ssi-eeb

Sadly some manufactures, like Silverstone, think SSI EEB and EATX are the same thing


Well, that explains the hole in the wrong place on my Silverstone case (not the one I showed you). I'm running short one peg at the moment, but I plan to pull the motherboard and tap a new hole at some point.


Looking at the form factor specs on Wikipedia it looks like some manufacturers do get them mixed up.

ATX is 12 * 9.6"
EATX is 12 * 13"

SSI CEB is 12 * 10.5" and has the same mounting hole positions as an ATX board.
SSI EEB is 12 * 13" and the mounting hole positions are not the same as an ATX board.
Grant
Darwin NT
ID: 1777745 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1777762 - Posted: 10 Apr 2016, 8:58:48 UTC
Last modified: 10 Apr 2016, 9:04:16 UTC

ASUS Z9PA-D8 is a similar board to the ASROCK, but it is an ATX board. Would it be suitable for a cruncher? It does have 2 PCIe x16 slots (and 3 x8).


EDIT: I see upthread I missed a ref to this MB. My bad! However, it does have the square ILM pattern for the sockets, not the narrow one. But the sockets are close together, so you might still have to be picky about your coolers...
ID: 1777762 · Report as offensive
Gamboleer

Send message
Joined: 3 Jun 06
Posts: 29
Credit: 12,391,598
RAC: 0
United States
Message 1777867 - Posted: 10 Apr 2016, 18:03:18 UTC - in response to Message 1777762.  
Last modified: 10 Apr 2016, 18:21:59 UTC

ASUS Z9PA-D8 is a similar board to the ASROCK, but it is an ATX board. Would it be suitable for a cruncher? It does have 2 PCIe x16 slots (and 3 x8).


EDIT: I see upthread I missed a ref to this MB. My bad! However, it does have the square ILM pattern for the sockets, not the narrow one. But the sockets are close together, so you might still have to be picky about your coolers...


I'm using this ATX board on one system for E@H CPU work, with dual Cooler Master Hyper 212 EVO coolers on E5-2670's with 115w TDP. They are slim enough to run in parallel, with enough space for a third fan between. This does mean that CPU1 sucks CPU2's exhaust and runs a little warmer, but in my case this means 51c vs 54c using only two CPU fans, at 100% usage. No biggie.

Regarding PCI-E, I have one GPU at the moment, and will probably be upgrading it to two of the next gen GPUs that come out later this year. The board fits two, no problem.
ID: 1777867 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1777871 - Posted: 10 Apr 2016, 18:14:43 UTC - in response to Message 1777867.  
Last modified: 10 Apr 2016, 18:15:06 UTC

ASUS Z9PA-D8 is a similar board to the ASROCK, but it is an ATX board. Would it be suitable for a cruncher? It does have 2 PCIe x16 slots (and 3 x8).


EDIT: I see upthread I missed a ref to this MB. My bad! However, it does have the square ILM pattern for the sockets, not the narrow one. But the sockets are close together, so you might still have to be picky about your coolers...


I'm using this ATX board on one system for E@H CPU work, with dual Cooler Master Hyper 212 EVO coolers on E5-2670's with 115w TDP. They are slim enough to run in parallel, with enough space for a third fan between. This does mean that CPU1 sucks CPU2's exhaust and runs a little warmer, but in my case this means 51c vs 54c using only two CPU fans, at 100% usage. No biggie.

I like the setup AgentB is using for their cooling setup. That seems like the the best way to go.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1777871 · Report as offensive
Gamboleer

Send message
Joined: 3 Jun 06
Posts: 29
Credit: 12,391,598
RAC: 0
United States
Message 1777876 - Posted: 10 Apr 2016, 18:26:18 UTC - in response to Message 1777871.  
Last modified: 10 Apr 2016, 18:27:49 UTC

I like the setup AgentB is using for their cooling setup. That seems like the the best way to go.


That is nice, though when (if) he adds GPUs, that setup will have the CPUs pulling from GPU exhaust. He did mention venting heat from the top as well, though.
ID: 1777876 · Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 . . . 12 · Next

Message boards : Number crunching : Building a 32 thread xeon system doesn't need to cost a lot


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.