GPU Wars 2016: GTX 1050 Ti & GTX 1050: October 25th

Message boards : Number crunching : GPU Wars 2016: GTX 1050 Ti & GTX 1050: October 25th
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 15 · 16 · 17 · 18 · 19 · Next

AuthorMessage
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1801622 - Posted: 8 Jul 2016, 20:53:29 UTC

ID: 1801622 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1801635 - Posted: 8 Jul 2016, 22:29:33 UTC - in response to Message 1801622.  
Last modified: 8 Jul 2016, 22:31:08 UTC

https://www.techpowerup.com/223981/amd-releases-pci-express-power-draw-fix-we-tested-confirmed-works

New driver appears to be as promised :)

And even in compatibility mode (which reduces the aux power connector load to within spec) the drop in performance is only up to 1 frame per second (approx. 3%) at worst depending on the game; the margin of error for the benchmarks is around +/-2%. So while the effect is measureable, it won't be noticeable to users.

So for people running the reference cards at stock speed, there is no issue. If they want to overclock them, then they could have problems; better to wait for the partner cards which will most likely have single 8 pin connecter or double 6 pin connectors for which overclocking won't cause any load issues.

Lets just hope they can keep up with demand, that will help put pressure on NVidia's pricing for the their cards.
Grant
Darwin NT
ID: 1801635 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1804264 - Posted: 22 Jul 2016, 21:30:34 UTC - in response to Message 1801635.  

More money than you know what to do with?
Maybe the new Titan X will take care of those excess funds.

                       GTX 1080   Titan X

CUDA Cores                 2560      3584
Memory (GDDR5X, GB)           8        12
Memory Bandwidth (GB/s)     320       480
Base clock speed (MHz)     1607      1530
FLOPS (32bit, TFLOPS)         9        11



Very limited availability from Aug 2.
Grant
Darwin NT
ID: 1804264 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1804311 - Posted: 22 Jul 2016, 23:34:39 UTC - in response to Message 1804264.  

I thought that it was supposedly going to have HBM memory, it was missing from the 10x0 series due to the cost? Is this the Big Pascal that some people has been getting all lathered up over?

ID: 1804311 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1804317 - Posted: 22 Jul 2016, 23:48:58 UTC - in response to Message 1804311.  
Last modified: 22 Jul 2016, 23:51:05 UTC

Is this the Big Pascal that some people has been getting all lathered up over?

The fact that it's using GDDR5X, i'd say no. Although it does look like it might be another variation on the existing GPUs, so maybe?
I guess it depends on what people consider Big Pacal to be. For me, that's Tesla/HBM2, for others- no idea.

From the Wiki,
          Code name
GTX 1060  GP106-400
GTX 1070  GP104-200
GTX 1080  GP104-400
Titan X   GP102-400*
Tesla     GP100



*Speculated at this time.
Grant
Darwin NT
ID: 1804317 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1804318 - Posted: 22 Jul 2016, 23:53:42 UTC - in response to Message 1804317.  

I believe the Big Kahuna is the GP100, which I read speculation of that the 10x0 series Titan was going to be based upon, but reading some of the comments in the linked article, it appears that it will possibly either be the 2nd rev of this Titan card, or else the next gen of their GPU's. Of course, it's all just speculation at this point, but I suppose that's half the fun, right? ;-)

ID: 1804318 · Report as offensive
AMDave
Volunteer tester

Send message
Joined: 9 Mar 01
Posts: 234
Credit: 11,671,730
RAC: 0
United States
Message 1806657 - Posted: 2 Aug 2016, 15:06:03 UTC - in response to Message 1797906.  

ID: 1806657 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1806742 - Posted: 3 Aug 2016, 6:25:27 UTC - in response to Message 1806657.  
Last modified: 3 Aug 2016, 7:10:56 UTC

Pascal Titan X benchmarks up on Tom's Hardware.

EDIT- review link.
Grant
Darwin NT
ID: 1806742 · Report as offensive
Profile Zalster Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 27 May 99
Posts: 5517
Credit: 528,817,460
RAC: 242
United States
Message 1806746 - Posted: 3 Aug 2016, 6:57:08 UTC - in response to Message 1806742.  

That Titan X would definitely benefit from a hybrid kit, or some sort of cooling.

The review was interesting to read. I bet with the right cooling, you would get even better scores.

Looking at the tear down, I would want to get a better look at the mounting holes.

After seen what the did to the 1060 to prevent hybrid kits from being used, I would be cautious but think it would be fun to try. If and when I find the $$$, lol...
ID: 1806746 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1806750 - Posted: 3 Aug 2016, 7:12:24 UTC - in response to Message 1806746.  

That Titan X would definitely benefit from a hybrid kit, or some sort of cooling.

Definitely; it puts out a lot more heat than the GTX 1080, and look at how much they are benefiting from non-reference coolers.
Grant
Darwin NT
ID: 1806750 · Report as offensive
Profile Shaggie76
Avatar

Send message
Joined: 9 Oct 09
Posts: 282
Credit: 271,858,118
RAC: 196
Canada
Message 1806781 - Posted: 3 Aug 2016, 12:39:07 UTC

I've been obsessing about how I want to upgrade my main rig to SLI (I can rationalize this for something I want to do for work and I know the hardware will get put to good use when I'm not working). I've been measuring, running tests and spending ridiculous time researching this purchase. I woke up early this morning because I couldn't sleep and so I ran an experiment that yielded some interesting results.

Here's how I have my system configured:



I've been operating on the assumption that the waste heat from the GPU was bad for my CPU-radiator; I'd once observed correlation between GPU temps and CPU temps that I assumed was an indication of the radiator operating less efficiently. I'm now convinced in my Corsair 750D Airflow Edition with Noctua fans that this isn't a problem at all.

Because my fans are so powerful I generally set the fan controller very conservatively -- under max load the fans are around 50-60%. My rig had been crunching all night and the both processors were relatively hot and so I manually jacked my three 140mm case-fans to 100% and let it ride for 10 minutes. I plotted CPU, GPU, and all the motherboard temps and watched and waited: nothing happened.

Looking at the my over-night cooking data I can also see evidence that discredits the theory that my GPU waste heat effects my CPU radiator; the two temperature plots aren't correlated at all in the long term. I doubt they're correlated when starting from cold either because it takes quite a while for water in the loop to warm up.

I was also worried about thermal throttling after cooking the GPU for 12hrs at a time; again, as Bill Nye would have us remember: science rules. I ran a test where I set the GPU fan to maximum for a while -- the GPU cooled off but didn't run any faster or take any more load. It seems like in my case at least the ideal temperature it aims for is about 60C -- when running full-blast it sometimes goes into the high-60s but then the fans on the GPU spin up to about 50% and the temperature stabilizes.

I was originally convinced that I needed cards with blowers to vent the heat out of the case but I was concerned about the noise. I was then sure that a hybrid solution would be the most cost-effective but I don't think the hoses on the MSI Seahawk are long enough to reach my exhaust grills. Now I'm thinking I can just get a card with twin-100mm fans and giant heat-sink because my PCIe x16 slots are 4-apart and there'll be room for the top card to breathe.

So now the only question is: 1070's or 1080's? My stats aren't showing a dramatic difference and I can use the money saved by getting 1070's to put new cards in other machines. I read one benchmark interpretation that suggested that the reduced core-count on the 1070's meant that it had more bandwidth-per-core even with the slightly slower memory.

What would really help is some stock SoG OpenCL benchmarks from a 1070 and 1080 with and without multitasking -- are the extra cores on the 1080 just idle because SoG isn't going wide-enough?
ID: 1806781 · Report as offensive
AMDave
Volunteer tester

Send message
Joined: 9 Mar 01
Posts: 234
Credit: 11,671,730
RAC: 0
United States
Message 1806788 - Posted: 3 Aug 2016, 13:53:43 UTC

Given:

GTX 1060, 120W . . . $ 249
GTX 1070, 150W . . . $ 449
GTX 1080, 180W . . . $ 599

Has anyone performed a cost benefit analysis of using two 1060s vs one 1070, or one 1080.  For example, two 1060s would cost 16.67% less and use 33.33% more W than a 1080.  How would the performance compare, and would it be worth the extra expenditure on electricity?
ID: 1806788 · Report as offensive
Profile Zalster Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 27 May 99
Posts: 5517
Credit: 528,817,460
RAC: 242
United States
Message 1806791 - Posted: 3 Aug 2016, 14:10:02 UTC - in response to Message 1806788.  

I'm always hesitant to see comparisons that use video games (which almost all these test are)

Running a GPU for scientific purposes vs gaming are 2 different things.

The Tom's Hardware talks about how the new Titan X are aimed at the scientific community as are the Telsa than are the GTXs

I think you would need someone running Seti on 2 1060 then have then compare it to 1080 running seti
ID: 1806791 · Report as offensive
Profile Shaggie76
Avatar

Send message
Joined: 9 Oct 09
Posts: 282
Credit: 271,858,118
RAC: 196
Canada
Message 1806798 - Posted: 3 Aug 2016, 14:37:26 UTC

My stats for default one-GPU task seem to suggest that the 1080 is barely faster. I'm looking forward to 1060s in the stats.
ID: 1806798 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1806799 - Posted: 3 Aug 2016, 14:42:20 UTC - in response to Message 1806791.  

Well, I might be able to do that, I got a ping from EVGA on Friday about a 'back in stock reminder' I had set for when the 1060s were out of stock. I grabbed a couple of them, so I'll find one of my X79 boards I have laying around and drop them in there when I have the time, and let them run for a while. What would you think would be the best version of the client to run? Should be a fairly easy build, Win 7 64 bit, latest vid drivers, need to get the latest stuff for the MB though, they've been lying around for 5 years or so. More fun! :-D

ID: 1806799 · Report as offensive
AMDave
Volunteer tester

Send message
Joined: 9 Mar 01
Posts: 234
Credit: 11,671,730
RAC: 0
United States
Message 1806802 - Posted: 3 Aug 2016, 15:06:16 UTC - in response to Message 1806791.  

I'm always hesitant to see comparisons that use video games (which almost all these test are)

Running a GPU for scientific purposes vs gaming are 2 different things.

Agreed.  I couldn't care less about the gaming aspect.  Just give me the raw performance/capability so I can determine if it meets my needs.  For example, with auto reviews, you are given hwy/city mpg, hp, torque, etc.  The reviews don't limit specific examples like:  how much hp is used in an uphill climb that curves to the left, or how much torque is used on gravel roads which curve right in a white-out.

The Tom's Hardware talks about how the new Titan X are aimed at the scientific community as are the Telsa than are the GTXs

I didn't include that card because, well, how many units of that would be sold compared the others.  What's the MSRP - $1200?

I think you would need someone running Seti on 2 1060 then have then compare it to 1080 running seti

Makes sense.  Looking forward to it.  I haven't seen anything published yet, but I'm curious as to whether nVidia will release any Ti versions of Pascal.
ID: 1806802 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1806966 - Posted: 4 Aug 2016, 6:40:11 UTC - in response to Message 1806781.  
Last modified: 4 Aug 2016, 7:38:45 UTC

What would really help is some stock SoG OpenCL benchmarks from a 1070 and 1080 with and without multitasking -- are the extra cores on the 1080 just idle because SoG isn't going wide-enough?


This system may be of use.
It has a GTX 750Ti and a just added GTX 1070. Both cards running 3WU at a time, stock applications on Beta, and almost all the applications used to date have been SoG.

EDIT- during the weekly outage I switch back to main, where i'm running 3WUs at a time, but using CUDA50.
Grant
Darwin NT
ID: 1806966 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13755
Credit: 208,696,464
RAC: 304
Australia
Message 1806967 - Posted: 4 Aug 2016, 6:45:21 UTC - in response to Message 1806781.  

So now the only question is: 1070's or 1080's?

GTX 1070 IMHO.
You'll get more work from the GTX 1080, but you'll pay a lot more money, for only slightly more work, and a lot more power compared to the 1070. I'd say it will be the cost/power/performance sweet spot.
I suspect the GTX 1060 won't really be in the running- however if they release a GTX 1060Ti then it could end up being the price/power/performance king.
Grant
Darwin NT
ID: 1806967 · Report as offensive
Profile Shaggie76
Avatar

Send message
Joined: 9 Oct 09
Posts: 282
Credit: 271,858,118
RAC: 196
Canada
Message 1807033 - Posted: 4 Aug 2016, 13:35:19 UTC - in response to Message 1806967.  

So now the only question is: 1070's or 1080's?

GTX 1070 IMHO.
You'll get more work from the GTX 1080, but you'll pay a lot more money, for only slightly more work, and a lot more power compared to the 1070. I'd say it will be the cost/power/performance sweet spot.
I suspect the GTX 1060 won't really be in the running- however if they release a GTX 1060Ti then it could end up being the price/power/performance king.

I'm thinking the same thing -- but to be sure I'm working on adding auto-detection of concurrent tasks to my "benchmark" scripts.
ID: 1807033 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1817680 - Posted: 16 Sep 2016, 22:31:49 UTC

Interesting little tidbit. Was looking at the crunching results on 2 of my machines: ID: 8064025 - X58-DualGTX1060 and ID: 7990258 - CrunchMonster, and noticed that even though one of them has 24 HT procs, and the other has 12 running about the same GHz, and one is running 4 GTX 950's and the other is running 2 GTX 1060's, they are putting out almost exactly the same RAC. Just thought it was interesting.

ID: 1817680 · Report as offensive
Previous · 1 . . . 15 · 16 · 17 · 18 · 19 · Next

Message boards : Number crunching : GPU Wars 2016: GTX 1050 Ti & GTX 1050: October 25th


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.