NVIDIA Jetson TK1

Message boards : Number crunching : NVIDIA Jetson TK1
Message board moderation

To post messages, you must log in.

AuthorMessage
W-K 666 Project Donor
Volunteer tester

Send message
Joined: 18 May 99
Posts: 19012
Credit: 40,757,560
RAC: 67
United Kingdom
Message 1566121 - Posted: 2 Sep 2014, 10:36:57 UTC

I've just been informed that the Zotac NVIDIA Jetson TK1 is now available from Maplins for UK customers, price is £199.99

Zotac NVIDIA Jetson TK1 Developer Kit

(previous thread on low power computers has been closed)
ID: 1566121 · Report as offensive
Profile Dr Grey

Send message
Joined: 27 May 99
Posts: 154
Credit: 104,147,344
RAC: 21
United Kingdom
Message 1566602 - Posted: 3 Sep 2014, 19:52:39 UTC - in response to Message 1566121.  

14 W for 300 GFLOP/s sounds amazingly efficient. Would this work? A review of the board is here:
http://www.itpro.co.uk/desktop-hardware/22731/nvidia-jetson-tk1-review
ID: 1566602 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1566604 - Posted: 3 Sep 2014, 20:12:14 UTC - in response to Message 1566602.  
Last modified: 3 Sep 2014, 20:25:21 UTC

14 W for 300 GFLOP/s sounds amazingly efficient. Would this work? A review of the board is here:
http://www.itpro.co.uk/desktop-hardware/22731/nvidia-jetson-tk1-review

Here is one actually running.
http://setiathome.berkeley.edu/show_host_detail.php?hostid=7354060
As I recall they had to compile the app for the hardware & there was quite a bit of effort to get it producing valid results.

For the money I am sticking with Bay Trail-D. The ASRock Q1900-ITX I bought recently only cost around $80 US, or about £50 GB (with VAT I imagine more), & looks to be running around 3200-3500 for RAC. My system total is around 25w, but once optimized I am hoping to get closer to 15w at full load.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1566604 · Report as offensive
Profile Dr Grey

Send message
Joined: 27 May 99
Posts: 154
Credit: 104,147,344
RAC: 21
United Kingdom
Message 1566622 - Posted: 3 Sep 2014, 20:49:00 UTC - in response to Message 1566604.  

I'm not sure whether to trust the GFLOPS figures quoted on our rigs but just looking at 17 tasks for the TK1 today compares pretty favourably with the 37 done by my i7 if you compare 14 W vs 70 W.
ID: 1566622 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1566648 - Posted: 3 Sep 2014, 21:21:40 UTC - in response to Message 1566622.  

I'm not sure whether to trust the GFLOPS figures quoted on our rigs but just looking at 17 tasks for the TK1 today compares pretty favourably with the 37 done by my i7 if you compare 14 W vs 70 W.

The NVIDIA GK20A on the TK1 lists
Average processing rate 15.82 GFLOPS
An old NVIDIA 8500 GT I have overclocked lists
Average processing rate 15.12 GFLOPS

The GK20A is a bit faster than my 8500. Given that I think the app GFLOPS listed seem to work for comparison sake at least.

I think the 14W listed is for the whole SOC. So just the GPU running should be a lower value. Maybe 7-8W?. Looking at the tasks on your i7 they take ~2 hours for a normal AR vs 2.5-2.75 hours on that GPU. However you are probably running 4-7 in parallel on your 70w CPU? So the PPW could be somewhat close.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1566648 · Report as offensive
Profile Dr Grey

Send message
Joined: 27 May 99
Posts: 154
Credit: 104,147,344
RAC: 21
United Kingdom
Message 1566684 - Posted: 3 Sep 2014, 22:30:20 UTC - in response to Message 1566648.  

Yes, 7 on the i7. So it looks like the TK1 isn't quite the supercomputer they're saying it is - but still a neat unit though.
ID: 1566684 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1566706 - Posted: 3 Sep 2014, 23:08:20 UTC - in response to Message 1566684.  

Yes, 7 on the i7. So it looks like the TK1 isn't quite the supercomputer they're saying it is - but still a neat unit though.

It has a really good GPU, for an arm processor. The Nvidia Shield Tablet, which uses this same SOC, is not that much more than a Google Nexus 7 & has a lot better GPU. So that is another option for those looking at this hardware.
http://shield.nvidia.com/gaming-tablet/
or
http://shield.nvidia.co.uk/gaming-tablet/
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1566706 · Report as offensive
Profile ivan
Volunteer tester
Avatar

Send message
Joined: 5 Mar 01
Posts: 783
Credit: 348,560,338
RAC: 223
United Kingdom
Message 1566730 - Posted: 4 Sep 2014, 0:18:01 UTC - in response to Message 1566604.  

14 W for 300 GFLOP/s sounds amazingly efficient. Would this work? A review of the board is here:
http://www.itpro.co.uk/desktop-hardware/22731/nvidia-jetson-tk1-review
Here is one actually running.
http://setiathome.berkeley.edu/show_host_detail.php?hostid=7354060
As I recall they had to compile the app for the hardware & there was quite a bit of effort to get it producing valid results.
Yeah, that's me. I had to add some ifdefs to the machine-code routines (someone else worked that out...) to get the floating-point working right. Note that the results it's producing are purely from the GPU. I've compiled the CPU code and it seems to run fine standalone but under BOINC itself it throws an exception. I haven't had time to try to debug this.
Also note that some CUDA code I have for reconstructing digital holograms compiled and ran without any code changes on the system -- I just had to change the location of the includes and libraries in the Makefile. It runs about 1/10th the speed of my GTX 750 Ti, but takes perhaps 1/15th the power. I've got my wattmeter at work now so I'll try to get some power figures soon (got the chance to grab it when I had to shut down my home machine last night to remove the GTX 640 which had developed a faulty fan -- the third Gigabyte GPU that's done that to me now! :-( ).

For the money I am sticking with Bay Trail-D. TheASRock Q1900-ITX I bought recently only cost around $80 US, or about £50 GB (with VAT I imagine more), & looks to be running around 3200-3500 for RAC. My system total is around 25w, but once optimized I am hoping to get closer to 15w at full load.
Mine (Acer Aspire XC-603) is running around 1250, but that's on Linux and without using the GPU. Cost £130.
ID: 1566730 · Report as offensive
Profile ivan
Volunteer tester
Avatar

Send message
Joined: 5 Mar 01
Posts: 783
Credit: 348,560,338
RAC: 223
United Kingdom
Message 1567589 - Posted: 5 Sep 2014, 15:02:39 UTC - in response to Message 1566730.  

Yeah, that's me. ... It runs about 1/10th the speed of my GTX 750 Ti, but takes perhaps 1/15th the power. I've got my wattmeter at work now so I'll try to get some power figures soon (got the chance to grab it when I had to shut down my home machine last night to remove the GTX 640 which had developed a faulty fan -- the third Gigabyte GPU that's done that to me now! :-( ).

OK, some figures. At idle, the wattmeter was showing 9 W at the socket (it was 4 W with the machine shut down, so that's about the inefficiency of the power supply). Running two CUDA jobs simultaneously shows 16 W, with occasional drops to 14 W and sometimes 11 W.
ID: 1567589 · Report as offensive
Profile jason_gee
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 24 Nov 06
Posts: 7489
Credit: 91,093,184
RAC: 0
Australia
Message 1567654 - Posted: 5 Sep 2014, 17:35:32 UTC - in response to Message 1567589.  

Any patches or recommendations for X-branch ? It's on idle while I address some complex Boinc issues, but I don't stop taking issues for the ToDo list :)
"Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions.
ID: 1567654 · Report as offensive
Profile ivan
Volunteer tester
Avatar

Send message
Joined: 5 Mar 01
Posts: 783
Credit: 348,560,338
RAC: 223
United Kingdom
Message 1567691 - Posted: 5 Sep 2014, 19:00:54 UTC - in response to Message 1567654.  

Any patches or recommendations for X-branch ? It's on idle while I address some complex Boinc issues, but I don't stop taking issues for the ToDo list :)

No, IIRC it just compiled as usual on the Jetson.
No real problems with the s@h, just the missing include I reported last January, and I had to edit the config file to remove the old compute capabilities that nvcc didn't like and put in 3.2 for the Tegra.
I had to add
#include <unistd.h>
to client/analyzeFuncs.h to bring in sleep() on my Mint system.

ID: 1567691 · Report as offensive
Profile ivan
Volunteer tester
Avatar

Send message
Joined: 5 Mar 01
Posts: 783
Credit: 348,560,338
RAC: 223
United Kingdom
Message 1567807 - Posted: 5 Sep 2014, 22:26:30 UTC - in response to Message 1566730.  
Last modified: 5 Sep 2014, 22:35:05 UTC

GT 640 which had developed a faulty fan -- the third Gigabyte GPU that's done that to me now! :-( )..
Just ordered a replacement fan from China on eBay for about £1.50, so might be back in action by the end of the month. Otherwise I'll get the Asus GT640-1GD5-L as a replacement -- it has to fit in a single-width slot with only about 5 mm extra clearance.
ID: 1567807 · Report as offensive

Message boards : Number crunching : NVIDIA Jetson TK1


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.