Message boards :
Number crunching :
Decided to Bail Out on SETI for a Short While
Message board moderation
Previous · 1 · 2 · 3 · 4
Author | Message |
---|---|
kittyman Send message Joined: 9 Jul 00 Posts: 51475 Credit: 1,018,363,574 RAC: 1,004 |
Let's hope that the scientists - that means astronomers, mathematicians, and sociologists with experience of distributed computing - are allowed to get on with the prototyping at their own speed and with their own goals in mind, then. Thank you for saying that, Richard. I am too much of a bandwagon banner flying kinda guy to tell the truth sometimes. I had some communication with Eric about the release of v8 before the apps were ready. Kinda angry about it, actually. Damned angry about it, actually. Eric was angry about it as well, not just me. And, as one of the higher contributors to the project, he took my anger very seriously. I have dedicated years of my life and countless dollars to this search, and to have that slap in the face was rather rude. "Time is simply the mechanism that keeps everything from happening all at once." |
Richard Haselgrove Send message Joined: 4 Jul 99 Posts: 14669 Credit: 200,643,578 RAC: 874 |
Let's all go on a nice relaxing cruise for the duration, shall we? |
Zalster Send message Joined: 27 May 99 Posts: 5517 Credit: 528,817,460 RAC: 242 |
Until they can keep people from getting NoroVirus, I prefer to stay off Cruise Ships. No matter how short the trip, lol... But vacations do sound like a good idea, just land based. |
Raistmer Send message Joined: 16 Jun 01 Posts: 6325 Credit: 106,370,077 RAC: 121 |
What de-dispersion? It's gravity, not electromagnetics. Einstein at home just doesn't look in such frequencies and transient signals at all. |
kittyman Send message Joined: 9 Jul 00 Posts: 51475 Credit: 1,018,363,574 RAC: 1,004 |
And Kittes@home just look at mices........... My kitties just keep on looking at the stars. "Time is simply the mechanism that keeps everything from happening all at once." |
Richard Haselgrove Send message Joined: 4 Jul 99 Posts: 14669 Credit: 200,643,578 RAC: 874 |
So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity. |
Raistmer Send message Joined: 16 Jun 01 Posts: 6325 Credit: 106,370,077 RAC: 121 |
Let's hope that the scientists - that means astronomers, mathematicians, and sociologists with experience of distributed computing - are allowed to get on with the prototyping at their own speed and with their own goals in mind, then. Time-table was fulfilled perfectly. Cause to the time of data release we had working app on main and fully functional apps for release on beta. Having data w/o apps would be much worse. There is nothing actually that was made "in hurry" to 12 April. All was done and tested well. The single side effect is that v9 will be needed for new processing types introduction. But even that actually good part. It's not too good to introduce 2 big changes simultaneously. So we have new data sources first and will have new processing algorithms later. |
kittyman Send message Joined: 9 Jul 00 Posts: 51475 Credit: 1,018,363,574 RAC: 1,004 |
Yikes...v9??? The kitties have not even recovered from v8 yet...................... LOL...bring it. Meow. "Time is simply the mechanism that keeps everything from happening all at once." |
Raistmer Send message Joined: 16 Jun 01 Posts: 6325 Credit: 106,370,077 RAC: 121 |
So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity. Would I be project scientist - perhaps. I will know design decisions when they will be formed. From common sense to get improvement over Arecibo data longer time periods can be used for signal accumulation so longer initial arrays for FFA. Smth like large FFA or bigger. Targeted search in general allows longer time domain for signal accumulation that results in better signal/noise ratio => increased sensitivity on the same hardware. |
kittyman Send message Joined: 9 Jul 00 Posts: 51475 Credit: 1,018,363,574 RAC: 1,004 |
So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity. If longer tasks increase our chance of finding our needle in the cosmic haystack, I am all for it. I do not care if the apparent result is me crunching one WU per day. If that one WU is more productive than tossing ten a minute out the window, so be it. I have long ago said that I luv my credits, but that is NOT why I am here. I am here for the freaking SCIENCE of it, and that is what attracted me to the project in the first place, and that is what keeps me here. And the fact that things are progressing here, rather that the same old... Makes me even more determined to participate in this amazing project. Meow! "Time is simply the mechanism that keeps everything from happening all at once." |
jason_gee Send message Joined: 24 Nov 06 Posts: 7489 Credit: 91,093,184 RAC: 0 |
So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity. Anything can be broken up into sections. First thing I'd do if such an animal materialised, would be to thread it. Only reason I haven't made Cuda MB properly threaded is the tasks have been too 'small' to get good scaling (so we run multiples instead). That's quite likely to change amidst our various plans to tackle Guppis from different development directions. Been a while since I looked at or even thought about AP code, but you can assume that larger dedispersion on a given (same sized) dataset increases the effective range. Before adding any observation length/number of points, or extra searches there's an inverse square law in there somewhere (iirc), sothere is non-linear growth in processing demand. In the case of increasing the chunk sizes, still same observations lengths, (to end up with more resolution) then the number of coadds, off the top of my head, would multiply. As high as 2M length transforms, then the FFT based convolution operations each of 4nlogn complexity or so, become pretty costly compared to regular 32k, and probably not many caches would handle that nicely (especially single threaded). Typical communications complexity cost for thrashing cache (as power of 2 transforms naturally do as they grow) increases by an order of magnitude for each cache level filled. Short version taking things to extremes, so 10 hours becoming 10,000 hours ( ~416 days ), doesn't seem out of the realm of possibility, with no increase in payload. "Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions. |
kittyman Send message Joined: 9 Jul 00 Posts: 51475 Credit: 1,018,363,574 RAC: 1,004 |
Might I be so bold as to ask in public, what do you do for a living, Jason? Reply by PM if you would rather not here, but I think your understanding and statements regarding mathmatical things astound me, and so, do you just drive a truck to let things stew? That would certainly explain your understanding of 'chunk sizes'. "Time is simply the mechanism that keeps everything from happening all at once." |
jason_gee Send message Joined: 24 Nov 06 Posts: 7489 Credit: 91,093,184 RAC: 0 |
Might I be so bold as to ask in public, what do you do for a living, Jason? That's complicated, will PM in a bit, lol "Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions. |
HAL9000 Send message Joined: 11 Sep 99 Posts: 6534 Credit: 196,805,888 RAC: 57 |
And Kittes@home just look at mices........... My fuzzy minions waste their time watching movies all day. http://i.imgur.com/IvWLL9K.jpg So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity. I believe one of the issues the project tries to balance is allowing as many host to participate as possible. Only having much larger, or complex, workunits could raise the minimum machine specifications. It might even cause a significant portion to no longer be able to contribute. Offering both types of work would allow a large number of hosts to participate & have more complex data analyzed by hosts that are capable. However that is more work on the back end. SETI@home classic workunits: 93,865 CPU time: 863,447 hours Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[ |
Al Send message Joined: 3 Apr 99 Posts: 1682 Credit: 477,343,364 RAC: 482 |
HAL, question for you. Since our 'benefactor' made his (very large) contribution to this science en devour, why hasn't (at least as far as my limited knowledge of it all goes) any of those resources migrated toward where they can do the most good long term for everyone, and the project as a whole - the back end (meaning the software that we all use to process all the work that the donation has provided)? I'd think that with the kind of dollars I have heard mentioned previously, there would be a percentage, let's say maybe 3-5% of it that could be strictly dedicated to making the basic program be the best it can be? Which may include hiring some of the people who have been so kind to volunteer their time to this project so generously (if they are interested, of course), otherwise hiring full time, highly qualified programming staff for it (maybe poach a programmer or 2 from Nvidia and ATI? Bwhahaha!). This would enable us to be able to efficiently run the latest work units that said contribution has provided us in such large quantities? From that would flow much good science, as well as long term (relative, as of course hardware and O/S'es change over time) stability, along with the ability to maybe work with said vendors in a serious way (ATI/Nvidia) so we might be able to plan somewhat our path, and not be blindsided by the changes and have to scramble madly to get the product to work properly. Maybe this is all pie in the sky, but with the proper resources, directed the proper way, I think it should be doable? |
HAL9000 Send message Joined: 11 Sep 99 Posts: 6534 Credit: 196,805,888 RAC: 57 |
HAL, question for you. Since our 'benefactor' made his (very large) contribution to this science en devour, why hasn't (at least as far as my limited knowledge of it all goes) any of those resources migrated toward where they can do the most good long term for everyone, and the project as a whole - the back end (meaning the software that we all use to process all the work that the donation has provided)? I haven't really been following everything related to this large funding contribution. However I know there is a lot more going on behind the scenes than we observe directly or get notifications. The amount that may be directed to SETI@home specifically is likely VERY small. I'm not sure that SETI institute itself even had that large of a chunk. I think most of the money was said to be going to dedicated telescope time globally. Also there may be limitations to how the money can be allocated. SETI@home classic workunits: 93,865 CPU time: 863,447 hours Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[ |
Richard Haselgrove Send message Joined: 4 Jul 99 Posts: 14669 Credit: 200,643,578 RAC: 874 |
If you read between the lines of Matt's Technical News post a couple of weeks ago, I'd say that the benefactor's contribution to SETI@Home's general funding so far has been negative - and substantially so. Much extra work, but no extra bodies or brains to do it. To my mind, that would violate the UK concept of Full Cost Recovery (FCR) - but that's a local funding standard, not a global one. |
HAL9000 Send message Joined: 11 Sep 99 Posts: 6534 Credit: 196,805,888 RAC: 57 |
If you read between the lines of Matt's Technical News post a couple of weeks ago, I'd say that the benefactor's contribution to SETI@Home's general funding so far has been negative - and substantially so. Much extra work, but no extra bodies or brains to do it. I read that mostly as "Breakthrough Listen is funding Matt's paycheck so we haven't lost him" at the time. SETI@home classic workunits: 93,865 CPU time: 863,447 hours Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[ |
kittyman Send message Joined: 9 Jul 00 Posts: 51475 Credit: 1,018,363,574 RAC: 1,004 |
I got things to say/ I ain't done yet/. "Time is simply the mechanism that keeps everything from happening all at once." |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.