Doing Seti Research by ourselves

Message boards : Number crunching : Doing Seti Research by ourselves
Message board moderation

To post messages, you must log in.

AuthorMessage
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 2035660 - Posted: 4 Mar 2020, 12:59:16 UTC
Last modified: 4 Mar 2020, 13:03:12 UTC

I believe I read someplace it is possible to get at the "raw"? data directly?

1) Download and "clean" a piece of data. Pick a particular "part" of the sky and remember it.
2) Analyze it taking however long it takes.
3) Save an abstract of the results.
4) Repeat until you have processed "enough" of the data from that specific point that you can compare "all" the repeated observations.
5) Do you "see" any patterns?
6) Pick another "location".
7) Wash /rinse/dry, repeat.

Assuming we have a "common format" for the abstracted results we could "band together" and have one machine/group of machines/bunch of gpus work on #5 while the rest of us are deciding which location we want to crunch data from next.

There are several different projects out there that had multiple weeks/months per task so I suspect the issue is not the rate of processing but having a suite of tools that allows one or a few machines/GPUs to analyze a single sky location from raw data to final search for a signal.

I know this is all way above my paygrade in terms of actually being able to design/program a system.

I would, however, be willing to alpha test, beta test and process things that take weeks/months to get from one end to the other. I assume regular checkpointing. And I presume we couldn't use the Tbar/Petri gpu processing because it re-starts so badly.

Tom
A proud member of the OFA (Old Farts Association).
ID: 2035660 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 2035668 - Posted: 4 Mar 2020, 13:22:55 UTC

ID: 2035668 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22222
Credit: 416,307,556
RAC: 380
United Kingdom
Message 2035676 - Posted: 4 Mar 2020, 13:35:27 UTC

That's not a bad summary of the process.
There are a couple hurdles to overcome, actually getting the data in a usable form, which shouldn't be too hard as someone posted a link around Christmas, so it's "simple" case of getting permission to do a bulk download, popping the straw in and sucking hard.
Getting a couple of folks on board with the required computing skills - weak signal analysis, spread over various platforms.
There would have to be a few decisions made, such as which set of frequencies to look at within the "DC to Light" range. Much of S@H's work was done on the water gap because, well there are some good reasons, not the least of which it is quite well characterized and is kept almost clear of terrestrial rubbish (note the word "almost"....). Another consideration would be which telescope to select data from, continue with a Northern sky one, or be adventurous and look further south?
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 2035676 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 2035677 - Posted: 4 Mar 2020, 13:36:31 UTC

The idea is good but i believe we not have the resources to do that.

Boinc needs a central server and someone who has the skills to take care of it, high speed communications links, etc. that cost a lot of $, specially if is US/EU based.

If that could solve, then we need someway to get the data to be crunched (from Arecibo, GBT, etc.) and a group os scientist who could analyze the crunched data (Nebula?).

In short words, we need another S@H structure to do that.

If you could find a solution to that, count with me.
ID: 2035677 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22222
Credit: 416,307,556
RAC: 380
United Kingdom
Message 2035678 - Posted: 4 Mar 2020, 13:40:35 UTC

Add - I've found the link to the page where the data can be selected - https://breakthroughinitiatives.org/opendatasearch
To get at the data one needs to have a guess at a location, observation time and a couple of other things.....
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 2035678 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 2035683 - Posted: 4 Mar 2020, 13:49:30 UTC - in response to Message 2035677.  


In short words, we need another S@H structure to do that.

If you could find a solution to that, count with me.


Basically what I am proposing more decentralized processing. It wouldn't necessarily require a BONIC structure. It would require a new Suite of programs that could allow a smaller or single user to process until they have something that could be shared.

Either an intermediate stopping spot similar to what we do when we upload back to the Seti@Home server. Or a "final" destination.

And that is an interesting question. Do we have big enough/fast enough computers that we could setup a replication of the first steps of Seti@Home running on a server with multiple Virtual Machines?

If yes, what would we narrow its focus down to, to allow it to handle a small enough bunch of raw data for processing?

And then there is the question of the data analysis step that Nebula (for instance) represents.

All in all, we would need all the skills/talents that Rob Smith listed to build a new(er) Suite of Tools.

Tom
A proud member of the OFA (Old Farts Association).
ID: 2035683 · Report as offensive
Profile Raistmer
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 16 Jun 01
Posts: 6325
Credit: 106,370,077
RAC: 121
Russia
Message 2036724 - Posted: 8 Mar 2020, 11:46:18 UTC

Well, to do smth useful there should be new approach to search. New patterns. New freq ranges. Maybe even whole new spectrum band (like optical SETI).
And skilled professional to define such patterns. That's hard part. And then definitely some new form of distributed processing can be formed.
As I see the main issue of current SETI@home search - it's hugely undermanned. Just devastately huge. Persons who need to deal exclusively with mentioned "hard part" spend lot of time for just system administration.
SETI apps news
We're not gonna fight them. We're gonna transcend them.
ID: 2036724 · Report as offensive

Message boards : Number crunching : Doing Seti Research by ourselves


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.