“When a star, or a another object such as a galaxy, is observed by a telescope or instrument, a researcher can be given an option on a star. The researcher will then have temporary exclusive rights to publish results and conclusions from that data. The only legal issue the project has, is one of intellectual property rights, in particular that you should never make somebody else’s proposal public. Stars are not very sensitive about IP or privacy.”
“If you go back thirty years astronomers would worked very differently. They would go to the telescope, take the data, bring the data back to their office, use them, leave them in a drawer until they became unusable. There was no structured method of keeping them.
I think the reason that everything changed was due to space-based astronomy. In the 1970s the European Space Agency (EAS) launched the first European space based observatory, COS-B, to observe gamma-rays. A gamma-ray detector must be positioned outside the atmosphere, so is consequently extremely expensive. Five years after the expedition finished, the project team decided they should not keep this unique data to themselves, but made them public. They literally just sent out tapes to everyone who wanted it. They were very small datasets: you could get the whole dataset on one tape.
Consequently, ESA developed a policy that they would make all the data available in an archive that they would maintain forever. ESA, like NASA, both builds satellites and governs the process by which they decide which satellite will be built. This has now evolved into a really sophisticated archive together with the clear statement: all of our data will be in that archive.
In the first part of the history of space astronomy, most missions were so called Principle Investigator (PI) missions. The PI and the team (Co-PIs) that build the instrument had the data rights. In time But observatory class missions became the norm. In these missions time to a list of targets each associated with an astronomer: a sort of option on a star, or a galaxy or a quasar.
Typically, for an observatory mission there will be an announcement of opportunity every year. In response to this opportunity astronomers will write proposals saying I would like to look into this, and this is the science I will do it with. That is assessed by a Time Allocation Committee and you are either given it, or not. Each approved observation has a priority, which is used to optimize the schedule. In the next run hopefully your observations will be scheduled and you will have the rights for the data typically for one year. It will go into the archive immediately and after one year somebody will flick a switch and everybody can access it. So you have one year to do your science. The process of evaluating rival proposals for observatory class missions, is basically taking the practice from ground based research to the sky.
While I was working for the RUG I continued to write a few proposals. This interview happened, because of my excitement when I was awarded a so-called target of opportunity. I work on active galactic nuclei. Which are galaxies with a big black hole in the middle. These objects exhibit significant variability and flares can seen over various timescales. If you actually see them when they are flaring, you can do different science from when they are just sitting still. But, you can’t say, please just look at it now, because you don’t know when it flares. So I write a proposal, in which I say, if one of my listed galaxies is shown to flare in by X-ray monitor, I want to look at them with the INTEGRAL gamma-ray telescope. I had a list of 30 Seyfert 2 galaxies . Now, this particular proposal I had put in every year for the last five years and nothing ever flared (Seyfert 2 galaxies are less variable than many other types). And then, suddenly, one of them flared and I got very excited.
Yes, earlier my colleagues were starting to make jokes that these galaxies never flared. They can see the abstract of my proposal and a list of trigger criteria. So they knew what I was waiting for. They cannot read the detailed scientific proposal, otherwise they could see my ideas and copy them. Once you get data, you are not obliged to published about it. But it is a very bad idea not to publish. The panel will take that against you when you apply for an observation later on.
There have been lots of discussions on how you can safeguard data. But I have never heard of a real incident. For instance for the Integral gamma ray mission, the data centre wanted to make some data immediately accessible to everybody since it was useful to understand the instrument. They put in very elaborate mechanisms to make this data available, but not enabling anybody to steal somebody else’s bit of sky. This was very elaborated and Integral probably spent a lot of money on developing the mechanism.
Technically, however the security mechanisms developed, did not work for the later stage of the mission. This is because they decided to look at some areas of sky for several months (a deep survey) and to maximize the scientific return. A number of astronomers were given data rights on one observation. Previously only one astronomer had rights to one observation. So they gave everyone data rights, physical access to all the data. Then they were clearly told what objects they could analyse and what objects they could not. If anyone had analyzed somebody else’s object, when it was is clearly not allowed, their reputation would be gone. Nobody of course ever did this. So, in my opinion there is not any point in building in stringent safety measures.
The community has learned. On the projects we are working on now,
, goes back a bit to the PI mission concept, is EUCLID. The data goes back to the scientists that works to develop the mission. This is necessary because is EUCLID is a survey mission. It surveys the entire sky. To achieve its main scientific aim you need all the data. So the entire consortium has everything. In that particular case there is a board of the consortium who decides you can work on this aspect and publish on it. There will be a small number of really important papers, if it succeeds it hopefully will produce some Nobel prize winning cosmology paper. In my case I’ll work on the less spectacular stuff . There will be a huge set of data, since it will be the highest resolution all-sky survey ever made. There will be many hundreds of papers written using this data. I plan to continue to work on active galactic nuclei.
If you have so much data, you might wonder why we save them all. The reason we don’t destroy data, is that an observation of a given star gives you information of that particular time. Stars vary with time. If you destroy that data, you destroy something unique. The instruments we have now are much better. Yet, the data of thirty year ago are still useful. All the data are archived. There is a site in Madrid, where ESA archived all of its data. Meanwhile, the NASA HEASARC laboratory archives copies of all the X-ray and gamma-ray data they can get.
Everybody can download data. Anybody can connect to the servers via the WWW and download it. But whether that is of interest to most people, it is not. There are tools available which allow the general public to access astronomical data. There are two that spring to mind. One is developed by Microsoft. It is called the
World wide telescope
. The other is the
digital universe of the Hayden planetarium
. Both of them have collections of data already in them. You can download them and look at them on your computer. These tools are relatively simple. But they allow you to access the collections that are already in them.
As part of the Target outreach programme we wrote an interface to allow the digital universe viewer access some of our local data. There is a plan that we adapt the viewer, so it can access data from the
. That is an standard interface that all these archives have. So if you say I am interested in this specific star, you can find the star in all the archives. It allows astronomers to interrogate multiple data centers in a seamless and transparent way. This is made possible by standardization of data and metadata, by standardization of data exchange methods, and by the use of a registry, which lists available services and what can be done with them. The Virtual Observatory is very nice. We have the idea to seamlessly link the digital universe viewer into this professional observatory. However, we have not done this yet. Note that if we did this is would also allow Infoversum to access the Virtual Observatory, since it uses the same format files as the digital universe collection.
It is important to note that all these missions have differences, but for all of them the data come publically available within a short time. This is because of the costs. The cost of the EUCLID ground system alone is 100 million euro. So, you have to justify the costs. There are also less missions now, so you have to maximize the return. And because these missions are so expensive you’ll have the whole community to work on the data. You have to make the most of it.”
Banks with a high sustainability score have a lower default risk. In addition, the most sustainable banks help to reduce the systemic risk of the financial system as a whole. These are the conclusions of Bert Scholtens, Professor of Sustainable Banking...
Every year the television program Het Klokhuis organizes a science prize for universities. This year, the RUG has submitted four studies that are appealing to children.
The in-depth study “Future markets for renewable gases and hydrogen: What would be the optimal regulatory provisions?” by Professor José Luis Moraga, Professor Machiel Mulder and Peter Perey explores the economic outlook for renewable gases and hydrogen...