Sunday, 23 November 2014
Advanced search

UK's role in world's largest radio telescope revealed

The UK is to design a computer system that can handle data over 60 times faster than the entire internet for the world’s largest radio telescope.

British engineers will also design the systems to control and connect the quarter of a million of antennas spread across two continents that will form part of the Square Kilometre Array (SKA), it was announced today.

This technology, effectively the nervous system of the SKA, will create the most sensitive radio telescope ever, enabling scientists to study signals from the first billion years after the Big Bang and help explain how stars and galaxies formed.

‘The only really good way of doing that is to essentially increase the physical size of the antennas and the collecting area that you have,’ said Peter Dewdney, the SKA’s architect based at the organisation’s headquarters in Manchester, from where the entire project will be overseen.

‘We have a situation where we will have antennas scattered in an array configuration covering an area hundreds of kilometres in size. We have to connect all those together and the antennas are each gathering quite incredible amounts of information that has to be transported to a central location to be processed.’

/k/o/w/TE_SKA_antennas.jpg

Source: SKA

The SKA will include 250,000 and eventually millions of low-frequency aperture antennas.

By the time the SKA starts operation in 2020, it will be producing raw data at an estimated rate of 20,000 petabytes a day. By comparison, the entire internet currently transports about 300 petabytes of data a day. This could rise by a factor of 10 when the final phase of the SKA is finished in 2028.

‘The SKA is one of the exemplar projects for big data challenges in science and is going to be so over the next decade,’ said Prof Paul Alexander of Cambridge University, who will be leading development of the SKA’s data processing equipment.

Under a three-year project, the Cambridge team will work with industry to design the hardware and software capable of handling such large amounts of data. They will use cutting edge chip technology to create vast data centres processing 100 petabytes of the data each day – rising to 10,000 petabytes by 2028 – so that it can be used for scientific research.

A team at Manchester University, meanwhile, will lead development of the backbone communication system that will link the hundreds of radio telescope dishes and 250,000 low-frequency antennas spread across South Africa and Australia to the SKA’s central and regional data centres.

The second stage of development will expand the SKA to include an estimated 3,000 dishes, millions of low-frequency antennas and 250 mid-frequency array stations.

/s/y/a/TE_SKA_dishes1.jpg

Source: SKA

The size of the SKA will enable it to detect radio signals from the first billion years of the universe.

British engineers from Oxford University will also be involved in developing the system that digitises and combines signals from the low-frequency antennas and effectively forms steerable beams that allow the telescope to point in multiple directions at once.

The total collecting area of all the telescope’s components will be equal to around one square kilometre. The sensitivity this will produce will enable the SKA to detect radio signals produced by the hydrogen gas thought to comprise most of the matter of the early universe and that gradually came together to form stars and galaxies.

‘Radio telescopes can observe actually the hydrogen gas at this early stage and the things being built out of this gas,’ said Dewdney. ‘The telescopes can see the holes and the way the gas is being eaten away and structure of it as this happens.’

The SKA will also allow scientists to gain a greater understanding of gravity, the so-called dark energy thought to be driving the expansion of the universe, and look for signs of the complex molecules that could indicate the presence of life outside our solar system.


Readers' comments (2)

  • OK. So based on the figures above at turn on each days data production will generate a 200 day backlog of data processing (20,000 PB produced vs 100 PB processed). Granted this will reduce over time but given the figures even if the produced data rate were to stay the same the processing would eventually only get to 1 day backlog per day of produced data (20,000 produced vs 10,000 process). But as produced data is predicted to increase 10 fold over the same period this is optimistic at best!

    This begs the question, if the data processing will never catch up with the data production why produce so much data in the first place?

    Don't get me wrong I'm not saying scrap the project, just is it worthwhile to spend so much money on collecting data that may never get processed?

    It would surely be better to match data production and data processing increasing both almost simultaniously.

    If you only search half the haystack you may miss the needle!

    Unsuitable or offensive? Report this comment

  • It could be in the wording of the article. Are they making vast data centers capable of processing 100 petabytes collectively, or can each data center process 100 petabytes a day, so multiple data centers could potentially process the data coming on with no backlog. Even so, that is huge amount of data centers and where does the balance between common sense and expense occur. Is it worth 300 billion+ to obtain signatures from the creation of time and space, or would that money be better spent building a better future for our planet, and a secure future for our existance?

    Unsuitable or offensive? Report this comment

Have your say

Mandatory
Mandatory
Mandatory
Mandatory

My saved stories (Empty)

You have no saved stories

Save this article