The Full Wiki

Einstein@Home: Wikis

Advertisements
  
  

Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.

Encyclopedia

From Wikipedia, the free encyclopedia

Einstein@Home is a distributed computing project hosted by the University of Wisconsin‚ÄďMilwaukee and running on the Berkeley Open Infrastructure for Network Computing (BOINC) software platform. It searches through data from the LIGO experiment for evidence of gravitational waves from continuous wave sources, which may include pulsars.

Contents

Introduction

Einstein@Home is designed to search data collected by the Laser Interferometer Gravitational-Wave Observatory (LIGO) and GEO 600 for gravitational waves. The project was officially launched on 19 February 2005 as part of American Physical Society's contribution to the World Year of Physics 2005.[1] It uses the power of volunteer-driven distributed computing in solving the computationally intensive problem of analyzing a large volume of data. Such an approach was pioneered by the SETI@home project, which is designed to look for signs of extraterrestrial life by analyzing radio wave data. Einstein@Home runs through the same software platform as SETI@home, the Berkeley Open Infrastructure for Network Computing (BOINC).

As of January 2010, over 248,000 volunteers in 214 countries have participated in the project, making it the third most popular BOINC project.[2] About 39,000 active users contribute about 220 teraFLOPS of computational power, which would rank Einstein@Home among the top 20 on the TOP500 list of supercomputers.

Scientific objectives

The Einstein@Home project searches for continuous wave sources of gravitational radiation via an "all-sky search".[3] These sources may include gravitational radiation from pulsars. Einstein@Home may result in the first confirmed direct detection of a gravitational wave. A successful detection of gravitational waves would constitute a significant milestone in physics, as it would be the first detection of a previously unknown astronomical object by means of gravitational radiation alone.

Data analysis

The Einstein@Home program processes data from the LIGO and GEO instruments using Fast Fourier Transforms. The resulting signals are then analyzed using a method called matched filtering. This method involves the computation of hypothetical signals that might result if there were a physically plausible source of gravitational waves in the part of the sky being examined. The measured signal is then compared to the hypothetical signal. A matching signal is a candidate for further examination by more sophisticated analysis.[4]

Einstein@Home analyzes data from the LIGO S3, S4 and S5 data sets, each of which represent improvements in the accuracy compared with the previous data set. Processing of the S3 data set was conducted between 22 February 2005 and 2 August 2005. Work on the S4 data set was started interlaced with the S3 calculations, and has finished in July 2006. Processing of the S5 data set, which should reach design sensitivity for the first time, began on 15 June 2006.[5]

On March 24, 2009, it was announced that the Einstein@Home project is beginning to analyze data taken by the PALFA Consortium at the Arecibo Observatory in Puerto Rico.[6]

On November 26, 2009, CUDA-optimized application for Arecibo Binary Pulsar Search was announced, on official Einstein@home webpages. This application uses both a regular CPU plus NVIDIA GPU to perform the analysis faster (in some cases up to 50% faster).[7]

Advertisements

Optimized data analysis

Einstein@home has gained considerable attention of the world's distributed computing community when an optimized application for the S4 data set analysis was developed and released in March 2006 by project volunteer Akos Fekete, a Hungarian programmer.[8] Fekete improved the official S4 application and introduced SSE, 3DNow! and SSE3 optimizations into the code improving performance by up to 800%.[citation needed] Fekete was recognized for his efforts and was afterwards officially involved with the Einstein@home team in the development of the new S5 application.[9] As of late July 2006 this new official application became widely distributed among the Einstein@home users, creating a large surge in the project's total performance and productivity, best measured by floating point speed (or FLOPS), which has increased by approximately 50% compared to non-optimized S4 application.[10]

See also

References

Further reading

External links


Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message