LHC Data to be Made Public Via Open Access Initiative

imageLHC Data to be Made Public Via Open Access Initiative1

The Compact Muon Solenoid Experiment2 (CMS) is one of the two general-purpose experiments that have been constructed to search for new physics; to detect a wide range of particles and phenomena produced by high-energy proton-proton and heavy-ion collisions. CMS has taken place at and inside of the Large Hadron Collider (LHC), part of CERN, the European Laboratory for Particle Physics (formerly the Organization for European Nuclear Research; the name CERN is derived from the acronym for the French Conseil Européen pour la Recherche Nucléaire).

Scientists at CMS are striving to answer the most fundamental and basic questions about the Universe; questions such as, “What is the Universe really made of and what forces act within it?” and “What gives everything substance?”.

imageLHC CMS data are exotic. They are complicated. They are big. At peak performance, about one billion proton collisions take place every second inside the CMS detector at the LHC. Around 64 petabytes of analyzable data has been collected from these collisions so far.

Kati Lassila-Perini, head of the CMS Data Preservation and Open Access project at the Helsinki Institute of Physics has said, “We must make sure that we preserve not only the data but also the information on how to use them.” As part of the effort to achieve this, the Open Access project intends to make available the CMS data that are no longer under active analysis. The first set of this data will be made available in the second half of 2014. It will comprise of a portion of the data collected by CMS in 2010.

Additional information about CMS, the Data Preservation and Open Access project, the LHC and CERN can be gotten through the following links: