Taking a closer look at LHC

HOME CERN LHC PHYSICS AT LHC DETECTORS STANDARD MODEL EDUCATION LINKS NEWS & MORE GLOSSARY

NEWS: Follow the news...

 LHC DATA
 


The LHC produces  at design parameters over 600 millions collisions ( ~ 109 collisions) proton-proton per second in ATLAS or CMS detectors.  The amount of data collected for each event is around 1 MB (1 Megabyte).

109 collisions/s  x  1 Mbyte/collision = 1015 bytes/s  = 1 PB/s (1 Petabyte/second)

Since 1 DVD ~ 5 GB :   200000 DVDs per second would be filled or about 6000 IPods (ones with 160 GB of storage) per second!

This is several orders of magnitude greater than what any detector data acquisition system can handle.

A trigger is designed to reject the uninteresting events and keep the interesting ones (more information about trigger in this page).

For example, the ATLAS trigger system is designed to collect about 200 events per second.

200 events/s  x  1 Mbyte = 200 MB/s (200 Megabyte/second)

Taking two shifts of ten hours per day, and about 300 days per year:

200 MB/s  x  2  x 10  x  3600  x  300  ~   4·1015 bytes/year  = 4 PB/year

Collectively, the LHC experiments produce about 15 petabytes of raw data each year that must be stored, processed, and analyzed.

A three level trigger is used to select events that show signs of interesting physics processes.

  • The first level is a hardware based trigger selecting events with large energy deposits in the calorimeters or hits in the muon chambers...
  • The level-2 trigger is software based, and selects events based on a rudimentary analysys of regions of interest identified in level-1.
  • The level-3 trigger does a preliminary reconstruction of the entire event, events that are selected by this trigger is stored for offline analysis.

To achieve the physics goals, sophisticated algorithms are applied succesively to the raw data collected y each experiment in order to extract physical quantities and observables of interest that can be compared to theoretical  predictions.

The next figure (taken from the book The Large Hadron Collider: a Marvel of Technology edited by Lyndon Evans) shows a high-level view of the data flow and the principal processing stages involved int this process.


For more information see  EVANS L. (Ed). The Large Hadron Collider: a Marvel of Technology. CERN and EPFL Press (2009). Chapter 5.6.




© Xabier Cid Vidal & Ramon Cid - rcid@lhc-closer.es  | SANTIAGO (SPAIN) | Original Design by Gabriel Morales Rey