|
CMS-HIGH ENERGY PHYSICS
The CMS Research Program at the LHC
|
|
The CMS (Compact Muon Solenoid) experiment is one of the three major collider detectors at theLarge Hadron Collider (LHC), located in the CERN international laboratory in Geneva, Switzerland. The LHC will open a new regime in particle physics up to and beyond the TeV energy scale, made accessible for the first time by a combination of high luminosity (number of collisions per second) and the world’s highest center of mass energy of 14 TeV. The unprecedented energy range and sensitivity of this new particle accelerator, combined with the special capabilities of the CMS experiment for particle detection and measurement, are expected to lead to discoveries of new elementary particles and novel behaviors of the fundamental forces. Such discoveries at the smallest distance scales could have revolutionary effects on our understanding of the unification of forces, the origin and stability of matter, the ultimate underpinnings of the observable universe, and the nature of spacetime itself. The LHC, which is now being constructed by a global collaboration of thousands of physicists and engineers, will commence operation in 2007 with a scientific program that will continue for decades.
The search for new particles and physics processes beyond the so-called “Standard Model” of particle physics is the main motivation of the CMS scientific program. Chief among these is the Higgs boson, thought to be responsible for particle masses and a key (though unobserved) element of the Standard Model. The Standard Model is known to be incomplete, and more fundamental theories predict new families of supersymmetric particles, extra dimensions, leptoquark particles, and sub-constituents of the leptons and quarks that are currently thought to be the most basic components of matter. The National Science Foundation, in partnership with the US Department of Energy, has made heavy investments in the LHC program, including the detector construction, software and computing. In Summer 2002, the NSF began funding the LHC research program that includes support for operating and maintaining the experiment as well as the data analysis during up to and beyond LHC startup.
The CMS Offline WorkBook
The CMS Detector
The CMS is a multi purpose discovery detector with very precise electromagnetic calorimetry and excellent coverage and momentum resolution for muons (see ). Although the word “Compact” appears in its name, CMS’ compactness is only relative to ATLAS, a competing LHC experiment with a lower magnetic field and a larger volume. In fact, the detector is 20 meters long, 14 meters in diameter and has a mass of 12,000 tons. The CMS detector design is optimized, in combination of the high energy and beam intensity (or “luminosity”) of the LHC, to provide opportunities for new physics discoveries with the greatest achievable sensitivity, based on precision measurements of muon, electrons and photons over a wide energy range. Several detector components, the 223 m 2 silicon vertex detector, tracker, 80,000-crystal lead tungstate electromagnetic calorimeter, and hadron calorimeter, are located inside a 4 Tesla superconducting solenoid. Outside the solenoid is an all solid-state muon system. The forward calorimeter is also outside the solenoid. In addition there are the electronic readouts, controls, trigger system and data acquisition system. The construction of the detector started in 1996 and is now more than 50% complete.
Implementation of modern object-oriented software and database systems were initiated in 1997, and they are now in their third of four development cycles (“fully functional software”) leading to production software, data handling systems, and worldwide networked Grid systems for data processing and analysis, ready in time for the start of LHC operations. The Software and Computing task is a global collaborative effort, US having the overall leadership roles and supporting approximately 25% of the cost. The detector is being constructed by various collaborating institutions and by now it is approximately 60% completed. The total cost of the detector and computing facilities is approximately one billion dollars, using US accounting, with the US providing approximately 20% of the total.
CMS and US CMS Collaboration
At the present time the worldwide CMS collaboration includes approximately 1900 scientists and engineers from 150 institutions in 35 countries, including 420 from 38 US institutions. The CMS management is headquartered in CERN, including the Spokesman, Dr. Michel della Negra. In the US Dr. Dan Green of Fermilab heads the CMS research program that oversees the US CMS Construction, the Software and Computing projects, and Maintenance and Operations. The US CMS collaboration is represented by its elected Collaboration Board Chair (H. Newman of Caltech) and Deputy Chair (V. Hagopian of Florida State) who are both Co-PIs of this proposal. Univ. of Florida, Florida State Univ. and the California Institute of Technology are members of the CMS collaboration with major hardware and software responsibilities. FIU has officially applied to become a full member of CMS.
At the present time the CMS collaboration has no institutions from South and Central America, but institutions from Brazil (led by UERJ in Rio) and Mexico have applied to join the collaboration.
CMS Computing Challenges
CMS’ worldwide reach, and the unprecedented challenges it presents in several fields of Information Technology, makes it a fertile and highly motivating foundation for an educational outreach program. This is driven by several aspects of the CMS program, including (1) collaborative data analysis by large and small sub-communities, distributing regionally, nationally, and in many cases globally; (2) the tremendous scale and complexity (and the artificial intelligence to deal with the complexity) of the software infrastructures and networked systems required; (3) the scale of the processing and data handling systems, requiring a coordinated global effort to cover all the possible discovery scenarios, and to effectively separate the rare “signals” in the data that point the way to discoveries, from the large, complex and potentially overwhelming backgrounds; and (3) the multi-decade timescale (including the design, construction and operation phases of the experiment), with a consequent need for great flexibility as well as robust and innovative engineering in the computing, software, data, and network management systems that are now being developed
|
|