Scientific computing for particle physics, astroparticle and cosmology
The Grid is a computing paradigm in which distributed resources (computing power, storage) are shared among members of communities called Virtual Organizations (VOs).
For a few years now, high-energy physics experiments have been using Grid technologies, in the framework of the Worldwide LHC Computing Grid (WLCG) project. Within WLCG, the computing infrastructure required for the simulation, processing and analysis of the data of the CERN's Large Hadron Collider (LHC) experiments was built and is now operated and continuously enhanced. The WLCG is essential for a fruitful exploitation of the LHC data and was instrumental for the work that led to the discovery of the Higgs boson in 2012.
The CIEMAT's Scientific Computing group has an active participation in the WLCG project. In what regards the infrastructure, the group operates a Tier-2 resource center for the CMS experiment at CIEMAT (CIEMAT-LCG2site) and participates in the Tier-1 resource center for all the LHC experiments in the Port d'informació científica(PIC site). In addition, the group participates in software developments for the Grid and the experiment applications and gives support to Grid users.
Other Grid Computing Projects and Communities
Besides WLCG, the group has participated in several other Grid-related european projects for several years. Such projects include the Data Gridproject, Enabling Grids for E-sciencE (EGEEE), and, more recently, theEuropen Grid Infrastructure (EGI) project.
All these projects had the goal of providing researchers with access to a geographically distributed computing Grid infrastructure, available 24 hours a day. They focus on developing and maintaining the middleware of the different Grid services, and on operating the large infrastructure for the benefit of a vast and diverse research community.
Besides the LHC experiments, both the PIC and CIEMAT-LCG2 Grid sites offer their Grid resources to other scientific communities. These include experiments participated by the local researchers (such as the Cherenkov Telescope Array (CTA), calice,fusion, and the general EGI's physics VO), but also others (e.g., biomed).
New Computing Architectures
Existing and emerging large-scale research projects are producing increasingly high amounts of data at faster and faster rates. The success of existing and future scientific and experimental programmes depends among other factors on an efficient exploitation of the recent and future advances in computing technology. For this reason, we are exploring and exploiting emerging computational paradigms and platforms.
Specialised platforms like Graphic processor units (GPUs), have introduced a shift in how computing systems and software must be designed to keep improving performance and efficiency. Many large scientific research projects require increasing computational power which cannot be achieved by adding more and ever faster CPUs both from an overall cost and a power-consumption point of view. It is important to research data processing algorithms that can make good use of the data parallel processing features of the GPUs. The algorithms can be implemented using open standard languages like OpenCL or OpenMP, to be CPU and GPU independent, or in a vendor-specific language if that brings clear performance benefits.
The group has worked in porting some cosmological correlation functions resulting in large performance improvements that have allowed to process large datasets with reasonable execution time. The group is also involved in performance evaluation studies of online (real-time) data filtering (triggers) of large quantities of high-rate data using GPUs.
The Worldwide LHC Computing Grid (WLCG) provides global computing resources to store, distribute and analyse the tens of Petabytes of data annually generated by the Large Hadron Collider (LHC) at CERN
El Puerto de Información Científica (PIC) founded in 2003 and maintained through a collaboration agreement between CIEMAT and IFAE, is a data center of excellence for scientific-data processing supporting scientific groups working in projects which require strong computing resources for the analysis of massive sets of distributed data. The Spanish Tier-1 center for LHC is hosted at PIC.