DIRAC: a community grid solution
A Tsaregorodtsev, M Bargiotti, N Brook… - Journal of Physics …, 2008 - iopscience.iop.org
The DIRAC system was developed in order to provide a complete solution for using the
distributed computing resources of the LHCb experiment at CERN for data production and …
distributed computing resources of the LHCb experiment at CERN for data production and …
GridPP: the UK grid for particle physics
D Britton, AJ Cass, PEL Clarke… - … of the Royal …, 2009 - royalsocietypublishing.org
The start-up of the Large Hadron Collider (LHC) at CERN, Geneva, presents a huge
challenge in processing and analysing the vast amounts of scientific data that will be …
challenge in processing and analysing the vast amounts of scientific data that will be …
The CMS dataset bookkeeping service
A Afaq, A Dolgert, Y Guo, C Jones… - Journal of Physics …, 2008 - iopscience.iop.org
Abstract The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all
CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC …
CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC …
[PDF][PDF] Data transfer infrastructure for CMS data taking
R Egeland, T Wildish, S Metson - XII Advanced Computing and …, 2008 - academia.edu
In order to meet the data distribution requirements [1, 2, 3] of the CMS [4] experiment at the
LHC, the Physics Experiment Data Export (PhEDEx)[5, 6] project was designed to facilitate …
LHC, the Physics Experiment Data Export (PhEDEx)[5, 6] project was designed to facilitate …
A classification of file placement and replication methods on grids
J Ma, W Liu, T Glatard - Future Generation Computer Systems, 2013 - Elsevier
This paper presents a classification of file placement and replication methods on grids. The
study is motivated by file transfer issues encountered in the Virtual Imaging Platform …
study is motivated by file transfer issues encountered in the Virtual Imaging Platform …
CRAB3: Establishing a new generation of services for distributed analysis at CMS
M Cinquilli, D Spiga, C Grandi… - Journal of Physics …, 2012 - iopscience.iop.org
In CMS Computing the highest priorities for analysis tools are the improvement of the end
users' ability to produce and publish reliable samples and analysis results as well as a …
users' ability to produce and publish reliable samples and analysis results as well as a …
CRAB: a CMS application for distributed analysis
G Codispoti, C Mattia, A Fanfani… - … on Nuclear Science, 2009 - ieeexplore.ieee.org
Beginning in 2009, the CMS experiment will produce several petabytes of data each year
which will be distributed over many computing centres geographically distributed in different …
which will be distributed over many computing centres geographically distributed in different …
The CMS data aggregation system
V Kuznetsov, D Evans, S Metson - Procedia Computer Science, 2010 - Elsevier
Meta-data plays a significant role in large modern enterprises, research experiments and
digital libraries where it comes from many different sources and is distributed in a variety of …
digital libraries where it comes from many different sources and is distributed in a variety of …
Distributed analysis in CMS
A Fanfani, A Afaq, JA Sanches, J Andreeva… - Journal of Grid …, 2010 - Springer
The CMS experiment expects to manage several Pbytes of data each year during the LHC
programme, distributing them over many computing sites around the world and enabling …
programme, distributing them over many computing sites around the world and enabling …
DIRAC: reliable data management for LHCb
AC Smith, A Tsaregorodtsev - Journal of Physics: Conference …, 2008 - iopscience.iop.org
Abstract DIRAC, LHCb's Grid Workload and Data Management System, utilizes WLCG
resources and middleware components to perform distributed computing tasks satisfying …
resources and middleware components to perform distributed computing tasks satisfying …