kryukov 20041004

50 %
50 %
Information about kryukov 20041004
Entertainment

Published on October 12, 2007

Author: Arkwright26

Source: authorstream.com

HEP Grid in RUSSIA:  HEP Grid in RUSSIA A.Kryukov Skobeltsyn Institute of Nuclear Physics Lomonosov Moscow State University (kryukov@theory.sinp.msu.ru) DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Outline:  Outline Introduction LHC HEP Experiments Challenge in Computing Tier0/Tier1/Tier2 Centers LCG: LHC Computing Grid Project Middleware EGEE Project Russia in EGEE project Russian CIC/ROC LHC DC in Russia Conclusions DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Introduction:  Introduction What is a GRID? Flexible, secure, coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations. I. Foster, “What is the Grid? A Three Points Checklist.”, GRIDToday, July 20, 2002 Why GRID? Moore’s law highly functional end systems The increased use of the Internet highly universal connectivity Changing modes of working and problem solving Computation addresses collaboration Cost savings DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Why Russia take part in GRID projects?:  Why Russia take part in GRID projects? To get experience in development and deployment of modern information (GRID) technology Integration in the European information infrastructure Russia makes large contribution into CERN experiments (Atlas, Alice, CMS, LHCb) Russia is geographically “distributed” country by nature DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Why Moscow State University is interested in GRID?:  Why Moscow State University is interested in GRID? Very large site More then 25 000 of staff About 30 000 of students. A lot of platforms PC (both Linux and MS Windows) Unix WS Supercomputers Parallel clusters Why Moscow State University is interested in GRID?:  Why Moscow State University is interested in GRID? Wide area site Many departments, Institutes There are geographically separated site Bio station Nuclear and HEP labs A lot of scientific contacts and collaborations HEP: DESY, CERN, FNAL, … HEP Challenge in Computing:  HEP Challenge in Computing DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Tier1 centers:  Tier1 centers To keep certain portions of RAW, ESD, simulated ESD data and full copies of AOD and TAG data, calibration data. Data processing and further reprocessing passes. Official physics group large-scale data analysis (collaboration endorsed massive processing). ALICE and LHCb – contribution to simulations. DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Tier2 Centers:  Tier2 Centers To keep certain portions of AOD and full copies of TAG for both real and simulated data (LHCb – store only simulated data at T2s). To keep small selected samples of ESD. Produce simulated data. General end-user analysis. DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian Tier2-Cluster:  Russian Tier2-Cluster Cluster of institutional computing centers with Tier2 functionality and summary resources at 50-70% level of the canonical Tier1 center for each experiment (ALICE, ATLAS, CMS, LHCb) Basic functions: analysis; simulations; users data support Participating institutes: Moscow ITEP, SINP MSU, RRC KI, LPI, MEPhI… Moscow region JINR, IHEP, INR RAS St.Petersburg PNPI RAS, St-PSU Novosibirsk BINP SB RAS Russian Tier2 Cluster is planned to be connected to the CERN Tier1 Centre DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Phys.channel analysis for requirements to the computing resources:  Phys.channel analysis for requirements to the computing resources DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian Tier2 Cluster (resources):  DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian Tier2 Cluster (resources) Russian Tier2 Cluster planning:  Russian Tier2 Cluster planning DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG: LHC Computing GRID Project:  LCG: LHC Computing GRID Project Mission To prepare, deploy and operate the computing environment for the experiments to analyze the data from the LHC detectors Two phases: Phase 1: 2002 – 2005 Build a prototype, based on existing grid middleware Deploy and run a production service Produce the Technical Design Report for the final system Phase 2: 2006 – 2008 Build and commission the initial LHC computing environment LCG is NOT a development project for middleware but problem fixing is permitted (even if writing code is required) LCG-2 is the first production service for EGEE DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG: LHC Computing GRID Project:  Jan 2003 GDB agreed to take VDT and EDG components March 2003 LCG-0: existing middleware, waiting for EDG-2 release September 2003 LCG-1 3 month late -> reduced functionality December 2003 LCG-2 Full set of functionality for DCs, first MSS integration Deployed in January to 8 core sites DCs started in February -> testing in production May 2004 -> now monthly incremental releases Not all releases are distributed to external sites Improved services, functionality, stability and packing step by step Timely response to experiences from the data challenges DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG: LHC Computing GRID Project LCG MiddleWare:  LCG MiddleWare Globus www.globus.org VDT EDG Condor www.cs.wisc.edu/condor PBS/LSF DESY, 4 Oct. 2004 A.Kryukov, SINP MSU MW Components of LCG/EGEE:  MW Components of LCG/EGEE DESY, 4 Oct. 2004 A.Kryukov, SINP MSU EGEE Project:  EGEE Project The EGEE project brings together experts from 70 organisations and 27 countries with the common aim of developing a service grid infrastructure in Europe which is available to scientists 24 hours-a-day. The project aims to provide researchers in academia and industry with access to major computing resources, independent of their geographic location. Two pilot application domains have been selected to guide the implementation and certify the performance and functionality of the evolving infrastructure. One is the LHC Computing Grid and the other is Biomedical Grids. With funding of over 30 million Euro from the European Commission, the project is one of the largest of its kind. EGEE is a two-year project conceived as part of a four-year programme. DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Slide19:  Grid deployment in e-infrastructures Goal: Create a European-wide Grid production infrastructure on top of present and future EU RN infrastructure Build on: EU and EU member states major investments Scope: Operations services, networking, pilots Middleware: Hardening & re-engineering of existing middleware functionality Approach: Leverage current and planned national and regional Grid programmes   Applications Geant network Grid infrastructure DESY, 4 Oct. 2004 A.Kryukov, SINP MSU GEANT – European network for science:  GEANT – European network for science DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG/EGEE:  LCG/EGEE 78 Sites ~9400 CPUs ~6.5 PByte DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian patrticipation in LCG/EGEE:  8 Institutes IHEP, Protvino ITEP, Moscow JINR, Dubna KIAM RAS, Moscow PNPI, St.-Petersburg RRC KI, Moscow SINP MSU, Moscow IMPB RAS, Puschino Russian patrticipation in LCG/EGEE DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian patrticipation in LCG/EGEE:  DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian patrticipation in LCG/EGEE IHEP, Protvino 93 CPU 2.14 TB HDD LCG-2.2.0 Russian patrticipation in LCG/EGEE:  DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian patrticipation in LCG/EGEE IHEP, Protvino 93 CPU 2.14 TB HDD LCG-2.2.0 ITEP, Moscow 40 CPU 0.62 TB HDD LCG-2.2.0 Russian patrticipation in LCG/EGEE:  DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian patrticipation in LCG/EGEE IHEP, Protvino 93 CPU 2.14 TB HDD LCG-2.2.0 ITEP, Moscow 40 CPU 0.62 TB HDD LCG-2.2.0 JINR, Dubna 20 CPU 1.7 TB HDD LCG-2.2.0 Russian patrticipation in LCG/EGEE:  DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Russian patrticipation in LCG/EGEE IHEP, Protvino 93 CPU 2.14 TB HDD LCG-2.2.0 ITEP, Moscow 40 CPU 0.62 TB HDD LCG-2.2.0 JINR, Dubna 20 CPU 1.7 TB HDD LCG-2.2.0 SINP MSU, Moscow 24 CPU 9.3 TB HDD LCG-2.1.1 RDIG: Management structure:  RDIG: Management structure DESY, 4 Oct. 2004 A.Kryukov, SINP MSU RDIG: RU-CIC/ROC structure:  RDIG: RU-CIC/ROC structure DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG-2 GRID segment in Russia:  LCG-2 GRID segment in Russia UI JINR UI IHEP UI ITEP UI SINP UI KIAM RB (SINP) Replica DESY, 4 Oct. 2004 A.Kryukov, SINP MSU SINP MSU LCG-2 cluster:  SINP MSU LCG-2 cluster DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Total number of CPU: 30 LHC Computing at ITEP:  Main ITEP-LHC computing farm Servers and Network 1 Gbit/s Internet 64  2  Pentium IV PC modules (01.01.2004) A. Selivanov (ITEP-ALICE) a head of the ITEP-LHC farm LHC Computing at ITEP DESY, 4 Oct. 2004 A.Kryukov, SINP MSU REGIONAL CONNECTIVITY:  Moscow 1 Gbps (ITEP, KI, SINP, …) IHEP 100 Mbps fiber-optic ( Sept. 2004 !) JINR 45 Mbps, 1 Gbps (Q4 2004 – Q1 2005) INR RAS 2 Mbps+2x4Mbps(m/w) BINP 1 Mbps, 45 Mbps (2004 ?), … PNPI 2 Mbps REGIONAL CONNECTIVITY DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Connectivity with CERN:  Connectivity with CERN International links for Russian science still are (RBNet): 4 STM1 (4x155 Mbps) links Moscow-Stockholm Today three of these 155 Mbps links operate: 155 Mbps to StarLight in Chicago 155 Mbps for commodity Internet 155 Mbps to GEANT RunNET Moscow-St-Petersburg-Helsinki (NorduNET): 2.4 Gbps from June 2004. From October 2004 RBNet links will go within the RunNet channel. DESY, 4 Oct. 2004 A.Kryukov, SINP MSU CIC-RU: Supported services:  CIC-RU: Supported services VO management (SINP MSU) Certification Authorities (SINP MSU->RRC KI) Website/portal (JINR) Basic monitoring services (JINR) GRID reource accounting (JINR) MW validation (KIAM RAS) DESY, 4 Oct. 2004 A.Kryukov, SINP MSU CIC-RU: Supported services:  CIC-RU: Supported services CIC: Core grid services Resource Broker (SINP MSU) DB Information Index (SINP MSU) MyProxy (SINP MSU) Replica Location Service (SINP MSU) Logging & Bookiping DESY, 4 Oct. 2004 A.Kryukov, SINP MSU ROC-RU: Supported services:  ROC-RU: Supported services ROC: Regional Operating Center MiddleWare deployment (IHEP) Site certification (SINP MSU->IHEP) User support (ITEP) Monitoring and Accounting (JINR) Resource Center support (IHEP) MW repository (IHEP) Deployment and induction of new users DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG-2 Problems:  LCG-2 Problems Hardware Robustness HDD: SCSI vs. SATA Software Software-Hardware compatibility Network card -> rebuilding kernel Installation and update Security User and job accounting Security accident reports VO management - VOMS DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG-2 Problems:  LCG-2 Problems Application software Alice Specific problem of AliEn<->LCG-2 interaction ATLAS Define more then 1500 environment variables. CMS: A lot of installation problems. LHCb: NO PROBLEMS! Batch system tuning (pbs) Resource utilization Resource monopolization DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LCG-2 Problems:  LCG-2 Problems Personnel 24x7 services levels of skill Financial We have about 10% of necessary resources Specific problems Sharing resources among LCG and local (non GRID) users Access to leased resource Propietary batch system DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LHCb DC04: Phase 1 Completed:  LHCb DC04: Phase 1 Completed DIRAC alone LCG in action 1.8 106/day LCG paused Phase 1 Completed 3-5 106/day LCG restarted 186 M Produced Events DESY, 4 Oct. 2004 A.Kryukov, SINP MSU LHCb DC04: The Phase 1 results:  LHCb DC04: The Phase 1 results 4.85% DESY, 4 Oct. 2004 A.Kryukov, SINP MSU gLite: Next generation of GRID MW:  gLite: Next generation of GRID MW Lightweight (existing) services: Easily and quickly deployable Interoperability: Allow for multiple implementations Perf/Scale. & Resilience/Fault Tolerance Portable: Being built on Scientific Linux and Windows Co-existence with deployed infrastructure Reduce requirements on participating sites Flexible service deployment Co-existence with LCG-2 and OSG (US) are essential for the EGEE Grid service Service oriented approach Follow WSRF standardization No mature WSRF implementations exist to-date so start with plain WS DESY, 4 Oct. 2004 A.Kryukov, SINP MSU gLite: Next generation of GRID MW:  gLite: Next generation of GRID MW Special group from Russia is taking part in gLite activity (testing, deployment) We will install gLite MW in SINP MSU for stress test of gLite components. DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Conclusions:  Conclusions Russia is taking part in the leading GRID project in Europe - LCG/EGEE All Russian sites provide computer and data storage resources for 4 LHC experiments Russian sites are distributive Tier2 center with about 70% Tier1 site resources in total. Russian sites provide of 5% of MC simulation for LHC. DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Thank you!:  Thank you! DESY, 4 Oct. 2004 A.Kryukov, SINP MSU Questions? More details: http://www.egee-rdig.ru/

Add a comment

Related presentations

Related pages

The Conception of Russian Regional LHC Centre - desy.de

HEP Grid in RUSSIA A.Kryukov Skobeltsyn Institute of Nuclear Physics Lomonosov Moscow State University (kryukov@theory.sinp.msu.ru) DESY, 4 Oct. 2004 A ...
Read more