Einstein Telescope Observatory Workshop Summary

Einstein Telescope Observatory Workshop Summary
Slide Note
Embed
Share

The summary covers computing and data requirements for the Einstein Telescope Gravitational Wave Observatory, including discussions on computing needs, scientific cases, hardware evolution, and person power requirements. It highlights the baseline data flow, requirements for pipelines, and modular code collaboration for data analysis and trigger signal waveforms in the observatory's operations.

  • Einstein Telescope
  • Gravitational Waves
  • Data Requirements
  • Computing Evolution
  • Workshop Summary

Uploaded on Feb 21, 2025 | 2 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Preparatory Phase for the Einstein Telescope Gravitational Wave Observatory Deliverable 8.1 Computing and Data Requirements Workshop Summary

  2. Workshop High Density content GW community Introduction Baseline: IGWN (VIRGO) ISB: Instrument computing requirements OSB: Data Analysis requirements Discussion Pre-merger alert (premature) Computing needs Scientific cases: Continuous waves, CBC, Hardware evolution: from CPUs to GPUs, FPGAs, Computing evolution: from HTC to HPC Person power needs

  3. Baseline - Data flow CBC is the dominant ET:1Msources per year), 60% is production There will be one pipeline for each science group, with multiple subpipelines. Population studies CW can use computing power when there is nothing else running. Extrapolation: Current computing needs of the entire GW network are roughly o(10%) of an LHC experiment of today Total: <100xO5+

  4. Baseline - Data flow Raw data Numerical relativity Noise subtraction calibration Bank generation Templates (500k) Data - h(t) SEARCH ET Triggers signal waveforms Parameters estimation

  5. Baseline - Data flow Requirements Pipelines flexibility Modular code Collaboration algorithms Joint development Common catalog Raw data Numerical relativity Noise subtraction calibration Bank generation Templates (500k) Eccentricity add ET = Multiply by 10 ? Data SEARCH Non GR templates ET Triggers signal waveforms Test General Relativity ET will need a much better method than now Parameters estimation posteriors populations

  6. Requirements Document Short term reason why we are here Document Breakdown structure: Summary tables collecting first estimates on: 1. Online (DAQ, environmental monitoring, online data preparation) 2. Low latency alert infrastructure and pipelines 3. Offline computing (data management, offline analysis) Collection method: requirements table Name DescriptionMotivation Required value Preferred value Risk assessment PBS ID Deadline for distributing internally: Dec 2023

Related


More Related Content