Compute integration update

Compute integration update
Slide Note
Embed
Share

This update includes information on the LOFAR use case, specific infrastructure requirements, collaborative workspace needs, and the ESAP integration progress. It also covers the parameter space definition and plans for tighter integration in the future.

  • Compute
  • Integration
  • LOFAR
  • Infrastructure
  • ESAP

Uploaded on Feb 20, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Compute integration update Yan Grange

  2. LOFAR use case Place on the wiki https://wiki.escape2020.de/index.php/Use_cases:LOFAR Since most use cases touch several work packages this page is not per se WP2 only so we put it at highest level. Can also be input to the proposed cross-WP task force Observation (Target + calibrator) in storage space SURFsara Not visible in Rucio yet (afaik) This is the input for the prefactor pipeline (described on the Wiki)

  3. LOFAR use case Questions on required infrastructure prepared: What type of hardware does the use case run on (how many CPUs, minimum amount of memory do you need or use GPUs or other accelerators,...)? What interaction with the system is necessary (think: batch, jupyter notebook, command line shell, ...) Also if applicable please specify what type of batch system is needed (slurm, dirac, ...)? What type of other resource requirements do you have (also think about HPC, HTC, ...)? How long do the jobs take to run, and/or are there any other time constrains on the jobs? Do you need any dependencies on the system (think things like containers, OS versions, software, shared/local file system)? Do you have any requirements on collaborative workspaces, data sharing? What is the typical numbers and files and their typical sizes? How many jobs do you expect to run in parallel?

  4. Parameter space definition Created this mostly to steer a bit in the discussion between SKA and LOFAR, but the ideas may be relevant to others too. Especially when discussing future use cases (the SDC work for SKA may be significantly different from what we do now for LOFAR). Of course: work in progress, feedback welcome. Focus is on the access patterns, storage access and specifically not on the compute itself since that is out of scope of ESCAPE

  5. Parameter space definition

  6. ESAP integration Part of the busy month of ESAP (WP5). Goal is to have a demonstrator in September. Currently Zheng Meyer has a version of the interface that will link directly to the Rucio web UI (https://escape-dios-dl.cern.ch/ui). (I would love to show you a demo but it seems not committed to the repo, and Zheng is on a well-deserved holiday) Next step is to have the rucio API connected to from within the ESAP interface, this is our target for the foucus month At a later stage a tighter integration (a container with the client to be executed from within ESAP).

  7. ESAP integration CERN has worked on an integration of Rucio in a Jupyter notebook. I am the only one able to do anything for now. So the only task there is out now is that I try to get that to run. Will have weekly meetings (probably mondays at 15:00) We have a (public!) rocketchat channel: #esap-rucio This is primarily a WP5 activity, but if you are interested feel free to contact me or just join the channel on RC!

Related


More Related Content