Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Info

Welcome to our third newsletter. Regretfully, we are rather approaching a two-month release schedule than the initially envisioned one-month plan — much work going on in parallel. 

Hopefully, you will still find this newsletter helpful and informative. The tools section is updated, and IT news is up to date. 

As always, we would like to encourage you to comment openly on any subject or raise new topics to which you think we do not pay enough attention in the context of SDC. Feedback is always welcome and helps us to deliver a better product.


Table of Contents
maxLevel3
minLevel2

Editorial


📰 Editorial

🔒 What goes here?  

Peter Caligari

Project Status

SDC Project Status 03-2021 (06.07.2021)

Petri Kehusmaa

Solution Development and Integration Phases.Image Removed

Solution Development and Integration.

The project has now shifted into a phase where we are building the actual SDC platform and creating/acquiring all necessary components. These components are in-house developed software for instrument pipelines and analysis, compute, network and storage hardware, middleware (RUCIO, Kubernetes, Docker, etc.), and governance/management/documentation software like Jira Service Management and Confluence.

There is still some work to be done to find all suitable solution components and thus shaping the final scope of SDC. We aim to build SDC as a service platform for the solar community with a continuous focus on users and platform development.

📋 Summary

Current project health

Current project status

Project constraints

Status
colourYellow
titleYELLOW

Finalizing some tasks for solution design and creating solution components. 

Governance model not finalized and implementation not started yet.

Resources and their availability.

Technology POCs taking more time than predicted.

 📊 Project status

Tip

Accomplishments

  • High-level solution design


  • 📜Aperio

    Traditionally all data processing in solar physics is typically done on files. While this option will prevail in SDC, it is not the best way to deal with large data sets, where computations need to be done where the data resides and not vice versa. To interface data in SDC programmatically, APIs are needed for the most common programming languages like Python and IDL. 

    We are pleased that we could win Aperio Software to develop a Python API for SDC. Aperio Software is heavily involved in the development of SunPy and Astropy, a community effort to develop Python packages for Solar Physics and Astronomy. Drew Leonard, one of the founders of Aperio, developed the prototype for the VTF pipeline. The contract divides into a design and implementation phase. During the former, Drew will clarify what is expected from the future API and what requirements it must meet through workshops and one-on-one meetings. Expect the first version for mid-2022. 

    Project Status


    SDC Project Status 03-2021 (06.07.2021)


    Petri Kehusmaa

    Solution Development and Integration Phases.Image Added

    Solution Development and Integration

    The project has now shifted into a phase where we are building the actual SDC platform and creating/acquiring all necessary components. These components are in-house developed software for instrument pipelines and analysis, compute, network and storage hardware, middleware (RUCIO, Kubernetes, Docker, etc.), and governance/management/documentation software like Jira Service Management and Confluence.

    There is still some work to be done to find all suitable solution components and thus shaping the final scope of SDC. We aim to build SDC as a service platform for the solar community with a continuous focus on users and platform development.

    📋 Summary

    Current project health

    Current project status

    Project constraints

    Status
    colourYellow
    titleYELLOW

    Finalizing some tasks for solution design and creating solution components. 

    Governance model not finalized and implementation not started yet.

    Resources and their availability.

    Technology POCs taking more time than predicted.

     📊 Project status

    Tip

    Accomplishments

    • High-level solution design

    • Some software components created (GRIS Viewer)

    • Hardware The hardware acquisition process started

    • RUCIO test environment established

     

    Next steps

    • Continue selecting solution components and creating solution components

     

    Warning

    Risks & project issues

    • Lack of resources

    • Resource availability

    • Multiple process implementations at the same time

    • No agreed governance model

    Governance


    👩‍⚖️ Policies, Frameworks & Governance


    • ITIL v4 process model going to be partially adopted for service management purposes

    • Data policies definition started

    • SDC governance model and scope to be decided

    📜Aperio

    Traditionally all data processing in solar physics is typically done on files. While this option will prevail in SDC, it is not the best way to deal with large data sets, where computations need to be done where the data resides and not vice versa. To interface data in SDC programmatically, APIs are needed for the most common programming languages like Python and IDL. 

    We are pleased that we could win Aperio Software to develop a Python API for SDC. Aperio Software is heavily involved in the development of Astropy, a community effort to develop Python packages for Astronomy. Drew Leonard, one of the founders of Aperio, developed the prototype for the VTF pipeline. The contract divides into a design and implementation phase. During the former, Drew will clarify what is expected from the future API and what requirements it must meet through workshops and one-on-one meetings. Expect the first version for mid-2022. 
    • to be decided

    Products & Tools


    🛠 SDC Products & Tools


    Standardized GRIS Pipeline

    Carl Schaffer (Unlicensed)

    The GRIS reduction pipeline was merged to a common version in collaboration with M. Collados (IAC, GRIS PI). The version running at OT and Freiburg now both produce data that is compatible with downstream SDC tools. The latest version of the pipeline can always be found on the KIS GitLab server. The current OT version will be synced to the ulises branch and merged into the main production branch periodically.

    SDC GRIS VFISV-Inversion pipeline

    Vigeesh Gangadharan

    A pipeline code for performing Milne-Eddington inversions of GRIS spectropolarimetric data is now available at,

    https://gitlab.leibniz-kis.de/sdc/grisinv

    The pipeline uses the Very Fast Inversion of the Stokes Vector (VFISV, Borrero et al. 2011) code v5.0 (node for spectrograph data) as the main backend to carry out a Milne-Eddington Stokes inversion for individual spectral lines.
    The current implementation of the pipeline is a Python MPI wrapper around the VFISV code to easily work with the GRIS data. The inversion for the desired spectral line is performed using VFISV and the buffer with the inversion results is communicated to the Python module. The Python module propagates the keywords from level 1 (L1) and packages the inversion results and outputs a FITS file (when used as a command-line interface) or returns an NDarray (when called within a python script).

    For more information on installing and using the pipeline, check the above GitLab repository.

    Please report any issues with the code using the link below,

    https://gitlab.leibniz-kis.de/sdc/grisinv/-/issues/new?issue

    SDC data archive

    https://sdc.leibniz-kis.de/

    Get access to data from GRIS/GREGOR and LARS/VTT instruments and the ChroTel full-disc telescope at OT.

    Updates as of July 2021

    • The detail pages for observations have been reworked see an example here:

      • Added dynamic carousel of preview data products

      • Added flexible selection for downloading associated data

    • VFISV inversion results have been added for most of the GRIS observations. The website now includes information on line of sight velocity and magnetic field strength

    • The development process has streamlined:

      • automated test deployments for quicker iterations and fixes

      • Changes to the UI will occur in regular sprints. We’re currently collecting ideas here

    • Added historic ChroTel data for 2013, thanks to Andrea Diercke from AIP for contacting us and providing us with this supplemental archive.

    GRISView

    Taras Yakobchuk

    GRISView is a new visualization and analysis tool to work with GRIS/GREGOR calibrated datasets as distributed by the SDC website. It is written in Python with GUI made using Qt cross-platform framework.

    Currently implemented features include:

    • Quick panning and zooming of map images and spectra using mouse

    • Multiple POI (point-of-interest) and ROI (rectangle-of-interest) for easy inspection of spectral changes across the map

    • Distance measurement between multiple map points given in different units

    • Intensity profile plots along a given line segment, linking several profiles for radial profiles checking

    • Interactive color bars used to view histogram, adjust image contrast, select and modify the viewing color scheme

    • Generating contours for map images, easy levels adjustment, and color setting

    • Browsing spectra with cursor moving using keyboard and mouse shortcuts, quick navigation using marker list

    • Relative scale for quick wavelengths difference evaluation at the cursor position

    • Viewing observation FITS files headers

    • Support for both individual observations and time-series

    Next, it is planned to add the following:

    • Exporting current spectra and map plots as images and data files

    • Derived quantities visualization e.g. Q/I, V/I, DOLP (degree of linear polarization) etc.

    • Various normalizations of spectra e.g. to a selected signal level, local continuum, quiet Sun

    • Spectral line fitting and line parameters determination

    • Saving and restoring working sessions

    Info

    Feedback welcome

    We strongly encourage all colleagues to try out this new tool and provide feedback. Instructions for installing and using the programme can be found on the tool's gitlab page:

    https://gitlab.leibniz-kis.de/sdc/gris/grisview

    Please report any issues and bugs on the program GitLab page or using the direct link. Instructions for installing and using the program can be found on the tool's GitLab page:

    https://gitlab.leibniz-kis.de/sdc/gris/grisview/-/issues/new?issue

    Conferences & Workshops

    📊 Conferences & Workshops

    Nazaret Bello Gonzalez

    Forthcoming Conferences/Workshops of Interest 2021

    Every second Thursdays, 12:30-13:30 CET

    PUNCH Lunch Seminar (see SDC calendar invitation for zoom links)

    • 11 Feb 2021: PUNCH4NFDI and ESCAPE - towards data lakes

    • 25 Feb 2021: PUNCH Curriculum Workshop

    KIS internal Typo3 Editors' training

    July 13 & 14, 2021, 10:00 - 12:00 CEST registration needed!

    SDC participation in Conferences & Workshops

    Nov. 26, 2020:

    2nd SOLAR net Forum Meeting for Telescopes and Databases

    Talk:  Big Data Storage -- The KIS SDC case, NBG, PC & PK, 2nd SOLARNET Forum (Nov 26)
    Nazaret Bello GonzalezPetri Kehusmaa Peter Caligari

    Please report any issues and bugs on the program GitLab page or using the direct link:

    https://gitlab.leibniz-kis.de/sdc/gris/grisview/-/issues/new?issue

    Conferences & Workshops


    📊 Conferences & Workshops


    Forthcoming Conferences/Workshops of Interest 2021

    Every second Thursday, 12:30-13:30 CET (currently on summer break)

    PUNCH Lunch Seminar (see SDC calendar invitation for zoom links)

    KIS internal Typo3 Editors' training

    July 13 & 14, 2021, 10:00 - 12:00 CEST registration needed!

    SDC Collaborations


    🤲 SDC Collaborations


     Nazaret Bello Gonzalez

    SOLARNET https://solarnet-project.eu

    KIS coordinates the SOLARNET H2020 Project that brings together European solar research institutions and companies to provide access to the large European solar observatories, supercomputing power and data. KIS SDC is actively participating in WP5 and WP2 in coordinating and developing data curation and archiving tools in collaborations with European colleagues.
    Contact on KIS SDC activities in SOLARNET: Nazaret Bello Gonzalez nbello@leibniz-kis.de

     ESCAPE https://projectescape.eu/

    KIS is a member of the European Science Cluster of Astronomy & Particle Physics ESFRI Research Infrastructures (ESCAPE H2020, 2019 - 2022) Project aiming to bring together people and services to build the European Open Science Cloud. KIS SDC participates in WP4 and WP5 to bring ground-based solar data into the broader Astronomical VO and the development tools to handle large solar data sets. 

    Contact on KIS SDC activities in ESCAPE: Nazaret Bello Gonzalez nbello@leibniz-kis.de

     

    EST https://www.est-east.eu/

    KIS is one of the European institutes strongly supporting the European Solar Telescope project. KIS SDC represents the EST data centre development activities in a number of international projects like ESCAPE and the Group of European Data Experts (GEDE-RDA).

    Contact on KIS SDC as EST data centre representative: Nazaret Bello Gonzalez nbello@leibniz-kis.de

     

    PUNCH4NFDI https://www.punch4nfdi.de

    KIS is a participant (not a member) of the PUNCH4NFDI Consortium. PUNCH4NFDI is the NFDI (National Research Data Infrastructure) consortium of particle, astro-, astroparticle, hadron, and nuclear physics, representing about 9.000 scientists with a Ph.D. in Germany, from universities, the Max Planck Society, the Leibniz Association, and the Helmholtz Association. PUNCH4NFDI is the setup of a federated and "FAIR" science data platform, offering the infrastructures and interfaces necessary for the access to and use of data and computing resources of the involved communities and beyond. PUNCH4NFDI is currently competing with other consortia to be funded by the DFG (final response expected in spring 2021)has been granted funds and will start officially its activities on October 1, 2021. KIS SDC aims to become a full member of PUNCH and federate our efforts on ground-based solar data dissemination to the broad particle and astroparticle communities.

    Contact on KIS SDC as PUNCH4NFDI participant: Nazaret Bello Gonzalez nbello@leibniz-kis.de & Peter Caligari mailto:cale@leibniz-kis.de 

     IT news


    🖥 IT news


    Peter Caligari

    Ongoing & Future developments

    Webpage

    Status
    colourYellow
    titleKIS
    The design of the new website is essentially complete. We are currently making some final technical adjustments to the webserver and Typo3. The website is already running at the deployment (VM-ware) server at KIS and is already publicly available at the web address:

    https://newwww.leibniz-kis.de

    After the content has been moved, the server will be renamed http://www.leibniz-kis.de , and the old site will be shut down.

    One of the reasons for the relaunch was to increase support of the particular browsers used by people with disabilities. This requires specific fields in the back-end to be filled in so that the page content can be appropriately classified. We will have a training course on handling the typo3 back-end in general, focusing on the above points on 

    July 13 & 14, 2021, 10:00 CEST (Editors' training)

    We currently plan to avoid any user login in the front end. This would allow us to not have to use cookies at all, rendering the need to use these annoying GDPR popups obsolete. However, this means we might not have any restricted areas on the website at all (including an Intranet)! This is a radical approach, and we might not be able to stringently follow through with this (see below). In that case, the Intranet on the website will be limited to purely informational pages; any documents now downloadable on the old website should be migrated to the cloud (wolke7). Anyhow, Typo3 allows hosting multiple websites under a single installation sharing the basic design and resources. Therefore, any websites requiring user registration and login (like the Intranet or a possible OT-webpage) might be built as separate websites, keeping the publicly accessible website login-free. 

    Network

    Status of the dedicated 10 Gbit line between KIS & OT

    Status
    colourYellow
    titleKIS
    Status
    colourPurple
    titleOT
    The missing network equipment for the end at KIS will be installed in the second week of July. We will then try to establish the link remotely from Freiburg with the help of personnel at the telescopes.

    Test of (application) firewalls at KIS

    Status
    colourYellow
    titleKIS
    Status
    colourPurple
    titleOT
    Firewall testing at KIS (see https://leibniz-kis.atlassian.net/l/c/rF8kmXjv ) has terminated. Two manufacturers are still beeing being considered, and a final choice will be made as soon as possible.

    We (IT) still very much advocate going for high-availability setups for KIS and OT (in Freiburg) because KIS will host a significant part of SDC and OT because there's no trained personnel on-site, and replacements to the Canary islands take time).

    Storage

    Status
    colourYellow
    titleKIS
    Status
    colourRed
    titleSDC
    We are currently setting up one DELL R740XD2 as a (fake) dCache cluster running two (redundant) dCache pools on VM-ware. This host serves as a testbed to simulate hardware and network failures in the dCache cluster to come while providing a failure-tolerant (hopefully) net capacity of about 100 TB to KIS, alleviating the currently pressing storage shortage.

    Starting in July, six more comparable hosts will be purchased through a public tender. These will have a similar setup and form storage Tier1 (near-line) of SDC at KIS. We expect the hosts to arrive in late September.

    We use ZFS on virtualized Debian servers as a basis for the individual dCache-nodes. ZFS uses copy-on-write and checksums any blocks on disk and provides auto-healing. Zpools will most probably use RAIDZ or RAIDZ2, and any file will reside on at least 2 different servers. At the time of this writing, the only other file system offering similar features is BTRFS, but support for BTRFS was recently pulled from some major distributions (e.g. CentOS, the distro that has mainly been used at the KIS so far).

    Status
    colourRed
    titleSDC
    The 100 TB space on Microsoft Azure for cold data still needs some configuration. As of today, the third-party software responsible for moving files between our Isilon storage cluster (mars) and the cloud has problems doing so from Linux clients in a satisfactory way. The manufacturer of the software is working on the issue.

    Current Resources

    Compute nodes

    hostname

    # of CPUs & total cores

    ram [GB]

    patty, legs & louie

    Status
    colourYellow
    titleKIS

    2 x AMD EPYC 7742, 128 cores

    1024

    itchy & selma

    Status
    colourYellow
    titleKIS

    4 x Xeon(R) CPU E5-4657L v2 @ 2.40GHz, 48 cores

    512

    scratchy

    Status
    colourYellow
    titleKIS

    quake &halo
    Status
    titleKIS/seismo

    hathi

    Status
    colourPurple
    titleOT

    4 x Intel(R) Xeon(R) CPU E5-4650L @ 2.60GHz, 32 cores

    512

    Central storage space

    Total available disk space for /home (

    Status
    colourYellow
    titleKIS
    Status
    colourPurple
    titleOT
    ), /dat (
    Status
    colourYellow
    titleKIS
    Status
    colourPurple
    titleOT
    ), /archive (
    Status
    colourYellow
    titleKIS
    ), /instruments (
    Status
    colourPurple
    titleOT
    )

    name

    total [TB, brutto]

    free [TB, brutto]

    mars

    Status
    colourYellow
    titleKIS

    758

    39

    quake

    Status
    titleKIS/seismo

    61

    0

    halo

    Status
    titleKIS/seismo

    145

    44,5

    jane

    Status
    colourPurple
    titleOT

    130 (-> 198)

    23


     References

    📎 References

    Products & Tools

    Forthcoming Conferences/Workshops

    Collaborations