It's a pleasure to present you our third newsletter. We try to keep the release schedule close to one, not exceeding two months, balanced between being informational and not too chatty.
Apart from the regular project progress and IT news, there are quite some chapters on policies that will affect how observations will be done and what is required to access data in the future. There's also a section on licensing issues of data no longer embargoed.
Taras Yakobchuk introduces the new tool he is developing for visualizing and analyzing calibrated GRIS/GREGOR data. The tool is not only intended to help experts analyzing data offered by SDC, but it should also allow access to laypersons who are not experts in dealing with this type of data.
We would like to encourage you to openly comment on any parts. Feedback is always welcome and helps us to deliver a better product.
Editorial
📰 Editorial
🔒 What goes here?
Project Status
SDC Project Status 03-2021
Solution Analysis and Design Phase steps.
SDC Project team has been working hard to find the best possible hardware and software components to build a robust platform for the solar community. The project has now entered a phase where we are creating a detailed solution design meaning that we have already identified many technical pieces which are going to be included in the final version of SDC. The team is now trying to find the best possible ways to integrate these different pieces together. This means a lot of investigations on technical details and testing different scenarios.
📋 Summary
Current project health | Current project status | Project constraints |
---|---|---|
GREEN | “Create Detailed Solution Design” phase in progress.
| Resources and their availability. Technology POCs taking more time than predicted. |
📊 Project status
Accomplishments
High-level solution design
Started collecting data policies
Clarified embargo policies
Listed essential use cases for SDC
Next steps
Continue selecting solution components and planning component integrations
Risks & project issues
Lack of resources
Resource availability
Multiple process implementations at the same time
Governance
👩⚖️ Policies, Frameworks & Governance
📜
Products & Tools
🛠 SDC Products & Tools
Standardized GRIS Pipeline
The GRIS reduction pipeline was merged to a common version in collaboration with M. Collados. The version running at OT and Freiburg now both produce data which is compatible with downstream SDC tools. The latest version of the pipeline can always be found on the KIS GitLab server. The current OT version will be synced to the ulises
branch and merged into the main production branch periodically.
SDC data archive
Get access to data from GRIS/GREGOR and LARS/VTT instruments and the ChroTel full-disc telescope at OT.
Updates as of July 2021
The detail pages for observations have been reworked see an example here:
Added dynamic carousel of preview data products
Added flexible selection for downloading associated data
VFISV inversion results have been added for most of the GRIS observations. The website now includes information on line of sight velocity and magnetic field strength
Development process has streamlined:
automated test deployments for quicker iterations and fixes
Changes to the UI will occur in regular sprints. We’re currently collecting ideas here
Added historic ChroTel data for 2013, thanks to Andrea Diercke from AIP for contacting us and providing us with this supplemental archive.
Conferences & Workshops
📊 Conferences & Workshops
Forthcoming Conferences/Workshops of Interest 2021
Every second Thursdays, 12:30-13:30 CET
PUNCH Lunch Seminar (see SDC calendar invitation for zoom links)
11 Feb 2021: PUNCH4NFDI and ESCAPE - towards data lakes
25 Feb 2021: PUNCH Curriculum Workshop
April week 12-16 (3 days, TBD)
ESCAPE WP4 Technology Forum
June 01-02 (16:00 - 17:30)
15th International dCache Workshop
June 10-11
3th International Workshop on Science Gateways | IWSG 2021
Topics:
Architectures, frameworks and technologies for science gateways
Science gateways sustaining productive collaborative communities
Support for scalability and data-driven methods in science gatewayS
Improving the reproducibility of science in science gateways
Science gateway usability, portals, workflows and tools
Software engineering approaches for scientific work
Aspects of science gateways, such as security and stability
June 28, 2021:
Data-intensive radio astronomy: bringing astrophysics to the exabyte era
Topics:
Data-intensive radio astronomy, current facilities and challenges
Data science and the exascale era: technical solutions within astronomy
Data science and the exascale era: applications and challenges outside astronomy
SDC participation in Conferences & Workshops
Nov. 26, 2020:
2nd SOLAR net Forum Meeting for Telescopes and Databases
Talk: Big Data Storage -- The KIS SDC case, NBG, PC & PK, 2nd SOLARNET Forum (Nov 26)
Nazaret Bello GonzalezPetri Kehusmaa Peter Caligari
SDC Collaborations
🤲 SDC Collaborations
SOLARNET https://solarnet-project.eu
KIS coordinates the SOLARNET H2020 Project that brings together European solar research institutions and companies to provide access to the large European solar observatories, supercomputing power and data. KIS SDC is actively participating in WP5 and WP2 in coordinating and developing data curation and archiving tools in collaborations with European colleagues.
Contact on KIS SDC activities in SOLARNET: Nazaret Bello Gonzalez nbello@leibniz-kis.de
ESCAPE https://projectescape.eu/
KIS is a member of the European Science Cluster of Astronomy & Particle Physics ESFRI Research Infrastructures (ESCAPE H2020, 2019 - 2022) Project aiming to bring together people and services to build the European Open Science Cloud. KIS SDC participates in WP4 and WP5 to bring ground-based solar data into the broader Astronomical VO and the development tools to handle large solar data sets.
Contact on KIS SDC activities in ESCAPE: Nazaret Bello Gonzalez nbello@leibniz-kis.de
KIS is one of the European institutes strongly supporting the European Solar Telescope project. KIS SDC represents the EST data centre development activities in a number of international projects like ESCAPE and the Group of European Data Experts (GEDE-RDA).
Contact on KIS SDC as EST data centre representative: Nazaret Bello Gonzalez nbello@leibniz-kis.de
PUNCH4NFDI https://www.punch4nfdi.de
KIS is a participant (not a member) of the PUNCH4NFDI Consortium. PUNCH4NFDI is the NFDI (National Research Data Infrastructure) consortium of particle, astro-, astroparticle, hadron and nuclear physics, representing about 9.000 scientists with a Ph.D. in Germany, from universities, the Max Planck Society, the Leibniz Association, and the Helmholtz Association. PUNCH4NFDI is the setup of a federated and "FAIR" science data platform, offering the infrastructures and interfaces necessary for the access to and use of data and computing resources of the involved communities and beyond. PUNCH4NFDI is currently competing with other consortia to be funded by the DFG (final response expected in spring 2021). KIS SDC aims to become a full member of PUNCH and federate our efforts on ground-based solar data dissemination to the broad particle and astroparticle communities.
Contact on KIS SDC as PUNCH4NFDI participant: Nazaret Bello Gonzalez nbello@leibniz-kis.de & Peter Caligari mailto:cale@leibniz-kis.de
IT news
🖥 IT news
Ongoing & Future developments
Webpage
KIS The design of the new website is essentially complete. Final technical adjustments are currently being made to Typo3. The website is already running at the final (VM-ware) server at KIS and is available at the web address:
After the content has been moved, the server will be renamed to http://www.leibniz-kis.de , and the old site will be shut down.
One of the reasons for the relaunch was to present our content more adapted to the particular browsers used by people with disabilities. This requires specific fields to be filled in in the back-end so that the page content can be appropriately classified. We will have an editor training on 13. and 14.07.21 about these points and the back-end's general handling.
Network
Status of the dedicated 10 Gbit line between KIS & OT
KIS OT The missing network equipment for the end at KIS will be installed in the second week of July. We will then try to establish the link remotely from Freiburg with the help of personnel at the telescopes.
Test of (application) firewalls at KIS
KIS OT Firewall testing at KIS is terminated. We chose between three
Storage
KIS SDC We are currently in the process of ordering the first storage node for SDC. This node consists of a DELL R740XD2 (2 CPUs, 24x16 TB disks, 10 Gbit Ethernet). The price (including VAT) per usable TB storage is of the order of 160€/TB.
We will use this machine as a test-bed for the technology envisioned for SDC and already have all raw-data from observations in 2021 as well as the large files from simulations accumulating on mars stored there.
SDC will consist of at least 4 similar nodes. This is the first one. As soon as the remaining hosts are setup we will move any data still on this first host to the new SDC cluster and join it to the latter, also.
SDC In parallel, we are looking into outsourcing seldomly accessed files to the public cloud. Within the framework of the SDC, it is planned to use the latter mainly to flexibly cover short-term peaks in demand.
The costs per TB of storage space in the cloud are strongly dependent on capacity and, above all, the access pattern. They vary between approx. 60-200 €/TB/a. Access-independent models, in which only a fixed fee is charged per stored GB, but no fees for downloading or uploading, are at the upper end of this scale. At the lower end are public providers such as Amazon, Google and Microsoft, which charge a relatively high fee for each type of data access in addition to the (relatively cheap) price of simple storage.
Additionally, licence fees of a similar magnitude for the software that moves files between the cloud and the local storage at the KIS are required.
We are currently obtaining concrete offers to outsource 100 TB for 1 year to a public cloud. The pricing models are so complicated that we can determine the resulting costs only through a limited real-world test.
We will intentionally design the integration so that it will become apparent to all users which files are in the cloud and which are not. Although this is cumbersome (and artificially induced), we deem this awareness essential (at least initially, where we have no experience of the potential costs involved). The exact model is still to be worked out, and we will inform you about it again in due course.
OT The two new nodes for jane arrived at OT. The installation will be done as soon as either Peter Caligari can travel there or we get a technician of DELL up to the telescopes. Due to Covid-19, the time scale for this installation remains unclear. We will keep you informed.
Current Resources
Compute nodes
hostname | # of CPUs & total cores | ram [GB] |
---|---|---|
patty KIS | 2 x AMD EPYC 7742, 128 cores | 1024 |
itchy & selma KIS | 4 x Xeon(R) CPU E5-4657L v2 @ 2.40GHz, 48 cores | 512 |
scratchy KIS hathi OT | 4 x Intel(R) Xeon(R) CPU E5-4650L @ 2.60GHz, 32 cores | 512 |
Central storage space
Total available disk space for /home (KIS OT), /dat (KIS OT), /archive (KIS), /instruments (OT)
name | total [TB, brutto] | free [TB, brutto] |
---|---|---|
mars KIS | 758 | 39 |
quake KIS/SEISMO | 61 | 0 |
halo KIS/SEISMO | 145 | 44,5 |
jane OT | 130 (-> 198) | 23 |
References
📎 References
Products & Tools
SDC data archive: https://sdc.leibniz-kis.de/
Speckle reconstruction: https://gitlab.leibniz-kis.de/sdc/speckle-cookbook
Forthcoming Conferences/Workshops
June 01-02 (16:00 - 17:30): 15th International dCache Workshop
June 10-11, 2021: 3th International Workshop on Science Gateways | IWSG 2021
June 28, 2021: Data-intensive radio astronomy: bringing astrophysics to the exabyte era
Collaborations
SOLARNET: https://solarnet-project.eu
ESCAPE: https://projectescape.eu/
PUNCH4NFDI: https://www.punch4nfdi.de
Quick links
Computer load: http://ganglia.leibniz-kis.de
Drafts of web-page relaunch: https://wolke7.leibniz-kis.de/s/wEkPRsA5xKRgYbB
Add Comment