This Software Repository includes the full list of software being developed within Task 3.4, including versions released in Deliverable 3.4 (2016 Release) and 3.15 (2018 Release). Users can filter the softwares by release and by developer using the right column section.
The D-ANA task is still in its development phase. The completion of this task is expected in April 2019. In the light of this, the following should be noted:
The software described in this contribution is developed as part of subtask 1 of the D-ANA task within the Obelics work package. But since the software is made available as a docker container and we are running an instance of the software as a JupyterHub webservice, it is also listed in this document.
CORELib (Cosmic Ray Event Library) is a collection of simulated events of cosmic ray showers. The production is currently based on CORSIKA featuring a common set of physical parameters in order to achieve a general purpose high statistics production. Cosmic rays are a source of background to many astroparticle and astronomy experiments, but at the same time they provide a useful benchmarking tool to assess detector performances.
The INAF CTA Authentication and Authorization Infrastructure provides functionalities enforcing the protection of CTA resources and digital assets by means of a role based authorization and by allowing both a federated authentication (based on eduGAIN inter-federation) and a centralized authorization managed at consortium level. The infrastructure offers a proper environment for enforcing accountability allowing maintenance and audit of logs for relevant events.
The INAF CTA science gateway aims at providing a web instrument for high energy astrophysics. It leverages open source technologies giving web access to a set of tools and software widely used by the CTA community. An extended (though not exhaustive) list of tools provided by this technology embrace XANADU software package, GammaLib & ctools, Fermi Science Tools, Aladin, IRAF. The gateway is based on the Liferay platform. It provides a Workflow Management System (WMS) with a customizable graphical web user interface and a web-desktop environment.
The LOFAR Default Pre-Processing Pipeline (DPPP) software reads and writes radio-interferometric data in the form of Measurement Sets. It goes through visibilities in time order and contains standard operations like averaging, phase-shifting and flagging bad stations. Between the steps in a pipeline, the data is not written to disk, making this tool suitable for operations where I/O dominates.
ctapipe.flow is a Python implementation of the flow-based programming paradigm for ctapipe framework. In flow-based programming, applications are defined as networks of black-box components that exchange data across predefined connections. These components can be reconnected to form different applications. ctapipe-flow executes ctapipe processing modules in a sequantial or multiprocess environment. User implements steps in Python class.
HPC programming techniques are largely applied in many fields . New generation research projects require often HPC code programming for efficient processing of large data volumes . In this scope the optimisation of data model and data format is critical . A HPC data format is designed to help the CPU data pre-fetching . Data have to be contiguous , cache friendly and the data locality has to be preserved . The tables of data also have to be aligned on vectorial registers with respect to their types . Therefore, no peal is needed before a loop.
The INAF Cloud Science Platform explores a possible technological solution for large projects, to implement a new integrated approach to data access, manipulation and sharing. In particular, in case of worldwide distributed collaborations that needs to share distributed infrastructures. It is an hybrid cloud built of three main blocks: the CANFAR e-infrastructure, the EGI Federated Cloud and a cloud gateway site INAF-OATs.
The instrument response builder (IRB) was written to assert the performance of an astrophysics observatory for a set of signal and background events. The output distributions range from angular and energy resolutions to effective areas and background rates.
Jpp is a java-inspired collection of C++ classes and applications for PDF creation, multidimensional interpolation, function minimisation and plotting. Based on a much broader KM3NeT software framework, developed by Maarten de Jong, it is released here in a more generic, experiment-independent format. The package uses a flexible, templated class structure, which avoids the use of global variables, such as in Minuit.
The following subpackages are included:
KM3Pipe is a framework for KM3NeT related stuff including MC, data files, live access to detectors and databases, parsers for different file formats and an easy to use framework for batch processing.
The main Git repository, where issues and merge requests are managed can be found at http://git.km3net.de
Gravitational wave observations are limited by a background of transient signals from instrumental and environmental origin.
This package includes a set of machine learning tools that allow to classify those transient signals, in order to better characterize their large population, give hints about their source, and provide new ways for mitigating this background. The algorithms take a labelled set of transients, extract features from the time series, and learn a classifier (neural network) using standard ML libraries.
OMGsim software objective is to provide an easy to use simulation of the optical modules containing PMTs, in particular, muli-PMT optical modules of KM3NeT. An important attention as been given to the simulation of the photocathode, using a dedicated thin layer and complex refraction index to determine photon absorption, reflection and transmission.
High Performance Computing software libraries (Intel) for data reduction, barycenter, first and second momenta calculation.
The deliverable participation of the LAPP is composed of two different parts, a High Performance Computing library and some programs described below.
PYWI is an image filtering library aimed at removing additive background noise from raster graphics images. The image filter relies on multiresolution analysis methods (Wavelet transforms) that remove some scales (frequencies) locally in space. These methods are particularly efficient when signal and noise are located at different scales (or frequencies).
The ROOT analysis framework is one of the most used software for the analysis and indeed it is the “de facto” standard for high-energy physics. The goal of the ROAst library (ROot extensions for ASTronomy) is to extend the ROOT capabilities adding packages and tools for astrophysical research.
ROAst comes with three feature sets:
STOA (Script Tracking for Observational Astronomy) is a workflow management with both command line and web interfaces that permits the efficient handling of large, heterogeneous data sets, and provides a fast run-test-rerun work cycle for these situations.
SWIFTCASA is a task-based parallelisation of the standard radio astronomy data reduction package “CASA”. It allow easy distribution of processing on conventional HPC cluster through the use of a shared filesystem, and is able to scale CASA up to the limits of storage throughput relatively easily.
Softwares and technologies described in this section have been used in the IA2 Data Center, which is is an Italian Astrophysical research infrastructure service of INAF that manages astronomical data archives, mainly for ground-based telescopes (LBT, TNG, etc.), also allowing data computing through workflow applications.
Cloud computing and minimal recomputation for Casa.
The Casa data processing package, widely used in radio astronomy; Jupyter, a system for hosting science data processing on the World Wide Web and Recipe (Recipe Tarball), developed during the RadioNet Hilado project.
Yabi is a 3-tier application stack to provide users with an intuitive, easy to use, abstraction of compute and data environments. Developed at the Centre for Comparative Genomics, Yabi has been deployed across a diverse set of scientific disciplines and high performance computing environments.
Yabi is deployed at IA2 Data Center to allow accredited users to run HARPS-N and GIANO-B data reduction pipelines on private and public data from TNG archive.