Skip to main content

Web Content Display Web Content Display

News

Breadcrumb Breadcrumb

Web Content Display Web Content Display

JU physicist to work on a project awarded by CHIST-ERA

JU physicist to work on a project awarded by CHIST-ERA

The CHIST-ERA consortium has announced the results of the Open and Re-usable Research Data and Software (ORD) call, directed to international research teams carrying out projects related to open science in multidisciplinary perspectives. Amongst the nine winning teams there are five featuring Polish scientists, including one whose member is Dr hab. Andrzej Siódmok, Prof. UJ from the JU Faculty of Physics, Astronomy and Applied Computer Science.

The CHIST-ERA ORD call was open to international consortia made up of at least three research teams from at least three different participating countries. The PI of the Polish team had to have at least a PhD degree. The purpose of CHISTERA is to support research on information and communication technologies. To date, the NETWORK has launched 13 international calls for proposals, with the next one being scheduled to start in October.

The project OpenMAPP: Open meta-analysis in particle physics, which features Prof. Wojciech Siódmok from the Institute of Applied Computer Science of the JU Faculty of Physics, Astronomy and Applied Computer Science, will be carried out by an international consortium with members from countries such as France (leader), South Korea, Turkey and Great Britain.

The abstract of the project reads as follows:

The Large Hadron Collider (LHC), and other major particle-physics experiments past, present, and future, are vast public investments in fundamental science. However, while the data-analysis and publication mechanisms of such experiments are sufficient for well-defined targets such as the discovery of the Higgs boson in 2012 (and the W, Z, and gluon bosons before), they limit the power of the experimental data to explore more subtle phenomena.

In the 10 years since the Higgs-boson discovery, the LHC has published many analyses testing the limits of the Standard Model (SM) — the established, but suspected-incomplete central paradigm of particle physics. Each direct-search paper has statistically disproven some simplified models of physics beyond the SM, but such models are no more a priori likely than more complex ones: the latter feature a mixture of the simplified ones’ new phenomena, but at lower intensity, rather than concentrated into a single characteristic. To study such “dispersed signal” models requires a change in how LHC results are interpreted: the emphasis must shift to combining measurements of many different event types and characteristics into holistic meta-analyses. Only such a global, maximum-information approach can optimally exploit the LHC results.

This project will provide a step towards building the infrastructure needed to make this change. It will facilitate experiments to provide fast, re-runnable versions of data-analysis logic through enhancements of a domain-specific language and event-analysis toolkits. It will join up the network of such toolkits with the public repositories of research data and metadata. It will provide common interfaces for controlling preserved analyses in the multiple toolkits, and for statistically combining the thousands of measurements and assessing which combinations can provide the most powerful scientific statement about any beyond-SM theory. At the start of the 3rd major data-taking run of the LHC, the time is now ripe to put this machinery and culture in place, so that the LHC legacy is publicly preserved for all to reuse.

The project specifically aims to enhance the extent to which public analysis data from particle-physics experiments (in a general sense, but particularly summary results such as those used in publication plots and statistical inference, rather than raw collider events) can be combined and re-used to test theories of new physics. These tests, pursued by theorists and experimentalists alike, can also go beyond particle physics and also connect to astrophysics and cosmology, nuclear-physics direct searches for dark-matter. The value of combining information from different individual analyses was made clear early in the LHC programme, as early experimental data proved crucial for improving models of SM physics. The huge scientific volume, greater model-complexity, and increased precision of the full LHC programme requires pursuing this approach in a more systematic and scalable manner, open to the whole community and including use of reference datasets to ensure validity into the far future.

The time is right for this step, as the key technologies (DOI minting and tracking, RESTful Web APIs, version-control hosting with continuous integration, containerisation) have become mature in the last 5 or so years. Particle physics already has established open data and publication repositories, but the crucial link of connecting those to scalable preservations of the analysis logic needs to be made, as does normalising the culture of providing such preservations and engaging in the FAIR principles for open science data. Individual physicists are generally enthusiastic about such ideals, as evidenced by the uptake of open data policies at particle-physics labs, and preservation of full collider software workflows. But an explicit, funded effort is required to eliminate the technical barriers and make these desirable behaviours more accessible and rewarded.

Recommended
An extraordinary butterfly discovered hiding in plain sight

An extraordinary butterfly discovered hiding in plain sight

Night shift, or when pesticides are harmful

Night shift, or when pesticides are harmful

The Jagiellonian University featured in the QS World University Ranking 2025

The Jagiellonian University featured in the QS World University Ranking 2025

JU delegation takes part in Coimbra Group General Assembly

JU delegation takes part in Coimbra Group General Assembly