Powered by OpenAIRE graph

Jagiellonian University

Jagiellonian University

11 Projects, page 1 of 3
  • Funder: French National Research Agency (ANR) Project Code: ANR-19-EBI3-0003
    Funder Contribution: 248,508 EUR

    Access to a diverse spectrum of food resources ensures appropriate nutrition and is thus crucial for animal health and fitness. Bees obtain nearly all nutrients from flowers. Their population dynamics are therefore largely determined by the availability, composition and diversity of flowering plants. Alarmingly, many bee populations are in decline in contemporary landscapes, likely due to the loss of floral resource diversity and abundance and to a decrease in the nutritional quality of floral resources. However, the actual link between floral diversity and composition, the nutritional composition of floral resources and bee health is still unclear, particularly in wild bees, which are considered even less resilient to environmental changes than honeybees. In NutriB2, eleven scientists from seven different countries will combine their expertise in taxonomy, nutritional & chemical ecology, physiology, behavior, epidemiology, biostatistics and modeling to, in a synergistic effort, clarify the link between floral biodiversity, nutrition and bee health. We will further reveal critical nutrients and/or ratios and thus key plant species and compositions of plant species that cover the nutritional needs and support health of a large fraction of bee species. This knowledge is crucial for our understanding of how floral composition and diversity structure bee communities through nutritionally mediated health effects. It is also essential for designing and/or identifying and restoring habitats that support wild bee populations. Our results will be shared and processed with different stakeholders (i.e. seed companies, beekeeping and farmers’ organizations, regional to international conservation groups, schools with programs to actively promote bee diversity, other business representatives and policymakers) based on already established contacts and networks. Our aim is to determine feasible ways of a) restoring and/or maintaining semi-natural habitats with nutritionally highly valuable plant species and b) designing and implementing nutritionally balanced floral seed mixes. Nutritionally appropriate and diverse floral communities will not only benefit diverse bee species, but also others animals depending on plants as well as higher trophic levels. NutriB2 will therefore not only shed light on the mechanisms underlying the known positive correlation between floral biodiversity and bee health, but also enable us to design better strategies for conserving or restoring floral diversity for bees and thus mitigate the ongoing wild bee and biodiversity decline.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-23-NEUG-0001
    Funder Contribution: 56,750 EUR
    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-21-CHR4-0003
    Funder Contribution: 136,744 EUR

    The XPM project aims to integrate explanations into Artificial Intelligence (AI) solutions within the area of Predictive Maintenance (PM). Real-world applications of PM are increasingly complex, with intricate interactions of many components. AI solutions are a very popular technique in this domain, and especially the black-box models based on deep learning approaches are showing very promising results in terms of predictive accuracy and capability of modelling complex systems. However, the decisions made by these black-box models are often difficult for human experts to understand – and therefore to act upon. The complete repair plan and maintenance actions that must be performed based on the detected symptoms of damage and wear often require complex reasoning and planning processes, involving many actors and balancing different priorities. It is not realistic to expect this complete solution to be created automatically – there is too much context that needs to be taken into account. Therefore, operators, technicians and managers require insights to understand what is happening, why it is happening, and how to react. Today’s mostly black-box AI does not provide these insights, nor does it support experts in making maintenance decisions based on the deviations it detects. The effectiveness of the PM system depends much less on the accuracy of the alarms the AI raises than on the relevancy of the actions operators perform based on these alarms. In the XPM project, we will develop several different types of explanations (anything from visual analytics through prototypical examples to deductive argumentative systems) and demonstrate their usefulness in four selected case studies: electric vehicles, metro trains, steel plant and wind farms. In each of them, we will demonstrate how the right explanations of decisions made by AI systems lead to better results across several dimensions, including identifying the component or part of the process where the problem has occurred; understanding the severity and future consequences of detected deviations; choosing the optimal repair and maintenance plan from several alternatives created based on different priorities, and understanding the reasons why the problem has occurred in the first place as a way to improve system design for the future.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-23-CHRO-0006
    Funder Contribution: 97,498.8 EUR

    The Large Hadron Collider (LHC), and other major particle-physics experiments past, present, and future, are vast public investments in fundamental science. However, while the data-analysis and publication mechanisms of such experiments are sufficient for well-defined targets such as the discovery of the Higgs boson in 2012 (and the W, Z, and gluon bosons before), they limit the power of the experimental data to explore more subtle phenomena. In the 10 years since the Higgs-boson discovery, the LHC has published many analyses testing the limits of the Standard Model (SM) — the established, but suspected-incomplete central paradigm of particle physics. Each direct-search paper has statistically disproven some simplified models of physics beyond the SM, but such models are no more a priori likely than more complex ones: the latter feature a mixture of the simplified ones’ new phenomena, but at lower intensity, rather than concentrated into a single characteristic. To study such “dispersed signal” models requires a change in how LHC results are interpreted: the emphasis must shift to combining measurements of many different event types and characteristics into holistic meta-analyses. Only such a global, maximum-information approach can optimally exploit the LHC results. This project will provide a step towards building the infrastructure needed to make this change. It will facilitate experiments to provide fast, re-runnable versions of data-analysis logic through enhancements of a domain-specific language and event-analysis toolkits. It will join up the network of such toolkits with the public repositories of research data and metadata. It will provide common interfaces for controlling preserved analyses in the multiple toolkits, and for statistically combining the thousands of measurements and assessing which combinations can provide the most powerful scientific statement about any beyond-SM theory. At the start of the 3rd major data-taking run of the LHC, the time is now ripe to put this machinery and culture in place, so that the LHC legacy is publicly preserved for all to reuse. The project specifically aims to enhance the extent to which public analysis data from particle-physics experiments (in a general sense, but particularly summary results such as those used in publication plots and statistical inference, rather than raw collider events) can be combined and re-used to test theories of new physics. These tests, pursued by theorists and experimentalists alike, can also go beyond particle physics and also connect to astrophysics and cosmology, nuclear-physics direct searches for dark-matter. The value of combining information from different individual analyses was made clear early in the LHC programme, as early experimental data proved crucial for improving models of SM physics. The huge scientific volume, greater model-complexity, and increased precision of the full LHC programme requires pursuing this approach in a more systematic and scalable manner, open to the whole community and including use of reference datasets to ensure validity into the far future. The time is right for this step, as the key technologies (DOI minting and tracking, RESTful Web APIs, version-control hosting with continuous integration, containerisation) have become mature in the last 5 or so years. Particle physics already has established open data and publication repositories, but the crucial link of connecting those to scalable preservations of the analysis logic needs to be made, as does normalising the culture of providing such preservations and engaging in the FAIR principles for open science data. Individual physicists are generally enthusiastic about such ideals, as evidenced by the uptake of open data policies at particle-physics labs, and preservation of full collider software workflows. But an explicit, funded effort is required to eliminate the technical barriers and make these desirable behaviours more accessible and rewarded.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-18-QUAN-0002
    Funder Contribution: 232,799 EUR

    Recent progress in various areas of physics has demonstrated our ability to control quantum effects in customized systems and materials, thus paving the way for a promising future for quantum technologies. The emergence of such quantum devices, however, requires one to understand fundamental problems in non-equilibrium statistical physics, which can pave the way towards full control of quantum systems, thus reinforcing new applications and providing innovative perspectives. This project is dedicated to the study and the control of out-of-equilibrium properties of quantum many-body systems which are driven across phase transitions. Among several approaches, it will mainly focus on slow quenches and draw on the understanding delivered by the Kibble-Zurek (KZ) mechanism. This rather simple paradigm connects equilibrium with out-of-equilibrium properties and constitutes a benchmark for scaling hypothesis. It could pave the way towards tackling relevant open questions, which lie at the heart of our understanding of out-of-equilibrium dynamics and are key issues for operating in a robust way any quantum simulator. Starting from this motivation, we will test the limits of validity of the KZ dynamics by analyzing its predictions, thus clarifying its predictive power, and extend this paradigm to quantum critical systems with long-range interactions and to topological phase transitions. We will combine innovative theoretical ideas of condensed-matter physics, quantum optics, statistical physics and quantum information, with advanced experiments with ultracold atomic quantum gases. Quantum gases are a unique platform for providing model systems with the level of flexibility and control necessary for our ambitious goal. Their cleanness and their robustness to decoherence will greatly enhance the efficient interplay between theory and experiments, and provide a platform of studies whose outcomes are expected to have a strong scientific impact over a wide range of disciplines. On the short time scale we will exploit this knowledge to develop viable protocols for quantum simulators. In general, we expect that the results of this project will lay the ground for the development of the next generation of quantum devices and simulators.n of the proposed research, which would lay the ground for future device/simulator development in the mid-term. Our proposed work lies deeply within the “Quantum Technologies” theme. More specifically, by providing a deeper understanding and direct control of out-of-equilibrium phenomena in quantum many-body systems, we will make impactful contributions to the areas of “Quantum simulation” and “Quantum metrology, sensing and imaging”. Firstly, we will make significant advances to the initialization of a quantum system in a well-controlled initial state (ground state, without defects) and optimize the adiabatic control of its time evolution to an “interesting” target state, both of which are crucial features for adiabatic quantum computing. It is expected that the initial state and target state could be separated by a phase transition, which brings to the fore the question of the time evolution of a quantum system near a critical point. The response of a system when driven across critical points is further relevant for developing atomtronic devices and quantum sensors at the limit, that may find applications to the detection of extremely weak signals. This could include applications in diverse fields such as the detection of dark matter. Our consortium is composed of world-leading scientists with pioneering contributions in non-equilibrium dynamics of ultracold atomic systems and possesses a unique combination of the relevant expertise and tools for the successful completion of the proposed research. We expect that the results of this project will lay the ground for the development of the next generation of quantum devices and simulators.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.