Powered by OpenAIRE graph

Institut national de recherche en informatique et en automatique

Country: France

Institut national de recherche en informatique et en automatique

9 Projects, page 1 of 2
  • Funder: French National Research Agency (ANR) Project Code: ANR-12-JS01-0004
    Funder Contribution: 149,960 EUR

    Incompressible fluid-structure interaction problems, i.e., mathematical models that describe the interaction of a deformable structure with an internal or surrounding incompressible fluid flow, are among the most widespread multi-physics problems. Their numerical simulation is of major interest in practically all the engineering fields, from the aeroelasticity of bridge decks and parachutes, to naval hydrodynamics and the biomechanics of blood and airflow. The separate simulation of either an elastic structure in large displacements or incompressible flow in fixed domains is rather well established. Yet, making both models interact via efficient numerical methods is a permanent challenge in scientific computing and numerical analysis. In fact, besides the increasing complexity of the models (contacting structures, active mechanics, porous media, etc.), there is also an emerging interest in addressing inverse problems (e.g., to improve clinical diagnosis via personalized fluid-structure models) which definitely calls for efficient numerical methods. Nowadays, numerical simulations of these multi-physics systems are obtained at the expense of efficiency. They are much more computationally onerous than solving two independent fluid and structure problems. This indicates that, in terms of efficiency, the coupling scheme does not fully exploits the maturity of the numerical methods for each of the sub-systems, which is a major obstacle. The basic principle of this proposal is that, to guarantee efficiency, the coupling scheme must allow a decoupled time-marching of the fluid and the structure. This is a particularly challenging problem in numerical analysis since fluid incompressibility generally makes standard decoupling schemes unstable. The scientific objective is thus to marry decoupling with stability and accuracy.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-20-CE48-0017
    Funder Contribution: 195,920 EUR

    In the last decades, we have seen a large deployment of smart devices and contact-less smart cards, with applications to the Internet of Things and smart cities. These devices have strong security requirements as they communicate sensitive data, but they have very low resources available: constrained computing capabilities and limited energy. This led to security disasters with the use of weak homemade cryptography such as KeeLoq or MIFARE. More recently, the academic cryptography community has come up with dedicated lightweight designs such as PRESENT or SKINNY, and the NIST is currently organizing a competition to select the next worldwide standards. The goal of this project is to perform a wide security evaluation of the designs submitted to/chosen by the NIST competition, and of lightweight cryptographic algorithms in general. We will use latest cryptanalysis advances, but also propose new attacks; study classical attacks, but also physical ones (very powerful in such scenarios).

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-18-CE25-0010
    Funder Contribution: 408,959 EUR

    General-purpose Operating Systems (OSes), such as Linux, are increasingly used in the safety-critical embedded systems industry, with usage in automotive, medical and cyber-physical systems. However, it is well known that general purpose OSes suffer from bugs. Recently, some major advances have been made in verifying OS kernels, mostly employing interactive theorem-proving techniques. Nevertheless, these approaches incur the cost of a huge user-assisted proving effort, not only during the design, implementation and proving phases but also when maintaining the OSes or extending them with new features. Therefore, scaling such techniques to verify all OS services remains a challenge. An alternative that has not previously investigated in the context of OS services is the use of automated static analysis. Automated static analysis tools compute program invariants and semantic properties, without human intervention. These tools rely on abstractions that describe a range of program invariants and properties, and on algorithms to compute invariants that hold true for all program executions. Still, while sound, static analyzers are incomplete, meaning that they may fail to prove a valid program correct, instead conservatively reporting it as "possibly incorrect". The complex data structures, low level operations, and concurrency intrinsic to OS code further raise the difficulty of applying automated static analysis tools in an OS setting. The VeriAMOS project brings together experts in static analysis, operating systems, and programming language design, at Inria, Sorbonne University, and the University of Grenoble, to propose a new approach to producing verified OS services, based on the use of Domain-Specific Languages (DSLs) and automated static analysis. A DSL is a programming language that is designed around abstractions that are specific to a family of programs. A DSL lets programmers implement a specific class of algorithms at a higher level than a conventional programming language, and provides a compilation chain to produce target code. By limiting the language expressiveness, a DSL also constrains programmers, and prevents the misuse of language features. The key insight behind the VeriAMOS project is that by raising the level of abstraction and restricting the program design space, DSLs can make it possible to fully automate strong verifications on implementations of real OS services. As a showcase for the proposed approach, the VeriAMOS project will focus on I/O schedulers. Recent years have seen the introduction of a number of new, complex I/O schedulers to meet the challenges raised by SSDs, which provide much greater performance than traditional rotating disks and are attractive in the context of embedded systems, due to their shock resistance. VeriAMOS will target properties such as no loss or corruption of requests and freedom from starvation, that are critical in a SSD I/O scheduler implementation. Challenges include the design, automation, and deployment of relevant abstractions to support the static analysis and the definition of an expressive, high level programming framework that ensures the generation of efficient and provably correct scheduling policies. To overcome these challenges, synergy is needed between the domains of static analysis, operating systems, and programming language design to ensure that the static analysis is able to reason about the chosen programming constructs and the chosen programming constructs are sufficient to implement real scheduling policies. While the VeriAMOS project will focus on the domain of I/O schedulers, we strongly believe that this work will pave the way towards the automatic verification of other OS services. As a result, creating fully verified critical services will no longer to be limited to a few proof experts, but will be accessible to a wide spectrum of systems code developers.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-13-TECS-0006
    Funder Contribution: 943,588 EUR

    The liver has a remarkable capability to regenerate. A healthy human liver can regenerate more than 2/3 of its mass. However, patients undergoing liver surgery often suffer from liver diseases accompanied by a significant reduction in liver function and regenerative capacity. Post-operative liver failure is the main cause of short-term mortality (around 5% at 3 months after surgery) after hepatectomy (5000 / year in France) due to insufficient functional liver mass. Today, limits of liver surgery and partial liver transplantation are based on empiric minimal acceptable liver volume that is preoperatively defined on volumetric reconstruction using CT-scan. These limits are defined according to a priori ratios between liver volume and liver function that depend on the quality of liver parenchyma and the therapeutic situation. However, the evaluation based on these threshold values fails if either the a priori ratios do not apply for a patient or if the partial liver transplantation or major hepatectomy lead to a drastic modification of the hepatic hemodynamics. The aim of this project is double. Firstly, relationships between liver function, blood flows and architecture along with liver volume after major hepatectomy will be carefully analyzed by combining experiments and modeling on multiple levels and scales. These results will be used to develop an alternative evaluation procedure based on innovative tools, particularly near-infrared fluorescence imaging (that permits intraoperative evaluations of liver function), in order to improve identification of possible or likely post-operative liver failure as early as possible. In this way a critical liver function decrease can be duly treated. Preliminary studies in pigs indicate that patients in which liver failure is likely to occur, may significantly benefit from an implantable surgical device capable of modulating the diameter of the portal vein hence controlling the portal venous pressure into the liver. The second goal of this project is to explore the conditions for the optimal use of this device for such patients. For both goals, mathematical models at organ, lobular and cellular levels will be built within an iterative process of model refinement and experimental and clinical data acquisition. The emerging multiscale model framework linking models at various spatial scales will thus provide a predictive tool to better understand how the measured data in different parts of the blood circulation, bile and tissue, reflect liver functional changes depending on the surgical procedure including transplantation as well as the following liver regeneration process. This highly multidisciplinary project involves clinicians and researchers specialized in liver surgery and liver transplantation (DHU Hepatinov - Hôpital Paul Brousse - Villejuif), mathematicians specialized in the modeling and computer simulations of biological flows, tissue growth and remodeling, and their biomedical applications (INRIA), a SME involved in intra operative fluorescence imaging (Fluoptics™) that will industrialize the imaging system and market it after the project, and a SME that had developed an adjustable vascular ring (MID-AVR™) allowing to modulate postoperatively the liver blood perfusion.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-13-MONU-0001
    Funder Contribution: 414,800 EUR

    This project aims to quantify the uncertainties of the pollutant concentrations that are computed by an operational urban air quality model. The uncertainties refer to the range of values that the errors (i.e., the discrepancies between the model outputs and the true values) can take. These errors are usually modeled as a random vector, whose probability density function is the complete description of the uncertainties. Our strategy to approximate this probability density function is the generation of an ensemble of simulations that properly samples the errors. The application is air quality simulation across Clermont-Ferrand, using a dynamic traffic model to compute traffic emissions and using an atmospheric chemistry-transport model that explicitly represents the streets of the city. Based on the emission data, meteorological conditions and background pollutant concentrations, the air quality model computes every hour the concentration fields (across the whole city) of several air pollutants, especially dioxide nitrogen and particulate matter. As a result of the complexity of atmospheric phenomena and the limited observations, the simulations can show high uncertainties which need to be estimated. Our objective is to propose a tractable approach to provide uncertainty estimations along with any urban simulation. The approach should apply to short-term forecasts as well as long-term simulations (e.g., for impact studies). One major uncertainty source lies in the traffic emissions. We will carefully estimate the uncertainties of traffic assignments in the streets and of associated pollutant emissions. Using multiple simulations of a state-of-the-art dynamic traffic model, an ensemble of traffic assignments will be generated. The ensemble will be calibrated with traffic observations so that it should be representative of the uncertainties of the traffic model. The associated ensemble of pollutant emissions will provide inputs to the air quality model. An ensemble of air quality simulations will be generated, using the different traffic emissions, using perturbed input data (Monte~Carlo approach) and possibly a multimodel approach. This ensemble will also be calibrated using observations of pollutant concentrations in the air. The air quality model is a high-dimensional model with high computational cost. In order to generate an ensemble of simulations, it is necessary to reduce the computational costs. Consequently a part of the project deals with the reduction of the air quality model. This project is proposed in a context of increasing use of numerical air quality models at urban scale. The models are used for daily forecasts, for assessment of long-term exposure of populations to pollution, for the evaluation of the impact of new regulations, ... We will propose methods that can be applied in an operational context to the core modeling chain, from traffic assignment to atmospheric dispersion. The scientific results of the project will be integrated in an operational modeling system that is currently used for many cities in France and abroad.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.