Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis
Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis
10 Projects, page 1 of 2
assignment_turned_in ProjectFrom 2022Partners:Laboratoire d'Informatique, Signaux et Systèmes de Sophia AntipolisLaboratoire d'Informatique, Signaux et Systèmes de Sophia AntipolisFunder: French National Research Agency (ANR) Project Code: ANR-22-PAUK-0061Funder Contribution: 35,000 EURBy 2021, cloud IP traffic will be the most part of an Internet traffic that complexifies with an increasing devices diversity and traffic dynamicity. A proposal framed at the cloud to face this situation is the Knowledge Defined Networking (KDN), where Machine Learning (ML) and Artificial Intelligence (AI) are combined with SDN/NFV and network monitoring to collect data, transform them into knowledge (e.g. models) via ML, and take decisions with this knowledge. Under this paradigm, we aim to design a unified AI-based framework able to learn new efficient cloud network control algorithms. This framework will integrate seamlessly data-driven control (based on ML tools) and model-driven control (based on optimization models), addressing scalability and optimality issues of the cloud control. To do that, we intend to apply two promising AI tools: Deep Learning (DL); and, Reinforcement Learning (RL). In the project, a Deep Learning Artificial Neural Network (ANN) will be used to transform the original input data representations (in our case, the cloud network state) into a low dimensional space where the network structural information and network properties are maximally preserved, and used them to solve in a more tractable way the optimal control problem. RL will be applied to learn the optimal control by interacting with the environment (in our case, the Cloud network).These interactions can be used to guide the learning of the weights of the deep ANN. The result is that the RL algorithm (acting as control loop) will solve more easily the control problem using as input this more compacted and lower dimensional representations found by the deep neural network. The main novelty of our approach is that we state that, for network control problems, the deep ANN should not be implemented using the same deep layer architectures used in computer vision (the so-called convolutional layers), but using a different kind (the so-called novel graph embedding architectures) better fitted to the graph nature of the network problems. Then, we propose to use the graph embedding layers as deep layers to solve cloud network control problems, namely the dynamic allocation of service chains composed by network virtualised functions. Starting from the case where the network service is unicast, we will move later to the multicast case, since video delivery, the classical multicast service, is the Internet killer application. Finally, we will implement a KDN proof-of-concept tested where our Deep Reinforcement Learning control will send via the northbound interface the control decisions to a an SDN controller, that, in its turn, will instruct an emulated SDN network.
more_vert assignment_turned_in ProjectFrom 2025Partners:Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis, Institut de Chimie Radicalaire UMR 7273Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis,Institut de Chimie Radicalaire UMR 7273Funder: French National Research Agency (ANR) Project Code: ANR-24-CE48-7504Funder Contribution: 646,973 EURAutomata networks are general models of interacting entities, exhibiting "complex" collective behaviors. Relating the local rules followed by the entities, the architecture of interactions, and the global dynamics, motivates the community. ALARICE aims at understanding these relations through the formal framework of computational complexity theory. This renewed point of view explains why some structural bounds from the literature are still loose despite considerable efforts: the inferences at stake involve algorithmically complex decisions. Our project relies on promising initial results obtained by members of the consortium, and is threefold. 1- Unveil metatheorems of the form "any non-trivial question of type X expressed in logic Y is Z-hard" for various X such as "given the local rules, question on the graph of the dynamics", Y in {FO,MSO,...} and Z in {P,NP,...}. We have a first "à la Rice" result of this flavour, presented at STACS'2021 (Y=FO, Z=NP), with a proof technique based on abstract pumping, using tools from finite model theory in order to construct (polytime) metareductions. 2- Characterize the formal complexity of new natural problems, in particular related to the network architecture. Getting a mature understanding of the constructions is a preliminary step towards metastatements. 3- Transfer this knowledge to other models of computation, via strict simulations acting as reductions. Our consortium is build around a core expertise on automata networks, complexity and simulation. It aims to enroll strong additional expertises from related fields (finite model theory, lattice dualization, graph parameterization), in order to nourish the fruitful articulations recently discovered.
more_vert assignment_turned_in ProjectFrom 2015Partners:INS2I, CNRS PARIS A, LINA, École Polytechnique, Université Pierre et Marie Curie +7 partnersINS2I,CNRS PARIS A,LINA,École Polytechnique,Université Pierre et Marie Curie,Laboratoire dInformatique, Signaux et Systemes de Sophia Antipolis,University of Nantes,LIX,INRIA,CNRS,Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis,Laboratoire dInformatique de lEcole PolytechniqueFunder: French National Research Agency (ANR) Project Code: ANR-15-CE25-0002Funder Contribution: 874,079 EURVerifying correctness and robustness of programs and systems is a major challenge in a society which relies more and more on safety-critical systems controlled by embedded software. This issue is even more critical when the computations involve floating-point number arithmetic, an arithmetic known for its quite unusual behaviors, and which is increasingly used in embedded software. Note for example the "catastrophic cancellation" phenomenon where most of the significant digits of a result are cancelled or, numerical sequences whose limit is very different over the real numbers and over the floating-point numbers. A more important problem arises when we want to analyse the relationship between floating-point computations and an "idealized" computation that would be carried out with real numbers, the reference in the design of the program. The point is that for some input values, the control flow over the real numbers can go through one conditional branch while it goes through another one over the floating-point numbers. Certifying that a program, despite some control flow divergences, computes what it is actually expected to compute with a minimum error is the subject of the robustness or continuity analysis. Providing a set of techniques and tools for verifying the accuracy, correctness and robustness for critical embedded software is a major challenge. The aim of this project is to address this challenge by exploring new methods based on a tight collaboration between abstract interpretation (IA) and constraint programming (CP). In other words, the goal is to push the limits of these two techniques for improving accuracy analysis, to enable a more complete verification of programs using floating point computations, and thus, to make critical decisions more robust. The cornerstone of this project is the combination of the two approaches to increase the accuracy of the proof of robustness by using PPC techniques, and, where appropriate, to generate non-robust test cases. The goal is to benefit from the strengths of both techniques: PPC provides powerful but computationally expensive algorithms to reduce domains with an arbitrary given precision whereas AI does not provide fine control over domain precision, but has developed many abstract domains that quickly capture program invariants of various forms. Incorporating some PPC mechanisms (search tree, heuristics) in abstract domains would enable, in the presence of false alarms, to refine the abstract domain by using a better accuracy. The first problem to solve is to set the theoretical foundations of an analyser based on two substantially different paradigms. Once the interactions between PPC and IA are well formalized, the next issue is to handle constraints of general forms and potentially non-linear abstract domains. Last but not least, an important issue concerns the robustness analysis of more general systems than programs, like hybrid systems which are modeling control command programs. Research results will be evaluated on realistic benchmarks coming from industrial companies, in order to determine their benefits and relevance. For the explored approaches, using realistic examples is a key point since the proposed techniques often only behave in an acceptable manner on a given sub classes of problems (if we consider the worst-case computational complexity all these problems are intractable). That's why many solutions are closely connected to the target problems.
more_vert assignment_turned_in ProjectFrom 2014Partners:Institut de recherche en communications et cybérnetique de Nantes, IBV, Nice Sophia Antipolis University, UCA, INSERM +6 partnersInstitut de recherche en communications et cybérnetique de Nantes,IBV,Nice Sophia Antipolis University,UCA,INSERM,Laboratoire dInformatique, Signaux et Systemes de Sophia Antipolis,INSB,Rythmes Biologiques et Cancer,CNRS,Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis,Institut National de la Recherche en Informatique et AutomatiqueFunder: French National Research Agency (ANR) Project Code: ANR-14-CE09-0011Funder Contribution: 551,997 EURThe mammalian circadian timing system rhythmically controls most aspects of behaviour and physiology over the 24 h. The underlying basic component of this system is a molecular clock present in virtually every cell and controlled by a genetic network. Self-sustained oscillations of these circadian clocks are synchronised by external or internal time cues and in turn coordinate key cellular processes such as signalling, cell cycle, and metabolism. While the molecular makeup of circadian clock is relatively well known, we are still far from fully understanding how the clock mechanism is integrated with other important processes to ensure optimal temporal coordination at the molecular, cellular and physiological levels. This is a critical issue because circadian misalignment or disruption as observed for example in workers exposed to rotating shift work compromise health and wellbeing. Indeed, experimental and clinical evidence increasingly supports the hypothesis that poor circadian coordination is a risk factor for major pathologies such as cancer, cardiovascular, metabolic, inflammatory and sleep disorders. In addition, tolerability and efficacy of treatments is strongly influenced by the time of administration because the circadian system also controls key drug pharmacology determinants. Accordingly, the concept of chronotherapy aims at integrating circadian timing and pharmacology in order to improve the therapeutic index of drugs through appropriate timing of delivery. The network and the dynamic nature of the circadian clock mechanism makes it difficult to investigate and understand its behaviour when coupled with input and output pathways or other genetic or biochemical networks, using experimental approaches exclusively. We have in two previous projects successfully combined experimental and mathematical modelling approaches to (i) provide the proof of principle that circadian data based modelling can predict optimal timing of irinotecan delivery leading to improved tolerability in a preclinical setting and (ii) demonstrate the consequences of the coupling between the clock and the cell cycle on the dynamical behaviour of the system in proliferating cells. Although such systems biology approach is potentially powerful, current modelling approaches still suffer from several limitations because they were not initially developed for genetic networks involving chronometric time while parameter estimation remains challenging. The HyClock project gathers a multidisciplinary team of experts in computer sciences, mathematical modelling, chronobiology and chronopharmacology to develop novel formal methods and hybrid modelling frameworks and apply them to the analysis and understanding of circadian clock function in mammals. This novel modelling strategy will be first used to predict and analyse how the coupled circadian clock-cell cycle network responds to physiological synchronisation in healthy cells with consequences on proliferation. Second, we will investigate in vivo using experimental design guided by these hybrid modelling approach how we can reinforce circadian timing system coordination of the host through synchronisation, in order to improve the tolerability to treatments using the widely used anticancer targeted agent everolimus (mTOR inhibitor) and cytostatic chemotherapeutic agent irinotecan (Top1 inhibitor) as model drugs. HyClock is expected to provide a general and innovative approach as well as invaluable tools and information for both the modelling of biological time and for the forthcoming personalization of chronotherapy.
more_vert assignment_turned_in ProjectFrom 2014Partners:Université Nice Sophia Antipolis/Transporteurs en Imagerie et Radiothérapie en Oncologie, Laboratoire dInformatique, Signaux et Systemes de Sophia Antipolis, Institut National de Recherche en Informatique et en Automatique, Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis, PHASICSUniversité Nice Sophia Antipolis/Transporteurs en Imagerie et Radiothérapie en Oncologie,Laboratoire dInformatique, Signaux et Systemes de Sophia Antipolis,Institut National de Recherche en Informatique et en Automatique,Laboratoire d'Informatique, Signaux et Systèmes de Sophia Antipolis,PHASICSFunder: French National Research Agency (ANR) Project Code: ANR-14-CE17-0017Funder Contribution: 491,922 EURHigh content imaging is a rapidly evolving technique which makes quantitative analysis of multiple events in a population of cultured cells. The evolution with time is also studied. A new generation of fully automated systems acquires images entire cultured cell populations with multimodal quantitative information (high-throughput microscope) but using conventional methods like fluorescence microscopy. Multispectral data is combined thanks to advanced data analysis algorithms. Biological study processes are in a large majority based on fluorescence imaging. This implies labelling the cells and induces some bias on their behavior. In addition long term time-lapse imaging (over hours) are tough to implement because fluorophores are very weak against excitation light: They rapidly photobleach. To proceed unbiased long-term studies, label-free techniques are to be promoted. The industrial partner Phasics has developed a quantitative phase imaging camera that answers this need. This new label-free technique reaches high contrast and low-noise tissue and cell images. In addition, the images contain quantitative information on the cell components, which is used as supplementary parameters for multispectral studies. The PhaseQuantHD project aims at developing a high-content imaging system using quadriwave lateral shearing interferometry as a quantitative phase imaging modality. Automated analysis methods will be developed and optimized for this modality. Finally an open biological study question will be treated with the system. The proposed multidisciplinary consortium combines the Phasics company and three academic partners: mathematicians and biologists. Phasics will integrate the quantitative phase into high-throughput systems that are already used by the academic partners. Phasics will perform specific technological developments including an optimization of the processing and acquisition time of quantitative phase images. In parallel, innovative automated analysis tools will be developed by mathematicians using two methodological pathways. The first will be based on supervised classification directly using gradients produced by the camera in order to optimize the information obtained from the samples. The second will be based on the possibility offered by the quantitative phase images to automatically define the space occupied by each cell as well as its biomass, and then to longitudinally monitor each cell. This project will promote the development of innovative tools for biologists. These tools will directly be evaluated by the consortium. The most innovative image processing methods will be patented. This project will also promote quantitative phase microscopy for high content imaging and give Phasics access to this fast growing market.
more_vert
chevron_left - 1
- 2
chevron_right
