Aix-Marseille University
Aix-Marseille University
5 Projects, page 1 of 1
assignment_turned_in Project2014 - 2019Partners:University of Birmingham, ASTRAZENECA UK LIMITED, University of Birmingham, UF, Florida State University +19 partnersUniversity of Birmingham,ASTRAZENECA UK LIMITED,University of Birmingham,UF,Florida State University,Birmingham Childrens Hospital NHS FT,Aix-Marseille University,Thermo Fisher (To be removed 1),Thermo Electron Corporation,Texas A & M University,Waters UK,Aix-Marseille University,Birmingham Childrens Hospital NHS FT,Owlstone Limited,Waters UK,Aix-Marseille University,AstraZeneca plc,Advion Ltd,National Physical Laboratory NPL,UT System,Advion Ltd,Owlstone Limited,Astrazeneca,NPLFunder: UK Research and Innovation Project Code: EP/L023490/1Funder Contribution: 1,484,530 GBPThe aim of the research is to develop novel approaches for the analysis of biomolecules, and in particular proteins, directly from their natural (or actual) environment, i.e., to develop approaches for in situ biomolecular analysis. Proteins are the work-horses of the cell and perform all the functions required for life. They also find uses as therapeutics and in consumer products. To gain insight into the various and specific roles of proteins in life processes, or to determine the therapeutic efficacy of protein drugs, or to establish the environmental fate of protein additives in consumer products, it is necessary to be able to analyse proteins at a molecular level. Mass spectrometry, in which ionised molecules are characterised according to their mass-to-charge, is ideally suited to this challenge, offering high sensitivity, broad specificity (all molecules have a mass), and the capability for chemical structure elucidation. The ultimate goal is to link molecular analysis directly to molecular environment. Much like a forensics officer tasked with determining the presence of an illicit substance, there is much greater reliability and credibility afforded to an analysis performed at the scene of the crime than to one performed following removal of the sample to a separate location and alternative surroundings. Growing evidence suggests in situ protein analysis has groundbreaking roles to play in biomarker discovery, diagnosis & early detection of disease, targeting therapeutics (personalised medicine) and assessment of therapeutic efficacy. The benefits of in situ protein analysis can be illustrated by considering a thin tissue section through a drug-treated tumour. In principle, in situ analysis would inform on drug-target interactions (i.e., is the drug binding to the correct protein?). Moreover, with in situ protein analysis the capacity for artefact introduction as a result of sample preparation (e.g., application of a matrix) or sample damage is eliminated. Nevertheless, a number of challenges exist. Proteins are large molecules associated with a vast array of chemical modifications, and which form loosely-bound complexes with themselves, other proteins and other molecule types. It is not only their chemical structure but also their overall 3-D structure which dictate their function. Other molecular classes that are hugely important in biological processes also have an intricate relationship with proteins. Any in situ mass spectrometry approach needs to be able to meet these analyte-driven challenges, i.e., it must be capable of (a) measuring proteins and characterising any modifications, (b) detecting protein complexes and determining their constituents, (c) providing information on 3-D structure, and (d) detecting other relevant molecular classes. Moreover, there are technique-driven challenges for in situ analysis including inherently high sample complexity and wide ranging concentrations, and opportunities for quantitation. The research will meet these challenges by developing a newly emerging in situ approach, liquid extraction surface analysis mass spectrometry, in combination with two complementary types of ion mobility spectrometry (which can either provide information on 3-D structure, or separate ionised molecules in the mass spectrometer on the basis of their 3-D shape) and a structural elucidation strategy known as electron-mediated dissociation mass spectrometry. The research will be undertaken primarily at the University of Birmingham in the Advanced Mass Spectrometry Facility in the School of Biosciences and the School of Chemistry mass spectrometry facility. The programme involves a number of academic and industrial collaborators and additional research will be carried out during scientific visits to National Physical Laboratory (NPL), Thermo Fisher Scientific, Waters, Owlstone, Florida State University, Texas A&M University and Université d'Aix-Marseille.
more_vert assignment_turned_in Project2011 - 2016Partners:GSS, Gold Standard Simulations, Aix-Marseille University, Institute of Material Sciences Barcelona, Swiss Federal Institute of Technology +8 partnersGSS,Gold Standard Simulations,Aix-Marseille University,Institute of Material Sciences Barcelona,Swiss Federal Institute of Technology,Swansea University,Institute of Material Sciences Barcelona,Universites d'Aix-Marseille Paul Cezanne,University of Southampton,UCL,Swansea University,EPFZ,University of SouthamptonFunder: UK Research and Innovation Project Code: EP/I004084/2Funder Contribution: 640,570 GBPComputers and electronic gadgets, such as the iphone, have transformed modern life. The silicon transistor is at the core of this revolution, having been continuously made faster and smaller over the last forty years. In a chip, millions of them are squeezed into an area the size of a pinhead, switching a billion times in one second. Transistor size has now reached nanometre dimensions; one nanometre is only ten time larger than an atom. Moore's law, which dictates that transistor size halves every two years and is the driving force behind the success of the electronics industry, has come to a halt. The happy and easy days of transistor scaling are now gone. Quantum mechanical laws conspire against transistor function making it leak when switched off and generating poor electrical control. Also, our inability to control the precise atomic structure of interfaces and chemical composition during fabrication makes transistors less predictable. Hence semiconductor companies are searching for alternative, non-planar (multigate) transistor architectures and novel devices such as nanowires, nanotubes, graphene and molecular transistors, which will ultimately break through the nano-size barrier resulting in a completely new era of miniaturization. There is a significant gap between our ability to fabricate transistors and to predict their behaviour.The simulation and prediction of the silicon transistor has become an vital mission. Current planar transistor architecture presents serious problems in scalability regarding leakage and controllability. Transistors of nanometre dimensions are more vulnerable to the atomic nature of matter than their previous cousins of micrometre dimensions. Furthermore, at nanoscales heat transfer is a source of heat death for novel transistor applications due to the decrease of thermal conductivity. Within this context I propose to develop a Quantum Device simulator, with atomic resolution that will enable the accurate prediction of present and future transistor performance. The simulator will deploy a quantum wave description of electron propagation, treating the interaction of electrons with crystal lattice vibrations (heat) at a fully quantum mechanical level. It will have the capability of describing the electron interactions with the roughness of the semiconductor/dielectric interface and with each other under the effect of a high electric field. Devices will be properly tested and optimised regarding materials, chemical composition and geometry without the high costs implicit in fabrication. A wide range of transistors will be explored from planar, non-planar and novel. This is timely as existing computer design tools lack predictive capabilities at the nanoscale and the industrial build-and-test approach has become prohibitively costly. Efficient quantum-models/algorithms/methodologies and tools will be developed.These are dynamic times as device dimensions move closer to the realm of atoms, which are inherently uncontrollable. In this regime two streams collide: the classical and quantum worlds making the need for new regularities and patterns vital as we strive to conquer nature at this scale. This offers exiting opportunities to merge an engineering top-to-bottom approach with a physics bottom-up approach. As 21st century environmental concerns rise, the need for greener technology is increasing. My proposal addresses the lowering of power consumption, raw material reductions delivering more functionality and the provision of a cheaper way to assess new design technologies. Collectively, these will help companies to provide a greener alternative to consumers.
more_vert assignment_turned_in Project2010 - 2011Partners:Aix-Marseille University, Universites d'Aix-Marseille Paul Cezanne, Swiss Federal Institute of Technology, University of Glasgow, University of Southampton +9 partnersAix-Marseille University,Universites d'Aix-Marseille Paul Cezanne,Swiss Federal Institute of Technology,University of Glasgow,University of Southampton,EPFZ,University of Southampton,Gold Standard Simulations,UCL,University of Glasgow,Aix-Marseille University,Institute of Material Sciences Barcelona,Institute of Material Sciences Barcelona,GSSFunder: UK Research and Innovation Project Code: EP/I004084/1Funder Contribution: 712,368 GBPComputers and electronic gadgets, such as the iphone, have transformed modern life. The silicon transistor is at the core of this revolution, having been continuously made faster and smaller over the last forty years. In a chip, millions of them are squeezed into an area the size of a pinhead, switching a billion times in one second. Transistor size has now reached nanometre dimensions; one nanometre is only ten time larger than an atom. Moore's law, which dictates that transistor size halves every two years and is the driving force behind the success of the electronics industry, has come to a halt. The happy and easy days of transistor scaling are now gone. Quantum mechanical laws conspire against transistor function making it leak when switched off and generating poor electrical control. Also, our inability to control the precise atomic structure of interfaces and chemical composition during fabrication makes transistors less predictable. Hence semiconductor companies are searching for alternative, non-planar (multigate) transistor architectures and novel devices such as nanowires, nanotubes, graphene and molecular transistors, which will ultimately break through the nano-size barrier resulting in a completely new era of miniaturization. There is a significant gap between our ability to fabricate transistors and to predict their behaviour.The simulation and prediction of the silicon transistor has become an vital mission. Current planar transistor architecture presents serious problems in scalability regarding leakage and controllability. Transistors of nanometre dimensions are more vulnerable to the atomic nature of matter than their previous cousins of micrometre dimensions. Furthermore, at nanoscales heat transfer is a source of heat death for novel transistor applications due to the decrease of thermal conductivity. Within this context I propose to develop a Quantum Device simulator, with atomic resolution that will enable the accurate prediction of present and future transistor performance. The simulator will deploy a quantum wave description of electron propagation, treating the interaction of electrons with crystal lattice vibrations (heat) at a fully quantum mechanical level. It will have the capability of describing the electron interactions with the roughness of the semiconductor/dielectric interface and with each other under the effect of a high electric field. Devices will be properly tested and optimised regarding materials, chemical composition and geometry without the high costs implicit in fabrication. A wide range of transistors will be explored from planar, non-planar and novel. This is timely as existing computer design tools lack predictive capabilities at the nanoscale and the industrial build-and-test approach has become prohibitively costly. Efficient quantum-models/algorithms/methodologies and tools will be developed.These are dynamic times as device dimensions move closer to the realm of atoms, which are inherently uncontrollable. In this regime two streams collide: the classical and quantum worlds making the need for new regularities and patterns vital as we strive to conquer nature at this scale. This offers exiting opportunities to merge an engineering top-to-bottom approach with a physics bottom-up approach. As 21st century environmental concerns rise, the need for greener technology is increasing. My proposal addresses the lowering of power consumption, raw material reductions delivering more functionality and the provision of a cheaper way to assess new design technologies. Collectively, these will help companies to provide a greener alternative to consumers.
more_vert assignment_turned_in Project2020 - 2023Partners:Bordeaux INP, City, University of London, Aix-Marseille University, University of London, Aix-Marseille University +1 partnersBordeaux INP,City, University of London,Aix-Marseille University,University of London,Aix-Marseille University,Aix-Marseille UniversityFunder: UK Research and Innovation Project Code: EP/T018313/1Funder Contribution: 249,526 GBPThe proposed research lies at the interface of the areas of verification and machine learning, interactions of which are attracting a lot of attention currently and of potential huge benefits for both sides. Verification is this domain of computer science aiming at checking and certifying computer systems. Computer systems are increasingly used at all levels of society and peoples' lives and it is paramount to verify that they behave the way they are designed to and that we expect (examples of crucial importance, among many others, are embedded software for planes auto-pilot or self-driving cars). Unfortunately, the verification of complex systems encounters limits: there is no universal fully automated way to verify every system and one needs to find a good trade-off between the constraints of time, memory space and accuracy, which are often difficult to overcome. Machine learning has been studied since the 50's and regained much attention recently with breakthroughs in speech recognition, image processing or game playing. The development of neural networks (studied since the 60's) awarded Hinton, LeCun, and Bengio the Turing award 2019 and using deep learning, the British firm DeepMind developed its successful AlphaGo and AlphaGo Zero which were impressive steps forward and reaffirmed the amazing potential of machine learning. This project proposes to apply learning techniques in verification to improve the efficiency of some algorithms which certify computer systems and to compute fast accurate models for real-life systems. Automata are one of the mathematical tools used in verification to model computer or real-life systems. Giving certifications on these systems often boils down to running some algorithms on the corresponding automata. The efficiency of such algorithms usually depends on the size of the considered automaton. Minimising automata is thus a paramount problem in verification, as a way to verify large computer or real-life systems faster. This proposal aims at studying the minimisation of some streaming models of quantitative automata using machine learning techniques. The kind of automata we are going to focus on, are streaming models, in the sense that the input is not stored but received as a stream of data and dealt with on the fly, thus being particularly suitable for the treatment of big data. They are also suited to deal with optimisation problems such as minimising the resource consumption of a system or computing the worst-case running time of a program, for example. Minimising these kind of automata is highly challenging and linked with the long-standing open problem of the determinisation of max-plus automata. This proposal gives several directions of research, such as using learning methods to tackle it.
more_vert assignment_turned_in Project2020 - 2024Partners:University of Sheffield, Aix-Marseille University, Aix-Marseille University, University of Sheffield, Aix-Marseille University +1 partnersUniversity of Sheffield,Aix-Marseille University,Aix-Marseille University,University of Sheffield,Aix-Marseille University,[no title available]Funder: UK Research and Innovation Project Code: EP/T02450X/1Funder Contribution: 446,163 GBPThis project will develop computational models with the ability to recognize and accurately process idiomatic (non-literal) language that are linguistically motivated and cognitively-inspired by human processing data. Equipping models with the ability to process idiomatic expressions is particularly important for obtaining more accurate representations as these can lead to gains in downstream tasks, such as machine translation and text simplification. The originality of this work is in integrating linguistic and cognitive clues about human idiomatic language processing in the construction of models for word and phrase representations, and in integrating them in downstream tasks. The main objectives and research challenges of this project: 1:To explore cognitive and linguistic clues linked to idiomaticity that can be used to guide models for word and phrase representations 2: To investigate idiomatically-aware models. 3: To explore alternative forms of integrating these models in applications and to develop a framework for idiomaticity evaluation in word and phrase representation models. 4. To release software implementations of the proposed models to facilitate reproducibility and wider adoption by the research community. This proposal targets a crucial limitation in standard NLP models, as idiomaticity is everywhere in human communication, with potential benefits to various applications that include natural language interfaces, such as conversational agents, computer assisted language learning platforms, question answering and information retrieval systems. As a consequence we anticipate the proposal will have a wide academic impact in the community. Moreover, enabling more precise language understanding and generation also has the potential of enhancing accessibility and digital inclusion, through promoting more natural and accurate communication between humans and machines. We intend to demonstrate the additional potential benefits of these models through interactions with our external collaborations, including by means of an advisory board. The board will include other academics and industrial partners working on related topics, such as, Dr. Fabio Kepler (Unbabel Portugal, for machine translation), Prof. Mathieu Constant (Université de Lorraine, France, for parsing and idiomaticity), Prof. Lucia Specia (Imperial College, UK, for text simplification) and Dr. Afsaneh Fazly (Samsung, Canada, for idiomaticity). This proposal targets a crucial limitation in standard NLP models, as idiomaticity is everywhere in human communication, with potential benefits to various applications that include natural language interfaces, such as conversational agents, computer assisted language learning platforms, question answering and information retrieval systems. As a consequence we anticipate the proposal will have a wide academic impact in the community. Moreover, enabling more precise language understanding and generation also has the potential of enhancing accessibility and digital inclusion, through promoting more natural and accurate communication between humans and machines. We intend to demonstrate the additional potential benefits of these models through interactions with our external collaborations, including by means of an advisory board.
more_vert
