Royal Institute of Technology KTH Sweden
Royal Institute of Technology KTH Sweden
16 Projects, page 1 of 4
assignment_turned_in Project2024 - 2029Partners:THALES UK LIMITED, Cambridge Consultants Ltd, Royal Institute of Technology KTH Sweden, EnCORE, Swiss Federal Inst of Technology (EPFL) +14 partnersTHALES UK LIMITED,Cambridge Consultants Ltd,Royal Institute of Technology KTH Sweden,EnCORE,Swiss Federal Inst of Technology (EPFL),DIMACS,DeepMind,Meta,Toshiba Europe Limited,University of Bristol,Roke Manor Research Ltd,Centre for Science of Information,Center for Networked Intelligence,Mind Foundry Ltd,Nokia Bell Labs,Nu Quantum,Institute of Network Coding,Georgia Institute of Technology,University of California, San DiegoFunder: UK Research and Innovation Project Code: EP/Y028732/1Funder Contribution: 7,691,560 GBPArtificial intelligence (AI) is on the verge of widespread deployment in ways that will impact our everyday lives. It might do so in the form of self-driving cars or of navigation systems optimising routes on the basis of real-time traffic information. It might do so through smart homes, in which usage of high-power devices is timed intelligently based on real- time forecasts of renewable generation. It might do so by automatically coordinating emergency vehicles in the event of a major incident, natural or man-made, or by coordinating swarms of small robots collectively engaged in some task, such as search-and-rescue. Much of the research on AI to date has focused on optimising the performance of a single agent carrying out a single well-specified task. There has been little work so far on emergent properties of systems in which large numbers of such agents are deployed, and the resulting interactions. Such interactions could end up disturbing the environments for which the agents have been optimised. For instance, if a large number of self-driving cars simultaneously choose the same route based on real-time information, it could overload roads on that route. If a large number of smart homes simultaneously switch devices on in response to an increase in wind energy generation, it could destabilise the power grid. If a large number of stock-trading algorithmic agents respond similarly to new information, it could destabilise financial markets. Thus, the emergent effects of interactions between autonomous agents inevitably modify their operating environment, raising significant concerns about the predictability and robustness of critical infrastructure networks. At the same time, they offer the prospect of optimising distributed AI systems to take advantage of cooperation, information sharing, and collective learning. The key future challenge is therefore to design distributed systems of interacting AIs that can exploit synergies in collective behaviour, while being resilient to unwanted emergent effects. Biological evolution has addressed many such challenges, with social insects such as ants and bees being an example of highly complex and well-adapted responses emerging at the colony level from the actions of very simple individual agents! The goal of this project is to develop the mathematical foundations for understanding and exploiting the emergent features of complex systems composed of relatively simple agents. While there has already been considerable research on such problems, the novelty of this project is in the use of information theory to study fundamental mathematical limits on learning and optimisation in such systems. Information theory is a branch of mathematics that is ideally suited to address such questions. Insights from this study will be used to inform the development of new algorithms for artificial agents operating in environments composed of large numbers of interacting agents. The project will bring together mathematicians working in information theory, network science and complex systems with engineers and computer scientists working on machine learning, AI and robotics. The aim goal is to translate theoretical insights into algorithms that are deployed onreal world applications real systems; lessons learned from deploying and testing the algorithms in interacting systems will be used to refine models and algorithms in a virtuous circle.
more_vert assignment_turned_in Project2014 - 2020Partners:Durham University, TU Delft, OSU, Universität Köln, Durham University +4 partnersDurham University,TU Delft,OSU,Universität Köln,Durham University,Royal Institute of Technology KTH Sweden,University of Cologne,KTH,Ohio State UniversityFunder: UK Research and Innovation Project Code: NE/K003674/1Funder Contribution: 341,023 GBPThis proposal aims to improve estimates of Antarctica's contribution to sea level. Sea level is currently rising at approximately 3mm/yr. If we are to understand why it is rising and how future sea-level rise will continue - perhaps accelerate - and lead to a wide range of societal impacts then we need to understand the different contributions to sea level. Some of the largest contributions come from the great ice sheets in Antarctica and Greenland but the amount of ice being lost from Antarctica is particularly difficult to establish. There are three main ways to measure the amount of ice being lost or gained from Antarctica - its 'mass balance'. These are (i) satellite altimetry (measuring very precisely how the ice sheet surface is going up or down through time); (ii) the input-output method (calculating the difference between estimates of how much snow falls on Antarctica, and how much ice breaks off at the coast or is lost by melting); (iii) satellite gravimetry (measuring minute changes in Earth's gravitational field caused by loss or gain of ice in Antarctica through time). Ideally, these three techniques would provide similar answers but they currently do not. All the techniques have problems or drawbacks and all are the subject of ongoing research. In this proposal we focus on the satellite gravimetry method. Mass balance from gravimetry is particularly tricky to calculate because the changes to the gravitational field are not only affected by ice loss/gain but also by mass moving around beneath the Earth's crust. At the end of the last ice age, a large thickness of ice in Antarctica melted and the rocks deep within the Earth are still responding to this change 1000s of years later. The consequence of this response - which scientists call glacial-isostatic adjustment or 'GIA' - is that the satellite measurements have to be corrected by a very large amount that accounts for movements of the rocky material and thus to provide the 'real' figure for ice mass loss/gain. It is getting this correction right that has been so problematic because it requires us to know the history of the ice sheet (including past snow accumulation) for over 10,000 years and also to know the structure of the Earth underneath Antarctica. Recent projects including a previous one by our group that was funded by NERC have made substantial improvements in determining this correction but our recently published work has shown very clearly that we still lack data to pin down the GIA correction tightly enough in parts of East Antarctica. In other words there is still an unacceptable level of uncertainty in East Antarctica, which leads directly to uncertainty in sea-level contribution. In this proposal we have identified a region called Coats Land, in East Antarctica, which accounts for the greatest remaining uncertainty in the GIA correction but where we have managed to identify suitable sites where we can obtain the necessary ice history information, new seismic measurements of crustal structure, and GPS measurements of crustal uplift (a key part of testing GIA models). By visiting these sites and undertaking some world-leading modelling using our field data and a synthesis of existing snow accumulation data we will provide a new and much improved GIA correction for Antarctica. Whilst our data collection focus will be on Coats Land our subsequent modelling effort will encompass all of Antarctica. The data will be used to develop an improved model of GIA in Antarctica in order to correct the GRACE dataset. We conservatively estimate that with the measurements and modelling that we propose to carry out then we can at least halve the total uncertainty in satellite gravimetry measurements of Antarctic mass balance, and probably do substantially better than this. This proposal raises the prospect of getting an improved estimate of the Antarctic contribution to present-day global sea level rise.
more_vert assignment_turned_in Project2010 - 2014Partners:University of Southampton, [no title available], South West Research Institute, BC, Royal Institute of Technology KTH Sweden +7 partnersUniversity of Southampton,[no title available],South West Research Institute,BC,Royal Institute of Technology KTH Sweden,University of Alaska - Fairbanks,KTH,UiT Arctic University of Norway (Tromso),UAF,University of Southampton,UiT,SwRIFunder: UK Research and Innovation Project Code: NE/H024433/1Funder Contribution: 390,031 GBPThe subject of our study is the aurora borealis, or northern lights, which is an amazing natural lightshow in the sky, seen regularly at high latitudes such as northern Scandinavia, but rarely at the latitudes of the UK. We use the aurora as a diagnostic to find out many things about the environment around the Earth, mainly in the region of upper atmosphere called the ionosphere. That environment is made up of 'plasma' (ionised gas) often called the fourth state of matter, which makes up over 95% of the directly observable material in the cosmos. Yet it is strangely difficult to maintain and study within Earth's biosphere. The upper atmosphere provides an ideal natural laboratory for its study since there is no need to consider collisions of the plasma with container walls. The story of the aurora begins at the Sun, which is a continuous but very variable energy source, in the form of a plasma stream (the 'solar wind') which impacts on the Earth. We are interested in understanding the smallest scale auroral structures, and how the energy changes within them influence the large scale environment. To study the aurora, we use a special instrument which has three cameras looking at different 'colours' simultaneously. The proposed research is for studies of very dynamic and structured aurora at the highest possible resolution. The instrument is named ASK for Auroral Structure and Kinetics. It was designed to measure a small circle of 3 degrees in the 'magnetic zenith' i.e. straight up along the Earth's magnetic field. Particles from the Sun spiral along these imaginary magnetic field lines, and lose energy when they collide with atmospheric oxygen and nitrogen. The exact colour (or wavelength of the light) depends on how much energy the incoming particle started with, and what molecule or atom it hits. The ASK cameras help to unravel this complicated process by making very precise measurements in space and time of three emissions which have different physical origins. We will combine these optical measurements with measurements from special radar experiments, which are designed to use a technique known as interferometry to measure structures smaller than the beam width, and with accuracy of position and height better than has been possible to date. The radar imaging technology is new in the field of incoherent scattering radar and will be one of the cornerstones of a future project that is called EISCAT_3D. The technology employed is Aperture Synthesis Imaging Radar (ASIR). It is very similar to the technology used by radio astronomers (VLBI, Very Long Baseline Interferometry) to image stellar objects, and also has some similarity with the SAR (Synthetic Aperture Radar) technique used onboard airplanes and satellites to map the Earth's surface and other planetary surfaces. In the radio astronomy case the source itself spontaneously emits radiation that is collected by a number of passive antennas. In ASIR, the radar transmitter acts like a camera flash to illuminate the target (the ionosphere or atmosphere) and a number of antennas collect the scattered radiation exactly as in the radio astronomy case (or like the lens of a camera). From this point on, the two cases are essentially identical. To construct the image of the target, the cross-correlation between the signals is calculated from all different pairs of receivers. By using the radar imaging technique we will become the pioneers of this new technique in Europe.
more_vert assignment_turned_in Project2024 - 2028Partners:Mahavir Cancer Sansthan, Independent, INDEPENDENT, Heidelberg University, The University of Manchester +3 partnersMahavir Cancer Sansthan,Independent,INDEPENDENT,Heidelberg University,The University of Manchester,Royal Institute of Technology KTH Sweden,Mbarara Univers. of Science & Technology,University of MelbourneFunder: UK Research and Innovation Project Code: MR/Y016327/1Funder Contribution: 1,775,530 GBPAccess to safe drinking water is centrally linked to public health, well-being and economic prosperity. Although water quality is strongly linked to many of the UN Sustainable Development Goals (SDG 6: Clean Water & Sanitation, 3: Good Health & Well-Being, 5: Gender Equality, and 2: Zero Hunger), there is still a long way to go to achieve equitable access to safe drinking water, particularly in the Global South. To accelerate progress, we need new interdisciplinary approaches to tackle complex water quality challenges, especially with increasing stressors like rapid urbanisation and climate change impacting groundwater resources widely used for drinking. The aim of my FLF is to create a roadmap towards improved groundwater quality management in the context of the Global South by bringing together systematic approaches to improve the understanding of dominant groundwater processes and to support evidence-based decision-making for effective groundwater remediation. We will develop and demonstrate this approach in relation to two selected contrasting locations in South Asia (e.g. Bihar, India) and East Africa (e.g. Uganda) and for selected priority groundwater contaminants relevant to those locations. The roadmap approach developed here could then be applied to different scenarios in the future. We will bring together expertise in groundwater pollution (e.g. chemical, microbial, emerging contaminants, antimicrobial resistance), (bio)geochemical processes, remediation technologies, machine learning, decision science (e.g. agent based modelling, multi-criteria decision analysis) and social science to address local water quality and remediation challenges in these two areas. We will co-design decision tools, iteratively integrating scientific data with modelled predictions, to enable informed, locally-relevant decision-making for effective groundwater remediation. We will address an integrated set of key objectives and hypothesis (see objectives) through a series of Workpackages (WP) implemented as: (i) WP 1: Field-based Investigations comprising of WP 1.1 Multipollutant & Process Investigation and WP 1.2 Community Science; (ii) WP 2: Lab-based Investigations comprising of WP 2.1: Water & Sediment Characterisation and WP 2.2 Remediation Evaluation; (iii) WP 3: Predictive Modelling comprising of WP 3.1 Machine Learning; WP 3.2 Agent Based Modelling; and WP 3.3 Multi-Criteria Decision Analysis; and (iv) WP 4: Synthesis & Communication comprising of WP 4.1 Stakeholder Engagement and WP 4.2 Open Resource Bank Development. Our project team brings together highly complementary expertise and skillsets. I am an environmental engineer with expertise in groundwater pollution and remediation, with substantial experience managing and implementing complex, multi-partner research projects in South/Southeast Asia, Africa and South America. I am joined by Co-Investigators from The University of Manchester, British Geological Survey, University of Birmingham and University of Bath, along with international Project Partners from University of Melbourne (Australia), KTH Royal Institute of Technology (Sweden), Mahavir Cancer Sansthan (India), University of Heidelberg (Germany), Mbarara University of Science and Technology (Uganda) and independent affiliates from India and Malaysia. Collectively we bring together decades of interdisciplinary expertise in water science, remediation, water management, water and health, biotechnology, decision science, social science, participatory science, stakeholder engagement and extensive local knowledge in India and East Africa. The results and tools generated will improve the understanding of the complex natural and anthropogenic processes impacting groundwater quality in the selected locations and will better enable evidence-based decision making for effective groundwater remediation, with the roadmap generated able to be applied to other scenarios in the future.
more_vert assignment_turned_in Project2017 - 2021Partners:KTH, University of Leeds, Royal Institute of Technology KTH Sweden, University of LeedsKTH,University of Leeds,Royal Institute of Technology KTH Sweden,University of LeedsFunder: UK Research and Innovation Project Code: EP/P024688/1Funder Contribution: 291,436 GBPWhen certain solid materials (for example, tin) are cooled down to very low temperatures, the electrons they contain start to behave not as individual, independent particles, but as a collective, collaborative entity, a kind of gas of electron pairs. This allows them to move without friction so that electrical currents can pass through the material with absolutely no energy loss. This phenomenon, called superconductivity, has immense technological potential, already partially exploited (most medical MRI scanners use superconducting magnets nowadays, for example). A major barrier to further exploitation is the very low temperatures at which superconductivity typically occurs (around -270 degrees C), which require refrigeration with liquid helium. Since mid 2001 complex materials have been engineered which exhibit superconductivity at relatively high temperatures and have several different inter-pervading collaborative electron "gases". Whereas the underlying mechanism for conventional low temperature superconductivity is well understood, the basis of superconductivity in these newer multiband materials is, so far, relatively mysterious. The aim of this project is to make a thorough mathematical study of a class of models of multiband superconductors called multicomponent Ginzburg-Landau models. The precise mathematical structure of the model is determined by underlying assumptions about the electron pairing mechanisms which lead to superconductivity. These models possess mathematically interesting solutions called "topological solitons", smooth spatially localized lumps of energy which cannot be dissipated by any continuous deformation of the system. The idea is to determine how the properties (the most important property being existence and stability) of these solitons depend on the mathematical structure of the model. The absence, presence and characteristics of these solitons in real superconductors can then be used to infer information about the electron pairing mechanisms underlying superconductivity in these materials.
more_vert
chevron_left - 1
- 2
- 3
- 4
chevron_right
