NCAS
12 Projects, page 1 of 3
assignment_turned_in Project2017 - 2022Partners:Sorbonne University, UH, University of Oxford, Sorbonne University (Paris IV & UPMC), NERC National Ctr for Atmospheric Sci +5 partnersSorbonne University,UH,University of Oxford,Sorbonne University (Paris IV & UPMC),NERC National Ctr for Atmospheric Sci,MET OFFICE,NCAS,National Centre for Atmospheric Science,Met Office,Met OfficeFunder: UK Research and Innovation Project Code: NE/R000999/1Funder Contribution: 461,686 GBPThe ocean circulation is dominated by an energetic mesoscale eddy field on spatial scales of 10-100 km, analogous to weather systems in the atmosphere. These eddies are unresolved, or at best inadequately resolved, in the ocean models used for long-range climate projections. Thus it is necessary to parameterise the impacts of the missing mesoscale eddies on the large-scale circulation. The vast majority of numerical ocean circulation models employ the Gent and McWilliams "eddy parameterisation" which acts to flatten density surfaces, mimicking the release of potential energy to fuel the growth of the mesoscale eddies. A key parameter in this eddy parameterisation is the "eddy diffusivity", which is critical as it plays a leading order role in setting global ocean circulation, stratification and heat content, the adjustment time scale of the global circulation, and potentially atmospheric CO2. In this project, we will implement a new closure for the Gent and McWilliams eddy diffusivity, derived from first principles, which depends only on the ocean stratification, the eddy energy and a non-dimensional parameter that is less than or equal to 1. If the eddy energy is known, then there is no freedom to specify explicitly any additional dimensional parameters, such as an eddy diffusivity. For this reason, we argue that existing approaches to parameterising eddies in ocean climate models are fundamentally flawed. Our new approach requires solving an equation for the depth-integrated eddy energy. This is a significant challenge and will form a major component of the present project. However, we believe that solving for the eddy energy is tractable as we have some understanding of the key physical ingredients. These key ingredients include the source of eddy energy through instability of the large-scale flow, westward propagation of eddies, diffusion of eddy energy, dissipation of eddy energy in western boundary "eddy graveyards", and dissipation of eddy energy through bottom drag and lee wave generation. Once a consistent eddy energy budget is incorporated, our new eddy parameterisation leads to three highly desirable results, which serve as important proofs of concept: 1. It reproduces the correct dimensional growth rate for eddies in a simple model of instability of atmospheric and oceanic flows for which there is an exact mathematical solution. 2. Assuming perfect knowledge of the eddy energy, it reproduces the eddy diffusivity diagnosed from high-resolution computer simulations of fully turbulent instabilities. 3. It predicts and explains the physics of "eddy saturation", the remarkable insensitivity of the size of the Antarctic Circumpolar Current to surface wind forcing, and a long standing challenge and known deficiency of current eddy parameterisations. The work plan will consist of four inter-related work packages: 1. Implementation and validation of the new eddy parameterisation framework in the NEMO ocean model, used by NERC and the UK Met Office, along with other European partners. 2. Development and refinement of the parameterised eddy energy budget. 3. Quantifying the impact of the new parameterisation on the oceanic uptake of heat and passive tracers in the UK Earth System Model, used for the UK contribution to the Intergovernmental Panel for Climate Change (IPCC) climate projections. 4. Project management, to ensure that the work is delivered fully and in a timely manner.
more_vert assignment_turned_in Project2016 - 2020Partners:NCAR, Frontier Research Ctr For Global Change, CAWCR, STFC - LABORATORIES, NERC CEH (Up to 30.11.2019) +32 partnersNCAR,Frontier Research Ctr For Global Change,CAWCR,STFC - LABORATORIES,NERC CEH (Up to 30.11.2019),Institute of Atmospheric Physics,University of Oxford,Japan Agency for Marine Earth Science an,UCAR,Potsdam Institute for Climate Impact Res,Science and Technology Facilities Council,National Ctr for Atmospheric Res (NCAR),Collaboration for Australian Weather and Climate Research,NERC National Ctr for Atmospheric Sci,Pierre Simon Laplace Institute IPSL,Institute of Atmospheric Physics,MET OFFICE,Pierre Simon Laplace Institute IPSL,EnviroSim (Canada),Finnish Meteorological Institute,Japan Agency for Marine-Earth Sci & Tech,Danish Meteorological Institute (DMI),Danish Meteorological Institute DMI,STFC - Laboratories,Met Office,Environment Canada,FMI,UKCEH,CAWCR,NCAS,National Centre for Atmospheric Science,Indian Institute of Tropical Meteorology,Met Office,Indian Institute of Tropical Meteorology,LSCE-Orme,Environment and Climate Change Canada,National Centre for Atmospheric ResearchFunder: UK Research and Innovation Project Code: NE/P006779/1Funder Contribution: 408,100 GBPGOTHAM represents an ambitious research programme to gain robust, relevant and transferable knowledge of past and present day patterns and trends of regional climate extremes and variability of vulnerable areas identified by the IPCC, including the tropics and high-latitudes. It will achieve this by identifying the influence of remote drivers, or teleconnections, on regional climate variability, and assessing their relative impact. It will also assess the potential for improved season-decadal prediction using a combination of contemporary climate models, citizen-science computing and advanced statistical analysis tools. GOTHAM has the direct backing of many international weather and climate research centres, and will lead to the improved development of seasonal-decadal forecasts at the regional level. The improved knowledge and understanding of dynamical factors that influence regional weather and climate in the tropics/sub-tropics, and polar regions, will directly feed through to weather and climate forecast services to assist in their decisions on which priority areas of their model development to target in order to improve forecast skills. For example, GOTHAM will advise whether a model is missing or misrepresenting important global teleconnections that significantly influence regional climate in identified vulnerable regions. These impacts will be achieved through regular meetings with GOTHAM investigator groups and their extended collaborative networks, and extensive involvement in wider science and science-policy programmes with co-aligned strategies, such as the core projects within the WCRP. Improved seasonal to decadal scale forecasts will improve predictions of extreme events and natural hazard risks such as flooding that can have devastating impact on society. There is real potential for project results feeding through to impacts-related research, such as those involved in hydrological and flood forecast modeling, and these will be explored in liaison with identified partners in Asia and Europe.
more_vert assignment_turned_in Project2021 - 2025Partners:University of Edinburgh, NERC National Ctr for Atmospheric Sci, Meteo Swiss, European Centre for Medium Range Weather, ECMWF +4 partnersUniversity of Edinburgh,NERC National Ctr for Atmospheric Sci,Meteo Swiss,European Centre for Medium Range Weather,ECMWF,NCAS,National Centre for Atmospheric Science,Meteo Swiss,ECMWF (UK)Funder: UK Research and Innovation Project Code: EP/W007940/1Funder Contribution: 577,148 GBPDeveloping scientific software, for example for climate modeling or medical research, is a highly challenging task. Domain scientists are often deeply involved in low-level programming details just to make their code run sufficiently fast. These tedious, but important, optimization steps significantly reduce the productivity of scientists. Domain specific languages (DSLs) revolutionize the productivity of domain scientists by enabling them to focus on scientific questions rather than making their code run fast. Sophisticated DSL compilers automatically generate high-performance code from domain-specific high-level problem descriptions. While there are individual successes, the existing landscape of DSLs is scattered and the reuse of software components in DSL compiler implementations is limited as traditionally DSL compilers are built in isolation. This results in high development costs of new DSLs and prevents many DSLs from ever achieving a level of maturity and sustainability that enables uptake by the scientific community. This project revolutionizes the design of DSL compiler implementations by leveraging the breadth and cross-industry support of the MLIR compiler and Python ecosystems. Python is the tool of choice for application developers in many domains, such as machine learning, data science, and - we believe - an important component of the future of High Performance Computing software. This project establishes MLIR as a common representation for code at multiple levels of abstraction in DSL compiler development. DSLs embedded in various host languages, including Python and Fortran, will be easily built on top of MLIR. Instead of building DSL compilers as isolated monolithic towers, our research will build a toolbox that enables developers to build DSLs using a rich ecosystem of shared intermediate representations IRs and optimizations. This project evaluates, drives, and demonstrates the DSL design toolbox to build the next generation of DSLs for Seismic and Climate Modelling as well as Medical imaging. These will share common software components and make them available for other DSLs. An extensive evaluation will show the scalability of DSL software towards exascale. Finally, this project investigates how future disruptors, including artificial intelligence, data science, and on-demand HPC-as-a-service, will shape and influence the next generations of high performance software. This project will work towards deeply integrating modern interactive data analytics and machine learning methods from the Python ecosystem with high-performance scientific code.
more_vert assignment_turned_in Project2016 - 2019Partners:UNIVERSITY OF EXETER, NERC National Ctr for Atmospheric Sci, MET OFFICE, NCAS, National Centre for Atmospheric Science +4 partnersUNIVERSITY OF EXETER,NERC National Ctr for Atmospheric Sci,MET OFFICE,NCAS,National Centre for Atmospheric Science,University of Exeter,Met Office,Met Office,University of ExeterFunder: UK Research and Innovation Project Code: EP/N030141/1Funder Contribution: 235,429 GBPIf CO2 emissions continue to rise, climate change will adversely affect global food and water availability, ecosystems, cities, and coastal communities. While reduction of fossil fuels will be an essential step for reducing atmospheric CO2, Negative Emission Technologies (NETs) can help meet emission targets. During combustion, CO2 can be extracted, transported, and stored in geologic repositories - this is the process of Carbon Capture and Storage (CCS). Combining bioenergy with CCS (BECCS) could result in negative emissions of CO2. BECCS is attractive since it results in a net removal of CO2 from the atmosphere while also providing a renewable source of energy. However, BECCS requires a large commitment of land and will have impacts on food and water availability. This work focuses on BECCS and addresses the challenges for planning a global and nationwide distribution of bioenergy crops. The vast majority of IPCC scenarios that remain below 2 degrees C makes use of NET in the 21st century. Although bioenergy crops and BECCS are an essential component of the scenarios (produced by Integrated Assessment Models, or IAMs), the crops in even the most sophisticated IAMs only respond to mean changes in climate. This results in an inconsistency in the modelling framework: the IAMs can assume bioenergy crops are effective at drawing down CO2 and producing energy in a region where actually climate change will reduce their effectiveness. Earth System Models (ESMs) represent the dynamics of the atmosphere, oceans, sea ice, and land surface. They can account for biophysical (i.e. changes to albedo and latent heat fluxes) and biogeochemical (i.e. uptake or release of greenhouse gases) feedbacks due to land use change. They are the only tool available to investigate future impacts of spatial and temporal variability in climate on the food, energy, and water nexus. However, the ESMs used in the last IPCC report only accounted for a generic crop type at best, not differentiating between bioenergy and food crops. Without an explicit representation of bioenergy crops in ESMs, the effects of climate change do not feedback to affect the food, energy, and water resources assumed to be true in the IAMs. There is an urgent need for predicting the productivity of bioenergy crops in a coupled climate simulation, to see the impact of a range of climate change on the productivity, and associated impacts on food crop productivity, energy production, and water availability. In this project, I will include representations of first and second generation bioenergy crops in the UK ESM, and investigate the impacts of climate change on the productivity at the global and regional (for the UK) level. This work will assess the viability of negative emissions of CO2 through bioenergy crops as an effective climate mitigation strategy under a changing climate, and provide data to support decisions that will minimize the impacts of both climate change and climate change mitigation on bioenergy production, food, and water availability. The outcomes of this project will enhance the resilience of the food/water/energy nexus to climate change and climate variability through better planning, and providing socially responsible recommendations for balancing the challenges of reducing climate change with feeding our growing global population.
more_vert assignment_turned_in Project2015 - 2018Partners:NCAS, National Centre for Atmospheric Science, Cambridge Integrated Knowledge Centre, CE, Galois, Inc +11 partnersNCAS,National Centre for Atmospheric Science,Cambridge Integrated Knowledge Centre,CE,Galois, Inc,Cambridge Econometrics,Baincore Limited,Polyhedron Software ltd,Galois, Inc,Naked Science Limited,University of Cambridge,NERC National Ctr for Atmospheric Sci,UNIVERSITY OF CAMBRIDGE,Baincore Limited,Naked Science Limited,Polyhedron Software ltdFunder: UK Research and Innovation Project Code: EP/M026124/1Funder Contribution: 542,082 GBPScientific models play a vital role in science and policy making. Many models are now expressed as complex computer programs which are often the result of decades of research and development, possibly involving multiple researchers or teams. This has lead to significant investment in maintaining these models and evolving them to use modern programming approaches or to work efficiently on new hardware platforms (such as cloud computing resources). However, the complexity of these models makes maintenance and evolution difficult. In particular, changing a complex model's code whilst ensuring it produces the same results is difficult; maintenance/evolution of complex models is often error prone. The complexity of a piece of software can be classified as either intrinsic or accidental. Intrinsic complexity is an essential reflection of the complexity inherent in the problem and solution at hand. Alternatively, accidental complexity arises from the particular programming language, design or tools used to implement the solution. Many of the research contributions of programming language design and software engineering have been aimed at reducing the accidental complexity of software. However, many of these approaches have not been targetted at scientific computing. There is now a need to develop these contributions so that they meet the needs of scientists. Addressing these needs will provide huge benefits to science and policy through increased productivity and trust in models. Our collaborations with leading research groups in science have highlighted the huge existing investments in established models. We are therefore aiming to support the evolution, rather than replacement, of existing code and working practices. Our goal is to apply cutting edge programming language and software engineering research to help develop "sustainable" software, which maintains its value over generations of researcher. Our focus is on models developed in the Fortran language, as this remains a dominant programming language in scientific computing, owing in part to its longevity. We will provide practical tools which scientists can use to reduce the accidental complexity of models through evolving a code base, as well as tools for automatically verifying that any maintenance/evolution activities preserve the models behaviour. We will develop new mechanisms for program comprehension and transformation in order to bring effective techniques from programming language design and software engineering across the chasm to scientific computing. Ultimately, reducing the effort to maintain and evolve code will free-up scientists to focus on the core aspects of the science, and will lead to models that are more easily communicated, disseminated, and reused between researchers, supporting core ideals of science.
more_vert
chevron_left - 1
- 2
- 3
chevron_right
