Powered by OpenAIRE graph

Towers Watson

14 Projects, page 1 of 3
  • Funder: UK Research and Innovation Project Code: NE/R014329/1
    Funder Contribution: 282,756 GBP

    Coastal ecosystems, such as coral reefs, mangroves and seagrass beds, can provide protection against tropical cyclones. They also provide other benefits such as fisheries and tourism. However, extreme storm surges can damage these ecosystems. This reduces their capacity to protect coastal communities from storms that can cause deaths and damage property, and their ability to play a multi-functional role in human well-being, e.g. providing food through fisheries or controlling erosion. It is important to manage these ecosystems well and assist their recovery when they are damaged, so that coastal communities can recover quickly from extreme events. Financial investment in active restoration of these ecosystems is feasible and can be cost-effective over small areas. When large areas of habitat are damaged, setting aside areas of the ecosystem as reserves may be a more cost-effective approach, but people who depend on the ecosystem for their livelihoods, such as fishermen, need to be compensated financially for loss of short-term earnings. Insurance of these ecosystems could provide a solution to allow for immediate funds to be made available, whether for restoration or as compensation to fishermen. The availability of such an insurance product would enable rehabilitation of the reef or seagrass bed following an extreme storm surge to recover their protective and livelihood-related functions. There has been relatively little work on the development of insurance products to protect public assets such as ecosystems and their services, and this has left a large protection gap. The CERFF project therefore represents an important and timely innovation in the design and development of disaster risk financing instruments. Developing an effective ecosystem-based insurance product requires a quantified understanding of the extent to which coastal ecosystems reduce risk and damages from storm surges in different situations, and how this varies with potentially confounding factors. These include: (i) the geographical distribution of key habitats; (ii) their functional capacity; (iii) the damage they can suffer; (iv) an assessment of their condition as a baseline; (v) an understanding of how this baseline may change over time in relation to other stressors such as ocean acidification, pollution or overfishing; and (vi) an understanding of the time period and uncertainties around restoration or regeneration. The CERFF project brings together applied scientists and academics from natural and social sciences and the humanities, with a leading global broking and solutions company. The CERFF team will develop an insurance product to build disaster mitigation capacity through a market-based mechanism that can facilitate rapid responses in relation to extreme events and also incentivise preparedness behaviour. It builds on extensive marine data that Cefas has recently acquired through the Commonwealth Marine Economies programme. Process-based hydrodynamic modelling at Cefas will be combined with species distribution modelling and economic assessment of ecosystem service values to parameterise models of expected and avoided damages. The outputs of the models, combined with scientific understanding of monitoring capability, will inform the development by Willis Towers Watson of a suitable insurance product and associated parametric index trigger for payment. Financial modelling led by York, informed by an understanding of the socio-economic, cultural and political economic factors potentially affecting demand for the insurance project, will allow us to evaluate the flexibility and transferability of the product and its incentive and investment options. This will include adaptability of the product to different locations, and the potential role of the government and external donors in promoting insurance penetration to align private-public incentives, create behavioural changes, and spread the costs of risk prevention within society.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/V004166/1
    Funder Contribution: 353,956 GBP

    Climate change is arguably the biggest challenge facing people this century, and changes to the intensity and frequency of climatic and hydrologic extremes will have large impacts on our communities. The 2017 Climate Change Risk Assessment identified floods and windstorms as likely to have a strong impact on key infrastructure sectors in the UK with climate change. Extreme rainfall is becoming more intense with warming, and short-duration bursts within storms appears to be increasing at a higher rate. However, we still don't understand how changes in large scale atmospheric patterns, the storm track, the release of energy from evaporation and other factors will influence the profile of the storm in time and as well as their frequencies and how long they last for. This is partly due to the fact that most scientific studies have concentrated on 'peak intensity' changes over fixed durations, e.g. daily, multi-day, hourly, etc. Alongside this, most studies look at the likely range of change even though the most important risks rarely lie within this range. Instead, the most important risks are often associated with the 'plausible worst case' scenario. In STORMY-WEATHER we are producing a new methodology based on different 'storm' types to understand the drivers behind the changes and to produce a set of physically-plausible high-impact storm hazard storylines and metrics that people can use to plan for the future. These will use the latest climate projections. We use climate models to tell us about what weather in the future will be like and these computer models are based on fundamental physical laws and complicated mathematical equations which necessarily simplify real processes. One of the simplifications that really seems to matter is that of deep convection (imagine the type of processes that cause a thunderstorm). However, computers are so powerful now that we are able to produce models that work on smaller and smaller scales, and recently we have developed models which we call "convection-permitting" where we stop using these simplifications of deep convection. These "convection-permitting" models are not necessarily better at simulating mean rainfall or rainfall occurrence but they are much better at simulating intense rainfall over short time periods (less than one day) which cause flooding, in particular flash-flood events. They are also better at simulating the increase in heavy rainfall with temperature rise that we can observe; therefore we are more confident in their projections of changes in heavy rainfall for the future. We will use these new models as well as global climate models more commonly used to assess the uncertainty in our projections of the future. We will consider changing temperatures as the potential driver of change to storm hazards, including precipitation and wind as joint hazards. Our storm-type approach will help clarify hazard from different rainfall mechanisms and their scaling rates with temperature, alongside combined wind and rain hazard from storms, as well as their changing nature with warming; characteristics that are vital for planning for impacts (e.g. flooding, infrastructure failure, transport and energy systems, etc.) The focus on storm properties is balanced against the need to understand the impact of potential changes to large-scale circulation patterns on storm hazards, e.g. frequency/persistence changes, and, in particular, the possibility of circulation-driven changes to the dominant event type across regions. Ultimately, we need better information on how extreme weather events might change in the future on which to make adaptation decisions and STORMY-WEATHER intends to provide this important advance, alongside translating this information into useful tools and metrics for use in climate change adaptation.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/S009922/1
    Funder Contribution: 375,422 GBP

    Climate change, because of its enormous social, economic and political consequences, reigns as the leading scientific problem of our times. The primary scientific tool in the study of future climate is the coupled numerical model, in which the various components of the climate system interact, producing an estimate of a future climate state. The resulting projections receive global exposure and impact global policy. Of primary importance in the prediction of climate on interannual to centennial time scales is the ocean. Dynamically sound ocean models are integral to reliable climate forecasting, yet due to gaps in our scientific understanding of ocean function, they suffer from a fundamental weakness. Most of the kinetic energy in the ocean resides in the so-called `mesoscale', a term referring to ocean phenomenon with time scales of days to months and length scales of 10's to 100's of km. The mesoscale through its large scale feedbacks has been shown to be a major factor determining intrinsic ocean variability on interannual to decadal timescales, which covers a significant fraction of the temporal spectrum over which the ocean contributes importantly to climate. It has been estimated that up to 80% of ocean variability is due to such intrinsic processes. This presents climate forecasting with a practical problem: the mesoscale consists of features that are small (50 km)compared to the basin scale (6000 km) and their direct numerical resolution over the entire globe for times required in climate simulations is far beyond current computer resources. Present computational resources for climate projection allow for modest but incomplete mesoscale resolution (25 km), necessitating the parametrisation of the remaining sub-grid scale dynamics. Reliable ocean models will employ parametrisations based on dynamics. This is not current modelling practice. Current practice models sub-grid scale dynamics using viscous and mixing representations and tunes the related parameters to match output to present observations, justifying this modelling by arguing that large scale low frequency winds drive the basin scale circulation that subsequently develops the mesoscale by instabilities. The amplitude of the large scale circulation is then set by balancing the energy flow into the large scale with the energy flow out of the large scale into the mesoscale. The mechanisms by which the mesoscale loses energy are not addressed directly and are not as well understood. Models are tuned to representative mesoscale energy levels so they exhibit reasonable decadal scale variability and to reproduce essential elements of the ocean circulation, such as accurate separation of the Gulf Stream from the east coast of the US. These parametrisations have, however, no basis in the dynamics of the flow. The nonlinearity of the climate system means that there is no assurance that a parametrisation tuned to present conditions will perform well when modelling a changing climate. The same difficulty arises in the modelling of palaeoclimates where the underlying flow structure is far from present day observations. A dynamically based parametrisation, especially as it addresses mesoscale dissipation, is needed to address this issue. We argue that there is a gap between the very high spatial and temporal resolution required in global ocean models to accurately resolve the flow near ocean boundaries and the lower resolution required to resolve the motion of the ocean interior. A dynamical boundary model of the form proposed here can exploit this gap allowing more accurate simulations at lower computational cost while simultaneously increasing our knowledge of boundary mixing processes. This addresses directly the NERC priority of ``studies of water circulation in seas and oceans on a variety of temporal and spatial scales based on modelling''. This project will test a key hypothesis that, if true, will change the modelling of ocean circulation.

    more_vert
  • Funder: UK Research and Innovation Project Code: NE/P017436/1
    Funder Contribution: 1,530,230 GBP

    Wind storms can cause great damage to property and infrastructure. The windstorm footprint (a map of maximum wind gust speed over 3 days) is an important summary of the hazard of great relevance to the insurance industry and to infrastructure providers. Windstorm footprints are conventionally estimated from meteorological data and numerical weather model analyses. However there are several interesting less structured data sources that could contribute to the estimation of the wind storm footprints, and more importantly will raise the spatial resolution of our estimates. This is important as there are important small-scale meteorological phenomena, such as sting jets, that are currently not well resolved by the current methods. We propose to exploit three additional sources of data (and possibly others during the course of the project). The three sources so far identified identified are amateur observations available through the Met Office weather observations website (WOW), comments made on social media and video recorded on social media or CCTV. Amateur meteorological observations are currently collected by the Met Office but not used in producing the footprint estimates. We will investigate whether we can use them in the estimation of the storm footprint; a useful by-product will be estimates of the uncertainty for each WOW station. Social media, such as twitter or instagram, often contains comments on windstorms. These can range from comments on how windy it is, to reports of damage produced by storms. In some cases the geographical location of the message is provided by the device but in others it has to be inferred. There are very large numbers of messages posted on social media every day and it should be possible to used these to provide more detailed modelling of footprints. In addition to text, social media also records images and video. Video is also recorded extensively in the form of CCTV. Video recordings of trees, say, blowing in the wind include information on the strength of the windstorm. We will analyse such recordings to produce information on wind velocity and gust velocity. Bringing together large quantities of diverse data is a complex procedure. We will develop, test, and compare two approaches in modern data science: statistical process modelling and machine learning. Both methods will aim to synthesise all the data into an estimate of the windstorm footprint (and its associated uncertainty). The former will concentrate on producing a map more like the current estimates based on the maximum gust speed while the latter data based methods will concentrate more on mapping the damage caused by the storm. Once we have estimates of the windstorm footprint from both social media and the modelling we will compare these with the standard products and, in consultation with stakeholder, establish any improvements.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P034489/1
    Funder Contribution: 344,231 GBP

    Predicting high impact extreme events, such as severe climatic and economic events is a major societal challenge. Using innovative mathematical techniques this proposal determines phenomenological mechanisms that lead to the occurrence of extremes, and develops a theory that can be used to predict when such events occur in physical modelling applications. Using dynamical systems theory, the proposed research will use geometrical features of the underlying mathematical models to determine the future extreme behaviour. This goes beyond certain traditional approaches such as monitoring output time series data alone. The study of successive maxima (or minima) for stochastic processes is called Extreme Value Theory (EVT). This theory is extensively used in risk analysis to estimate probabilities of rare events and extremes, e.g. high river levels; hurricanes and market crashes. For physical systems modelled by deterministic dynamical systems, especially chaotic dynamical systems a corresponding theory of extremes is yet to be fully understood. These systems are highly sensitive and the time series of observations can be highly correlated. A key question that we address is when to modify the theory for independent, identically distributed random variables in the case of understanding extremes for deterministic systems. Conversely when are certain probabilistic limit laws (such as Poisson laws) a good description of the extreme phenomenon? Ergodic theory approaches have been very successful in understanding the long-term evolution of these systems. Recent approaches have focused on time series observations which have a unique maxima at a distinguished point in phase space, and whose level set geometries coincide with balls in the ambient (usually Euclidean) metric. However extremes of other physically relevant functions (with geometries beyond nested balls), are also important in applications. This includes energy-like functions or wind speed functionals which play a role in measuring the destructiveness of storms. We therefore go beyond existing methodologies and develop a theory of extremes for physically relevant observable functions. We then apply this theory to explicit dynamical systems (both discrete and continuous) motivated by real-world mathematical models such as for the weather and climate.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.