Powered by OpenAIRE graph

AWE

Country: United Kingdom
91 Projects, page 1 of 19
  • Funder: UK Research and Innovation Project Code: EP/R026084/1
    Funder Contribution: 12,807,900 GBP

    The nuclear industry has some of the most extreme environments in the world, with radiation levels and other hazards frequently restricting human access to facilities. Even when human entry is possible, the risks can be significant and very low levels of productivity. To date, robotic systems have had limited impact on the nuclear industry, but it is clear that they offer considerable opportunities for improved productivity and significantly reduced human risk. The nuclear industry has a vast array of highly complex and diverse challenges that span the entire industry: decommissioning and waste management, Plant Life Extension (PLEX), Nuclear New Build (NNB), small modular reactors (SMRs) and fusion. Whilst the challenges across the nuclear industry are varied, they share many similarities that relate to the extreme conditions that are present. Vitally these similarities also translate across into other environments, such as space, oil and gas and mining, all of which, for example, have challenges associated with radiation (high energy cosmic rays in space and the presence of naturally occurring radioactive materials (NORM) in mining and oil and gas). Major hazards associated with the nuclear industry include radiation; storage media (for example water, air, vacuum); lack of utilities (such as lighting, power or communications); restricted access; unstructured environments. These hazards mean that some challenges are currently intractable in the absence of solutions that will rely on future capabilities in Robotics and Artificial Intelligence (RAI). Reliable robotic systems are not just essential for future operations in the nuclear industry, but they also offer the potential to transform the industry globally. In decommissioning, robots will be required to characterise facilities (e.g. map dose rates, generate topographical maps and identify materials), inspect vessels and infrastructure, move, manipulate, cut, sort and segregate waste and assist operations staff. To support the life extension of existing nuclear power plants, robotic systems will be required to inspect and assess the integrity and condition of equipment and facilities and might even be used to implement urgent repairs in hard to reach areas of the plant. Similar systems will be required in NNB, fusion reactors and SMRs. Furthermore, it is essential that past mistakes in the design of nuclear facilities, which makes the deployment of robotic systems highly challenging, do not perpetuate into future builds. Even newly constructed facilities such as CERN, which now has many areas that are inaccessible to humans because of high radioactive dose rates, has been designed for human, rather than robotic intervention. Another major challenge that RAIN will grapple with is the use of digital technologies within the nuclear sector. Virtual and Augmented Reality, AI and machine learning have arrived but the nuclear sector is poorly positioned to understand and use these rapidly emerging technologies. RAIN will deliver the necessary step changes in fundamental robotics science and establish the pathways to impact that will enable the creation of a research and innovation ecosystem with the capability to lead the world in nuclear robotics. While our centre of gravity is around nuclear we have a keen focus on applications and exploitation in a much wider range of challenging environments.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/R018537/1
    Funder Contribution: 2,557,650 GBP

    Bayesian inference is a process which allows us to extract information from data. The process uses prior knowledge articulated as statistical models for the data. We are focused on developing a transformational solution to Data Science problems that can be posed as such Bayesian inference tasks. An existing family of algorithms, called Markov chain Monte Carlo (MCMC) algorithms, offer a family of solutions that offer impressive accuracy but demand significant computational load. For a significant subset of the users of Data Science that we interact with, while the accuracy offered by MCMC is recognised as potentially transformational, the computational load is just too great for MCMC to be a practical alternative to existing approaches. These users include academics working in science (e.g., Physics, Chemistry, Biology and the social sciences) as well as government and industry (e.g., in the pharmaceutical, defence and manufacturing sectors). The problem is then how to make the accuracy offered by MCMC accessible at a fraction of the computational cost. The solution we propose is based on replacing MCMC with a more recently developed family of algorithms, Sequential Monte Carlo (SMC) samplers. While MCMC, at its heart, manipulates a single sampling process, SMC samplers are an inherently population-based algorithm that manipulates a population of samples. This makes SMC samplers well suited to the task of being implemented in a way that exploits parallel computational resources. It is therefore possible to use emerging hardware (e.g., Graphics Processor Units (GPUs), Field Programmable Gate Arrays (FPGAs) and Intel's Xeon Phis as well as High Performance Computing (HPC) clusters) to make SMC samplers run faster. Indeed, our recent work (which has had to remove some algorithmic bottlenecks before making the progress we have achieved) has shown that SMC samplers can offer accuracy similar to MCMC but with implementations that are better suited to such emerging hardware. The benefits of using an SMC sampler in place of MCMC go beyond those made possible by simply posing a (tough) parallel computing challenge. The parameters of an MCMC algorithm necessarily differ from those related to a SMC sampler. These differences offer opportunities for SMC samplers to be developed in directions that are not possible with MCMC. For example, SMC samplers, in contrast to MCMC algorithms, can be configured to exploit a memory of their historic behaviour and can be designed to smoothly transition between problems. It seems likely that by exploiting such opportunities, we will generate SMC samplers that can outperform MCMC even more than is possible by using parallelised implementations alone. Our interactions with users, our experience of parallelising SMC samplers and the preliminary results we have obtained when comparing SMC samplers and MCMC make us excited about the potential that SMC samplers offer as a "New Approach for Data Science". Our current work has only begun to explore the potential offered by SMC samplers. We perceive significant benefit could result from a larger programme of work that helps us understand the extent to which users will benefit from replacing MCMC with SMC samplers. We propose a programme of work that combines a focus on users' problems with a systematic investigation into the opportunities offered by SMC samplers. Our strategy for achieving impact comprises multiple tactics. Specifically, we will: use identified users to act as "evangelists" in each of their domains; work with our hardware-oriented partners to produce high-performance reference implementations; engage with the developer team for Stan (the most widely-used generic MCMC implementation); work with the Industrial Mathematics Knowledge Transfer Network and the Alan Turing Institute to engage with both users and other algorithmic developers.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/L015552/1
    Funder Contribution: 4,544,990 GBP

    Moore's Law states that the number of active components on an microchip doubles every 18 months. Variants of this Law can be applied to many measures of computer performance, such as memory and hard disk capacity, and to reductions in the cost of computations. Remarkably, Moore's Law has applied for over 50 years during which time computer speeds have increased by a factor of more than 1 billion! This remarkable rise of computational power has affected all of our lives in profound ways, through the widespread usage of computers, the internet and portable electronic devices, such as smartphones and tablets. Unfortunately, Moore's Law is not a fundamental law of nature, and sustaining this extraordinary rate of progress requires continuous hard work and investment in new technologies most of which relate to advances in our understanding and ability to control the properties of materials. Computer software plays an important role in enhancing computational performance and in many cases it has been found that for every factor of 10 increase in computational performance achieved by faster hardware, improved software has further increased computational performance by a factor of 100. Furthermore, improved software is also essential for extending the range of physical properties and processes which can be studied computationally. Our EPSRC Centre for Doctoral Training in Computational Methods for Materials Science aims to provide training in numerical methods and modern software development techniques so that the students in the CDT are capable of developing innovative new software which can be used, for instance, to help design new materials and understand the complex processes that occur in materials. The UK, and in particular Cambridge, has been a pioneer in both software and hardware since the earliest programmable computers, and through this strategic investment we aim to ensure that this lead is sustained well into the future.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/I033335/2
    Funder Contribution: 5,618,010 GBP

    The EPSRC Centre for Innovative Manufacturing in Additive Manufacturing will create a sustainable and multidisciplinary body of expertise that will act as a UK and international focus - the 'go to' place for additive manufacturing and its applications. The Centre will undertake a user-defined and user-driven programme of innovative research that underpins Additive Manufacturing as a sustainable and value-adding manufacturing process across multiple industry sectors.Additive Manufacturing (AM) is the direct production of end-use component parts made using additive layer manufacturing technologies. It enables the manufacture of geometrically complex, low to medium volume production components in a range of materials, with little, if any, fixed tooling or manual intervention beyond the initial product design. AM enables a number of value chain configurations, such as personalised component part manufacture but also economic low volume production within high cost base economies. This innovative approach to manufacturing is now being embraced globally across industry sectors from high value aerospace / automotive manufacture to the creative and digital industries. To date AM research has almost exclusively focused upon the production of single material, homogeneous structures (in polymers, metals and ceramics). The EPSRC Centre for Innovative Manufacturing in Additive Manufacturing will move away from single material, 'passive' AM processes and applications that exhibit conventional levels of functionality, toward the challenges of investigating next generation, multi-material active additive manufacturing processes, materials and design systems. This transformative approach is required for the production of the new generation of high-value, multi-functional products demanded by industry. The Centre will initially explore two themes as the centrepieces of a wider research portfolio, supported by a range of platform activities. The first theme takes on the challenge of how to design, integrate and effectively implement multi-material, multi-functional manufacturing systems capable of matching the requirements of industrial end-users for 'ready-assembled' multifunctional devices and structures. Working at the macro level, this will involve the convergence of several approaches to increase embedded value to the product during the manufacturing stage by the direct printing / deposition of electronic / optical tracks potentially on a voxel by voxel basis; the processing and bonding of dissimilar materials that ordinarily require processing at varying temperatures and conditions will be particularly challenging. The second theme will explore the potential for 'scaling down' AM for small, complex components, extending single material AM to the printing of optical / electronic pathways within micro-level products and with a vision to directly print electronics integrally. The platform activities will provide the opportunity to undertake both fundamental and industry driven pilot studies that both feed into and derive from the theme-based research, and grow the capacity and capability of the Centre, creating a truly national UK Centre and Network that maintains the UK at the front of international research and industrial exploitation in Additive Manufacturing.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/H017461/1
    Funder Contribution: 1,029,470 GBP

    The use of computers and computer programs is pervasive nowadays, but every computer user knows that programs go wrong. While it is just annoying when our favourite text editor loses a bit of our work, the consequences are potentially much more serious when a computer program that, for instance, controls parts of an airplane goes wrong. Software validation and verification are central to the development of this sort of application. In fact, the software industry in general spends a very large amount of money in these activities. One of the measures taken to promote correctness of programs is the use of a restricted set of features available in programming languages. This usually means that most of the more recent advances in software engineering are left out. In this project, we propose to provide development, validation, and verification facilities that allow object-orientation and a modern real-time computational model to be used for the programming of safety-critical systems. In particular, we will work with one of the most popular programming languages: Java, or more specifically, its profiles for high-integrity engineering proposed by the Open Group. As our main case study, we will verify parts of the controller of the first Java Powered Industrial Robot, developed by Sun. One of our collaborators, a senior engineer in Sun tells in an interview that Distributed Real-Time Systems are really hard to build and the engineering community doesn't really know how to build them in a coherent repeatable way. (java.dzone.com/articles) Real-Time Java is entering the industrial automation and automotive markets. Lawyers did not allow the Java Robot to get anywhere near a human, even in a JavaOne conference demo. To proceed in that kind market, better support is needed.Programming is just one aspect of the development of a modern system; typically, a large number of extra artefacts are produced to guide and justify its design. Just like several models of a large building are produced before bricks and mortar are put together, several specification and design models of a program are developed and used before programs are written. These models assist in the validation and verification of the program. To take our civil engineering metaphor one step further, we observe that, just like there can be various models of a building that reflect several points of view, like electricity cabling, plumbing, and floor plans, for example, we also have several models of a system. Different modelling and design notations concentrate on different aspects of the program: data models, concurrent and reactive behaviour, timing, and so on. No single notation or technique covers all the aspects of the problem, and a combination of them needs to be employed in the development of large complex systems. In this project, we propose to investigate a novel integrated approach to validation and verification. Our aim is to provide a sound and practical technique that covers data modelling, concurrency, distribution, and timing. For that, we plan to investigate the extension and combined use of validation and verification techniques that have been successfully applied in industry. We do not seek an ad hoc combination of notations and tools, but a justified approach that provides a reliable foundation for the use of practical techniques. We will have succeeded if we verify a substantial part of the robot controller: using a model written in our notation, we will apply our techniques to verify parts of the existing implementation, execute it using our verified implementation of Safety-critical Java. Measure of success will be provided by our industrial partners and the influence of our results in their practice or business plans.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • 4
  • 5
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.