Powered by OpenAIRE graph

Dassault Systèmes (United Kingdom)

Dassault Systèmes (United Kingdom)

29 Projects, page 1 of 6
  • Funder: UK Research and Innovation Project Code: EP/V028839/1
    Funder Contribution: 809,674 GBP

    Models of complex chemical processes such as combustion or atmospheric chemistry assume that the molecules taking part are thermalized, that is that their energy is characterized by the temperature of the system. Chemical activation (CA) occurs when the energy released by a reaction is channelled into the products and they have an energy greater than would be thermally predicted. How does the reactivity of these activated species compare with their thermalized equivalents? What is the significance of CA? How can CA be incorporated into chemical models of complex systems? These are the questions at the heart of our project: Complex Chemistry and Chemical Activation (C3A). Aspects of CA have been known about for more than 100 years, indeed 2022 marks the centenary of the Lindemann Mechanism, the first theory proposed to explain the pressure dependence of some chemical reactions. Models of CA have grown in sophistication, yet uncertainties in key processes (energy transfer, calculation of densities of states) limit the accuracy of kinetic and thermodynamic predictions from such systems. Addressing the uncertainties in these aspects of current models through new experimental data and developments in fundamental models is one strand of C3A. More recently, work in this group and elsewhere has shown that systems which were thought to be adequately modelled by thermalized reagents, such as abstraction reactions (e.g. OH + HCHO), do need to considered in the context of chemical activation. In a 2018 review, Klippenstein states: 'These studies ultimately led us to the realization that at combustion temperatures, the foundational assumption of thermalization prior to reaction is not always valid, and further that its breakdown significantly affects key combustion properties' (Proceedings of the Combustion Institute, 36, p77). These phenomena are not limited to combustion; plasma chemistry and the atmospheric chemistry of Earth and other planets provide other important examples of applications. C3A is a collaboration between leading groups from Leeds and Oxford, both with interests in experiments and theory. C3A will generate a wealth of new experimental data, which in combination with theoretical interpretation, will allow us to assess the significance of CA in real systems and provide the tools to allow CA to be accurately incorporated into chemical models of of these processes. The impact of C3A to industry will be facilitated by collaborations with Shell, Dassault Systemes and AirLabs. Such models are essential tools for understanding important questions from current highly practical issues (how can combustion systems be optimized to minimize CO2 emissions and improve air quality) to future questions (biofuels for aviation, novel methods of renewable energy storage such as ammonia generation and combustion) to important, fundamental questions such as modelling the atmospheres of hot-Jupiter exo-planets or the interstellar medium. The accurate assessment and incorporation of CA into such models will significantly enhance their reliability and predictive value.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/P021573/1
    Funder Contribution: 383,551 GBP

    By 2020, the advanced composite market is predicted to be worth around £17 billion, with automotive the second largest growth sector (after wind energy) but still falling far short of its enormous growth potential; the high cost of production for advanced composite products is still a major obstacle to their wider exploitation. Government legislation on the reduction of emissions is an important driver across the transport sector and one way to achieve prescribed targets is through the substitution of relatively heavy metallic components with highly optimised light-weight advanced polymer composite parts. Consequently, there is an urgent need to address the economic viability of manufacturing with advanced polymer composites and realise their full weight and fuel saving potential. The proposed project aims to contribute to this overarching goal by introducing an ambitious low-cost route to manufacturing highly optimised advanced composite structures. The ability to produce 'steered-fibre laminates' containing non-linear fibre paths, creates a step change in the design space for advanced composite structures. The designer is able to reposition stress concentrations away from holes and inserts, improve a laminate's resistance to buckling and failure, and to enhance a laminate's dynamic response to vibrations. Ultimately this can lead to lighter, more optimised structures for use in the aerospace and automotive sectors, enhancing fuel efficiency and contributing to the broader goals of reduced cost and lower emissions across the transport sector. The aim of the proposed project is to implement and demonstrate a novel and disruptive manufacture process that can produce low-cost high-quality steered-fibre laminates, without use of expensive, capital intensive automated fibre placement machines (the current solution). The new process is best described as 2-D forming; in order to support this novel manufacture process, a custom-designed suite of computer aided design and manufacture software will be developed. Computational tools for digital manufacturing are essential if 2-D forming is to be successfully achieved without inducing severe wrinkling and buckling of the deforming biaxial sheet. Reducing cost will effectively bring fibre-steering technology to a broader range of applications, increasing its economic impact and bringing new manufacturing capabilities to a wider industrial base, with the UK leading the way in this important area of manufacturing.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/N002288/1
    Funder Contribution: 346,710 GBP

    Two of the most critical global challenges currently being faced are energy security and climate change. In the UK, more than £100 bn of investment in new UK power stations and grid infrastructure is projected within the next decade, both to replace ageing plant and to allow for the incorporation of renewable sources. Such changes will involve a paradigm shift in the ways in which we generate and transmit electricity. Since a central element of all items of power plant is electrical insulation, meeting our future challenges through the deployment of new innovative plant, this will require the development and exploitation of new high performance insulation material systems. Polymer nanocomposites have demonstrated clear potential, but the lack of detailed understanding of the underlying physics and chemistry is a major impediment to the technological realisation of this potential. In certain laboratory studies, nanodielectrics materials have out-performed unfilled and traditional micro-composite insulating materials. However, entirely contrary results have also been elsewhere. Undoubtedly, this variability in macroscopic behaviour comes about as a consequence of our inability to define and control the key factors that dictate the dielectric behaviour of nanocomposites. The overarching aim of this project is to resolve this issue such that the potential of dielectric nanocomposites - nanodielectrics - can be fully exploited. As such, the project is totally aligned with the EPSRC Materials for Energy theme in which it is accepted that "in the field of advanced materials it will be necessary to strengthen approaches to the rational design and characterisation of advanced materials and their integration into structures and systems". It also aligns with the Advanced Materials theme of the "Eight Great Technologies", it which it is accepted that "these materials are essential to 21st century manufacturing in a UK market worth £170 billion per annum and representing 15 per cent of GDP". Our research hypothesis is that the macroscopic properties of nanodielectrics cannot be reliably controlled without understanding the processes that occur at the interfaces between the matrix material and the nanoparticles, because these regions directly affect two critical issues. First, interfacial interactions will affect the nanoparticle dispersion, which has a major bearing on many physical properties and, second, the nature of the interface determines the local density of states in the system, and thereby the material's overall electrical characteristics. To understand such local processes is challenging and we propose to do this through a combination of computation simulation and experiment, where both aspects are closely aligned, thereby allowing the simulation to direct experiment and the experimental result to refine the simulation. The work programme has been divided in 3 distinct themes, which will progressively move the work from fundamentals to exploitation. Theme 1 will therefore concentrate on model systems, where simulation and experiment can be most closely aligned. Theme 2 will then seek to deploy the key messages to the development of technologically relevant systems and processes. Throughout, Theme 3 will engage with a range of stakeholders that will range from key industry players (equipment manufacturer s, energy utilities, standards bodies) to the general public t maximise the reach and significance of its ultimate impact (economic, environmental, societal). We see the involvement of our Industrial Users Group as being particularly important, both in helping to guide the project and in terms of ensuring acceptance of the technologies that will ultimately arise.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/G055882/1
    Funder Contribution: 309,547 GBP

    Quantum mechanics has had a profound and pervasive influence on science and technology. Phenomena that are intrinsically quantum mechanical, such as magnetism, electron transport in semiconductors, and the effect of impurity atoms in materials, lie at the heart of almost every branch of industry. Quantum mechanical calculations of properties and processes from ``first-principles'' are capable of making accurate quantitative predictions but require solving the Schrdinger equation which is extremely difficult and can only be done using powerful computers. In contrast, empirical modelling approaches are relatively cheap but lack the predictive power of first-principles methods (which are parameter-free and take as input only the atomic numbers of the constituent atoms). The predictive capability is essential, in order to make rapid progress on new and challenging problems where there is insufficient experimental data and to also generate useful empirical approaches or even to check their reliability when these exist. Within the class of first-principles methods, one approach that has been outstandingly successful is the Density Functional Theory (DFT) as it combines high accuracy with moderate computational cost. Nevertheless, the computational effort of performing calculations with conventional DFT approaches increases as the cube of the number of atoms, making them unable to tackle problems with more than a few hundred atoms even on modern supercomputers. Since the pioneering work of the Nobel laureate Walter Kohn, it has been known that it is possible to reformulate DFT so that it scales linearly, which would in principle allow calculations with many thousands or even millions of atoms. The practical realisation of this however, in a method which is as robust and accurate as conventional cubic-scaling DFT approaches has been extremely difficult. The ONETEP approach developed over many years by the applicants of this proposal has achieved just that. ONETEP is at the cutting edge of developments in first principles calculations. However, while the fundamental difficulties of performing accurate first-principles calculations with linear-scaling cost have been solved, only a small core of functionality is currently available in ONETEP which prevents its wide application. In this collaborative project between three Universities, the original developers of ONETEP will lead an ambitious workplan whereby the functionality of the code will be rapidly and significantly enriched. The code development ethic of ONETEP, namely that software is robust, user-friendly, modular, portable and highly efficient on current and future HPC technologies will be of fundamental importance and will be further strengthened by rigorous cross-checking between the three institutions of this proposal. The developments are also challenging from a theoretical point of view as they need to be within the linear-scaling framework of ONETEP, using its highly non-trivial formulation of DFT in terms of in situ optimised localised functions. The program of work provides much added value as the few fundamental enabling technologies that will be developed in its first stages will then underpin many of the functional capabilities that will follow. The result will be a tool capable of a whole new level of materials simulation at the nanoscale with unprecedented accuracy. It will find immediate application in simulations in molecular biology, nanostructures and materials, which underpin solutions in urgent current problems such as energy, environment and health. Through the increasing number of commercial and academic users and developers of ONETEP, the worldwide dissemination and wide use of this novel tool will be rapid; finally the expanding ONETEP Developers' Group will coordinate the best strategies for the future maintenance and development of the software.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/G007489/1
    Funder Contribution: 1,360,330 GBP

    The discovery that matter is made up of atoms ranks as one ofmankind's greatest achievements. Twenty first century science isdominated by a quest for the mastery (both in terms of control andunderstanding) of our environment at the atomic level.In biology, understanding life (preserving it, or even attempting tocreate it) revolves around large, complex, molecules -- RNA, DNA, andproteins.Global warming is dictated by the particular way atoms are arrangedto make small greenhouse gas molecules, carbon dioxide and so on.The drive for faster, more efficient, cheaper computer chips forcesnanotechnology upon us. As the transistors that make up themicroscopic circuits are packed ever closer together, electronicengineers must understand where the atoms are placed, or misplaced, inthe semiconducting and insulating materials.Astronomers are currently, daily, discovering new planets outside oursolar system, orbiting alien stars. The largest are the easiest tospot, and many are far larger than Jupiter. The more massive theplanet the higher pressures endured by the matter that makes up itsbulk. How can we hope to determine the structure of matter at theseconditions?The atomic theory of matter leads to quantum mechanics -- a mechanicsof the every small. In principle, to understand and predict thebehaviour of matter at the atomic scale simply requires the solutionof the quantum mechanical Schroedinger equations. This is a challengein itself, but in an approximate way it is now possible to quicklycompute the energies and properties of fairly large collections ofatoms. But is it possible to predict how those atoms will be arrangedin Nature - ex nihilo, from nothing but our understanding ofphysics?Some have referred to it as a scandal that the physical sciencescannot routinely predict the structure of even simple crystals -- butmost have assumed it to be a very difficult problem. A minimum energymust be found in a many dimensional space of all the possiblestructures. Those researchers brave enough to tackle this challengehave done so by reaching for complex algorithms -- such as geneticalgorithms, which appeal to evolution to breed ever betterstructures (with better taken to mean more stable). However, Ihave discovered to my surprise, and to others', that the very simplestalgorithm -- throw the collection of atoms into a box, and move theatoms downhill on the energy landscape -- is remarkably effectiveif it is repeated many times.This approach needs no prior knowledge of chemistry. Indeed thescientist is taught chemistry by its results -- this is critical ifthe method is to be used to predict the behaviour of matter underextreme conditions, where learned intuition will typically fail.I have used this approach, which I call random structure searching to predict the structure of crystals ex nihilo. My firstapplication of it has been to silane at very high pressures, and thestructure I predicted has recently been seen in experiments. Butprobably the most impressive application so far has been to predictingthe structure of hydrogen at the huge pressures found in the gas giantplanets, where it may be a room temperature superconductor.In the course of my fellowship I will extend this work to try toanticipate the structure of matter in the newly discovered exoplanets,to try to discover and design materials with extreme (and hopefully,extremely useful) properties, and to help pharmaceutical researchersunderstand the many forms that their drug molecules adopt when theycrystallise.

    more_vert
  • chevron_left
  • 1
  • 2
  • 3
  • 4
  • 5
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.