Tel Aviv University
Tel Aviv University
10 Projects, page 1 of 2
assignment_turned_in Project2024 - 2025Partners:Lancaster University, Tel Aviv UniversityLancaster University,Tel Aviv UniversityFunder: UK Research and Innovation Project Code: EP/Y003152/1Funder Contribution: 99,771 GBPThis collaborative project brings together the capabilities of international collaborators who are experts in fabricating superconducting junctions devices using graphene, with UK based expertise in performing low-noise electronic measurements at ultra-low temperatures to develop superconducting amplifiers that will improve the performance of superconducting quantum circuits. Quantum systems are generally very sensitive to noise. In quantum computing, for example, qubits are characterised by coherence, a property which describes how long a qubit can remain in a given quantum state. Noise causes decoherence which limits the lifetime of these delicate systems. One source of noise is the amplifiers that we use to amplify the very low power signals used to control these quantum systems. In superconducting circuits we can use parametric amplifiers that employ Josephson junctions to create amplifiers that can operate at the very lowest possible noise levels, limited only by quantum mechanics. Most commonly, these amplifiers use Josephson junctions that are formed by connecting two superconductors with an insulator, a so-called SIS junction. The operating frequency of parametric amplifiers employing SIS junctions can be tuned using magnetic flux, however due to the long range effects of magnetic fields, this magnetic flux can interfere with our delicate quantum devices, or create cross-talk between multiple amplifiers. Screening this flux therefore creates additional challenges. Superconducting junctions can also be formed by connecting two superconductors with graphene, an SgS junctions. Unlike parametric amplifiers using SIS junctions, parametric amplifiers using SgS junctions can instead be tuned electrostatically. This is achieved by applying a voltage to a gate electrode near the graphene. In this way, we can avoid the perils of the interference caused by stray magnetic flux. However, the development of SgS junctions is overwhelmingly focused on using graphene that is exfoliated from high quality graphite crystals. Although this produces the highest quality graphene, the flakes are only a few micrometers in size, which limits the number of junctions that can be fabricated from a single flake. This makes a practical realisation of devices using these junctions very challenging. Graphene can also be produced in large areas, enough to cover a 6-inch diameter silicon wafer and can be readily purchased. This form of graphene is generally of a lower quality, however Junctions using large-area graphene have been demonstrated. Unfortunately, there has been almost no development of devices which exploit these junctions. This project seeks to develop a robust method for fabricating superconducting junctions and then to develop parametric amplifiers using the junctions. Through this proof-of-concept work we aim to demonstrate that scalable, electrostatically tuned parametric amplifiers is a possibility. These amplifiers could be used for the readout of superconducting qubits and by operating with very low noise, could help in reducing decoherence of these systems. As they are less sensitive to magnetic fields, these amplifiers would also be more robust against such interference. More widely, superconducting microwave amplifiers are used in experiments searching for Axions, a dark matter candidate and can also be used for radio astronomy. The low heat capacity of graphene will also allow the fabrication of very sensitive bolometers with low noise parametric amplifiers. Success in this project would bring the expertise of a world leading research group to the UK and deliver a key enabling technology that would strengthen the UK's position as a world leader in quantum technologies.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::9fe050606962fceea2aa2eb96bcedef3&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::9fe050606962fceea2aa2eb96bcedef3&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2007 - 2009Partners:TAU, Max Planck Institute of Quantum Optics, Tel Aviv University, Imperial College London, Max-Planck-GymnasiumTAU,Max Planck Institute of Quantum Optics,Tel Aviv University,Imperial College London,Max-Planck-GymnasiumFunder: UK Research and Innovation Project Code: EP/E045049/1Funder Contribution: 245,651 GBPIn the early seventies it was realized that the notion of particles depends on the specific details of the quantum measurement process used to detect them, and that the state of motion of the measuring device can determine whether or not particles are observed. This discovery has created a new viewpoint which was prompted by Fulling,Unruh and Hawking's work demonstrating that the number of particles found in a region depends on the acceleration of the measuring device. For example, the vacuum, i.e. a region that contains no particles at all, would be seen by an accelerated observer as aregion with particles. The number of particles and their energy would increase with increased acceleration. This effect is known as the Unruh effect. Since by general relativity acceleration and gravitation are equivalent, an analogical effect would be the black hole radiation.Einstein, Podolsky, and Rosen, introduced a Gedanken experiment in a 1935 paper to argue that quantum mechanics is not a complete physical theory. It is sometimes referred to as the EPR paradox. This thought experiment shows paradoxical features of quantum mechanics, demonstrating strange correlation sometimes referred to as spooky action from a distance. These correlations could be quantified. Various quantifications were suggested which are referred to as measures of entanglement.I propose to study entanglement using the view point introduced in the first paragraph. I am interested in studying the behavior of entanglement when it is probed by different observers. Especially, I would like to explore the experimental realization of these ideas.Since the Unruh effect was never measured due to experimental difficulties, I will study the realization of this effect in a Bose Einstein Condensate (BEC). A BEC is a macroscopic collection of atoms which are all located in the same state. BEC could be thought of as a macroscopic number of particles located at the same point, but this point, due to the rules of quantum mechanics could be quite big, due to uncertainty relations. It was found that this strange state, in some way, is very similar to the vacuum of light, i.e. if we think of the vacuum as some kind of ether which let the lightpropagate through, the BEC is a background in which information propagates.In this proposal I want to study the feasibility of the experimental realization of these effects in BEC. First I will study a scheme to measure the Unruh effect. I will propose a scheme in which an accelerated observer will find particles inside the vacuum, not thereal vacuum but its analogy, the BEC at very low temperature. Then I will propose experiments in which two observables which accelerate next to the vacuum would become entangled, i.e. would show EPR correlation. The experimental feasibility of this is important not only as a proof of physical theory which is believed to be true, butalso as a mean to study a scheme which cannot be calculated. The creation of entanglement by acceleration is a problem which cannot be solved analytically. The realization of this experiment would provide a numerical solution to this problem. It is important to note here, that this problem, in addition to not being analyticallysolvable can neither be checked numerically in a regular computer. A quantum computer could check this result, but unfortunately such a computer does not exist. Modeling experimentally problems that could be checked numerically only by using a quantum computer is just the idea behind the quantum simulator. The advance of this technology would serve as a major stepping stone to the creation of a quantumcomputer.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::361bb387ce054c6b257d671ae29b1fe3&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::361bb387ce054c6b257d671ae29b1fe3&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2015 - 2018Partners:Mediterranean Institute of Oceanology, Tel Aviv University, Keele University, TAU, Mediterranean Institute of Oceanography +1 partnersMediterranean Institute of Oceanology,Tel Aviv University,Keele University,TAU,Mediterranean Institute of Oceanography,Keele UniversityFunder: UK Research and Innovation Project Code: NE/M016269/1Funder Contribution: 351,386 GBPWind waves in seas are inherently random. Despite the progress of engineering, unpredicted extreme waves in the ocean remain a serious danger for ships and offshore structures. In recent years there was a number of accidents with large ships resulting in loss of life and pollution of large sea and coastal areas. The UK, as an island trading nation, increasingly depends on ever expanding shipping and offshore activities. The loss of life, disruption (even temporary) of supply lines or of offshore energy production have become totally (morally and economically) unacceptable. To address these challenges thorough understanding of random sea waves is needed, first of all, knowledge of the dependence of their probability distribution on wave interaction with atmosphere. In the situation of changing weather patterns the required knowledge of, say, a "100-year wave" for a particular place cannot be obtained from past experimental records, and a comprehensive theoretical model deduced from first principles is needed. Now a radical improvement compared to the present state of affairs has become possible. This is the aim of the proposed project. At present all wave forecasting and modelling, which is a part of routine meteorological forecasting, is based on the numerical integration of the kinetic (Hasselmann) equation. The equation derived from first principles takes into account wind input, dissipation and interaction between waves of different scales and directions and describes the slow evolution of wind wave energy spectra in time and space. There has been accumulated a good understanding of spectra evolution obtained from modelling and observations. The weakest link is in translating the acquired knowledge of energy spectra into predicting probability distributions of wave heights. The major shortcomings of the prevailing approach are: (i) it relies on the very restrictive assumption of narrow spectra, while most of the observed spectra are broad from the viewpoint of nonlinear interactions, (ii) it does not properly take into account wave nonlinear interactions, (iii) it assumes stationarity of the process. Very recently PI and RCoI found a way to evaluate numerically the higher moments of probability distribution (skewness and kurtosis) within the established framework of wave turbulence without these restrictions. Since the procedure is numerically expensive, we propose to parametrize all combinations of wave spectra and thus to obtain simple parametrizations of probability distributions. This will allow us to deduce from first principles a parametrization of probability distributions easy-to-use in operational forecasting for all the variety of sea states. The sea states predicted by the existing models or obtained as a result of direct measurements describe somehow averaged ("normal") sea states. There also exist short-lived transient states caused by sharp changes of wind, which are filtered out by such averaging. We argue that these ephemeral sea states might be responsible for disproportionate share of anomalously high waves. Such transient sea states have never been studied in this context. The time resolution of wind forecasts was far too low, there were no conceptual and numerical tools. Crucially for this project the situation has improved radically: the time resolution of wind forecasts is improving dramatically, while the PI and RCoI derived a generalized kinetic equation able to describe the fast evolution of the spectra, developed and tested the numerical code able to tackle this equation. Combining this with the authors' specially designed direct numerical simulation algorithm, we propose a clear path for examining probability distributions of wave heights of transient events linked to rapid changes of atmospheric forcing. On this basis this project aims to revolutionise modelling of random wind waves and freak wave forecasting.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::6ada81d47487c1aaeafaa747b21a2634&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::6ada81d47487c1aaeafaa747b21a2634&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2023 - 2026Partners:Tel Aviv University, Cornell University, Max Planck Institutes, University of Colorado Boulder, ARM Ltd +4 partnersTel Aviv University,Cornell University,Max Planck Institutes,University of Colorado Boulder,ARM Ltd,ARM (United Kingdom),Nvidia (United States),Cornell University,University of SurreyFunder: UK Research and Innovation Project Code: EP/X037142/1Funder Contribution: 466,479 GBPModern society depends on accessing and transferring vast quantities of data, at ever-increasing speeds. Technology giants such as Google invest billions of dollars every year into data centres across the world. Replicated data systems form the backbone of all cloud services; improving their reliability and performance impacts all cloud and big data services. The UK is currently the largest cloud market in Europe, with over £17 billion in cloud investment in 2020. To meet our ever-growing need for rapid data transfer, RDMA (remote direct memory access) technologies enable next-generation infrastructures by allowing a machine to access (read/write) directly the memory of another machine across a network. Unlike traditional network protocol stacks such as TCP/IP, RDMA-enabled network interface cards (NICs) can bypass' an operating system kernel, ans are thus capable of wire-speed data transmission. RDMA technology has been available in supercomputing clusters since the mid 2000s, but had until recently remained an experimental feature in consumer and enterprise systems due to its cost. However, this changed recently with the availability of affordable NICs (e.g. those developed by our partner NVIDIA), and it is now possible to build distributed applications that challenge conventional design paradigms. For instance, one can leverage the additional throughput of RDMA to support concurrent front-end applications, surpassing sequential state-machine replication services used today. There is already a shift towards consumer-grade devices through the development of wireless RDMA and RDMA over 6G. To unlock the potential of RDMA in enterprise and consumer systems, we must enable programmers to write safe and secure programs. However, the RDMA behaviour is currently documented through informal plain-text manuals [22] and examples, making its semantics vague and ambiguous. The programming models for RDMA are poorly understood, and there is no support for formal verification. This carries significant risks since RDMA enables writing directly into the memory of a remote machine. Moreover, RDMA programming is inherently challenging as it requires an understanding of the interaction between remote communication and local computation, i.e. how remote memory writes interact with the local memory writes of a given machine, potentially leading to data races and increasing the risk of safety and security bugs.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8c9fa2ce6f0e09686d5d145b69c55df2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::8c9fa2ce6f0e09686d5d145b69c55df2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2025 - 2031Partners:Batavia Biosciences, ASTRAZENECA UK LIMITED, KCL, AskBio, Syncona +5 partnersBatavia Biosciences,ASTRAZENECA UK LIMITED,KCL,AskBio,Syncona,Institute of Clinical Physiology (CBS),University of Pennsylvania,Tel Aviv University,CELL THERAPY CATAPULT LIMITED,University of BristolFunder: UK Research and Innovation Project Code: MR/Z504658/1Funder Contribution: 22,345,200 GBPCardiovascular disease remains the number one cause of death worldwide, of which coronary artery disease leading to heart attack and ultimately heart failure is the biggest contributor. There is no known cure for heart failure, other than organ transplantation which is compounded by limited donor hearts and a requirement for lifelong toxic drugs to prevent organ rejection. All current drug-based treatments only serve to assist the survived heart muscle and blood vessels after heart attack but do not reverse disease progression. Thus, there is an urgent unmet need for novel therapeutic approaches to regenerate the injured heart and to reverse established heart failure. Here we propose to establish a new MRC-BHF Centre (called RECREATE) to develop advanced therapies for heart attack and heart failure. We will bring together world leading scientists and clinicians from King's College London and the Universities of Oxford and Edinburgh, combined with other academic partners as well as biotech and pharmaceutical industry representatives to develop new medicines and to deliver them to the injured or failing heart. We will define patients for treatment based on current clinical guidelines. We will develop improved diagnostic measures to refine treatment groups across what is known to be a mixed population of heart attack and heart failure patients. The drugs will be made up of nucleic acids, such as modified RNA, as was used in COVID-19 vaccines. Together with delivery vehicles, we will ensure that they specifically target the heart in both models of disease and in patients, thereby maximising the impact of our therapies. Our approach will restore lost heart muscle, increase numbers of functioning blood vessels, reduce inflammation, and reduce tissue scarring to maintain or enhance heart function. By leveraging additional support, we will progress therapies to patient clinical trials within the lifetime of the Centre. We will develop delivery methods that are non-invasive, combined with cost-effective treatments to ensure that our therapies can be utilised in all primary health care systems, including those within developing countries. The Centre will widely disseminate its progress and key findings to the public and patients alike and is firmly committed to training, career development and the promotion of early career researchers. We will establish a fully inclusive and diverse research culture, all working towards the common goal of developing novel advanced therapies to treat heart attack and heart failure patients.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::2e5f80e6dc56e098f19d3a13b54b1eb2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::2e5f80e6dc56e098f19d3a13b54b1eb2&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
chevron_left - 1
- 2
chevron_right