Powered by OpenAIRE graph

Laboratoire de laccélérateur linéaire

Laboratoire de laccélérateur linéaire

7 Projects, page 1 of 2
  • Funder: French National Research Agency (ANR) Project Code: ANR-17-CE31-0023
    Funder Contribution: 474,829 EUR

    The discovery of the Higgs boson during the LHC Run 1 completes the experimental validation of the Standard Model (SM) of high-energy particle physics. Its particle spectrum is fully established, and definite predictions are available for all interactions. At the quantum level, the SM relates the masses of the heaviest particles the W and Z gauge bosons, the Higgs boson, and the top quark. The Z boson mass is precisely known since LEP1, and the measurement precision of the top quark mass has vastly improved at the TeVatron and LHC. In 2014, the ATLAS and CMS collaborations produced a precise measurement of the Higgs boson mass, based on the full 7 and 8 TeV datasets; in 2016, ATLAS completed a first measurement of mW, using 7 TeV data only, that matches the precision of the best previous results. The present proposal aims at further improvement in the measurements of mW, mZ and the weak mixing angle with ATLAS, exploiting all data available at 8 and 13 TeV. Leptonic final states play a particular role, and improving the measurement of electrons and muons is our main focus on the experimental side. A set of dedicated measurements is foreseen to bring our understanding of strong interaction effects to the required level. Finally, a global analysis of the electroweak parameters is proposed, accounting for correlations of QCD uncertainties across the different measurements, extending traditional electroweak fits. The involved scientists and institutes have recognized expertise and achievements in the fields spanned by this project. The present call provides a unique opportunity to strengthen our collaboration over a timescale matching the needs of our ambition.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-19-CE19-0009
    Funder Contribution: 638,520 EUR

    The aim of the ClearMind project is to develop a monolithic gamma ray detector (0.5 MeV to few MeV) with a large surface area (= 25 cm2), high efficiency, high spatial accuracy (< 4 mm3 FWHM ) and high timing accuracy ( < 20 ps FWHM, excluding contributions of the collection and amplification of photoelectrons). Our motivation is to improve the performance of Positron Emission Tomography scanners (PET). We propose to develop a position-sensitive detector consisting of a scintillating crystal on which is directly deposited a photo-electric layer of refractive index greater than that of the crystal. This "scintronic" crystal, which combines scintillation and photoelectron generation, optimizes the transmission of scintillation photons and Cherenkov light photons to the photoelectric layer. We expect a factor 4 gain on the probability of optical photon transmission between the crystal and the photoelectric layer, compared to conventional assemblies using optical contact gels. The crystal will be encapsulated with a micro-channel plate multiplier tube (MCP-MT) in order to amplify the signal and optimize the transit time of the photo-electrons towards the plane of detection anodes (densely pixelated) and thus the temporal and spatial resolutions of the detection chain. The originality of our detector consists in: - Improve the efficiency of light collection in a high-density, and high-effective atomic number crystal by depositing a photoelectric layer directly on the scintillating crystal. - Use the Cherenkov light emission for detection. The gain in optical coupling optimizes the measurement of time based Cherenkov photons, inherently very fast. - Use the map of photoelectrons produced at the surface of the crystal to reconstruct the properties of the gamma interactions by means of robust statistical estimators and information processing using machine learning algorithms. The scintillation photons provide the necessary statistics for a measurement of the energy deposited in the crystal, modest but compatible with a use on a PET imager, and a precise measurement of the coordinates of the interaction position of the gamma ray. - The fast acquisition of signal shapes (SAMPIC technology), which facilitates the optimization of the detector. - The effort to reduce the number of electronic channels (and associated constraints) while keeping optimal performance. We propose to develop the ClearMind prototypes in two phases. Phase 1 consists in producing a "thin" detector, ~ 10 mm, instrumented on one side. The objective is a proof of principle of the technology, the characterization of the performances of this prototype, and its confrontation with a Monte Carlo model, using the GATE simulation tool. This should allow us to set up all the technologies and to concretely understand their stakes. Deadline 18 months. Phase 2 involves the production of a ~ 20 mm thick detector, instrumented on both sides. The objective will then be to produce a detector module of optimized efficiency, spatial and temporal resolutions, close to what would be used in future PET machines. Deadline 30 months. The GATE Monte Carlo simulation will then allow us to assess the potential of the technology to design an enhanced cerebral Time-Of-Flight PET imager, (and alternatively whole body TOF-PET). Our efforts with the manufacturers involved in the development of the prototypes resulted in quotations and delivery times compatible with the schedule and the budget presented in this project.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-19-CE31-0025
    Funder Contribution: 248,400 EUR

    Laser-driven plasma accelerators are capable of delivering beams of several hundred MeV with acceleration distances of only a few centimetres. However, despite these impressive results the shot to shot reproducibility of the particle beam produced by this technique remains low in comparison with that of conventional accelerators. For some of the possible applications of these laser-driven plasma accelerators, this can be a severe limitation. The limited reproducibility may come from the regime in which the electrons were produced in most experiments so far: in the so-called “bubble regime” the plasma excitation is very strong, leading to wavebreaking in which some electrons from the plasma are accelerated in a very high gradient. This regime has been very successful in producing and accelerating electrons to high energies (more than 100 MeV/mm) but it is intrinsically non-linear and the electrons that are accelerated also come from the plasma itself in a partially uncontrolled way. This may explain the difficulty to have reproducible results from one shot to the next in terms of electron energies but also energy spread and total beam charge or beam pointing. In the following project, we propose to operate a laser-driven plasma accelerator in a different regime where the plasma excitation is not so strong. In this regime, the forces in the plasma remain linear or quasi linear and consequently the acceleration should be less sensitive to any kind of fluctuation. In addition, there is no wavebreaking in this regime and thus the electrons have to be injected from an external source such as a photoinjector and a RF accelerator. Using RF accelerators coupled with high-power lasers, both in France and in Germany, we will investigate this scheme with the hope of demonstrating highly stable electron beam accelerated with a high gradient.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-18-CE92-0012
    Funder Contribution: 396,020 EUR

    The aim of the BOLD-PET project is to develop a fast-response, high-efficiency gamma detector with fine grained spatial resolution for positron emission tomography (PET) based on recent developments using the liquid TriMethyl Bismuth (TMBi). This liquid with a bismuth mass fraction of more than 80% allows for a very efficient and accurate detection of 511 keV photons originated from positron annihilation. Bismuth has the highest nuclear charge (Z = 83) and thus the largest photoelectric cross section of all stable isotopes. The gamma energy of 511 keV is transferred to electrons in this material in nearly 50 % of the cases through the photoelectric effect. Both the Cherenkov light emitted by the resulting relativistic photo-electron and the secondary charge carriers produced during multiple scattering interactions are detected in a liquid ionisation chamber, supplemented by photodetectors. Based on previous studies of liquid TMBi, we intend to develop and evaluate a novel PET detector with simultaneous detection of Cherenkov light and ionisation in a common effort of four research partners. Excellent imaging resolution is anticipated with the proposed detector by cancelling out Depth Of Interaction effects while allowing for placement of the detector close to the body which is increasing the detector’s solid angle. The TMBi’s coincidence photoelectric efficiency is the highest available, with twice the value of LSO/LYSO crystals. The new detector should be able to use accurate time-of-flight (TOF) information through Cherenkov light detection in order to improve the contrast of the reconstructed image. To achieve a breakthrough in this challenging project, the expertise of the existing French CaLIPSO group (CEA-IRFU, CNRS-LAL) will be supplemented by the expertise on high-resolution PET imaging and detector development (WWU-EIMI group), and ultra-purification as well as light and charge detector readout (WWU-PHYSICS group), both from University of Munster. The main objective of this collaborative project is to develop a novel detector system for PET imaging (e.g. human brain PET, small animal PET) with a projected efficiency of 30%, high spatial precision of 1 mm3 , and high time of flight resolution of 100 ps (FWHM). In order to achieve these objectives the project will focus on the following work areas: (1) ultra-purification of TMBi and further characterisation of TMBi for gamma radiation detection, (2) development of an ionisation detector prototype, (3) study of the Cherenkov photon detection in liquid TMBi, (4) Monte Carlo simulation and image reconstruction of a full PET scanner employing the new technology, and (5) evaluation of a final PET detection demonstrator, merging charge and optical readout.

    more_vert
  • Funder: French National Research Agency (ANR) Project Code: ANR-17-CE23-0018
    Funder Contribution: 498,563 EUR

    Machine learning has inspired new markets and applications by extracting new insights from complex and noisy data. However, to perform such analyses, the most costly step is often to prepare the data. It entails correcting errors and inconsistencies as well as transforming the data into a single matrix-shaped table that comprises all interesting descriptors for all observations to study. Indeed, the data often results from merging multiple sources of informations with different conventions. Different data tables may come without names on the columns, with missing data, or with input errors such as typos. As a result, the data cannot be automatically shaped into a matrix for statistical analysis. This proposal aims to drastically reduce the cost of data preparation by integrating it directly into the statistical analysis. Our key insight is that machine learning itself deals well with noise and errors. Hence, we aim to develop the methodology to do statistical analysis directly on the original dirty data. For this, the operations currently done to clean data before the analysis must be adapted to a statistical framework that captures errors and inconsistencies. Our research agenda is inspired from the data-integration state of the art in database research combined with statistical modeling and regularization from machine learning. Data integrating and cleaning is traditionally performed in databases by finding fuzzy matches or overlaps and applying transformation rules and joins. To incorporate it in the statistical analysis, an thus propagate uncertainties, we want to revisit those logical and set operations with statistical-learning tools. A challenge is to turn the entities present in the data into representations well-suited for statistical learning that are robust to potential errors but do not wash out uncertainty. Prior art developed in databases is mostly based on first-order logic and sets. Our project strives to capture errors in the input of the entries. Hence we formulate operations in terms of similarities. We address typing entries, deduplication -finding different forms of the same entity- building joins across dirty tables, and correcting errors and missing data. Our goal is that these steps should be generic enough to digest directly dirty data without user-defined rules. Indeed, they never try to build a fully clean view of the data, which is something very hard, but rather include in the statistical analysis errors and ambiguities in the data. The methods developed will be empirically evaluated on a variety of dataset, including the French public-data repository, data.gouv.fr. The consortium comprises a company specialized in data integration, Data Publica, that guides business strategies by cross-analyzing public data with market-specific data.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.