Durham University
Durham University
5 Projects, page 1 of 1
assignment_turned_in ProjectFrom 2024Partners:Laboratoire de Physique Subatomique et de Cosmologie, University of Glasgow, UCL, Jagiellonian University, Bogazici University +3 partnersLaboratoire de Physique Subatomique et de Cosmologie,University of Glasgow,UCL,Jagiellonian University,Bogazici University,Laboratoire de Physique Théorique et Hautes Énergies,Durham University,Kyungpook National UniversityFunder: French National Research Agency (ANR) Project Code: ANR-23-CHRO-0006Funder Contribution: 97,498.8 EURThe Large Hadron Collider (LHC), and other major particle-physics experiments past, present, and future, are vast public investments in fundamental science. However, while the data-analysis and publication mechanisms of such experiments are sufficient for well-defined targets such as the discovery of the Higgs boson in 2012 (and the W, Z, and gluon bosons before), they limit the power of the experimental data to explore more subtle phenomena. In the 10 years since the Higgs-boson discovery, the LHC has published many analyses testing the limits of the Standard Model (SM) — the established, but suspected-incomplete central paradigm of particle physics. Each direct-search paper has statistically disproven some simplified models of physics beyond the SM, but such models are no more a priori likely than more complex ones: the latter feature a mixture of the simplified ones’ new phenomena, but at lower intensity, rather than concentrated into a single characteristic. To study such “dispersed signal” models requires a change in how LHC results are interpreted: the emphasis must shift to combining measurements of many different event types and characteristics into holistic meta-analyses. Only such a global, maximum-information approach can optimally exploit the LHC results. This project will provide a step towards building the infrastructure needed to make this change. It will facilitate experiments to provide fast, re-runnable versions of data-analysis logic through enhancements of a domain-specific language and event-analysis toolkits. It will join up the network of such toolkits with the public repositories of research data and metadata. It will provide common interfaces for controlling preserved analyses in the multiple toolkits, and for statistically combining the thousands of measurements and assessing which combinations can provide the most powerful scientific statement about any beyond-SM theory. At the start of the 3rd major data-taking run of the LHC, the time is now ripe to put this machinery and culture in place, so that the LHC legacy is publicly preserved for all to reuse. The project specifically aims to enhance the extent to which public analysis data from particle-physics experiments (in a general sense, but particularly summary results such as those used in publication plots and statistical inference, rather than raw collider events) can be combined and re-used to test theories of new physics. These tests, pursued by theorists and experimentalists alike, can also go beyond particle physics and also connect to astrophysics and cosmology, nuclear-physics direct searches for dark-matter. The value of combining information from different individual analyses was made clear early in the LHC programme, as early experimental data proved crucial for improving models of SM physics. The huge scientific volume, greater model-complexity, and increased precision of the full LHC programme requires pursuing this approach in a more systematic and scalable manner, open to the whole community and including use of reference datasets to ensure validity into the far future. The time is right for this step, as the key technologies (DOI minting and tracking, RESTful Web APIs, version-control hosting with continuous integration, containerisation) have become mature in the last 5 or so years. Particle physics already has established open data and publication repositories, but the crucial link of connecting those to scalable preservations of the analysis logic needs to be made, as does normalising the culture of providing such preservations and engaging in the FAIR principles for open science data. Individual physicists are generally enthusiastic about such ideals, as evidenced by the uptake of open data policies at particle-physics labs, and preservation of full collider software workflows. But an explicit, funded effort is required to eliminate the technical barriers and make these desirable behaviours more accessible and rewarded.
more_vert assignment_turned_in ProjectFrom 2018Partners:FNSP, University of Copenhagen, AU, University of Turku, LG +3 partnersFNSP,University of Copenhagen,AU,University of Turku,LG,University of Bamberg,LIfBi,Durham UniversityFunder: French National Research Agency (ANR) Project Code: ANR-17-DIAL-0003Funder Contribution: 279,656 EURmore_vert assignment_turned_in ProjectFrom 2023Partners:Kadir Has University, Institut d'electronique de microélectronique et de nanotechnologie, CNRS, Durham University, USTL +8 partnersKadir Has University,Institut d'electronique de microélectronique et de nanotechnologie,CNRS,Durham University,USTL,ISEN,EPFZ,UVHC,ENSCL,University of Glasgow,LABORATOIRE D'INTEGRATION DU MATERIAU AU SYSTEME,INSA Hauts-de-France,INSISFunder: French National Research Agency (ANR) Project Code: ANR-22-CHR4-0003Funder Contribution: 353,057 EURMotivation: Health and fitness wearables present mobile solutions for ICT in public wellbeing by providing personal remote control and clinical intervention through telemedicine networks. Due to their noninvasive and continuous vital sign monitoring, wearables are incorporated in several studies to identify the onset and progression of the Coronavirus pandemic, and institutions deployed patient surveillance networks based on them. However, today's consumer wearables rely on sensing technologies vulnerable to motion artifacts due to discontinuous skin contact or insufficient motion artefact reduction mechanisms that prevent them from being a reliable source of vital signs. Objective: The SNOW project specifically aims to heterogeneously integrate the best options of different disciplines to offer a complete ICT solution based on a Nano-Opto-Electro-Mechanical system (NOEMS) that is mechanically flexible and energy-efficient. The combination of optical and mechano-acoustic sensors into a single platform and consequent manipulation of the light signal via mechanical input and integrated electronics offers accurate heart rate and respiration rate extractions. With the combination of material and flexible-electronics-based technologies, our project aims to provide a wearable solution for ICT to contribute to a decent level of personal and public health. By benefiting from the proven expertise of the interdisciplinary consortium, here we propose to realize the next generation wearable devices that can continuously monitor and provide instant feedback on the user’s personal health parameters. Implementation: Our hybrid approach provides artefact compensation by using the heart rate signal from both optical and mechano-acoustic sensors. Integrating these sensors into a neuromorphic processor yields strict control on the actively extracted data and creates instant feedback in the case of abnormalities. The energy and data communication requirement of the proposed mobile sensing unit will be realized by a specific wireless communication that provides an efficient capacitive coupling to operate the sensors and circuitry components bypassing the need for an additional battery and bulky readout systems. Capacitive coupling with a smartwatch module will also provide transmission of the processed signal back to the final smart devices such as smartphone, laptops and the smartwatch itself. The final system integration work package will employ a heterogenous integration methodology to pack these technologies in a wearable device form factor suitable for user experience and validation. Systematic validation of the final wearable device prototypes will be provided to reach reliable device deployment. Active user experience will be investigated to improve design aspects and the measurement methodologies.
more_vert assignment_turned_in ProjectFrom 2016Partners:University of Toulouse I Capitole, UT, Utrecht University, IRS Leibnitz Institute, Durham UniversityUniversity of Toulouse I Capitole,UT,Utrecht University,IRS Leibnitz Institute,Durham UniversityFunder: French National Research Agency (ANR) Project Code: ANR-15-ORAR-0007Funder Contribution: 119,617 EURmore_vert assignment_turned_in ProjectFrom 2015Partners:Roma Tre University, UNISA, LINA/Université de Nantes, MICA/Université Bordeaux 3, P2P Foundation +13 partnersRoma Tre University,UNISA,LINA/Université de Nantes,MICA/Université Bordeaux 3,P2P Foundation,UCB,ULB,University of Waterloo - Critical Media Lab,CEA,Institute of Networked Cultures,IRI,EPFL,UCSC,Leuphana University,GOLDSMITHS',Technological University Dublin,City University of Hong Kong,Durham UniversityFunder: French National Research Agency (ANR) Project Code: ANR-15-MRSE-0022Funder Contribution: 29,999.8 EURThe field of the Digital Humanities has developed both in France and globally over the last decade, in line with the penetration of digital technologies into all levels of society and the academic sphere. Coined by Unsworth and Siemens in 2004, the term ‘digital humanities’ was originally intended to signal a shift away from the first phase of humanities computing, which largely involved the digitisation of texts through the Text Encoding Initiative (TEI). Doing ‘digital humanities’ has since comprised the application of computational methods to the study of texts and also the application of the critical theoretical methodologies of the contemporary humanities to digital objects. It thus overlaps with media studies, which looks at the relationship between technology and the kinds of thought and cultural expression to which different technologies give rise. It has begun to converge, too, in this respect, with the emerging discipline of Science and Technology Studies (STS), which analyses the relationship between technological cultures (scientific instruments, institutional practices) and the formation of scientific knowledge. Just as the digital humanities subsumed the earlier field of the computational humanities, we now argue that the time has come for the digital humanities to be superseded and subsumed within the broader field of digital studies. This is in recognition of the way that all fields of knowledge, including not just the humanities but the social and ‘hard’ sciences, and all aspects of social organisation, are reinvented with every change in the technical systems that constitute culture. With the advent of the digital, this reinvention has been both creative and traumatic in equal measure. Science has been revolutionised by new techniques that permit the unprecedented sharing of data (the human genome project, the compilation of climate data), but has stumbled over intellectual copyright and the privatisation of knowledge, not to mention the profound ‘flattening’ of scientific expertise that comes with the proliferation of media and the prospect of undifferentiated access to all manner of different opinions (Bruno Latour). Similar issues of access are transforming the spheres of politics and the economy (the replacement of ‘professional’ classes with unpaid, free content: from HuffPo to ‘Uberisation’). In each of these cases, we are seeing the realignment of existing social structures around an economy of contribution, in which knowledge is produced not by private, proprietorial users who buy and sell information, but by collaborative participants and amateurs who make their data open to and modifiable by all. Digital Studies is the field of research that takes this emerging ‘economy of contribution’ as one of its objects and systematically investigates the epistemic and epistemological stakes of this new state of affairs in the field of knowledge. The purpose of this bid is to bring together researchers across institutions in Europe, Asia and North and South America to build an international Digital Studies Network, focusing on the way in which society and its institutions are being transformed by digital culture, and by different technologies more generally. In addition to these theoretical and epistemological dimensions, the network will also develop open-source technologies to foster the growth of the economy of contribution as well as new instruments for contributive research. This is where we want to develop an ambitious European and international research program (FET-Exchange) on technologies for contributive categorization, annotation, certification and editorialization with the goal of developing an hermeneutic and negentropic conception of the world wide web.
more_vert
