Powered by OpenAIRE graph

Mayor's Office for Policing and Crime

Mayor's Office for Policing and Crime

3 Projects, page 1 of 1
  • Funder: UK Research and Innovation Project Code: EP/S023356/1
    Funder Contribution: 6,898,910 GBP

    The UK is world leading in Artificial Intelligence (AI) and a 2017 government report estimated that AI technologies could add £630 billion to the UK economy by 2035. However, we have seen increasing concern about the potential dangers of AI, and global recognition of the need for safe and trusted AI systems. Indeed, the latest UK Industrial Strategy recognises that there is a shortage of highly-skilled individuals in the workforce that can harness AI technologies and realise the full potential of AI. The UKRI Centre for Doctoral Training (CDT) on Safe and Trusted AI will train a new generation of scientists and engineers who are experts in model-based AI approaches and their use in developing AI systems that are safe (meaning we can provide guarantees about their behaviour) and are trusted (meaning we can have confidence in the decisions they make and their reasons for making them). Techniques in AI can be broadly divided into data-driven and model-based. While data-driven techniques (such as machine learning) use data to learn patterns or behaviours, or to make predictions, model-based approaches use explicit models to represent and reason about knowledge. Model-based AI is thus particularly well-suited to ensuring safety and trust: models provide a shared vocabulary on which to base understanding; models can be verified, and solutions based on models can be guaranteed to be correct and safe; models can be used to enhance decision-making transparency by providing human-understandable explanations; and models allow user collaboration and interaction with AI systems. In sophisticated applications, the outputs of data-driven AI may be input to further model-driven reasoning; for example, a self-driving car might use data-driven techniques to identify a busy roundabout, and then use an explicit model of how people behave on the road to reason about the actions it should take. While much current attention is focussed on recent advancements in data-driven AI, such as those from deep learning, it is crucial that we also develop the UK skills base in complementary model-based approaches to AI, which are needed for the development of safe and trusted AI systems. The scientists and engineers trained by the CDT will be experts in a range of model-based AI techniques, the synergies between them, their use in ensuring safe and trusted AI, and their integration with data-driven approaches. Importantly, because AI is increasingly pervasive in all spheres of human activity, and may increasingly be tied to regulation and legislation, the next generation of AI researchers must not only be experts on core AI technologies, but must also be able to consider the wider implications of AI on society, its impact on industry, and the relevance of safe and trusted AI to legislation and regulation. Core technical training will be complemented with skills and knowledge needed to appreciate the implications of AI (including Social Science, Law and Philosophy) and to expose them to diverse application domains (such as Telecommunications and Security). Students will be trained in responsible research and innovation methods, and will engage with the public throughout their training, to help ensure the societal relevance of their research. Entrepreneurship training will help them to maximise the impact of their work and the CDT will work with a range of industrial partners, from both the private and public sectors, to ensure relevance with industry and application domains and to expose our students to multiple perspectives, techniques, applications and challenges. This CDT is ideally equipped to deliver this vision. King's and Imperial are each renowned for their expertise in model-driven AI and provide one of the largest groupings of model-based AI researchers in the UK, with some of the world's leaders in this area. This is complemented with expertise in technical-related areas and in the applications and implications of AI.

    more_vert
  • Funder: UK Research and Innovation Project Code: ES/W002248/1
    Funder Contribution: 7,976,110 GBP

    Policing is undergoing rapid transformation. As societies face new and more complex challenges, police workloads increasingly focus on managing risks of harm to vulnerable people. At the same time, public debate voicing concerns about police priorities is rising, driven by questions about what the police do and about legitimacy in the face of discriminatory practices. Dramatic increases in complex cases coupled with cuts to public services have resulted in the police frequently acting as 'the service of first resort', at the frontline of responding to urgent social problems such as mental illness, homelessness and exploitation. The presence of such vulnerabilities draw the police into responses alongside other service providers (such as health, social care and housing) often with little clarity of roles, boundaries or shared purpose. Simultaneously, the transformation of data and its use are beginning to reshape how public services operate. They raise new questions about how to work in ethical ways with data to understand and respond to vulnerability. These shifts in police-work are mirrored around the world and pose significant challenges to how policing is undertaken and how the police interact with other public services, as well as how policing affects vulnerable people who come into contact with services. The Vulnerability and Policing Futures Research Centre aims to understand how vulnerabilities shape demand for policing and how partner organisations can prevent future harm and vulnerability through integrated public service partnerships. Rooted in rich local data collection and deep dives into specific problems, the Centre will build a knowledge base with applications and implications across the UK and beyond. It will have significant reach through collaborative work with a range of regional, national and international partners, shaping policy and practice through networks, practitioner exchanges and comparative research, and through training the next generation of scholars to take forward new approaches to vulnerabilities research and co-production with service providers, service receivers and the public. The Centre will be an international focal point for research, policy, practice and public debate. Jointly led by York and Leeds, with expertise from Durham, Lancaster, Liverpool, Manchester, Sheffield, UCL, Monash and Temple universities and the Police Foundation, and working with a network of 38 partners, it will explore fundamental questions regarding the role police and their partners should play in modern society. While focusing policing effort on the most vulnerable holds promise for a fairer society, targeting specific groups raises questions about who counts as vulnerable and has the potential to stigmatise and increase intervention in the lives of marginalised citizens. At a critical time of change for policing, the Centre will ensure that research, including evidence drawing on public opinion and the voices of vulnerable people, is at the heart of these debates. The Centre will undertake three interconnected strands of research. The first focuses on how vulnerability develops in urban areas, drawing together diverse public sector datasets (police, health, social services and education) to understand interactions between agencies and the potential to prevent vulnerabilities. The second explores how police and partners can best collaborate in response to specific vulnerabilities, including exploitation by County Lines drug networks, online child sexual exploitation, domestic abuse, modern slavery, mental illness and homelessness. The third will combine research into public opinion with a programme to embed research evidence into policy, practice and public debate, creating a new understanding of vulnerability and transforming capability to prevent harm and future vulnerabilities through integrated partnership working, reshaping the future of policing as a public service.

    more_vert
  • Funder: UK Research and Innovation Project Code: EP/V00784X/1
    Funder Contribution: 14,069,700 GBP

    Public opinion on complex scientific topics can have dramatic effects on industrial sectors (e.g. GM crops, fracking, global warming). In order to realise the industrial and societal benefits of Autonomous Systems, they must be trustworthy by design and default, judged both through objective processes of systematic assurance and certification, and via the more subjective lens of users, industry, and the public. To address this and deliver it across the Trustworthy Autonomous Systems (TAS) programme, the UK Research Hub for TAS (TAS-UK) assembles a team that is world renowned for research in understanding the socially embedded nature of technologies. TASK-UK will establish a collaborative platform for the UK to deliver world-leading best practices for the design, regulation and operation of 'socially beneficial' autonomous systems which are both trustworthy in principle, and trusted in practice by individuals, society and government. TAS-UK will work to bring together those within a broader landscape of TAS research, including the TAS nodes, to deliver the fundamental scientific principles that underpin TAS; it will provide a focal point for market and society-led research into TAS; and provide a visible and open door to engage a broad range of end-users, international collaborators and investors. TAS-UK will do this by delivering three key programmes to deliver the overall TAS programme, including the Research Programme, the Advocacy & Engagement Programme, and the Skills Programme. The core of the Research Programme is to amplify and shape TAS research and innovation in the UK, building on existing programmes and linking with the seven TAS nodes to deliver a coherent programme to ensure coverage of the fundamental research issues. The Advocacy & Engagement Programme will create a set of mechanisms for engagement and co-creation with the public, public sector actors, government, the third sector, and industry to help define best practices, assurance processes, and formulate policy. It will engage in cross-sector industry and partner connection and brokering across nodes. The Skills Programme will create a structured pipeline for future leaders in TAS research and innovation with new training programmes and openly available resources for broader upskilling and reskilling in TAS industry.

    more_vert

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.