Blue Bear (United Kingdom)
Blue Bear (United Kingdom)
9 Projects, page 1 of 2
assignment_turned_in Project2017 - 2019Partners:University of Exeter, University of Exeter, University of Bristol, UNIVERSITY OF EXETER, Blue Bear Systems Research Ltd +2 partnersUniversity of Exeter,University of Exeter,University of Bristol,UNIVERSITY OF EXETER,Blue Bear Systems Research Ltd,Blue Bear (United Kingdom),University of BristolFunder: UK Research and Innovation Project Code: EP/P015409/1Funder Contribution: 100,982 GBPRecent reports by the EU and House of Lords on the civilian use of unmanned aerial vehicles (UAVs) have highlighted examples of UAV applications for civil and commercial applications, which include search & rescue, inspection and filming. In fact recent media coverage highlighted prominent companies such as Amazon, DHL and Google which are seriously considering, and testing, small UAVs for delivery services. Despite the huge potential impact of small UAVs on civil life and commercial practices, an increase in reported UAV-related incidents (e.g. a collision with a bridge, near collisions with large passenger aircraft, and injury to an athlete) has been a major concern for regulators worldwide. At present, the commercial use of small UAVs is currently heavily restricted and regulated in the UK. Some incidents occurred due to operator shortcomings, but raise serious concerns about the safety of UAVs, especially when one considers the presence of a large number of civil and commercial UAVs flying autonomously in heavily populated areas. Therefore, there is an urgent need to develop control technologies which will compensate for faults and failures, and enable the safe operation of autonomous UAVs. In fact, it is envisaged that small UAVs will embrace and implement advanced state-of-the-art fault tolerant control schemes before their manned aircraft counterparts could, partly due to their versatility and the fundamental need to ensure safety for civil and commercial applications. This project will therefore: (1) help improve safety, resilience and survivability of small multirotor unmanned aerial vehicles in the event of in-flight faults and failures, and (2) bridge the gap between the theory and application of sliding mode control, thus encouraging adoption of sliding mode control in industry, particularly aerospace. Flight control systems will be developed for a small, highly redundant UAV for commercial and civil applications. A resilient control system, typically known as fault tolerant control (FTC), will be built based on sliding mode control (SMC) schemes. The fault tolerant schemes will initially be simulated with realistic faults/failures using a simulation tool developed in this project, then followed by hardware implementation and rigorous evaluation on a highly redundant UAV. An important aspect of this proposal is the partnership with Bristol Robotics Laboratory (BRL) and Blue Bear Systems Research Ltd (BBSR). Driven by industrial challenges and supported by BRL and BBSR, a rigorous assessment and evaluation campaign will be undertaken to highlight the maturation of the control schemes developed during the project. By demonstrating an increase in technology readiness level (TRL), the project will promote the adoption of these technologies in industry.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::b76a426f07f1f35ed0c96efe61cf10ff&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::b76a426f07f1f35ed0c96efe61cf10ff&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2015 - 2021Partners:University of Salford, University of Manchester, BAE Systems (UK), Blue Bear (United Kingdom), BAE Systems (United Kingdom) +9 partnersUniversity of Salford,University of Manchester,BAE Systems (UK),Blue Bear (United Kingdom),BAE Systems (United Kingdom),TRW Automotive (United Kingdom),Defence Science and Technology Laboratory,BAE Systems (Sweden),Blue Bear Systems Research Ltd,Defence Science & Tech Lab DSTL,TRW Conekt,TRW Conekt,The University of Manchester,Defence Science & Tech Lab DSTLFunder: UK Research and Innovation Project Code: EP/M019284/1Funder Contribution: 858,323 GBPAutonomous robots, capable of independent and intelligent navigation through unknown environments, have the potential to significantly increase human safety and security. They could replace people in potentially hazardous tasks, for instance search and rescue operations in disaster zones, or surveys of nuclear/chemical installations. Vision is one of the primary senses that can enable this capability, however, visual information processing is notoriously difficult, especially at speeds required for fast moving robots, and in particular where low weight, power dissipation and cost of the system are of concern. Conventional hardware and algorithms are not up to the task. The proposal here is to tightly integrate novel sensing and processing hardware, together with vision, navigation and control algorithms, to enable the next generation of autonomous robots. At the heart of the system will be a device known as a 'vision chip'. This bespoke integrated circuit differs from a conventional image sensor, including a processor with each pixel. This will offer unprecedented performance. The massively parallel processor array will be programmed to pre-process images, passing higher-level feature information upstream to vision tracking algorithms and the control system. Feature extraction at pixel level results in an extremely efficient and high speed throughput of information. Another feature of the new vision chip will be the measurement of 'time of flight' data in each pixel. This will allow the distance to a feature to be extracted and combined with the image plane data for vision tracking, simplifying and speeding up the real-time state estimation and mapping capabilities. Vision algorithms will be developed to make the most optimal use of this novel hardware technology. This project will not only develop a unique vision processing system, but will also tightly integrate the control system design. Vision and control systems have been traditionally developed independently, with the downstream flow of information from sensor through to motor control. In our system, information flow will be bidirectional. Control system parameters will be passed to the image sensor itself, guiding computational effort and reducing processing overheads. For example a rotational demand passed into the control system, will not only result in control actuation for vehicle movement, but will also result in optic tracking along the same path. A key component of the project will therefore be the management and control of information across all three layers: sensing, visual perception and control. Information share will occur at multiple rates and may either be scheduled or requested. Shared information and distributed computation will provide a breakthrough in control capabilities for highly agile robotic systems. Whilst applicable to a very wide range of disciplines, our system will be tested in the demanding field of autonomous aerial robotics. We will integrate the new vision sensors onboard an unmanned air vehicle (UAV), developing a control system that will fully exploit the new tracking capabilities. This will serve as a demonstration platform for the complete vision system, incorporating nonlinear algorithms to control the vehicle through agile manoeuvres and rapidly changing trajectories. Although specific vision tracking and control algorithms will be used for the project, the hardware itself and system architecture will be applicable to a very wide range of tasks. Any application that is currently limited by tracking capabilities, in particular when combined with a rapid, demanding control challenge would benefit from this work. We will demonstrate a step change in agile, vision-based control of UAVs for exploration, and in doing so develop an architecture which will have benefits in fields as diverse as medical robotics and industrial production.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::1cf0a057bbcfda378f688fc21ce0a2b4&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::1cf0a057bbcfda378f688fc21ce0a2b4&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2022 - 2024Partners:University of Bristol, The Noise Abatement Society, Virginia Polytechnic Institute & State U, University of Salford, NESTA +11 partnersUniversity of Bristol,The Noise Abatement Society,Virginia Polytechnic Institute & State U,University of Salford,NESTA,University of Bristol,Virginia Tech,University of Salford,Blue Bear (United Kingdom),Virginia Polytechnic Inst & State Uni,Noise Abatement Society,Blue Bear Systems Research Ltd,Nesta,Nesta,DronePrep,DronePrepFunder: UK Research and Innovation Project Code: EP/V031848/1Funder Contribution: 285,465 GBPThere is currently a risk that drones are taking to the air without sufficient consideration of their noise impact on public health and wellbeing. Government and industry agree that drone technologies will lead to a significant business opportunity. Drones are expected to support an efficient provision of public services, and therefore deliver substantial societal benefits. But there is a very real barrier to making this a reality - none of this can happen if noise issues are not taken care of at design, operation and policy levels. The pandemic crisis has served to propel the use of drones to deliver food and medicines. It is now more certain that drone technologies will be widely adopted in the short future for a range of applications from parcel delivery to transport of people. These applications are set to grow thanks to the EC U-space and UK Future Flight initiatives, which are creating a clear framework to allow the creating of a market for drone services. However, the noise of hundreds of drones flying around will certainly lead to conflicts with communities. To date, there is not a comprehensive understanding on how drone noise is perceived and what can be done to operate drones without affecting public health and wellbeing. Noise is already a serious issue. As reported by the European Environment Agency, environmental noise already causes approximately 16,600 cases of premature death in Europe each year, with almost 32 million adults suffering annoyance and over 13 million adults suffering sleep disturbance. Assessing noise perception of drones and developing actions to mitigate their impact on communities is challenging, due to their unconventional sound signatures and operating procedures. Standard measures of sound power (proposed in EU Regulation 2019/945) are inadequate to characterise this. But it's also an opportunity to innovate in the way transportation noise issues are dealt with. In this project, I will develop models to predict human response to drone noise. Integrated into the design cycle, these perception noise models will allow to noise issues to be anticipated early in the design process. This approach will avoid costly and inefficient ad hoc corrections at later stages, and therefore, will go beyond the traditional approach on aircraft noise assessment. I will investigate how context influences drone noise perception. People won't perceive a drone delivering a parcel to their neighbours equally to a drone providing medical supplies. Furthermore, I will investigate noise annoyance and audibility for a comprehensive set of drone operating conditions, to define acceptable noise characteristics for drone operations. The outcomes of my project will inform how and where to fly drones to minimise impact on existing soundscapes. The work in my project will be connected to industry design, policy making and organisations lobbing for noise abatement, through a steering group with the main drone stakeholders. I will develop a toolkit to aid manufacturers to reduce the noise impact of their vehicles. Developing quiet technologies will give the UK drone industry, which has over 700 entities, an edge in a highly competitive market both domestic & overseas. I will also write a policy brief to inform regulations for operating drones with less impact on people's health and wellbeing. Regulations for quiet drone operations would allow greater usage for the benefit of the people in the UK. The outcomes of my project are planned to have direct impact in the small-to-medium size drone market, and set the foundations for potential future impact in drones for transport of people. In summary, my work will address the noise issues related to the design and operation of drones, to aid drone stakeholders to ensure community acceptance, and contribute to the sustainable expansion of the sector. This will contribute to maintain the UK world-leading position on drone research and development.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::32dac45b768950eb1412050ba10d9542&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::32dac45b768950eb1412050ba10d9542&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2018 - 2018Partners:Adelard, Intel Corporation, Blue Bear (United Kingdom), Bristol Robotics Laboratory, University of Liverpool +17 partnersAdelard,Intel Corporation,Blue Bear (United Kingdom),Bristol Robotics Laboratory,University of Liverpool,Blue Bear Systems Research Ltd,Federal University of Pernambuco,University of Liverpool,D-RisQ Ltd,Liverpool Data Research Associate LDRA,ESC (Engineering Safety Consultants Ltd),Verified Systems International GmbH,Adelard,Bristol Robotics Laboratory,ESC (Engineering Safety Consultants Ltd),Brunel University,Federal University of Pernambuco,D-RisQ (United Kingdom),Intel (Ireland),Verified Systems International GmbH,Brunel University London,Liverpool Data Research Associate LDRAFunder: UK Research and Innovation Project Code: EP/R025134/1Funder Contribution: 610,059 GBPMobile and autonomous robots have an increasingly important role in industry and the wider society; from driverless vehicles to home assistance, potential applications are numerous. The UK government identified robotics as a key technology that will lead us to future economic growth (tinyurl.com/q8bhcy7). They have recognised, however, that autonomous robots are complex and typically operate in ever-changing environments (tinyurl.com/o2u2ts7). How can we be confident that they perform useful functions, as required, but are safe? It is standard practice to use testing to check correctness and safety. The software-development practice for robotics typically includes testing within simulations, before robots are built, and then testing of the actual robots. Simulations have several benefits: we can test early, and test execution is cheaper and faster. For example, simulation does not require a robot to move physically. Testing with the real robots is, however, still needed, since we cannot be sure that a simulation captures all the important aspects of the hardware and environment. In the current scenario, test generation is typically manual; this makes testing expensive and unreliable, and introduces delays. Manual test generation is error-prone and can lead to tests that produce the wrong verdict. If a test incorrectly states that the robot has a failure, then developers have to investigate, with extra cost and time. If a test incorrectly states that the robot behaves as expected, then a faulty system may be released. Without a systematic approach, tests may also identify infeasible environments; such tests cannot be used with the real robot. To make matters worse, manual test generation limits the number of tests produced. All this affects the cost and quality of robot software, and is in contrast with current practice in other safety-critical areas, like the transport industry, which is highly regulated. Translation of technology, however, is not trivial. For example, lack of a driver to correct mistakes or respond to unforeseen circumstances leads to a much larger set of working conditions for an autonomous vehicle. Another example is provided by probabilistic algorithms, which make the robot behaviour nondeterministic, and so, difficult to repeat in testing and more difficult to characterise as correct or not. We will address all these issues with novel automated test-generation techniques for mobile and autonomous robots. To use our techniques, a RoboTest tester constructs a model of the robot using a familiar notation already employed in the design of simulations and implementations. After that, instead of spending time designing simulation scenarios, the RoboTest tester, with the push of a button, generates tests. With RoboTest, testing is cheaper, since it takes less time, and is more effective, because the RoboTest tester can use many more tests, especially when using a simulation. To execute the tests, the RoboTest tester can choose from a few simulators employing a variety of approaches to programming. Execution of the tests also follows the push of a button. Yet another button translates simulation to deployment tests. So, the RoboTest tester can trace back the results from the deployment tests to the simulation and the original model. So, the RoboTest tester is in a strong position to understand the reality gap between the simulation and the real world. The RoboTest tester knows that the verdicts for the tests are correct, and understands what the testing achieves; for example, it can be guaranteed to find faults of an identified class. So, the RoboTest tester can answer the very difficult question: have we tested enough? In conclusion, RoboTest will move the testing of mobile and autonomous robots onto a sound footing. RoboTest will make testing more efficient and effective in terms of person effort, and so, achieve longer term reduced costs.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::16d1ff5451b5278c2f73af32e7819c80&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::16d1ff5451b5278c2f73af32e7819c80&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in Project2018 - 2024Partners:Verified Systems International GmbH, Adelard, University of Sheffield, Bristol Robotics Laboratory, Adelard +18 partnersVerified Systems International GmbH,Adelard,University of Sheffield,Bristol Robotics Laboratory,Adelard,[no title available],Federal University of Pernambuco,D-RisQ (United Kingdom),Liverpool Data Research Associate LDRA,Liverpool Data Research Associate LDRA,ESC (Engineering Safety Consultants Ltd),University of Liverpool,Blue Bear Systems Research Ltd,University of Sheffield,D-RisQ Ltd,ESC (Engineering Safety Consultants Ltd),Intel Corporation,Bristol Robotics Laboratory,Intel (Ireland),Verified Systems International GmbH,Blue Bear (United Kingdom),Federal University of Pernambuco,University of LiverpoolFunder: UK Research and Innovation Project Code: EP/R025134/2Funder Contribution: 575,876 GBPMobile and autonomous robots have an increasingly important role in industry and the wider society; from driverless vehicles to home assistance, potential applications are numerous. The UK government identified robotics as a key technology that will lead us to future economic growth (tinyurl.com/q8bhcy7). They have recognised, however, that autonomous robots are complex and typically operate in ever-changing environments (tinyurl.com/o2u2ts7). How can we be confident that they perform useful functions, as required, but are safe? It is standard practice to use testing to check correctness and safety. The software-development practice for robotics typically includes testing within simulations, before robots are built, and then testing of the actual robots. Simulations have several benefits: we can test early, and test execution is cheaper and faster. For example, simulation does not require a robot to move physically. Testing with the real robots is, however, still needed, since we cannot be sure that a simulation captures all the important aspects of the hardware and environment. In the current scenario, test generation is typically manual; this makes testing expensive and unreliable, and introduces delays. Manual test generation is error-prone and can lead to tests that produce the wrong verdict. If a test incorrectly states that the robot has a failure, then developers have to investigate, with extra cost and time. If a test incorrectly states that the robot behaves as expected, then a faulty system may be released. Without a systematic approach, tests may also identify infeasible environments; such tests cannot be used with the real robot. To make matters worse, manual test generation limits the number of tests produced. All this affects the cost and quality of robot software, and is in contrast with current practice in other safety-critical areas, like the transport industry, which is highly regulated. Translation of technology, however, is not trivial. For example, lack of a driver to correct mistakes or respond to unforeseen circumstances leads to a much larger set of working conditions for an autonomous vehicle. Another example is provided by probabilistic algorithms, which make the robot behaviour nondeterministic, and so, difficult to repeat in testing and more difficult to characterise as correct or not. We will address all these issues with novel automated test-generation techniques for mobile and autonomous robots. To use our techniques, a RoboTest tester constructs a model of the robot using a familiar notation already employed in the design of simulations and implementations. After that, instead of spending time designing simulation scenarios, the RoboTest tester, with the push of a button, generates tests. With RoboTest, testing is cheaper, since it takes less time, and is more effective, because the RoboTest tester can use many more tests, especially when using a simulation. To execute the tests, the RoboTest tester can choose from a few simulators employing a variety of approaches to programming. Execution of the tests also follows the push of a button. Yet another button translates simulation to deployment tests. So, the RoboTest tester can trace back the results from the deployment tests to the simulation and the original model. So, the RoboTest tester is in a strong position to understand the reality gap between the simulation and the real world. The RoboTest tester knows that the verdicts for the tests are correct, and understands what the testing achieves; for example, it can be guaranteed to find faults of an identified class. So, the RoboTest tester can answer the very difficult question: have we tested enough? In conclusion, RoboTest will move the testing of mobile and autonomous robots onto a sound footing. RoboTest will make testing more efficient and effective in terms of person effort, and so, achieve longer term reduced costs.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::547a9d49b189df68794c5c8b34da157b&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::547a9d49b189df68794c5c8b34da157b&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
chevron_left - 1
- 2
chevron_right