Institut des Sciences du Mouvement
Institut des Sciences du Mouvement
10 Projects, page 1 of 2
assignment_turned_in ProjectFrom 2016Partners:Institut des Sciences du MouvementInstitut des Sciences du MouvementFunder: French National Research Agency (ANR) Project Code: ANR-16-CE10-0003Funder Contribution: 288,777 EURThe sense of touch is the sense that captures the mechanical interaction with our surroundings. When we enter in contact with an object, the mechanical deformation of our skin and the resistance opposed to our limb, inform the central nervous system about the contact condition. From there information such as the weight, center of gravity, texture and slipperiness of the object are extracted. These tactile percepts advise the planification of motor commands in order to achieve the mesmerizing dexterity of the human hand. Regardless of the obvious benefit of relying on touch system, current commercial robotic systems are rarely equipped with tactile sensors that capture relevant information about the tactile scene. Instead they mostly depend on vision systems to perform tasks. Many reasons motivate this technological preference. First, contrary to cameras, robotic fingers and artificial skins with enough resolution and robustness are not yet broadly available to researchers and industrials. Secondly, even the best artificial sensing systems lack framework for processing and recognizing the tactile scene. The few studies that did tackle these issues are often linked to image processing and often neglect to include frictional and adhesion properties, essential for swift control of robotic hand and surface texture characterization. This research program aims at bridging the current limitation of soft-sensor design and computational touch to bring the sense of touch to a wide variety of robotic applications. The sensor will leverage the recent advances in soft material construction to build an artificial fingertip that can match the perceptual capability and mechanical strength of their human counterpart. Data provided by sensors will be use to infer the state of contact via a physically motivated computational framework that is based on tribology and contact mechanics. Research on touch is still in its infancy compared to visual and auditory perception. Building a physically-grounded framework around artificial touch has the same innovative potential as computer vision had 30 years ago.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::3954a8f5e354eddc70dceebb8d6aa140&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::3954a8f5e354eddc70dceebb8d6aa140&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2024Partners:Joint Robotics Laboratory, Institut des Sciences du MouvementJoint Robotics Laboratory,Institut des Sciences du MouvementFunder: French National Research Agency (ANR) Project Code: ANR-24-CE33-0218Funder Contribution: 449,088 EURThe aim of this project is to develop a new strategy for the automatic return to base for a humanoid robot: this strategy will be based on low-resolution panoramic vision and not on GNSS localization or greedy algorithms such as SLAM. We are currently developing new neural models inspired by ant navigation. In addition to using a low-resolution image, these models are particularly frugal, since they compress on-the-fly the visual memory of images taken continuously along the way. These models are already robust enough to guide a mobile robot along a learned path. The aim is to show that these neural models can also operate humanoids without GNSS or SLAM for a navigation task in a complex environment, such as climbing stairs over a relatively long distance, i.e. a circuit of at least 20 meters.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::2ea4ef45db7d3b7287f44663083a2713&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::2ea4ef45db7d3b7287f44663083a2713&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2014Partners:Institut National de Recherche en Informatique et Automatique, Institut des Sciences du MouvementInstitut National de Recherche en Informatique et Automatique,Institut des Sciences du MouvementFunder: French National Research Agency (ANR) Project Code: ANR-14-CE24-0009Funder Contribution: 455,664 EURTouch-based interactions with computing systems are greatly affected by two interrelated factors: the transfer functions applied on finger movements, and latency. Little is actually known on these functions, and latency only recently received attention in this context. This project aims at transforming the design of touch transfer functions from black art to science to support high-performance interactions. We will precisely characterize the functions used and the latency observed in current touch systems. We will develop a testbed environment to support multidisciplinary research on touch transfer functions. We will use this testbed to design latency reduction and compensation techniques, and new transfer functions. Three partners with expertise in Human-Computer Interaction, Control Theory and Human Movement Science will collaborate on this project: the MINT and NON-A teams from Inria Lille, and the "Perceptual-motor behavior group" from the Institute of Movement Sciences.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::ff17bdc066d6ee347503dcd95ee852b7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::ff17bdc066d6ee347503dcd95ee852b7&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2016Partners:LIP6, Institut des Sciences du MouvementLIP6,Institut des Sciences du MouvementFunder: French National Research Agency (ANR) Project Code: ANR-16-CE33-0002Funder Contribution: 287,787 EURIOTA project aims to establish a novel scientific instrument for biomedical applications, which will extend an operator's physical interaction abilities into a Petri dish, directly in contact with living cells and similar samples. The system combines laser manipulation techniques and a tactile force feedback user interface into a complete robotic system which will allow the operator to directly probe and manipulate microscale biochemical samples}, ranging from large molecules and cells to tissues. Such a system addresses the needs of fields such as biology, chemistry and medical sciences. Hence its design is aimed to hide the complexity of the robotic parts from the operator and provide him with an easy-to-use, intuitive and familiar interface using the widespread touchscreen approach, augmented with tactile feedback. However, the abundance of information to transmit such interactions to the operator is well beyond the capabilities of actual devices. The system proposed here relies on two novel developments: the first one concerns the closed-loop force and position control of engineered micro-structures in liquid solution by means of optical traps, to be used as tools and probes. The second one concerns the user interface, and will provide the means to render co-locally visual cues and high-fidelity haptic information through the modulation of friction forces on a glass plate.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::c4eaba7bd2dddb139fcac5a99e71764d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::c4eaba7bd2dddb139fcac5a99e71764d&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2014Partners:École Polytechnique, Laboratoire de Mécanique et dAcoustique, Institut des Sciences du Mouvement, PSAÉcole Polytechnique,Laboratoire de Mécanique et dAcoustique,Institut des Sciences du Mouvement,PSAFunder: French National Research Agency (ANR) Project Code: ANR-14-CE24-0018Funder Contribution: 798,969 EURMusicians adapt their gestures according to the sounds they produce on their instrument, but would they be able to learn such expert gestures without such a coherent, auditory feedback? This question, which illustrates the relation between movements and sounds, reveals the potential interest in sonification as a tool to learn and guide silent gestures. In fact, the sound-gesture adjunction should enable a precise and efficient control of so far unreferenced dynamic gestures, thanks to the high temporal and spatial precision that sounds can offer. The SoniMove project focuses on the contribution of calibrated sounds to control functional gestures attached to new Human-Machine Interfaces (HMI) for cars and to expert gestures specific to sport and music. It is based on the natural tendency of our cognitive system to try to identify the sources that produce the sounds we hear and to precisely identify the dynamic relations (time and space) between sounds and movements. In addition, this project aims at reducing the cognitive load in the visual modality by providing information through sounds. This is of particular interest when proposing new HMIs for cars, since the visual modality should in this case be devoted to driving for security reasons. Hence, this project has the ambition to provide important innovations in terms of industrial and societal results by focusing on the dynamic and interactive information that merely sounds can offer, and that makes it possible to radically reconsider the interaction between audionumerical technologies, new HMIs and human expert movements (sport and music). The SoniMove project is destined to industrial and societal applications, but it also raises a number of fundamental questions linked to the influence of sounds on human beings. More precisely, it questions how an intimate manipulation of sounds, based on invariant sound morphologies that transmit specific information, can not only inform, but also guide or modify the human motor behavior in a given cognitive context. To answer these fundamental questions and adapt them to industrial applications, we will adopt a theoretical viewpoint in accordance with recent paradigms within the domain of cognitive neuroscience based on the properties of the perception-action loop, that we will evaluate in interactive protocols (enactive loop) by taking into account multimodal processes of integration (sound, vision, movement). The Sonimove project is naturally organized along 3 intimately linked tasks. These 3 tasks respectively address 1/ fundamental questions linked to sound morphologies and 3D auditive immersion, 2/ fundamental questions linked to sound/movement relations related to the adjustment of motor behavior, 3/ and finally their industrial and societal applications geared to the sonification of new Human/Machine interfaces and to the learning of expert gestures for sport and music through sounds. To treat all these questions, the SoniMove project unites the necessary expertise in the domains of analysis-synthesis of sounds, sonification, perception of sounds, cognitive science, movement science and human machine interfaces within an interdisciplinary environment that favors the development of enactive and multimodal approaches. It is founded on a collaboration between three partners: two academic laboratories with expert knowledge in the domains of acoustics (Laboratory of Mechanics and Acoustics, LMA, Marseille) and movement science (Institute of Movement Science, ISM, Marseille) and an industrial partner that is strongly involved in innovative research and development (Society Peugeot-Citroën Automobiles, PSA).
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::cf608ea268439c51915ad2e7b16d5db8&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::cf608ea268439c51915ad2e7b16d5db8&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
chevron_left - 1
- 2
chevron_right