AGH University of Science and Technology
AGH University of Science and Technology
Funder
8 Projects, page 1 of 2
assignment_turned_in Project2019 - 2021Partners:AGH University of Science and Technology, University of Science and Technology AGH - Faculty of Applied MathematicsAGH University of Science and Technology,University of Science and Technology AGH - Faculty of Applied MathematicsFunder: Austrian Science Fund (FWF) Project Code: J 4276Funder Contribution: 153,280 EURAll Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=fwf_________::1b29c67ca369cc5a99e114b9eef01749&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=fwf_________::1b29c67ca369cc5a99e114b9eef01749&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2024Partners:NTNU, AGH University of Science and Technology, XLIM, University of CambridgeNTNU,AGH University of Science and Technology,XLIM,University of CambridgeFunder: French National Research Agency (ANR) Project Code: ANR-24-MRS1-0005Funder Contribution: 35,998.8 EURIn the context of a digital world saturated with images and videos, QualitEye stands out as an innovative project aiming to push the boundaries of visual quality assessment. This ambitious initiative aims to create a European doctoral network dedicated to this crucial challenge. With a consortium of renowned partners, QualitEye is committed to training a new generation of highly qualified interdisciplinary researchers capable of tackling the complex challenges related to the evaluation of visual content quality. This European doctoral network will provide a unique platform for yearly stage researchers, fostering international and intersectoral exchanges. By collaborating with experts from various countries and sectors, QualitEye's doctoral students will benefit from a wealth of approaches and expertise, enriching their research records. International mobility and secondments in industrial environments will be encouraged, allowing doctoral students to explore other partner institutions or companies across Europe. In a world where visual content is ubiquitous, the applications of QualitEye are vast and varied. The project aims to stimulate innovation in areas such as healthcare, education, entertainment, and virtual reality. By developing advanced methodologies to quantify and master the quality of visual content, QualitEye seeks to enhance user experiences, offer new learning and entertainment methods, and open new perspectives in key areas such as healthcare, culture, and education. On the technical side, QualitEye will focus on creating precise models and metrics, implementing sophisticated perceptual models, and exploring deep learning for finer evaluation. All of this is aimed at ensuring quality throughout the lifecycle of visual content while mastering energy efficiency. By also integrating industrial partners, the project is committed to addressing the needs and challenges of the professional world. Doctoral students will have the opportunity to work on concrete projects in collaboration with companies, thus facilitating the transfer of knowledge between the academic and industrial worlds.
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::a862f92687c5751391fa10a19b499a55&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::a862f92687c5751391fa10a19b499a55&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2024Partners:OUH, L2S, CS, AGH University of Science and Technology, UPFOUH,L2S,CS,AGH University of Science and Technology,UPFFunder: French National Research Agency (ANR) Project Code: ANR-23-CHR4-0005Funder Contribution: 195,844 EUR1.Project summary In the following years, we will see the advent of many new applications and use-cases such as the metaverse, the adoption of XR/VR, holographic telepresence, the Internet of the Senses, the consolidation of the Internet of Things, with autonomous robots, fully automated industries and manufacturing plants, as well as smart infrastructures and environments, to mention just a few. To satisfy their strict and high requirements —in terms of throughput, latency, reliability, connectivity, and power consumption— wireless networks —and their radio interface in particular— are becoming exceedingly complex, with a plethora of advanced communication features, protocols and parameters, usually involving nonlinear dependencies between them. To deal with such complexity, the use of Artificial Intelligence and Machine Learning (AI/ML) techniques—and their ability to deal with complexity in general—is the necessary performance enabler for next-generation wireless networks. In this project, we aim to build a new, clean-slate AI/ML-Driven Radio (MLDR) interface. This new MLDR interface will learn to communicate by selecting and configuring the set of communication protocols and functionalities that better suit every particular use-case and scenario, thus satisfying the aforementioned hard performance requirements and efficiently using the available spectrum resources. While the project proposal is groundbreaking in terms of focus and goals, we will follow a standard research approach to reach the stated objectives, i.e., we will move from use-cases, concepts/specifications and design, to implementation, evaluation and analysis. The consortium includes four partners, all working at the intersection of wireless networks and AI/ML areas, with complementary expertise. During the MLDR design and evaluation process, we will generate new knowledge in the form of new ideas, theories, practical solutions, ML algorithms, and disruptive communication functions. We expect the results from this project will guide the design of future AI/ML-driven wireless communications and networks, becoming a reference to follow and compare with. 2.Relevance to the call The proposal meets the topic addressed in the call and is fully in-line with the following five target outcomes: (i) design of AI-enhanced techniques for resource optimisation in RANs (MLDR will perform decisions directly related to resource optimisation), (ii) implementation of ML in physical layer signal processing (a key component of MLDR), (iii) development of AI-enhanced techniques for MIMO processing & beamforming (MLDR will embrace any available hardware functionality such as multiple antennas to provide performance gains), (iv) design of improved Cognitive Radio Networks (MLDR is itself a self-learning, cognitive interface), (v) design of use-cases to take advantage of these technologies (the definition of appropriate use-cases is crucial for the justification of deploying MLDR). Additionally, we are partially in agreement with the following two outcomes: (i) development of hardware and or/software techniques to improve energy efficiency in wireless networks (energy efficiency is not a direct goal to address in this project, but resource optimisation will lead to energy efficiency), and (ii) generation and assurance of reliable training data for ML (we will openly provide our training data, but its generation is not our direct goal)
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::ca7222b2a8e4c28168b6544ace95cf48&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::ca7222b2a8e4c28168b6544ace95cf48&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2015Partners:Université dAvignon et des Pays de Vaucluse, AGH University of Science and Technology, LIA, DEUSTO, LORIA +1 partnersUniversité dAvignon et des Pays de Vaucluse,AGH University of Science and Technology,LIA,DEUSTO,LORIA,University of AvignonFunder: French National Research Agency (ANR) Project Code: ANR-15-CHR2-0001Funder Contribution: 341,786 EURWith the growth of the information in different media such as TV programs or internet, a new issue arises. How a user can access to the information which is expressed in a foreign language? The idea of the project is to develop a multilingual help system of understanding without any human being intervention. What we would like to do, is to help people understanding broadcasting news, presented in a foreign language and to compare it to the corresponding one available in the mother tongue of the user. The concept of understanding is approached in this project by giving access to any information whatever the language in which it is presented. In fact, with the development of internet and satellite TV, tens of thousands shows and broadcasting news are available in different languages, it turns out that even high educated people, do not speak more than two or three languages while the majority speaks only one, which makes this huge amount of information inaccessible. Consequently, the majority of TV and radio programs as well as information on internet are inaccessible for the majority of people. And yet, one would like to listen to news in his own language and compare it to what has been said on the same topic in another language. For instance, how the topic of AIDS is presented in SAUDI-ARABIA and in USA? What is the opinion of The Jerusalem-Post about Yasser-Arafat? And how it is presented in Al-Quds ? To access to various information and to make available different and sometimes opposite information, we propose to develop AMIS (Access to Multilingual Information and Opinions). As a result, AMIS will permit to have another side of story of an event. The understanding process is considered here to be the comprehension of the main ideas of a video. The best way to do that, is then to summarize the video for having access to the essential information. Henceforth, AMIS will focus on the most relevant information by summarizing it and by translating it to the user if necessary. Another aspect of AMIS is to compare two summaries produced by this system, from two languages on the same topic whatever their support is: video, audio or text and to present the difference between their contents in terms of information, sentiments, opinions, etc. Furthermore, the coverage of the web and social media will be exploited in order to strengthen or weaken the retrieved opinions. AMIS could be incorporated in a TV remote control or such as software associated to any internet browser. In conclusion AMIS will address the following research points: • Text, audio and video summarization. • Automatic Speech Recognition (ASR) • Machine Translation • Cross-lingual sentiment analysis • Achieving successful synergy between the previous research topics
All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::67b8c32aa63c20dd8a1afcb5cfab7713&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::67b8c32aa63c20dd8a1afcb5cfab7713&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euassignment_turned_in ProjectFrom 2012Partners:COMMISSARIAT A LENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES - CENTRE DETUDES NUCLEAIRES SACLAY, AGH University of Science and Technology, LETI, University Hospital Würzburg, University of Bonn +2 partnersCOMMISSARIAT A LENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES - CENTRE DETUDES NUCLEAIRES SACLAY,AGH University of Science and Technology,LETI,University Hospital Würzburg,University of Bonn,KTB Tumorforschungs GmbH, Research Division ProQinase,FalseFunder: French National Research Agency (ANR) Project Code: ANR-11-ERNA-0004Funder Contribution: 188,858 EURAll Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::a6eb4f1e64e4b14129891eda62c4d6c5&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eumore_vert All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::a6eb4f1e64e4b14129891eda62c4d6c5&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
chevron_left - 1
- 2
chevron_right