Activity detection utilizing multi-modal data
Summary of the entity:
The Centre for Research and Technology Hellas (CERTH) is one of the largest research centres in Greece; top 1 in north Greece. Its mission is to promote the triplet Research – Development – Innovation by conducting high quality research and developing innovative products and services while building strong partnerships with industry and strategic collaborations with academia and other research and technology organisations in Greece and abroad.
More than 800 people work at CERTH with the majority being scientists. CERTH has received numerous awards and distinctions, while it is listed among the Top-20 of the EU’s Research Centres with the highest participation in H2020 competitive research grants.
It is active in a large number of application sectors (energy, buildings and construction, health, manufacturing, robotics, (cyber)security, transport, smart cities, space, agri-food, marine and blue growth, water, etc.) and technology areas such as data and visual analytics, data mining, machine and deep learning, virtual and augmented reality, image processing, computer and cognitive vision, human computer interaction, IoT and communication technologies, navigation technologies, cloud and computing technologies, distributed ledger technologies (blockchain), (semantic) interoperability, system integration, mobile and web applications, hardware design and development, smart grid technologies and solutions and social media analysis.
Summary of the challenge:
The goal of this challenge is to detect the activity in indoor environments from multi-modal data, like audio-based and motion information.
Stakeholder: Company internal stakeholders (employees) and in particular hardware engineers – DATA ANALYSIS
The major challenge regarding activity monitoring systems is the problem of missing or incomplete data collected from deployed hardware solutions. This could be a result of either connectivity issues or even hardware failures and can lead to incorrect estimations or false alarms.
Data imputation is the obvious course of action in addressing this challenge, although in many cases it could create even bigger discrepancies when assessing daily behavior.
In recent years, there has been an increasing interest in Deep Learning architectures. The main reason is that these architectures can achieve high classification results from the raw data, without the need for human-engineered ones.
However, when dealing with sensitive data, such as audio, one must consider the deployment of computationally expensive algorithms to single-board computers. This would require optimization, keeping a fair trade-off between the computational cost and classification accuracy, as well as, storing the data on the device. In the case of a cloud-based system, one must assure that the Blockchain-as-a-Service is used.
A fused system that could combine data from multiple sources would provide a more accurate way to deal with data loss and false alarms in behavioral monitoring and detection of abnormalities.
In that frame, a solution leveraging data from motion sensors along with audio-based information collected from microphones could prove to be the most precise way of monitoring everyday activity.
Analyzing data from heterogeneous data sources, using machine learning and advanced analytics techniques, can lead to efficient identification of patterns, trending models, and abnormalities, increasing as a result the accuracy and quality of recommendations while also providing meaningful insights and eventually better decisions to the end-user.
REACH Data Providers:
- Audio data (wav files)
- Motion data:
- Motion/ no motion (1/0) and timestamp
- Temperature (Celsius) and timestamp
- Luminance (lux) and timestamp
To create a data value chain that allows:
- To develop an accurate algorithm for recognizing the human activities;
- To recognize more than 90% of the type of activities inside the building.
How do we apply?
Read the Guidelines for Applicants
Doubts or questions? Read more about REACH on the About Us page,
have a look at our FAQ section or drop us an email at firstname.lastname@example.org.