You are here: TUCS > PUBLICATIONS > Publication Search > M3 Interoperability for Remote...
M3 Interoperability for Remote Rehabilitation with Kinect
Natalia Díaz Rodríguez, Stefan Grönroos, Franck Wickström, Petteri Karvinen, Anders Berg, Shohreh Hosseinzadeh, Marion Karppi, Johan Lilius, M3 Interoperability for Remote Rehabilitation with Kinect. In: Juha-Pekka Soininen Soininen, Sergey Balandin, Johan Lilius, Petri Liuha, Tullio Salmon Cinotti (Eds.), Open International M3 Semantic Interoperability Workshop, 21, 153–163, TUCS Lecture Notes, 2013.
Abstract:
The ageing of the population and technological innovations around this topic are common aims of different current projects within eHealth and Ambient Assisted Living (AAL). The role of Åbo Akademi in the Health and Wellbeing action line within EIT ICT Labs project consists on developing an architectural component for the Active Healthy Ageing (AHA) Platform as well as technologies for the detection of in-home activities. To demonstrate it, we input data from rehabilitation exercises captured with a Kinect sensor that can be used for rehabilitation or working out at home. The main central component within the AHA Platform is the Personal Health Labs (PHL) data store provided by Philips as well as other vertical activities which facilitate information on stress, heart rate and sleep quality among others. Through using a M3 semantic storage box we provide an RDF store for ontology-based knowledge representation and publish/subscribe-based rules running on a (low power) Atom board. The platform provides SSAP (Smart Space Application Protocol) and SPARQL protocols, and a RESTful interface is currently under development [1] for easing the integration with the PHL relational store.
To provide the AHA Platform with sensor data interoperability, we created an AHA platform ontology, as well as a Security and Privacy ontology (to be completed together with Philips, Fraunhofer and INRIA Grenoble within AmiQoLT innovation Factory and DFKI UI toolbox). A Kinect-based application on remote rehabilitation monitoring/activity recognition was developed using Kinect for Windows SDK (C#). The Rehab@Home application encompasses two aspects of health care and well-being: activity monitoring and activity feedback, integrated into everyday lives of (possibly but not uniquely) senior citizens living independently. The application monitors exercise sessions for patients in rehabilitation after shoulder, hip or knee surgery or a simple sit-stand exercise [2]. The final aim is to allow the patient to do the sessions at home giving feedback on the quality and frequency of the exercise to the patient itself and physiotherapist expert, remotely and in real time. The current software allows recording new patterns from different users realizing exercises for the system to learn recognizing them (see Fig. 1 center).
In the future we plan to integrate sensor data from PHL and M3 stores to obtain context-aware long-term evolution/changes. We also are extending the platform to allow fuzzy rules to tackle imprecision, vagueness and uncertainty in knowledge representation.
Files:
Full publication in PDF-format
BibTeX entry:
@INPROCEEDINGS{inpDxGrWiKaBeHoKaLi13a,
title = {M3 Interoperability for Remote Rehabilitation with Kinect},
booktitle = {Open International M3 Semantic Interoperability Workshop},
author = {Díaz Rodríguez, Natalia and Grönroos, Stefan and Wickström, Franck and Karvinen, Petteri and Berg, Anders and Hosseinzadeh, Shohreh and Karppi, Marion and Lilius, Johan},
volume = {21},
editor = {Soininen, Juha-Pekka Soininen and Balandin, Sergey and Lilius, Johan and Liuha, Petri and Cinotti, Tullio Salmon},
publisher = {TUCS Lecture Notes},
pages = {153–163},
year = {2013},
keywords = {interoperability,Kinect,rehabilitation,physical exercise},
}
Belongs to TUCS Research Unit(s): Embedded Systems Laboratory (ESLAB)