An assistive environment for improving human safety utilizing advanced sound and motion data classification
This paper presents the concept and an initial implementation of an assistive awareness system that may be used for human activity interpretation and emergency recognition in cases such as elder or patient falls and distress speech expressions, thus improving their safety. Awareness is achieved through collecting, analyzing and classifying motion and sound data. The latter are collected through sensors equipped with accelerometers and microphones that are attached to the human body and transmit movement and sound data wirelessly to the monitoring unit. The detection of fall incidents has been proven to be feasible by applying Short Time Fourier Transform (STFT) and spectrogram analysis on sounds. The classification of the sound and movement data is performed using a variety of advanced classification techniques. Evaluation results provide a performance comparison between the evaluated classifiers and indicate the high accuracy and the effectiveness of the proposed implementation. The system architecture is open and can be easily enhanced to include patient awareness based on additional context (e.g., physiological data).