Human activity recognition with inertial sensors using a deep learning approach

Tahmina Zebin, Patricia J. Scully, Krikor B. Ozanyan

Research output: Chapter in Book or Conference Publication/ProceedingConference Publicationpeer-review

128 Citations (Scopus)

Abstract

Our focus in this research is on the use of deep learning approaches for human activity recognition (HAR) scenario, in which inputs are multichannel time series signals acquired from a set of body-worn inertial sensors and outputs are predefined human activities. Here, we present a feature learning method that deploys convolutional neural networks (CNN) to automate feature learning from the raw inputs in a systematic way. The influence of various important hyper-parameters such as number of convolutional layers and kernel size on the performance of CNN was monitored. Experimental results indicate that CNNs achieved significant speed-up in computing and deciding the final class and marginal improvement in overall classification accuracy compared to the baseline models such as Support Vector Machines and Multi-layer perceptron networks.

Original languageEnglish
Title of host publicationIEEE Sensors, SENSORS 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781479982875
DOIs
Publication statusPublished - 5 Jan 2016
Externally publishedYes
Event15th IEEE Sensors Conference, SENSORS 2016 - Orlando, United States
Duration: 30 Oct 20162 Nov 2016

Publication series

NameProceedings of IEEE Sensors
Volume0
ISSN (Print)1930-0395
ISSN (Electronic)2168-9229

Conference

Conference15th IEEE Sensors Conference, SENSORS 2016
Country/TerritoryUnited States
CityOrlando
Period30/10/162/11/16

Keywords

  • Convolution
  • Convolutional Neural Networks (CNN)
  • Feature Extraction
  • Human activity recognition (HAR)
  • Signal Processing

Fingerprint

Dive into the research topics of 'Human activity recognition with inertial sensors using a deep learning approach'. Together they form a unique fingerprint.

Cite this