Show simple item record

dc.contributor.author
Stampfler, Tristan
dc.contributor.author
Elgendi, Mohamed
dc.contributor.author
Fletcher, Richard Ribon
dc.contributor.author
Menon, Carlo
dc.date.accessioned
2023-04-03T12:16:43Z
dc.date.available
2023-04-03T03:18:10Z
dc.date.available
2023-04-03T12:16:43Z
dc.date.issued
2023-02-28
dc.identifier.issn
2296-2565
dc.identifier.other
10.3389/fpubh.2023.1086671
en_US
dc.identifier.uri
http://hdl.handle.net/20.500.11850/606179
dc.identifier.doi
10.3929/ethz-b-000606179
dc.description.abstract
The emerging field of digital phenotyping leverages the numerous sensors embedded in a smartphone to better understand its user's current psychological state and behavior, enabling improved health support systems for patients. As part of this work, a common task is to use the smartphone accelerometer to automatically recognize or classify the behavior of the user, known as human activity recognition (HAR). In this article, we present a deep learning method using the Resnet architecture to implement HAR using the popular UniMiB-SHAR public dataset, containing 11,771 measurement segments from 30 users ranging in age between 18 and 60 years. We present a unified deep learning approach based on a Resnet architecture that consistently exceeds the state-of-the-art accuracy and F1-score across all classification tasks and evaluation methods mentioned in the literature. The most notable increase we disclose regards the leave-one-subject-out evaluation, known as the most rigorous evaluation method, where we push the state-of-the-art accuracy from 78.24 to 80.09% and the F1-score from 78.40 to 79.36%. For such results, we resorted to deep learning techniques, such as hyper-parameter tuning, label smoothing, and dropout, which helped regularize the Resnet training and reduced overfitting. We discuss how our approach could easily be adapted to perform HAR in real-time and discuss future research directions.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
Frontiers Media
en_US
dc.rights.uri
http://creativecommons.org/licenses/by/4.0/
dc.subject
digital health
en_US
dc.subject
deep learning
en_US
dc.subject
data science
en_US
dc.subject
public health
en_US
dc.subject
smartphone
en_US
dc.subject
activity recognition
en_US
dc.subject
physical activity
en_US
dc.subject
wearable technology
en_US
dc.title
The use of deep learning for smartphone-based human activity recognition
en_US
dc.type
Journal Article
dc.rights.license
Creative Commons Attribution 4.0 International
ethz.journal.title
Frontiers in Public Health
ethz.journal.volume
11
en_US
ethz.journal.abbreviated
Front. Public Health
ethz.pages.start
1086671
en_US
ethz.size
11 p.
en_US
ethz.version.deposit
publishedVersion
en_US
ethz.identifier.wos
ethz.publication.place
Lausanne
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02070 - Dep. Gesundheitswiss. und Technologie / Dep. of Health Sciences and Technology::09715 - Menon, Carlo / Menon, Carlo
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02070 - Dep. Gesundheitswiss. und Technologie / Dep. of Health Sciences and Technology::09715 - Menon, Carlo / Menon, Carlo
ethz.date.deposited
2023-04-03T03:18:18Z
ethz.source
WOS
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2023-04-03T12:16:45Z
ethz.rosetta.lastUpdated
2024-02-02T21:29:44Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=The%20use%20of%20deep%20learning%20for%20smartphone-based%20human%20activity%20recognition&rft.jtitle=Frontiers%20in%20Public%20Health&rft.date=2023-02-28&rft.volume=11&rft.spage=1086671&rft.issn=2296-2565&rft.au=Stampfler,%20Tristan&Elgendi,%20Mohamed&Fletcher,%20Richard%20Ribon&Menon,%20Carlo&rft.genre=article&rft_id=info:doi/10.3389/fpubh.2023.1086671&
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record