Automated areas of interest analysis for usability studies of tangible screen-based user interfaces using mobile eye tracking
dc.contributor.author
Batliner, Martin
dc.contributor.author
Hess, Stephan
dc.contributor.author
Ehrlich-Adám, C.
dc.contributor.author
Lohmeyer, Quentin
dc.contributor.author
Meboldt, Mirko
dc.date.accessioned
2020-12-22T13:43:23Z
dc.date.available
2020-10-20T07:50:41Z
dc.date.available
2020-10-20T11:44:14Z
dc.date.available
2020-12-22T13:43:23Z
dc.date.issued
2020-11
dc.identifier.issn
0890-0604
dc.identifier.issn
1469-1760
dc.identifier.other
10.1017/s0890060420000372
en_US
dc.identifier.uri
http://hdl.handle.net/20.500.11850/446758
dc.description.abstract
The user's gaze can provide important information for human–machine interaction, but the analysis of manual gaze data is extremely time-consuming, inhibiting wide adoption in usability studies. Existing methods for automated areas of interest (AOI) analysis cannot be applied to tangible products with a screen-based user interface (UI), which have become ubiquitous in everyday life. The objective of this paper is to present and evaluate a method to automatically map the user's gaze to dynamic AOIs on tangible screen-based UIs based on computer vision and deep learning. This paper presents an algorithm for automated Dynamic AOI Mapping (aDAM), which allows the automated mapping of gaze data recorded with mobile eye tracking to the predefined AOIs on tangible screen-based UIs. The evaluation of the algorithm is performed using two medical devices, which represent two extreme examples of tangible screen-based UIs. The different elements of aDAM are examined for accuracy and robustness, as well as the time saved compared to manual mapping. The break-even point for an analyst's effort for aDAM compared to manual analysis is found to be 8.9 min gaze data time. The accuracy and robustness of both the automated gaze mapping and the screen matching indicate that aDAM can be applied to a wide range of products. aDAM allows, for the first time, automated AOI analysis of tangible screen-based UIs with AOIs that dynamically change over time. The algorithm requires some additional initial input for the setup and training, but analyzed gaze data duration and effort is only determined by computation time and does not require any additional manual work thereafter. The efficiency of the approach has the potential for a broader adoption of mobile eye tracking in usability testing for the development of new products and may contribute to a more data-driven usability engineering process in the future.
en_US
dc.language.iso
en
en_US
dc.publisher
Cambridge University Press
en_US
dc.subject
Computer vision
en_US
dc.subject
convolutional neural networks
en_US
dc.subject
mobile eye tracking
en_US
dc.subject
usability testing
en_US
dc.title
Automated areas of interest analysis for usability studies of tangible screen-based user interfaces using mobile eye tracking
en_US
dc.type
Journal Article
dc.date.published
2020-09-11
ethz.journal.title
Artificial intelligence for engineering design, analysis and manufacturing
ethz.journal.volume
34
en_US
ethz.journal.issue
4
en_US
ethz.pages.start
505
en_US
ethz.pages.end
514
en_US
ethz.identifier.wos
ethz.identifier.scopus
ethz.publication.place
Cambridge
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02665 - Inst. f. Design, Mat. und Fabrikation::03943 - Meboldt, Mirko / Meboldt, Mirko
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02130 - Dep. Maschinenbau und Verfahrenstechnik / Dep. of Mechanical and Process Eng.::02665 - Inst. f. Design, Mat. und Fabrikation::03943 - Meboldt, Mirko / Meboldt, Mirko
en_US
ethz.date.deposited
2020-10-20T07:50:51Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2020-12-22T13:43:32Z
ethz.rosetta.lastUpdated
2021-02-15T22:48:50Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Automated%20areas%20of%20interest%20analysis%20for%20usability%20studies%20of%20tangible%20screen-based%20user%20interfaces%20using%20mobile%20eye%20tracking&rft.jtitle=Artificial%20intelligence%20for%20engineering%20design,%20analysis%20and%20manufacturing&rft.date=2020-11&rft.volume=34&rft.issue=4&rft.spage=505&rft.epage=514&rft.issn=0890-0604&1469-1760&rft.au=Batliner,%20Martin&Hess,%20Stephan&Ehrlich-Ad%C3%A1m,%20C.&Lohmeyer,%20Quentin&Meboldt,%20Mirko&rft.genre=article&rft_id=info:doi/10.1017/s0890060420000372&
Files in this item
Files | Size | Format | Open in viewer |
---|---|---|---|
There are no files associated with this item. |
Publication type
-
Journal Article [124130]