Demonstrating TapType for mobile ten-finger text entry anywhere


METADATA ONLY
Loading...

Date

2022-04-27

Publication Type

Other Conference Item

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

We demonstrate a mobile text entry system that brings full-size ten-finger typing to everyday surfaces, allowing users to type anywhere. Our wearable wristband TapType integrates accelerometers that sense vibrations arising from finger taps against a passive surface, from which our Bayesian neural network estimates a probability distribution over the fingers of the hand. Given a pre-defined key-finger mapping, our text entry decoder fuses these predictions with the character priors of an n-gram language model to decode the input text entered by the user. TapType combines high portability with sustained rapid bimanual input across the full space, which we demonstrate at the example of supplementing text input on mobile touch devices, in eyes-free scenarios using audio feedback, and in a situated Mixed Reality scenario to enable typing outside visual control with passive haptic feedback.

Permanent link

Publication status

published

Book title

CHI EA '22: CHI Conference on Human Factors in Computing Systems Extended Abstracts

Journal / series

Volume

Pages / Article No.

195

Publisher

Association for Computing Machinery

Event

2022 Conference on Human Factors in Computing Systems (CHI 2022)

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Mobile text entry; Invisible interfaces; Bayesian inference; Bayesian neural network; N-gram language model; Virtual reality

Organisational unit

09649 - Holz, Christian / Holz, Christian check_circle

Notes

Extended abstract.

Funding

Related publications and datasets