A Robust User Interface for IoT using Context-aware Bayesian Fusion Conference Paper uri icon

abstract

  • 2018 IEEE. As the Internet of Things (IoT) continues to expand into our daily lives, consumers are finding a growing catalogue of smart devices to boost the intelligence of their homes. Currently, the user must manage a proprietary user interface (UI) for each device, and each application comes with its own UI, creating a cumbersome app environment. Clearly, a single UI that can control all of these devices would be preferable. This interface should be accessible using forms of communication that feel natural, for example, speech, body language, and facial expressions, to name a few. In this paper, we propose a framework for multimodal UI using a flexible, slotted command ontology and decision-level Bayesian fusion. Our case study explores command recognition for device control with a wearable system accessed via speech and gestures, using a wrist-mounted inertial measurement unit (IMU) for hand gesture recognition. We achieve an accuracy of 94.82% on a set of 17 commands.

name of conference

  • 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN)

published proceedings

  • 2018 IEEE 15TH INTERNATIONAL CONFERENCE ON BIOMEDICAL AND HEALTH INFORMATICS (BHI) AND THE WEARABLE AND IMPLANTABLE BODY SENSOR NETWORKS (BSN)

author list (cited authors)

  • Wu, J., Grimsley, R., & Jafari, R.

citation count

  • 3

complete list of authors

  • Wu, Jian||Grimsley, Reese||Jafari, Roozbeh

publication date

  • March 2018