A dynamic modelling framework for human hand gesture task recognition

Sara Masoud, Bijoy Chowdhury, Young-Jun Son, Chieri Kubota, Russell E Tronstad

Research output: Contribution to conferencePaper

Abstract

Gesture recognition and hand motion tracking are important tasks in advanced gesture based interaction systems. In this paper, we propose to apply a sliding windows filtering approach to sample the incoming streams of data from data gloves and a decision tree model to recognize the gestures in real time for a manual grafting operation of a vegetable seedling propagation facility. The sequence of these recognized gestures defines the tasks that are taking place, which helps to evaluate individuals' performances and to identify any bottlenecks in real time. In this work, two pairs of data gloves are utilized, which reports the location of the fingers, hands, and wrists wirelessly (i.e., via Bluetooth). To evaluate the performance of the proposed framework, a preliminary experiment was conducted in multiple lab settings of tomato grafting operations, where multiple subjects wear the data gloves while performing different tasks. Our results show an accuracy of 91% on average, in terms of gesture recognition in real time by employing our proposed framework.

Original languageEnglish (US)
Pages563-568
Number of pages6
StatePublished - Jan 1 2018
Event2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018 - Orlando, United States
Duration: May 19 2018May 22 2018

Other

Other2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018
CountryUnited States
CityOrlando
Period5/19/185/22/18

Fingerprint

Gesture recognition
Bluetooth
Vegetables
Decision trees
Wear of materials
Experiments

Keywords

  • Decision tree
  • Hand gesture
  • K-means
  • Task recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Industrial and Manufacturing Engineering

Cite this

Masoud, S., Chowdhury, B., Son, Y-J., Kubota, C., & Tronstad, R. E. (2018). A dynamic modelling framework for human hand gesture task recognition. 563-568. Paper presented at 2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018, Orlando, United States.

A dynamic modelling framework for human hand gesture task recognition. / Masoud, Sara; Chowdhury, Bijoy; Son, Young-Jun; Kubota, Chieri; Tronstad, Russell E.

2018. 563-568 Paper presented at 2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018, Orlando, United States.

Research output: Contribution to conferencePaper

Masoud, S, Chowdhury, B, Son, Y-J, Kubota, C & Tronstad, RE 2018, 'A dynamic modelling framework for human hand gesture task recognition', Paper presented at 2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018, Orlando, United States, 5/19/18 - 5/22/18 pp. 563-568.
Masoud S, Chowdhury B, Son Y-J, Kubota C, Tronstad RE. A dynamic modelling framework for human hand gesture task recognition. 2018. Paper presented at 2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018, Orlando, United States.
Masoud, Sara ; Chowdhury, Bijoy ; Son, Young-Jun ; Kubota, Chieri ; Tronstad, Russell E. / A dynamic modelling framework for human hand gesture task recognition. Paper presented at 2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018, Orlando, United States.6 p.
@conference{8f88e21df57f4052aafc4d5dd27fa8c8,
title = "A dynamic modelling framework for human hand gesture task recognition",
abstract = "Gesture recognition and hand motion tracking are important tasks in advanced gesture based interaction systems. In this paper, we propose to apply a sliding windows filtering approach to sample the incoming streams of data from data gloves and a decision tree model to recognize the gestures in real time for a manual grafting operation of a vegetable seedling propagation facility. The sequence of these recognized gestures defines the tasks that are taking place, which helps to evaluate individuals' performances and to identify any bottlenecks in real time. In this work, two pairs of data gloves are utilized, which reports the location of the fingers, hands, and wrists wirelessly (i.e., via Bluetooth). To evaluate the performance of the proposed framework, a preliminary experiment was conducted in multiple lab settings of tomato grafting operations, where multiple subjects wear the data gloves while performing different tasks. Our results show an accuracy of 91{\%} on average, in terms of gesture recognition in real time by employing our proposed framework.",
keywords = "Decision tree, Hand gesture, K-means, Task recognition",
author = "Sara Masoud and Bijoy Chowdhury and Young-Jun Son and Chieri Kubota and Tronstad, {Russell E}",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
pages = "563--568",
note = "2018 Institute of Industrial and Systems Engineers Annual Conference and Expo, IISE 2018 ; Conference date: 19-05-2018 Through 22-05-2018",

}

TY - CONF

T1 - A dynamic modelling framework for human hand gesture task recognition

AU - Masoud, Sara

AU - Chowdhury, Bijoy

AU - Son, Young-Jun

AU - Kubota, Chieri

AU - Tronstad, Russell E

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Gesture recognition and hand motion tracking are important tasks in advanced gesture based interaction systems. In this paper, we propose to apply a sliding windows filtering approach to sample the incoming streams of data from data gloves and a decision tree model to recognize the gestures in real time for a manual grafting operation of a vegetable seedling propagation facility. The sequence of these recognized gestures defines the tasks that are taking place, which helps to evaluate individuals' performances and to identify any bottlenecks in real time. In this work, two pairs of data gloves are utilized, which reports the location of the fingers, hands, and wrists wirelessly (i.e., via Bluetooth). To evaluate the performance of the proposed framework, a preliminary experiment was conducted in multiple lab settings of tomato grafting operations, where multiple subjects wear the data gloves while performing different tasks. Our results show an accuracy of 91% on average, in terms of gesture recognition in real time by employing our proposed framework.

AB - Gesture recognition and hand motion tracking are important tasks in advanced gesture based interaction systems. In this paper, we propose to apply a sliding windows filtering approach to sample the incoming streams of data from data gloves and a decision tree model to recognize the gestures in real time for a manual grafting operation of a vegetable seedling propagation facility. The sequence of these recognized gestures defines the tasks that are taking place, which helps to evaluate individuals' performances and to identify any bottlenecks in real time. In this work, two pairs of data gloves are utilized, which reports the location of the fingers, hands, and wrists wirelessly (i.e., via Bluetooth). To evaluate the performance of the proposed framework, a preliminary experiment was conducted in multiple lab settings of tomato grafting operations, where multiple subjects wear the data gloves while performing different tasks. Our results show an accuracy of 91% on average, in terms of gesture recognition in real time by employing our proposed framework.

KW - Decision tree

KW - Hand gesture

KW - K-means

KW - Task recognition

UR - http://www.scopus.com/inward/record.url?scp=85054040633&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85054040633&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85054040633

SP - 563

EP - 568

ER -