TY - GEN
T1 - Motion estimation approach for UAV controls using bidirectional two-layer LSTMs
AU - Guo, Haitao
AU - Sung, Yunsick
AU - Kang, Jungho
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/7
Y1 - 2019/7
N2 - With the widespread use of unmanned aerial vehicles (UAVs), there is an increasing demand for the development of their control technology. The key interaction technology between humans and UAVs needs to focus on the human body language, which comprises rich interactive information, as it is the most natural, intuitive, and easy to master approach of interpersonal communication for humans. Therefore, the research on human motion estimation for UAV control is of considerable practical significance. Recently, deep learning has made breakthroughs in speech, image recognition and, other fields, and has crushed the performance of traditional methods in many fields. However, in the field of human motion estimation, deep learning has been progressing slowly. To overcome the limitations of the traditional methods and explore the application of deep learning methods in the field of motion estimation, this study proposes a method to estimate human arm motion using deep learning networks. We proposed a bidirectional two-layer LSTM fusion network to estimate the forearms' motion according to the hand position measured by HTC Vive. The performance was verified using a real data set. The average Euclidean distance similarity can reach up to 56%. In comparison with the traditional methods, the proposed method demonstrated wider applicability and better performance.
AB - With the widespread use of unmanned aerial vehicles (UAVs), there is an increasing demand for the development of their control technology. The key interaction technology between humans and UAVs needs to focus on the human body language, which comprises rich interactive information, as it is the most natural, intuitive, and easy to master approach of interpersonal communication for humans. Therefore, the research on human motion estimation for UAV control is of considerable practical significance. Recently, deep learning has made breakthroughs in speech, image recognition and, other fields, and has crushed the performance of traditional methods in many fields. However, in the field of human motion estimation, deep learning has been progressing slowly. To overcome the limitations of the traditional methods and explore the application of deep learning methods in the field of motion estimation, this study proposes a method to estimate human arm motion using deep learning networks. We proposed a bidirectional two-layer LSTM fusion network to estimate the forearms' motion according to the hand position measured by HTC Vive. The performance was verified using a real data set. The average Euclidean distance similarity can reach up to 56%. In comparison with the traditional methods, the proposed method demonstrated wider applicability and better performance.
KW - Deep learning
KW - HTC Vive
KW - Motion estimation
KW - UAV control
UR - http://www.scopus.com/inward/record.url?scp=85074831517&partnerID=8YFLogxK
U2 - 10.1109/iThings/GreenCom/CPSCom/SmartData.2019.00083
DO - 10.1109/iThings/GreenCom/CPSCom/SmartData.2019.00083
M3 - Conference contribution
AN - SCOPUS:85074831517
T3 - Proceedings - 2019 IEEE International Congress on Cybermatics: 12th IEEE International Conference on Internet of Things, 15th IEEE International Conference on Green Computing and Communications, 12th IEEE International Conference on Cyber, Physical and Social Computing and 5th IEEE International Conference on Smart Data, iThings/GreenCom/CPSCom/SmartData 2019
SP - 381
EP - 384
BT - Proceedings - 2019 IEEE International Congress on Cybermatics
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 12th IEEE International Conference on Internet of Things, 15th IEEE International Conference on Green Computing and Communications, 12th IEEE International Conference on Cyber, Physical and Social Computing and 5th IEEE International Conference on Smart Data, iThings/GreenCom/CPSCom/SmartData 2019
Y2 - 14 July 2019 through 17 July 2019
ER -