Development of an Infrared-Based Sensor for Finger Movement Detection

  • Agbotiname Lucky Imoize
  • Aanuoluwapo Eberechukwu Babajide Department of Computer Engineering, University of Lagos, Akoka-Lagos, Nigeria.
Keywords: Finger movement detection; Haar-Like features; Deep Learning; Z-axis filtering; Accuracy; Precision; Recall; F1-score; Weighted average; 3-D Images; Depth Sensor; Infrared (IR) Images; Convolutional Neural Network (CNN).


With the increasing interest in smart devices and convenient remote control, the need for accurate wireless means of control has become imperative. This gives rise to the growing research interests in the area of gesture and finger movement detection. In this paper, a suitable design exploring some techniques involved in hand and finger movement detection, using depth-sensing infrared cameras embedded on Xbox Kinect Module is presented. First, 3-D images are generated and filtered along the z-axis, then two distinct techniques; Haar-Like Features, and Deep Learning using a Convolution Neural Network (CNN), are performed on the images to detect hands movement. In order to evaluate the robustness of the proposed technique, useful metrics like, Precision, Recall, F1-Score and Accuracy are used to evaluate the efficiency of the technique. Results show that while the deep learning model showed the most accurate results with a weighted accuracy of 1.0 (due to the absence of noise in the images), a weighted accuracy of 0.97 is observed for the Haar-Like features. Finally, the Haar-like features technique appears to run faster due to its static nature whereas, the deep learning model is quite slow in terms of running time. Overall, these findings point to the conclusion that the deep learning model is a better technique for detecting hands movements despite its longer running time.


(1) P. Viola and M. Jones, "Robust real-time object detection," International Journal of Computer Vision, vol. 4, pp. 137-154, 2001.

(2) Q. Chen, N. D. Georganas, and E. M. Petriu, "Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar," IEEE Transactions on Instrumentation and Measurement, vol. 57, pp. 1562-1571, 2008.

(3) S. Sharma and S. Jain, "A Static Hand Gesture and Face Recognition System for Blind People," in 2019 6th International Conference on Signal Processing and
Integrated Networks (SPIN), 2019, pp. 534-539.

(4) M. De Berg, M. Van Kreveld, M. Overmars, and O. Schwarzkopf, "Computational geometry," in Computational geometry, ed: Springer, 1997, pp. 1-17.

(5) Z. Ren, J. Meng, and J. Yuan, "Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction," in 2011 8th International Conference on Information, Communications & Signal Processing, Singapore., 2011., pp. 1-5.

(6) M. V. d. Bergh, D. Carton, R. D. Nijs, N. Mitsou, C. Landsiedel, K. Kuehnlenz, et al., "Real-time 3D hand gesture interaction with a robot for understanding directions from humans," in 2011 RO-MAN, Atlanta, GA., 2011., pp. 357-362.

(7) H. Guan-Feng, K. Sun-Kyung, S. Won-Chang, and J. Sung-Tae, "Real-time gesture recognition using 3D depth camera," in 2011 IEEE 2nd International
Conference on Software Engineering and Service Science, Beijing, China., 2011., pp. 187-190.

(8) C. F. Higham and D. J. Higham, "Deep Learning: An Introduction for Applied Mathematicians," SIAM Review, vol. 61, pp. 860-891, 2019.

(9) Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," nature, vol. 521, pp. 436-444, 2015.

(10) O. K. Oyedotun and A. Khashman, "Deep learning in vision-based static hand gesture recognition," Neural Computing and Applications, vol. 28, pp. 3941-3951, December 01 2017.

(11) H. Birk, T. B. Moeslund, and C. B. Madsen, "Real-time recognition of hand alphabet gestures using principal component analysis," in Proceedings of the Scandinavian conference on image analysis, 1997, pp. 261-268.

(12) S. Hussain, R. Saxena, X. Han, J. A. Khan, and H. Shin, "Hand gesture recognition using deep learning," in 2017 International SoC Design Conference (ISOCC), Seoul, South Korea., 2017., pp. 48-49.

(13) K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition," arXiv preprint arXiv:1409.1556, 2014.

(14) S. J. Pan and Q. Yang, "A survey on transfer learning," IEEE Transactions on knowledge and data engineering, vol. 22, pp. 1345-1359, 2009.

(15) H. Cheng, A. M. Chen, A. Razdan, and E. Buller, "Contactless gesture recognition system using proximity sensors," in 2011 IEEE International Conference on Consumer Electronics (ICCE), Berlin, Germany., 2011., pp. 149-150.

(16) T. Liu, X. Luo, J. Liu, and H. Cui, "Compressive infrared sensing for arm gesture acquisition and recognition," in 2015 IEEE International Conference on Information and Automation, Lijiang, China., 2015., pp. 1882-1886.

(17) R. K. Megalingam, V. Rangan, S. Krishnan, and A. B. E. Alinkeezhil, "IR Sensor-Based Gesture Control Wheelchair for Stroke and SCI Patients," IEEE Sensors Journal, vol. 16, pp. 6755-6765, 2016.

(18) C. Metzger, M. Anderson, and T. Starner, "FreeDigiter: a contact-free device for gesture control," in Eighth International Symposium on Wearable Computers, Arlington., 2004., pp. 18-21.

(19) F. Erden and A. E. Çetin, "Hand gesture based remote control system using infrared sensors and a camera," IEEE Transactions on Consumer Electronics, vol. 60, pp. 675-680, 2014.

(20) D. Ionescu, V. Suse, C. Gadea, B. Solomon, B. Ionescu, and S. Islam, "An infrared-based depth camera for gesture-based control of virtual environments," in 2013 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), Milan, Italy., 2013., pp. 13-18.

(21) D. Ionescu, V. Suse, C. Gadea, B. Solomon, B. Ionescu, S. Islam, et al., "A single sensor NIR depth camera for gesture control," in 2014 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) Proceedings, Montevideo, Uruguay., 2014., pp. 1600-1605.

(22) Y. Sato, Y. Kobayashi, and H. Koike, "Fast tracking of hands and fingertips in infrared images for augmented desk interface," in Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580), Grenoble, France., 2000, pp. 462-467.

(23) L. Xia and K. Fujimura, "Hand gesture recognition using depth data," in Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings., 2004, pp. 529-534.

(24) J. Suarez and R. R. Murphy, "Hand gesture recognition with depth images: A review," in 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France., 2012, pp. 411-417.

(25) H. Ruser, S. Kosterski, and C. Kargel, "Gesture-based universal optical remote control: Concept, reconstruction principle and recognition results," in 2015 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) Proceedings, Pisa, Italy., 2015, pp. 1677-1681.

(26) A. Utsumi, N. Tetsutani, and S. Igi, "Hand detection and tracking using pixel value distribution model for multiple-camera-based gesture interactions," in Proceedings. IEEE Workshop on Knowledge Media Networking, Anacapri, Italy., 2002, pp. 31-36.

(27) Z. Zhang, "Microsoft Kinect Sensor and Its Effect," IEEE MultiMedia, vol. 19, pp. 4-10, 2012.

(28) Y. Li, "Hand gesture recognition using Kinect," in 2012 IEEE International Conference on Computer Science and Automation Engineering, 2012, pp. 196-199.

(29) M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, et al., "ROS: an open-source Robot Operating System," in ICRA workshop on open source software, Japan., 2009, p. 5.

(30) Kaggle, "Hand Gesture Recognition Database," GTI, Ed., ed., 2018.

(31) J. Schmidhuber, "Deep learning in neural networks: An overview," Neural networks, vol. 61, pp. 85-117, 2015.
How to Cite
Imoize, A. L., & Babajide, A. E. (2019). Development of an Infrared-Based Sensor for Finger Movement Detection. Journal of Biomedical Engineering and Medical Imaging, 6(4), 29-44.