Submit manuscript...
eISSN: 2574-9838

International Physical Medicine & Rehabilitation Journal

Literature Review Volume 4 Issue 5

Mapping study of computer vision tools applied to aid in wheelchair control

Flávia Gonçalves Fernandes,1 Eder Manoel de Santana,2 Eduardo Lázaro Martins Naves2

1Department of Informatics, Federal Institute of Education, Science and Technology Goiano - Campos Belos Campus, Brazil
2Assistive Technology Center, Federal University of Uberlandia, Brazil

Correspondence: Flávia Gonçalves Fernandes, Department of Informatics, Federal Institute of Education, Science and Technology Goiano - Campos Belos Campus, Brazil, Tel +55(64)981142956

Received: September 02, 2019 | Published: October 15, 2019

Citation: Fernandes FG, Santana EMD, Naves ELM. Mapping study of computer vision tools applied to aid in wheelchair control. Int Phys Med Rehab J. 2019;4(5):249?253. DOI: 10.15406/ipmrj.2019.04.00211

Download PDF

Abstract

Driving a motorized wheelchair at first glance seems like a simple activity to the. In reality, however, the ability to drive the wheelchair independently requires specific motor, visual and cognitive skills, and Attempting to the so without proper preparation and knowledge may pose a risk, not only to the user in question, but Also to the Individuals around him. Thus, computer vision can assist in the wheelchair control process through various tools. From this perspective, this paper presents a systematic review of references Aimed at the application of computer vision in wheelchair driving. The research sources Were the Following indexed databases: IEEE Xplore, Science Direct and PubMed. Thus, the systematic mapping in the elaboration of the review allowed us to identify the main gaps for the development of new research, in addition to directing the main publications related to the study. Finally, the results show That It is a constantly expanding area with great potential for development and applications.

Keywords: index terms- wheelchair, systematic mapping, computer vision

Introduction

New modern clinical practice, a wide range of information from the human body can be read routinely. Through the use of appropriate monitoring equipment, engineers and health professionals have access to many phenomena bioelectrical with relative facility. Among these phenomena are the following electrocardiogram (ECG), eletrooculogram (EOG), electroencephalogram (EEG), electromyogram (EMG) biopotential among others.1

In addition, computer vision has been widely used to help control process for wheelchair, for example, that, in turn, is very useful for temporary or permanent rehabilitation treatments. To try to make these more attractive and effective treatments to patients, it is common to use virtual environments for training, where we use a virtual environment to create a consistent model with reality in conducting a virtual wheelchair. Such models (called simulator) has as main function to allow interventions in some scenarios that could not be done in a real situation, without providing some kind of risk or other adverse characteristics. Thus, the use of simulators can provide a secure environment for training, learning and adaptation, until the user is ready to deal with the real situation properly.2

Conceptually, an environment involves a certain space and an enclosed situation, and includes all the components that they carry, as the set of objects and likely condition to be perceived and with which you can interact. In this line of reasoning, a virtual environment is an interactive environment, generated by a computer and made available through a virtual reality system.3 According to4 are features that should be considered for the development of a virtual environment:

Synthetic: the environment must be generated in real time and not be a recording such as multimedia systems;

Dimensional: the environment surrounding the user is represented in three dimensions (3D), giving the user the impression of depth;

Multisensory: it uses more than one way to represent the environment, such as vision, hearing, spatial perception (deep), user reaction to the environment, among others;

Immersive: refers to the impression that it is within the environment produced computationally. Normally, an immersive system is obtained with the use of visualization helmets, but other senses such as sound and reactive controls, can also collaborate with immersion;

Interactive: is the ability to detect user input and instantly modify the virtual world and the actions performed on it;

Realistic: involves the precision with which the virtual environment reproduces the real objects, interactions with users and the environment of the model itself.

In this context, the objective of this work is to check the overview of related research on the application of computer vision in control of wheelchairs, featuring a systemic study of what has been published for that aspect.

The purpose of this systematic mapping to analyze the references of the work carried out involving the theme on computer vision tools applied to control wheelchairs. And with that, check the rise of this research today.

This work is organized and structured in the following sections: Section 2, the methodology and the development of systematic review was discussed, detailing the research carried out. Section 3 corresponds to the results obtained by means of graphs and tables, which are analyzed and commented. Finally, we present the findings of systematic research on computer vision tools applied to aid in control of wheelchairs.

Materials and methods

In order to make up the literature review established by the worker, systematic mapping (Study mapping) was performed according to the methodology proposed by5 and,6 consisting of a search for registered studies on databases by means of logical operators for the selection of items, from the selection of key words or expressions.

Databases considered for this study were: IEEE Xplore,7 Science Direct8 and PubMed,9 which are bases available at the Federal University of Uberlândia. It is necessary to emphasize that only periodic were analyzed peer-reviewed articles.

The logical expressions used to search the bases were "computer vision" and "wheelchair". These strings were chosen to seek work involving machine learning applied in health care. Then filters were applied to reduce the scope of the search. For example, as the language (English and Portuguese), type of publication (journal article peer-reviewed) and year of publication (from 2013 to 2018, representing six years of search space). Applied filters in each database, a reading was made of the securities in order to select which were in accordance with the logical expression selected. It also observed the possible duplicates between databases, and articles that fit the inclusion criteria have any duplicates removed.

Finally, the final stage of selection of items consisted in directing for applications involving machine learning in health. Thus, it was made from the reading and analysis of titles and abstracts, to exclude those jobs that did not relate directly to the subject being studied, and the review of the literature developed from this result. In this sense, different studies that addressed topics were discarded, such that addressed work machine learning oriented application in other areas, for example.

Results and discussion

After completion of the searches on the databases, the results were organized in tables and charts in order to present them in a more practical manner. Table 1 shows the total results of the mapping, given keywords and cited the deadline of December 2018.

Database

Number of articles

initial

with title adherent

PubMed

5

4

IEEE Xplore

53

27

Science Direct

89

5

Total

147

35

Table 1 Summary of results obtained in the research

Thus, through the systematic mapping a database was created with 35 (thirty five) databases of references to the combination of strings "computer vision" AND "wheelchair".

The evolution of the annual publication of selected papers in the international pursuit can be seen in Figure 1, where it is possible to note that articles involving the subject matter in this work have been published in recent years. Thus, it is observed that the subject is recent and large fields of possibilities to explore. 35 (thirty five) selected works included in the references of this study.10–44 However, they are not mentioned in the text due to the extensive amount.

Figure1 Number of articles per year of publication.

Figure 2 shows the number of journal articles published by countries. The published works are from different countries such as: Argentina, Australia, Bangladesh, Brazil, Chile, China, South Korea, Spain, USA, France, India, Indonesia, Italy, Japan, New Zealand and Romania. Thus, it appears that most found jobs that involve the use of computer vision applications for wheelchairs are from developed countries, including found only one study that addresses this issue in Brazil.

Figure 2 Number of articles by country of publication.

Figure 3 shows the number of items found depending on the type of rehabilitation, which were: muscular dystrophy, limiting the upper limbs, cerebral palsy and tetraplegia. But unfortunately, most of the works selected for the systematic mapping did not report the type of rehabilitation used to use computer vision to control wheelchairs.

Figure 3 Number of articles by type of rehabilitation.

Figure 4 shows the number of items selected according to the technology used for the application of computer vision to control wheelchairs, which were varied, among which we mention: AdaBoost face detection algorithm; customized; Blob detection; Color detection with OpenCV; detection of gestures and objects; Face detection and face angle; fuzzy controler and virtual reality; HOG (Histogram of Oriented Gradients) with OpenCV; kinematic model and iris movement; Monocular visual odometry; OpenCV, OpenNI and gesture recognition; PCL (Point Cloud Library); VR (virtual reality) and FOV (field of view); SLAM (Simultaneous Localization and Mapping) based on vision; TensorFlow; egocentric computer vision; stereo vision; monocular vision; detection and mapping of obstacles.

Figure 4 Number of articles for technology.

Figure 5 shows the quantity of items found on the type of equipment used in work on computer vision to control wheelchairs, namely: Camera; binocular camera and emotiv epoc; stereo vision camera; camera and robotic arm; camera and laser line;

Figure 5 Number of articles per unit.

Technology

Camera and infrared LED; camera and raspberry pi; camera and IMU sensor; camera and LIDAR-Lite; camera, raspberry pi and line laser; camera and tablet; infrared and ultrasound; Kinect and Tobii PCEye; simulator and camera; ultrasound; Xtion PRO Live. Thus, it is noticed that the camera was the most used equipment in the selected works on the theme researched this systematic mapping.

Figure 6 shows the number of found items according to the purpose of application of labor, namely: monitoring /evaluation, accident prevention, rehabilitation and training. In this sense, it can be observed that most of the work seeks to prevent accidents, since it is something very important for people who use wheelchairs, the elderly and people with disabilities, for example, to avoid further damage to your health.

Figure 6 Number of items per order.

Figure 7 shows the percentage of articles that did experiments with participants. Thus, it is observed that most (69% of the work) conducted this procedure. This demonstrates the need for testing to prove the results and effectiveness of applications, and help in making improvements and maintenance of the systems.

Figure 7 Percentage of articles that did experiments.

Conclusion

Through the mapping done, we found that there was an increase in the study of the application of computer vision to control wheelchairs, since it is a new and very efficient technique.

Thus, it is worth noting that the application of systematic mapping in the development of literature review identifies the main gaps in the development of new research. In addition, it directs to the main publications related to the study.

Because of what was presented in this paper, it appears that there is growing interest in researching and publishing in this area related to control wheelchairs using computer vision, seeking help in rehab, treatment, training or preventing people from accidents with various diseases.

Therefore, we see the need to promote this research area to offer this audience with limited dexterity access to computer vision techniques as a treatment, acquisition of knowledge, motivation, entertainment or even inclusion. In this way, you can obtain a greater maturity on the results and thus promote a systematic employment of computer vision in helping to promote the well-being of these people.

Acknowledgements

None.

Conflicts of interest

The authors declare that there is no conflict of interest.

References

  1. Sanchez J, Sue Cobb, Paul Sharkey, et al. Virtual reality and assistive technologies for people with disabilities. International Journal on Disability and Human Development. 2011;10(4):275–276.
  2. Wolpaw JR1, Birbaumer N, McFarland DJ, et al. Brain computer interfaces for communication and control. Clinical Neurophysiology. 2002;113(6):767–791.
  3. Stuart R. The design of virtual environments. Fairfield:McGraw-Hill. 2006:274.
  4. Vince J. Virtual reality systems. Boston: Addison-Wesley. 2005:388.
  5. Bailey J, David B, Mark T, et al. Evidence relating to object-oriented software design: the survey". first international symposium on empirical software engineering and measurement. Computer Society. 2007.
  6. Petersen, K. et al. "Systematic mapping studies in software engineering". school of engineering, blekinge institute of technology. 2008;26–27.
  7. https://ieeexplore.ieee.org/Xplore/home.jsp
  8. https://www.sciencedirect.com/
  9. https://www.ncbi.nlm.nih.gov/pubmed
  10. Alshaer, A. Simon H, Holger R, et al. Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator system. IEEE. 2013.
  11. Bankar RT, Salankar SS. Head gesture recognition system using adaboost algorithm with obstacle detection. 7th international conference on emerging trends in engineering & technology. IEEE. 2015.
  12. Bastos, VB, Alan FPT, Cesar HCQ, et al. Monocular visual odometry for robotic wheelchair in a virtual environment. IEEE. 2018.
  13. Chen Naijian, Han Xiangdong, Wang Yantao, et al. Coordination control strategy between human vision and wheelchair manipulator based on BCI. IEEE. 2016.
  14. Grewal HS, Jayaprakash, Matthews, et al. Autonomous wheelchair navigation in indoor environments unmapped. IEEE. 2018.
  15. Hapsani AG, Dahnial S, Fitri U, et al. Onward movement detection and distance estimation of object using disparity map on stereo vision. In: 5th international symposium on computational intelligence and business. IEEE. 2017.
  16. Hoareau F, Murakami T. A step strategy passage using environment recognition for an electric wheelchair two-wheel. IEEE. 2016.
  17. Jiang H, Zhang T, Wachs PJ, et al. Enhanced control of the wheelchair-mounted robotic manipulatorusing 3-D visionandmultimodal interaction. Computer Vision and Image Understanding. 2016;1–11.
  18. Karuppiah P, Hem M, Kiran G, et al. Automation of the wheelchair mounted robotic arm using computer vision interface. IEEE. 2018.
  19. Katyal DK. Harmonie: the multimodal control framework for human assistive for smart wheelchair. In: international conference on advanced robotics and intelligent systems. Taiwan; 2014.
  20. Kawarazaki N, Diaz AIB. Gesture recognition system for wheelchair control using the depth sensor. IEEE. 2013.
  21. Kim EY. Wheelchair navigation system for disabled and elderly people. Sensors. 2016;16:1806.
  22. Ktena SI, William A, AAldo Faisal, et al. The virtual reality platform for safe evaluation and training of natural gauze-based wheelchair driving. In: 7th Annual International IEEE EMBS Conference on Neural Engineering, Montpellier, France. 2015.
  23. Li H, M Kutbi, X Li, et al. An egocentric computer vision based co-robot wheelchair. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Daejeon, Korea; 2016.
  24. Li X, Xu C, Li S, et al. Velocity measurement of intelligent wheelchair based on restoration of optical flow field. In: Proceedings of the 2016 IEEE International Conference on Mechatronics and Automation, Harbin, China; 2016.
  25. Li Z, Suna Z, J Duan, et al. Human cooperative wheelchair with brain machine interaction based on shared control strategy. Transactions on Mechatronics. 2016.
  26. Li Z, Xiong Y, Zhou L, et al. ROS-based indoor autonomous exploration and navigation wheelchair. In: 10th International Symposium on Computational Intelligence and Design. 2017.
  27. Narayanan VK, François P, Maud M, et al. Vision-based adaptive assistance and haptic guidance for safe wheelchair corridor following. Computer Vision and Image Understanding. 2016;149:171–185.
  28. Nguyen JS, Steven W, Hung TN. Experimental study on the smart wheelchair system using a combination of stereoscopic spherical and vision. In: 35th Annual International Conference of the IEEE EMBS. Osaka, Japan; 2013.
  29. Nguyen VT, Chandimal J, Iman A. The navigation model for side-by-side robotic wheelchairs for social optimizing comfort in crossing their situations. Robotics and Autonomous Systems. 2018;100:27–40.
  30. Palla A, Luca S, Luca F, et al. Embedded implementation of an eye-in-hand control for visual servoing a wheelchair mounted robotic arm. In: IEEE Workshop on ICT solutions for e-Health. 2016.
  31. Pasteau F, Vishnu KN, Marie Babel, et al. Visual servoing approach corridor for autonomous Following doorway and passing in a wheelchair. Robotics and Autonomous Systems. 2014.
  32. Patel SN, V Prakash. Autonomous eye camera based system using controlled wheelchair raspberry-pi. In: Sponsored 2nd IEEE International Conference on Embedded Innovations in Information and Communication Systems ICIIECS'15. 2015.
  33. Pathirage I, Karan K, Elijah K, et al. The vision p300 based brain computer interface for grasping using the wheelchair-mounted robotic arm. In: IEEE / ASME International Conference on Advanced Intelligent Mechatronics (AIM). Wollongong, Australia; 2013.
  34. Perez E, Natalia L, Eugenio O, et al. Robust human machine interface based on head movements applied to assistive robotics. The ScientificWorld Journal. 2013:589636.
  35. Quintero CP, Oscar R, Martin J, et al. VIBI: assistive vision-based interface for robot manipulation. In: IEEE International Conference on Robotics and Automation (ICRA), Washington. 2015.
  36. Rajesh A, Mantur M. Gesture eyeball wheelchair controlled automatic learning using deep. In: IEEE Region 10 Humanitarian Technology Conference (HTC R10), 21-23 Jan, Dhaka, Bangladesh; 2017.
  37. Rampinelli M, Vinicius PM, Raquel F, et al. Use of computer vision for localization of a robotic wheelchair in an intelligent space. In: ISSNIP Biosignals and Biorobotics Conference: Biosignals and Robotics for Better and Safer Living (BRC). Rio de Janeiro. 2013.
  38. Solea R, Adrian F, Adriana F, et al. Wheelchair control and navigation based on kinematic model and iris movement. 2015.
  39. Utaminingrum F, Tri Astoto Kurniawan, M Ali Fauzi, et al. Laser-vision based obstacle detection and distance estimation for smart wheelchair navigation. In: IEEE International Conference on Signal and Image Processing. 2016.
  40. Utaminingrum F, M Ali Fauzi, Randy CW, et al. Development of computer vision based obstacle detection and human tracking on smart wheelchair for disabled patient. In: 5th International Symposium on Computational Intelligence and Business. 2017.
  41. Vidal EG, Zarricueta EF, Cheein FA, et al. Human-inspired sound environment recognition system for assistive vehicles. J Neural Eng. 2015;12:016012.
  42. Yu H, Jiann JC, Chung HK, et al. The human-environment interface design with vision robotics assistance module. In: 6th Annual International IEEE EMBS Conference on Neural Engineering. San Diego, California; 2013.
  43. Zal F, Ting SC, Shou-WC, Fuzzy controller based subsumption behavior architecture for autonomous robotic wheelchair. In: International Conference on Advanced Robotics and Intelligent Systems, Taiwan, 2013.
  44. Zondervan DK, Secoli R, Darling AM, et al. Design and evaluation of the kinect-controlled wheelchair interface (kwic) smart wheelchair for pediatric powered mobility training. Assistive Technology. 2015;27:183–192.
Creative Commons Attribution License

©2019 Fernandes, et al. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.