Fatma Ben Taher, Nader Ben Amor, and Mohamed Jallouli


  1. [1] K. Arai and R. Mardiyanto, A prototype of electric wheelchair controlled by eye-only for paralyzed user, Journal of Robotics and Mechatronics, 23, 2011, 66–74.
  2. [2] X. Xu, Y. Zhang, Y. Luo, and D. Chen, Robust bio-signalbased control of an intelligent wheelchair, Robotics, 2, 2013, 187–197.
  3. [3] N. Mani, A. Sebastian, A.M. Paul, A. Chacko, and A. Ragunath, Eye controlled electric wheel chair, International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering, 4, 2015, 2494–2497.
  4. [4] T. Kaufman, A. Herweg, and A. K¨ubler, Toward brain–computer interface based wheelchair control utilizing tactually-evoked event-related potentials, Journal of Neuroengineering and Rehabilitation, 11, 2014, 1–17.73
  5. [5] J. Philips, et al., Adaptive shared control of a brain-actuatedsimulated wheelchair, Proc. 10th IEEE Int. Conf. on Rehabil-itation Robotics, Noordwijk, The Netherlands, 2007, 408–414.
  6. [6] H. Tran et al. An EEG-controlled wheelchair using eye move-ments, Proc. 5th Int. Conf. Biomed. Eng., Vietnam, 2015,470–473.
  7. [7] T. Carlson and J. del Rs Millan, Brain-controlled wheelchairs:A robotic architecture. IEEE Robotics Automation Magazine,20, 2013, 65–73.
  8. [8] G. Massimo, et al., Towards a brain-activated and eye-controlled wheelchair, International Journal of Bioelectromag-netism, 13(1), 2011, 44–45.
  9. [9] E.C. Lee, J.C. Woo, J.H. Kim, M. Whang, and K.R. Park, Abrain–computer interface method combined with eye trackingfor 3D interaction, Journal of Neuroscience Methods, 190(2),2010, 289–298.
  10. [10] A. Jain, C.W. de Silva, and Q.M.J. Wu, Intelligent fusion ofsensor data for product quality assessment in a fishcuttingmachine, Control and Intelligent Systems, 32, 2004
  11. [11] D. Izadi, J.H. Abawajy, S. Ghanavati, and T. Herawan, A datafusion method in wireless sensor networks. Sensors (Basel),15, 2015, 2964–2979.
  12. [12] S. Koelstra and I. Patras, Fusion of facial expressions and EEGfor implicit affective tagging, Image and Vision Computing,31, 2013, 164–174
  13. [13] X. Li, A. Dick, C. Shen, Z. Zhang, A. van den Hengel, andH. Wang, Visual tracking with spatio temporal Dempster Shaferinformation fusion. IEEE Transactions on Image Processing,22, 2013, 3028–3040
  14. [14] W. Zheng, B. Dong, and B. Lu, Multimodal emotion recognitionusing EEG and eye tracking data, Proc. 36th Annual Int.Conf. of the IEEE Engineering in Medicine and Biology Society(EMBC), Chicago, IL, 2014, 5040–5043.
  15. [15] “Emotiv epoc software development kit , http://www.emotiv.com/.
  16. [16] P. Viola and M. Jones, Robust real-time face detection, Inter-national Journal of Computer Vision, 57(2), 2004, 137–154.
  17. [17] P.K. Allen, A. Timcenko, B. Yoshimi, and P. Michelman,Automated tracking and grasping of a moving object with arobotic hand-eye system, IEEE Transactions on Robotics andAutomation, 9(2), 1993, 152–165.
  18. [18] J. Rada-Vilela, fuzzylite: A fuzzy logic control library writtenin C++, 2013.
  19. [19] Opencv tutorial: http://www.geckogeek.fr/tutorial-opencv-isoler-et-traquer-une-couleur.html.
  20. [20] A. Urken, Voting theory, data fusion, and explanations of socialbehaviour, AAAI Spring symposium Series, North America,2011.
  21. [21] L.I. Kuncheva, A Theoretical study on six classifier fusionstrategies, IEEE Transactions on Pattern Analysis and Ma-chine Intelligence, 24(2), 2002, 281–286.
  22. [22] K.S. Ahmed, Wheelchair movement control via human eyeblinks, American Journal of Biomedical Engineering, 1(1),2011, 55–58.
  23. [23] H. Yamada and T. Muto, Using virtual reality to assess factorsaffecting shipboard accessibility for wheelchair users, Controland Intelligent System, 32, 2004.

Important Links:

Go Back