ROBOT GRASPING AND MANIPULATION COMBINING VISION AND TOUCH, 181-194.

Zihao Ding, Guodong Chen, Zhenhua Wang, and Lining Sun

References

  1. [1] Y. Liu, Z. Li, H. Liu, Z. Kan, and B. Xu, Bioinspiredembodimentfor intelligent sensing and dexterity in fine manipulation: Asurvey, IEEE Transactions on Industrial Informatics, 16(7),2020, 4308–4321.
  2. [2] V. Vellaiyan, S. Subramaniam, and V. Arunachalam, Bendangle and contact force on soft pneumatic gripper for graspingcylindrical-shaped different-sized objects, International Jour-nal of Robotics and Automation, 37(5), 2022, 391–399.
  3. [3] N. Korbinian, S. Arne, and A.-S. Alin, Towardsautonomousrobotic assembly: Using combined visual and tactile sensingfor adaptive task execution, Journal of Intelligent & RoboticSystems, 101(3), 2021.
  4. [4] I. Lenz, H. Lee, and A. Saxena, Deep learning for detectingrobotic grasps, The International Journal of Robotics Research,34(4–5), 2015, 705–724.
  5. [5] L. Pinto and A. Gupta, Supersizing self-supervision: Learningto grasp from 50Ktries and 700 robot hours, Proc.2016 IEEEInternational Conf.on Robotics And Automation (ICRA),Stockholm, 2016, 3406–3413.
  6. [6] E. Lyu, X. Yang, W. Liu, J. Wang, S. Song, and Q.-H.M.Max, AN autonomous eye-in-hand robotic system for pickingobjects in a supermarket environment with non-holonomicconstraint, International Journal of Robotics and Automation,37(4), 2022, 352–361.
  7. [7] L. Chen, P. Huang, and Z. Meng, Convolutional multi-graspdetection using grasp path for RGBD images, Robotics andAutonomous Systems, 113, 2019, 94–103.
  8. [8] A. Garcia-Garcia,S. Orts-Escolano, S.Oprea, J. Garcia-Rodriguez, J.Azorin-Lopez, M.Saval-Calvo, and M.Cazorla,Multi-sensor 3D object dataset for object recognition with fullpose estimation, Neural Computing and Applications, 28(5),2017, 941–952.
  9. [9] R. Calandra, A. Owens, M.Upadhyaya,W. Yuan,J. Lin,E.H.Adelson, and,S. Levine, The feeling of success: Doestouch sensing help predict grasp outcomes?, 2017,arXiv:1710.05512.
  10. [10] R. Calandra, A. Owens, M.Upadhyaya,W. Yuan,J. Lin,E.H.Adelson, and,S. Levine, More than a feeling: Learning tograsp and regrasp using vision and touch, IEEE Robotics andAutomation Letters, 3(4), 2018, 3300–3307.
  11. [11] R. Li and E.H. Adelson, Sensing and recognizing surfacetextures using a GelSightsensor, Proc.of the IEEE Conf.onComputer Vision and Pattern Recognition, Portland, OR, 2013,1241–1247.
  12. [12] H. Hai, D.-P.Yang, C.-Y.Sun, N. Li, Y.-J. Pang, L. Jiang, andH. Liu, Surface EMG for multi-pattern recognition with sensoryfeedback controller of hand prosthesis system, InternationalJournal of Robotics and Automation, 28(1), 2013.
  13. [13] Y. Chebotar, K. Hausman, Z. Su, G.S. Sukhatme, and S.Schaal, Self-supervised regrasping using spatio-temporal tactilefeatures and reinforcement learning, Proc.2016 IEEE/RSJInternational Conf.on Intelligent Robots and Systems (IROS),Daejeon, 2016, 1960–1966.
  14. [14] Z. Yi, T. Xu, W. Shang, W. Li, and X. Wu, Geneticalgorithm-based ensemble hybrid sparse ELM for graspstability recognition with multimodal tactile signals,IEEE Transactions on Industrial Electronics, 70(3), 2022,2790–2799.
  15. [15] Z. Deng, Y. Jonetzko, L. Zhang, and J. Zhang, Grasping forcecontrol of multi-fingered robotic hands through tactile sensingfor object stabilization, Sensors, 20(4), 2020, 1050.
  16. [16] Q. Li, O. Kroemer, Z. Su, F.F. Veiga, M. Kaboli, and H.J.Ritter, A review of tactile information: Perception and actionthrough touch, IEEE Transactions on Robotics, 36(6), 2020,1619–1634.
  17. [17] C. De Farias, N. Marturi, R. Stolkin, and Y. Bekiroglu,Simultaneous tactile exploration and grasp refinement forunknown objects, IEEE Robotics and Automation Letters,6(2), 2021, 3349–3356.
  18. [18] F. Veiga, H.V. Hoof, J. Peters, and T. Hermans, Stabilizingnovel objects by learning to predict tactile slip, Proc.2015IEEE/RSJ International Conf.on Intelligent Robots andSystems (IROS), Hamburg, 2015, 5065–5072.
  19. [19] D. Han, H. Nie, J. Chen, M. Chen, Z. Deng, and J. Zhang,Multi-modal haptic image recognition based on deep learning,Sensor Review, 38(4), 2018, 486–493.
  20. [20] T. Li, X. Sun, X.Shu, C. Wang, Y. Wang, G. Chen, and N.Xue,Robot grasping system and grasp stability prediction based onflexible tactile sensor array, Machines, 9(6), 2021, 119.
  21. [21] J.W. James and N.F. Lepora, Slip detection for graspstabilization with a multifingered tactile robot hand, IEEETransactions on Robotics, 37(2), 2020, 506–519.
  22. [22] S.J. Lederman and R.L. Klatzky, Multisensory textureperception,in J. Kaiser and M. Naumer(eds.), Handbook ofmultisensory processes (New York, NY: Springer, 2004),107–122.
  23. [23] S. Lacey, C. Campbell, and K. Sathian, Vision andtouch: Multiple or multisensory representations of objects?,Perception, 36(10), 2007, 1513–1521.
  24. [24] F. Wallhoff, J. Blume, A. Bannat, W. R¨osel, C. Lenz, andA. Knoll, A skill-based approach towards hybrid assembly,Advanced Engineering Informatics, 24(3), 2010, 329–339.
  25. [25] S. Wang, J. Wu, X. Sun, W. Yuan, W.T. Freeman,J.B. Tenenbaum, and E.H. Adelson, 3Dshape perceptionfrom monocular vision, touch, and shape priors, Proc.2018IEEE/RSJ International Conf.on Intelligent Robots andSystems (IROS), Madrid, 2018, 1606–1613.
  26. [26] D. Guo, F. Sun, H. Liu, T. Kong, B. Fang, and N. Xi, Ahybrid deep architecture for robotic grasp detection, Proc.2017IEEE International Conf.on Robotics and Automation (ICRA),Singapore, 2017, 1609–1614.
  27. [27] S. Cui, R. Wang, J. Wei, J. Hu, and S. Wang, Self-attention based visual-tactile fusion learning for predictinggrasp outcomes, IEEE Robotics and Automation Letters, 5(4),2020, 5827–5834.
  28. [28] A. Caporali, K. Galassi, G. Laudante, G. Palli, and S. Pirozzi,Combining vision and tactile data for cable grasping, Proc.2021IEEE/ASME International Conf.on Advanced IntelligentMechatronics (AIM), Delft, 2021, 436–441.
  29. [29] Y. Han, K. Yu, R. Batra, N. Boyd, C. Mehta, T. Zhao, Y. She,S. Hutchinson, and Y. Zhao, Learning generalizable vision-tactile robotic grasping strategy for deformable objects viatransformer, 2021, arXiv:2112.06374.
  30. [30] S. Kanitkar, H. Jiang, and W. Yuanl, PoseIt: A visual-tactile dataset of holding poses for grasp stability analysis,Proc.2022 IEEE/RSJ International Conf.on Intelligent Robotsand Systems (IROS), 2022, 71–78.
  31. [31] M. Matak and T. Hermans, Planning visual-tactile precisiongrasps via complementary use of vision and touch, IEEERobotics and Automation Letters, 8(2), 2022, 768–775.
  32. [32] S. Hochreiter and J. Schmidhuber, Long short-term memory,Neural Computation, 9(8), 1997, 1735–1780.
  33. [33] A. Graves and J. Schmidhuber, Framewise phoneme classi-fication with bidirectional LSTM and other neural networkarchitectures, Neural Networks, 18(5–6), 2005, 602–610.
  34. [34] K. Greff, R.K. Srivastava, J. Koutn´ık, B.R. Steunebrink,and J. Schmidhuber, LSTM: A search space odyssey, IEEETransactions on Neural Networks and Learning Systems,28(10), 2016, 2222–2232.
  35. [35] Y. Jiang, S. Moseson, and A. Saxena, Efficient grasping fromrgbd images: Learning using a new rectangle representation,Proc.2011 IEEE International Conf.on Robotics and Automa-tion, Shanghai, 2011, 3304–3311.
  36. [36] J. Redmon and A. Angelova, Real-time grasp detection usingconvolutional neural networks, Proc.2015 IEEE InternationalConf.on Robotics and Automation (ICRA), Seattle, WA, 2015,1316–1322.
  37. [37] S. Kumra and C. Kanan, Robotic grasp detection usingdeep convolutional neural networks, Proc.2017 IEEE/RSJInternational Conf.on Intelligent Robots and Systems (IROS),Vancouver, BC, 2017, 769–776.
  38. [38] D. Morrison, P. Corke, and J. Leitner, Learning robust, real-time, reactive robotic grasping, The International Journal ofRobotics Research, 39(2–3), 2020, 183–201.
  39. [39] H. Karaoguz and P. Jensfelt, Object detection approachfor robot grasp detection, Proc.2019 International Conf.onRobotics and Automation (ICRA), Montreal, QC, 2019,4953–4959.193

Important Links:

Go Back