{"id":231,"date":"2025-02-17T05:12:27","date_gmt":"2025-02-17T05:12:27","guid":{"rendered":"https:\/\/chinaservicerobots.com\/?p=231"},"modified":"2025-02-17T05:12:27","modified_gmt":"2025-02-17T05:12:27","slug":"robotic-grasping-technology-based-on-shape-analysis-and-probabilistic-reasoning","status":"publish","type":"post","link":"https:\/\/chinaservicerobots.com\/ar\/robotic-grasping-technology-based-on-shape-analysis-and-probabilistic-reasoning\/","title":{"rendered":"Robotic Grasping Technology Based on Shape Analysis and Probabilistic Reasoning"},"content":{"rendered":"<div class=\"article-left\">\n<div id=\"articleEnMeta\" class=\"articleEn loaded\">\n<h2><span style=\"font-family: arial, helvetica, sans-serif; font-size: 12pt;\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium\" src=\"https:\/\/robot.sia.cn\/fileJQR\/journal\/article\/jqr\/2025\/1\/6e3281d8-f7f0-4d27-a86b-6498b9f6270b.jpg\" width=\"1180\" height=\"885\" \/><\/span><\/h2>\n<h2><span style=\"font-family: arial, helvetica, sans-serif; font-size: 12pt;\">School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China<\/span><\/h2>\n<div class=\"com-author-info article-fundPrjs\"><\/div>\n<p><span style=\"font-family: arial, helvetica, sans-serif;\"><a class=\"togglebtn mainColor\"><i class=\"articleFont icon-jia\"><\/i>More Information<\/a><\/span><\/div>\n<\/div>\n<ul class=\"article-tab-box tab-content article-box-content\">\n<li style=\"list-style-type: none;\">\n<ul class=\"article-tab-box tab-content article-box-content\">\n<li id=\"GraphicalAbstract\" class=\"articleListBox loaded\"><\/li>\n<li id=\"Abstract\" class=\"articleListBox loaded\">\n<h3 id=\"Abstract-list\" class=\"navTitle visible-lg\"><span style=\"font-family: arial, helvetica, sans-serif;\">Abstract<\/span><\/h3>\n<div id=\"014d4de1-b8be-4e37-bc5e-f759ade7bbce_abs_div_0\" class=\"article-abstract \"><span style=\"font-family: arial, helvetica, sans-serif;\">In the task of grasping irregular objects, the transported objects may shake and fall off due to their complex and diverse shapes and structures. For these issues, a robotic grasping technology based on shape analysis and probabilistic reasoning is proposed. Firstly, the dispersivity and flatness of the object&#8217;s point cloud are analyzed to generate a set of candidate grasping poses. Then, the factors influencing the shaking and falling off of the object are qualitatively analyzed in the simulation scenario, and the number of successful grasping and rotation-translation experiments is statistically counted in the simulation. The stability of the grasp pose is quantitatively analyzed using the conditional expectation method, and a PointNet discriminator is trained to evaluate and rank the candidate grasp poses. The grasping is ultimately completed with the optimal grasp pose. The experimental results indicate that the proposed method can solve the issue of shaking and falling off of irregular objects during the grasping and transporting process. Compared with the benchmark method, the average grasping success rate is improved to 89.2%, an increase of 2.6%, and the average transportation stability is enhanced to 84.2%, an increase of 22.7%. The proposed method enables intelligent grasping of objects in multi-object stacking scenarios, ensuring stability during the grasping and transporting process, and establishing a logical sequence for grasping.\u00a0<span id=\"icon_014d4de1-b8be-4e37-bc5e-f759ade7bbce_abs_div_0\" class=\"translate-icon\" title=\"Translate this paragraph\"><\/span><\/span><\/div>\n<p><span style=\"font-family: arial, helvetica, sans-serif;\"><b>Keywords:<\/b><\/span><\/p>\n<ul class=\"article-keyword article-info-en\">\n<li><span style=\"font-family: arial, helvetica, sans-serif;\"><a class=\"underHigh mainColor\">intelligent grasping<\/a>,\u00a0<\/span><\/li>\n<li><span style=\"font-family: arial, helvetica, sans-serif;\"><a class=\"underHigh mainColor\">irregular object<\/a>,\u00a0<\/span><\/li>\n<li><span style=\"font-family: arial, helvetica, sans-serif;\"><a class=\"underHigh mainColor\">shape characteristics<\/a>,\u00a0<\/span><\/li>\n<li><span style=\"font-family: arial, helvetica, sans-serif;\"><a class=\"underHigh mainColor\">grasping pose evaluation<\/a>,\u00a0<\/span><\/li>\n<li><span style=\"font-family: arial, helvetica, sans-serif;\"><a class=\"underHigh mainColor\">6D grasping pose<\/a><\/span><\/li>\n<\/ul>\n<\/li>\n<li id=\"FullText\" class=\"articleListBox FullText-all html-text\">\n<div class=\"appendix-html\"><\/div>\n<div class=\"acks-html\"><\/div>\n<\/li>\n<li id=\"References\" class=\"articleListBox loaded\">\n<h3 id=\"References-list\" class=\"navTitle\"><span style=\"font-family: arial, helvetica, sans-serif;\">References<\/span><\/h3>\n<div class=\"References-wrap\">\n<table class=\"reference-tab\">\n<tbody>\n<tr id=\"b1\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[1]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-cn\"><span style=\"font-family: arial, helvetica, sans-serif;\">\u9648\u6dd1\u5a77. \u4e2d\u56fd\u5de5\u4e1a\u673a\u5668\u4eba\u4ea7\u4e1a\u521b\u65b0\u7f51\u7edc\u6f14\u5316\u7814\u7a76[D]. \u5e7f\u5dde: \u5e7f\u5dde\u5927\u5b66, 2022.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.27040\/d.cnki.ggzdu.2022.000652\" target=\"_blank\" rel=\"noopener\">10.27040\/d.cnki.ggzdu.2022.000652<\/a><\/span><\/div>\n<p class=\"mar6\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">CHEN S T. The research on innovation network evolution of Chinese industrial robot industry[D]. Guangzhou: Guangzhou University, 2022.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.27040\/d.cnki.ggzdu.2022.000652\" target=\"_blank\" rel=\"noopener\">10.27040\/d.cnki.ggzdu.2022.000652<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b2\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[2]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-cn\"><span style=\"font-family: arial, helvetica, sans-serif;\">\u6f58\u9759\u6960. \u4eba\u53e3\u5e74\u9f84\u7ed3\u6784\u8001\u5316\u3001\u52b3\u52a8\u529b\u6d41\u52a8\u4e0e\u673a\u5668\u6362\u4eba[D]. \u676d\u5dde: \u6d59\u6c5f\u5927\u5b66, 2022.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.27461\/d.cnki.gzjdx.2022.000490\" target=\"_blank\" rel=\"noopener\">10.27461\/d.cnki.gzjdx.2022.000490<\/a><\/span><\/div>\n<p class=\"mar6\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">PAN J N. Aging of population age structure, labor migration and robot replacement[D]. Hangzhou: Zhejiang University, 2022.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.27461\/d.cnki.gzjdx.2022.000490\" target=\"_blank\" rel=\"noopener\">10.27461\/d.cnki.gzjdx.2022.000490<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b3\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[3]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-cn\"><span style=\"font-family: arial, helvetica, sans-serif;\">\u5218\u4e9a\u6b23, \u738b\u65af\u7476, \u59da\u7389\u5cf0, \u7b49. \u673a\u5668\u4eba\u6293\u53d6\u68c0\u6d4b\u6280\u672f\u7684\u7814\u7a76\u73b0\u72b6[J]. \u63a7\u5236\u4e0e\u51b3\u7b56, 2020, 35(12): 2817-2828.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.13195\/j.kzyjc.2019.1145\" target=\"_blank\" rel=\"noopener\">10.13195\/j.kzyjc.2019.1145<\/a><\/span><\/div>\n<p class=\"mar6\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">LIU Y X, WANG S Y, YAO Y F, et al. Recent researches on robot autonomous grasp technology[J]. Control and Decision, 2020, 35(12): 2817-2828.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.13195\/j.kzyjc.2019.1145\" target=\"_blank\" rel=\"noopener\">10.13195\/j.kzyjc.2019.1145<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b4\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[4]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">ZHANG H B, TANG J, SUN S G, et al. Robotic grasping from classical to modern: A survey[DB\/OL]. [2024-02-01].\u00a0<a class=\"mainColor ref-url \" href=\"https:\/\/arxiv.org\/abs\/2202.03631\" target=\"_blank\" rel=\"noopener\">https:\/\/arxiv.org\/abs\/2202.03631<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b5\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[5]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">DENG Z, JONETZKO Y, ZHANG L, et al. Grasping force control of multi-fingered robotic hands through tactile sensing for object stabilization[J]. Sensors, 2020, 20(4).\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.3390\/s20041050\" target=\"_blank\" rel=\"noopener\">10.3390\/s20041050<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b6\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[6]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">MATAK M, HERMANS T. Planning visual-tactile precision grasps via complementary use of vision and touch[J]. IEEE Robotics and Automation Letters, 2023, 8(2): 768-775.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/LRA.2022.3231520\" target=\"_blank\" rel=\"noopener\">10.1109\/LRA.2022.3231520<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b7\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[7]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">SIDDIQUI M S, COPPOLA C, SOLAK G, et al. Grasp stability prediction for a dexterous robotic hand combining depth vision and haptic Bayesian exploration[J]. Frontiers in Robotics and AI, 2021, 8.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.3389\/frobt.2021.703869\" target=\"_blank\" rel=\"noopener\">10.3389\/frobt.2021.703869<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b8\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[8]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">CHEN M Q, LI S D, SHUANG F, et al. Development of a three-fingered multi-modality dexterous hand with integrated embedded high-dimensional sensors[J]. Journal of Intelligent &#038; Robotic Systems, 2023, 108.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1007\/s10846-023-01875-6\" target=\"_blank\" rel=\"noopener\">10.1007\/s10846-023-01875-6<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b9\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[9]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">XIE Z, LIANG X, ROBERTO C. Learning-based robotic grasping: A review[J]. Frontiers in Robotics and AI, 2023, 10.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.3389\/frobt.2023.1038658\" target=\"_blank\" rel=\"noopener\">10.3389\/frobt.2023.1038658<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b10\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[10]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">DU G G, WANG K, LIAN S G, et al. Vision-based robotic grasping from object localization, object pose estimation to grasp estimation for parallel grippers: A review[J]. Artificial Intelligence Review, 2021, 54(3): 1677-1734.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1007\/s10462-020-09888-5\" target=\"_blank\" rel=\"noopener\">10.1007\/s10462-020-09888-5<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b11\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[11]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">OUYANG W X, HUANG W H, MIN H S. Robot grasp with multi-object detection based on RGB-D image[C]\/\/ China Automation Congress. Piscataway, USA: IEEE, 2021: 6543-6548.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/CAC53003.2021.9728678\" target=\"_blank\" rel=\"noopener\">10.1109\/CAC53003.2021.9728678<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b12\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[12]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">ZHANG S T, GUO Z C, HUANG J, et al. Robotic grasping position of irregular object based Yolo algorithm[C]\/\/ International Conference on Automation, Control and Robotics Engineering. Piscataway, USA: IEEE, 2020: 642-646.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/CACRE50138.2020.9229933\" target=\"_blank\" rel=\"noopener\">10.1109\/CACRE50138.2020.9229933<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b13\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[13]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">LIU D, TAO X T, YUAN L H, et al. Robotic objects detection and grasping in clutter based on cascaded deep convolutional neural network[J]. IEEE Transactions on Instrumentation and Measurement. 2022, 71.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/TIM.2021.3129875\" target=\"_blank\" rel=\"noopener\">10.1109\/TIM.2021.3129875<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b14\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[14]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">MAHLER J, LIANG J, NIYAZ S, et al. Dex-Net 2.0: Deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics[DB\/OL]. (2017-08-08) [2024-02-01].\u00a0<a href=\"https:\/\/arxiv.org\/abs\/1703.09312\">https:\/\/arxiv.org\/abs\/1703.09312<\/a>.<\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b15\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[15]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">LIANG H Z, MA X J, LI S, et al. PointNetGPD: Detecting grasp configurations from point sets[C]\/\/ International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2019: 3629-3635.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/ICRA.2019.8794435\" target=\"_blank\" rel=\"noopener\">10.1109\/ICRA.2019.8794435<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b16\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[16]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">DUAN H N, WANG P, HUANG Y Y, et al. Robotics dexterous grasping: The methods based on point cloud and deep learning[J]. Frontiers in Neurorobotics, 2021, 15.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.3389\/fnbot.2021.658280\" target=\"_blank\" rel=\"noopener\">10.3389\/fnbot.2021.658280<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b17\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[17]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-cn\"><span style=\"font-family: arial, helvetica, sans-serif;\">\u90ac\u91d1.\u8bba\u5f02\u5f62\u6db2\u4f53\u5bb9\u5668\u9020\u578b\u53ca\u5176\u9500\u552e\u5305\u88c5\u8bbe\u8ba1[D].\u82cf\u5dde: \u82cf\u5dde\u5927\u5b66, 2018.\u00a0<a class=\"mainColor ref-url \" href=\"https:\/\/cdmd.cnki.com.cn\/Article\/CDMD-10285-1018146406.htm\" target=\"_blank\" rel=\"noopener\">https:\/\/cdmd.cnki.com.cn\/Article\/CDMD-10285-1018146406.htm<\/a><\/span><\/div>\n<p class=\"mar6\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">WU J. On the design of the shaped liquid containers and their sales packaging design[D]. Suzhou: Soochow University, 2018.\u00a0<a class=\"mainColor ref-url \" href=\"https:\/\/cdmd.cnki.com.cn\/Article\/CDMD-10285-1018146406.htm\" target=\"_blank\" rel=\"noopener\">https:\/\/cdmd.cnki.com.cn\/Article\/CDMD-10285-1018146406.htm<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b18\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[18]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-cn\"><span style=\"font-family: arial, helvetica, sans-serif;\">\u6731\u67ad. \u57fa\u4e8e\u591a\u76ee\u89c6\u89c9\u7684\u5f02\u5f62\u74f6\u6807\u7b7e\u56fe\u50cf\u9ad8\u901f\u62fc\u63a5\u7cfb\u7edf\u7814\u7a76[D]. \u4e0a\u6d77: \u4e0a\u6d77\u7535\u673a\u5b66\u9662, 2023.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.27818\/d.cnki.gshdj.2023.000110\" target=\"_blank\" rel=\"noopener\">10.27818\/d.cnki.gshdj.2023.000110<\/a><\/span><\/div>\n<p class=\"mar6\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">ZHU X. Fast image stitching system for irregular bottle based on multi-view stereo vision[D]. Shanghai: Shanghai Dianji University, 2023.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.27818\/d.cnki.gshdj.2023.000110\" target=\"_blank\" rel=\"noopener\">10.27818\/d.cnki.gshdj.2023.000110<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b19\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[19]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">MANUELLI L, GAO W, FLORENCE P, et al. KPAM: KeyPoint affordances for category-level robotic manipulation[C]\/\/ International Symposium of Robotics Research. Cham, Switzerland: Springer, 2022: 132-157.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1007\/978-3-030-95459-8_9\" target=\"_blank\" rel=\"noopener\">10.1007\/978-3-030-95459-8_9<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b20\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[20]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">DONG H X, ZHOU J D, QIU C, et al. Robotic manipulations of cylinders and ellipsoids by ellipse detection with domain randomization[J]. IEEE\/ASME Transactions on Mechatronics, 2023, 28(1): 302-313.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/TMECH.2022.3193895\" target=\"_blank\" rel=\"noopener\">10.1109\/TMECH.2022.3193895<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b21\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[21]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">WEN B, LIAN W, BEKRIS K, et al. CaTGrasp: Learning category-level task-relevant grasping in clutter from simulation[C]\/\/ International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2022: 6401-6408.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/ICRA46639.2022.9811568\" target=\"_blank\" rel=\"noopener\">10.1109\/ICRA46639.2022.9811568<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b22\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[22]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">CHARLES R Q, HAO S, MO K, et al. PointNet: Deep learning on point sets for 3D classification and segmentation[C]\/\/ IEEE Conference of Computer Vision and Pattern Recognition. Piscataway, USA: IEEE, 2017: 77-85.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1109\/CVPR.2017.16\" target=\"_blank\" rel=\"noopener\">10.1109\/CVPR.2017.16<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b23\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[23]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">XIANG Y, SCHMIDT T, NARAYANAN V, et al. PoseCNN: A convolutional neural network for 6D object pose estimation in cluttered scenes[DB\/OL]. (2018-05-26) [2024-02-01].\u00a0<a class=\"mainColor ref-url \" href=\"https:\/\/arxiv.org\/abs\/1711.00199.\" target=\"_blank\" rel=\"noopener\">https:\/\/arxiv.org\/abs\/1711.00199.<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<tr id=\"b24\" class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[24]<\/span><\/td>\n<td class=\"td2\">\n<div class=\"reference-en\"><span style=\"font-family: arial, helvetica, sans-serif;\">TEN PAS A, GUALTIERI M, SAENKO K, et al. Grasp pose detection in point clouds[J]. International Journal of Robotics Research, 2017, 36(13-14): 1455-1473.\u00a0doi:\u00a0<a class=\"mainColor ref-doi \" href=\"https:\/\/dx.doi.org\/10.1177\/0278364917735594\" target=\"_blank\" rel=\"noopener\">10.1177\/0278364917735594<\/a><\/span><\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/li>\n<li id=\"RelatedPages\" class=\"articleListBox\">\n<h3 id=\"relative-article\" class=\"navTitle\"><span style=\"font-family: arial, helvetica, sans-serif;\">Related Articles<\/span><\/h3>\n<table class=\"reference-tab\">\n<tbody>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[1]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">ZHANG Huiwen, SU Yun, LIU Yuwang, ZHAO Xingang, FU Chenglong, LENG Yuquan.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.230338\" target=\"_blank\" rel=\"noopener\">SGPD: Stable Grasp Pose Detection in Complex Environment<\/a>[J]. ROBOT, 2024, 46(6): 703-712.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.230338\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.230338<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[2]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">YU Xinyi, ZHAO Chongliang, CHEN Lei, ZHU Mingzhu, OU Linlin.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.230031\" target=\"_blank\" rel=\"noopener\">Design and Implementation of the Five-fingered Robotic Hand Grasping System Based on Regional Pose Solving<\/a>[J]. ROBOT, 2023, 45(6): 698-709.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.230031\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.230031<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[3]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">XU Shengjun, REN Junlin, LIU Guanghui, MENG Yuebo, HAN Jiuqiang.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.220445\" target=\"_blank\" rel=\"noopener\">Lightweight Encoding-Decoding Grasp Pose Detection Based on a Context Aggregation Strategy<\/a>[J]. ROBOT, 2023, 45(6): 641-654.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.220445\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.220445<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[4]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">CHEN Jun, SONG Wei, ZHOU Yang.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.210415\" target=\"_blank\" rel=\"noopener\">A Monocular Pose Estimation Method Based on Multi-module Neural Network and Genetic Algorithm<\/a>[J]. ROBOT, 2023, 45(2): 187-196, 237.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.210415\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.210415<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[5]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">XU Jin, LIU Ning, LI Deping, LIN Longxin, WANG Gao.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.210044\" target=\"_blank\" rel=\"noopener\">A Grasping Poses Detection Algorithm for Industrial WorkpiecesBased on Grasping Cluster and Collision Voxels<\/a>[J]. ROBOT, 2022, 44(2): 153-166.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.210044\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.210044<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[6]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">SU Jie, ZHANG Yunzhou, FANG Lijin, LI Qi, WANG Shuai.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.190261\" target=\"_blank\" rel=\"noopener\">Estimation of the Grasping Pose of Unknown Objects Based onMultiple Geometric Constraints<\/a>[J]. ROBOT, 2020, 42(2): 129-138.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.190261\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.190261<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[7]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">LIU Hanwei, CAO Chuqing, WANG Yongjuan.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.180762\" target=\"_blank\" rel=\"noopener\">Autonomous Grasping Method Based on Non-Structural Basic Composition Analysis<\/a>[J]. ROBOT, 2019, 41(5): 583-590.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.180762\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.180762<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[8]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">XIA Jing, QIAN Kun, MA Xudong, LIU Huan.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.170702\" target=\"_blank\" rel=\"noopener\">Fast Planar Grasp Pose Detection for Robot Based on Cascaded Deep Convolutional Neural Networks<\/a>[J]. ROBOT, 2018, 40(6): 794-802.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.170702\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.170702<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[9]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">ZHENG Jingyi, LI En, LIANG Zize.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.2017.0099\" target=\"_blank\" rel=\"noopener\">Grasping Posture Determination of Planar Workpieces Based on Shape Prior Model<\/a>[J]. ROBOT, 2017, 39(1): 99-110.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.2017.0099\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.2017.0099<\/a><\/span><\/td>\n<\/tr>\n<tr class=\"document-box\">\n<td class=\"td1\" valign=\"top\"><span style=\"font-family: arial, helvetica, sans-serif;\">[10]<\/span><\/td>\n<td class=\"td2\"><span style=\"font-family: arial, helvetica, sans-serif;\">HU Jie, GUAN Yisheng, WU Pinhong, SU Manjia, ZHANG Hong.\u00a0<a class=\"\" href=\"https:\/\/robot.sia.cn\/en\/article\/doi\/10.13973\/j.cnki.robot.2014.0569\" target=\"_blank\" rel=\"noopener\">Pole Pose Measurement and Autonomous Grasping with a Biped Climbing Robot<\/a>[J]. ROBOT, 2014, 36(5): 569-575.\u00a0DOI:\u00a0<a class=\"mainColor\" href=\"https:\/\/dx.doi.org\/10.13973\/j.cnki.robot.2014.0569\" target=\"_blank\" rel=\"noopener\">10.13973\/j.cnki.robot.2014.0569<\/a><\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/li>\n<li id=\"citedby-info\" class=\"article-box article-knowledge-map loaded\"><\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<div><\/div>\n","protected":false},"excerpt":{"rendered":"<p>Robotic Grasping Technology Based on Shape Analysis and Probabilistic Reasoning<br \/>\nIn the task of grasping irregular objects, the transported objects may shake and fall off due to their complex and diverse shapes and structures. For these issues, a robotic grasping technology based on shape analysis and probabilistic reasoning is proposed.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"zakra_page_container_layout":"customizer","zakra_page_sidebar_layout":"customizer","zakra_remove_content_margin":false,"zakra_sidebar":"customizer","zakra_transparent_header":"customizer","zakra_logo":0,"zakra_main_header_style":"default","zakra_menu_item_color":"","zakra_menu_item_hover_color":"","zakra_menu_item_active_color":"","zakra_menu_active_style":"","zakra_page_header":true,"footnotes":""},"categories":[72],"tags":[161],"class_list":["post-231","post","type-post","status-publish","format-standard","hentry","category-robot-new","tag-robotic-grasping-technology-based-on-shape-analysis-and-probabilistic-reasoning"],"_links":{"self":[{"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/posts\/231","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/comments?post=231"}],"version-history":[{"count":0,"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/posts\/231\/revisions"}],"wp:attachment":[{"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/media?parent=231"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/categories?post=231"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/chinaservicerobots.com\/ar\/wp-json\/wp\/v2\/tags?post=231"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}