Back to top

Center for Intelligent Robotics


Center for Intelligent Robotics (智能机器人研究中心)

Key research focus

The Center for Intelligent Robotics aims to advance the science and engineering of Robotics with emphasis on intelligent robot systems. By developing our core expertise in three pillars, namely, (A) Robot Control & Automation, (B) Computational Robotics as well as (C) Robotic Systems & Devices, the center hopes to engage actively in frontier research directed towards the development of cutting-edge technology for Intelligent Robot Systems. At the intersections of these pillars are subjects including Mechatronics, Cybernetics and Cyber-Physical Systems which in turn converge towards our focus on Intelligent Robotics. The envisioned areas include, but are not limited to, (i) robot planning, control & perception, (ii) collaborative robots & physical human-robot interaction (pHRI), (iii) human-machine interface, (iv) smart sensing devices & intelligent imaging technology for robotic systems, (v) probabilistic, knowledge-based, data-driven & cloud robotics, etc. Through a multidisciplinary approach encompassing Robotics, Computational Intelligence and Engineering Design, the center hopes foster strong cross-disciplinary research collaboration and synergize faculty expertise towards the area of Intelligent Robotics unique to ZJUI.


图片 1.png

Research projects: Representative research projects include, but not limited to, the following: (some examples, to be consolidated and streamlined towards the center vision)

Robotics in HUMAN supervised (CYBER-PHYSICAL) System for Agriculture [1-2]

•    Mobile Robotics Platform for Poultry Production

•    Mobile Robotics Platform for Plant Factories

•    Autonomous Navigation under Plant Canopy


Robotic Image-Guided Planning, Control & Perception

•    Image-Guided Surgical Planning and Robotic Interventional Procedures [3]

•    Intelligent Intraoperative Imaging [4-7]

•    Robot Vision for Micromanipulator Control [8-12]


Data-Driven Robot-Assisted Surgical Procedures and Training

•    Surgical Video-Data-Driven Trajectory Analysis [13]

•    Kinesthetic Training System for Robotic Surgery [14-15]

•    Human-Robot Interaction and Collaborative Control [16]


Prospective Scope

Adaptive Cyber-Physical Systems

•    Communicating robots in production, e.g. welding robots in car manufacturing

•    Cloud-controlled robots in health care at home, in hospitals and rest-homes

•    Interaction with non-robotic cyber-physical systems

Field Robotics

•    Mobile Robots: Airborne, Underwater, Ground, Wheeled, legged

•    Adaptive Robots: Reconfigurable Robots, Soft Robots, Swarm Robotics

The New-Generation Robotics

Bio-robotics, Cyborg, Molecular Robotics, Quantum Robotics

4) Key members of the center

Hongwei Wang; Huan Hu, Jiahuan Cui; Klaus-Dieter Schewe; Kuan-Chong Ting; Liangjing Yang; Mark Butala; Shurun Tan, Wee-Liat Ong; Yu Lin


College of Control Science & Engineering, ZJU: Chao Xu

School of Mechanical Engineering, ZJU: Tao Liu, Yong Lei

College of Biosystems Engineering and Food Science, ZJU: Tao Lin, Yibin Ying

UIUC: Elizabeth Hsiao-Wecksler, Thenkurussi “Kesh” Kesavadas, Girish Chowdhary

Postdoctoral Researchers: Santosh Kumar Singh


1.     G. Ren, T. Lin, Y. Ying, G. Chowdhary, K-C. Ting, “Agricultural robotics research applicable to poultry production: A review”, Computers and Electronics in Agriculture, 169, 105216, 2020

2.     G. Ren, E. B. Velasquez, T. Wu, T. Lin, Y. Ying, G. Chowdhary, K-C. Ting, “Autonomous navigation under plant canopy using an improved YOLO-based model”, in preparation

3.  L. Yang, R. Wen, J. Qin, C.-K. Chui, K.-B. Lim, and S. K. Y. Chang, "A Robotic System for Overlapping Radiofrequency Ablation in Large Tumor Treatment," Mechatronics, IEEE/ASME Transactions on, vol. 15, pp. 887-897, 2010.

4.  L. Yang, J. Wang, T. Ando, et al. "Self-contained image mapping of placental vasculature in 3D ultrasound-guided fetoscopy." Surgical Endoscopy, v 12, pp. 4136-4149, 2015

5.  L. Yang, J. Wang, T. Ando, et al. " Towards scene adaptive image correspondence for placental vasculature mosaic in computer assisted fetoscopic," International Journal of Medical Robotics and Computer Assisted Surgery, vol 12, pp. 375-386, 2016

6.  L Yang, J Wang, E Kobayashi, et al. "Image mapping of untracked free-hand endoscopic views to an ultrasound image-constructed 3D placenta model," International Journal of Medical Robotics and Computer Assisted Surgery, vol 11(2), pp. 223-234, 2015

7.  L. Yang, J. Wang, T. Ando, et al. "Vision-based endoscope tracking in 3D ultrasound image-guided fetoscopic procedures," Computerized Medical Imaging and Graphics, vol. 40, pp. 205-216, 2014

8.  L. Yang, K. Youcef-Toumi, U. Tan, “Towards Automatic Robot-Assisted Microscopy: An Uncalibrated Approach for Robotic Vision-Guided Micromanipulation,” in Intelligent Robots and System, IROS 2016, Daejeon, Korea, 2016.

9.  L. Yang, I. Paranawithana, et al., “Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup”, Automation Science and Engineering, IEEE Transactions, v15 pp.1609-15, 2018 (DOI: 10.1109/TASE.2017.2754517)

10. L. Yang, K. Youcef-Toumi, U. Tan, “Detect-Focus-Track-Servo (DFTS): A Vision-Based Workflow Algorithm for Robotic Image-Guided Micromanipulation,” in Robotics and Automation, ICRA 2017, Singapore, 2017

11. L. Yang, I. Paranawithana, K. Youcef-Toumi, U. Tan, “Self-initialization and recovery for uninterrupted tracking in vision-guided micromanipulation,” in Intelligent Robots and System, IROS 2017, Vancouver, Canada, 2017

12. L. Yang, I. Paranawithana, K. Youcef-Toumi, U. Tan, “Confidence-Based Hybrid Tracking to Overcome Visual Tracking Failures in Calibration-less Vision-Guided Micromanipulation”, Automation Science and Engineering, IEEE Transactions v 17, pp. 524-536, Jan. 2020

13.  J. Huang, X. Li, T. Kesavadas, L. Yang. “Feature Extraction of Video Data for Automatic Visual Tool Tracking in Robot Assisted Surgery”. In Proceedings of the 4th International Conference on Robotics, Control and Automation pp. 121-127, Jul. 2019.

14. T. Yang, L. Yang, J. Liu, C.-K. Chui, et. al., “Robotic Surgical Trainer for Image-guided Robot-Assisted Surgical Training” US 20130224710 A1, Date of Publication 2013

15. K. Hong, Y. Sun, J. He, Y. Lei, L. Yang, “Preliminary design of a robotic system for kinaesthetic training of surgical tasks”, The Journal of Engineering, v 14, p. 490 – 494, 2019

16. H-Y. Li, I. Paranawithana, L. Yang, et al.., “Stable and Compliant Motion of Physical Human-Robot Interaction Coupled with a Moving Environment using Variable Admittance and Adaptive Control”, IEEE Robotics and Automation Letters , v 3, pp. 2493-2500, July 2018