• Skip to primary navigation
  • Skip to main content
  • Skip to footer
  • Career
  • Alumni
  • Employers
  • News

Fung Institute for Engineering LeadershipFung Institute for Engineering LeadershipFung Institute for Engineering Leadership

  • Master of Engineering
    • Master of Engineering Program
      • Engineering Departments
      • Program Design
      • Leadership Development
      • Capstone Experience
      • Career Development
      • Learn More
      • How to Apply
  • Fung Fellowship
    • Fung Fellowship

      The Fung Fellowship is shaping a new generation of entrepreneurial leaders focused on transforming health and wellness.

      • Program Overview
    • The Fung Fellowship
    • Executive & Professional Education
  • Partners
    • Partners
    • Become a Partner
    • Propose a Project
    • Recruit a Student
  • Apply
  • About
  • Career
  • Alumni
  • News
People by project smiling

[Siemens] Coding-Free Robot Control: Rapid Learning for Automated Processes

October 28, 2019 by

Show it once, and leave the rest to the robot

Team: Zhe Wang (ME), Gordon Li (ME), Tsegab Mekonnen (ME)
Advisors: Chengtao Wen (Siemens), Gabriel Gomes (ME)

Industrial robot arms are widely used in various fields including manufacturing automation. When these robots need to adapt to a new production task, reprogramming of the control is the key. Our goal is to make this transition process more user-friendly, cost-effective, and adaptive, by enabling a robot arm to mimic human motions without explicit programming. To achieve this, we adopted computer vision object-tracking techniques to automatically record, analyse, and extract useful information from human motion, and execute the desired trajectory with Model Predictive Control.

Solution

Our goal is to enable a robot arm to learn new tasks without the need for explicit programming. To achieve this, we adopted computer vision object-tracking techniques to automatically extract coordinate information from human motion and executed the trajectory following with Model Predictive Control.

Benefits

  • User-friendly
  • Adaptive
  • Cost-effective

Process

First, OpenCV is used to extract spatial coordinates of a moving object from a video, operating with specific color type (A) or by selecting region of interest (B). A Graphic User Interface (C) is then developed to conveniently record and extract useful information.

Model Predictive Control follows the planned trajectory as closely as possible and receives current measurements of robot joint angles from simulator and calculate the optimal next-step torque.

Then, the Robot Simulator demonstrates the effect of MPC and PID controller on a virtual UR5 robot
arm. This forms a closed loop feedback control with the controller and uses built-in Proportional-integral-
derivative (PID) controller of VREP for comparison with MPC.

Results

  • Has global awareness
  • Handles multiple inputs and outputs
  • Produces smaller errors
Project brief

← View all Capstone Projects

Fung Institute For Engineering Leadership
Shires Hall
2451 Ridge Road Berkeley, CA 94709

Mudd Hall
1798 Scenic Avenue Berkeley, CA 94709

(510) 642-0633
funginstitute@berkeley.edu

Explore

  • Programs
  • Partners
  • Apply
  • Feedback
  • Job Opportunities

Experience

  • About
  • Career
  • Alumni
  • News
  • Donate

Connect

Copyright © 2023 Accessibility • Nondiscrimination • Privacy • Sitemap

berkeley_engineering

uc-berkeley

Copyright © 2023 Accessibility • Nondiscrimination • Privacy • Sitemap

berkeley_engineering

uc-berkeley

Prospective MEng Students

Sign up for our mailing list to receive program news and updates including information sessions, class visits and opportunities to connect with an admissions advisor.