Automation plays an important role in increasing productivity and maintaining quality in industrial environment. Industrial Robots enable scalability of throughput and ensure strict quality which is very tough to attain by manual labor.
Large automobile companies outsource the production of smaller parts to a few small and medium sized industries. Such S & M sized industries invest in industrial robots for the manufacturing and finishing of auto parts.
During commissioning of the robots, ABB programs and configures the robot for a particular job. Any changes in job or work piece, the S & M scale industry reverts to ABB to configure the robot.
How could such industries adapt to frequent job changes without depending on ABB for configuring the robot?
The project explored direct manipulation techniques to configure the robot. Instead of joystick or push-buttons, Human Gestures were used to communicate with the robot. This project had two main parts — configuring the robot and defining the paint paths. Microsoft Kinect camera was used to track different points on a body.
The movement of hand-joints was mapped on the axes of the robot. The angle of movement of a particular hand-joint is signaled to the robot and the robot corresponds. This way the robot could be configured to a large extent and the fine tuning could be done by a TPU (Teach Pendant Unit).
Motion tracking was used to capture the paint path taken by an expert painter. The 3D points which constitute the path are fed to the robot which replicates the expert painter’s actions.
These way paint-efficient and energy-efficient paths could be made. This project explored how machine learning could happen through natural user interfaces like Human Gestures.