Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

Pick and place using labview and Ni vision acquisition

Solved!
Go to solution

Hi everyone,

 

I am doing a student project  on Vision guided pick and place of an industrial robot(abb) . I would like to know the steps involved in creating the block.

I have to locate the object , get its cooordinates through webcam. Then does a pattern matching.. and would send the cooordinates to microcontroller . then from microcontroller to robot controller.. then the industrial robot should pick the object and place it in a predefined place..

I would be extremely grateful if you guys can help me since  I am new to LabView. 

 

 

Thanks,

Pradeep.M

( impradeepm@gmail.com)

0 Kudos
Message 1 of 6
(7,748 Views)
Solution
Accepted by ImPradeepm

What you are describing is fairly involved, but here's are some tips.  The key is to correlate the robot's coordinate system to the camera's coordinate system.  I assume the camera is statically located above the pick-up area?  I would move the robot to each corner of the frame to its pick position vertically, and note the robot's position at those locations.  These 4 points in space will be correlated to the X,Y coordinates of the camera frame's pixels.  You basically need to write a sub-VI with the inputs being pixel X and Y coordinates, and the output being the robot coordinates.

 

Write a test application telling the robot to go to any X,Y pixel location in the frame to test your sub-VI.  If that is working, then you need to set up a pattern match.  You will likely want to do a geometric pattern match.  Have a look at this example:  http://zone.ni.com/devzone/cda/epd/p/id/5555

 

You will need your pattern match algorithm to return both the coordinates for your robot, and the orientation of the tool required to properly pick up the object (if the pick and place robot tool requires that it be in a specific orientation).  So its basically up to you to convert the object's X,Y,and rotation angle in the frame that you get from the pattern match to whatever coordinate system the robot uses.  

 

The placing algorithm could just be a orientation adjustment to put the object into placement orentation, and then the placement positions could be an array of robot coordinates that you iterate through after each pick.

 

Make sure to implement some safety mechanisms in your algorithms so that the robot can never move somewhere outside a safe range of coordinates.

Message 2 of 6
(7,731 Views)

What have you got so far?

From looking at your workflow you have 4 key sections

1) Object detection

2) Object recognition

3) Serial? Communication

4) Microcontroller stuff

 

1 and 2 are linked together.

First you need to use something like IMAQ (Not sure if it's the vision development or vision acquisiton that it comes with) to detect objects within the field of view of the webcam, perform some kind of filtering to only keep the one(s) you want (so for instance you could remove all objects with are too large or small). The location of the object is then fairly easy to get (in pixel coordinates).

 

Before you do 3, you need to work out what it is you want to tell the microcontroller. Just the current location of the object? Current and new location? etc. Depending on how the microcontroller is programmed you may have to convert the object location(s) into something useful for the microcontroller. For example, does the industrial robot operate on a raw x-y-z or does it use angles for its joints?

 

Microcontroller stuff - You need to work out how it communicates with the robot and program it appropriately. Keep in mind you'll need to program it to receive communication from LabVIEW.

 

LabVIEW

Detect objects -> Filter objects -> Determine location of object to "pick" -> Generate commands to send to microcontroller ->send to microcontroller

 

microcontroller

receive from LabVIEW -> generate "pick" command -> send "pick" command to robot -> wait (for it to pick) -> generate "place" command -> send "place command" -> wait (for it to place)

 

then repeat as necessary.

Message 3 of 6
(7,730 Views)

In the following we proceed to function block search pattern extracted in the previous process (the parameters as rotation angle and minimum score is inserted into SETTINGS control), extract the output of the search function to get the position values indicators that will be displayed on the front panel)

 

5.png

0 Kudos
Message 4 of 6
(7,578 Views)

hi i know it was a long time ago but im also doing a students project for a robotic arm pick and place and if you could help me out that would be great ty

0 Kudos
Message 5 of 6
(4,015 Views)

Hello Sir, 

I am doing a student project "LabVIEW Programming and Implementation to pick and place chips using DOBOT mg400 robot arm".

Till now I got familiar with robot arm by using scratch blockly software. Now it is time to control Dobot to pick and place using LabVIEW.

 

The problem is I do not know which one to buy. LabVIEW has three differrent categories as basic, full and Professional. I am confused which one to buy for this purpose.

 

Can someone help me to find best one to purchase?  

0 Kudos
Message 6 of 6
(2,651 Views)