Joined: 24 Oct 2011
|Posted: Mon Oct 24, 2011 4:38 am Post subject: Delta Robot with vision system- Graduation project
|I'm still working on the documentation to make the final presentation
so I can graduate. But the project is already "finished" (there are
few bugs but nothing that I really need to fix for my grad)
First a small idea on how it works
We used Labview to make the main program, it has all the configuration
of the robot.
On normal cycle it will scan the image from a webcam located under the
big base that hold the motors and if there is something that looks
like the template that we are looking for it will calculate the
inverse kinematics to start the movement (yes it has few problems on
locating something under the robot arms). With the kinematics
calculated it will send a value in steps to the microcontrollers so
they can calculate a acceleration ramp and send it to the driver.
The end effector is a vacuum cup that have a small pump to feed it.
After the first cycle of movements to get the object it will repeat
the process of scanning the webcam streaming to find a location to
deposit the object.
Video of the robot at the fair
We (me and 2 more friends) wanted a project to test our knowledge,
something different from regular grad projects... so we started
searching for a good topic. We had few ideas and the Delta Robot was
the best shoot.
The vision system was the first part that we developed (was the
harder part at the time for us... had almost no knowledge on that
area when we started)
We did few codding on C++ but was going too slow so we changed to
Labview to help out. Learning how to program on labview took some time
but it made all the programming very easy.
After the "hard part" complete we made few tests with a simple
mechanic arm to check what we would need to make it. Few wooden sticks
with the joints made of bent copper wire made it very hard to see how
it would move but it was good to find out a problem that we would have
on the motors movement...
we choose to use step motor because it was easier to implement and
wasn't too expensive choice, but the minimum step for a simple driver
was a problem.
Making the math:
half step*200 step/revolution motor would give us 0.9° resolution...
The main arm that are coupled to the motor would amplify the movement
and we needed more resolution for a smooth movement.
So we jumped to the next step and we started ordering few gadgets on
ebay and local stores.
-RC car ball joints to link the arms
-Carbon fiber tube for the "long arms"
-SLA7078 Stepper motor drivers (it can make 1/16 microstepping)
-24v switching power supply
-15kgf.cm step motors
The mechanic of the robot was very simple at the start, no worrying
about too much precision on machining and we used aluminum extruded
profile for the structure (easier to make changes if needed).
The electronics had few changes from the initial idea after 1 burned
SLA7078 chip and the fright to burn a step motor... we changed the SLA
drivers for a complete step motor driver that costs more but was safer
and the microstepping could be 1/40.
After the mechanic done and the electronics all hooked up we could
start programming. The main idea was to create a real time interface
that could make a straight trajectory.
At the start of the programming we didn't had any idea on what
microcontroller would be needed, but we did chose PIC family because
was what we could program at the time. The labview kinematics was
already done (with some help from labview database projects) and the
straight trajectory started to complicate the programming, we couldn't
make it move in straight line and generate a acceleration/deceleration
movement using PIC's.
Google made our life easier after we found out some info about
acceleration profiles made on PIC in real time... so we stopped
worrying about straight line and used the help from this pdf
We used 3x 18F2550 microcontrollers all hooked up to the RX of the
RS232 and one PIC had the TX connected too (1 master and 2 slave). All
the data package has the checksum at the end so the microcontrollers
can verify if the data is Ok. Using the the idea of "flags" on the
communication between the PIC's we could make sure that all the
actions was made in parallel no matter if there was a error on only
For the end effector there is a vacuum pump to feed the pneumatic line
and a valve that is driven by solenoid to make the vacuum cup hold the
About the problems that we had
Few problems with the webcam calibration... tried to put the webcam
in angle so it could see all the objects no matter the robot position
but we got back to the perpendicular position with no calibration.
At the start the communication was a little tricky, there was too many
errors so we did implement the checksum (we had problems with
The mechanic of the arms still have few problems, the lenght is not so
precise so it will affect the final position of the robot. Testing the
movement close to the center where there is less error we got close to
1mm error on 100mm movement.
There are few changes that we wanted to make but maybe it will be hold
for some time... there is no main application for the robot yet,
maybe something will appear.
Thank you for the attention and sorry about my english that isn't my
Henrique Ribeiro de Oliveira
Joined: 29 Jan 2006
Location: Winnipeg, MB
|Posted: Mon Oct 24, 2011 5:19 am Post subject:
|Wow that looks great!
||All times are GMT - 6 Hours
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Powered by phpBB © 2001, 2005 phpBB Group