0% found this document useful (0 votes)
151 views

Auto Arm Report

The robotic arm kit was modified to meet length requirements by 3D printing enlarged midsection links. An aluminum chassis base and lightweight acrylic claw end effector were added. The arm runs on 6 volts for smooth motion and adequate torque to lift objects. Mechanical calculations were performed to ensure each servo could handle loads. In V-REP, a simulated arm mimics the real arm's motion using Reflexxes library functions to smoothly move joints to target positions. Originally, the arm was programmed in Simulink but issues arose, so a new software approach was developed.

Uploaded by

api-519418123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
151 views

Auto Arm Report

The robotic arm kit was modified to meet length requirements by 3D printing enlarged midsection links. An aluminum chassis base and lightweight acrylic claw end effector were added. The arm runs on 6 volts for smooth motion and adequate torque to lift objects. Mechanical calculations were performed to ensure each servo could handle loads. In V-REP, a simulated arm mimics the real arm's motion using Reflexxes library functions to smoothly move joints to target positions. Originally, the arm was programmed in Simulink but issues arose, so a new software approach was developed.

Uploaded by

api-519418123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Robotic Arm

Brian McGinnis, Jennifer Nakajima, Sarah Mosser, Octavio Anaya, Calvin Lu


ETE 4751/L
Instructors: Professor Boskovich and Professor Khoshnoud

Electronics and Computer Engineering Technology


California State Polytechnic University, Pomona
Mechanical Design:

Body
As a team, we unanimously decided purchasing a kit and making modifications to meet
the specifications given, would be the easiest and most efficient route. The kit we decided to
purchase had an overall reach of about 12”, which unfortunately did not meet the length
requirements needed to compete in the events, so our next step was to either cut the midsection
and extend or replace parts of the body with longer pieces. We decided to go with obtaining
bigger pieces because cutting the existing pieces in the middle and adding another length would
only make our arm weaker. To get the larger pieces we agreed on having our pieces 3D printed.
This was done by first modeling our desired parts in solidworks and converting the files so they
could be properly printed on a 3D printer. We had to enlarge the midsection link by a scale of
1.5x, while still keeping the same diameter holes to ensure that the same bearings, nuts, and
screws could be used. We also enlarged the smaller link that worked in conjunction with the
midsection link, and altogether the parts were lengthened by 5” and thickened by about ¼”. PLS
plastic was used for the 3D printed parts due to its lightweight and durability. We initially
thought about extending both the midsection and the link leading to the claw. We decided against
extending the link leading to the claw because we would either have to cut the pvc pipe in the
middle and add an extra section of pvc or remove the pvc all together and put in a longer pvc.
We concluded that the midsection link was the only part we were going to enlarge, so with this
change, the total reach of our robot came out to about 16-17”. The link leading to the claw and
the first link connecting to the base were kept the same.

Base and End Effector:


Since our purchased kit did not come with a base or end effector, these components had
to be purchased or modeled to fit the existing parts of our robot. For the base, a 6”x 10”x 3½”
aluminum chassis was purchased and used to house all electrical components, while giving the
arm stability and support.
As mechanical lead, my first thought was to model an end effector in solidworks that had
bucket shaped jaws to enhance the ability of picking up piped shaped objects. Although this may
have worked quite well, we decided to opt out of this option as this would not be ideal in picking
up 1” wooden cubes. From there, I decided to then model a basic claw that encompassed the use
of a simple gearing system. After 3D printing this model, I assembled each component with its
corresponding nuts and bolts. Once put together, it was then realized that this claw would not be
suitable for attachment to the arm, as it was a bit too heavy for the servos to handle. From there,
my next step was to find and purchase an extremely lightweight end effector, because modeling
and printing a new design would not be possible due to time constraints. Luckily, we were able
to find a claw that worked the same way as my modeled end effector and was light enough for
the servos at the wrist to lift. This new end effector was made of acrylic plates, weighed only a
couple ounces, and had jaws that opened up to 2”. For extra texturing, silicone sealant was added
to the inside of the grippers to enhance its ability to pick up objects.

The final design is pictured below.

Hardware:
The kit we bought came with 2 different types of servos. The MG996 (the bigger black
servos) and the SG90 (the smaller blue servos). According to the datasheets both servos had
different specs. The MG996 had a stall torque of 9.4 kgfcm at 4.8 volts and 11 kgfcm at 6 volts.
The SG90 has a stall torque of 2.5 kgfcm, the datasheet did not show the stall current at 6 volts.
The MG996 has an operating voltage of 4.8-7.2 volts and the SG90 operates from 4.8-6 volts.
Since the operating voyage for both servos is 4.8-6 volts, we decided to go the safe route with
our first test and keep our voltage at 4.8 volts just to make sure we did not burn anything up. We
found that at 4.8 volts our arm was able to move but when it went to pick up a block it did not
have enough torque to come back to the starting position. Because of this we bumped up our
voltage to about 5.2 volts again trying to keep it relatively low so as to not burn up anything and
found that our arm could come back to the starting position but was very slow and had a lot of
jerk. Finally, we settled on 6 volts because our arm has the least amount of jerk at this voltage
and it’s the cutoff for the SG90’s.

Mechanical Calculations:
To ensure our robot would be able to pick up and carry the load presented, mechanical
calculations were done to determine the maximum torque exerted at each joint. This was done
using the equation T= L*P + ½*L*W, where L = link length [cm], P = load (including actuator +
object being lifted) [kg], W = weight of each link [kg], and T = [kg cm]. To do this, I had to
measure each link length, and find the weight of all actuators used, the weight of link, and the
weight of each object that needed to be lifted. There are 4 torque locations on the robotic arm
and since the arm has multiple joints, torques need to be added from each consecutive joint as
you go from wrist to base. Since torque increases proportionally to length, and the load is
heaviest at the base, 3 large servos are used in that location. Another large servo is used at the
elbow, and 3 smaller ones used at the wrist and end effector location. Since the servos will be
subjected to the highest torque when the arm is stretched horizontally (longest length at this
position), calculations were done based on this idea to ensure a safety factor was included.
Although the robot may not be designed to encounter this type of scenario, it is best to always be
safe when doing calculations to ensure all possible situations are accounted for. After calculating
the torque used at each joint, I was able to compare those values to the stall torques given on the
spec sheet for each servo and made sure those values were still above my calculated values.
From there, I was able to see that our robot could handle all loads given and that each actuator
would not fail.

V-REP Design:

Virtual Robot Experimentation Platform

The general purpose of V-REP is to mimic a robotic simulation with an integrated


development environment. For our demonstration we develop a scene object with a robot arm
that will move every single joint build in the robot arm. We found an arm that is related to our
experimental arm and we decided to use it for demonstration and testing purposes. We use the
PhantonX Pincher which consists of 4 degrees of freedom. In order to be able to manipulate the
joints that the robot arm has in V-REP we use a API function called
sim.rmlMoveToJointPositions, what the API function does to our robot arm it actuates several
joints at the same time using the Reflexxes Motion Library. The function is called from the child
script threaded that belongs to the parent arm which in this case is the PhantonX Pincher. What a
child script is a simulation script that represents a collection of routines written in Lua language
allowing handling a particular function in a simulation. Child scripts are attached to a scene
object that could easily be found next to the object in the scene hierarchy. To build our child
script we first began with an initialization code saying to handle all the joints available in the
robot arm. We wrote joinHandles=(-1,-1-,1,-1) for i=1,4,1 do
jointHandles[i]=sim.getobjecthandle(‘name of the arm_joint’..i). Then Velocity acceleration and
jerk vectors were customized as we liked, also maxVel, maxAccel, maxJerk and targetVel were
assigned in the script. Then the middle part of the script is where it executes what is commanded.
In our child script we commanded the script a target position by using the reflexxes library
sim.rmlMoveToJointPositions. Lastly, we told the script to open gripper when it executes the last
command (target position).

Reflexxes library stands for motion trajectory generation library which is integrated in V-
REP, also Reflexxes has been acquired by Google. Reflexxes Motion Libraries provide
instantaneous trajectory generation capabilities for motion control systems. Reflexxes Motion
Library contains a set of on-line trajectory generation(OTG) algorithms that are designed to
control robots and mechanical systems. The many advantages of Reflexxes make it applicable. In
the fields of robots, CNC machine tools and servo drive systems. With the development of
computer technology, the numerical control system has evolved from a hard wired numerical
control system composed of digital logic circuits to a computer based computer numerical
control (CNC) system. Compared with the hard wire CNC system, the control function of the
CNC system is mainly realized by software, and can handle complex information that is difficult
to be processed by the logic circuit, thus having higher flexibility and higher performance.

Software Design:
Originally the robot was programmed in Simulink through the nucleo board. For the first
programming check-in, sine wave inputs were used to control the angles within the servo motors
to maintain a consistent movement along the x and y axis. During the demonstration switches
were used to reset the start position. However, we ultimately did not continue pursuing this idea
because of the problems that started to arise. One of the problems revealed was that the cosine to
reverse the movement of the arm was not available. Additionally, there was not an effective way
to control the speed of the movements. Due to these unforeseen problems it was decided to
switch microcontrollers and software to Raspberry Pi and Python.
After solving the hardware and software problems then started came the code for each
event. For each event the approach was roughly the same. First is to have a goal and brianstorm
multiple “pathways” to reach that goal. Then once a pathway is envisioned then figure out
different functions to help progress to that specific goal in the beginning. In order to complete all
the tasks our robot uses a camera to recognize the type of object as well as the color. Using a
Pixy Cam and the Pixy2Cam sorce code there is a python script, “Get_blocks_cpp_demo.py”
recognises the block signature, x and y coordinates, width, height, and will update in real time.
Adding a Pi hat onto the Raspberry Pi then allows their source code to connect to the servos and
work with the color detection.

Event code concepts:


● Event 1:
Using the pixy recognition capabilities and real time communication between the
camera to the servos allows a clear pathway of communication for this event. The
loops are incrementing each angle of the sensors by 1 from the start angle to the
end angle every 10 milliseconds. By doing this it can effectively control the speed
of the servos. By having a “for-loop” this allows control of which servo is moving
by changing the value in the “pwm.set_pwm” function. Using function “count
=pixy.ccc_getblocks” to call up tp 100 blocks detected to pass this info into the
for loop to print the values. Using this to track the angle of movement gives an
updated signature of the servos while in motion.

● Event 2:
Setting x-y coordinates designates the “home square” so that the arm has a place
to pick up, or put down the blocks. Then having “for-loops”, as mentioned before
in event 1 to move the block to the center of the field. When stacking the blocks,
if the camera does not sense the blocks it will drop the block it is carrying, then
loop back to the “home square”. However the next time around the sensor will
detect the block and increment PWM 2 vertically until it is above the previously
placed block.
● Event 3:
This event can be manually controlled so instead of sensors push buttons will be
used to control the servos. The push-buttons are fed into the GIPO port to be used
for signal detection. Then a set of “if-statements” will be used for each push-
button pin on the Raspberry pi.
● Event 4:
Similar to the first event but there will be a specific color that will be selected.
With each color there will be a corresponding push-button that will select the
color that the arm will pick up. Once the button is pushed then it will scan the
field for that specific color. After picking up the color then it will return it to the
home square and repeat.
● Event 5:
Using similar code to the previous logic for setting the “home square” this gives
somewhere for the arm to leave the playing field. To start the event, the arm
positions itself to the center of the playing field allowing the arm to better scan
the field for the blue coupling. The code “color_blocls.py” in used to detect the
location of the coupling placed there by the opponent. Then using the location
array a new position is generated from random x and y integers to place the piece
down for the opponent's next move.

You might also like