Auto Arm Report
Auto Arm Report
Body
As a team, we unanimously decided purchasing a kit and making modifications to meet
the specifications given, would be the easiest and most efficient route. The kit we decided to
purchase had an overall reach of about 12”, which unfortunately did not meet the length
requirements needed to compete in the events, so our next step was to either cut the midsection
and extend or replace parts of the body with longer pieces. We decided to go with obtaining
bigger pieces because cutting the existing pieces in the middle and adding another length would
only make our arm weaker. To get the larger pieces we agreed on having our pieces 3D printed.
This was done by first modeling our desired parts in solidworks and converting the files so they
could be properly printed on a 3D printer. We had to enlarge the midsection link by a scale of
1.5x, while still keeping the same diameter holes to ensure that the same bearings, nuts, and
screws could be used. We also enlarged the smaller link that worked in conjunction with the
midsection link, and altogether the parts were lengthened by 5” and thickened by about ¼”. PLS
plastic was used for the 3D printed parts due to its lightweight and durability. We initially
thought about extending both the midsection and the link leading to the claw. We decided against
extending the link leading to the claw because we would either have to cut the pvc pipe in the
middle and add an extra section of pvc or remove the pvc all together and put in a longer pvc.
We concluded that the midsection link was the only part we were going to enlarge, so with this
change, the total reach of our robot came out to about 16-17”. The link leading to the claw and
the first link connecting to the base were kept the same.
Hardware:
The kit we bought came with 2 different types of servos. The MG996 (the bigger black
servos) and the SG90 (the smaller blue servos). According to the datasheets both servos had
different specs. The MG996 had a stall torque of 9.4 kgfcm at 4.8 volts and 11 kgfcm at 6 volts.
The SG90 has a stall torque of 2.5 kgfcm, the datasheet did not show the stall current at 6 volts.
The MG996 has an operating voltage of 4.8-7.2 volts and the SG90 operates from 4.8-6 volts.
Since the operating voyage for both servos is 4.8-6 volts, we decided to go the safe route with
our first test and keep our voltage at 4.8 volts just to make sure we did not burn anything up. We
found that at 4.8 volts our arm was able to move but when it went to pick up a block it did not
have enough torque to come back to the starting position. Because of this we bumped up our
voltage to about 5.2 volts again trying to keep it relatively low so as to not burn up anything and
found that our arm could come back to the starting position but was very slow and had a lot of
jerk. Finally, we settled on 6 volts because our arm has the least amount of jerk at this voltage
and it’s the cutoff for the SG90’s.
Mechanical Calculations:
To ensure our robot would be able to pick up and carry the load presented, mechanical
calculations were done to determine the maximum torque exerted at each joint. This was done
using the equation T= L*P + ½*L*W, where L = link length [cm], P = load (including actuator +
object being lifted) [kg], W = weight of each link [kg], and T = [kg cm]. To do this, I had to
measure each link length, and find the weight of all actuators used, the weight of link, and the
weight of each object that needed to be lifted. There are 4 torque locations on the robotic arm
and since the arm has multiple joints, torques need to be added from each consecutive joint as
you go from wrist to base. Since torque increases proportionally to length, and the load is
heaviest at the base, 3 large servos are used in that location. Another large servo is used at the
elbow, and 3 smaller ones used at the wrist and end effector location. Since the servos will be
subjected to the highest torque when the arm is stretched horizontally (longest length at this
position), calculations were done based on this idea to ensure a safety factor was included.
Although the robot may not be designed to encounter this type of scenario, it is best to always be
safe when doing calculations to ensure all possible situations are accounted for. After calculating
the torque used at each joint, I was able to compare those values to the stall torques given on the
spec sheet for each servo and made sure those values were still above my calculated values.
From there, I was able to see that our robot could handle all loads given and that each actuator
would not fail.
V-REP Design:
Reflexxes library stands for motion trajectory generation library which is integrated in V-
REP, also Reflexxes has been acquired by Google. Reflexxes Motion Libraries provide
instantaneous trajectory generation capabilities for motion control systems. Reflexxes Motion
Library contains a set of on-line trajectory generation(OTG) algorithms that are designed to
control robots and mechanical systems. The many advantages of Reflexxes make it applicable. In
the fields of robots, CNC machine tools and servo drive systems. With the development of
computer technology, the numerical control system has evolved from a hard wired numerical
control system composed of digital logic circuits to a computer based computer numerical
control (CNC) system. Compared with the hard wire CNC system, the control function of the
CNC system is mainly realized by software, and can handle complex information that is difficult
to be processed by the logic circuit, thus having higher flexibility and higher performance.
Software Design:
Originally the robot was programmed in Simulink through the nucleo board. For the first
programming check-in, sine wave inputs were used to control the angles within the servo motors
to maintain a consistent movement along the x and y axis. During the demonstration switches
were used to reset the start position. However, we ultimately did not continue pursuing this idea
because of the problems that started to arise. One of the problems revealed was that the cosine to
reverse the movement of the arm was not available. Additionally, there was not an effective way
to control the speed of the movements. Due to these unforeseen problems it was decided to
switch microcontrollers and software to Raspberry Pi and Python.
After solving the hardware and software problems then started came the code for each
event. For each event the approach was roughly the same. First is to have a goal and brianstorm
multiple “pathways” to reach that goal. Then once a pathway is envisioned then figure out
different functions to help progress to that specific goal in the beginning. In order to complete all
the tasks our robot uses a camera to recognize the type of object as well as the color. Using a
Pixy Cam and the Pixy2Cam sorce code there is a python script, “Get_blocks_cpp_demo.py”
recognises the block signature, x and y coordinates, width, height, and will update in real time.
Adding a Pi hat onto the Raspberry Pi then allows their source code to connect to the servos and
work with the color detection.
● Event 2:
Setting x-y coordinates designates the “home square” so that the arm has a place
to pick up, or put down the blocks. Then having “for-loops”, as mentioned before
in event 1 to move the block to the center of the field. When stacking the blocks,
if the camera does not sense the blocks it will drop the block it is carrying, then
loop back to the “home square”. However the next time around the sensor will
detect the block and increment PWM 2 vertically until it is above the previously
placed block.
● Event 3:
This event can be manually controlled so instead of sensors push buttons will be
used to control the servos. The push-buttons are fed into the GIPO port to be used
for signal detection. Then a set of “if-statements” will be used for each push-
button pin on the Raspberry pi.
● Event 4:
Similar to the first event but there will be a specific color that will be selected.
With each color there will be a corresponding push-button that will select the
color that the arm will pick up. Once the button is pushed then it will scan the
field for that specific color. After picking up the color then it will return it to the
home square and repeat.
● Event 5:
Using similar code to the previous logic for setting the “home square” this gives
somewhere for the arm to leave the playing field. To start the event, the arm
positions itself to the center of the playing field allowing the arm to better scan
the field for the blue coupling. The code “color_blocls.py” in used to detect the
location of the coupling placed there by the opponent. Then using the location
array a new position is generated from random x and y integers to place the piece
down for the opponent's next move.