BACKGROUND // Design Concepts was a sponsor for the 2016 SF Design Week and as part of that, we hosted an open house/studio tour at our office. A couple months prior, Swope Design Solutions (our office partner) had purchased a 6 axis robot arm for running overnight CNC milling jobs, but it had mainly been under used up to this point so I thought it could be a great opportunity to get it up and running and do a cool demo for the open house.
OBJECTIVE // After convincing Brett to trust me with his robot (a difficult feat unto itself), I hatched a plan to make the robotic arm draw our company logo onto sugar cookie with icing. Once this was accomplished, the second objective became to create a more engaging interaction with the demo so they weren’t just standing there watching me run the machine. This eventually involved writing a program to prompt the user to place a cookie on the plate and detect when it was available to be decorated.
……………(1) Universal Robots UR10 6 Axis Robot
……………(3) Java (Processing)
……………(4) Nordson Ultimus I Electronic Fluid Dispenser & Syringes
……………(5) Sugar cookies, powdered sugar, vanilla, water
……………(6) Fused Deposition Modeler
The majority of the work put into this project was figuring out how to interact with the robot, beyond just the manual jogging of the tool position. This started with some quick skimming of the manual before hitting the internet to see what I could come up with. Brett pointed me to a GUI called RoboDK that allowed you to visualize the robot motion and create tool paths, but the interface wasn’t particularly intuitive and whenever I would import vector files to CAM, there were issues. I mainly abandoned it, but it was useful for me to look at the formatted code that was transmitted to the robot from the program so I could replicate this myself.
Surprisingly, the resource available were pretty slim, but after crashing the machine at a terrifying speed and a couple close calls, I was eventually able to feed the robot commands from a Python script via a IP/TCP socket. Unfortunately, it didn’t seem like there was any internal buffer on the robot itself, so if you sent it multiple commands, each command would override the previous, which meant you needed to add a time delay on the client side to control the pace of the communication instead of using some sort of handshake (at least with my initial python script).
I was eventually able to come across a Python module that essentially formatted commands and handled robot communication and flow control. Ultimately, this was incredibly helpful and saved me a bunch of time and effort from having to create a robust command buffer and communication process. There were two main downsides to the module/firmware combination: 1) it didn’t provide access to arc commands so everything had to be line segments and 2) it was intended for pick and place operations so it didn’t create smooth paths between line segments, but rather treated every command discretely, which made for quick choppy operation.
Once the robot was running, I started working with the Electronic Fluid Dispenser (EFD) to get the extrusion process tuned in. I FDM’d a mount for the syringe and whipped up some icing to start decorating some cookies. I think this may have been the trickiest part. Between the variability of icing viscosity, the ability to avoid bubbles in the syringe, the impact of nozzle height on drawing efficacy, and variation of pressure, it took a fair amount of process development to get things just right.
When I was extruding properly, and the robot was drawing on cookies, my attention quickly shifted toward guest interaction. I still had a day before the event so I figured I could probably throw together a vision system to recognize when there was a cookie on the plate and start the python script. I’ve done a fair bit of this in the past so I was able to look through some of my old code to give me a jump start. Getting the code written was fairly quick, but revising the recognition algorithm took a bit of work and was heavily impacted by lighting conditions and the threshold values that determined whether the cookie was present kept changing. I was ultimately able to make a relatively robust algorithm once I had the final setup and lighting lock down.
NEXT STEPS // Creating the tool paths is quite tedious currently so it would be nice to have a post processor to convert GCode into an executable Python script that would format and send all the code to the robot. I don’t expect this would be too difficult as long as the tool head orientation stays the same and the motion isn’t so great that you have to be worried about portions of the arm colliding with itself. I also need to figure out how to include arcs as I would prefer not to be restricted to just straight lines.
LESSONS LEARNED //
- The tolerancing on cookie height is horrible. The tip of the extruder needs to be just above the surface so that the icing immediately sticks to the surface otherwise you get rounded corners and loops in the icing, etc, but the cookie height varies so much that quite often during the night, the nozzle would catch the cookie and drag it around the plate.
- The UR10 uses absolute coordinate in meters. If you put in values in 100’s of meters, the machine will crash very quickly and it will be scary.
- 6 axis definitely adds some complexity to operation compared to a typical xyz gantry. Surprisingly, it seems like the robot will let itself collide with two segments when you send it a command that it can’t physically accomplish. Similarly, when you are jogging the position of the tool along x/y/z axis instead of rotating joints, the speed along the straight lines is kept constant which means that depending on the pose at any given moment and the direction that it is moving in, the kinematics could force some crazy joint accelerations in order to preserve the constant speed. This makes since, but it isn’t something I’ve had to consider when the motor speed it linearly related to the head movement.