Implementing an Automated Robotic System into an AI-Guided Laboratory

Hi! My name is Sam Woerdeman, and I am a rising senior at Northwestern University, pursuing a bachelor’s degree in mechanical engineering. Through the quarantine summer of 2020, I have been working with researchers Jie Xu and Dr. Young Soo Park in implementing an automated robotic system into a nanomaterial solution-processing platform. Although I have been working remotely from home, with the help of my mentors, I have been able to conduct research nonetheless, including the simulating, programming and controlling of a robot.

Significance of Automating Nanomaterial Production

Before I dive into the importance of the robotic system, I want to acknowledge the significance of automating nanomaterial production using an AI-guided, robotic platform. For one, we would be able to understand the multi-dimensional relationships of numerous properties resulting from the production of nanomaterials and thin films. Also, we could quickly identify and improve upon the workflow of producing nanomaterials. Finally, this system would allow us to reduce human error resulting from basic intuition and reproducing the material continuously.

Differentiation from the Available Autonomous Platforms

There are a number of key elements that separate my research in implementing the robotic system from past, competing projects. Primarily, it is uncommon to mimic an entire nanomaterial laboratory autonomously. Usually, small parts of the workflow are automated and then assembled, or larger scale materials are produced autonomously. Distinctively, I am working on a modular robotic system rather than a fixed workflow. This is essential because it allows researchers to easily adjust the workflow program, rather than having to construct a unique platform for each individual solution-processing experiment. This aspect allows the programming module to be used on projects for years to come.

Approach to Implementing the Robotic System

The goal of the project is to integrate an automated robotic system into the entire solution-processing platform; I approached it from three unique angles. First, the robot has to be modeled and simulated. By utilizing CAD parts, I was able to assemble a laboratory workspace that resembles the set-up in the Argonne laboratory. The completed CAD files were imported into a simulator program called CoppeliaSim, which provides a platform for adding joints and features to allow the robot to move and interact with its environment. I want to note the importance of creating a simulator as it allows us to experiment first with different commands instead of initially risking the expensive hardware of the actual robotic system and potentially wasting time and money.

Secondly, I programmed modules for the workflow using Python as the primary programming language. In order to connect the code directly to the simulator, I used a remote API connection. An API, which is an acronym for application programming interface, allows and defines interactions between multiple software. In my case, it allows me to control the robot simulator in CoppeliaSim using Python code. By simply importing the CoppeliaSim library, I can use the functions to control the simulator and create new functions with more complex commands. Mainly, the kinematics of the robot is showcased by manipulating the motion of the joints, inputting different Cartesian coordinates for the robot to follow, and control the speed and acceleration of the robotic arm.

Example of how the joint motion function operates

Finally, I look to control the robot using my Python programming modules and simulator. While I am completing my internship remotely, the show must go on. Therefore, my mentors and I were able to remotely control the robot, which is located at our vendor’s laboratory. We did this by video chatting and connecting to the vendor’s computer using Chrome Remote Desktop. We inputted Python code, watched the simulation, and observed the robot completing the corresponding commands all from our homes. Even though we did not intend for the project to be remote, this gave us the confidence that we can control these systems from all over the world without wasting valuable time.

Remote connection of the robot using Python code and simulator simultaneously

Overall, I am amazed that I have been able to accomplish all of this in eight weeks thus far in the NAISE program, especially when considering that I am a mechanical engineering major who came into the project expecting to manually run trials with the robotic system. None of this would be possible without my mentors’, Jie Xu’s and Young Soo Park’s, assistance as well as Dr. Jennifer Dunn and Britanni Williams for keeping the NAISE program running in these strenuous times. In the coming weeks, I look to finish the library of functions for the simulator, demonstrate a basic workflow on the simulator and robot, and ultimately merge my robotic system with the artificial intelligence aspect of the project.

References

MacLeod, B.P. “Self-driving laboratory for accelerated discovery of thin-film materials” Science Advances. May 13, 2020