Our project aims to deliver a surgical training method to teach surgical training techniques by establishing muscle memory through the integration of robotics and motion capture systems. Our device incorporates a robotic hand guide for specific movements similar. These movements are data recordings of the hand movements of experienced senior neurosurgery residents in the form of vectors, which will be generously gifted to us by a local research lab in St. Louis (Dr. Jack Engsberg). We will process the data that was acquired using the motion capture system using MATLAB in order to translate the movements into robotic movement. Our device's movement is tool-centered, which means that the user grasps the tool attachment on the actuator to feel the movements. We chose to use a 3D printer to execute the motions for inputted vector data. 3D printers use G-code, which is a simple programming language into which vectors are easily programmable. By using a 3D printer and G-code, creating nearly any movement is possible. Specifically, we focused two components: movement on the YZ plane (parallel to the wall) and the translation of motion capture data so that it can be inputted into the 3D printer. To demonstrate the teaching of a motor movement, we attempted to improve the drawing abilities of our subjects. To show that we can successfully translate motion capture data into device movement, we chose a point on the right wrist, translated that motion into G-code, and inputted the code into the 3D printer to show that it was similar to the plotted path in MATLAB.