SINERGI Vol. No. June 2021: 217-226 http://publikasi. id/index. php/sinergi http://doi. org/10. 22441/sinergi. MOTION CONTROL ANALYSIS OF TWO COLLABORATIVE ARM ROBOTS IN FRUIT PACKAGING SYSTEM Tresna Dewi1*. Citra Anggraini1. Pola Risma1. Yurni Oktarina1. Muslikhin2,3 Department of Electrical Engineering. Politeknik Negeri Sriwijaya. Indonesia Department of Electrical Engineering. Southern Taiwan University of Science and Technology. Taiwan Department of Electrical Engineering Education. Universitas Negeri Yogyakarta. Indonesia Abstract As robots' use increases in every sector of human life, the demand for cheap and efficient robots has also enlarged. The use of two or more simple robot is preferable to the use of one sophisticated robot. The agriculture industry can benefit from installing a robot, from seeding to the packaging of the product. A precise analysis is required for the installation of two collaborative robots. This paper discusses the motion control analysis of two collaborative arms robots in the fruit packaging system. The study begins with the relative motion analysis between two robots, starting with kinematics modeling, image processing for object detection, and the Fuzzy Logic Controller's design to show the relationship between the robot inputs and outputs. The analysis is carried out using SCILAB, open-source software for numerical computing engineering. This paper is intended as the initial analysis of the feasibility of the real experimental system. Keywords: Agriculture Robot. Arm Robot Manipulator. Collaborative Robot. Relative Motion. Article History: Received: July 19, 2020 Revised: September 24, 2020 Accepted: October 4, 2020 Published: February 20, 2021 Corresponding Author: Tresna Dewi Department of Electrical Engineering. Politeknik Negeri Sriwijaya. Indonesia Email: tresna_dewi@polsri. This is an open access article under the CC BY-NC license INTRODUCTION The first robot to be introduced in human life was a robot arm manipulator used in industries to replace human labor in a dull, dangerous, and dirty working environment . , 2, 3, . This type of robot has been so wellrecognized and well-researched that it creates a perfect worker for industry revolution 4. Automation has penetrated every aspect of human life to support our lifestyle or to spoil us . , 6, 7, 8, 9, 10, 11, 12, 13, 14, . , such as a service robot . , a worker robot . , and an agriculture robot . , 13, 14, 15,16,. The new trend in robotics is to take up the concept of community robotics or multi-robots that mimic natural animals' lives like ants, a flock of birds, the swarm of bees, or a fish school. The use of many simple robots is more profitable than using a single robot with high technical Although most multi-robots found today are mobile robots . , 8, . , this idea can also be applied to robot arms . , 3, . If the arm robot can help people work in repetitive and time-consuming tasks, then using more than one arm robot will increase the task's speed without requiring human intervention . , 2, 3, . The use of more than one robot or multirobot in the fruit and goods packaging process will be more beneficial in terms of time and efficiency than using a single robot. One robot can be equipped with a camera to distinguish the colors and shapes of objects or fruits to be The other robot can only be equipped with a proximity sensor to ensure that the robot's arms' ends do not collide with surrounding objects or objects to be packaged. An example of this robot is position-based visual multi-arm robotic cells using multiple cameras . The problem with numerous cameras, however, is excessive computational time is high. Image processing is a key point for the visual cue of a robot used in a place where it is necessary to recognize an object, such as an agricultural robot . , 9, 10, 11, 12, 13, 14, 15, 16. Image processing is helpful in any agricultural situation, such as seeding . , maintenance . , yield forecast . , harvesting . , and packaging . , 22, 23, 24, 25, 26, . The use of robots in agriculture could Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A SINERGI Vol. No. June 2021: 217-226 ease the farmer's work burden and increase the harvest yield . Robot movement is more efficient and smoother by implementing artificial intelligence in the robot arm . The three types of Artificial Intelligence (AI) used in robotics are the Fuzzy Logic Controller, the Neural Network, and the Genetic Algorithm. Each of these AIs can be implemented individually or in combination to have However, implementation group's issue is simplicity. therefore, the most used AI is proposed in this study, i. Fuzzy Controller Logic . , 31, . This paper discusses the design and analysis of the motion control of two collaborative arms robots used in the fruit packaging system. Control design and analysis combine the concept of kinematics and relative motion analysis to show how two robots work together to create a collaborative environment between robots. This paper is intended as the initial analysis of the real experimental system's feasibility, where two robots are working collaboratively instead of human and robot co-working. This study also shows image processing, which functions as a visual cue for the robot to pick and place orange objects assigned to it. The robots assigned to this study are robot 1 and robot 2. Robot 1 is equipped with a proximity sensor only, and robot 2 is equipped with a proximity sensor and a camera. The system is connected to a scale system in which a weight sensor is installed . Mathematical analysis, and Fuzzy Logic Controller design are presented to demonstrate the feasibility of the proposed method. This research is the continuation of our study in . , 14, . METHOD The paper analyzes a series of relative motion and visual cues to produce robotic motion resulting from image processing. The motion control analysis of two-arm robot manipulators that work together in a fruit packing system will be Figure 1 shows the robot arms considered in this study. Robot Design The design of robots and their working environment is shown in Figure 1. Robot 1 is equipped with a proximity sensor to sense the distance between the robot's end-effector and the object when picking up the object and to sense the distance between the end-effector and weight Figure 1. The collaborative arm robots and their The object considered in this study is orange. Robot 2 picks the orange and place it in the box according to the size of the orange. The weight system in the weight scale in Figure 1 is connected to the arm robot system. The weight sensor is a sensor for detecting load pressure or weight and is commonly used as the main element of a digital weighing device. The weight sensor works by converting the received forces such as pressure, tension, and compression into the electrical signal. The concept of weighbridge is used to analyze this system, as shown in Figure 2, where ycOyaycu is the excitation voltage constant, ycO0 is the output voltage, ycI1 and ycI3 are tension strain gauge, and ycI2 and ycI4 are compression strain gauges. If ycI1 to ycI ycI ycI4 in Figure 2 are balanced, ycI1 = ycI4, then V0 is Hence, changes in ycI1 to ycI4 , results in V0 The changes in V0 are measured by ycO using Ohm's Law, ya = ycI . Therefore, based on the Wheatstone bridge configuration in Figure 2, the voltage output is given by: ycO0 = ( ycI3 ycI3 ycI4 Oe ycI1 ycI1 ycI2 ) ycOyaycu . VEx Figure 2. Weighbridge circuit Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A p-ISSN: 1410-2331 e-ISSN: 2460-1217 The electrical design of the arm robot manipulator is presented in Figure 3. The arm robot is equipped with a Pi camera to capture the image and then processed by an image processing method to separate the orange from its Hence the coordinate of orange in the coordinate image frame is known by the endeffector. Image processing is conducted in Raspberry Pi, and the output becomes the visual cue for controller ATMega 2560 to move the arms. The robot is also equipped with four servo sensors, one for each joint, and one is on the endeffector to grab the object. ycOP/A = ycOB/A ycOC/B ycOP/C, . where ycOP/A is the relative motion from point A to point P . nd-effecto. , ycOB/A is the relative velocity of point B to A, ycOC/B is the relative velocity of point C to B, and ycOP/C is from end-effector to point C. Figure 3. Electrical design of the robot Relative Motion Analysis The first analysis is based on a relative motion analysis where each joint is considered to be moving relative to the other joints. The relative motion analysis is undergoing general planar motion, where the robot bodies are translated and rotated simultaneously. Two arm robots are shown in Figure 1 moving relative to each other and given by ycOycI1/ycI2 = ycOycI2 Oe ycOycI1 , . where ycOycI1/ycI2 is the velocity of robot 1 . cOycI1 ) relative to robot 2 . cOycI2 ). Two arm robots are identical to each other. Therefore, the relative motion analysis is conducted to one robot. For the sake of simplicity, the robot is considered as a three-link planar arm shown in Figure 4. , where ya1 is the length from the base . oint A) to B, ya2 is from B to C, and ya3 is the length of the third link. The relative motion analysis breaks the robot into four points. C, and P, where P is the robot's end-effector. Joint A is the only point that is fixed, and the rotations are the motion from A to B then B to C. The endeffector of the robot is on the tip of point P. therefore, the relative velocity of joints of the robot in Figure 1 is . Figure 4. Assigned coordinate frames of three-link planar arm Table 1. DH parameters of arm robot in Figure 4 i The velocity of each joint can be analyzed by applying kinematics modeling of the robot in Figure 4. The analysis is started by assigning coordinate frames to the robot shown in Figure 1 and finding robot velocity and each joint's velocity. Consider the three-link planar in Figure 4. , where all the revolute axes are parallel, and the direction of ycu0 is arbitrary. All the parameters Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A SINERGI Vol. No. June 2021: 217-226 of the link offset along the previous z-axis to the common normal . i ) are null since all the joints are revolute joints. The angles ycycn are the angles between the axes ycaycn . The Denavit-Hartenberg (DH) parameters are presented in Table 1 to analyze the robot's kinematics modeling . Therefore, based on DH parameters in Table 1, the direct kinematics function is given by ycN30 . c ) = ycN10 Oo ycN21 Oo ycN32 0( ) ycN3 yc = yca123 Oeyc123 0 ya1 yca1 ya2 yca12 ya3 yca123 yca123 0 ya1 yc1 ya2 yc12 ya3 yc123 ycN where yc = . c1 yc2 yc3 ] , yca1 is cos yc1 , yc1 is sin yc1 , yca12 is cos. c1 yc2 ), yc12 is sin. c1 yc2 ), yca123 is cos. c1 yc2 yc3 ), and yc123 is sin. c1 yc2 yc3 ). It is necessary to find the joint variables yc1 , yc2 , and yc3 , which are corresponding to a given end-effector position and orientation. The robot position and orientation are ycEycu , ycEyc , and yuo. Therefore, yc123 yuo = yc1 yc2 yc3 . The position of point C as the origin of Frame 2, which is the origin of angles yc1 and yc2 , ycyyaycu = ycEycu Oe yca3 ycayuo = yca1 yca1 yca2 yca12 , . ca1 yca2 yca2 )ycyyaycu Oe yca2 yca2 ycyyayc yc1 = Atan2. c1 , yca1 ). If yc2 = 0, then yc2 = 0, and if the kinematics reach singularity, then yc2 = yuU. The angle yc1 can be determined uniquely, except when ya1 = ya2 , in this case, ycyyaycu = ycyyaycu = 0. Hence, yc3 = yuo Oe yc1 Oe yc2 . The velocity of center of gravity (CG) of each joint A. B and C obtained from the kinematic analysis is Oe0. 5ya1 yc1 ycN 1 ycOya0 = [Oe0. 5ya1 yca1 ycN 1 ] Oeya1 yc1 ycN 1 Oe 0. 5ya2 yc12 . cN 1 ycN 2 ) ycOyaA0 = [ ya1 yca1 ycN 1 0. 5ya2 yca12 . cN 1 ycN 2 ) ] Oeya1 yc1 ycN 1 Oe 0. 5ya2 yc12 . cN 1 ycN 2 ) ycOya0 = [ ya1 yca1 ycN 1 0. 5ya2 yca12 . cN 1 ycN 2 ) Oe0. cN 1 ycN 2 ycN 3 ) . 5ya3 yca123 . cN 1 ycN 2 ycN 3 )] ycyyayc = ycEyc Oe yca3 ycyuo = yca1 yc1 yca2 yc12 . By squaring and summing . yields ycyyaycu ycyyayc = yca12 yca22 2yca1 yca2 yca2 . Hence, yca2 = 2 ycy2 Oeyca 2 Oeyca 2 2yca1 yca2 Then by setting Oe1 O yca2 O 1 to ensure the given point is inside the reachable arm workspace, yield yc2 = AOo1 Oe yca22 , . where the positive sign is relative to the elbowdown posture and the negative sign to the elbowup posture. Therefore, yc2 is calculated as yc2 = Atan2. c2 , yca2 ). By substituting yc1 and yc2 to . , the algebraic equations with unknown yc1 and yca1 as yc1 = yca1 = . ca1 yca2 yca2 )ycyyayc Oe yca2 yca2 ycyyaycu Oeya1 yc1 ycN 1 Oe ya2 yc12 . cN 1 ycN 2 ) ycOya0 = [ ya1 yca1 ycN 1 ya2 yca12 . cN 1 ycN 2 ) Oeya3yc12 . cN 1 ycN 2 ycN 3 ) ya3 yca123 . cN 1 ycN 2 ycN 3 )] . By substituting the velocity in . , the relative motion of the considered robot in Figure 4 can be obtained. By obtaining . , . is also achieved, since they are identical robots. Image Processing Image processing requires computational The more complicated an image processing, the more resources are required. Therefore, the image processing method should be simplified without sacrificing the effectiveness of the method. The basic image processing operation that is conducted in this study is as follows. Grayscale converting The first step of image processing is by converting the image to grayscale, giving the pixels in the image representing the amount of light its carries. This process makes the Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A p-ISSN: 1410-2331 e-ISSN: 2460-1217 information inside the image is limited only on Therefore, it is easier to proceed to the next image processing steps. Thresholding They are conducted by comparing the grey level of each pixel to a threshold. The equal and higher grey level, then the threshold is marked as true, and if the grey level is lower than the threshold, it is considered false. The result of this image processing is called a logical image. Filtering Filtering is placing a mask on each pixel to be a new grey or logical value. The object of interest can be emphasized, and an irrelevant object can be removed. Blob analysis Blob analysis continues the process by connecting the true pixels. All true pixels are marked by a number greater than zero, while the false mapped is marked as zero. connecting the true pixels, the characteristic properties such as centroid will be merged. RESULTS AND DISCUSSION The proposed method's feasibility is simulated using Scilab, an open-source engineering simulation, to model and simulate mathematical problems in engineering. Simulation results consist of image processing results as a visual cue and a fuzzy logic controller resulting from kinematics analysis. The Detected Oranges Image processing is necessary as the visual cue to move the robot. The robot only moves when the camera detects the assigned The considered object in this study is The original image is shown in Figure 5. Figure 5. The original images of oranges The raw image of oranges in Figure 5 is processed to make the robot recognized the shape of oranges and taking the most right one as the assigned object to be initially picked and placed by considering the right one is the closest one to the robot. The final detected image is shown in Figure 6, where the boxes show the detected oranges' viewer. Figure 7 shows step by the steps of the image processing considered in this study. The first step in processing the captured image is to convert the original image to the grayscale image. The grayscale image allows the image to be manipulated and process to detect the assigned The thresholding processing is started by inverting the image. The Otsu threshold algorithm is applied to the image to find objects in an image by identifying the significantly brighter or darker pixels than the background. The next type of image processing is Blob analysis, where the true pixels in the logical image and forming shape and color. The final detected object is shown in Figure 6, and bounding boxes are determined and drawn around the subject as the shape of the assigned objects is detected. Figure 6. The final detected image Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A SINERGI Vol. No. June 2021: 217-226 Figure 7. Image processing considered in this study Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A p-ISSN: 1410-2331 e-ISSN: 2460-1217 Fuzzy Logic Controller Robot 1 takes the input of oranges detection using a camera and proximity sensor to sense the distance between the end-effector and the oranges. Robot 2 takes the weight sensor's input to sort the oranges and place them accordingly in a box. Robot 2 is also received the distance approximation between the end-effector and oranges from the distance sensor. The input membership functions for robots 1 and 2 are shown in Figure 8 and Figure 9. Figure 10. The output membership function as the result of the application of the input membership function Figure 8. Visual cue and proximity sensor inputs to the system of robot 1 Figure 9. Weight sensor membership function for robot 2 The membership output of robot 1 and 2 are the same, the angles q1 , q2 , q3 , and yuo, which corresponds with the angles made by the servo motors that move the joints. The membership output is shown in Figure 10. Figure 11 shows the relationship between camera detection, proximity sensor on endeffector, and phi as the end-effector servo angle as robot 1 picks and places the oranges. Figure 12 shows the base motion . 1 ) when the camera detects the orange and moves the robot. Figure 11. The relation between camera detection, proximity sensor, and yuo for robot 1 Dewi et al. Motion Control Analysis of Two Collaborative Arm Robots in Fruits A SINERGI Vol. No. June 2021: 217-226 Figure 12. The relation between camera detection, proximity sensor, and q1 for robot 1 Figure 13. The relation between weight sensor, proximity sensor and, yuo for robot 2 Figure 14. The relation between weight sensor, proximity sensor, and q1 for robot 2 Figure 13 illustrates the relationship between the weight sensor, proximity sensor, and the angles of the end-effector of robot 2 during picking and placing the orange. Figure 14 shows the relationship between the weight sensor, proximity sensor, and the robot-base angle when robot 2 sorts the orange and place it in the box according to the detected weight. The simulation results show the feasibility of design two collaborative robots in picking and sorting the oranges in an agriculture product packing system. CONCLUSION A robot is assigned to replace or assist human beings in many life sectors, such as agriculture, from seed to agricultural products In doing so, one of the essential factors is image processing, as the robot must be able to recognize the object to be manipulated. Image processing should be simple to accommodate a limited resource without sacrificing the importance of effective detection. Applying more than one robot can increase productivity rather than just one robot. therefore, robots' relative motion must be considered. The kinematics analysis and fuzzy logic controller design show the feasibility of design two collaborative robots applied in an agriculture product packing system. REFERENCES