Submit manuscript...
eISSN: 2574-8092

International Robotics & Automation Journal

Mini Review Volume 9 Issue 1

Computer controlled image based object classification by using a robot manipulator

Erol UYAR, Mücahid CANDAN

Ege University - Faculty of Engineering Department of ElectricElectronic Engineering, Turkey

Correspondence: Erol UYAR, Ege University- Faculty of Engineering Department of Electric-Electronic Engineering, 35100 - BORNOVA, IZMIR/TÜRKIYE, Tel +90 232 38831 38, Fax +90 232 388 78 68

Received: January 16, 2023 | Published: February 15, 2023

Citation: UYARE, CANDAN M. Computer controlled image based object classification by using a robot manipulator. Int Rob Auto J. 2023;9(1):21-24. DOI: 10.15406/iratj.2023.09.00257

Download PDF

Abstract

In this paper the design, implementation and application of a computer controlled vision system with a robot manipulator for classification of various fruits is introduced. The system, which consists of a RGB Camera, a PC and a five degree of freedom (DOF) robot manipulator, specifically set up to work in real-time applications.

At the first stage of study the captured image of objects in RGB scale viewed by the camera is stored via USB link of computer as digital image data in registries for further segmentation process to recognize the objects. A particular Neural Network is designed to classify objects exactly.   

The moving objects on a belt, driven by a servo-stepper motor, are recognized in size and property via a vertically mounted camera. The recognized objects are then assorted by a computer controlled robot manipulator with five DOF and a gripper at the end-effecter, in order to transport the objects via a predefined trajectory to designated locations.

As second stage, the kinematic link table of robot manipulator using inverse kinematics is determined, so that from the given way points of a particular trajectory the link parameters (joint angles) can be calculated. The joint angles are used as reference values for a joint based control algorithm programmed in C++. The joints are equipped with geared DC motors with position feedback. 

A particular control program is also used to control the motion of servo-stepper belt motor and the pneumatically activated gripper valve of the manipulator. Successful results are evaluated both in assorting of fruit in size, property, in synchronous working of the robot arm and in connection with image processing.

Keywords: DOF, image processing, robot arm, control, joint, computer, robot manipulator, RGB camera, neural network

Introduction

In recent production technologies and automation systems, the use of robot manipulators has always been inevitable and various kinds of manipulators for various tasks have been developing increasingly. The use of various sensor technologies with robot manipulators assisted to generate convenient and effective production systems.

Machine vision has been a key sensor technology for robots and has the potential to be used in many industrial processes. Machine vision currently has a lot of uses in inspection, but it is anticipated that robotics will use vision technology more and more in the future. The system for object recognition is a good illustration of machine vision. There are several uses for object recognition, including identifying precise locations, separating objects based on size and property, and inspecting a work area.

More than just a set of mechanical linkages should be considered when designing a robot manipulator. One part of a larger robotics system, which also includes an arm, an external power supply, end-of-arm tooling, external and internal sensors, actuators, a computer interface, and a control unit, is the mechanical arm. Since the robot's programming and control can have a significant impact on its performance and later applications, even the programmed software should be viewed as a crucial component of the total system.

 By using different sensing systems with robot manipulators for simultaneous operations can be exactly realized in variable conditions which will positively affect the quality of production.

In this study, a classification system made up of a robot manipulator, a digital RGB camera, and a speed-controlled moving belt (rod-less cylinder) is illustrated. (Figure 1) The robot manipulator moves the recognized object to the desired location after picture capture with a camera and segmentation. The approach is specifically utilized and evaluated for categorizing different fruits, where some fruits, such an orange, lemon, and apple, etc., are quite similar.

Successful applications are actualized to investigate the vision capability and position accuracy of the whole system particularly for the convenience in industrial tasks, such as convey and position operations, object assortment in accordance to the size and property.

Configuration and implementation of the system

The system can be considered in two main instance; the first part is the manipulator and the trajectory-plan of the manipulator, second image processing.

Five DOF-Robot manipulator

This subject is contended with two main tasks; the construction and decisions for kinematic of the manipulator. At this instance, a manipulator which is designed and constructed in Electric-Electronic Engineering Department of Ege University is used for the trajectory plan and transport of classified objects by machine vision. (Figure 1)

Figure 1 Five DOF Robot manipulator.

As first attempt, the kinematic analysis and “link table” of the revolute jointed manipulator with five DOF investigated. Furthermore a path following and control algorithm for joint motors is developed to move the manipulator on the chessboard.

The body and the joints of the manipulator are all made from aluminum due to lower weight and toughness of this material. In order to reduce the speed and increase the motor torques up to desirable level, gear reduction mechanisms have been used for the actuators of the joints. Harmonic drives are a type of gear mechanism that is common in robots due to their low backlash, high torque transmission and compact size. However, undesirable frictions and flexibilities are introduced at joints. (Figure 2)

Figure 2 Drawing of robot manipulator.

The workspace of a manipulator, as depicted in table 1, is the total volume swept away by the end-effecter while the manipulator performs all potential motions. The extent, number, type, and sequence of basic axes, as well as the control software, will influence the size and shape of a robot workspace.

The workspace is restricted by the mechanical limitations on the joints as well as the manipulator's geometrical design. For instance, the range of motion for a revolute joint could not be all the way around. The production duties may impose extra restrictions as well.

There are 5 links that numbered from 0, the base, to 5, forming our manipulator. There are five joints, which are the points in the space formed by the connection of the manipulator links with each other. Joints are denoted by qi, where q is the angle of rotation. Next, a coordinate frame is attached rigidly to each link.  In order to be particular, an internal frame is inserted to the base and call it frame 0. Then, i15 frames are selected, thus frame i15 is rigidly attached to link i15 respectively. This connotes that, whatever motion the robot executes, the coordinates of each point on link i are constant when expressed in the ith coordinate frame (Table 1) illustrates the idea of attaching frames rigidly to links in the case of an elbow manipulator. Presume that Ti is the homogeneous matrix that transforms the coordinates of a point from frame i to frame i-l. The matrix Ti is not constant, but varies as the configuration of the robot is changed. However, the assumption that all joints are revolute infers that Ti is a function of only a single joint variable, namely qi· Second step in forward kinematics is to compose a table to show all link parameters.1-3(Table 2)

DOF 1

00 ≤ θ1 ≤ 1500


DOF 2

00 ≤ θ2 ≤ 1200

DOF 3

00 ≤ θ3 ≤ 1500

DOF 4

00 ≤ θ4 ≤ 1800

DOF 5

00 ≤ θ5 ≤ 3600

Table 1 Working capacity of manipulator arms, in terms of the angle and coordinates located into links

I

αi-1

Ai-1

Di

θI

1

0

0

210

θ1

2

+90

0

0

θ2

3

0

202

0

θ3

4

0

202

0

θ4

5

-90

360

0

θ5

Table 2 Link parameters

T 5 0 = T 1 0 . T 2 1 . T 3 2 . T 4 3 . T 5 4 MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOqaamaaDeaaleaaca aI1aaabaGaaGimaaaajugibiaadsfakiabg2da9maaDeaaleaacaaI XaaabaGaaGimaaaakiaadsfacaGGUaWaa0raaSqaaiaaikdaaeaaca aIXaaaaOGaamivaiaac6cadaqhbaWcbaGaaG4maaqaaiaaikdaaaGc caWGubGaaiOlamaaDeaaleaacaaI0aaabaGaaG4maaaakiaadsfaca GGUaWaa0raaSqaaKqzadGaaGynaaWcbaGaaGinaaaakiaadsfaaaa@4BE0@    (1)

It is always desirable to be able to determine the joint positions required by the manipulator to approach a desired point in Cartesian coordinates. According to given X MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOqaaKqzadGaamiwaa aa@3919@ , Y MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOqaaKqzadGaamywaa aa@391A@ and Z MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOqaaKqzadGaamOwaa aa@391B@ in Cartesian coordinates, forward kinematic equations are required to obtain the joint angles. In other words, objective is the calculation of successive values of the robot variables required to maintain a particular trajectory and orientation of an object. At this instance, maintaining particular trajectory and orientation of object problem will be concerned. Mentioned problem is more difficult than the forward kinematics problem.  The general problem of inverse kinematics will be presented below. Given a 4x4 homogeneous transformation matrix of end-effecter, find equivalent theta angles for five links.

By determining two of them, PX,PY,PZ MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOqaaKqzGeGaamiuaK qzadGaamiwaKqzGeGaaiilaiaaykW7caWGqbqcLbmacaWGzbqcLbsa caGGSaGaaGPaVlaadcfajugWaiaadQfaaaa@45D4@ , have just three unknown parameters, and other angles of the manipulator are able to be solved, because we have three equations, and three unknowns as enough to solve.

x=Px=[360*cos(Q2+Q3+Q4)+202*cos(Q2+Q3) +202*cos(Q2)]/ [(P x Λ 2+P y Λ 2)/P x Λ 2] Λ (1/2) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOabaeqabaaeaaaaaa aaa8qacaWG4bGaeyypa0JaamiuaiaadIhacqGH9aqpcaGGBbGaaG4m aiaaiAdacaaIWaGaaiOkaiaadogacaWGVbGaam4CaiaacIcacaWGrb GaaGOmaiabgUcaRiaadgfacaaIZaGaey4kaSIaamyuaiaaisdacaGG PaGaey4kaSIaaGOmaiaaicdacaaIYaGaaiOkaiaadogacaWGVbGaam 4CaiaacIcacaWGrbGaaGOmaiabgUcaRiaadgfacaaIZaGaaiykaaqa aiabgUcaRiaaikdacaaIWaGaaGOmaiaacQcacaWGJbGaam4Baiaado hacaGGOaGaamyuaiaaikdacaGGPaGaaiyxaiaac+cacaGGBbGaaiik aiaadcfacaWG4bWdamaaCaaaleqabaWdbiabfU5ambaak8aacaaIYa WdbiabgUcaRiaadcfacaWG5bWaaWbaaSqabeaacqqHBoataaGccaaI YaGaaiykaiaac+cacaWGqbGaamiEamaaCaaaleqabaGaeu4MdWeaaO GaaGOmaiaac2fadaahaaWcbeqaaiabfU5ambaakiaacIcacaaIXaGa ai4laiaaikdacaGGPaaaaaa@7853@ (2)

y=Py=[360*Py*cos(Q2+Q3+Q4)+202*Py*cos(Q2+Q3) +202*Py*cos(Q2)]/Px/ [(P x Λ 2+P y Λ 2)/P x Λ 2] Λ (1/2) MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOabaeqabaaeaaaaaa aaa8qacaWG5bGaeyypa0JaamiuaiaadMhacqGH9aqpcaGGBbGaaG4m aiaaiAdacaaIWaGaaiOkaiaadcfacaWG5bGaaiOkaiaadogacaWGVb Gaam4CaiaacIcacaWGrbGaaGOmaiabgUcaRiaadgfacaaIZaGaey4k aSIaamyuaiaaisdacaGGPaGaey4kaSIaaGOmaiaaicdacaaIYaGaai OkaiaadcfacaWG5bGaaiOkaiaadogacaWGVbGaam4CaiaacIcacaWG rbGaaGOmaiabgUcaRiaadgfacaaIZaGaaiykaaqaaiabgUcaRiaaik dacaaIWaGaaGOmaiaacQcacaWGqbGaamyEaiaacQcacaWGJbGaam4B aiaadohacaGGOaGaamyuaiaaikdacaGGPaGaaiyxaiaac+cacaWGqb GaamiEaiaac+cacaGGBbGaaiikaiaadcfacaWG4bWdamaaCaaaleqa baWdbiabfU5ambaak8aacaaIYaWdbiabgUcaRiaadcfacaWG5bWaaW baaSqabeaacqqHBoataaGccaaIYaGaaiykaiaac+cacaWGqbGaamiE amaaCaaaleqabaGaeu4MdWeaaOGaaGOmaiaac2fadaahaaWcbeqaai abfU5ambaakiaacIcacaaIXaGaai4laiaaikdacaGGPaaaaaa@825D@ (3)

z=Pz= 360*sin( Q2+Q3+Q4 )+210+202*sin( Q2+Q3 )+202*sin( Q2 )       MathType@MTEF@5@5@+= feaagKart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqkY=Mj0xXdbba91rFfpec8Eeeu0xXdbba9frFj0=OqFf ea0dXdd9vqaq=JfrVkFHe9pgea0dXdar=Jb9hs0dXdbPYxe9vr0=vr 0=vqpWqaaeaabiGaciaacaqabeaadaqaaqaaaOqaaabaaaaaaaaape GaamOEaiabg2da9iaadcfacaWG6bGaeyypa0JaaeiiaiaaiodacaaI 2aGaaGimaiaacQcacaWGZbGaamyAaiaad6gapaWaaeWaaeaapeGaam yuaiaaikdacqGHRaWkcaWGrbGaaG4maiabgUcaRiaadgfacaaI0aaa paGaayjkaiaawMcaa8qacqGHRaWkcaaIYaGaaGymaiaaicdacqGHRa WkcaaIYaGaaGimaiaaikdacaGGQaGaam4CaiaadMgacaWGUbWdamaa bmaabaWdbiaadgfacaaIYaGaey4kaSIaamyuaiaaiodaa8aacaGLOa GaayzkaaWdbiabgUcaRiaaikdacaaIWaGaaGOmaiaacQcacaWGZbGa amyAaiaad6gapaWaaeWaaeaapeGaamyuaiaaikdaa8aacaGLOaGaay zkaaWdbiaacckacaGGGcGaaiiOaiaacckacaGGGcGaaiiOaaaa@6AFA@     (4)

As described above, these equations are nonlinear and one of the methods in order to solve equations is to utilize Gauss Elimination Method in which the solution differs by the initial conditions and the differential interval that is chosen. A program has been written in C++ language to find solutions for all essential conditions. After obtainment of solution matrixes, program executes to predilection the best solution for the manipulator control. The criteria of opting the best are quest the easiest and shortest stroke of the manipulator.4 (Figure 3).

Figure 3 Unit step response of the links.

Before control applications, the position tracing capability of the manipulator is tested for each arm. First of all, reconsidering the mechanical design and workspace; the feasible motion capabilities of the manipulator are investigated. Despite simplicity, the joint base control is used for each joint via operating additional control algorithms. Responses of the joints for different control algorithms are tested.

Vision system and segmentation algorithm

At this instance of the study a digital RGB camera and image processing algorithm are used to capture the image via activation signal of object detector sensor. The RGB camera is mounted over the rodless cylinder, which is moved by a servo-stepper motor and drive unit.5-9 Image processing program and the control of the driving unit of the stepper motor are executed by computer. Convey operation is actualized through gripper at the end-effecter. Operating scheme of the process is given in Figure 4.

Figure 4 Operating scheme of process.

The computer needs to be programmed to process the digital image. Digital image processing is a significant task in industrial applications because of the vast volume of data that needs to be evaluated.

A system for industrial vision contains 311040 picture elements in total, with 640 pixels per line and 486 lines of pixels. Each pixel also has three RGB values. Thus, each 1/30 s would need a total of 933 120 bits of data. Figure 4 shows the flowchart of the unique software created and used for object recognition and selection.

If any object is sensed, object will be captured and separated from background via an algorithm based on values of the Red, Green, Blue, are read from look-up color table which consolidate the proper color range for each to maintain color class of objects in one test. Alias, system recognizes the object that enters its field of view, captures it, and then uses a particular algorithm based on choosing appropriate color threshold density values to distinguish the object from the backdrop. The three primary color elements of digital image processing software such as MATLAB—Red, Green, and Blue—are utilized for this purpose. (Figure 5)

Figure 5 Captured images and relevant histogram.

Three distinct characteristic images of the original image are assessed after separating the obtained real digital image into these three channels. The image processing algorithm in Figure 6 is used to remove the impacts of varying brightness and lighting as well as to generate the ideal characteristic threshold density values for the color picture separation and segmentation procedure.

Figure 6 Image processing algorithm.

After segmentation process, feature vectors which give the Eigen characteristics of the various fruits such as coordinates of gravity center, principal axis, perimeter, etc. are calculated by the computer in order to build a Neural Network to differ various fruits exactly. Feature vectors are then used for further classification of the object via a trained Neural Network. (Figure 7) Besides the calculated feature vectors are used to obtain a rotation, translation and scale free recognition of fruits.

Figure 7 Neural network for classification.

Neural Network is trained by feature vectors before pattern classification process based on back propagation method. Weights of neural are used to simulate forward phase to classify fruits on forward phase of Neural Network.

Experimental results and conclusions

In order to evaluate an image system's segmentation capabilities, a variety of fruit classes are used, each with a unique geometric shape and color composition. For further item recognition and comparison, various developed algorithms that depend on MATLAB's color analysis capabilities as well as some mechanical descriptions for feature characterization are created. The evaluation of very successful results in classifying the fruits that cannot be seperated easily.

On the other hand, the operation and transportation of the camera motion-object recognition are perfectly timed and synchronous. The implementation set is unrestricted in its application for tasks related to industrial classification.

For investigation of the mechanical motion capability and precision of the manipulator, and synchronous working of the robot arm in connection with image processing, various tests on the chessboard are made and successful results are obtained. Expected and acquired results are shown in Table 3.

Classification

Training Acc %

Test Acc %

Path Following Acc (Max) (mm)

Aimed

100%

100%

5 mm

Acquired

99.5%

98.8%

10 mm

Table 3 Aimed and acquired results of study

It is clear that, the feature vector selection algorithm of neural network and post-processing algorithm contributes test results and its accuracy. In order to improve path following accuracy it is clear that better mechanical properties needed.

Acknowledgments

None.

Conflicts of interest

Authors declares that there is no conflict of interest.

References

Creative Commons Attribution License

©2023 UYARE,, et al. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.