0% found this document useful (0 votes)
18 views

The 9 Points Calibration Using Scar A Robot

Uploaded by

Phạm Bình
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

The 9 Points Calibration Using Scar A Robot

Uploaded by

Phạm Bình
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/339328539

The 9 Points Calibration Using SCARA Robot

Conference Paper · December 2019


DOI: 10.1109/RI2C48728.2019.8999901

CITATION READS

1 2,345

3 authors:

Chanin Joochim Supod Kaewkorn


King Mongkut's University of Technology North Bangkok King Mongkut's University of Technology North Bangkok
16 PUBLICATIONS 52 CITATIONS 7 PUBLICATIONS 13 CITATIONS

SEE PROFILE SEE PROFILE

Alisa Kunapinun
Florida Atlantic University
7 PUBLICATIONS 15 CITATIONS

SEE PROFILE

All content following this page was uploaded by Alisa Kunapinun on 08 July 2020.

The user has requested enhancement of the downloaded file.


The 9 Points Calibration Using SCARA Robot
Chanin Joochim Supod Kaewkorn Alisa Kunapinun
King Mongkut’s University of King Mongkut’s University of Asian Institute of Technology
Technology North Bangkok Technology North Bangkok
Bangkok, Thailand Bangkok, Thailand Pathumthani, Thailand
chanin.j@cit.kmutnb.ac.th supod.k@cit.kmutnb.ac.th st121133@ait.ac.th

Abstract—The vision system is always applied to industrial There are many problems must be concerned when using
in many applications. There are many applications that camera calibration. Firstly, users must understand
integrated with industrial robots by using vision system such as calibration’s process, unless the accuracy will be low. Many
detecting position, and matching objects. Therefore, it is industrial applications such as electronics parts, and screwing
necessary to transfer from camera position in pixel coordination need very high accuracy, but because of some conditions of
to robot position in world coordination. After calibration, robot calibration process, the accuracy may not meet the target, and
can know the objects position and orientation, which be detected yield rate is low. Secondly, the basic camera calibration is not
from vision system. In basic camera calibration to robot enough high accuracy. There are many internal parameters
coordination, it needs at least 3 points of camera and robot
that make error. Thus, users do not understand when the
positions. However, the accuracy from the algorithm will be low
from human error, internal hardware such as intrinsic, extrinsic
problems occurred, and they cannot correct the parameters,
camera parameters, and installation error (Ex: tilt etc.). Thus, which come from camera. Thirdly, it is necessary to do camera
origianally, the process of calibration must have 3 steps, calibration frequently when apply the vision to industrial
intrinsic camera calibration, extrinsic camera calibration and system, because the precision may be changed. If calibration
camera to robot base calibration. Moreover, the basic is too complicated, the maintenance time will be long, and the
calibration cannot calculate TCP offset (Tool coordinate point product output will be less than preferred target.
offset). If tool has been installed to robot, the robot must change
In the paper, it will explain another concept of calibration
final position from MIF (Mechanical interface) to TIF (Tool
by using 9 points calibration to SCARA robot. The algorithm
interface). Thus, user must calculate the TCP offset before
calculate camera calibration. However, there are many is applied from other applications of vision system. It is high
processes of calibration. In this paper, it will show the 9 points accuracy and widely used in industrial world. Moreover, the
calibration’s algorithm, which be applied from other process is not complicated. It can decrease calibration process
applications of vision system. The paper will explain step-by- steps into one step.
step how to solve the equation and how to apply with SCARA By using the 9-points calibration, the calibration method
robot. Moreover, the paper will explain not only camera can summarize process to be mathematical equations. Not
calibration, but also SCARA robot and TCP offset calculation.
only SCARA robot base position, but also TCP tool point will
Keywords— Calibration, Camera, SCARA Robot, Vision be declared after solved the equation. The calibration accuracy
system, TCP offset can be compared between basic calibration and 9 points
calibration. Because the process, which users must do, is only
I. INTRODUCTION teaching points, it is easy to apply calibration process to users’
applications. Moreover, the calibration’s equation can be
The vision system is always applied to industrial in many applied to other robots (SCARA, XY-Table, 6-Axis, etc) and
applications. There are many applications that integrated with other vision system such as 3D Vision.
industrial robots by using vision system such as detecting
position, and matching many objects. Therefore, it is The paper will explain as follow: Section II the
necessary to convert pixel coordination from vision system to fundamental calibration. Section III explains about
world coordination of robot. Thus, camera calibration is an Homographs matrix. Section IV describes the 9-points
important process that vision system integrated with robots calibration algorithm and how to apply to robotics. The
must do before start applications. accuracy test process is given in Section V. Finally, a brief
conclusion is provided in Section VI.
The basic of camera calibration to SCARA robot
coordination needs at least 3 points, which link together II. THE FUNDAMENTAL CALIBRATION
between camera position and robot position. Although, the
SCARA robot is the 4-axis robot, which design for pick
calibration process is very simple, the accuracy is very low
because of human error, hardware error and installation error. and place objects from the top view. Thus, the objects, which
Thus, the calibration process must include other calibration robot needs to pick, usually place on the plain tray or pallet
process for increase accuracy. For example, before connect tray. From the characteristics of the placing objects, if the
camera to robot, the camera must calculate intrinsic and robot system needs vision system to detect something such as
extrinsic camera calibration. These processes are too error, missing alignment, and fault objects, the camera can be
complicated. Thus, the calibration’s topic is very difficult for put from top or bottom of the working area. Moreover, the
all users to apply into system. Moreover, users who do not objects’ height between robot and objects usually are the
understand the calibration’s process may do wrong steps. same length. Therefore, vision system can use in 2D system.
Therefore, the accuracy is low and not match for some SCARA robot’s gripper has been installed from the end
application, which need high accuracy from vision system. effector of robot called Mechanical Interface (M/IF), but the
picking position might not be the same as M/IF. There is the
length position X and Y from M/IF, which called tool center

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


point (TCP). The TCP must be added from robot position for
pick objects accurately. 𝛼 𝑠 𝑢0
𝐾 = [0 𝛽 𝑣0 ] (3)
0 0 1

Assigned 𝑓 as focal length and 𝑚𝑥 , 𝑚𝑦 as scale factor in


pixel, the intrinsic camera can be 𝛼 = 𝑓𝑚𝑥 , 𝛽 = 𝑓𝑚𝑦 . 𝑠 is
skew coefficient between x an y. 𝑢0 and 𝑣0 is the principal
point.
Moreover, camera also has radial distortion from lens
especially the wide-angle lens. The radial distortion can be
defined as the equation:

𝑥̂ = 𝑥(1 + 𝑘1 𝑟 2 + 𝑘2 𝑟 4 ) (4-1)
𝑦̂ = 𝑦(1 + 𝑘1 𝑟 2 + 𝑘2 𝑟 4 ) (4-2)

Where 𝑟 2 = 𝑥 2 + 𝑦 2 and 𝑘1 , 𝑘2 are radial distortion


parameters.
Fig. 1. SCARA robot and camera coordinate system The intrinsic camera parameters can get from camera
and lens, which be released from producers. However, the
The Fig. 1 has shown relative coordinate system between parameters may incorrect from some conditions such as
robot and camera. From the system, it can be assigned increasing more options. Thus, it should be calibrated to
equation as: find the parameters by users
B. Camera calibration - Extrinsic parameters
𝑏𝑎𝑠𝑒 𝑀𝐼𝐹 𝑏𝑎𝑠𝑒 𝑐𝑎𝑚
𝑇𝑀𝐼𝐹 𝑇𝑜𝑏𝑗 = 𝐻𝑐𝑎𝑚 𝑃𝑜𝑏𝑗 (1) The extrinsic parameters are transformation matrix that
transfer world coordinate system to be camera coordinate
𝑏𝑎𝑠𝑒
Where 𝑇𝑀𝐼𝐹 is the M/IF position matrix of SCARA system. From the matrix [𝑅 𝑇] in equation (2), 𝑅 is rotation
𝑀𝐼𝐹
robot related with robot base. 𝑇𝑜𝑏𝑗 is tool position matrix, matrix and 𝑇 is the position of the origin of the world
𝑏𝑎𝑠𝑒
which be connected from M/IF position. 𝐻𝑐𝑎𝑚 is the coordinate system.
transformation matrix related between robot base position The extrinsic parameters can be found by using the 3-points
and camera position. The matrix includes many intrinsic and algorithm by using steps:
𝑐𝑎𝑚
extrinsic parameters. And 𝑃𝑜𝑏𝑗 is an object position matrix, 1. Use camera to detect an object or a point, keep the
which measured from camera. Thus, the scale type of the position data as 𝑐1
matrix is pixel type. 2. Point the robot at the end effector to the object or the
When robot is pointing to the object, the end of TCP point, keep the position data as 𝑝1
point is same position as object position. Thus, it can explain 3. Change the position in X and Y and do the same as
that the object position from camera is the robot position number 1 and 2 for 3 times
𝑏𝑎𝑠𝑒 𝑐𝑎𝑚
including tool position. Therefore, the 𝑇𝑀𝐼𝐹 and 𝑃𝑜𝑏𝑗 is the At the end of process, the position, orientation and scale
known matrices. can be calculated and release to transformation matrix
𝑏𝑎𝑠𝑒 𝑀𝐼𝐹 [𝑅 𝑇].
The unknown matrices, 𝐻𝑐𝑎𝑚 and 𝑇𝑜𝑏𝑗 , can get from
calculation. Originally, Calibration processes, which link to The calculation to get extrinsic camera is not complex.
robots, has many steps of not only the process of camera However, there are disadvantages of the calculation. Firstly,
calibration, but also the process of robot calibration. For the position and orientation might be error from Hardware
example, camera calibration contains 2 groups of parameters, installation. The tilting camera can change scale of each
intrinsic parameters and extrinsic parameters. Therefore, pixel. Secondly, the error comes from mistaken intrinsic
there are at least 2 steps of camera calibration. Moreover, the parameters. It can make image distortion, and the position
TCP point is another parameter, which need to be calibrated. parameter will be incorrect. Thirdly, error comes from
Thus, the calibration process must add the TCP calibration human’s measurement error.
step into system. Finally, when finished calculate all camera parameters, the
combination between intrinsic and extrinsic parameter will
A. Camera calibration - Intrinsic parameters 𝑏𝑎𝑠𝑒
release the 𝐻𝑐𝑎𝑚 matrix.
When represent 𝐶 = [𝑢 𝑣 1]𝑇 as 2D position matrix
C. Tool calibration
in pixel coordinates and 𝑊 = [𝑥 𝑦 𝑧 1]𝑇 as 3D position
matrix in world coordinates. The camera matrix is used to Not only the camera calibration, but also the TCP tool
denote a projective mapping as the equation (2) point need to be calculated. The TCP can be calculated in the
steps as below:
𝐶 = 𝐾[𝑅 𝑇]𝑊 (2) 1. Assigned a point, which robot is reachable. The point
is called the reference point.
Where K is intrinsic camera parameters, and [𝑅 𝑇] is 2. Bring the end effector of TCP point of the robot arm
Extrinsic camera parameters point at the reference point.
The intrinsic parameters can represent to:
3. Change the robot arm position and orientation but 𝑝11 𝑝12 𝑝13
lock the end effector of TCP point to the reference 𝑃𝑟𝑜𝑗𝑒𝑐𝑡𝑖𝑣𝑒 = [𝑝21 𝑝22 𝑝23 ] (8-3)
point. Do it for 3 times at least. 𝑝31 𝑝32 1

Basically, robot and camera calibration in SCARA robot use


only Euclidean or Affine transformation matrix. However,
from the projective mapping in equation (2), if all matrices is
combined, The matrix result is looked like projective
transformation matrix more than Euclidean or Affine
transformation matrix. Thus, the full transformation matrix
can be explained as:

𝑐𝑥 𝛼 𝑠 𝑢0 𝑝11 𝑝12 𝑝13 𝑤𝑥


[𝑐𝑦 ] = [ 0 𝛽 𝑣0 ] [𝑝21 𝑝22 𝑝23 ] [𝑤𝑦 ] (9)
1 0 0 1 𝑝31 𝑝32 1 1

𝑐𝑥 𝑝11 ′ 𝑝12 ′ 𝑝13 ′ 𝑤𝑥


[𝑐𝑦 ] = [𝑝21 ′ 𝑝22 ′ 𝑝23 ′] [𝑤𝑦 ] (10)
Fig. 2. Tool calibration process
1 𝑝31 ′ 𝑝32 ′ 1 1
Hence,
By referring the robot position as (𝑡1 𝑡2 𝑡3 ) when 𝑡𝑖 =
(𝑥𝑖 , 𝑦𝑖 ), TCP tool position can be calculated by using circular 𝐶 = 𝑃′𝑊 (11)
𝑊 = 𝑃 −1 𝐶 = 𝐻𝐶 (11-1)
equation as below:
Where
ℎ11 ℎ12 ℎ13
(𝑥𝑖 − 𝑎)2 + (𝑦𝑖 − 𝑏)2 = 𝑟 2 (5)
𝐻 = [ℎ21 ℎ22 ℎ23 ] (12)
2𝑎𝑥𝑖 + 2𝑏𝑦𝑖 − (𝑎2 + 𝑏 2 − 𝑟 2 ) = (𝑥𝑖 2 + 𝑦𝑖 2 ) (5-1)
ℎ31 ℎ32 1
Where (𝑎, 𝑏) is the reference point and 𝑟 is TCP length from
M/IF to the reference point. A. Result from homography matrix
Assigned 𝐴𝑖 = (𝑥𝑖 2 + 𝑦𝑖 2 ) and 𝐵 = (𝑎2 + 𝑏 2 − 𝑟 2 ) into Eq. From the equation (11-1), the world coordinate system
(5-1), the new equation will show as below: can be calculated from camera coordinate and projective
matrix H. The result will be
2𝑎𝑥1 + 2𝑏𝑦1 − 𝐵 = 𝐴1 (6-1)
2𝑎𝑥2 + 2𝑏𝑦2 − 𝐵 = 𝐴2 (6-2) ℎ11 𝑐𝑥 +ℎ12 𝑐𝑦 +ℎ13
2𝑎𝑥3 + 2𝑏𝑦3 − 𝐵 = 𝐴3 (6-3) 𝑤𝑥 = ℎ31 𝑐𝑥 +ℎ32 𝑐𝑦 +1
(13-1)
ℎ21 𝑐𝑥 +ℎ22 𝑐𝑦 +ℎ23
Finally, the equation can be solved and released 𝑎, 𝑏, 𝑟 𝑤𝑦 = ℎ31 𝑐𝑥 +ℎ32 𝑐𝑦 +1
(13-2)
parameters. Then, the TCP point can be calculated by using
𝑀𝐼𝐹
𝑇𝑜𝑏𝑗 = (𝑥𝑖 − 𝑎, 𝑦𝑖 − 𝑏).
IV. THE 9-POINTS CALIBRATION
III. THE HOMOGRAPHS MATRIX
A. Process of 9-points calibration
For Extrinsic camera calibration, there is another one way to
The 9-points calibration is widely used in industrial
calculate camera calibration. The matrix result has been come
because of accuracy. It comes from calibration thinking
out called homographs matrix. The homographs matrix can
process for calculate projective matrix (H). The projective
be described in equation as:
matrix has 8-DOF parameters. Thus, the teaching points need
at least 4 points. However, in industrial systems, they use at
𝑥′ ℎ11 ℎ12 ℎ13 𝑥 𝑥
least 9 points for many reasons.
[𝑦′] = [ℎ21 ℎ22 ℎ23 ] [𝑦] = 𝐻 [𝑦] (7)
1. More points to be calculated will decrease error result
1 ℎ31 ℎ32 ℎ33 1 1
from measurement.
2. Some systems need more parameters to calculate such
From the equation (7), the homographs matrix (H) can be
as depth.
separated to Euclidean matrix, Affine matrix, or Projective
matrix. Each matrix can be described to: 3. The calibration equations are same concept of all
applications.
cos 𝜃 − sin 𝜃 𝑡𝑥 4. Because of same concept of equation, the program for
𝐸𝑢𝑐𝑙𝑖𝑑𝑒𝑎𝑛 = [ sin 𝜃 cos 𝜃 𝑡𝑦 ] (8-1) calibration process is easy to understand for user.
0 0 1 The process of 9-points calibration is the same as the process
of extrinsic camera calibration, but it needs 9 groups points
𝑎11 𝑎12 𝑡𝑥 between robot position and camera position as the Table 1.
𝐴𝑓𝑓𝑖𝑛𝑒 = [𝑎21 𝑎22 𝑡𝑦 ] (8-2)
0 0 1
Table 1. Example group data for calibration By using the least mean square concept, The Equation (16)
can combine all data points to be one matrix (in this case, 9
Number Camera Position Robot Position
data points have been input to equation) as
1 𝑐𝑥 1 , 𝑐𝑦 1 , 𝑐𝜃 1 𝑟𝑥 1 , 𝑟𝑦 1 , 𝑟𝜃 1
2 𝑐𝑥 2 , 𝑐𝑦 2 , 𝑐𝜃 2 𝑟𝑥 2 , 𝑟𝑦 2 , 𝑟𝜃 2 𝑟𝑥 1 𝑐𝑥 1 𝑐𝑦 1 1 0 0 0 −𝑟𝑥 1 𝑐𝑥 1 −𝑟𝑥 1 𝑐𝑦 1
𝑟𝑦 1 0 0 0 𝑐𝑥 1 𝑐𝑦 1 1 −𝑟𝑦1 𝑐𝑥 1 −𝑟𝑦 1 𝑐𝑦 1 ℎ11
⋮ ⋮ ⋮
⋮ = ⋮ [ ⋮ ] (18)
9 𝑐𝑥 9 , 𝑐𝑦 9 , 𝑐𝜃 9 𝑟𝑥 9 , 𝑟𝑦 9 , 𝑟𝜃 9 𝑟𝑥 9 𝑐𝑥 9 𝑐𝑦 9 1 0 0 0 −𝑟𝑥 9 𝑐𝑥 9 −𝑟𝑥 9 𝑐𝑦 9 ℎ33
[𝑟𝑦 9 ] [ 0 0 0 𝑐𝑥 9 𝑐𝑦 9 1 −𝑟𝑦 9 𝑐𝑥 9 −𝑟𝑦 9 𝑐𝑦 9 ]

B. Robot and camera calibration (without tool position) 𝑟𝑥 1


𝑏𝑎𝑠𝑒 𝑀𝐼𝐹 𝑏𝑎𝑠𝑒 𝑐𝑎𝑚
According to Fig. 1, the 𝑇𝑀𝐼𝐹 𝑇𝑜𝑏𝑗 = 𝐻𝑐𝑎𝑚 𝑃𝑜𝑏𝑗 is used ℎ11 𝑟𝑦1
for find missing parameters. However, it is complicated By assuming 𝐻 = [ ⋮ ], 𝐴 = ⋮ , and
because TCP position has been added. Therefore, cutting ℎ33 𝑟𝑥 9
TCP point to reduce complex of equation may be necessary. 𝑟
[ 𝑦 9]
If the robot picks an object at M/IF, the image of system will 𝑐𝑥 1 𝑐𝑦 1 1 0 0 0 −𝑟𝑥 1 𝑐𝑥 1 −𝑟𝑥 1 𝑐𝑦 1
be shown at below.
0 0 0 𝑐𝑥 1 𝑐𝑦 1 1 −𝑟𝑦 1 𝑐𝑥 1 −𝑟𝑦 1 𝑐𝑦 1
𝐵= ⋮ ,
𝑐𝑥 9 𝑐𝑦 9 1 0 0 0 −𝑟𝑥 9 𝑐𝑥 9 −𝑟𝑥 9 𝑐𝑦 9
[0 0 0 𝑐𝑥 9 𝑐𝑦 9 1 −𝑟𝑦 9 𝑐𝑥 9 −𝑟𝑦 9 𝑐𝑦 9 ]
it can be described to the equation as below

𝐻 = (𝐴∗ )𝐵 (19)

By 𝐴∗ is pseudo inverse matrix of 𝐴, which can be calculated


from

𝐴∗ = (𝐴𝑇 𝐴)−1 𝐴𝑇 (20)


Therefore,
𝐻 = (𝐴𝑇 𝐴)−1 𝐴𝑇 𝐵 (21)

Finally, the output at the equations above, releases to be


𝑏𝑎𝑠𝑒
𝑇𝑀𝐼𝐹 = [𝑥, 𝑦, 𝑣]𝑇 . And the robot coordinate is:
Fig. 3. SCARA robot and camera coordinate system 𝑥 𝑦
(without tool) 𝑟𝑥 = , 𝑟𝑦 = (22)
𝑣 𝑣

Thus, the equation can be described as:


C. Robot and camera calibration (with tool position)
𝑏𝑎𝑠𝑒
𝑇𝑀𝐼𝐹 𝑏𝑎𝑠𝑒 𝑐𝑎𝑚
= 𝐻𝑐𝑎𝑚 𝑃𝑜𝑏𝑗 (14) After calculating the camera calibration without TCP
offset is done, TCP offset point can be added into the system.
Then, the equation (14) can transfer to be Jacobian matrix as

𝜕𝑓 1 𝑐𝑥 𝑐𝑦 1 0 0 0 −𝑟𝑥 𝑐𝑥 −𝑟𝑥 𝑐𝑦
𝐽= = [ ] (15)
𝜕𝑝 𝐷 0 0 0 𝑐𝑥 𝑐𝑦 1 −𝑟𝑦 𝑐𝑥 −𝑟𝑦 𝑐𝑦

Where 𝐷 = ℎ31 𝑐𝑥 + ℎ32 𝑐𝑦 + 1 . It depends on the current


parameter settings.
An initial guess for the eight unknown parameters in the
projective matrix {ℎ11 , ℎ12 , … , ℎ33 } can be obtained by
multiplying both sides of the equation in the Jacobian matrix
(15) through by the denominator. Then, the equation is
transferred to be the linear equation form as eq.17.

𝑟𝑥 𝑐𝑥 𝑐𝑦 1 0 0 0 −𝑟𝑥 𝑐𝑥 −𝑟𝑥 𝑐𝑦 ℎ11


[𝑟 ] = [ ] [ ⋮ ] (16)
𝑦 𝑛 0 0 0 𝑐𝑥 𝑐𝑦 1 −𝑟𝑦 𝑐𝑥 −𝑟𝑦 𝑐𝑦 𝑛
ℎ33 Fig. 4. SCARA robot and camera coordinate system
(Add TCP position)
𝑃𝑀𝐼𝐹 = 𝐽𝐻 (17)
Referred from Equation (1), the parameters inside the equation
can be assigned as:
cos 𝜃 − sin 𝜃 𝑟𝑥 Table 2. Comparation between original calibration
𝑏𝑎𝑠𝑒 𝑅 𝑇𝑟
𝑇𝑀𝐼𝐹 = [ sin 𝜃 cos 𝜃 𝑟𝑦 ] = [ 𝑟 ] (23) and 9-points calibration
0 0 1
0 0 1
No. Original calibration 9-points calibration
𝑡𝑥 1 Complicated steps
𝑡𝑥 1 step and 1 equation to
𝑀𝐼𝐹
𝑇𝑜𝑏𝑗 = [𝑡𝑦 ] Thus, 𝑇̂ = [𝑡 ] (24) and equation to
𝑦 calculate
1 calculate
Therefore, 2 Tool calibration must
𝑝𝑥 Tool calibration can be
be calculated in
𝑃̂ ̂
𝑜𝑏𝑗 = [𝑝 ] = 𝑅𝑟 𝑇 + 𝑇𝑟 (25) calculated in the equation
𝑦 another equation
3 Low accuracy: if use
From the equation (1), the equation can be derived by basic calibration
modifying Jacobian matrix. Thus, Medium-High accuracy
High accuracy: if use
homography equation
𝑏𝑎𝑠𝑒 𝑐𝑎𝑚
[0] = 𝐻𝑐𝑎𝑚 𝑏𝑎𝑠𝑒 𝑀𝐼𝐹
𝑃𝑜𝑏𝑗 − 𝑇𝑀𝐼𝐹 𝑇𝑜𝑏𝑗 4 Can apply to other Can apply to other
application application
[0] = 𝐽𝐻 − (𝑅𝑟 𝑇̂ + 𝑇𝑟 )
Therefore,
VI. RESULT OF CALCULATION
𝑇𝑟 = 𝐽𝐻 − 𝑅𝑟 𝑇̂

𝑟𝑥 𝑐𝑥 𝑐𝑦 1 0 0 0 −𝑟𝑥 𝑐𝑥 −𝑟𝑥 𝑐𝑦 ℎ11 cos 𝜃 − sin 𝜃 𝑡𝑥


[𝑟 ] = [ ][ ⋮ ] − [ ][ ]
𝑦 0 0 0 𝑐𝑥 𝑐𝑦 1 −𝑟𝑦 𝑐𝑥 −𝑟𝑦 𝑐𝑦 sin 𝜃 cos 𝜃 𝑡𝑦
ℎ33

ℎ11
𝑟𝑥 𝑐𝑥 𝑐𝑦 1 0 0 0 −𝑟𝑥 𝑐𝑥 −𝑟𝑥 𝑐𝑦 cos 𝜃 − sin 𝜃 ⋮
[𝑟 ] = [ ] ℎ33
𝑦 0 0 0 𝑐𝑥 𝑐𝑦 1 −𝑟𝑦 𝑐𝑥 −𝑟𝑦 𝑐𝑦 sin 𝜃 cos 𝜃 𝑡𝑥
[ 𝑡𝑦 ]
Then, combine all 9 data points to one matrix

𝑟𝑥 1 𝑐𝑥 1 𝑐𝑦 1 1 0 0 0 −𝑟𝑥 1𝑐𝑥 1 −𝑟𝑥 1𝑐𝑦 1 cos 𝜃1 − sin 𝜃1


ℎ11
𝑟𝑦 1 0 0 0 𝑐𝑥 1 𝑐𝑦 1 1 −𝑟𝑦 1𝑐𝑥 1 −𝑟𝑦 1𝑐𝑦 1 sin 𝜃1 cos 𝜃1 ⋮
⋮ = ⋮ ℎ33
𝑟𝑥 9 𝑐𝑥 9 𝑐𝑦 9 1 0 0 0 −𝑟𝑥 9 𝑐𝑥 9 −𝑟𝑥 9 𝑐𝑦 9 cos 𝜃9 − sin 𝜃9 𝑡𝑥 Fig. 5. Calibration test program
[𝑟𝑦 9 ] [ 0 0 0 𝑐𝑥 9 𝑐𝑦 9 1 −𝑟𝑦 9 𝑐𝑥 9 −𝑟𝑦 9 𝑐𝑦 9 sin 𝜃9 cos 𝜃9 ] [ 𝑡𝑦 ]
The 9-points calibration has been tested by writing a calibration
Finally, the equation can be solved by using process in the software program and collect data by using SCARA robot of
same as equation (21) Toshiba Machine robot model THL600 and Vision system using
Basler and OpenCV library. The precision of THL600 is 0.02 mm,
V. ACCURACY TEST PROCESS the camera resolution is 1.3 MP with 8 mm of lens and install with
the robot system at the height 1200 mm between camera and objects.
The 9-points calibration can be compared with the
Thus, the scale of image is 3.03 pixels/mm. The result of error
original calibration. The comparation’s results are released in
accuracy of 9-point calibration with TCP position, for 10 times, have
working step and accuracy. result as the table below:
In the working step process, we created program to do
process of calibration and let tester tried to input parameters Table 3. Calibration Average test result by using program
by using program helper and report the result as how
complicated between original calibration and 9-points Average error Maximum Error
calibration. There were 10 testers who tried to use the Position
0.434 1.494
program. They reported that 9-points calibration is easier. (mm)
In the accuracy test, there are 2 states of the test. Firstly, Angle
0.239 1.261
using program to check and re-calculate the object position (degree)
from camera. Secondly, Toshiba Machine robot model
THL600 has been used for testing the accuracy. When Another testing is robot picking an object to desired position.
program calculates the object position, a user commands The test has been recorded the error in position and angle for 150
robot to go to pick the object and place to target position. times. The result of error accuracy of 9-point calibration with TCP
Then, the error between object position and real target position have result as the table below:
position at placing point can be measured.
The summarize of the difference between original Table 4. Calibration Average test result by using robot
calibration and 9-points calibration are in the table 2 below. Average error Maximum Error
Position
0.657 1.745
(mm)
Angle
0.452 1.831
(degree)
to rectified images before do the next process such as
matching or calibration.
VII. CONCLUSION This project can be applied to other robot types, and how to
Using the 9-points algorithm to apply calculate from camera apply to the future work such as industrial world.
to robot base position, it can decrease multiple steps process
into one step. After test and check the error result of the REFERENCES
algorithm, the result has around 2 times of resolution error. [1] J. Hallenberg (February 2007), Robot Tool Center Point Calibration
using Computer Vision, Linkoping
The error possibly came from human error when measure
[2] F. Shu (November 2009), High-Precision Calibration Approaches to
objects and point the robot to picking position. Moreover, in Robot Vision System, University of Hamburg
this project, the lens, which be selected, are distorted for test [3] R. Szeliski (September 2010), Computer Vision: Algorithms and
full calibration including distortion image. Therefore, the Applications, Springer
error increases when the objects have been put near the border [4] Richard Hartley and Andrew Zisserman (2003), Multiple View
of images. Moreover, the matching library usually does not Geometry in Computer Vision, Cambridge University Press. pp. 155–
support tilt or skew matching objects. Thus, the matching 157
result might have error before input to calibration system. The [5] Chien-Chuan Lin and Ming-Shi Wang (2012), A Vision Based Top-
View Transformation Model for a Vehicle Parking Assistant, Sensor
best way of using the calibration should produce raw images 2012

View publication stats

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy