Strona główna

Dominik sierociuk, Maciej czeredys

Pobieranie 21.68 Kb.
Rozmiar21.68 Kb.


Warsaw University of Technology
Identification of mobile robot kinematics using computer vision system.

Abstract. This paper presents parametric identification of caterpillar mobile robot. The data of robot's position is collected by computer vision system contained a camera mounted over the place of the robot movements and a PC computer with frame grabber. An algorithm of positioning is also described. Results of identification are simulated and compared..

Streszczenie. Artykuł prezentuje identyfikację parametryczną gąsienicowego robota mobilnego. Dane o pozycji robota są uzyskiwane przy pomocy komputerowego systemu wizyjnego złożonego z kamery zamontowanej ponad miejscem przemieszczania się robota i komputerem klasy PC z frame grabber’em. Algorytm tego pozycjonowania jest także opisany. Rezultaty identyfikacji są zasymulowane i porównane z rzeczywistymi.

Key words: mobile robot, identification, positioning, image processing.

Słowa kluczowe: robot mobilny, identyfikacja, pozycjonowanie, przetwarzanie obrazów.


Positioning of the mobile robot is a general problem in its kinematics or dynamics identification. Knowing real robot's position in each time period we can analyze its reaction for any inputs and then try to identify parameters of the mathematical model or learn neural network.

The Robot description

The robot was built by the Author of the paper, basing on a caterpillar driven chassis. It is equipped with:

  • two motors „Speed 280 RACE” with gear-box,

  • A

    Fig. 1 Camera distortion model.

    Tmega128L microcontroller: up to 8MHz, 4xtimers,8xPWM, 8xADC, 128kB Flash,
    4Kb RAM, 2xUSART, ISP and others [4],

  • two impulse encoders 5 imp/rot.

The ATmega128L C has two 8 bit timers and two 16 bit timer which can generate four PWM signals. This PWM signals are used to control the motors by two MOSFET
H-bridges. Timers also can be configured as a counters and connected to encoders can measure angular velocity of motors. ISP (In-System Programming) makes programming process faster and easier (without the necessity to remove the C from controller board to place it in the chip programmer).

The microcontroller is programmed in C language using CodeVisionAVR C Compiler. The RS-232 interface is used to transfer data from C directly to Matlab. In C two PI controllers was implemented (one for each engine) in order to set the reference speed.

Vision system setup

The vision system were build using Camera PIEPER
K-4612C and PC with TV card (Conexant BT878 chipset). Camera is montage vertically on height 2m over the robot's movements place. In order to calibrate camera the Camera calibration Toolbox is used. The results of the calibration process is shown in figure 1. It is clear to see that camera without calibration introduce not acceptable error in length and angular position, that's why we need undistort images before later using. Undistortion function is also included in previous mentioned Toolbox. All frames were collected in resolution 640x480 with 24bit color resolution and 25 frame/sek. frame rate.
The positioning algorithm.

The robot is marked by blue (in the front of robot) and yellow (in the back of robot) marks on the red background. The surround of the robot is in not defined, so the main goal of the algorithm is to find two marks with desired color (yellow and blue) on the red background. Therefore surround can contain blue, yellow objects (noise), but can’t contain blue or yellow object on the on the red background.

Firstly, the image is binearised with threshold chosen in order to detect red color (background of the robot). In order to reduce the noise and close/unite possible unclosed holes robot background dilation is done.

Secondly holes are filling. (A hole is a set of background pixels that cannot be reached by filling in the background from the edge of the image). figure 2 shown the image after binearisation, dilation and holes filling.

Thirdly next two binearisation is done, to detect yellow and blue color. In order to get position of yellow/blue pixels on the red background we use the logical AND operation on the binary picture of the robot background and one of marks (Fig.3). Bring about to detect every yellow and blue object on the on the red background. Finally the position of the robot is counted.

Fig. 2 The robot’s background after binearisation, dilation and holes filling.


Fig. 2 Binearized image for detect blue colour.


Model of robot’s dynamics could be taken as equations [1]:


are robot position,

is angle of robot position,

are angular velocity of each motor,

are parameters of the model.
Let's define the matrixes,

By solving the equation where , we get the robot parameters [2].


Fig. 4 Real and simulated trajectory (first group)

Fig. 3 Both found markers (blue and yellow).

athematical model used for identification is very simple, doesn’t include such a important information like friction or sliding. That’s why the robot’s parameters were estimated for selected groups of robot movements sequentions. Groups have been divided according to values of angular velocity of engines (average values and absolute value of its differences). So each group of data collected two trajectories (one when the robot is turning left and one when turning right). There is presented identification only of two groups but it was done for over ten group.

In figures 4 and 5 results for first group of trajectories are presented in which the reference value of engines speed is equal to 3000rpm for left and 6000rpm for right. Estimated parameters are:

a=2,661 b=0,023198
The results of simulation very well fits to the real trajectory.

In figures 6 and 7 are presented example from second group of trajectories for speed of engines equal to 9000rpm for left and 6000rpm for right. Estimated parameters are:

a=2,7843 b=0,020464

Fig. 6 Real and simulated trajectory (second group)

Fig. 3

Simulated trajectory fits nearly perfectly to the real one.

Fig. 7 Measured data of engines speed and estimation error (second group).


The positioning vision system was set up. The algorithm was developed and tested for dozens sequentions. The identification algorithm was also tested, however results shows that the more suitable and complicated mathematical model is needed. Encoders measured angular speed of engines also should be improved (5imp/rot it is not enough for PI controllers).The other problem is algorithm time optimization in order to setup real time positioning vision system.

Fig. 5 Measured data of engines speed and estimation error (first group).


[1] Antoni Wiktor Chodkowski, Konstrukcja i obliczanie szybkobieżnych pojazdów gąsienicowych, Wydaw. Komunikacji i Łączności,1990.

[2] Torsten Söderström, Petre Stoica. Identyfikacja systemów, PWN,1997.

[3] Marcin Iwanowski, An Application of Mathematical Morphology to Filtering and Feature Extraction for Pattern Recognition, ”Przegląd elektrotechniczny” ,vol.80, No.4, 2004;

[4] ATMEL, ATmega16(L) Datasheets, Updated 12/03,


mgr inż. Dominik Sierociuk, Warsaw University
of Technology - Electrical Engineering,
Institute of Control and Industrial Electronics, Control Division. ul Koszykowa 75
00-662 Warszawa. E-mail:

mgr inż. Maciej Czeredys, Warsaw University
of Technology - Electrical Engineering,
Institute of Control and Industrial Electronics, Control Division. ul Koszykowa 75
00-662 Warszawa. E-mail:

© 2016
wyślij wiadomość