Find ‘Tube’ and measure its position and orientation.
description
Transcript of Find ‘Tube’ and measure its position and orientation.
D = sqrt(x²+y²+(z+BF)²)
Disatnces after BF adaption.20 40 60 80 100 120 140 160
-1
-0.5
0
0.5
1Tube Orientation: dz/dx.
Object found
Indication of "free floor".
Tube Length = 397.9
[ x z A ] = -19 1266 18.8
Find ‘Tube’ and measure its position and orientation. Reach the tube as accurate as possible ( +/- 1 cm .) ( Master thesis 2012-2013: Chris Beringhs ).
Calculate path:
Circular Path (1)S-shaped Path (2)
P(xP,yP)
y
x
A( xA,yA)
M
β
γ
RM = (xM²+zM²)1/2
x0 - xM = zM.tg(β) x0 . sin(ε)/x0 = sin(90°-β)/RM sin(ε) .
yA + L = RM.cos(ε) xA = RM.sin(ε) tg(γ) = (j-J2)/f (Image information).
1. Track correction over an angle (γ.sign(ε) +δ): |AP| = yP ( Circular path ! ) |AP|/sin(ε) = (yP+L)/sin(δ) [1] (Sine- yP/sin(ε) = RM/sin(ε+δ) [2] rule). From which: (yP+L)sin(|ε|)|sin(δ)| = ------------------------------------------------ (RM²-2.RM.(yP+L).cos(ε) + (yP+L)² )1/2
2. Drive with a radius Rx over an angle α:α = |ε| +|δ| ,|Rx|= yP .tg(α/2) ( AGV Radii LUT ) .
So, yP must be searched for …
δ
L
RMRx
Track
Track (0,0)
Precision navigation: Circular Path.
ε
xM
x0
β
Here: β, ε, γ < 0 !
α
P(xP,yP)
y
x
A( xA,yA)
M
ε
γ
3. After track correction |AP| must equal yP !
|AP|² = xA² + (yA-yP)² = yP² xA² + yA² = 2.yA.yP
yP = (xA² + yA²)/2yA
δ
L
zM Rx
Track
Track
Precision navigation: Curcular Path(2)
y
x
A( xA,yA)
M
β
γ
RM = (xM²+zM²)1/2
x0 - xM = zM.tg(β) x0 . sin(ε)/x0 = sin(90°-β)/RM sin(ε) .
tg(γ) = (j-J2)/f (Image information).
|ε| = |ε0| + |γ| m/sin(γ) = Z/ sin(ε) = RM/ sin(ε0) m² = Z² + RM² - 2.Z.RM.cos(γ)
m = Z.sin(γ)/sin(ε) ; sin(ε0) = sin(ε).Z/RM
m² = Z² + RM² - 2.Z.RM.cos(γ) Z = RM.sin(ε).sin(ε + γ) / [sin²(ε) – sin²(γ)]
Angle α en radius Rk are bound by:
Z.sin(ε0) + Rk.[1-cos(ε0)] cos(α + ε0 ) = 1 - ------------------------------ 2Rk
d3 = Rk.sin(α + ε0 ); t3 = (α+ ε0)/ωk(Rk) d2 = d3 – Rk.sin(ε0); t2 = α/ωk(Rk)
δ
L
RMZ
(0,0)
Precision navigation: S-shaped Path.
ε
xM
x0
β
Here: β, ε, γ < 0 ! ε0
m
So, Rk must be searched for … LUT
d3
d2 Rk
-Rk
α
ε0
δ
RMS error. d1 = sqrt(u1² + v1² + f²) x1 = u1* D1/ d1; y1 = v1* D1/ d1; z1 = f* D1/ d1
D22 = D1² + δ² - 2.δ.D1.cos(A)
D22 = D1² + δ² - 2.δ.z1
u2/ f = x2/ (z1 - δ); ( x2 = x1 ! )v2/ f = y1/ (z1 - δ) . Measurement at (v2,u2) DP .
v1,u1
δ
v2,u2
y1D1 D2 = DP ?
P
Every Random point P can be used in order to estimate de value δ. More points make a statistic approach possible. RANSAC is advised.
ff
A
Random world
For every δ there is a D2 . For one single δ the error D2 – DP is minimal.
Floor.
Distance correspondence during a pure translation + horizontal camera.
d1
Instant Eigen Motion: translation.
Distance correspondence during apure translation + horizontal camera.
ToF guided navigation of AGV’s.
Random points P! (In contrast: ‘Stereo Vision’ must find edges, so texture is pre-assumed )
D2 = DP ?
x
δ1
Image data: tg(β1) = u1/f ; tg(β2) = u2/f .Task: find the correspondence β1 β2 ; Procedure: With 0 < |α| < α0 With |x| < x0
xs = |x.sin(α)| ; xc = |x.cos(α)| . R = - x.cos(α)/[1-cos(α)] δ1 = |R.sin(α)| .
Projection rules for a random point P : xP = D2.sin(β2) = D1.sin(α + β1) + xc zP = D2.cos(β2) = D1.cos(α + β1) + xs – δ1
D1.sin(α + β1) + xc
tg(β2) = ------------------------------- u2 . D1.cos(α + β1) + xs – δ1
D2² = D1² + x² + δ1² + 2D1.x.sin(β1) - 2δ1.[D1.cos(α – β1) + xs] . D2 .
α
R+
Instant Eigen Motion: planar rotation.
D2
β2β1
z2
z1
Previoussensorposition
Nextsensorposition
PHere: x < 0 α < 0 R > 0 .
D1
Parallel processing possible,Ransac driven !
ToF guided navigation of AGV’s. D2 = DP ?
DP
05
1015
2025
0
50
100
1500
2
4
6
8
10
Angle A. CPUtime = 0.088781
R = 1051 alfa = 0.24
Minimum RMS = 1.8103
RM
S e
rror
e.g. Make use of 50 random points at the horizon.
Ransac methodsare advised.Ransac = RANdom SAmple Consensus .
Previous image
White spots are random selected points
Next image
Minimum RMS = 1.3178
Result = Radius & Angle
ToF guided navigation of AGV’s. Instant Eigen Motion: planar rotation.
D2 = DP ?
R α
Flemish Institute for Mobility: project ‘sensovo’. 1. Make a choice of an adapted camera and its set up. (Benchmark the cameras). 2. Data-captation: ~ 1 GByte / km. ( Width = 1.5 m / camera )3. Identification of the road sections ( Section # + GPS start/stop + Image # + Speed )4. Parallel computed image analysis: ‘edges’ on a level 2, 4 or 6 cm. (Prewitt filter +
small contour closings if necessary). 5. Transversal z-Plus/z-Min sequences. (e.g. track formation, convex road parts). 6. Road cracks (e.g. concrete fractures, transversal and longitudinal stitches…). 7. Contour interpretation and classification. 8. Reports about the road damage on the level of kilometre, hectometre, decametre
and meter. (List with the heaviest strokes on the scale of 1, 10, 100 en 1000 m ).9. Dealing with road paintings, mark signs (arrows..), ‘zebra strokes’...
VLAAMS INSTITUUT VOOR MOBILITEITWetenschapspark 13B-3590 DIEPENBEEK
T +32 11 24 60 00 www.vim.be/projecten/sensovo
Road inspections: MESA RS4500 (wide FoV).
1. Camera: FOV ( 69° , 56 ° ) Resolution(176 , 144 pix); I = 172; J = 140 .
2. Height z0 can be chosen (e.g z0 = 1.50 m): x0 = 2*z0*tg( 56° / 2) * (J-6)/J ;
y0 = 2*z0*tg( 69° / 2) * (I-6)/I ; (overlap)
3. Trigger period: T = 1000. y0 / v [msec] . Freq = v/y0 [ frames/sec ] = 5 Hz.
4. Data generation: # images / km = 1000 / y0 (± 500 ).# doubles / image = 2 * 24000 = ± 50000# Mbytes / km = 200 / y0 (± 100 Mb/km ).
z0 = 1.50 m.
y0 = 2.00 m.x0 = 1.50 m.
About:~ 15 x 15 mm² / pixel
v = 10 m/s = 36 km/hT = 10 msec (shutter).
One trip = 200 km = 5 GB/1.5 m Width (speed independent).
z0 = 1.50 m.
y0 = 2.00 m.x0 = 1.50 m.
About:~ 15 x 15 mm² / pixel
v = 10 m/s = 36 km/hT = 1.5 msec (shutter).
One trip = 200 km = N * 16 GB/1.4 m (speed independent).
One lateral stroke of 1.4 m. is multiple present in consecutive images.
Search for the damage in the centre of the images.
Steps:- Raw images should be deblurred- Row shifts are function of speed: #i/I = v.T/y0 # i = floor(I*v*T/y0) Redundance Number N = floor(I/#i)
Road inspections: FOTONIC E-Series.
Maximum fps = 60 !! Accuracy 0.01 m/m .
P
k = [ -3 -2 -1 0 1 2 3 ] ; here: K = 7 .
Disparity Law: (u0 – uk)/f = k.tk/zP , - (du/dt)/f = (ds/dt)/zP
with tk = V.∆t [ mm ]. ∆t = 1/fps [ sec ] . vP = speed [pix/sec ] . vP/V = f/zP [ - ] .
uk = u0 - k.(vP.∆t) uk = u0 - k.vimage , with: vimage [pix/image]
f
zP
V = ds/dt
tk
u-3u-1u1u3 u0
u
u0
3.tk
Averaging the distance to a fixed world point for a
moving ToF camera:
Z = Σk Z(uk,v)/K
Data shufflingReordering in such a waythat road information present in different images can be averaged over dimension K .
M
At this moment ‘reordering’ is especially compatible with Melexis TOF cameras for which the data A1, A2, A3 and A4 are full available. A kind of saw teeth modulation is used in stead of a sine wave modulation. Collect the frames every 90°. If the speed is known (or measured) the pixel shift can be found and used in order to find associated mean-amplitudes from which the phase angles φ can be derived.
mlxGetDistances(rawFrameData,speed)
alternative modulation
XY
φφ = f(X,Y)
A1A2 A3 A4
A0
Melexis: EVK75301 80x60 pix.
O km. 1 km.
- deblurring is structural incompatible with noise (cfr. Wiener Filters) !
- ‘pulsed ToF cameras’ react totally different from ‘phase shift ToF cameras’ !
- the better we can handle ‘blur’ in real time the faster objects may flow ‘through’ the ToF-images. (VIM-project).
Which pixels have ‘seen’ a specific dice eye and for what integration time.
Let’s organize a ‘MARBLE experiment’ and benchmark different ToF camera
types. (see next slide).
Motion deblur for ‘colour cameras ’ and for ‘ToF cameras’ ask for a different approach.
14
Marble experiments: a white ball rolling over a white floor ... , ( ...conveyor belts... ) coloured ball over (contrast) coloured floor.
Point Spread Functions (PSF-depth / PSF-brightness)
20 40 60 80 100 120 140-2
0
2
Noised & ToF blurred wave
ToF Further research needed !
Question: … will white mice running in the snow give ToF - Motion Blur ?
Spinning wheel experiments
Use a bicycle wheel + a flat disk (PUR).Mount a pie shaped element (Thickness 40 mm , angle 30°).Mount a flat second material (Thickness 3 mm , angle 180°).
Let it spin around and strange measurement phenomena will occur …
Material edges
Depth edges
0 100 200 3000
100
200
300
400
500
600
700
800
Angle = 0.73892 Correct angle = 0.71558
Distance D = 294 Correct Distance = 285
A1
2.D = λmod . [φ/2π]
φ = atan[(A1-A3)/(A2-A4)]
Moving ToF: (Robotics, AGV’s, conveyor belts,..‘road damage inspection’)
During motion, inconsistent brightness values are collected giving rise to chaotic fluctuations on the measured phase angle ‘phi’ .
Big deviations will occur !
A2 A3 A4
φ
3
21
4
Texture dependantmean intensity values
Remember: ‘pulsed time of flight cameras’ don’t follow this mechanism !
Spinning wheel experiment
Heavy noised brightness imageBlurred and noised data
Smallest eigenvalue = 0.3
Sig
ma
Noi
se =
0.5
0001
50 100 150-15
-10
-5
0
5
10
15Sk-deblurred
100*CPUtime = 0.2184
ssqL
uc =
3.9
88
Sk-deblurred data
CPUtime Luc = 0.002184
K =
5
jj =
160
50 100 150-15
-10
-5
0
5
10
15Wiener-deblurred
100*CPUtime = 1.6068
ssqW
NR
= 5
.802
Z-values Last Image
Frames per second = 9.8398
Out
of C
ontro
ls #
= 0
Luminances Last Image
Shutter time = 4 ms.
RO
I = 1
18 1
58 p
ix.
Z-values Last Image
Frames per second = 9.8721
Out
of C
ontro
ls #
= 0
Luminances Last Image
Shutter time = 4 ms.
RO
I = 1
18 1
58 p
ix.
STATIC
DYNAMIC
~ 960 mm
~ 1000 mm
~ 1000 mm
~ 997 mm
~ 997 mm~ 1000 mm
~ 1000 mm~ 960 mm
~ 997 mm~ 1000 mm
~ 1000 mm
~ 960 mm
Z40
Sample Z #8
Int =4 z0 =1044
Font
onic
Distance image of a spinning wheel.
Dark is closer,Bright is further away
Distance ~ 1000 mm.
Wheel diameter = 600 mm.
~ 960 mm
~ 997 mm
~ 1000 mm
~ 1000 mmZ40.
www.TriDiCam.net
Line sensorThe first line sensor with a resolution of 64 x 2 pixel to be developed by Fraunhofer IMS.
15 kHz scan speed.
3 mm x 10000 Hz 30 m/s 108 km/u @ 10000 Hz 3 mm.
Pixel shape
‘Motion blur reduced’ ToF-cameras based on pulsed ToF.
www.odos-imaging.com
Real.iZ
March 2013Image Sensors
Type of sensor JAG PAN
Type of sensorCMOS CCD
Maximum frame rate25 fps 60 fps
- Total capture time> 20 ms¹ 6,7 ms
Pixel array size160 (h) x 120 (v) 160 (h) x 120 (v)²
Multi camera optionYes, no limitations No
Performance pros/conHigh distance accuracy Handles motions better
Lower price Better in sunlight
Number of dead pixels on sensor<=5 <=5
Dead pixel cluster (2 or more direct neighbours)
No No
Optics 70o 40o 70o 40o
Field of view (h) x (v) 64o x 48o 44o x 34o 70o x 53o 40o x 30o
Measurement range 0,1 - 5 m 0,1 - 7 m 0,1 - 10 m 0,1 - 12 m
Fotonic E70 E-PanFrames per second:
fps = 60 !!Strongly
Reduced Blur Jaguar (Texas Instr) Panasonic
Fotonic Z70 E-PanFrames per second:
fps = 60 !!Reduced Blur
Sheet of Light (SoL)Actual camera types can deal with 75.000 line acquisitions a second.
The resolution depends on the FOV (e.g. FOV = 2000 mm.) and on the number of image columns (e.g. # cols = 2048). ∆x = # cols/FOV (e.g ∆x ~ 1 mm/pixel) ∆z = ∆x / sin(a) (e.g ∆z ~ 2 mm/pixel, a = 30° )
Row wise the sub pixel ‘Centre of Gravity CoG’ can be search for in 2k accuracy steps. In such case we get:
∆zCoG = ∆x / (2k.sin(a)) (e.g. ∆zCoG ~ 0.25 mm/pixel ) .
v
j
i
I
11
J
u.ku
v.kv
d
rD
x
y
z
(0,0,f)
f = 1
R
φ
u = j – j0 ; uk = ku*uv = i – i0 ; vk = kv*v
tg φ = uk/f tg ψ = vk/f
r = √(xu²+f²)
d = √(uk²+vk²+f²)
D/d = x/uk = y/vk = z/f
J/2
I/2
ToF VISION: world to image, image to world - conversion
horizon
N
Every world point generates 12 important coordinates:
x, y, z, Nx, Ny, Nz, N, kr, kc, R, G, B .
kr ,kc
A
ψ
P = [ tg(φ) tg(ψ) 1 d/D ] ,
P’ = [ tg(φ) tg(ψ) 1 ] . ( f is chosen to be the unit.)
i0
j0
RGBd-cameras
STEREO VISION: ‘Disparity / Distance’ - Conversion• Image disparity [pixel]:
• Set-up disparity [m]:
• Object distance [m]:
12 ccd
xs sccd *)( 12
sdbfz
Cr2
Cr1
b
Pw
P1
P2
z
f
f
O1
O2
xr
zr
yr
cc
c
c1
c2
N
dsbfzd
bfzz xs
22
.
Error Estimation
Rectified Images
Stereo correspondence calculations are time consuming
fd = c2 - c1
Matlab programs ( Mesa / Fotonic ) ( easily extendable for Melexis, ODOS and other 3D-cameras)
VR_fot_Wheel_Benchmark_zL / VR_Mesa_Wheel_Benchmark_DL
VR_fot_Marble_Z / VR_MESA_Marble_Dzyx
VR_EigenMotionRotation / VR_EigenMotionTranslation
VR_Fot_Chris_ImageAnalysis / VR_Fot_Chris_AGV ( Image analysis & AGV-control ).
OK_VR_Mesa_AcquireImageFast_DL_OoC
VL (wereld) vL (code) Δv (Rechter- relatief t.o.v. het linkerwiel, gemeten in digits). R = B.(2vL / ∆v - 1) ; ωAGV = K.vL/(R+B) (rad/sec)
[mm/s] [#] 0 1 2 3 4 5 6 7 8 9 10
6,25 1 #DEEL/0! 180 0 -60 -90 -108 -120 -128,571 -135 -140 -144
12,5 2 #DEEL/0! 540 180 60 0 -36 -60 -77,1429 -90 -100 -108
18,75 3 #DEEL/0! 900 360 180 90 36 0 -25,7143 -45 -60 -72
25 4 #DEEL/0! 1260 540 300 180 108 60 25,71429 0 -20 -36
31,25 5 #DEEL/0! 1620 720 420 270 180 120 77,14286 45 20 0
37,5 6 #DEEL/0! 1980 900 540 360 252 180 128,5714 90 60 36
43,75 7 #DEEL/0! 2340 1080 660 450 324 240 180 135 100 72
50 8 #DEEL/0! 2700 1260 780 540 396 300 231,4286 180 140 108
56,25 9 #DEEL/0! 3060 1440 900 630 468 360 282,8571 225 180 144
62,5 10 #DEEL/0! 3420 1620 1020 720 540 420 334,2857 270 220 180
68,75 11 #DEEL/0! 3780 1800 1140 810 612 480 385,7143 315 260 216
75 12 #DEEL/0! 4140 1980 1260 900 684 540 437,1429 360 300 252
Deze tabel levert de kromtestraal op in millimeter en als functie van vL en ∆v .
AGV eigenschappen:
B = 180 mm
Rw = 62 mm
K = 6,25
K = 1000/160 = 6,25 [(mm/sec) / code-eenheid ]
Formularium: VL [ m/s] = K. vL [digits] ; K must be calibrated. & VR [ m/s] = K. vR ; ∆V = K. ∆v = VL – VR .
Clock wise movements ( VL > VR ): VL = VR + ∆V = ωAGV . (R + B) ; à ωAGV . (R - B) + ∆V = ωAGV . (R + B) à -ωAGV . B + ∆V = ωAGV B
ωAGV = ∆V / 2B ; R = B.( VL +VR) / (VL -VR)
ωAGV = K.∆v / 2B ; R = B.(2vL /|∆v| - 1).sign(|∆v|)
Experiment: vL = vR = 30 à 2 m / 10.87 sec. of V = 0.184 m/s K = 0.184/30 (ongeveer 1/160). K [ m/sec / code-eenheid ]
= 50 4 m / 12.78 sec of V = 0.313 m/s K = 0.313/50 (ongeveer 1/160)
Experiment: vL = vR = 2 à 1 m / 80 sec. of V = 0.0125 m/s K = 0.0125 /2 (ongeveer 1/160).