KinectFusion : Real-Time Dense Surface Mapping and Tracking
description
Transcript of KinectFusion : Real-Time Dense Surface Mapping and Tracking
![Page 1: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/1.jpg)
KinectFusion : Real-Time Dense Surface Mapping and Tracking
IEEE International Symposium on Mixed and Augmented Reality 2011Science and Technology Proceedings (Best paper reward)
![Page 2: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/2.jpg)
Target
Normal maps GreyscalesNoisy data
![Page 3: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/3.jpg)
Outline
• Introduction• Motivation• Background• System diagram• Experiment results• Conclusion
![Page 4: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/4.jpg)
Introduction
• Passive camera• Simultaneous localization and mapping (SLAM)• Structure from motion (SFM)– MonoSLAM [8] (ICCV 2003)– Parallel Tracking and Mapping [17] (ISMAR 2007)
• Disparity– Depth model [26] (2010)
• Pose of camera from Depth models [20] (ICCV 2011)
![Page 5: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/5.jpg)
Motivation
• Active camera : Kinect sensor
• Pose estimation from depth information• Real-time mapping– GPU
![Page 6: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/6.jpg)
Background- Camera sensor
• Kinect Sensor– Infra-red light
• Input Information– RGB image(1)– Raw depth data– Calibrated depth image(2)
(1) (2)
![Page 7: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/7.jpg)
Background – Pose estimation
• Depth maps from two views
• Iterative closest points (ICP) [7]• Point-plane metric [5]
ICP
![Page 8: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/8.jpg)
Background – Pose estimation
• Projective data association algorithm [4]
![Page 9: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/9.jpg)
Background – Scene Representation
• Volume of space• Signed distance function [7]
![Page 10: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/10.jpg)
System Diagram
![Page 11: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/11.jpg)
System Diagram
![Page 12: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/12.jpg)
Pre-defined parameter
• Pose estimation with sensor camera• Raw depth map Rk
• Calibrated depth image Rk(u)
where and
Raw data
K
Rk
Rk(u)
![Page 13: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/13.jpg)
Surface Measurement
• Reduce noise• Bilateral filter
With bilateral filter Without bilateral filter
![Page 14: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/14.jpg)
Surface Measurement
• Vertex map
• Normal vector
![Page 15: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/15.jpg)
Define camera pose
Camera frame k is transferred into the global frame
![Page 16: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/16.jpg)
System Diagram
![Page 17: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/17.jpg)
Surface Reconstruction : Operate environment
L L
L
L3 voxel reconstruction
![Page 18: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/18.jpg)
Surface Reconstruction
• Signed distance function
![Page 19: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/19.jpg)
Truncated Signed Distance Function
Surface
sensor
Fk(p)
0
+v
-v
Axis x
Axis x
+v-v
![Page 20: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/20.jpg)
• Weighting running average
• Dynamic object motion
![Page 21: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/21.jpg)
System Diagram
![Page 22: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/22.jpg)
Surface Prediction from Ray Casting
• Store • Ray casting marches from +v to zero-crossing
Corresponding ray
![Page 23: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/23.jpg)
Surface Prediction from Ray Casting
• Speed-up– Ray skipping– Truncation distance
Surface
sensorAxis x
![Page 24: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/24.jpg)
System Diagram
![Page 25: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/25.jpg)
Sensor Pose Estimation
• Previous frame• Current frame• Assume small motion frame• Fast projective data association algorithm– Initialized with previous frame pose
where
![Page 26: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/26.jpg)
• Vertex correspondences
where
• Point-plane energy
![Page 27: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/27.jpg)
• For z > 0
• Modified equation
where
![Page 28: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/28.jpg)
![Page 29: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/29.jpg)
Experiment Results
• Reconstruction resolution : 2563
• Test camera pose• kinect camera rotates and captures 560 frame
over 19 seconds in turntable
![Page 30: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/30.jpg)
Experiment Results
• Using every 8th frame
![Page 31: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/31.jpg)
Experiment Results : Processing time
Pre-processing raw data, data-associations; pose optimisations; raycasting the surface prediction and surface measurement integration
Demo
![Page 32: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/32.jpg)
Conclusion
• Robust tracking of camera pose by all aligning all depth points
• Parallel algorithms for both tracking and mapping
![Page 33: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/33.jpg)
Reference[8] A. J. Davison. Real-time simultaneous localization and mapping with a single camera. In Proceedings of the International Conference on Computer Vision (ICCV), 2003.
[17] G. Klein and D. W. Murray. Parallel tracking and mapping for small AR workspaces. In Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR), 2007.
[26] J. Stuehmer, S. Gumhold, and D. Cremers. Real-time dense geometry from a handheld camera. In Proceedings of the DAGM Symposium on Pattern Recognition, 2010.
![Page 34: KinectFusion : Real-Time Dense Surface Mapping and Tracking](https://reader036.fdocument.pub/reader036/viewer/2022062323/5681637b550346895dd45885/html5/thumbnails/34.jpg)
[20] R. A. Newcombe, S. J. Lovegrove, and A. J. Davison. DTAM: Dense tracking and mapping in real-time. In Proceedings of the International Conference on Computer Vision (ICCV), 2011
[7] B. Curless and M. Levoy. A volumetric method for building complex models from range images. In ACM Transactions on Graphics (SIGGRAPH), 1996.
[5] Y. Chen and G. Medioni. Object modeling by registration of multiple range images. Image and Vision Computing (IVC), 10(3):145–155, 1992.
[4] G. Blais and M. D. Levine. Registering multiview range data to create 3D computer objects. IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI), 17(8):820–824, 1995.