3D Shape Measurement of Moving Objects for Industrial Applications

dc.contributor.advisorCree, Michael J.
dc.contributor.authorCharleston, Sean Andrew
dc.date.accessioned2015-11-24T20:56:42Z
dc.date.available2021-02-16T21:28:37Z
dc.date.issued2015
dc.date.updated2015-07-24T00:07:08Z
dc.description.abstractThree dimensional (3D) cameras provide distance measurements to objects, allowing computers and instruments to interact with their environment. The applications are wide-ranging, from human gesture control to industrial processing. Time-offlight cameras measure the distance to the scene by measuring the flight time of a modulated light source. Sequential captures are required to produce the depth map, hence time-of-flight cameras are vulnerable to depth errors from motion blur in dynamic scenes. This is a major hindrance for industrial applications, where accurate results are required when reconstructing objects. The fruit grading industry is of particular interest for this work, where significant advancements can be made using 3D cameras. The produce moves at a constant velocity, providing an ideal case for initial work into industrial motion correction. The SR4000 from Mesa Imaging is an industrial grade time-of-flight camera with a high quality factory calibration, and is used throughout this work. When applying custom algorithms (such as motion correction), the camera is run in ‘raw mode’ where the sequential captures can be individually manipulated, however the factory calibration set is lost. The first part of this work investigates calibrations in time-of-flight cameras, where the factory calibration set in the SR4000 is extracted from the camera to be used on the ‘raw mode’ data in custom algorithms. The factory calibrated data is compared to both the ‘raw mode’ data, as well as data acquired using the extracted calibration set. The key results show a root mean squared error (RMSE) of 62.4 mm for ‘raw mode’ data, while using the extracted calibrations shows an RMSE of 6.1 mm. The effects of motion blur on time-of-flight cameras are then investigated. The technique from Hussmann et al. (2011) provides a good first attempt at motion correction, however fails to implement a number of calibrations. The improvements presented in this thesis on the motion correction technique manipulates the demodulation of time-of-flight cameras so that these additional calibrations are incorporated, resulting in a more robust motion correction algorithm. To test these improvements, a controlled experiment is setup to image a moving spherical object, and a stationary reference image of the same object is captured for comparison. Without motion correction the RMSE is 75.9 mm. Using the naive correction technique from Hussmann et al. (2011) gives an RMSE of 58.7 mm, and finally applying the suggested improvements reduces the RMSE to 4.3 mm.
dc.format.mimetypeapplication/pdf
dc.identifier.citationCharleston, S. A. (2015). 3D Shape Measurement of Moving Objects for Industrial Applications (Thesis, Master of Engineering (ME)). University of Waikato, Hamilton, New Zealand. Retrieved from http://hdl.handle.net/10289/9777en
dc.identifier.urihttps://hdl.handle.net/10289/9777
dc.language.isoen
dc.publisherUniversity of Waikato
dc.rightsAll items in Research Commons are provided for private study and research purposes and are protected by copyright with all rights reserved unless otherwise indicated.
dc.subjectTime-of-flight
dc.subjectMotion Correction
dc.title3D Shape Measurement of Moving Objects for Industrial Applications
dc.typeThesis
pubs.place-of-publicationHamilton, New Zealanden_NZ
thesis.degree.grantorUniversity of Waikato
thesis.degree.levelMasters
thesis.degree.nameMaster of Engineering (ME)
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis.pdf
Size:
23.46 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.07 KB
Format:
Item-specific license agreed upon to submission
Description: