Abstract
Abstract
Although the rapid development in reverse engineering techniques, 3D laser scanners can be considered the modern technology used to digitize the 3D objects, but some troubles may be associate this process due to the environmental noises and limitation of the used scanners. So, in the present paper a data pre-processing algorithm has been proposed to obtain the necessary geometric features and mathematical representation of scanned object from its point cloud which obtained using 3D laser scanner (Matter and Form) through isolating the noised points. The proposed algorithm based on continuous calculations of chord angle between each adjacent pair of points in point cloud. A MATLAB program has been built to perform the proposed algorithm which implemented using a suggested case studies with cylinder and dome shape. The resulted point cloud from application the proposed algorithm and result of surface fitting for the case studies has been proved the proficiency of the proposed chord angle algorithm in pre-processing of data points and clean the point cloud, where the percent of data which was ignored as noisy data points according to proposed chord angle algorithm was arrived to (81.52%) and (75.01%)of total number of data points in point cloud for first and second case study respectively.
Copyright: Open Access authors retain the copyrights of their papers, and all open access articles are distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution and reproduction in any medium, provided that the original work is properly cited. The use of general descriptive names, trade names, trademarks, and so forth in this publication, even if not specifically identified, does not imply that these names are not protected by the relevant laws and regulations. While the advice and information in this journal are believed to be true and accurate on the date of its going to press, neither the authors, the editors, nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein.
(Received 6 March 2020; accepted 5 July 2020)