Support Website Contact Support Blog

The idea of the new functionality

Would it be possible to enchant processing for repetitive work? Like mining supervision. Now you need to select GCP in all new projects of the same mining site. But I think it should be possible to process data from new flight using last flight processed data, because only small part of mining site will change in short amount of time. So it should be possible to compare all site from flight1 and flight2, find places which has not changed and geolocating data from fligt2 using data from flight1. We can create point groups in flight1, so using data from flight1 it should be possible to group point from flight2 automatically. And may be it would increase accuracy of flight2?

Hello Ginuti,

Pix4D uses images as input. Software analyzes the image content and generates the outputs. Pix4D finds thousands of common points between images. Each characteristic point found in an image is called a keypoint. When 2 keypoints on 2 different images are found to be the same, they are matched keypoints. Each group of correctly matched keypoints will generate one 3D point. Therefore in order to generate the point cloud, orthomosaic, etc. all processing steps have to be repeated.

Furthermore, GCPs are marked in images and marks are registered in image coordinate system. This means that when a new set of images is acquired, GCPs have to be marked in these images. Even if the mapped area is the same, images from different flights have different image names, geotags and lighting conditions.

It is recommended to have 5 well distributed GCPs in the project https://support.pix4d.com/hc/en-us/articles/202557489. When marking a GCP in the images, all marks of this GCP are used to compute a new 3D point. At least 2 images need to be marked (we recommend to mark GCP in 5 images) in order to compute the estimated 3D position of the GCP. This estimated 3D point is then reprojected in all the images where it might be visible https://support.pix4d.com/hc/en-us/articles/202560769.

Best regards,