Hello
I am facing repeated failures when trying to generate dense point clouds on Pix4Dcloud using large image sets (400+ images, high resolution).
The initial steps (upload, calibration) complete successfully, but the processing stops during the point cloud generation phase with a vague error message. ![]()
This doesn’t happen when running the same project locally using Pix4Dmapper. I have checked for image overlap and quality issues; but everything looks fine. ![]()
Is there a known image count or resolution threshold for Pix4Dcloud that triggers processing failures? Or perhaps limits on RAM or processing time per project?
I would also like to know if anyone has successfully split large datasets for cloud processing / if there’s a smarter way to batch upload tiles without manual intervention. Cloud processing should ideally scale but right now; I’m hitting a wall.
I have checked What are the processing options - PIX4Dcloud related to this and found it very helpful.
While debugging this; someone on our team asked what is cloud architect and it made me realize that cloud infrastructure and workload orchestration are crucial even in photogrammetry platforms like Pix4Dcloud. ![]()
Any tips or workflow adjustments from others who deal with large-volume aerial imagery would be really helpful.
Thank you !! ![]()