CRASH - Exception found trying to compute descriptors: bad allocation

Hi Team,

We have a powerful workstation with the speck listed below, however, we get the following crash report after 3 days of processing 55000 images:

[94%RAM][84%CPU][Error]: Exception found trying to compute descriptors: bad allocation

Does anyone know what could be wrong?

G.SKILL 64GB (4 x 16GB) TridentZ Series DDR4 PC4-27200 3400MHZ For Intel Z170 Platform 288-Pin Desktop Memory Model F4-3400C16Q-64GTZ,

AMD Ryzen Threadripper 2950X Processor (YD295XA8AFWOF)

CORSAIR Hydro Series H115i AIO Liquid CPU Cooler, 280mm Radiator, Dual 140mm SP Series PWM Fans, Advanced RGB Lighting and Fan Software Control

Placa Asrock Taichi x399

Power supply Seasonic Prime Ultra 1300W 80 Plus GOLD MODULAR $300

Samsung 960 EVO, 2TB, M.2 MZ-V6E500BW

Hi Alan,

In most of the cases, this error appears when there is not enough RAM for processing. However, please send us the .log file that we could look more closely at this problem. You can use our OneDrive to store the file. 

Thanks in advance

 

Dear Beta,

We have been trying for a week now with no success, it keeps crashing after 4 - 7 hours. 

We have 64Gb of ram, and we have set max use 62GB ram. 

I have uploaded the log files to the XXXXXXX Global file in your OneDrive.

I look forward to hearing from you with your comments and recommendations. 

 

Regards,

 

Alan C

 

 

Hi Alan. Given that the project is over 50,000 images and the images are 3840x2160 the hardware resources you have described are definitely insufficient for such a massive dataset. This size dataset would require server grade hardware to process in its entirety. You will likely need to split it up to generate the outputs as smaller pieces and then merge together in another application. For example, if you are interested in an ortho then you can generate the sub-projects (~3000-5000 image chunks) and then merge the orthos generated in GIS. If you want a mesh then you can merge the point clouds and generate it in a free application like cloud compare. 

Beata,

We had an engineer review our Workstation configuration and he found that there was an issue with how the ram was installed. 

The process has been running now for 4 days without any problems, it is slow but advancing. 

We are in Phase 1, step 2 of 6. 

Do you think we should cancel and break it up as you recommended, or let it continue as it is?  

Pix4D may be able to get through steps 1 and 2 possibly. Step 3 will likely not succeed because the point cloud DSM needs to be loaded into memory in its entirety and that will either exceed your available RAM or leave you with too little room for generating the rater outputs.

Thank you, so how much additional ram do we need? Also, what happens if our licence expires during this process?

Hi Alan. The maximum amount of RAM that I am aware of for workstation grade computers is 256GB but 128 GB is more common. Im not sure that even with that much you will be able to successfully process step 3 for all your images. It will depend on the processing options as well as the number of images. More likely you will still need to split into sub-projects but the benefit of more RAM would be that the subproject could be larger and thus would require less manual work. Exactly how big will vary project by project. You will need to test what your system is capable of processing given your image dataset and the processing options required by your project needs. 

Your project should be able to complete processing if your license expires during a project.