Has anyone ever tried employing high performance cluster computing in order to improve processing time of imagery? If so was it simple to set up and reliable? Thanks
I have just spent the month of May trying nearly every Amazon and Microsoft cloud server combination, with a Pix4D trial license (running on Windows Server requires the Enterprise version, not Desktop), and I found no scenario that could match my desktop. I processed about 40 billion pixels so it was no small job…bigger jobs may require more RAM to be efficient.
CPU: Intel(R) Core™ i7-5960X CPU @ 3.00GHz
RAM: 64GB
GPU: NVIDIA GeForce GTX 1060 6GB (Driver: 22.21.13.8205)
I can go deeper into the iterations if you want to discuss via email, Adam.Jordan@nhiae.com
1 Like