I’m still thinking Optane over traditional M2 SSD might be a bigger gain than a better graphics card. I also notice that my graphics card is never at 100% in HW Monitior when I’m processing in Pix4D.
From my understanding the Titan V only shows gains when you can utilize the AI components included. Given Pix4D can only use the CUDA cores the Titan V probably isn’t a big or worthwhile step up from a Titan xP. I’d love to see a 2080 though soon lol.
Titan series isn’t working…from what others have said and tested.
@Brian Young I wonder if your PSU is just a bit undersized for running that much OC with a 1080Ti. I know I had issues with a 700 w EVGA G2 Series PSU when I tried to overclock a Ryzen 7 1600 and overclock my 1070 at the same time. While the math might work out fine, I find you need a lot more headroom for longer duration demands.
Adam, how so the Titan series isn’t working? Is it like the Quattro where some setting have to be changed or there just isn’t much of a jump past the 1080Ti? The Titan xP is a heck of a card, I’m wondering why that wouldn’t cream a 1080Ti since they are actually based off the exact same technology.
I highly recommend people check out these results! This gives a lot of information that Pix4D has not provided to us in terms of comparing newer hardware.
https://www.pugetsystems.com/labs/articles/Pix4D-CPU-Comparison-Coffee-Lake-vs-Skylake-X-vs-Threadripper-1084/
https://www.pugetsystems.com/labs/articles/Pix4D-GPU-Comparison-GeForce-Titan-and-Quadro-1085/
I found these because @Bill George posted them on another thread. It’s crazy, there really isn’t much benefit in going into the higher end cards. I was ready to spend into the Titan series, but now I’ll just wait for the GTX 2000 series, hold onto my TR until TR2 comes out because clock speeds are supposed to go up nearly 20% (AMD at CES mentioned this), and spend my money on more/better memory!
1 Like
I ran the data from ary sanjaya’s lembongan.zip with my computer and got 434 seconds per the Quality Report on Pix4D v4.1.23
CPU - Threadripper 1950x
CPU Cooler - Liquid Cooler by enermax
GPU - GeForce GTX 970
RAM Amount - 64GB in (8) 8 chips
RAM Speed - DDR4 DRAM 3600
Hard Drive or SSD 960 M2
Motherboard Model - ASUS
Operating system - Windows 10 PRO
This was a test exchanging the Titan XP’s (2) for the GTX970.(1) very interesting!
Default options
Has anyone ever tried the Tesla cards out? They can be had cheap on eBay right now. Not sure which models line up with the GeForce cards. Just curious if they may really speed up things.
Thanks Sterling, I was speaking of the Tesla cards, do they perform the same? These are a bench of Quadro cards and spec wise, the Tesla cards are much beefier cards.
Marcus, simply compare CUDA cores vs. price on virtually any Nvidia video card. I suppose if you can find a $10,000 Tesla card for $500 then you might have something better than a GeForce card…might…
Unfortunately the crypto mining community has doubled the price of GTX cards if you can even find any so there honestly aren’t any good options :(
So I’ve been reading everyone’s scores and wondering why mine is so slow, granted I do have the Titan XP installed but unchecked for processing. I removed the Titan XP from the system and I’m seeing about a 10x increase in speed. It’s incredible, what was taking days now takes an hour.
From this article: https://www.pugetsystems.com/labs/articles/Pix4D-GPU-Comparison-GeForce-Titan-and-Quadro-1085/#TitanandQuadroResults
“After searching online to see if anyone else had run into this, we found a thread on the Pix4D support pages about the very same thing. There, a solution had been given for Quadro cards: go into the NVIDIA Control Panel, to the Manage 3D Settings section, and select “3D App - Game Development” from the drop-down menu. Once that change is applied, Quadro cards suddenly work great. We tested this with both the Quadro GP100 and P6000, with these results (presented in the same format as the previous GP100 vs GTX 1080 Ti chart)”
Try changing that setting.
Sterling, there is no such setting for the Titan XP within the Nvidia control panel. I had read that article previously and assumed unchecking the Titan XP it in the processing options would fix the issue but it doesn’t. The card has to be physically removed from the system or you will see massive performance penalties. I re-ran an old project and it completed in an hour rather than the usual 9-10 hours or so. There really should be a bulletin warning users to remove any Titan cards from their system. I have two 1080Ti cards and the Titan in the system with them slowed it to the point of having no Nvidia 1080Ti cards at all.
Interesting. I thought the solution worked for both cards, I wasn’t aware that fix only works for the Quattro cards. Also, the fact that the one Titan card could cause that issue is disturbing.
From what I see Pix4D still isn’t really optimized across many systems. The lack of OpenGL support and this issue really show that whoever is building Pix4D is focused on GTX series cards with Intel processors and little else. The program seems to be much more of a work in progress than some of the other HEDT software I use from companies like Adobe.
I recently made a 2700x Ryzen PC
CPU - 2700x Ryzen
CPU Cooler - noctua
GPU - GeForce GTX 1060
RAM Amount - 32GB 2x 16gb sticks
RAM Speed - DDR4 DRAM 3000
Hard Drive or SSD 960 M2
Motherboard Model - ASUS ROG STRIX
Operating system - Windows 10 PRO
ag timber= 21:46
lembongan=10:43
Recently purchased a new machine:
HP Workstation Z640 - MT - 4U –
1 x Xeon E5-2650V4 / 2.2 GHz –
RAM 16 GB - SSD 256 GB - HP Z Turbo Drive -
NVIDIA Quadro M4000 - GigE - Win 7 Pro 64-bit.
I ran the lembongan island project using “Standard 3D maps” with all standard settings. Total time: 652 seconds.
Also completed the Timber AG project using the Standard AG Multispectral. Total time: 1687 seconds.
Finally, the Timber AG RGB using the Advanced “Ag RGB” template, Total time: 5464 seconds.
Xeon and Quadro and very slow…
Thanks Adam. Yes, it does not seem as fast as i would have liked, but I am only comparing my results to the laptop i was previously using and it is a lot faster than that.
Our IT guy in the company purchased this machine for processing based upon what he read here and pix4D recommendations… but there are a couple of questions he has, which I have no idea about, if anyone can help:
-
The log file lists the load percentage for RAM and CPU. Given the program seems to be GPU intensive, why does Pix4D not have another log column showing GPU load ?
-
With regard to the Z640 workstation, it has the ability to run 2 CPU’s. Is pix4D capable of utilising a second processor ?
-
the Z640 has approximately 20 options for high end GPU’s. and some of those options allow for 2. Is pix4D capable of utilising a second Quadro M4000? As two of these can be installed in this particular model…
Appreciate any feedback.
Well for the money, that is the wrong machine to dedicate to Pix4D. But it will work if you run GeForce instead of Quadro. 2 CPUs might help but I doubt it as the most intensive CPU work is single threaded.
You need raw GHz CPU speed and more CUDA cores in the GPU…multiple GPUs is a waste unless doing 100MP pictures.
Also be careful of too much RAM but that can be restricted in the Pix4D settings for smaller projects. Hardware differences like RAM and GPU will affect the results.
totally agree with Adam, Im confused why Pix4d didnt utilize multi core multi thread much more often?? most of step on pix4d using just single thread, and that is wasting resource!! While other apps use resource much more efficient, especially core on cpu. Even GPU from AMD they can utilized, and the result was much faster… I think Pix4d R&D must catch much improvement on resource management in a sort time, or they will lose on competition…
https://www.youtube.com/watch?v=Z1M5wZSgaGE