Quadro vs GeForce

I have done some tests with a Quadro K5200 and a GeForce Titan X (Pascal) in my dual Xeon HP Z840 workstation.

So it seems that because the nvidia Control Panel allows you to set ‘3D Game Development’ mode under Quadro but not under Geforce, processing times are better with Quadro despite being a lower spec card.

When using a GeForce card, there is no option in the Nvidia control panel to select a preset mode. When using Quadro, there is. And its this mode that Pix4D recommend using with Quadro.

Now I read somewhere that GeForce and Quadro are virtually the same hardware, its the drivers that switch capabilities on and off.

Does anyone have any similar experiences?

So if I were to build a new machine based on the Intel i9 Skylake CPU, Quadro would be the way to go…

Any thoughts?

 

I just posted to your query in the other thread in favor of GeForce, but your comments here are interesting.  I’m going to have to do some more digging on my end.

We saw big improvements going from K4000 to GTX 1080, but this was not knowing anything about this ‘game development mode’.

Thanks for the post.

There must be something else going on because the Quadro cards are inferior to the GeForce cards…it is all about the CUDA cores for Pix4D.

The Quadro cards have preset settings in 3D Settings. Pix4D officially amounted that selecting 3D Game App setting solved a lot of slow downs people were experiencing.

Its all about the drivers…

Take a look at Step 1 times in this thread: https://community.pix4d.com/t/3575-Desktop-vs-Cloud-Comparison

 @Philip, Thomas, and Adam, we are investigating what options are available to mitigate the extended processing time for Step 3 with the NVIDIA GeForce GTX Titan X Pascal. Please stay tuned for an update.

1 Like

I have been doing a lot of testing with Pix4D on different hardware (various CPUs and GPUs) and have found very odd performance on Titan and Quadro cards. I came here to search for similar reports, and found this thread, so I wanted to post some info from my tests.

All the GeForce cards I have tested - so far, the GTX 1060, 1070, and 1080 Ti - perform very similarly, despite vast hardware / spec differences. 

On the other hand, the Titan Xp, Titan V, and Quadro GP100 also perform about the same as eachother… but FAR, FAR worse than the GeForce cards. This is not limited to Step 3 either. Steps 1, 2, and 3 are all between 4 and 20 times slower on these cards compared to the GeForce models. And these are not low-end / low-spec Quadro cards either - these should all be at least as fast as the GTX 1080 Ti, in terms of the number of CUDA cores, the amount of onboard VRAM, etc.

I can provide even more details if needed, and I still plan to test a couple more GPUs (dual 1080 Ti and a Quadro P6000).

Pretty similar to my tests except I had a decent pickup in time going from 1060 to 1080Ti. Pix4D is simply optimized for GTX cards and desktops…bummer for all those people spending $20k to $40k on computers when $6k gets you max speed.

How much variance did you see between the 1060 and 1080 Ti, if I may ask? I’m seeing about a 10-15% reduction in Step 1 processing time, no impact on Step 2, and a very slight (1-5%) reduction in Step 3.

As for the Titan and Quadro cards not working well, I could certainly see them not having any improvement over a card like the GTX 180 Ti… but to be so much slower? That smacks of something awry within Pix4D. The Titan Xp is nearly a GeForce card itself - previous Titan cards before it were marketed under than name, and it shares the same drivers and architecture. In almost all other applications it matches or slightly outperforms the GTX 1080 Ti, but in Pix4D it is suddenly many times slower. I’m 99% sure that something is wrong, but I am not sure what.

Oh, I should add something that I noted: with the GeForce cards, I can hear the fan on the card spin up and down depending on how heavily it is working. However, with the Titan cards and the Quadro GP100 the fan *never* increased in speed (enough for me to hear, at least). This indicates to me that those cards are never being stressed in the same way as the GeForce models, which could explain the poor performance. I was able to confirm within Pix4D (both the settings and the log files) that the Titan and Quadro cards were detected as CUDA devices and supposedly being used.

I have been notified that NVidia made a subtle change to the GeForce drivers from Nov 15 2017. This would/could explain a massive deterioration in processing time. The matter is still under investigation.

From my side, my 1080 Ti works very well in my i9 7900X Skylake X configuration. I have put the Quadro back in the HP Z840 and configured it for 3D Game App settings, which works well. The Titan X (Pascal) is on the shelf gathering dust… anyone want it?

1 Like

Bill, I noticed about the same thing and thankfully I never paid the premium to try a Titan.

Philip, may I ask what Quadro card you have in the Z840? I’ll have to try out that 3D Game App setting the next time I test on a Quadro. I am planning to run the P6000 through Pix4D this weekend, so I am curious to see if it behaves differently depending on how that configuration option is set.

I could also go back and try GeForce / Titan drivers from before the 15th of November. Thankfully NVIDIA does allow access to older driver revisions :slight_smile:

Bill - its an old K5200. The difference when using 3D Game App is like night and day.

I read somewhere that most of these high end Nvidia cards are pretty much the same hardware/chip components, whether Quadro or GeForce. Its the drivers that permit Nvidia to have separate market streams and considerably different price tags.

 

What puzzles me is how the Titan and GeForce can behave so differently when they both run on Pascal architecture!

Yes, Philip, under the hood the GeForce, Titan, and Quadro cards are all based on the same chips. That is why I am extremely confused about the GTX 1080 Ti and Titan Xp being so different, as both use the GP102 chip inside. Moreover, they even use the same drivers! The Quadro P6000 I will be testing this weekend also uses GP102, but different (Quadro series) drivers.

What you can so is actually disable the card for Step 3. In Device Manager you can right click on the display adapter and actually turn it off.

 

When you run Step 3, you will notice the Load Point Cloud runs normally at the correct speed.

 

This was suggested to me by support whilst they continue the investigation and it worked when using the Titan.

 

Nvidia is changing the way these chips run between cards for different specialization.  As has been noted by adjusting the 3D gaming setting, the Quadro can be adjusted to fit Pix4D better than the default setup from Nvidia.

While the base hardware is the same, they are changing other things so they can market to a different industry segment.  The best for Pix4D is the GTX 1080 Ti as a single video card…SLI helps a bit but certainly not worth the money.

I really wish a Xeon configuration would perform at least equal to a Core i9 setup as my current work needs more than 128GB of RAM…13 clusters just doesn’t make a good point cloud.

Okay, I have confirmed on a Quadro P6000 that at default settings its performance is horrible… just like the Titan Xp & V, as well as the Quadro GP100. However, per Philip’s suggestion, when you go into the NVIDIA Control Panel, to the Manage 3D Settings section, and select “3D App - Game Development” from the drop-down menu… then performance is great! It isn’t better than a GTX 1080 Ti, mind you, but it is within a percent or two of that card. Given how much more expensive the P6000 is, I wouldn’t go out of my way to get one for Pix4D - but if you need a Quadro for other applications, at least this work around will avoid it causing horribly slow performance.

Additionally, in response to your comment Adam, there are a couple of options you might want to consider:

  1. Some X299 motherboards allow use of registered memory (RDIMMs) alongside the Core i9 processors, allowing for 256 or potentially even 512GB of memory to be installed. Getting that much memory isn’t cheap, but if you need more than 128GB it is an option in the right configurations. Check out this link to see what I mean: http://puget.systems/go/147485

  2. There are also some newer Xeon processors, the Xeon W series, which are based on the same tech as the current generation of Core i9 processors. They are a lot more expensive than the Core i9 chips, for effectively the same performance (or very, very close to the same) - but if you really want to stick with a Xeon, that could be something to look into. The model that is the equivalent of the Core i9 7980XE (the highest core count i9) is the Xeon W 2195: https://ark.intel.com/products/126793/Intel-Xeon–W-2195-Processor-24_75M-Cache-2_30-GHz

Awesome, thanks Bill, I will check out the high-end X299 boards.  I have no desire to go Xeon but the Gold 6144 is pretty darn close to the Core i9 in speed.

Xeon and Quadro can get close to a Core i9 system but it is twice the price so that is why I tell everybody on this forum to not waste your money.  And doing this full-time usually means the system(s) are dedicated to Pix4D only.