The GetDPI Photography Forum

Great to see you here. Join our insightful photographic forum today and start tapping into a huge wealth of photographic knowledge. Completing our simple registration process will allow you to gain access to exclusive content, add your own topics and posts, share your work and connect with other members through your own private inbox! And don’t forget to say hi!

Dual Graphics Cards

sc_john

Active member
Is there a benefit to dual graphics cards on the following scenario:

- Macbook Pro, 2.5GHz (or 2.8GHz) quad-core Intel Core i7 processor, 16GB Ram, 512GB SSD
- Files processed: medium format (60MP)
- Some stitching... probably max 100mp
- Photo editing only... no movies, no gaming
- Editing done in Capture One (V9.x) and PSCC
- Max file size in PSCC w/ layers ~ 1GB

I am posting in Medium Format given the files processed. If it belongs in Gear Garage or other area, moderator please move as appropriate.

Thanks in advance,

John
 

kdphotography

Well-known member
If your video card is up to date (decent chipset) and has plenty memory, you should be fine. I run an additional video card only because I am running additional monitors.

Ken
 

JeRuFo

Active member
I don't see a benefit here. Most important in the scenario you are sketching is probably sheer processor speed (both CPU and GPU) more cores and processors are more useful in batch processing. Read/write speeds can also be a bottleneck, but Apple is quite good at that. Most modern graphics cards Apple offers are probably up to the tasks photo editing software gives them. A lot do use the GPU for more and more tasks though.
 

Boinger

Active member
None of our photo programs benefit from dual graphics. No production environments really do unless you are doing lots of math and I mean pure number crunching.

Sli only has a direct benefit in gaming, and even then only if the game is coded for it. As it stands right now I would actually like our graphics programs to use more graphic processing power than they currently do. At the moment they barely scratch the surface.

For example I have a xeon 2697-v3 (12 core 24 thread chip) and a nividia titan x with 64 gb of ram and ssd scratch disks which i allow to use around 100gb of cache in the scratch disk.

On my system when I am editing large MP picturse I still find all photo programs to be too laggy. Photo programs simply aren't built to take advantage of all that power.
 

Wayne Fox

Workshop Member
While the MacBook Pro has two graphics card, I don’t believe an application can leverage both at the same time. The integrated graphics card which isn’t nearly as powerful is used by applications which don’t require the power to save on the battery, (it’s possible it’s GPU isn’t even available to running applications). The discrete card is used by applications which either require GPU power(or request it, not sure how it is determined). This seems to be on an application by application basis, you can tell which application is using which card in the energy pane of the activity monitor.
 

Christopher

Active member
None of our photo programs benefit from dual graphics. No production environments really do unless you are doing lots of math and I mean pure number crunching.

Sli only has a direct benefit in gaming, and even then only if the game is coded for it. As it stands right now I would actually like our graphics programs to use more graphic processing power than they currently do. At the moment they barely scratch the surface.

For example I have a xeon 2697-v3 (12 core 24 thread chip) and a nividia titan x with 64 gb of ram and ssd scratch disks which i allow to use around 100gb of cache in the scratch disk.

On my system when I am editing large MP picturse I still find all photo programs to be too laggy. Photo programs simply aren't built to take advantage of all that power.
I could be wrong, but as far as I understand C1 does use ALL graphic cards available. If one uses two identical fast cards it's a big speed bumb. (Not double but close) it certainly requires that the whole system is snappy.

Perhaps somebody can chip in with more experience, but that is what I have learned in the recent weeks on the phase forum.


It is correct that all adobe photo programs don't really use modern hardware well..... or let's say it could be much better.
 
Generally speaking, you want the fastest single GPU before considering two of them, because while a lot of programs these days are crossing over to GPU acceleration, very few can use more then one effectively,
If you were working with some form of 3D graphics or 4K video there would be a clear benefit, but as it is, it's not a good return on whatever you'd end up spending.
Same deal on the CPU side of things, the fastest quad-core is generally a better investment than slower chips with more cores or even dual CPUs. You can overclock some 6-core chips to run just as fast though.
 

Boinger

Active member
I could be wrong, but as far as I understand C1 does use ALL graphic cards available. If one uses two identical fast cards it's a big speed bumb. (Not double but close) it certainly requires that the whole system is snappy.

Perhaps somebody can chip in with more experience, but that is what I have learned in the recent weeks on the phase forum.


It is correct that all adobe photo programs don't really use modern hardware well..... or let's say it could be much better.
I cant see any technical info from C1 on how they use the graphics cards. But its pretty safe to assume dual gpu's are not supported.

Let me put it this way we have had access to "dual / multi" core cpu's for ages now yet most photo editing programs can't handle the multi core support very well, and generally prefer a single fast core for most actions. So utilizing 2 gpus would be even more complicated for them.

In a sense they will use both gpus as they "share" the graphical load. But you have to look at the efficiency per dollar for dual gpu vs single fast gpu. 90% of the time single fast gpus win. In my own system I went for a single TITAN X card sure I could slap in another titan x but I would not gain much power for double the cost. Maybe a 20%-30% bump. That is also considering if it is coded to utilize dual gpu's well. Which I am suspecting that C1 is not.

Dual gpu cards were also primarily designated for 3d gaming for faster rendering / frame rates. Yet not even most games can really make good use of them.
 

Christopher

Active member
C1 has their own GPU benchmark and it shows pretty good how fast a gPU is for THEIR program. It also shows it can use more than one GPU as long as the rest makes sense. It's also not a secret that for C1 AMD GPUs work much better than nvidia GPUs. Especially if you look on how affordable they are.

I'm still working on my new workstation and it won't be finished for the next few weeks. At that time i might be able to test it in more detail.

For the time being look at the Phase One forum. There are a few topics discussing it.

Keep in mind I am ONLY talking about C1. Dual GPUs won't do anything with Adobe photo products.
 

Paul2660

Well-known member
+1 on C1 and inoperability with nvidia GPUs. I have given up on a fix for the "error when processing a file" issue with the GTX970 980 lineup.

Paul Caldwell
 

Wayne Fox

Workshop Member
The discussion on graphics card is enlightening and informative.

however, the original question related to one specific configuration, a MacBook Pro. While a MacBook Pro does have two graphic cards, they are not two in the sense of desktop computers. Each application on the MacBook pro accesses one of the two cards. The OS prefers to use the integrated graphics card, which isn't very powerful so applications which can benefit from the GPU such as C1, Lr, Ps are allowed use the discrete card which has a much more powerful GPU. You can also force the MacBook Pro to always use the higher power card in the energy saver panel, but at this point the machine seems to use enough logic that it's better to leave it alone and looking at Activity Monitor shows it is allocating each application to one of the two cards based on logical and effective criteria.

I do not believe a single application can access and leverage both GPU's at the same time on a MacBook Pro - and even if it could the integrated GPU probably isn't powerful enough to be a significant benefit.
 

Boinger

Active member
The discussion on graphics card is enlightening and informative.

however, the original question related to one specific configuration, a MacBook Pro. While a MacBook Pro does have two graphic cards, they are not two in the sense of desktop computers. Each application on the MacBook pro accesses one of the two cards. The OS prefers to use the integrated graphics card, which isn't very powerful so applications which can benefit from the GPU such as C1, Lr, Ps are allowed use the discrete card which has a much more powerful GPU. You can also force the MacBook Pro to always use the higher power card in the energy saver panel, but at this point the machine seems to use enough logic that it's better to leave it alone and looking at Activity Monitor shows it is allocating each application to one of the two cards based on logical and effective criteria.

I do not believe a single application can access and leverage both GPU's at the same time on a MacBook Pro - and even if it could the integrated GPU probably isn't powerful enough to be a significant benefit.
Oh wow, I totally missed the part about the laptop. LOL

As soon as I read dual gpu I immediately thought desktop but yes this changes the discussion entirely.

Laptops simply do not have dual gpu's the "integrated gpu" is not really a gpu it is more just a display card. Prior to the days of "integrated gpus" you would have to put some kind of display transport card not necessarily a 3d gpu but some kind of display card to send the signal out to your monitor. This was done off cpu with a separate card. These days the display out being integrated on the cpu allows the system to become more compact and one less component to add in. If you did not have the integrated gpu and no 3d gpu you would have no picture.

So even though the integrated gpu can do some 3d processing I would not count it as a 3d processor at all.

You would definitely want to get the macbook pro with the GPU unit. But don't mistake it for being a dual gpu setup as it will not function like that.
 
Ah, I've also missed the laptop bit.
I do not believe a single application can access and leverage both GPU's at the same time on a MacBook Pro - and even if it could the integrated GPU probably isn't powerful enough to be a significant benefit.
You're right, this is a fairly new concept that's only starting to become exploited with the latest graphics APIs, but even then, it is said that if the performance differential between the two pieces of hardware is great enough, it can have a negative or negligible effect on performance. It's probably best left as it is today- have the integrated solution render the system-level graphics, and do all the heavy lifting with the discreet GPU. In a certain sense you're already getting better performance from the GPU since it's not having to render 5mp worth of graphics all the time aside from whatever else it's doing.
 

Boinger

Active member
Unless I am mistaken and there has been some new development I am unaware of since my days of system building.

You cannot use both graphics cards at the same time.

It is one or the other. So if your laptop has a discreet gpu it doesnt really ever turn off. Either you use the integrated or the discreet not both.

When people talk about dual gpu's they mean sli or crossfire and for that to work the cards have to be the identical spec not different cards.
 

lkuhlmann

Member
The discussion on graphics card is enlightening and informative.

however, the original question related to one specific configuration, a MacBook Pro. While a MacBook Pro does have two graphic cards, they are not two in the sense of desktop computers. Each application on the MacBook pro accesses one of the two cards. The OS prefers to use the integrated graphics card, which isn't very powerful so applications which can benefit from the GPU such as C1, Lr, Ps are allowed use the discrete card which has a much more powerful GPU. You can also force the MacBook Pro to always use the higher power card in the energy saver panel, but at this point the machine seems to use enough logic that it's better to leave it alone and looking at Activity Monitor shows it is allocating each application to one of the two cards based on logical and effective criteria.

I do not believe a single application can access and leverage both GPU's at the same time on a MacBook Pro - and even if it could the integrated GPU probably isn't powerful enough to be a significant benefit.
I happen to know how C1 works internally, and I can confirm both the Intel GPU and the nVidia GPU (or for the newer model AMD GPU) are used for interaction and processing of images. This is true for all newer MacBook Pro's, and plenty of other computers.

For batch processing the CPU is also utilized fully at the same time.
In fact we tested a PC where we had 4 Tesla GPU cards running in parallel - and got linear performance gains.

Look here for Mac performance tests (Capture One 8 used for the test)

https://macperformanceguide.com/blog/2014/20140421_1-MacPro2013-PhaseOne-CaptureOnePro.html

-Lionel
 

Boinger

Active member
I happen to know how C1 works internally, and I can confirm both the Intel GPU and the nVidia GPU (or for the newer model AMD GPU) are used for interaction and processing of images. This is true for all newer MacBook Pro's, and plenty of other computers.

For batch processing the CPU is also utilized fully at the same time.
In fact we tested a PC where we had 4 Tesla GPU cards running in parallel - and got linear performance gains.

Look here for Mac performance tests (Capture One 8 used for the test)

https://macperformanceguide.com/blog/2014/20140421_1-MacPro2013-PhaseOne-CaptureOnePro.html

-Lionel
Could you link the test with 4 gpus? If the gains are linear that is great, but that really doesnt seem to follow the laws for parellel computing. Going from 1 core to 2 cores to 3 cores don't give you linear gains. So I am curious to see the graphics card do such.
 

lkuhlmann

Member
Could you link the test with 4 gpus? If the gains are linear that is great, but that really doesnt seem to follow the laws for parellel computing. Going from 1 core to 2 cores to 3 cores don't give you linear gains. So I am curious to see the graphics card do such.
Unfortunately I can't share the data anymore. But in general GPU processing scales extremely well (for general image processing). Obviously it does not scale 100%, but I remember a number close to 95%, and this was for a combined number of GPU processing units in the order of 6000. It all comes down to only use of local memory access.

-Lionel
 
Top