The GetDPI Photography Forum

Great to see you here. Join our insightful photographic forum today and start tapping into a huge wealth of photographic knowledge. Completing our simple registration process will allow you to gain access to exclusive content, add your own topics and posts, share your work and connect with other members through your own private inbox! And don’t forget to say hi!

Monitor recommendations?

erlingmm

Active member
I have had the LG 31MU 4k for couple of months, driven by a Mac Pro. Great monitor, calibration, Adobe RGB, decent price. Eizo is way too expensive in comparison, and my last Eizo broke down after 7 years, which I think is too short.
 

Pradeep

Member
I have had the LG 31MU 4k for couple of months, driven by a Mac Pro. Great monitor, calibration, Adobe RGB, decent price. Eizo is way too expensive in comparison, and my last Eizo broke down after 7 years, which I think is too short.
The reviews of the LG suggest that calibration is a problem, either does not stay or is very difficult to do. Otherwise it seemed the best option in terms of value for money.

I think it is still a bit early in the 4K-5K world.

I might just have to wait it out a bit longer with my NECs, no way spending $6K for the Eizo. I will need two of those since I am so used to a dual monitor setup. That is a crazy amount of money for monitors!
 

erlingmm

Active member
The reviews of the LG suggest that calibration is a problem, either does not stay or is very difficult to do. Otherwise it seemed the best option in terms of value for money.

I think it is still a bit early in the 4K-5K world.

I might just have to wait it out a bit longer with my NECs, no way spending $6K for the Eizo. I will need two of those since I am so used to a dual monitor setup. That is a crazy amount of money for monitors!
Do you have a link or two to describing the calibration problems on the LG?
 

docmoore

Subscriber and Workshop Member
Having used the 5K iMac myself...the reduction in performance kept me from paying attention to the image quality, when it takes a couple seconds just to zoom into a 645Z image in Lightroom, it kinda takes you out of the experience. The display concept is good - it's simply paired to an underpowered mobile chip and can't be connected to any other computer if you do want more power.
The iMac truly is not strong enough for this audience ... great for home casual use.

The dual D700s are almost mandatory for the size of files in MF photography or video.

The Nvidia GTX Titan X with 12GB vram opens and zooms almost instantaneously with my Leica S files in photoshop ... I assume the dual D700 with 6GB cram
each would be a similar level of performance.

Command + is limiting but good enough for text ... at my age it helps.

Bob
 
The iMac truly is not strong enough for this audience ... great for home casual use.

The dual D700s are almost mandatory for the size of files in MF photography or video.

The Nvidia GTX Titan X with 12GB vram opens and zooms almost instantaneously with my Leica S files in photoshop ... I assume the dual D700 with 6GB cram
each would be a similar level of performance.

Command + is limiting but good enough for text ... at my age it helps.

Bob
For CUDA supported apps, the Titan X will run circles around the D700's. In fact, AFAIK, most apps do not use both D700's, only one. I have GTX 980 in one of the cMP's and will install a Titan X in the other, as I believe there is nothing more powerful for CUDA supported Mac apps. The new GTX 980 Ti (6GB) is a great lower cost alternative to the Titan X. For those who have iMac's, I believe you can actually install the Titan X (and 980 GTX's) as an eGPU.
 

Pradeep

Member
Do you have a link or two to describing the calibration problems on the LG?
In the past week I've read so many reviews and user comments that my head is spinning :loco:

However, I do recall one user who was complaining bitterly about the inability to calibrate the LG. Will try to look for the site. It may just be anecdotal but gives you a bit of pause. Amazon has users waxing eloquent and complaining a lot, but then that's quite typical of everything on Amazon.
 
For CUDA supported apps, the Titan X will run circles around the D700's. In fact, AFAIK, most apps do not use both D700's, only one. I have GTX 980 in one of the cMP's and will install a Titan X in the other, as I believe there is nothing more powerful for CUDA supported Mac apps. The new GTX 980 Ti (6GB) is a great lower cost alternative to the Titan X. For those who have iMac's, I believe you can actually install the Titan X (and 980 GTX's) as an eGPU.
From what I understand, the D700 cards have better dual-gpu support than the D300 and D500 variants, although you still won't see the full performance of those things in anything outside of Final Cut. The bin-style Mac Pro is basically a Final Cut machine; it's also kind of outdated and overpriced. Whether you want something compact and powerful or big and powerful, desktop PCs are where it's at right now.

The only people I could recommend a Titan X to are 3D graphics artists and animators, since their huge and high-resolution scenes need to fit entirely within RAM, so having 12GB could literally mean twice as much content. In terms of actual speed, the 980 Ti is often faster despite the fewer cores, since less power is wasted on the extra RAM chips and goes into higher clock speeds.
 

Pradeep

Member
From what I understand, the D700 cards have better dual-gpu support than the D300 and D500 variants, although you still won't see the full performance of those things in anything outside of Final Cut. The bin-style Mac Pro is basically a Final Cut machine; it's also kind of outdated and overpriced. Whether you want something compact and powerful or big and powerful, desktop PCs are where it's at right now.

The only people I could recommend a Titan X to are 3D graphics artists and animators, since their huge and high-resolution scenes need to fit entirely within RAM, so having 12GB could literally mean twice as much content. In terms of actual speed, the 980 Ti is often faster despite the fewer cores, since less power is wasted on the extra RAM chips and goes into higher clock speeds.
There was a time when I used to build my own PC from scratch. Moved to the Mac about 8 yrs ago and have to live with that. Must say I've been much happier overall since then, but won't get into the PC vs Mac argument here.

For me the Mac Pro serves a useful purpose and if it cannot drive two 5K monitors so be it. I am not a gamer and the only application where I need CUDA capable cards is DVDFab for blu-ray ripping, my system takes about 30 minutes for a full rip while the CUDA enabled systems taken 10-12 as per reports. However, this is hardly the mainstream use for me.

I think I am going to wait a little longer, perhaps the field will get better by the end of the year or early 2016.
 
From what I understand, the D700 cards have better dual-gpu support than the D300 and D500 variants, although you still won't see the full performance of those things in anything outside of Final Cut. The bin-style Mac Pro is basically a Final Cut machine; it's also kind of outdated and overpriced. Whether you want something compact and powerful or big and powerful, desktop PCs are where it's at right now.

The only people I could recommend a Titan X to are 3D graphics artists and animators, since their huge and high-resolution scenes need to fit entirely within RAM, so having 12GB could literally mean twice as much content. In terms of actual speed, the 980 Ti is often faster despite the fewer cores, since less power is wasted on the extra RAM chips and goes into higher clock speeds.
I believe FCP is the only app that takes advantage of dual D700's, however, AFAIK SLI/crosslinking is not supported with the D700's. Not only is the nMP (trashcan) outdated and overpriced at the moment, upgrading the GPU is not possible without going the eGPU route. I do love the design though (along with how quiet it is). I was able to get two 12-core cMP's with change to spare for badass GPU's that will outgun dual D700's for less than the price of a 12-core nMP. My machines serve many purposes outside photo editing. The Titan X will go into the cMP that runs DaVinci Resolve.

At the end of the day, I prefer working with OSX and tolerate Windows. Just a personal preference as I work on both all day, almost every day. Although I must add that Windows 10 has been giving me that warm fuzzy feeling, LOL. When I can no longer upgrade my cMP's to my satisfaction, I will go the desktop PC route if Apple hasn't given me a viable option with the nMP2.

Back on topic (sorta), I found this to be a good resource for those deciding between AMD and nVidia for their GPU needs:

http://create.pro/blog/open-cl-vs-c...upport-gpgpugpu-acceleration-real-world-face/

At the bottom of the article, there is a list of apps that support CUDA/OpenCL. I hope this serves as a good reference for those deciding which way to go.

Alvin
 
I like OS X as much as anyone else on here, but my hardware is getting long in the tooth and there isn't anything that strikes my fancy out of the new macs. I'd be willing to going back to Windows if it meant getting 50% faster hardware for 50% less money. If there wasn't such a glaring hardware disparity, I wouldn't bring this up, as when I originally bought my mid-2010 iMac it was actually fairly comparable to PCs of the time in performance.

December is when I'm sure Apple is going to announce the next crop of Mac Pros, but is it really worth waiting for? It'll just be un-upgradable again, with needless use of Xeon CPUs and ECC RAM jacking up the price, and will be limited by thermal and TDP constraints from being stuffed into such a tiny enclosure. I bet they'll use the new Samsung XP951 SSDs and have a slide about how the storage is 2x faster than the old one, even though it's something you can already buy today.
 

douglaspboyd

New member
Consider Vizio 43" 4K TV for $500 at Costco for photo editing

Hello all,

The time has come for me to get a new monitor - which I have not done for years. I am keen to get something really good that will last me a long time, though am hoping that it will fall in the 'expensive but not quite ridiculous' category.

My criteria are:
Size (27 inches or more)
Resolution (ppi) much as possible of course- I believe 109 or more is available?


Hi,

Someone recommended a Dell 27" 4K IPS monitor for $500, and I thought about it. But 27" is too small for 4K,: the menus in Lightroom would be too small to read. Yesterday I noticed Costco was selling a Vizio 4K 43" TV for $500. 43" is just large enough to see details on a desktop at 4K resolution, providing you are within 18" or so from the screen. So I picked it up.

I thought I would need a GEforce GT960 or higher video card for this, but I checked and my old video card supported UHD output on HDMI with custom settings, so I just used that. I had to dial in a custom resolution of 3840x2160 and it worked. You may want to select a lower refresh rate for this in case your cable or card are not good at 60Hz UHD frequencies. I selected 30Hz on the card.

Lightroom looks great on this monitor, the Develop menu bar now fits on the right side of the screen and no scrolling required. The image is close to the final print size (I generally print at 20X30). This is a real upgrade to my old monitor and a real time saver when processing in Lightroon or photoshop. Far less scrolling around in menus and the picture.

Now you ask about color since this is not IPS. The monitor has several color modes including a Calibrated mode which looks fine. Of course fine adjustments can be made at the video card for color if you want to get into color calibration. But for my purposes I doubt if it will be necessary-- we'll see.

Anyway, my advice is to consider using a really big TV 4K monitor when working with 42 mp files from the A7rII. The cost is no higher than the tiny 27" monitors that others are recommending.

I suppose this will also be good for 4K video editing, but haven't tried it yet.

==Doug
 
My understanding was the second card in the Mac was used to mainly to enable parallel GPU acceleration at the OS level
I believe that is true for OpenCL-supported apps (IIRC OpenGL uses only the first card -- crossfire required to use second). As OpenCL becomes more prevalent (relatively slow so far IMO) this will change. No CUDA acceleration on the AMD D700's, so neither card in this case. Metal API (and subsequent software support) on El Capitan may be somewhat of a game changer. Let's hope it is.
 
Top