XCVG
ModRetro Legend
ttsgeb said:I'd like to add that I was running 2.5mp off of Intel integrated, and being able to game. Because of this, I have a hard time believing that you would need dual 780ti setup to run less than 4x the pixels (8mp). The logic just doesn't work. Even if you claim an exponential relationship, it doesn't add up.
I really question this, unless you're either okay with low framerates or low settings. But my only experience with Intel graphics in recent memory is the horrific HD 3000 that I briefly used when my GPU died.
samjc3 said:I don't believe people who say this. I do a solid 80% of my gaming at 3500x1920 on a single 770 2gb. I can certainly see where I'd want more power and ram to go up to the full 4k, but it seems a single Titan or two 770s would handle it just fine. Course, I don't spend much time worrying about max settings and high AA, but for most of the games I play I run high or better settings with no AA, and source flax all the way up with 16x aa.XCVG said:Even a single 4K monitor requires two Titan/780Ti/290X to drive effectively.
As far as people complaining about bezels, I don't really have an issue with them - certainly they're less than ideal, and my new build will be built to minimize them (as opposed to the solid 1" panel gap I currently have), but with proper bezel correction, they're really not a big deal. I find the biggest issue with surround gaming is more the amount of games that simply won't run, or have issues when they run at such a resolution. There's additionally some system instability issues, which is part of why I want to build a dedicated surround rig and make this one a general purpose computer for internets and such and games that need to run on a single display.
If you've got a 4K display, you're going to want to max the quality settings. If you're willing to drop the quality to make it playable, then why bother with a 4K display? I know that if I go from 1080p to 4K and have to drop from ultra to high or even medium (this is what it takes to make some games playable), I consider that a loss. If you step up in resolution and your experience went from buttery-smooth (preferably 60FPS, but I've heard 40+ is good enough) to much less smooth (30FPS mark is generally considered minimum playable, some can deal with 20+ but I can't) I consider that a loss.
I am quite sensitive to flicker and such, though, so low framerates might bother me more than others.
If you want to see numbers, HARDOCP reduces settings until the game is playable, while HEXUS turns the settings up but not all the way up. This Tomshardware article is kind of old, and I think they ran at high but not maxed settings.
I tend to find benchmarks somewhat pessimistic- you can generally push a little more out of a card and still remain subjectively playable versus what the numbers say.
I'm also running a 770, and it can barely push Metro 2033 on Ultra at 1920x1080.
Bezels, no, I know they're not that much of an issue in general, they just bother me personally. Call it a kind of OCD quirk.
samjc3 said:All that said, what really needs to happen, is AMD or <a href="http://www.nvidia.com/" rel="nofollow" target="_blank" title="Link added by VigLink" class="vglnk"><span>Nvidia</span></a> needs to release a modern card with the port layout from the eyefinity 6 cards - that is, 6 minidisplayports per card. Because then you could run 4 way CFX/SLI with 24 <a href="http://www.ebay.com/sch/i.html?_from=R40&_trksid=p5197.m570.l1311&_nkw=ipad+3rd+generation&_sacat=See-All-Categories" rel="nofollow" target="_blank" title="Link added by VigLink" class="vglnk"><span>ipad</span></a> displays, 3 tall 8 wide. 16384x4608 wraparound display? Yes please.
I think they should too, but for other reasons. Six DP1.2 outputs with MST would be very useful for digital signage and presentations. 24 iPad displays is awesome, but very impractical for most people, even most enthusiasts.
I got distracted and this post turned into a rambly mess, sorry.