Postby CMDR Clypsis » Sun Jan 03, 2016 7:55 am
Here's my breakdown:
I am an Enthusiast level Gamer and have:
Core i7 920 (yes a 7 yr old proc!)
24GB of DDR3 (max for my Mobo)
4 SSD raid0
2x GTX760 2GB SLi
Samsung U28D590D 28" 4K (3840x2160) monitor
Corsair's RM1000i PSU
Crammed in Corsair's Carbide Air 540
When I made the Jump from 1080p to 2160p/4K the biggest performace hit i noticed was with SYS RAM, I only had 6GB at the time, and due to the massive textures loaded into memory for 2160p/4K rezzes 6GB wasn't cutting it. I use MSI's versatile Afterburner program to keep after Temps and mild OCs of my VCs but it also provides a neat tool of OSD for my temps VC/SYS also showing MEM loads.
Running E:D @ 2160p/4K floating in deep space both cards run with 1.5GB VRAM used and my SYS MEM loads @ 10GB-12GB usage! Planetside, in combat, or Ring Systems my VRAM usage Maxes on both cards and my SYS MEM hits 16GB-17GB usage!!
Up until Horizons came out I could run E:D at 2160p/4K with all graphical eye candy @Ultra with a FPS of 40-110 depending on what was onscreen. Now with Horizons at those settings as soon as a planet what was land-able popped my FPS would tank to <24FPS it was stop frame animation.. super scary plummeting towards a planet @over 120 Mm/s with an altitude of only 7km. The CUDA cores of the 760s just can't hack that resolution and run the COMPUTE calculations required for the procedural generation of the planet surfaces.
Now for some personal insight on the ingame IQE like supersampling, anti-aliasing, and such. Those IQEs were introduced at a time when screen resolutions were low and introduced aliasing (AKA the Stair-Step effect) due to the fact that @ 1024x768 on a 19" screen the individual pixels (the tiny box of RGB cells that make the image used to read this post) were ~.25-.5mm in size... massive by computing stds, which can cause a "screen door effect" that makes the image you're looking at appear as though you are looking at it through a screen door. Not good.
Which leads to DPI or pixel density. The pixel density of a 28” UHD screen (like mine) is 157.35 Pixels Per Inch (PPI), compared to 108.79 PPI for a 27” 2560 x 1440 model and 81.59 PPI for a 27” 1920 x 1080 monitor, and 67.37 PPI for a 19" 1024 x 768 monitor. So the Higher the DPI/PPI of a screen the more tiny pixels are crammed into a small space the more all those tiny individual pixels disappear into one smooth uninterrupted viewing surface.
Now with so many pixels in such a small space the aliasing is all but non-existent, meaning that the "stair-step" effect is present but it's so small it's imperceptible, which is the purpose of "anti-aliasing" it is a software/hardware induced processing effect that breaks up the "stair-step" by taking a "large" stair-step and introducing more stair-steps into the edge to "smooth" out the jaggedness caused by the aliasing of the "stair-step". Since the system introduces more processing into the image it requires more resources from the VC which is why at high levels of anti-aliasing and supersampling you get such a performance hit.
So if you have a high enough resolution with a screen with sufficient DPI/PPI, anti-aliasing/supersampling is grossly redundant, unnecessary, and causes additional performance loss on top of the high requirement to run a game at such high resolutions to begin.