![]() ![]() There is a fractionally small change in frame rate anyway. As I am running in 2560x1440p on a 1080p monitor, there is no aliasing. I think the reason why there was really no change in frame rate when changing AA rate is this. ![]() ![]() Neither did my previous GPU, a Kepler GTX 650 Ti Boost. It has to be this card though because the other one did make the PC games flicker. I am really worried about it though because it's the second one I had to send back. I think the only thing to do is return the card. That will be due to simpler graphics though. With Skyrim it's generally outside daylights scenes I have seen do it, though not all, and quite rarely. It's very noticeable too it's not a light faint micro-stutter. Usually the problem occurs when there is certain lighting, maybe with a dark background. As I said earlier Alien:Isolation can be solid at 40 fps in some scenes but stutter in others. I think it's an issue with DSR struggling with something in particular. If I'm in an area in the game where I get some frame rate reduction, I can stop moving. It's not about frame rate suddenly dropping and recovering. The card has 4GB RAM and I think the most Skyrim pulls is less than 1.5GB. Or it's something to do with the 970 chip, or some 970s. I really have thought of every aspect on the hardware. The effect is when the game runs less than 25fps, it starts showing. I have sees a CPU bottleneck games before and it looks just like it does when the GPU bottlenecks. I think that's triple buffering, but not sure.) Some games seem to manage V-sync but allow the frame rate to run at any frame rate under 60fps, e.g. The stuttering still happens with V-sync off when the frame rate naturally drops into the range it affects. Meaning a slight dip in performance from 60fps doesn't result in the frame rate dropping to a pre-set 45fps or 30fps. What it does do is change the frame rate. It also makes games look worse when you get tearing. That affects different games differently, but doesn't cure the problem. Whereas there are areas in A:I that the frame rate will drop but the problem happens much less.Īs for taking off the V-sync. Like in Alien: Isolation, there are plenty of places where lighting particularly stood out. I did once speculate that it was in certain areas because those places had more light on the scene. I would expect those areas it was hitting problems with, to be easier to run for the GPU. Oddly it has not improved the problem, though I wouldn't expect it to solve it. I originally had the anti-aliasing at 8x set in the game options. Using Skyrim as an example, because I am playing it at the moment. I even upgraded my PSU recently to try fix it - total waste of cash. I have noticed other users are seeing this, but is there a solution. I want them to put 16xAA and 32xAA, back in the Nvidia Control Panel. Except Nvidia have removed heavy levels of AA in their recent drivers. I am better off using native resolution and using heavy AA. Tomb Raider 2013 not so much, but it still happens. Alien: Isolation, and Skyrim, are games which are bad with it. It seems to affect differently in different games. If the frame rate drops to say 35 or 30 fps flickering/stuttering is even more pronounced. When it gets to about 45fps, the image starts to flicker and shudder. ![]() When the frame rate starts to drop from 60 in graphically demanding areas, bad stuff happens. Running DSR at 1440p many games still run at 60fps, and all is fine. Usually with my GTX 970 Most games cap at 60fps with v-sync and all is good. Anything over 25fps is pretty solid and good, and anything over 30fps is great. When running games in native 1080p resolution, I get a normal response. This does clear up the image but is causing a flickering/stuttering issue. I have a 1080p monitor, and when using DSR, I usually run in 1440p. ![]()
0 Comments
Leave a Reply. |