Seriously, sitting a reasonable distance from your monitor or television, its barely even noticeable from a 1440P screen 32" or smaller or a 1080P screen 24" or smaller. Is this quest for something most people won't even notice worth losing the frames that go with that? And even if 4K60 looked clearer to you, once you move that camera at 60fps, 1440P at 120fps is going to look way sharper.
Speak for yourself. I sit about 5 feet from my 55" OLED and the difference between 4K and sub-4K is night and day. I will always pick 4K/30 or 4K/60 over 3K/60 or 3K/120
I'm waiting for LG to make 44" 4K OLEDs (this year they released the 48" CX, but that's still too big) then I might go back to PC gaming and use that 44" on my desk sitting less than 3 feet from it. From that distance 1440p won't look as good as 4K, period.
For everyone else who sit far away or have small screens, then yes 4K is pointless.
I have friends who sit 15 feet from their 4K TVs and say "I can't wait until 8K comes out!"
I love my 3440x1440p dell ultra wide monitor. GTA 5 looks beautiful in high resolution I don't care about frames Im a resolution guy.
I go as high as I can on textures and res if I drop to 50-40 frame rates so be it. I love driving around the grassy area's and looking high definition foliage and trees who cares about frames when the game looks that good.
4K is the marketing buzzword and bragging rights. Its something you want it or not. I really don't care for raw graphics, all I care is better performance on gameplay.
Speak for yourself. I sit about 5 feet from my 55" OLED and the difference between 4K and sub-4K is night and day. I will always pick 4K/30 or 4K/60 over 3K/60 or 3K/120
I just ordered a 4k 55 OLED. I'm so looking forward to it!
@eoten: Yup, my 4k tv mostly gets used at 1080p 120hz mode on my pc. 4k is nice, but not so nice I'm willing to tolerate poor frame rates. Not to mention, I would guarantee a good chunk of the people who diss 1080p have never seen it with a good anti-aliasing solution in place.
@appariti0n: The THX Recommended viewing distance from a 55" is about 5.5 feet.
5 feet is perfectly fine for your eyes. I know a couple of psychopaths that sit 6 feet from a 110" (that's four times larger than a 55") and 8 feet from a 150" display.
Marketing wanks trying to vertically integrate with their TV departments.
4k is nice, but 30fps feels like garbage. And because they are cheap consoles it ends up being 20-30fps. I hope next gen does better since I'm still gonna be stuck on a PS5 for at least a few games.
I would like to see your source, as that conflicts with everything I've seen as far as viewing distances. This indicates 7.7 feet is the optimal viewing distance for a 55".
This is a much better source than Rtings as to how far you should sit from your display.
6.1 feet is what THX recommends for a 36 degree viewing angle from a 55" display. However is 36 degree viewing angle 'optimal'? Some recommend 40 degree viewing angle, which if my math is correct translates to closer to 5.5 feet from a 55".
7.5 feet is the MAXIMUM distance you should be sitting from your 55". 5.5 feet makes it much more immersive, and with 4K movies and games you still won't see any pixels, artefacts or other types of noise to the image.
While THX still contends that the optimum viewing distance is a position where the display occupies a 40-degree view angle for the viewer, they too provide a range recommendation. The minimum viewing distance is set to approximate a 40-degree view angle, and the maximum viewing distance is set to approximate 28 degrees.
And also:
THX recommends that the “best seat-to-screen distance” is one where the view angle approximates 40 degrees,[23] (the actual angle is 40.04 degrees).[24] Their recommendation was originally presented at the 2006 CES show, and was stated as being the theoretical maximum horizontal view angle, based on average human vision.[25] In the opinion of THX, the location where the display is viewed at a 40-degree view angle provides the most “immersive cinematic experience”,[23] all else being equal.
But what does science say? How wide of a viewing angle can you have before you start seeing pixels?
For 1080p, its 31 degrees and for 4K its a WHOPPING 58 degrees. for 8K its 96 degrees.
It's ridiculous, seriously. I own a pretty big 4k TV, and anything above 1440p looks more or less the same to me. Glad devs are using dynamic res more and more to get 60fps, what really mattets.
I love my 3440x144p dell ultra wide monitor. GTA 5 looks beautiful in high resolution I don't care about frames Im a resolution guy.
I go as high as I can on textures and res if I drop to 50-40 frame rates so be it.
What is the highest frame rate monitor you've actually had?
60 so far
You should really try gaming on a screen that can do at least 120hz. Way more satisfying than an unnoticeable boost to resolution at reasonable viewing distance. After that, 60fps will give you a headache. Not only that, but 120fpz will even feel smoother, more responsive, and in my testing on games like Doom 2016, Quake Champions, etc, I noticed an instant and significant improvement in gameplay performance because of that extra smoothness and responsiveness. 120fps is legit. 4K is not.
This is a much better source than Rtings as to how far you should sit from your display.
6.1 feet is what THX recommends for a 36 degree viewing angle from a 55" display. However is 36 degree viewing angle 'optimal'? Some recommend 40 degree viewing angle, which if my math is correct translates to closer to 5.5 feet from a 55".
7.5 feet is the MAXIMUM distance you should be sitting from your 55". 5.5 feet makes it much more immersive, and with 4K movies and games you still won't see any pixels, artefacts or other types of noise to the image.
According to everything I have seen in relation to screen size and resolution at various viewing distances, with 20/20 vision at 7.5 feet, there's little diference between 1080p, 1440p, and 4K. At 2.5 feet, which I measured is the distance between my monitor and my computer desk, on a 24" screen, there's very little difference between 1080P and 4K. And on the same charts I can go up to 32" with 1440P and still see little difference between it and 4K. So I would have to be 2.5 feet from a screen larger than 32" to get any noticeable benefit from 4K.
So for desktop monitors, there's never going to be a valid argument for 4K unless they become so easy to drive, and have replaced current offerings for price. As you move back, obviously, the same thing applies to bigger screens. It seems like 1440P HD televisions would still look sharp AF, but provide a significant boost to frame rates. I've never seen a 1440P HDTV before. Living room gamers kind of get screwed.
I sit 93 inches (7 and 3/4 feet) from my 65 inch 4K TV and wear glasses. It's a significant boost. Rarely use that res for games, though. Too demanding, and I prefer most PC games at a table.
also should not their is a different between 4k video and gaming.
one has a industry standard and the later has zero rules on it.
The real problem here is that in most games today you have a dynamic camera controlled with a thumbstick. When you move that camera, the background, details, textures, everything gets blurry. The solution to that is higher refresh rates. So unless you're in a game with a fairly stationary camera angle, even 1080P at 120hz is going to look clearer and sharper than 4K at 60. This is irrelevant in television and movies that 4K screens are designed for, but it'll continue to be an issue until hardware can drive 4K at 120fps consistently and stably. The problem is, as developers add more content, better lighting, better shaders, more shadows and reflections, 4K is likely to remain 60fps as games improve with the hardware.
dxmcat. tell me what standard are used in 4k gaming????
its just as loose as video. I can't even count the number of "4k blurays" that are upscaled from 2k/2.5k masters, or having effects only in 2k while the rest may or may not be 4k. It all just gets upscaled.
@eoten said:
@firedrakes said:
also should not their is a different between 4k video and gaming.
one has a industry standard and the later has zero rules on it.
The real problem here is that in most games today you have a dynamic camera controlled with a thumbstick. When you move that camera, the background, details, textures, everything gets blurry. The solution to that is higher refresh rates. So unless you're in a game with a fairly stationary camera angle, even 1080P at 120hz is going to look clearer and sharper than 4K at 60. This is irrelevant in television and movies that 4K screens are designed for, but it'll continue to be an issue until hardware can drive 4K at 120fps consistently and stably. The problem is, as developers add more content, better lighting, better shaders, more shadows and reflections, 4K is likely to remain 60fps as games improve with the hardware.
I think it depends on distance from screen. Sitting on a couch 5+ feet away from a screen, you can certainly tell the difference between 1080 and 4K.
Up close on a monitor, not so much. I was even pretty happy still at 1080, but I made the jump to 1440 and I am pretty happy with it; a noticeable upgrade, perfect for being 1-2 feet from the screen.
I do wonder why we bypassed 1440p for the most part on TV's, though. Seems like 1440 is a really good compromise, and it would lower data streaming as well for movies and stuff. 4K movies are HUUUUUUUGE
also should not their is a different between 4k video and gaming.
one has a industry standard and the later has zero rules on it.
The real problem here is that in most games today you have a dynamic camera controlled with a thumbstick. When you move that camera, the background, details, textures, everything gets blurry. The solution to that is higher refresh rates. So unless you're in a game with a fairly stationary camera angle, even 1080P at 120hz is going to look clearer and sharper than 4K at 60. This is irrelevant in television and movies that 4K screens are designed for, but it'll continue to be an issue until hardware can drive 4K at 120fps consistently and stably. The problem is, as developers add more content, better lighting, better shaders, more shadows and reflections, 4K is likely to remain 60fps as games improve with the hardware.
Drivel. I have a 32" 4K monitor and I can tell a huge damn upgrade when I moved from that from a 1080P 32" monitor. Seriously maybe you need an eye exam.
@Mozelleple112: agreed. Im also 55inch and 5 feet away, native 1080p with Sony Bravia upscaling isn’t bad, but definitely soft and or jagged. Native 4K is night and day.
Because it looks very sharp and it's much easier to market.
When I'm on a crappy display native res gives a massive boost to image quality but when I use a much bigger mediocre display, like some tvs, I do not care much for the native res unless it's below 900p on a 4K display.
For consoles, I'd rather have them push lower res with better everything else as they set the base standards for development in general.
I can see the difference between 4k and 1440p on my 55" Samsung QLED but the difference isn't enough for 1440p to look like crap like 1080p does now for me. Even with anti aliasing I can see how much blurrier and low detail 1080p is in comparison. I'll play at 4k when I can but I mostly play at 1440p since that is what my GTX 1080 can handle on most demanding games and for multiplayer titles the 120fps comes in handy. If a game is old I'll also use 1440p to get 120fps since the the graphics are too dated to make any good difference at 4k.
Log in to comment