@Mozelleple112: agreed. Im also 55inch and 5 feet away, native 1080p with Sony Bravia upscaling isn’t bad, but definitely soft and or jagged. Native 4K is night and day.
8k however is a joke.
Nope, not when its 75" 4 feet away.
I think it depends on distance from screen. Sitting on a couch 5+ feet away from a screen, you can certainly tell the difference between 1080 and 4K.
Up close on a monitor, not so much. I was even pretty happy still at 1080, but I made the jump to 1440 and I am pretty happy with it; a noticeable upgrade, perfect for being 1-2 feet from the screen.
I do wonder why we bypassed 1440p for the most part on TV's, though. Seems like 1440 is a really good compromise, and it would lower data streaming as well for movies and stuff. 4K movies are HUUUUUUUGE
Because TVs are far bigger than most monitors. Because 2160 is a clean multiplier of 1080, meaning 1920x1080 movies will look better than on a 1440 monitor. Because photos are huge. Because you're not offering 1920x1080 users as much of an upgrade, not one that can be advertised as easily.
I don't feel like we even bypassed it. More like PC monitors exist for totally different purposes.
tell me with sourcing. that direct gaming and not taken from video sourcing.
i never seen in my 35 years of life gaming follow a video standard properly.
@firedrakes: The often 1440p 27" gives you 108 ppi.
8k @ 75 inch gives you 117.ppi.
So no, don't need aa 20 ft screen
lol going cell phone display.. where talking about video. look how good a proper 4k movie is shown in theaters. that my ref.
difference between video and gaming.
For consoles it's important because pretty much all TV's are 4k now and it's very noticeable when you're not playing at the native resolution of the screen.
For consoles it's important because pretty much all TV's are 4k now and it's very noticeable when you're not playing at the native resolution of the screen.
true but around half of all 4k hdr are cheap. their hit bar min of the 4k uhd standard or lie out right. . up just till recent. did they hit middle ground. of the standard. this is 2020 models. or tail end 2019.
to put it bluntly read up on how 4k done in video . and how tech is needed for it etc.
seeing most dont shot or edit in it. instead they read rando gaming stuff.
its still very costly to film/shot in 4k video. and dont get me started on software size for editing it. that been a mess.
. As a general rule, the human eye cannot separate individual pixels in a computer display with a pixel density of 150 ppi or higher at a normal viewing distance of 2-3 feet. Because they are held closer to the eye, to achieve the same effect on a smartphone, the screen needs to have a pixel density of at least 300 ppi.
distance to object.
again where talking wait for it the dam standard for 4k content.
please once again show me what standard does the game industry follow for 4k or 8k with proper hdr.....
what you did instead of answer was deflect the question
I'm very likely getting the SONY X900H 75" 4k 120htz tv that they were pushing. It's been time to upgrade from 1080p for a while now.
Speak for yourself. I sit about 5 feet from my 55" OLED and the difference between 4K and sub-4K is night and day. I will always pick 4K/30 or 4K/60 over 3K/60 or 3K/120
I just ordered a 4k 55 OLED. I'm so looking forward to it!
As someone who sits 5-6 feet away from a 65" OLED... Don't listen to the guy above you, you can't tell the difference between 1440 and 4K.
I have to sit 4 feet away from a 65" to even notice a small difference.
It's ridiculous, seriously. I own a pretty big 4k TV, and anything above 1440p looks more or less the same to me. Glad devs are using dynamic res more and more to get 60fps, what really mattets.
Not to mention that DLSS has pretty much put the nail in the coffin for native resolutions.
Resolution is the most overrated part of a image.
I would rather game on a OLED TV at 1080p with HDR enabled than play a 4K game on a LCD TV with no HDR... There's more to the image quality and experience than just the resolution.
Someone with a XSS can have a better visual experience than someone with a XSX and PS5 if their TV is drastically better in contrast, colour reproduction and HDR capabilities.
@Grey_Eyed_Elf: Lol, imagine thinking because you have shitty eye sight that no one else can see the difference because you can't.
The difference between 1440p and 4K is night and day on my 55 inch. if I move behind the sofa and towards the opposite wall then yeah I can't see the difference anymore.
Then again, I have military-tested perfect vision, the best there is.
For consoles it's important because pretty much all TV's are 4k now and it's very noticeable when you're not playing at the native resolution of the screen.
So what you're saying is gamers are getting shafted out of the best gaming experience possible because they're designed to play on TVs that are frankly not designed or built for gaming? And given how many console titles are locked in coding to a 60fps frame rate, or less, then we could see a situation where lower resolutions would have no change. Basically, consoles are going to be worse to play games on going forward even if they had the same CPU and GPU performance as the PC.
@Grey_Eyed_Elf: Lol, imagine thinking because you have shitty eye sight that no one else can see the difference because you can't.
The difference between 1440p and 4K is night and day on my 55 inch. if I move behind the sofa and towards the opposite wall then yeah I can't see the difference anymore.
Then again, I have military-tested perfect vision, the best there is.
You have a 55 inch 1440p screen to compare it to? If not, then you're not getting real 1440p for the simple fact that there's no integer you can multiple 1440 pixels by to fit into a 2160 vertical resolution without stretching some pixels, or eliminating some. 8K could be 3x scaled (where 1 pixel of the 1440P resolution gets represented by a grid of 9 on the 8K). So yeah, unless you have a 55 inch 1440p native resolution screen, there's no comparison.
@eoten: I have a 4K TV, and I can easily differentiate X1X games running in 3200x1800p or native 4K vs PS4 Pro usually running 2560x1440p or 2880x1620p.
Same with a PC matched to PS4 Pro settings but running in native 4K instead of 1440p.
@eoten: I have a 4K TV, and I can easily differentiate X1X games running in 3200x1800p or native 4K vs PS4 Pro usually running 2560x1440p or 2880x1620p.
Same with a PC matched to PS4 Pro settings but running in native 4K instead of 1440p.
Playing games at a lower resolution than the native resolution of your monitor will always result in a noticeably poorer image quality than playing a game at that same resolution on a display with a native resolution that matches.
I can notice 4K against lower resolutions no problem. I'm just saying it's not important vs:
Talking about moving pictures here, folks. Not tiny text where every pixel matters. Pixel quality > pixel quantity.
Obviously on a PC you can scale up your hardware and do whatever you want, but we're talking about consoles. They don't have the horsepower to do it all. Upscale that shit. Or at least respect your customers enough to give them a graphics settings menu.
Dynamic resolution scaling can work very well. Look at Forza Horizon 4. Solid frame rates, still looks great. 1080p is perfectly viable on a 4K display. Perfect scaling up to 4K with 1/4 the pixels to compute. Render the UI independently if you want smaller text. There are lots of options.
Native 4K just isn't that important for gaming. Frankly, it's the wrong way to use the limited resources on a console in most cases.
@eoten: Okay, exactly. Then why would I want anything less than 4K on my 4K TV?
1080p on a 70 inch 4K TV should be exactly like 1080p on a 70 inch 1080p TV, no? The pixel of the video is simply placed over four display pixels instead of one. It scales perfectly.
4k is very noticeable on a big enough TV 50 inches and up make it look very clear and amazing.
A 48" 4K screen has the same pixel density as a 24" 1080P screen. PPI vs distance is the important thing. From 6 feet away, a 50 inch 1080P screen looks as sharp as the 24" screen from 2 1/2 feet away. A 1080P screen 50" diagonal has a pixel density of about 45. 4K would provide benefits in screens larger than this from that distance. So you may notice an improvement when you start looking at 60, 72 inch screens, etc, but only because 1440P doesn't really exist in televisions because TVs are made for movies and televisions where they're only displaying pre-rendered information, and do not have to be driven by a GPU.
But let's assume 1440P was available in a living room television, and you had that same 6 foot viewing distance. From 6 feet a 1440P resolution would be sharp as 4K, with every pixel indistinguishable at screen sizes up to 65 inches, with 4K only yielding a benefit above 65" at that distance.
And again, all those improvements to sharpness that do exist over lower resolutions is lost the instant you introduce movement because of the lower frame rate you will get from higher resolutions. Take it from someone who has seen both 4K60 and 1440P120, 1440P looks WAY better for gaming.
@eoten: Okay, exactly. Then why would I want anything less than 4K on my 4K TV?
1080p on a 70 inch 4K TV should be exactly like 1080p on a 70 inch 1080p TV, no? The pixel of the video is simply placed over four display pixels instead of one. It scales perfectly.
Yeah, 1080P on a pixel per pixel basis should fit into a 4K screen with a perfect 2x upscale. Similarly, 720P would fit perfectly into a 1440P resolution with the same 2x scaling where both horizontal and vertical pixels are doubled. But at 70 inches you would have to be like 8-9 feet from the screen for pixels to be indistinguishable. Also, some TVs may not have very good upscalers at all and it still may not look as good as a 1080P screen will natively. Most TVs though are going to be 60hz which means any frame rate above 60 will have no benefit at all to the visual experience.
If you're making a gaming console for living room television, 4K60 makes more sense from that standpoint, because it's what most people buying consoles will be expected to have, but it's certainly not going to be the best setup for gaming. And since most games on consoles are going to be hard-coded to a locked 60fps, there's not a big market for manufacturing televisions that do more than that. There are a few out there, but they're not the easiest to find, and I'm not sure you'd gain much because it seems like Sony and M$ are set on 4K60.
It seems 120fps, and 1440P resolutions will always be a perk to PC gaming. When 8K becomes cheaper though, maybe we'll see televisions that can do 8K30 for TV and movies, and 1440P120 for gaming combined into one screen.
@Grey_Eyed_Elf: Lol, imagine thinking because you have shitty eye sight that no one else can see the difference because you can't.
The difference between 1440p and 4K is night and day on my 55 inch. if I move behind the sofa and towards the opposite wall then yeah I can't see the difference anymore.
Then again, I have military-tested perfect vision, the best there is.
Depends on the display size and ppi. On 27" and 34" 1440p monitors vs 27" and 32" 4k I just can't tell a huge difference in games. But I can tell the huge FPS difference and being able to pump other settings to max.
Text, however, is a lot better.
I like both 1440 and 4K but 4K provides double the fidelity. I have a 65" Samsung 4K UHD and 4K is clearer than 1440p but I wouldn't mind playing something on the level of the Epic tech demo at 1440p if the poly count is higher than current gen games. It's all about what you want and how pretty you want it to be I guess.
@Skarwolf:
The original Xbox One has garbage graphics lol. it literally hurts looking at it.
Your eyesight must be really, really poor or you're sitting like 15 feet away .
@Skarwolf: And is 10x sharper. X1 feels like a PS3.5, its blurry and low res as heck. 1600x900p.
X1X however looks amazing in almost any game.
@Mozelleple112: I was standing there. All 4k does is add slightly more textures or leaves, grass. Nothing that effects gameplay.
And what it does add is barely noticeable unless you are sitting way close to the screen, and is lost the second you introduce movement. when comparing to a 120hz screen.
Please Log In to post.
Log in to comment