This topic is locked from further discussion.
not really.
the only diffrence between I and P is p refreshes the entire screen every time, I does every other line.
you almost need cyborg eyes to see the diffrence.
[QUOTE="GlassDominion"]wrong. You need cyborg eyes to see the differencenot really.
the only diffrence between I and P is p refreshes the entire screen every time, I does every other line.
you almost need cyborg eyes to see the diffrence.
rogerjak
:P
I completely disagree and have the hardware to voice this opinion strongly. People that say they can not see the difference generally do not have a 1080p unit. I will agree, having a 1080i picture on a 1080p TV will show no difference on the TV especially if it is deinterlacing the the picture correctly. This is because it is actually showing a 1080p picture. Sharps do this.
[QUOTE="rogerjak"][QUOTE="GlassDominion"]wrong. You need cyborg eyes to see the differencenot really.
the only diffrence between I and P is p refreshes the entire screen every time, I does every other line.
you almost need cyborg eyes to see the diffrence.
Taijiquan
:P
I completely disagree and have the hardware to voice this opinion strongly. People that say they can not see the difference generally do not have a 1080p unit. I will agree, having a 1080i picture on a 1080p TV will show no difference on the TV especially if it is deinterlacing the the picture correctly. This is because it is actually showing a 1080p picture. Sharps do this.
I have a 1080p HDTV, and I have seen 1080i working, and I have not seen any difference.[QUOTE="rogerjak"][QUOTE="GlassDominion"]wrong. You need cyborg eyes to see the differencenot really.
the only diffrence between I and P is p refreshes the entire screen every time, I does every other line.
you almost need cyborg eyes to see the diffrence.
Taijiquan
:P
I completely disagree and have the hardware to voice this opinion strongly. People that say they can not see the difference generally do not have a 1080p unit. I will agree, having a 1080i picture on a 1080p TV will show no difference on the TV especially if it is deinterlacing the the picture correctly. This is because it is actually showing a 1080p picture. Sharps do this.
i have a 1080p screen too, and seriously i think this whole screen resolution thing is blown way out of purportion. its like people **** when a 30fps game dips to 29 frames per second.
[QUOTE="Taijiquan"][QUOTE="rogerjak"][QUOTE="GlassDominion"]wrong. You need cyborg eyes to see the differencenot really.
the only diffrence between I and P is p refreshes the entire screen every time, I does every other line.
you almost need cyborg eyes to see the diffrence.
GlassDominion
:P
I completely disagree and have the hardware to voice this opinion strongly. People that say they can not see the difference generally do not have a 1080p unit. I will agree, having a 1080i picture on a 1080p TV will show no difference on the TV especially if it is deinterlacing the the picture correctly. This is because it is actually showing a 1080p picture. Sharps do this.
i have a 1080p screen too, and seriously i think this whole screen resolution thing is blown way out of purportion. its like people **** when a 30fps game dips to 29 frames per second.
If you had a TV that has a Native resolution as say 1366 x 768 and you run 1080i content on it you will most certainly see a difference then if you were running 1080i on a 1920 x 1080 native set. The resolution thing is NOT exaggerated.
i doubt glass domination has a 1080p tv. or maybe its too small and he cant see the difference.pablito3your right, i didnt spend 1,000 + dollars on a 52" 1080p screen. and the biggest diffrence is between 720p and 1080p
the changes between 1080i and 1080p are minimal, which is what this thread is about.
just get get P isnt worth getting a whole new tv over.
Anyone who buys a 1080p set that's under 55-60 inches is just wasting their money. You can't even tell the difference between 720p/1080i and 1080p unless it's a huge screen and that you're sitting really close to it, so don't fall for the scam. If you're looking to get a 32" - 42" for gaming (i have a sharp 32" aquos 720p), just go with a 720p. You'll save tons of cash.paul52181
So what are you saying is that asony Bravia 40" Full HD 1080p it will be the quality ofa 32" 720p?
Please answer that, I was saving money to get the sony (mentioned above), but if I wont see the difference between 720p and 1080p on a 40" I'll would rather have a smaller 32" 720p then. Now I have a 32" 480i, it suc**.
sorry taijiguan you must be some sort of super cyborg or so damn picky you want 2160p resolution.
GlassDominion
It is very simple why there is a difference and a large difference. My eyesight is 20/15 which is very good but this has nothing to do with what I am saying.
For those that have read crap on the internet let me explain that YES SOMETIMES IT IS POSSIBLE TO READ THINGS ONLINE THAT ARE NOT TRUE.
Here is how these TVs work. To spare me the time explaining something to those that simply have no idea what they are talking about I will start by saying I am talking about High End TVs. Sharp Aquos, Samsung 65/71/81 series, Sony XBR, Panasonic 700/77U series, Pioneer Elite/Kuro/Pro exc. Not Vizio and other junk people think is good because they can not afford a better panel to compare it with. Keep in mind, the units would need to be properly calibrated and I am not talking about the just eye balling it.
You have X TV with a NATIVE Resolution that is generally 1024 x768 (was popular with Plasmas), 1366 x 768, 1280 x 720 (more rare then you think) 1920 x 1080 and even sets that are1024 x1080.
No matter what resolution you are feeding your TV your TV is only going to display its native resolution. If it is a 1080i signal, and the TV native resolution is anything other then 1920 x 540 like say 1366 x 768 then guess what?? You are getting 1366x 768. The problem with the TVs is that in most cases the TV will not do a very goodjob of scaling the Picturethat is being sent as X resolution and then getting converted by the TV to Y Resolution. That is why evenon an amazingTV the picture will still look like garbageif the source material is junk. Crap in crap out. 480i content on an XBR4 looks like junk. Feeding your Television its native resolution will result in the best overall picture.
Will 720p source picture look as good as 1080p source picture if the same brand was using the same panel, with the same processor, with same contrast, same color accuracy, same blacks? NO 1280x720=921,600 pixels. 1920x1080=2,073,600 pixels. If there wasn't a real difference, why would companies run the risk of ADDING 1,152,000 pixels?That is just inviting the chance of dead pixels, stuck pixels exc. These kill the screen.
Here is where people get confused,
Is it truethat the difference between 1080i and 1080p isvery little? That depends, some TVs deinterlace better then others. For example, on my Sharp Aquos it is one of the fewLCDs that will do this. So am I watching 1080i? NO, Iam watching 1080p, as the TV deinterlaced it. There would be no way to tell a difference that I know of. If you have720p set you are feeding 1080i you better believe you will seea difference on that then on a 1080p being feed 1080i. How can you get 1920 x 540 out of 1280x 720? The TV is going touse all 921,600 pixels.
On a native 720p TV meaning 1280 x 720if the sourcematerial is 720p the picture will look great. It will not look asdetailed orsharp as a 1080p set getting fed its native resolution. There are downfalls to higer resolution sets but I am not getting into that.
Keep in mind that if a TV has a better panel, contrast, color accuracy, better blacks exc it may be a 720p/768 model and produce a better picture then a 1080p unit, but the 1080p unit will definately show it has the better resolution.
For the record, TVs I have owned and or still own.
Vizio 32 inch 768p LCD, Samsung 23inch 720p LCD, Sharp Aquos 1080p 42d64u LCD, Westinghouse 1080p 37WSE LCD, Samsung 768p 5054 Plasma, Samsung 1080p 5084 Plasma, Hitachi 51F500 Rear Projection 720p, Toshiba 42inch Rear Projection 720p, I am sure Imay be forgetting some.
Take the test. Go to your PC LCD monitor. Change the resolution lower then its native resolution. Look how HUGE the difference is.
Remember howeveryone said people can not tell the difference between 30 FPS and 60 FPS? I though Science already proved this. http://files.filefront.com/fpstestzip/;8280441;/fileinfo.htmlI uploaded this along time ago, you be the judge.Everything you read on the internet is not always true. Only when you have experimented withthe items being discussed can you make an intelligent argument. Not just claiming someone has Cyborg eyes. I do want 2160p though.
Wow this is a very interesting question since i would also like to know the answer. Since I have a Samsung 32" LCD TV with out the HD tuner built in. And it goes upto 720p or 1080i.
And I'm considering selling it with my dvd player to my sister and paying an extra $1200 and getting a Samsung 40" LCD with HD inbuilt and full 1080p. And then buying a PS3 next year in November some time so I can watch blue ray dvd's and play PS3 games.
So I would like to know if it is worth it.
Also what is the difference in watching a normal DVD compared to watching a blue ray dvd in a blue ray dvd player if it has DVD upscaling to 1080p?
It is very simple why there is a difference and a large difference. My eyesight is 20/15 which is very good but this has nothing to do with what I am saying.
For those that have read crap on the internet let me explain that YES SOMETIMES IT IS POSSIBLE TO READ THINGS ONLINE THAT ARE NOT TRUE.
Here is how these TVs work. To spare me the time explaining something to those that simply have no idea what they are talking about I will start by saying I am talking about High End TVs. Sharp Aquos, Samsung 65/71/81 series, Sony XBR, Panasonic 700/77U series, Pioneer Elite/Kuro/Pro exc. Not Vizio and other junk people think is good because they can not afford a better panel to compare it with. Keep in mind, the units would need to be properly calibrated and I am not talking about the just eye balling it.
You have X TV with a NATIVE Resolution that is generally 1024 x768 (was popular with Plasmas), 1366 x 768, 1280 x 720 (more rare then you think) 1920 x 1080 and even sets that are1024 x1080.
No matter what resolution you are feeding your TV your TV is only going to display its native resolution. If it is a 1080i signal, and the TV native resolution is anything other then 1920 x 540 like say 1366 x 768 then guess what?? You are getting 1366x 768. The problem with the TVs is that in most cases the TV will not do a very goodjob of scaling the Picturethat is being sent as X resolution and then getting converted by the TV to Y Resolution. That is why evenon an amazingTV the picture will still look like garbageif the source material is junk. Crap in crap out. 480i content on an XBR4 looks like junk. Feeding your Television its native resolution will result in the best overall picture.
Will 720p source picture look as good as 1080p source picture if the same brand was using the same panel, with the same processor, with same contrast, same color accuracy, same blacks? NO 1280x720=921,600 pixels. 1920x1080=2,073,600 pixels. If there wasn't a real difference, why would companies run the risk of ADDING 1,152,000 pixels?That is just inviting the chance of dead pixels, stuck pixels exc. These kill the screen.
Here is where people get confused,
Is it truethat the difference between 1080i and 1080p isvery little? That depends, some TVs deinterlace better then others. For example, on my Sharp Aquos it is one of the fewLCDs that will do this. So am I watching 1080i? NO, Iam watching 1080p, as the TV deinterlaced it. There would be no way to tell a difference that I know of. If you have720p set you are feeding 1080i you better believe you will seea difference on that then on a 1080p being feed 1080i. How can you get 1920 x 540 out of 1280x 720? The TV is going touse all 921,600 pixels.
On a native 720p TV meaning 1280 x 720if the sourcematerial is 720p the picture will look great. It will not look asdetailed orsharp as a 1080p set getting fed its native resolution. There are downfalls to higer resolution sets but I am not getting into that.
Keep in mind that if a TV has a better panel, contrast, color accuracy, better blacks exc it may be a 720p/768 model and produce a better picture then a 1080p unit, but the 1080p unit will definately show it has the better resolution.
For the record, TVs I have owned and or still own.
Vizio 32 inch 768p LCD, Samsung 23inch 720p LCD, Sharp Aquos 1080p 42d64u LCD, Westinghouse 1080p 37WSE LCD, Samsung 768p 5054 Plasma, Samsung 1080p 5084 Plasma, Hitachi 51F500 Rear Projection 720p, Toshiba 42inch Rear Projection 720p, I am sure Imay be forgetting some.
Take the test. Go to your PC LCD monitor. Change the resolution lower then its native resolution. Look how HUGE the difference is.
Remember howeveryone said people can not tell the difference between 30 FPS and 60 FPS? I though Science already proved this. http://files.filefront.com/fpstestzip/;8280441;/fileinfo.htmlI uploaded this along time ago, you be the judge.Everything you read on the internet is not always true. Only when you have experimented withthe items being discussed can you make an intelligent argument. Not just claiming someone has Cyborg eyes. I do want 2160p though.
[QUOTE="GlassDominion"]sorry taijiguan you must be some sort of super cyborg or so damn picky you want 2160p resolution.
Taijiquan
It is very simple why there is a difference and a large difference. My eyesight is 20/15 which is very good but this has nothing to do with what I am saying.
For those that have read crap on the internet let me explain that YES SOMETIMES IT IS POSSIBLE TO READ THINGS ONLINE THAT ARE NOT TRUE.
Here is how these TVs work. To spare me the time explaining something to those that simply have no idea what they are talking about I will start by saying I am talking about High End TVs. Sharp Aquos, Samsung 65/71/81 series, Sony XBR, Panasonic 700/77U series, Pioneer Elite/Kuro/Pro exc. Not Vizio and other junk people think is good because they can not afford a better panel to compare it with. Keep in mind, the units would need to be properly calibrated and I am not talking about the just eye balling it.
You have X TV with a NATIVE Resolution that is generally 1024 x768 (was popular with Plasmas), 1366 x 768, 1280 x 720 (more rare then you think) 1920 x 1080 and even sets that are1024 x1080.
No matter what resolution you are feeding your TV your TV is only going to display its native resolution. If it is a 1080i signal, and the TV native resolution is anything other then 1920 x 540 like say 1366 x 768 then guess what?? You are getting 1366x 768. The problem with the TVs is that in most cases the TV will not do a very goodjob of scaling the Picturethat is being sent as X resolution and then getting converted by the TV to Y Resolution. That is why evenon an amazingTV the picture will still look like garbageif the source material is junk. Crap in crap out. 480i content on an XBR4 looks like junk. Feeding your Television its native resolution will result in the best overall picture.
Will 720p source picture look as good as 1080p source picture if the same brand was using the same panel, with the same processor, with same contrast, same color accuracy, same blacks? NO 1280x720=921,600 pixels. 1920x1080=2,073,600 pixels. If there wasn't a real difference, why would companies run the risk of ADDING 1,152,000 pixels?That is just inviting the chance of dead pixels, stuck pixels exc. These kill the screen.
Here is where people get confused,
Is it truethat the difference between 1080i and 1080p isvery little? That depends, some TVs deinterlace better then others. For example, on my Sharp Aquos it is one of the fewLCDs that will do this. So am I watching 1080i? NO, Iam watching 1080p, as the TV deinterlaced it. There would be no way to tell a difference that I know of. If you have720p set you are feeding 1080i you better believe you will seea difference on that then on a 1080p being feed 1080i. How can you get 1920 x 540 out of 1280x 720? The TV is going touse all 921,600 pixels.
On a native 720p TV meaning 1280 x 720if the sourcematerial is 720p the picture will look great. It will not look asdetailed orsharp as a 1080p set getting fed its native resolution. There are downfalls to higer resolution sets but I am not getting into that.
Keep in mind that if a TV has a better panel, contrast, color accuracy, better blacks exc it may be a 720p/768 model and produce a better picture then a 1080p unit, but the 1080p unit will definately show it has the better resolution.
For the record, TVs I have owned and or still own.
Vizio 32 inch 768p LCD, Samsung 23inch 720p LCD, Sharp Aquos 1080p 42d64u LCD, Westinghouse 1080p 37WSE LCD, Samsung 768p 5054 Plasma, Samsung 1080p 5084 Plasma, Hitachi 51F500 Rear Projection 720p, Toshiba 42inch Rear Projection 720p, I am sure Imay be forgetting some.
Take the test. Go to your PC LCD monitor. Change the resolution lower then its native resolution. Look how HUGE the difference is.
Remember howeveryone said people can not tell the difference between 30 FPS and 60 FPS? I though Science already proved this. http://files.filefront.com/fpstestzip/;8280441;/fileinfo.htmlI uploaded this along time ago, you be the judge.Everything you read on the internet is not always true. Only when you have experimented withthe items being discussed can you make an intelligent argument. Not just claiming someone has Cyborg eyes. I do want 2160p though.
im gonna have to agree with you, my magnavox does what your tv does, deinterlaces, all 3 resolutions are pretty much the same
Wow this is a very interesting question since i would also like to know the answer. Since I have a Samsung 32" LCD TV with out the HD tuner built in. And it goes upto 720p or 1080i.
And I'm considering selling it with my dvd player to my sister and paying an extra $1200 and getting a Samsung 40" LCD with HD inbuilt and full 1080p. And then buying a PS3 next year in November some time so I can watch blue ray dvd's and play PS3 games.
So I would like to know if it is worth it.
Also what is the difference in watching a normal DVD compared to watching a blue ray dvd in a blue ray dvd player if it has DVD upscaling to 1080p?
BOZ_10
hi def movies are made for the right pixel count, dvd are only made for standard 480i, so all upscaling does is make it the proper screen size ( if its a widescreen movie) and then the tv has to do some other things to display it better. blu ray or hi def is made to be 1080p so it is clearer beause the upscaler/tv does less work and it is read with the correct laser.
don't games look best in 720p anyway?
Camble
No, they look better in 1080P. Higher Resolution and you see all the lines at once so sharper, clearer picture.
720P vs 1080i depends on your display, 1080i has better resolution which to me looks better, if your tv interlaces well enough I would go with 1080i over 720P, but its personal taste.
[QUOTE="BOZ_10"]Wow this is a very interesting question since i would also like to know the answer. Since I have a Samsung 32" LCD TV with out the HD tuner built in. And it goes upto 720p or 1080i.
And I'm considering selling it with my dvd player to my sister and paying an extra $1200 and getting a Samsung 40" LCD with HD inbuilt and full 1080p. And then buying a PS3 next year in November some time so I can watch blue ray dvd's and play PS3 games.
So I would like to know if it is worth it.
Also what is the difference in watching a normal DVD compared to watching a blue ray dvd in a blue ray dvd player if it has DVD upscaling to 1080p?
shroom76
hi def movies are made for the right pixel count, dvd are only made for standard 480i, so all upscaling does is make it the proper screen size ( if its a widescreen movie) and then the tv has to do some other things to display it better. blu ray or hi def is made to be 1080p so it is clearer beause the upscaler/tv does less work and it is read with the correct laser.
Wow thanks for this info.
Please Log In to post.
Log in to comment