Weird Distortion problem 1920 x 1080

This topic is locked from further discussion.

Avatar image for the_mitch28
the_mitch28

4684

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 the_mitch28
Member since 2005 • 4684 Posts

So I just finished my Media center computer today and excitedly hooked it up to my TV which is an Acer 42" 1080i.

Once I hooked it all up and fired it up, I set the desktop to 1920 x 1080 and everything is distorted, I don't know how to explain it other than it looks like everything has tiny lines deviding everything up and making words impossible to read and image quality extremely poor. I played around with nVidia control panel for a good while to no avail.

Specs:
CPU: Intel Pentium Dual Core @ 2.8GHz
GPU: XFX nVidia 8800GTS 320mb
RAM: 2 x 1GB DDR2
Mobo: Gigabyte 965P-DQ6 2.0
OS: Windows XP Media Center Edition (SP3) 32-bit

Obviously capable enough to run the desktop at 1920 x 1080 especially while testing before I started building I hooked up my old AGP computer to it which consisted of an ATi Radeon 9600PRO 256mb, 512mb of DDR400 RAM, an 800 front side bus mobo and XP SP2 and it still managed to spit out a clear desktop at 1920 x 1080 (although not in the most speedy manner).

Avatar image for lvgaming
lvgaming

739

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 lvgaming
Member since 2006 • 739 Posts

Have you tried 1366x768 or similar 1080i resolution?

1920x1080 is a 1080P resolution. I don't know if this going to matter, but your Acer might be having scaling problem with the higher resolution or I could just be blowing smoke I guess.

Avatar image for the_mitch28
the_mitch28

4684

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 the_mitch28
Member since 2005 • 4684 Posts

No I don't think the TV is the problem, like I said it displayed 1920 x 1080 fine on my old crappy AGP computer. I should probably mention I'm using a DVI connection to the TV.

When I connected it the old computer at one point it was similarly distorted but a drop down menu saying something about "ATi plug and play" just above the resolution scroller fixed that. That's greyed out when using nVidia though.

Hmmm might give it a shot with my desktop computer which I use on 1920 x 1080 on a 24" 1080p monitor but I'm positive it's not the TV's problem. Decoding method aside it should still be able to show 1920 x 1080 resolution.

Avatar image for lvgaming
lvgaming

739

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 lvgaming
Member since 2006 • 739 Posts

Alright, I was trying tobrainstormsome troubleshooting. I didn't mean to say the TV is the "problem", but maybe setup and etc. could be the issue.

Avatar image for the_mitch28
the_mitch28

4684

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5 the_mitch28
Member since 2005 • 4684 Posts

Oh in answer to your question there is no distortion on any other resolution level but I have lots of HD movies so i need 1920 x 1080. My 360 works fine at 1080i :S This is frustrating.

When the computer starts up it says in the corner 30 hertz which seems a little low to me (the TV is capable of higher) but it changes to this automatically. I've tried turning that up but it would seem my TV has a mind of its own.

Avatar image for gigatrainer
gigatrainer

2029

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 gigatrainer
Member since 2006 • 2029 Posts
I'm not sure but maybe you're supplying too much frames?

Have you tried 1366x768 or similar 1080i resolution?

1920x1080 is a 1080P resolution. I don't know if this going to matter, but your Acer might be having scaling problem with the higher resolution or I could just be blowing smoke I guess.

lvgaming
1080i is also 1920 x 1080. i stands for interlaced and p is progressive, its the way image is processed/delivered.
Avatar image for -GeordiLaForge-
-GeordiLaForge-

7167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7 -GeordiLaForge-
Member since 2006 • 7167 Posts

Oh in answer to your question there is no distortion on any other resolution level but I have lots of HD movies so i need 1920 x 1080. My 360 works fine at 1080i :S This is frustrating.

When the computer starts up it says in the corner 30 hertz which seems a little low to me (the TV is capable of higher) but it changes to this automatically. I've tried turning that up but it would seem my TV has a mind of its own.

the_mitch28
At 60Hz, an interlaced image will display at 30Hz. A 1080i image consists of two images being broadcasted at 1920 x 540. Odd horizontal lines are sent one frame, and then evens the next. When they are combined to create one image, the result is 30 images per second (30Hz).
Avatar image for -GeordiLaForge-
-GeordiLaForge-

7167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 -GeordiLaForge-
Member since 2006 • 7167 Posts
I'm not sure but maybe you're supplying too much frames?gigatrainer
That's a good idea. Try forcing "VSYNC" (Vertical Synchronization) in the nVidia Control Panel.
Avatar image for the_mitch28
the_mitch28

4684

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#9 the_mitch28
Member since 2005 • 4684 Posts

[QUOTE="gigatrainer"]I'm not sure but maybe you're supplying too much frames?-GeordiLaForge-
That's a good idea. Try forcing "VSYNC" (Vertical Synchronization) in the nVidia Control Panel.

Alrighty I'll give that a shot now. What would be better to use in this situation, a HDMI or DVI cable?

Avatar image for -GeordiLaForge-
-GeordiLaForge-

7167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 -GeordiLaForge-
Member since 2006 • 7167 Posts

[QUOTE="-GeordiLaForge-"][QUOTE="gigatrainer"]I'm not sure but maybe you're supplying too much frames?the_mitch28

That's a good idea. Try forcing "VSYNC" (Vertical Synchronization) in the nVidia Control Panel.

Alrighty I'll give that a shot now. What would be better to use in this situation, a HDMI or DVI cable?

They really use the same exact signal. HDMI just includes audio. I would use DVI, especially since the audio pass through on your video card doesn't support surround sound.
Avatar image for lvgaming
lvgaming

739

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 lvgaming
Member since 2006 • 739 Posts

I guess I was blowing smoke. :P Always had PC connect to monitor, but always good to learn.

Avatar image for the_mitch28
the_mitch28

4684

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 the_mitch28
Member since 2005 • 4684 Posts

[QUOTE="the_mitch28"]

[QUOTE="-GeordiLaForge-"]That's a good idea. Try forcing "VSYNC" (Vertical Synchronization) in the nVidia Control Panel.-GeordiLaForge-

Alrighty I'll give that a shot now. What would be better to use in this situation, a HDMI or DVI cable?

They really use the same exact signal. HDMI just includes audio. I would use DVI, especially since the audio pass through on your video card doesn't support surround sound.

Alright i'll stick with DVI then.

I forced Vsync on but that didn't hit it on the head, I'm trying a BIOS update now.

Avatar image for the_mitch28
the_mitch28

4684

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 the_mitch28
Member since 2005 • 4684 Posts

Well I've got no ideas, i've tried going through all the nvidia settings and all my TV settings... nothing is helping. At the moment I'm using 720P just so i don't have to guess read everything.

In 1080i everything isjust blured and distorted, even video and images.