• 58 results
  • 1
  • 2

This topic is locked from further discussion.

Avatar image for firefox59
firefox59

4530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#51 firefox59
Member since 2005 • 4530 Posts

JUST PUT AA ON, PROBLEM SOLVED!!

god damn

and also movies use motion blur to make things smoother (not really any thign to do with AA but still its do do with FPS)
apparently movies are recorded at 24- 30 FPS and tey they are REALLY smooth (most of the time) cause they use motion blur, but heres my question, why cant games use heaps of motion blur (well not HEAPS but just a little to make it look better?)

i know Crysis does, and it also looks pretty cool aswellLehman
I know why video games can't use motion blur like movies can but I'm not sure that I fully understand the tech behind it. Movies have a constant framerate throughout, not varying at all. While videogames FPS can vary extrememly at times such as entering new environments or new rooms, but even little FPS changes throughout normal gameplay. Because of this motion blue can't be used in video games like in movies.

I guess it's because the constantly changing framerate would make the bluring uneven and possibly even worse than Aliasing. Some games like Crysis, as you said, can use minor bluring effects, but until we can find a way to keep the FPS constant, motion blur in video games isn't an option.

Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#52 Baselerd
Member since 2003 • 5104 Posts

To answer the OP's question:

Aliasing is an artifact caused when using raster rendering techniques in computer graphics. Basically when the computed image is scaled to fit to a certain resolution (a grid), it is unable to represent the image with a good amount of precision. Thus a straight line as a vector cannot be represented in an integer cartesian field. So it comes up jagged.

That is to say, your graphics card doesnt calculate 3d geometry at a certain resolution. It calculates the geometry then divides it into a discrete resolution. The higher the resolution, the more accurately it will represent the actual geometry. But it is just approximating, and we will never be able to draw a straight line on an TFT LCD screen. If there is ever going to be a technology that can do this, I doubt it will be during our lifetimes.

If you are familiar with these examples, think of it like vector art programs versus traditional programs (like paint or PS), or converting an analog signal to a digitial one.

Anti-aliasing renders the image at higher resolutions (supersampling), and then scales it down, averaging the frames for each pixel in the smaller resolution. So 4x AA supersampling renders your game at 4 times the resolution and then averages the pixels to get the color/brightness for each pixel.

Obviously there are smarter AA methods now like multisampling and edge detect filters, etc which give far less a performance hit, albiet at an IQ loss (which is generally unnoticeable unless scrutinized)

Avatar image for Lehman
Lehman

2512

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#53 Lehman
Member since 2005 • 2512 Posts

[QUOTE="Lehman"]JUST PUT AA ON, PROBLEM SOLVED!!

god damn

and also movies use motion blur to make things smoother (not really any thign to do with AA but still its do do with FPS)
apparently movies are recorded at 24- 30 FPS and tey they are REALLY smooth (most of the time) cause they use motion blur, but heres my question, why cant games use heaps of motion blur (well not HEAPS but just a little to make it look better?)

i know Crysis does, and it also looks pretty cool aswellfirefox59

I know why video games can't use motion blur like movies can but I'm not sure that I fully understand the tech behind it. Movies have a constant framerate throughout, not varying at all. While videogames FPS can vary extrememly at times such as entering new environments or new rooms, but even little FPS changes throughout normal gameplay. Because of this motion blue can't be used in video games like in movies.

I guess it's because the constantly changing framerate would make the bluring uneven and possibly even worse than Aliasing. Some games like Crysis, as you said, can use minor bluring effects, but until we can find a way to keep the FPS constant, motion blur in video games isn't an option.

ah, i get what your saying
but if we can max a game, and we get HEAPS good FPS, but we then put V Sync on cant we use motion blur
i know the FPS drops a little sometimes, but still not as much as leaving it without V Sync

and if you dont what im saying, its bascally

say we get between 70-130 FPS in a game
but if we put V Sync on, as you know, the FPS is limited to your refresh rate (60, 70, 75, adn i believe also 80)
but most are 60, like mine and so the FPS cant go past 60, (its also a good FPS rate)

so if we can limit it couldnt we use motion blur??

i could dip a little but only for a little second and it wouldnt be that great of a dip in FPS

it just a thought

Avatar image for Collin_85
Collin_85

2694

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#54 Collin_85
Member since 2003 • 2694 Posts

My Digi is brank new sporting 10 megapixels and cost a few hundred dollars so ya it's probably better then what your sporting....

zipozal

Erm. You realize megapixels often mean jack right? Small point and shoots nowadays are reaching resolutions in excess of 12-14MPs. My 8MP EOS 30D eats them for breakfast, given the vastly superior APS-C sized sensor (and of course, better selection of interchangeable lenses of a dSLR system).

It's always hilarious to read blind consumers talking up their cameras by proclaiming they have the most megapixels. So how does yours perform at ISO 1600? Nuff said.

Avatar image for Collin_85
Collin_85

2694

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#55 Collin_85
Member since 2003 • 2694 Posts
[QUOTE="zipozal"][QUOTE="karasill"]

If you don't think cameras implement some form of AA, then you're just a lost cause. How else do you think movies and pics don't have aliasing issues? Is it magic? It must be, because the idea that cameras/videocameras don't implement some form of AA is a stupid idea. I mean why do cameras even have video processors in them? For kicks?

You're done man, there is no point in arguing with you. When I take a picture of someone in real life, it's going to look jaggy unless you blend the pixels together.... Sorry, if you can't understand that then I question your intelligence.

karasill

lmao maybe because real life doesn't have jaggies? Honestly do you even know what your saying? You saying that digital cameras blur the edges :lol: why would the edges need blurred? Theirs no jaggy's in Real life, but if their were THEY WOULD SHOW UP in a freaking photo and yes by default my camera is set to 10 megapixels and thats what I take photos at, most people that by high res camera do so to take high res photos shock.

You don't have a 800 dollar SLR camera, if you do prove me wrong...

Why would you take photos at that res? Are you professional photographer that sells his pics to the national geographic? And no camera ever sets it's default resolution to it's highest out of the box, you're plain lying. Yes real life doesn't have jaggies, but you know what? Any line displayed in a computer monitor or camera monitor is susceptible to jaggies, real life or not. Do you know why? Because that image is now comprised of pixels, we don't view real life in pixels. Notice this plane http://i214.photobucket.com/albums/cc26/cheshire03/untitled.jpg

The camera that took this didn't do a good job of blending the pixels together. Even though it's real life, the camera takes the image and converts it into pixels, and without blending them together you can have some aliasing issues. I can't believe someone who has a "nice " camera doesn't even know the basics of how it works.... Sad.

I'd have to correct you here. The majority of cameras sold these days have default set at their highest resolution. Remember, more megapixels doesn't correlate to 'professional', so one shouldn't correlate the need to shoot at higher resolutions to the prerequisite of being a professional.

Avatar image for Collin_85
Collin_85

2694

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#57 Collin_85
Member since 2003 • 2694 Posts
[QUOTE="karasill"]

If you don't think cameras implement some form of AA, then you're just a lost cause. How else do you think movies and pics don't have aliasing issues? Is it magic? It must be, because the idea that cameras/videocameras don't implement some form of AA is a stupid idea. I mean why do cameras even have video processors in them? For kicks?

You're done man, there is no point in arguing with you. When I take a picture of someone in real life, it's going to look jaggy unless you blend the pixels together.... Sorry, if you can't understand that then I question your intelligence.

zipozal

lmao maybe because real life doesn't have jaggies? Honestly do you even know what your saying? You saying that digital cameras blur the edges :lol: why would the edges need blurred? Theirs no jaggy's in Real life, but if their were THEY WOULD SHOW UP in a freaking photo and yes by default my camera is set to 10 megapixels and thats what I take photos at, most people that by high res camera do so to take high res photos shock.

You don't have a 800 dollar SLR camera, if you do prove me wrong...

Actually, many digitals (vast majority of digital SLRs) employ a low-pass AA filter just before the sensor, which is part of the reason why SLR photography consists of not just taking the picture, but also post processing. A degree of USM in Photoshop is almost routine for a typical professional digital workflow.

Oh, and I've got over $10,000 worth of photography gear, but that doesn't prove anything - so no idea about the $800 comment.

Avatar image for com2006
com2006

902

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#58 com2006
Member since 2006 • 902 Posts

Common sense.

Curves occur naturally in the real world. We, or better yet, matter, is not made of pixels.

I'm pretty sure, maybe not, that aliasing occurs because what you're looking at is made of pixels (i.e: SMALL SQUARES), which have six flat sides, twelve straight edges, and eight corners, no curves whatsoever. Natural curves do not exist in graphics, only imitated by large amounts of pixels.

muirplayer

You got to remember that when taking a photo of a real world object it is transformed into pixels when using a digital camera. Also CG movies such as toy story don't suffer with huge amount of jaggies either.

Avatar image for zipozal
zipozal

1809

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#59 zipozal
Member since 2007 • 1809 Posts

To answer the OP's question:

Aliasing is an artifact caused when using raster rendering techniques in computer graphics. Basically when the computed image is scaled to fit to a certain resolution (a grid), it is unable to represent the image with a good amount of precision. Thus a straight line as a vector cannot be represented in an integer cartesian field. So it comes up jagged.

Alright now were getting somewhere, a previous poster said it's basically due to geometry and the fact that when a edge of a poly attempts to be in two different lines of the screen at once we get this artifact, this implies that if you through enough polygons at it it the problem would dissapear? Of course game engines is pushing other areas much harder then polygons nowadays, but hypothetically?

That is to say, your graphics card doesnt calculate 3d geometry at a certain resolution. It calculates the geometry then divides it into a discrete resolution. The higher the resolution, the more accurately it will represent the actual geometry. But it is just approximating, and we will never be able to draw a straight line on an TFT LCD screen. If there is ever going to be a technology that can do this, I doubt it will be during our lifetimes.

If you are familiar with these examples, think of it like vector art programs versus traditional programs (like paint or PS), or converting an analog signal to a digitial one.

Don't sell mankinds technical progress short, scientists right now are working very hard to stop the process of aging entirely as it has been well known for a while that aging is literally encoded into our DNA, in other words it's not something that has to happen and in some lesser species they have already managed to triple life spans and it is expected that in the next 20 to 60 years they will be able to stop it and eventually even reverse it.

Anti-aliasing renders the image at higher resolutions (supersampling), and then scales it down, averaging the frames for each pixel in the smaller resolution. So 4x AA supersampling renders your game at 4 times the resolution and then averages the pixels to get the color/brightness for each pixel.

Really is that what supersampling does? So what your telling me is say you run a game with a resolution of 2560 by 1600 which = 4.1 Mega Pixels and you apply 4x super sampling your telling me your then running the game at 5120 x 3200 which= 16.4 megapixels and then it's being downscaled to fit your screen?

Are you sure that's exactly whats happening in it's entirety because you see I find that hard to believe, as if you've seen some of the downscaled Crysis Images they hit a point where literally Aliasing doesn't exist and not only does running Super Sampling on my games not do that, but I'd say it doesn't even get close and my monitor sports the same 100 PPI that those 30 inchers are carrying so...

Then again you have to increase the megapixel count 4x just to get double the quality so that obviously wouldn't be enough to do it, but I don't buy that it's doing it as well as running the game at 5120 x 3200 would I mean hypothetically if I took an image of Crysis at 2560 by 1600 with 4x SS and I took one at 5120 x 3200 with no SS and downscaled that image to 2560 by 1600 would the quality really be the same?

Another question on SS, I'm not sure how it works on Nvidia cards, but the Catalyst Control Center on my computer under 3D I have an Anti Aliasing sub section and a Adaptive Anti Aliasing sub section

Under the AA one I have the option to force the AA to be from 4 to 16 times on the standard box filter, then theirs some other filter types and a Temporal AA check box, but their is no mention of SS in it.

In the Adaptive AA section I have the option to enable Adaptive AA and choose it's method which can be either Super-Sampling or Multi-Sampling, under super sampling I have two setting choices, either Performance or Quality.

So what I'm wondering is where the hell does 2x 4x or whatever times fit into this? Does my Super-Sampling Subsection coincide with my AA subsections setting meaning if in the AA subsection I have 16x AA selected then I'm running the game in 16 times SS? Or are they to separate things and when my multi-sampling options give me performance and quality thats just another way of them saying 2 and 4x?

Obviously there are smarter AA methods now like multisampling and edge detect filters, etc which give far less a performance hit, albiet at an IQ loss (which is generally unnoticeable unless scrutinized)

I have an edge detect filter option under my AA subsection and even if I put it up to 8xAA (24 Samples) not only does it give a huge performance hit, but it looks like trash, at least on TF2 the only game I've tried it on...

Baselerd
Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#60 Baselerd
Member since 2003 • 5104 Posts

Well, pretty much nobody uses supersampling. That is the most primitive type of AA. Multisampling is a different type of AA filter that has far less a performance hit. NVIDIA just introduced a new AA algorithm called Cover Sampling Anti-Aliasing. AMD also has edge detect filter AA which scans each frame and only applies anti aliaising at regions where it detects that there is an edge in the geometry that needs to be smoothed.

And I think you are getting your terminology mixed up in some of your questioning... so I'm having a hard time addressing some of your questions.

Also, we are making huge technological advances, but as I said a TFT (thin film transistor) LCD screen will never be able to represent any geometry because it will always be limited by its resolution. Obviously, if we had ridiculously high resolutions, (tens or hundreds of billions of pixels) on our screens we would not notice aliasing much or at all. But as I said, with our current display technology we will not be able to render geometry exactly as it should be. To do this would require a new type of display with infinite precision/resolution.

Avatar image for Musacircuit_2
Musacircuit_2

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#61 Musacircuit_2
Member since 2008 • 570 Posts

Honeslty all of these new AA modes are "downgrades".Super sampling depending on the game looks quite superior to multisampling.And i am personally not all that impressed with nvidia's CSAA.The performance is good but it certainly looks inferior to MSAA.

Games like crysis and oblivion really benefit with supersampling but the performance hit is really bad and ultimately not worth it.

Avatar image for firefox59
firefox59

4530

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#62 firefox59
Member since 2005 • 4530 Posts
[QUOTE="firefox59"]

[QUOTE="Lehman"]JUST PUT AA ON, PROBLEM SOLVED!!

god damn

and also movies use motion blur to make things smoother (not really any thign to do with AA but still its do do with FPS)
apparently movies are recorded at 24- 30 FPS and tey they are REALLY smooth (most of the time) cause they use motion blur, but heres my question, why cant games use heaps of motion blur (well not HEAPS but just a little to make it look better?)

i know Crysis does, and it also looks pretty cool aswellLehman

I know why video games can't use motion blur like movies can but I'm not sure that I fully understand the tech behind it. Movies have a constant framerate throughout, not varying at all. While videogames FPS can vary extrememly at times such as entering new environments or new rooms, but even little FPS changes throughout normal gameplay. Because of this motion blue can't be used in video games like in movies.

I guess it's because the constantly changing framerate would make the bluring uneven and possibly even worse than Aliasing. Some games like Crysis, as you said, can use minor bluring effects, but until we can find a way to keep the FPS constant, motion blur in video games isn't an option.

ah, i get what your saying
but if we can max a game, and we get HEAPS good FPS, but we then put V Sync on cant we use motion blur
i know the FPS drops a little sometimes, but still not as much as leaving it without V Sync

and if you dont what im saying, its bascally

say we get between 70-130 FPS in a game
but if we put V Sync on, as you know, the FPS is limited to your refresh rate (60, 70, 75, adn i believe also 80)
but most are 60, like mine and so the FPS cant go past 60, (its also a good FPS rate)

so if we can limit it couldnt we use motion blur??

i could dip a little but only for a little second and it wouldnt be that great of a dip in FPS

it just a thought

You do have a good point there and I know what your saying but I think the problem is this... Vsync caps your FPS at your refresh rate as you said, but because of how Vsync works it would be difficult unless your FPS never dropped below the refresh rate. This is beause that would be the only way to ensure your FPS never went below a certain value and then that consistency would make motion blue possible.

The problem is that Vsync works by making the the GPU work based upon when the monitor is ready. So it can't render the next frame until the monitor displays it, which is why even with double buffers your FPS can be cut in half by Vsync. This problem can be solved a little in some games by triple buffering which I'm sure you know what that is.

Like I said before, I think the only way that motion blur would be possible in current tech, would be if the FPS never dropped below the refresh rate with Vsync enabled. But the problem is that with any recent game and the increasingly demanding graphics that is almost impossible. And with older games where that is possible, motion blue isn't really necessary.

Lol, kinda confusing, but I understood it. Hope you get what I'm trying to say.

Avatar image for Baselerd
Baselerd

5104

Forum Posts

0

Wiki Points

0

Followers

Reviews: 14

User Lists: 0

#63 Baselerd
Member since 2003 • 5104 Posts

Honeslty all of these new AA modes are "downgrades".Super sampling depending on the game looks quite superior to multisampling.And i am personally not all that impressed with nvidia's CSAA.The performance is good but it certainly looks inferior to MSAA.

Games like crysis and oblivion really benefit with supersampling but the performance hit is really bad and ultimately not worth it.

Musacircuit_2

Multisampling AA is not the same as Supersampling AA. It's very similar, but multisampling only fully supersamples the z-buffer. That is, only the geometry is scaled up and then averaged. The inside textures of polygons and such are still left aliased.

And yes you are correct, anything that is not Supersampling does not look as good. But the trick lies in balancing image quality and performance. All games use FS MSAA nowadays by default, since it looks almost as good and is still playable. Supersampling overextends even modern cards' texture bandwidths and fillrates.

Avatar image for Musacircuit_2
Musacircuit_2

570

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#64 Musacircuit_2
Member since 2008 • 570 Posts
[QUOTE="Musacircuit_2"]

Honeslty all of these new AA modes are "downgrades".Super sampling depending on the game looks quite superior to multisampling.And i am personally not all that impressed with nvidia's CSAA.The performance is good but it certainly looks inferior to MSAA.

Games like crysis and oblivion really benefit with supersampling but the performance hit is really bad and ultimately not worth it.

Baselerd

Multisampling AA is not the same as Supersampling AA. It's very similar, but multisampling only fully supersamples the z-buffer. That is, only the geometry is scaled up and then averaged. The inside textures of polygons and such are still left aliased.

And yes you are correct, anything that is not Supersampling does not look as good. But the trick lies in balancing image quality and performance. All games use FS MSAA nowadays by default, since it looks almost as good and is still playable. Supersampling overextends even modern cards' texture bandwidths and fillrates.

Actually i am perfectly happy as long as i am able to put 4xMSAA.:)

However games like crysis with their massive draw distances leave a little more to be desired.I also really like how the trees look in motion with supersampling.I can actually get away with super sampling in oblivion at a slightly lower res but for crysis the settings would probably need to be medium\low for acceptable performance with supersampling which obviously makes it not worth it.