I've been thinking about this. Since religion, for the most part, is reflective of the general ideas and views of a society, will religion eventually die out? I mean, it can be said that science (no, not Scientology), is our relgion. I mean, we have all these scientific theories of how we came to be, how life works, etc. These theories and facts are being taught in a good majority of our schools (not alongside the beliefs of an organized religion, might I add). We, for the most part, no longer believe in a sun god, or any other gods that represent different natural phenomenon.Â
American society, compared to how we were hundreds of years ago, has become more lax when it comes to relgion. I don't have to worry about being labeled as a heretic and being abused for believing something contrary to Christianity in most parts of the country.
Will religion die out eventually? If so, will some wacky new beliefs system rise up to take it's place? Is science our religion?
Discuss.Â
Log in to comment