When I say "we", I am referring to the U.S.A. Not any other nations, although other nations can be brought up in this discussion.
Are we a Christian nation? Most would say "yes", we are. But why? Is it because the majority of Americans claim to be Christians? Well then, if this is so, why not call America a white nation? A female nation? Does the majority define a country? I'd hate to think so. Discuss. Thanks for your time.
Log in to comment