[QUOTE="MadVybz"][QUOTE="RiseAgainst12"]
World war 1 was one of the major reasons that WW2 started actually.. so why in the name of god is that not taught?
RiseAgainst12
Because America is the world basically. Every economy in the world, is pretty much controlled by the American economy. Even though the States aren't imperialistic, they're actually economically imperialistic. If the American economy collapses, so will everyone else. America abuses this power and ships out history books based on their history, not the country they are shipping to. This means that since the US has a few dirty little secrets behind WW1, they won't publish it in a history book.
I know but if your being taught about WW2.. i can't imagine why they wouldn't bring up WW1 and how it led to alot of things that brought about WW2 Hitler getting into power for example stemmed from WW1.I've actually read one American history book on WW2, and it doesn't even explain Hitler's origins. The fact that he's Austrian (NOT German), and that he fought in WW1, and with the consequences of Germany losing the war, is what made his extreme motives. That, and the fact that he got rejected from art school by Jews. (That's why he hated them.) The American text books merely describe him as a madman that needed to be stopped. (well the text book I read, anyway)
Log in to comment