Americans are taught that white people did everything, but...
Americans are taught that white people did everything, but that is changing. American history and our dealings with other cultures are a constant conflict of understanding.
Click Here or the flag on image above to change the background image


















