You have to realize that up until about 1959, Africa was...
You have to realize that up until about 1959, Africa was dominated by the colonial powers. And by the colonial powers of Europe having complete control over Africa, they projected Africa always in a negative light - jungles, savages, cannibals, nothing civilized.
Click Here or the flag on image above to change the background image