Hollywood has always been political. They consider it their...
Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.
Click Here or the flag on image above to change the background image