I think Hollywood's gotten more reactionary and conservative...
I think Hollywood's gotten more reactionary and conservative over the years, because there's no longer art in Hollywood. Art suffers in Hollywood.
Click Here or the flag on image above to change the background image


















