White actors still get way more money in Hollywood. It's...
White actors still get way more money in Hollywood. It's been that way for a very long time. I hope it'll change, but it's a matter of forcing that change.
Click Here or the flag on image above to change the background image