Hollywood is one the MAJOR reasons why America is as screwed up as it is. Hollywood is on a mission to relentlessly promoting white male supremacy at the expensive of everyone else. This country is getting more diverse, but you will never know if you watch movies out of the Hollywood.
The entertainment industry in a country is supposed to entertain the population in a country. How is a country ever going to be stable when all the media and movies are interested only promoting white men, who only make up about 31% of the USA?
White producers used to say consumers prefer only white leads. This is bullshit to the extreme. Data has shown that movie with diverse cast actually do better. Hollywood is running out of excuses. Will they change? I doubt it since old white men are only interested in promoting white male supremacy even at the expense of money.