Westerners—especially white people from the United States—are always the good guys. At least, that’s what Hollywood wants you to think. Something that people in this country either take for…
Westerners—especially white people from the United States—are always the good guys. At least, that’s what Hollywood wants you to think. Something that people in this country either take for…