Seriously? That's the line that was spoken when a white billionaire was trying to "help" a community to get his own way. Why would the producers of this show, who all happened to be white by the way, and the white main characters continue to go along with the sabotage of culture that comes with vilifying your own? What is wrong with them? Can you think of any nation that hates itself as much as Americans seem too? I can't.