Hollywood has been the epicenter of a lot of what many call “wokeness.” I’m starting to lean away from that term, mostly because we’re starting to see people apply it to cover just anything they dislike, but we all still have a good idea of what it means.
And Hollywood is a big part of the problem with it. They’ve pushed that particular agenda down people’s throats, taken financial losses and chalked them up to hatred instead of making bad movies, and pretended they’re the moral ones.
But it seems we’ve got it all wrong. Why? Because they weren’t tripping over themselves to accommodate writers.
Keep reading with a 7-day free trial
Subscribe to Tilting At Windmills to keep reading this post and get 7 days of free access to the full post archives.