Waking up from woke
Once upon a time, Disney was synonymous with good, wholesome, family entertainment. You could trust a Disney movie just on the name alone, and as a parent, there’s something great about knowing there’s a studio you can trust like that.
But Disney went woke. They kept pushing an agenda in their movies, making a lot of people turn their back from the studio. They were reflective of the broader narrative that suggested the world was going nuts.
Now, though, people know better, and it seems Disney does as well.

