Denzel Washington just OWNED woke culture in front of everyone!

10 hours ago
11

Denzel Washington just delivered some truth bombs that many didn't see coming. He called out woke Hollywood on multiple occasions on a variety of different topics, from cultural and racial differences to a lack of genuine storytelling, just for the sake of profit.

Loading comments...