Questions to the culture: Is Hollywood finally showing black films and the black narrative some love

Image 1.

In the past two months I’ve seen: Hidden Figures, Moonlight, Get Out and Fences. All in real cinemas by the way, not any random independent ‘theatres’ in the back roads of Shoreditch, but in the Odeons, the Cineworlds and the Vues. No, i’m not bragging if that’s what you’re thinking, I’m just trying to emphasise my point. Four different movies with black leads, Four movies showing completely different black narratives, Four movies that were extremely successful; all out of Hollywood! That’s odd right? I don’t think I saw one slave or heard any type of depressing humming. Are we seeing a change in direction for Hollywood? Part of me doesn’t trust it. My brain is working hard to find reasons for it, why now? Why 2017? Don’t get me wrong, I’m extremely happy about it, these four movies are amazing. Each one brought out emotions in me that the film industry hasn’t made me feel in awhile. I was sad seeing Chirone get bullied for being different, I was angry when Denzel dropped the “I’m gonna be a daddy line” to his wife’s confusion, annoyed every time Taraji’s excellence was overlooked and smiling and yelping psychotically when Chris finally got his revenge and did what we all wanted him to do. If these references don’t make sense to you then i’m afraid you have a  lot of catching up to do in the cinema.

I personally just found it odd that all these movies were coming out one after the other, especially ever since the big Oscars mishap with them announcing the wrong movie for ‘The Best Picture Award’ which is basically impossible since there’s only one movie on the card, but let me not get started on that. Is Hollywood just trying to get back in our good books after last years OscarsSoWhite catastrophe and all the backlash they received? Makes you think right? Maybe i’m just fishing for conspiracy theories or just looking way too hard into it. All I can say is, no matter the reasons for Hollywood’s decision to finally show black movies some love, I’m happy they’re doing it, especially as a young black writer it’s just really nice to see. Hopefully the success of these movies will convince Hollywood to carry on making movies like these, showing narratives that we’re not used to seeing on TV, Moonlight especially. When I was watching Moonlight it was strange to me at first which sounds quite bad, but I’ve just never seen that side of  black masculinity explored on screen or if they do it’s usually like a side character only there for comic relief. We need to see more of these types of narratives because these are the narratives that reflect real life, that’s why in my opinion they leave a stronger impact, because they’re real.

So anyway, questions to the culture: Hollywood, just what are you up to?

 

Words by Michael Ukaegbu.