Matt Archbold graduated from Saint Joseph’s University in 1995. He is a former journalist who left the newspaper business to raise his five children. He writes for the Creative Minority Report.
Harvey Weinstein is a problem. But he's not THE problem. It goes far deeper. We all know that Harvey Weinstein is a symptom of a diseased culture.
I enjoy going to the movies but I've noticed something disturbing in recent years. It's hit me a few times when I'm watching a movie that the people who made the film don't actually like people. You can see it in the way they portray suburban life as if it's something merely to be mocked. You can see it in the way the movies depict businessmen and women as if they're always up to something nefarious. You can see it in the casual way pedestrians are crushed or killed by an explosion while the caped protagonist battles the supervillain. Sometimes those deaths are even played for laughs as if the death were retribution for being an extra, a normal person.
It hit me that Hollywood is angry. I think I'm understanding why.
The men in charge in Hollywood use and abuse women. The men view the women as objects and the women see men as oppressors and potential assaulters. And to at least some extent, these are the people creating our cultural conversation.
Is it any wonder that people in Hollywood believe that all businesses require a drastic increase in oversight and regulation? Is it a shock that many in Hollywood talk about the #waronwomen? or #rapeculture? Is it a surprise that they talk about the women's movement like it's still the 1940's? Maybe that's because in Hollywood nothing much has changed since then. They thought they were describing the country but in reality it was a call for help from Hollywood.
Maybe it's time that Hollywood caught up to the rest of the country. Maybe these cultural elites should listen to us normals for a little while. I know this would upset the worldview of many, but maybe the normals have something to teach them.
Hollywood has declared itself to be an oasis of progressivism in a country of Neanderthals. But the light shining upon it reveals the ugly truth. Hollywood is different from the rest of the country.
And yes, I'm going to say it, a large part of that difference is godlessness.
I remember a number of years ago, Ricky Gervais signed off from his stint hosting the Golden Globes or some other self congratulatory awards show by thanking God for making him an atheist. I think it was supposed to be a clever way to mock all the award winners who thanked God for their award win. But there was something he didn't notice. Nobody had thanked God; at least I don't recall any who did.
Without God, we are all just creatures vying for our own pleasure, comfort, or advantage over others until the moment of our meaningless demise. In a godless universe, there is no miracle of life. There is only happenstance. And we all know that the moment you don't treasure all life as sacred, others become a commodity. It is a necessary logical conclusion. Once another is not priceless, a worth is affixed that can be measured and weighed against the worth of other things. And when I say "things" I mean it because that is what people become. Things.
I was watching a television show the other day as it was funny and seemed to be about redemption and helping others. One seemingly wise character told another that God doesn't care about what two consenting adults do with each other. Well, all I could think was if God doesn't care about that, what does He care about? It seems to me that's exactly what God cares about. The Bible does away with that silly notion rather quickly when the only two consenting adults on the planet make a pretty bad decision right away. I don't remember Adam and Eve yelling back at God that they were two consenting adults who ate the apple so He shouldn't have a problem with it. (They also didn't claim to be going into therapy for their problems.)
We are not called by God to obtain consent from others to do with them what we please. We are called to love them. Love is a godly obligation. In the absence of that, mere consent becomes a legal duty. There's a great deal of space between those two visions of reality. Christians are asked to see God in others, not rub up against others for our own pleasure.
Even Harvey Weinstein, sometimes at least, sought consent. That didn't make his requests any less an abomination. And let's say that he received a reluctant consent from someone who was frightened about him destroying her career. His thinking might very well have been "who did he hurt?" by showering in front of women or massaging them. Two consenting adults, right? That sort of morality boggles the brain with its mal-intent. Seeking consent to use others is not love.
To think that for years he displayed his monstrousness on parade unfearful of being exposed is a horror. The silence from the Hollywood community was a form of consent. And I don't mean the silence of his victims. I'm talking about those who knew and remained silent. I'm pleased that many women are now speaking out about Weinstein and others. My suggestion would be name names and press charges. Now is the time.
I think I read somewhere that "if I speak without love, I am no more than a gong booming or a cymbal clashing." And I think that aptly describes what's been coming out of Hollywood for many years.
The heart of the issue is a lack of love. The soul of the problem is a lack of God. I know it sounds simple. But that's the beautiful thing about it. Evil and lies are twisted, gnarled and complex. Love is simple. I pray we see more of it coming out of Hollywood in years to come.