In the aftermath of last week’s Vanity Fair story on Tinder and the end of dating there was no shortage of hand-wringing by many readers who were, rightly, appalled at what they found in the story. But upon reflection it seems odd that it would be this particular story that elicits such strong reactions from readers. In many ways the story being told is not new. We have had dating apocalypse stories for far longer than we’ve had Tinder, after all. And when you shift from the anecdotal approach used by Nancy Jo Sales, the author of the original piece, and toward more comprehensive data sets the resulting picture is much more complex than Sales’s story would suggest. Continue reading
‘A Truth Universally Acknowledged…’
In a 1997 article on communal judgment in Pride and Prejudice, William Deresiewicz observed that Pride and Prejudice is, at first glance, an apparent exception to Austen’s practice of opening her novels by introducing a central character.(1) Indeed, Elizabeth Bennet’s character doesn’t truly come to the foreground until around the sixth chapter. Closer examination, however, reveals that there is a central character introduced at the beginning of the novel: the community, with its values, expectations, conventions, and practices. The opening sentence of the book—‘It is a truth universally acknowledged, that a single man in possession of a good fortune, must be in want of a wife’—is a ‘mock aphorism’, which is swiftly exposed to be nothing but a judgment that is ‘well fixed in the minds of the surrounding families’ of the neighbourhood. The earlier episodes of the story focus upon the neighbourhood of Meryton and its collective consciousness, which emerges as Mr Bingley and his friends move to Netherfield and become known to the community of the local gentry, most particularly in the opening ball. Deresiewicz remarks: ‘Elizabeth cannot appear until well into this initial story because it is that story—the story of how a community thinks, talks, exerts influence—that produces her plot, that produces her’ (504). Continue reading
If the conventional blog wasn’t dead before The Dish’s demise, the shuttering of Andrew Sullivan’s iconic internet publishing venture surely signaled the end of traditional blogging. Once an intriguing new publishing form that shunned the norms of traditional journalism for a more personal and—wretched word—”edgy” tone, the blog has now basically died with only a few odd examples that are holding on. These days many classic traits we associate with blogs are simply normal parts of more conventional online publishing. Continue reading
Let me get this out of the way, so no one else has to say it: “Farewell, Matthew Lee Anderson.” Effective immediately, I am stepping down as Lead Writer of Mere Orthodoxy and handing full control of the site over to Jake Meador. He will assume responsibility for all aspects of the site. If he makes me “Emeritus Writer,” well, I won’t turn him down. I am also indefinitely departing from Twitter, though I will be carrying on with Mere Fidelity. Whenever we get off our summer holiday, that is (which should be next week).
Eleven years ago, a friend and advisor told me that I should begin a ‘blog,’ a new medium that was democratizing discourse and opening up career paths for people who knew nothing about the traditional means of rising the ranks in publishing. I gathered a few close friends, took my inspiration from C.S. Lewis and G.K. Chesterton, and Mere Orthodoxy was born. It’s impossible for me to sum up everything this site has meant to my life since that day: we have never been famous or had a large audience. But our small size was one of our greatest strengths, especially in those early years. I was so young, and a barely adequate writer and thinker then, but somehow a small and extremely intelligent community formed and we argued and argued and argued together. Those years were crucial for my formation as a writer and as a person. And now that I am a decade older and still a barely adequate but much more verbose writer, I still don’t have the skills to say how much this ‘place’ means to me. Deciding to step down was the single most difficult decision I have made in a long time. Continue reading
I have a roundup on Amazon’s latest innovation over at Mere O Notes so if you’re wanting to learn more about Kindle Unlimited, start there.
I. Our Technocratic Libertarianism
While Mark Lilla is basically correct in saying that we live in a libertarian era, that term is not without its problems. (Ross Douthat made this point quite well in a recent blog post.) Despite our libertarian tendencies, we are still creatures bearing the image of God and living in a world as creatures made by that God. So both the essence of our humanity and the nature of our creaturely existence constrains our ability to function as completely autonomous beings. But when you have a society dedicated to such stark libertarianism to the cost of all non-coercive forms of community, this necessarily leaves only the coercive forces of big business and big government as the coherent social bodies able to shape communal life.
Thus we have services like Netflix and now Kindle Unlimited, both of which are premised on giving the user a seemingly infinite amount of choice, yet all of the choices available are defined by the business providing the service. So our experience of the service might seem libertarian because there are so many choices and there’s nothing stopping us from choosing anything on offer.
Yet the choices available to our libertarian will are themselves defined and handed down by the only viable social bodies left to us. We just don’t notice them as much these days because Amazon and Netflix have so completely blended into the fabric of our lives that we seldom look beyond them when looking for a movie or book. This is particularly troubling with Amazon given their current spat with Hachette and their history of questionable behavior regarding Kindle books. Continue reading
Sometimes it seems like our minds race to keep up with the pace of technology, that the flood of information overwhelms us. The reality, argues Tom Vanderbilt, is the reverse: technology is actually racing to keep up with us.
Our senses are voracious, taking in and processing the world at a rapid clip. It takes only 25 milliseconds for a flash of recognition to light up our brains and a quarter-second to understand what we’ve seen. That is the pace at which we experience life. Recent studies show that we enjoy running at the speed of mind. When the information we receive through our senses and the tools that deliver them are keeping pace with our brain, we experience a certain degree of pleasure. We’re in a groove.
So when, say, movies speed up their delivery of visual stimuli, we seem to quite like it, which translates into greater demand. And our wish is Hollywood’s command. Movies have steadily and relentlessly offered up quicker scenes, moving from a ten-second average in film’s mid-century “golden era” to today’s five-second scene (or the 1.7 second bludgeoning of Quantum of Solace). That is why action films like The Bourne Ultimatum seem to have a more visceral quality; their frenetic pace is moving more in step with our minds.
Yet for this we pay a price. Our brains are less able to weave these strings of rapid-fire stimuli into sustained experiences that linger in our memory. We then beg for more technologies that allow us to enjoy experiences and our rapid paces. Or, Instagram. Here’s Vanderbilt:
The “technical” acceleration of being able to send and receive more emails, at any time, to anyone in the world, is matched by a “social” acceleration in which people are expected to be able to send and receive emails at any time, in any place. The desire to keep up with this acceleration in the pace of life thus begets a call for faster technologies to stem the tide. And faced with a scarcity of time (either real or perceived), we react with a “compression of episodes of action”—doing more things, faster, or multitasking. This increasingly dense collection of smaller, decontextualized events bump up against each other, but lack overall connection or meaning. What is the temporal experience of reading several hundred Tweets versus one article, and what is remembered afterwards?
Vanderbilt’s essay isn’t about answers, but instead offering the sort of clarity that begs further questions. It seems undeniably good that Google is able to offer search results at precisely the speed with which our brains demand it—less than 300 milliseconds. Or that our desire for communication and connection is no longer frustrated by the tools we’ve created. We can refresh our Twitter feed with a long drag and a “pop” of release. Like an itch being instantly scratched. It feels good. We want more. Now.
Perhaps this is also why we sense withdrawal when we’ve been away from technology’s instant gratification for too long, or feel frustrated when other devices (or people) in our lives don’t offer the same immediacy.
Do we need to carve out time to refresh and reboot ourselves? Do we go cold turkey or slap on a patch to satiate our desire for speed?
Today’s speed is useful, no doubt. Our brain enjoys it and longs for it. Yet we must remain mindful of what may be lost: the deep remembrance that our soul desires.
Two of my favorite films of recent months, Gravity and All is Lost, have more than a few things in common. Both are basically one-man or one-woman shows about individuals trying to survive in an incomprehensibly vast wilderness. Gravity finds Sandra Bullock desperately attempting to return to terra firma after being stranded in space. All is Lost shows Robert Redford (in a mostly silent, yet tour de force performance) lost in the Indian Ocean after his solo yacht venture goes awry. Both films are very much about the visceral, unnerving feeling of alone-ness; both are about the frailty and contingency of man in an often-hostile universe, but also man’s ingenuity, adaptability and cleverness in survival mode. Both are very good films that you should see before they leave theaters.
Another survival-story of sorts: Blockbuster video. The once-dominant video store chain survived the digital revolution (and transition to cloud-based media consumption) longer than many expected. Yet as we knew it would eventually, Blockbuster announced this week that it will soon be closing its final 300 stores and ending its DVD-by-mail service.
The death of Blockbuster, following the death of the record store and the local bookstore (Barnes and Nobles will surely not survive much longer), marks the ongoing transition to a new era in which cultural commerce unfolds no longer in any sort of common, physical Third Place, but in a digital diaspora wherein individuals personally access streams and store (I wouldn’t say “collect”) media for their convenient consumption. And while this iMedia world has its advantages (the ability to access millions of songs and movies on one’s phone with just a few clicks and swipes), it also has severe drawbacks.
Such as: Are we losing a sense of common culture? Perhaps that is an outdated question. To the extent that it ever existed (in America for instance) “common culture” has been rapidly dissipating since at least the 1960s. Still, I wonder if the post-Blockbuster world of cloud-based media consumption is making it ever more unlikely that “culture” or “the arts” or “media” will be something that in the future pulls people together in unifying experiences, discussions and debates. After all, we don’t have to talk to anyone anymore (not even a person behind a counter!) when we purchase a movie, an album, a book. From start to finish, our entire experience of pop culture can happen through one little screen and/or one pair of headphones, wholly unique to us and totally tailored to our tastes, preferences and whims.
With everyone becoming their own self-styled curator, commentator, and ala carte consumer, and with the Internet exponentially subdividing niches, genres, and micro-communities for any of a billion interests, it seems implausible that “common” anything will survive the 21st century. As much as the Internet has gotten mileage out of the “connectedness” metaphor, it seems to be more adept at making us isolated consumers with the power to curate consumer pathways and narrative webs entirely on our own timetables and at our own discretion. We are subject to no one and nothing but our “instant” whims and desires; the curatorial power of “gatekeepers” has been diminished; metanarratives have been long deconstructed. We’re on our own, lost in the vast wilderness of the consumptive “cloud.”
Perhaps this is why the theme of “isolation” seems ever more ubiquitous in our cultural narratives. The solo shows of Gravity and All is Lost are not (of course) overt commentaries on 21st century media consumption trends. But I do think the subtext is there. We are alone, navigating our way in a free-for-all space. There’s a freedom in that. But also a terror. It’s a reverse claustrophobia: a fear of too many choices, too many open roads, too few guides and too little guidance.
One sees the isolation elsewhere. Mad Men’s Don Draper and Breaking Bad’s Walter White are quintessentially American anti-heroes: stubbornly independent, allergic to attachment and subsequently desperately alone. Walt White’s Whitman-esque “Song of Myself” in Breaking Bad–often played out in the vast, unforgiving landscapes of the desert Southwest–illustrates the sobering reality that utter independence often leads to wayward isolation. To a lesser extent, Noah Baumbach’s Frances Ha conveys a similar, albeit more humorous, sense of freedom as isolation in the character of Frances (Greta Gerwig), a twentysomething hipster whose freewheeling decisions to go to Paris on a whim, for example, or to literally dance in the streets of Manhattan, only deepen her directionless despair. It’s perhaps noteworthy that the free spirit dancing pose of the Frances Ha poster resembles the iconic iPod ads featuring silhouette bodies solo dancing against a bright neon background.
Are these movie and TV narratives reflecting the unforeseen isolation of the iPod age? As we further individualize our mediated and cultured lives and embrace the freedom to dance to whatever cultural beat we like, are we simply left spinning and dizzy? That’s certainly the way I felt after watching Gravity and, to a lesser extent, All is Lost: dizzy, unsteady, destabilized, sea-sick. I was left feeling hungry for ballast, for anchors, for solidity; for something outside of myself to offer orientation.
Because going to Blockbuster on a Friday night used to be overwhelming enough. But at least the options were finite. These days the sheer ubiquity of all that is available, all that is recommended, all that is buzzed about in ceaseless streams of 140-character bursts, leaves me with a bit of vertigo: spinning like Sandra Bullock in Gravity, pulled in a million directions at the mercy of vacuity, untethered and uncertain which way is up.
What and how we consume says a lot about what we value. And what and how we consume has never been more public.
Thanks to the broadcasting devices in our pockets and the social network audiences always just a few finger taps away, our interactions vis-a-vis culture are increasingly the means by which people make assumptions about who we are and what we worship.
One of the premises of my new book, Gray Matters, is that in this consumerism-as-social-media-identity world, it is all the more imperative that Christians be intentional, thoughtful and critical in their consumer choices. People are watching. We are observed, processed, known through our consumptive habits. What message are we sending?
The new paradigm of digital/mediated/consumer “identity” is on disturbing display in Sofia Coppola’s new film, The Bling Ring, which depicts the true-life drama of a group of L.A. teens who robbed the Hollywood Hills mansions of celebrities in the late 2000s. The film’s opening is interspersed with snapshots of partying teens’ photos on Facebook and Instagram, and the plot turns on the way that social media makes one’s cultural consumption public, enviable, and (in this case) vulnerable to property theft. But what is most striking is the sheer proliferation of “selfies”: characters holding out their arms with phone cameras to document (and immediately publish to the world) all manner of pursed-lip posing, stolen cash flaunting, booze-imbing and other such glamorization of vice.
There’s an unsettling ambience of directionless vacuity in these youngsters’ lives. Where is their sense of purpose (moral or otherwise)? All that seems to animate their reckless behavior is the possibility that it will play well on social media or get picked up by TMZ.
Bling’s teen bandits are obsessed, first and foremost, with celebrity. But it’s not that they are fans of the films or television shows which made people celebrities in the first place. Nor is it that they are particularly interested in the celebrities as people, with unique personalities and stories. Rather, what interests these Millennials most about celebrities is simply the celebrity-ness of them: their paparazzi aura, nightclub exploits, tabloid scandals and–above all–haute fashion. In short: their conspicuous consumption. As Richard Brody observes in his New Yorker review of the film,
Nobody here cares very much about movies or television shows. Nobody talks about stories, and certainly nobody is reading anything other than magazines. They know the actors whom series and movies have turned into celebrities but have little interest in the shows themselves.
This sort of fetishizing of celebrity at its most superficial (the Louboutin heels, Rolex watches, Birkin bags and Herve Leger dresses they wear), isolated from any broader narrative of who they are and why they are famous, helps explains the existence of famous-for-being-rich people like Kim Kardashian and Paris Hilton. But it also reveals a larger cultural problem, which Brody pinpoints as “narrative deprivation.”
Today’s youth, reared in the Google age of on-demand, isolated bits of information and the real-time feeds of a million little “snapshots” (tweets, Vines, rabble-rousing blog posts, etc.), have no patience for narratives that give context or make connections. It doesn’t matter who Kim Kardashian is or how she became famous. What matters is that she gets to wear Lanvin dresses while on red carpets with Kanye West, while paparazzi take note of the slightest details of her Judith Leiber clutch. And these kids want that too. Brody continues:
In their selfies and their videos, the teens broadcast themselves living out crude fantasies of what, as one of them says, “everyone” aspires to be. What isn’t shared is the way they actually live: the teens don’t depict themselves breaking into houses and cars, stealing, selling stolen goods, or driving drunk. They don’t talk about their own lives in terms of stories. Rather, they live in a world that detaches effect from cause, and they depict only the outcomes.
Hence the sheer ubiquity of selfies. For them, earning jail time for thievery is a small price to pay for the opportunity to broadcast images of themselves wearing Prada sunglasses and guzzling Cristal at Lindsay Lohan’s favorite nightclub. It doesn’t matter what they had to do to get there (steal) or what will happen later (jail). The “now” of social media glory–however fleeting it may be–is what matters.
This “narrative deprivation” is symptomatic of (or perhaps another name for) “narrative collapse,” a phenomenon discussed at length in Douglas Rushkoff’s Present Shock. Rushkoff suggests that today’s world is defined by presentist, fragmented media consumption and an “entropic, static hum of everybody trying to capture the slipping moment.”
Narrativity and goals are surrendered to a skewed notion of the real and the immediate; the Tweet; the status update. What we are doing at any given moment becomes all-important–which is behavioristically doomed. For this desperate approach to time is at once flawed and narcissistic. Which “now” is important: the now I just lived or the now I’m in right now?
Social media’s “what are you doing now?” invitation to pose, pontificate and consume conspicuously only amplifies the narcissistic presentism of the generation depicted in The Bling Ring. It makes it easier than ever to tell the world exactly what you want them to know about you. Through a carefully cropped and color-corrected selfie, depicting whatever glamorized “now” we think paints us in the best light, we can construct a public persona as we see fit.
But it’s a double deception. The projections of our self that we put on social media blast are more often than not deceptive in the way they skew, ignore or amplify realities that constitute our true identity. But it’s also a self-deception. That social media conflates our identity with what we consume leads us to the erroneous conclusion that “who I am” can be easily summed up in the ingredient-listing “profiles” of the bands, brands, books and causes we “like,” the restaurants at which we “check-in,” or the songs we let everyone know we are currently enjoying.
Social media exacerbates our ever-growing tendency to approach cultural consumption as more of a public, performative act than an enjoyable, enriching experience. It becomes less about the thing we consume and more about how our consuming of it fits our preferred image. Bling’s high school burglars steal thousands of dollars worth of jewelry, clothes, and shoes not because they find those things inherently interesting, beautiful or pleasurable; but because they hope the accoutrements of celebrity will rub off on them. The things themselves are merely a means to an end.
For anyone who loves culture and recognizes the inherent beauty and value in, say, an expertly crafted table or an exceptionally roasted coffee bean, it is regrettable to see such things reduced to status symbol or fodder for social media selfie-deception. Making cultural items mere props in our social media performance is just another way of “using” culture to meet our needs rather than “receiving” it and letting it “work on us,” to borrow from C.S. Lewis’ An Experiment in Criticism.
For Christians, resisting the temptation to use culture rather than value it for its inherent goodness is a worthy endeavor, but it’s not enough. Using culture for self-worship is bad, but worshipping culture for its own sake is too. The “goodness” of culture, while certainly a thing to be celebrated, comes not from what it can do for us or even what it is in itself, but rather what it reflects about God and how it points humanity toward Him.
Every piece of culture we consume is an opportunity to glorify and give thanks to the Creator. We of all people should not cheapen culture by reducing it to something that mostly serves our narcissism. We of all people should not strip a cultural thing of its God-given goodness by focusing on its potential to aid in our strategic social media identity construction.
For Christians, culture should never be a tool in service of selfie-deception or self-worship. Rather, it should be something that brings us to posture of gratitude and confronts us with who we really are, laying our deceptions bare and focusing us away from ourselves. And if our consumption of culture communicates anything to the world, it should be a testimony not to our own greatness, style, or Valencia-filtered taste, but to the grandeur and glory of God.
This is the second in a series of posts on contemporary Christianity’s relationship to culture, based on ideas from my soon-to-be released book, Gray Matters: Navigating the Space Between Legalism and Liberty (Baker Books).
That’s the subject of my latest essay over at The Gospel Coalition. Here’s my concluding paragraphs:
Yet the more interesting cases come closer to us. Consider the interrelationship between caffeine and marijuana. On the one hand, many of us rely on caffeine to fuel our work obsessions. Caffeine abuses reveal an overworked, exhausted culture that refuses to rest. A cup of tea is a wonderful gift. Five cups a day may signify unhealthy dependency.
On the other hand, recreational marijuana use seems can engender something resembling sloth. Proper relaxation is a sort of satisfaction—”a job well done”—not a form of escape. Cannabis use may undercut this rest, or at least short-circuit it.
Sloth and overwork are symptoms of the same diseased understanding of how we labor. Some people will strap themselves to and die on the wheel of performance, while others escape their troubles by medicating themselves. In that sense, drugs are (ab)used to therapeutically fill a gap that is felt without being articulated.
Drug use of various kinds highlights our culture’s fundamental commitments and raises questions about how we interact with those commitments as Christians. Just how far does the therapeutic mentality infiltrate our churches? The fastest-growing segment of drug use seems to be painkillers and prescription medicines. Such “white collar” abuses reveal the same sort of escapist mentality that marijuana may foster in different social contexts.
Expanding the framework for evaluating marijuana implicates us all. But the gospel of Jesus Christ creates churches where we carry one another’s burdens. We admonish one another by observing the ways we have failed in our discipleship because we idolize performance and success. Then we begin the process of repenting for our own sins and ensuring that a gospel-centered judgment about whether to use marijuana will actually sound like good news.
I approached the piece as something of an exercise in moral reasoning. It’s underdeveloped in a lot of ways, but I am attempting to expand some of my earlier thoughts on the body into new areas. Make of all of it what you will.