When I am between projects, I often read poetry and dip into surrealism in order to spark something new. I was reading a book of surrealist games. The surrealist movement was an answer to the tyranny of the rational, a quest to unleash the power of the unconscious in art making. The exercises in the book were designed to throw a random element into writing and to create odd juxtapositions. It occurred to me that what is different now than the 1920s and 1930s is that we do not need to go out of our way to encounter bizarre juxtapositions. We encounter them every day in our social media feeds. We are awash in unrelations. Basketball has nothing to do with genocide, and yet there they are beside one another in the Twitter feed and that is normal and you scroll on, movie reviews and calls for papers, and a clipping from a 1910 newspaper and a picture of a kitten…
It is all surreal. This is our reading culture. Everything is Lobster Telephone.
But the juxtapositions of Tik Tok or Twitter do not invite contemplation. Dali’s Lobster Telephone asks you to stop and engage because it’s odd. Twitter asks you not to stop looking, There is nothing surprising or startling about unrelated things coming together because the environment feeds this up constantly, unrelentingly. It is all distraction and no anchor. There is no separate lobster and telephone to begin with.
To me, social media feels like a slot machine where you keep pulling the handle waiting for a prize to come up. The prize, if you can find it, is something that you can comment upon, because the platform is fundamentally about expressing yourself, commenting. The deluge keeps coming, the trends go by faster and faster. To be part of the conversation you need to post before the moment is gone.
Entertainment and news blend. Human beings become metaphors. It is all part of the show. Back in 2019, I wrote a post asking what we should call this era in our artistic culture. “How does the self-conscious audience and the self-conscious creator– aware of how the work might be star-rated and dissected–shape the current art movement?” I didn’t come up with a name for this era. Today I was wondering what the art that reacts against this would look like? When everything is lobster telephone, what is the artistic corrective?
That would be the opposite of this present moment.
You would have to leave your phone outside and sign a non-disclosure. You would promise that you would not speak to anyone about what you saw inside. No social media posts. No reviews. You would have to experience it without comment of any kind.
How would it feel to experience art that you knew you would not tell anyone about? How would the artist approach it having no element of “platform building” or “branding” or “exposure?”
Ars gratia artis (the motto of MGM, the little art film outlet behind Terminator and Indiana Jones.)
It’s hard to imagine.
Remember to like and subscribe and leave a comment below…
Sotomayor believes that partisanship in Congress started to grow when cameras were allowed in. Since then senators have been standing on the empty senate floor speaking “to the camera not to each other… Many senators told me that they felt much of the collegiality died when they stopped getting together in that room and were forced to listen to each other and were forced to sit next to each other and talk to each other.”
Bloomberg today shared a clip of presidential candidate Pete Buttigieg who has invited a press pool to travel with him on his bus. This is something, they note, has not been done since John McCain’s “Straight Talk Express” in 2000. The candidate pointed out that this means that campaigning hasn’t been done this way “since the social media era began.”
Why have a bunch of journalists, who might not present you as you’d like, follow you around when you can reach the public directly with a tweet?
In both cases, the Senators speaking directly to the camera, the presidential candidates tweeting directly to the public, you’re bypassing confrontation and pushback, and also bypassing the natural empathy that tends to come with face to face conversation. It is much easier to caricature someone’s position and use it to your own ends when they’re not sitting beside you.
It’s not just the politicians though. Voters play to the cameras too. Buttigieg observed that instead of shaking hands, the people who meet him want a photo with him.
A handshake is just between two people. It can not be shared with a wider audience. We’re more focused on having something to show to the people not in the room than on the quality of the interaction in the room. The unobserved moment may as well not exist.
It struck me that in the social media age we’ve all become adept at playing to the cameras. We have more outlets than ever to share our views with the world, and at the same time, our desire to have two way conversations has dwindled.
You see the result in a lot of disputes where people do not even think to talk to a person directly before going to an authority or the public to complain. “So and so said/did this and it made me uncomfortable and he/she should be fired…” And in response, the problem the organization seeks to solve is often the PR disaster rather than the interpersonal conflict between these individuals. They, too, play to the cameras.
We have an entire attention based economy, where we try to “build platforms” and get clicks and likes. (Chris Hayes had one of the best observations about social media, he called it, if I remember rightly, “weaponizing our human need for affirmation.”)
As I previously mentioned here, Pew Research Center shows that social media actually stifles discussion on important issues. That is probably not surprising. What is of greater concern is that the researchers found that social media users were less likely to share their opinions even in face-to-face discussions. We get used to framing things in the least controversial manner in order to avoid being unfriended or unfollowed.
Because there is nothing everyone agrees on we form little safe zones online where we assume most people will look at things as we do. Within these tribes the range of discussion and thought becomes homogenized.
Is it possible that the pendulum has swung as far as it can in the direction of broadcasting ourselves and that we’re due for a shift back to a culture that values community and face to face interaction over being known to a large impersonal audience? Have we all used our proverbial 15 minutes and gotten sick of it? Time and Twitter will tell.
The 2016 film Christine is based on the true story of a Sarasota local news personality Christine Chubbuck. I did not know her story when I selected the film under the category “critically acclaimed dramas” on my streaming service. The blurb described the movie this way: “In a film based on true events, an awkward but ambitious TV reporter struggles to adapt when she’s ordered to focus on violent and salacious stories.” Journalism movies are a genre I often like, so I selected it. It was not at all what I had been expecting based on the description.
In retrospect, I believe I had read about Chubbuck when I was studying broadcasting in college, but I didn’t connect it to the film I was watching. The filmmakers undoubtedly assumed that the people who bought tickets would know how the story ends. It is not a spoiler to say that what is best known about Chubbuck is how her life ended. One morning on live TV before her regular segment she read the following “In keeping with Channel 40’s policy of bringing you the latest in blood and guts and in living color, you are going to see another first: an attempted suicide.” And then she shot herself on live television.
Had I watched the trailer before selecting the film, I would have had more of a sense of its tone. This is one case where I feel knowing the ending in advance would have made the experience of watching the film better. It would have added a tension and urgency to what was unfolding on screen. Instead, I spent most of the film wondering why I was watching this woman struggle with mental illness. What was the purpose, the point of view, of this story?
It is, however, a film that has stayed with me and in retrospect, what seemed to be its weaknesses while I was watching, are its strengths. It is a film in which easy answers and clear villains are absent. She has co-workers and family who are patient with her mood swings and who want to help. Chubbuck’s frustration with the shift towards sensationalism for ratings is present, but it is not a bogey man, just one of many problems that Chubbuck is ill-equipped to deal with. She is not seen as worthy of promotion by the powers that be, and the sexism of the time is present, but even if there had been a level playing field, it is not clear that Chubbuck had what it took to succeed in her field. Her erratic behavior, and outspoken insubordination would have gotten her fired in most places of work. She was stiff on camera. The obstacles she faced were real, but her internal struggle was bigger than anything external.
It is rare to have a film in which a woman who is difficult to understand and to like is the viewpoint character. That alone makes the film interesting. Rebecca Hall who played Chubbuck in the film said she was drawn to the film for just this reason. “There are a lot of films about the coolness of being a misfit,” she said, “I don’t know how many films there are, certainly about women, where it shows how painful it is to feel that you don’t fit in and that you are different…”
In this era, where we are sensitive to the idea of appropriation, something that comes up quite a bit in articles about the film is the fact that the writer and director are both men. Should a man have been the one to tell this woman’s story? Is this just exploiting Chubbuck again?
Each of us has many facets to our identity. Yet we consider some identity categories to be more fundamental than others. I am firmly of the opinion that the best person to tell as story is the one who is taken with a story and can’t let it go. Craig Shilowich, the writer of Christine, was drawn to the story because he had experienced depression himself. In the lead up to her dramatic last act, he saw a vehicle to explore mental illness. I would argue that the most important aspect of Chubbuck in this story is not her femininity but her mental illness.
Shilowich refuses to turn Chubbuck into a symbol of a greater cultural message. It might have served the drama better if he had, but he was right to resist the easy sensationalism that Chubbuck’s final statement seems to critique. In the end, I was left with a visceral sense of the frustrations of trying to reach someone who is depressed and who makes herself unreachable. Most of us have experienced–if not clinical depression–at least periods of feeling like an outcast, feeling misunderstood or unable to connect to others.
I was not left with an answer to the perhaps more compelling question of why Chubbuck chose to act in such a public manner. Why did she chose to make her final act a violent rebuke? It was a death that was engineered not only to end her own pain, but to inflict trauma on others who were forced to witness it. We can understand and empathize with the person who finds it too difficult to go on living, but the person who wants to force other people– strangers, society at large– to suffer with her?
I find a line from the Boomtown Rats song repeating in my head: “They could see no reasons ‘cos there are no reasons.” It is fortunate that most of us find this incomprehensible and can’t truly empathize.
The film succeeds, then, in what it attempts to do. It is a think piece. A story about a sensational, tabloid-esque story that is consciously anti-sensational and humanizing. It is at the same time disturbing and, for a film that is framed around an ending, strangely unresolved.
There was a line in a Rolling Stone review of the film that struck me. It was, wrote Sam Adams, “a time when things could happen without being recorded.” This led me to a whole series of reflections on how the dictates of what constitutes a good story, and a proper ending, effects our day to day lives and how we see ourselves. This article is already too long, so I will leave those thoughts for another day.
Seems like it’s been a rough week for you. I’ve been reading your mea culpas, and I am pleased to see your soul searching about the effect of the economy on the working class, the amount of coverage you give to rural issues and labor issues. I hope that these post-election realizations lead to real action on your part. And I’m glad to see the issue of fake news circulating on Facebook coming to the fore. It turns out all those “media elite” gatekeepers do perform a needed service, helping us to know what is fiction and what is news. I’m sure you take some comfort in the idea that it is the delivery system and not the coverage that is broken.
Before these narratives get too locked in, I would like to ask you to do a bit of soul searching about another kind of media bias– the bias for drama and suspense. I will admit that by addressing this to “the media” I am being overly broad. What I am responding to mostly is television coverage of this election. While more people may see stories by passing them around social media, television still sets the stage for water cooler talk, and gives certain stories prominence by covering them or not. What did the major news outlets cover? Not policy issues.
In watching TV news coverage of the campaigns, which I did a lot of, I saw two things. Controversy and pundits reactions to it, and predictions of who would win the horse race based on demographic stereotypes of different regions. (I’m a woman from Michigan and I’m kind of tired of being seen as a rust belt, suburban, female, college educated…blah, blah, blah) This is all exciting, and perhaps it succeeds in getting clicks, in the case of newspapers, and steals viewers from American Idol in the case of TV, but it doesn’t help voters make informed decisions.
I am suddenly seeing lots of coverage of potential conflicts of interest with Trump’s businesses. I recall one news cycle and one well-publicized story about that in the twelve years or so (at least that is how long I think it was) leading up to the election. Suddenly there are lots of stories about it. It is late to start focusing on that now, isn’t it? Was the fact that Trump’s organization did business with an Iranian bank linked to terrorism out there before the election? Because I don’t recall seeing any stories about it, and I watched the news every day.
Perhaps the lack of this scrutiny was due to your original sin of not taking seriously the possibility that Trump could win. If you had believed that, I have to believe, you would have given more thought to the conflicts and issues that would arise if Trump was elected and brought them more to the fore. Wouldn’t you? God, I hope so. Or were they just too boring and not tied enough to the Red/Blue culture wars to generate clicks, likes, shares and viewers?
There must have been some time you could have taken away from the big board speculations to ferret out some of these issues.
Now, I have to say that I am a writer myself and I’ve worked as a journalist and I am writing to you because I respect you so much and value what you do. The “media” is made up of a lot of individuals who are doing great work– many of you agree with all I am saying. Keep fighting the good fight.
Before I let you go, there is another thing I’d like to mention. Election turn out was down this year, contrary to predictions. Democratic turnout especially was down, and this more than anything sealed Clinton’s fate. I know you see your job as explaining the results and creating a narrative. What I am hearing is a lot of analysis on how Clinton failed to speak to voters. But is it possible you might yourselves have played a hand in this? What impact might it have had when, a month or so before the election, when the pussygate bus tape came out, you declared the election over, and said Clinton had a 90% chance of winning? If you like Clinton, but you have a couple of kids you have to get to school, and you work the kind of job where you don’t get paid if you take time off to vote, and you have been told that there is a 90% chance that the candidate you like is going to win anyway, that there is really no chance the other guy can win– how motivated will you be to get to the polls? How much will you believe your individual vote matters?
So yeah, you missed some things. Try to to better next time, won’t you? Because it’s kind of important.
“Never play to the gallery,” says David Bowie in the clip above.
I discovered something interesting when I looked at the logs for my blog. (My blog logs.) Conventional wisdom is that writers need to blog in order to build “an author platform.” The way to build such a platform is to have a consistent, recognizable topic or area of expertise.
A funny thing happened. I started this blog when I branched out into fiction as a way to distinguish my fiction writing persona from my non-fiction writing persona. Initially I wrote largely on subjects that touched on the theme of my first novel.
Eventually, however, I lost interest in those constraints as I moved on to other projects. I started to post on whatever topic caught my interest on a given day, whenever I felt as though I had something worth sharing.
A number of years ago I started reading a great deal about Oscar Wilde and his circle. This had nothing to do with any book I was writing at the time (although it has come full circle as I have sold a book on this topic and am working on it now). From an “author platform” perspective, it made no sense to post about Oscar Wilde, Lord Alfred Douglas and the like. It had nothing at all to do with my second novel, which is about personal identity, rock stars and online impersonation. If I was trying to create a Laura Lee brand the Wilde posts only muddled things.
Yet those posts are consistently popular. Now, I can’t say that this means that all of the people who googled “Give a man a mask and he’ll tell you the truth” and landed on my page can be claimed as “my audience.” They came for Wilde, not Lee. I get that. But they do come, which is more than they were doing before. Maybe some read what else I’ve written and find some through-line that persuades them to stay. Now that I am actually writing a Wilde-related book it has come full circle, the “platform” was built without conscious thought or effort because I wrote about what was interesting to me.
Do what you love, the audience will follow. Or maybe they won’t. In any case, it is a more pleasant way to spend your life than doing what you don’t love.
Ollie switched on the television. There had been another one of those mass shootings. A guy went barmy and shot a dozen people in the office building where he worked. To Ollie it meant that regardless of what channels the hotel subscribed to, there would be something interesting to watch on TV. He flipped through the channels until he found CNN and then threw the remote into the center of the bed. He listened to the report as he opened his suitcase and pulled out a small bottle of vodka and the manilla envelope that held his final divorce papers. He’d been carrying them around for about a week, and hadn’t found time to look at them until now.
The television coverage of real life mass shootings were like genre fiction, a cookie cutter mystery, except that in this case you know who done it, but you keep flipping pages to find out why. This story followed the same basic formula as other mass shootings, only a few characters and details were switched. There were the politicians praying for the families, the candle light vigils, the photos of people in tears surrounded by police in bullet proof vests, there was the slow unwinding of the killer’s biography which revealed everything but his motives. There was the debate about guns and mental health, destined to go nowhere. “Heroes” who jumped in front of bullets and newsmen drawing optimism from the fact that people came together to help the victims…
On the TV screen was the last tweet of one of the victims. She was looking forward to going to a concert with a friend that weekend. She’d just bought the tickets. She didn’t know that would be her last public utterance. Her first, really, for a national audience.
“I shouldn’t know this woman’s name,” Ollie thought.
This excerpt from the novel Identity Theft came to mind this evening as I read an article in The Wrap, a Hollywood business publication The article speculated on why mass shootings are so rare in Hollywood movies and television programs.
To me, the answer seems obvious. We already have the drama of mass shootings on television. If we’re not sufficiently engrossed in the latest permutation (I’d forgotten about the Oregon shootings entirely until a TV segment mentioned it today, and I still don’t remember the details), we can tune in in a week or so and another one will be shown.
Did you see that crazy episode earlier today when the landlord decided to let reporters into the alleged shooters’ apartment and they fought to get shots of family photos and children’s toys like the Black Friday fisticuffs they show us to fill the airways each Thanksgiving weekend?
I happened to notice today on Facebook that a particular story was trending, a story about a reporter who was suspended over something she tweeted.
It is precisely the type of story that tends to trend in social media (as I mentioned in my last post). It gives sharers the opportunity to make an identity statement– agreeing with the original tweet: “House passes bill that could limit Syrian refugees. Statue of Liberty bows head in anguish.” Or arguing vociferously against it. It allows people to express outrage either at her suspension or at her opinion.
Washington Post media critic Erik Wemple was the reporter who called the tweet out as an example of partisan bias. He said it was out of character for CNN, which positions itself as the nonpartisan news channel. This is what set the whole thing in motion. I agree. It was editorializing, and it did not sit well with how CNN wants to position itself. CNN wants to be the unbiased netowork. This is how it distinguishes itself from its competition FOX and MSNBC.
There is something that troubles me in this, however, and it is a bit difficult to articulate. It is the whole question of what is “partisan.” There is something disconcerting in how we assume people will respond to particular issues and that they will have clear political or ideological poles. This comes from our social media use of news as a vehicle for self-expression and fears of expressing points of view that differ from our peers.
It bothers me that responses are predictable enough that expressing an opinion on certain kinds of stories will inevitably identify you– rightly or wrongly– with a particular “team.” So a person might not express a point of view out of fear of assumption creep. If I express an opinion that you associate with a particular political pole you will assume that I am saying everything else that people in that camp are also saying. The fear of offending team A or team B accepts and reinforces existing polarities. We accept, en masse, that certain topics are by nature fodder for partisan confrontation. By making something “partisan” then you can avoid dissent by anyone but people who are assumed to be your enemies and they can just be written off.
The Huffington Post ran a story comparing another opinionated tweet by the same reporter criticizing President Obama that did not result in a suspension. I do not believe the difference was a political bias on the part of management. It was simply that a well-known media critic called out one of the tweets and not the other, which could simply be a product of when he happened to log on to twitter on a particular day.
Thanks to CNN suspending the reporter, her statement got far more exposure than the tweet ever would have. (I happen to agree with her assessment, but that is not really the point for my current purposes.)
What really struck me in the commentary on this story was a description of CNN’s editorial policy from media critic Wemple in New York Magazine.
“CNN strives for a tricky balance in its news programming. It wants spicy, watchable coverage enlivened by perspectives and opinions — but no partisan biases from its corps of reporters and anchors.”
“Spicy, watchable coverage” is perhaps the best– and also the most worrisome–summation of the “entertainment” bias in television news I was describing in a previous post.
I couldn’t really put my finger on what I found so troubling in the notion of “spicy coverage” until later in the day when I happened to turn on MSNBC where I saw a reporter talking about the latest ISIS propaganda video, a slick, well-produced video showing a Hollywood quality special effect of the Eiffel Tower being downed.
The talking head tried to downplay the threat in the video by saying that it was created as propaganda. “They are designed to grab attention and to get the media to show them,” she said and then with seemingly no self-consciousness whatsoever she played the video and it played on a continuous loop on a split screen as she interviewed an expert on the other end of the screen. Incidentally, studies show that news viewers react more strongly to the images on television than to the verbal content. It didn’t matter much what the talking head on the other side of the screen had to say. What people saw and internalized was a vision of ISIS taking down a beloved landmark in a way that conjured memories of the destruction of the Twin Towers.
Let me repeat this point: She said “ISIS created this video so the media will show it” and then went on to carry out ISIS’s wishes as if the network had no say in the matter. We have to put it on, it’s really dramatic, and if we don’t, people will tune into CNN or FOX to see it…
Modern war of the ISIS variety is made up of a series of television friendly events. Mass shootings are media events. They are performed by angry, violent young men who feel powerless and ignored and they want attention.
I don’t care much that Elise Labott thinks that the House vote to make it more difficult for refugees to come to America is contrary to our values. Nor do I much care that the same reporter thought Obama was “wining” at the G-20 summit instead of proposing real solutions.
None of that has the kind of real world implications like the automatic nature of our reporting on the visually exciting, dramatic and cinematic. ISIS sent us a video, and it is really scary. Now that is spicy. Let’s get it on the air fast!
Conflict and fear are dramatic. Stoking them is good for ratings. It is entertaining television. It does not make for good public discourse.
As Glenn Greenwald wrote in The Intercept, “In the wake of Paris, an already-ugly and quite dangerous anti-Muslim climate has exploded. The leading GOP presidential candidate is speaking openly of forcing Muslims to register in databases, closing mosques, and requiring Muslims to carry special ID cards. Another, Rand Paul, just introduced a bill to ban refugees almost exclusively from predominantly Muslim and/or Arab countries. Others are advocating exclusion of Muslim refugees (Cruz) and religious tests to allow in only ‘proven Christians’ (Bush). That, by any measure, is a crisis of authoritarianism. And journalists have historically not only been permitted, but required, to raise their voice against such dangers. Indeed, that is one of the primary roles of journalism: to serve as a check on extremism when stoked by political demagogues.”
There is a French saying, “qui ne dit mot consent.” He who says nothing consents. To put a camera on someone as he plays to fears and to say nothing is to normalize it. To say nothing is to consent. It puts it within the realm of acceptable and reasonable discourse.
In the future will we say about this time?
“We will not walk in fear, one of another,” Edward R. Morrow said. “We will not be driven by fear into an age of unreason, if we dig deep in our history and our doctrine; and remember that we are not descended from fearful men. Not from men who feared to write, to speak, to associate, and to defend causes that were for the moment unpopular.”
The other day I was watching the news with my mother and she pointed out that all of the commercials on the cable news channels were for the AARP and for drugs. “Don’t young people watch the news?”
Indeed, I believe that they don’t. Young people are more apt to get their news from the internet and from social media feeds.
I should mention, by the way, that this doesn’t mean– as one friend of mine lamented– that young people only read about Justin Bieber. (Who I am reliably informed is avoiding Charlie Sheen.)
This observation got me to thinking about how the medium affects the type of news that gets broadcast and received.
Television news has a finite number of broadcast hours and it can only point its camera in one direction as a time. So viewers are at the mercy of news producers to determine what is newsworthy. On the internet, people can choose for themselves what stories to follow and they can, therefore, find out about everything that is happening in the world. (And yet generally they don’t.)
In both environments certain types of stories get attention. The television news channels have a bias. It is not a right or left bias, as people on either side of the political divide sometimes claim. It is an entertainment bias. (I hate it when Fox News pundits complain about the “mainstream media” when they are the most watched TV network in America. Isn’t that the standard definition of “mainstream?”)
As an advertiser supported medium, news channels owe their existence to capturing an audience that could be watching Kardashians and keeping their attention. That means that the stories that lead have an element of drama. You are not likely to hear “Our lead story tonight, an analysis of the proposed budget.” (Snooze) The news will lead with a bang– literally, or a courtroom drama, or a downed airliner, a celebrity scandal or the disappearance of a woman who looks like a model and the networks will do their best to figure out what interests and entertains us and deliver it. Television news is the perfect environment to foster a reality television star’s presidential campaign.
The types of stories that trend on social media are slightly different. People post links to stories on their Facebook feeds and on Twitter as a means of self-expression. Each story shared is in some way a reflection of the person who posted. The types of stories that thrive in this environment are those that lend themselves to some kind of identity building. For example, people post political stories that identify them as being like or unlike the Tea Party, or the religious, or the liberals. “I am a person who stands for…” A story about Kim Davis who wouldn’t issue marriage licenses to same sex couples is the perfect story for this kind of news environment because it gives people an opportunity to post their commentary and present themselves as an upstanding fundamentalist or as the type of person who favors gay rights.
The red Starbucks coffee cups are a social media driven story. There is almost no content to the story at all. It is just a vehicle for people to showcase their opinions and their sense of humor. “I am offended by secularization” or “I believe in diversity” or “People are so superficial, and I am deep enough that I can point it out.”
So what are the ramifications of news as self-building? It must surely be a factor in the increasing political polarization we see and the rise of the “no compromise” style of governing. But there must be other, less obvious, consequences of how we spread the news.
This realization struck me a couple of weeks ago when I was speaking to a sponsorship agent. I was trying to line up a sponsor for our coast-to-coast ballet master class tours. As we talked about “markets” and “reach” I thought about all of the television commercials and the stadiums and theaters with brand names on them and I began to imagine an Uber for attention. Instead of paying networks or stadiums to carry messages that consumers might or might not see why not monetize attention directly– create an ap where a company can pay individuals directly for a bit of their undivided attention? Cut out the middle man.
Increasingly artists of all kinds are told they need to work for free in order to gain “exposure.” The Huffington Post pays writers in exposure. American Idol pays its entertainers, with the exception of the winner, with “exposure.” I think it is about time we develop actual units of “exposure” so that artists can pay their landlords with it. Maybe we could call it “FameCoin.”
Young people, especially seem to feel that this free artistic labor is worth it because exposure is so valuable. But is it really? Professor Barrie Gunter of the University of Leicester studied the question and found “The idea that being on a television talent contest is a guaranteed route to fame and fortune is not supported. While this can happen, it applies to only a minority of contestants.” Gunter points out that few winners of The Apprentice lasted beyond the first year of employment with Donald Trump and few went on to develop their own businesses.
“Project Runway” returns…with yet another Emmy nomination for best reality television show, respectable ratings and a modest list of upcoming celebrity guest judges. What it does not have are bragging rights to a dazzling designer success story. There is no true-life example of the wondrous fairy tale that has been at the heart of the show’s premise since its premiere in 2004…
“Project Runway” hasn’t told a story of triumph as much as it has, over time, offered a nuanced tale about what success means in today’s fashion industry, why it is so difficult and why it mostly has nothing to do with having one’s name up in lights — or on the New York Stock Exchange.
In its particular failure to produce another Michael Kors, the show has brilliantly illuminated the realities of fashion for the public to see.
A study by Adam Lackford does implicate the importance Americans place on fame as one of the ingredients that leads to our high rates of mass shootings compared to the rest of the world. So we respond with a “don’t say the killer’s name” policy. For those who would do violence in order to earn some notoriety, here’s some sobering news: It doesn’t work. Most mass shootings do not even make the national news these days. As Shane Ryan wrote in The Daily News:
Without the audiovisual and social media elements, this would barely register as a blip on America’s overburdened radar. In an incredible piece of data-based journalism, Vox’s German Lopez showed that there have been 885 mass shootings (with at least four victims) in the U.S. since the Sandy Hook massacre in late 2012, and we’re averaging about one per day in 2015. The Roanoke killings stand out because many of us actually saw the killings take place, but aside from the strange amount of documentation, nothing about it was exceptional. It was ordinary. In fact, it barely even qualified as a “mass shooting” by Vox standards, and would have fallen short of that metric if Flanagan hadn’t turned the gun on himself.
So to the angry guy who is building up his arsenal right now with a “this will show the world” drive– don’t do it.
In the literary world authors are constantly told to get out there and blog, blog, blog. The key to success as a writer is to build up a huge social media presence. But all of this is quite at odds with the traditional role of the writer as a silent observer of life.
“It’s very important for a writer to be unnoticed,” Edith Pearlman told The Boston Globe in 2012, when she was 75. “As quiet and unnoticed as possible.”
This is, of course, the opposite of what we are told we need to do in order to have any chance of having a writing career. So we turn to social media in an attempt to earn some FameCoin. This desire to be noticed and followed has an impact on the type of work we create.
Pew Research Center shows that social media actually stifles discussion on important issues. That is probably not surprising. What is of greater concern is that the researchers found that social media users were less likely to share their opinions even in face-to-face discussions. We get used to framing things in the least controversial manner in order to avoid being unfriended or unfollowed. It is reasonable to assume, then, that writers who are frequent social media users will also get in the habit of thinking and writing in more conventional, less challenging ways.
A Cornell study makes the case that social rejection is actually good for the creative process. The act of being rejected can liberate creative people from the need to fit in and allow them to pursue their interests. Barry Staw, a researcher at the University of California–Berkeley business school who specializes in creativity told Salon that a successful creative person is someone “who can survive conformity pressures and be impervious to social pressure.”
I propose that it is time to rethink some of our assumptions about the value of attention and exposure. We are dealing in a currency that buys very little.
I’ve been reading a lot of articles of late on the subject of shaming. A new book is out by Jon Ronson called So You’ve Been Publicly Shamed. Ronson spent the past three years traveling around the country and meeting with the targets of high profile shamings. As the description says, “The shamed are people like us – people who, say, made a joke on social media that came out badly, or made a mistake at work. Once their transgression is revealed, collective outrage circles with the force of a hurricane and the next thing they know they’re being torn apart by an angry mob, jeered at, demonized, sometimes even fired from their job.”
Today I read an article on the TED blog about Monica Lewinsky’s re-emergence as a spokesperson for those who are shamed online. Nadia Goodman wrote:
As TED’s social media editor, I have seen a lot of nasty comments. I’ve seen grown men and women deride a 14-year-old girl for her choice of dress. I’ve seen them say they’re revolted by a beautiful transgender woman. On every talk about race, I’ve seen a slew of racist comments. But none have ever been as bad as the comments we got when we published Monica Lewinsky’s TED Talk, The Price of Shame. At least at first.
Most of the articles I read about trolling, media shaming and viral online shaming campaigns make the same assumption, an assumption I believe is mistaken. People generally assume that we shame people who transgress in order to bring them back into line and to compel them to behave in socially agreeable ways, in much like the Puritans did when they put people in the stocks.
I don’t think this is actually what is happening. I came to this realization today while reading an old article I’d stored in my “to read” program. (I have about 180 pages of articles there and I thought it might be time to clear some out.)
Research shows that conversations between people seeking common ground can influence which ideas and people gain cultural prominence. The best baseball players don’t always get elected All-Stars. And the Nobel Prize doesn’t always go to the most deserving member of the scientific community. This, according to a pair of recent studies, is because such recognition can depend upon how well known an individual is rather than on merit alone. Moreover, because it’s human nature for people to try to find common ground when talking to others, simple everyday conversations could have the unfortunate side effect of blocking many of the best and most innovative ideas from the collective social consciousness…the more people are talked about, the larger a role they play in society — and the more they will subsequently get talked about. This creates a self-reinforcing ramping up of social prominence that is not necessarily deserved.
The researchers in the study referenced in this article found that when people were given the choice to speak with people they had not met before about baseball players who were well known, but were having mediocre seasons, or those who were not as well known but were having very good seasons, they invariably talked about the more famous players because they served as a common point of reference.
Well known people and their scandals serve as common conversational currency. We no longer read the same books. We do not share the same religious beliefs and the stories that are handed down through those traditions. We do not have a common store of mythological characters that we can use as common frames of reference for our ethical discussions. In fact, it often seems that all discussions of ethics and values only take place in a context of political polarization and a left/right team sport. So the fraternity brothers with their racist song become fictional characters that we can all use to discuss what we will stand for, what we want to be associated with, and what behavior is appropriate.
We are using these episodes, not to control the behavior of the perpetrators, but to define who we are either in support or opposition to the figure being shamed. Their “fat chick” tweets or extramarital affairs or offensive videos give us an opportunity to blog, to present ourselves on Facebook, to tweet our reactions and to generally exclaim what type of people we are. (In much the same way that a woman felt compelled to tell me at a book signing that she did not approve of the subject matter of my book. She didn’t say this to persuade me of anything but to define herself as the type of moral person who would not read such a book.) We care very little about the people we shame. They are not people we know, but stories we are told. We aren’t going to live with them, and their behavior will generally not affect us directly at all.
If you need proof of this hypothesis, watch this clip of Jon Ronson being interviewed on The Daily Show. In it, Ronson notes that most people give little thought to the people who have been shamed once the firestorm has passed. If you do not want to watch the entire interview, go forward to about the 6:50 mark. Ronson says that when he asks people how the victim of a public shaming is now, years later they say “Oh, I’m sure she’s fine.” Often that is not true.
In this clip Monica Lewinsky makes a call for a cultural shift. I think a lot of people share her concern that our media culture seems to thrive on these types of vicarious morality tales with little regard for the consequences to the individuals involved. If your particular brand of bad behavior seems to strike a chord with the passions of the moment, you may become good copy.
Lewinsky talks about changing the narrative– her personal narrative. But perhaps we need more fictional narratives, more characters, folk tales, modern myths that we can hold in common and discuss and debate. We need common stories.