assumptions

Sexual Harassment and the Single Story

Sexual harassment allegations continue to dominate the news. I applaud the social movement to change our culture on this issue, but there is something in our national discourse that has been troubling me.

The individual tales of bad behavior are being merged into one story. There is no distinction between transgressions, whether they are isolated or part of a pattern, whether with adults or people under age, whether in a social setting or at work, whether a rebuff was followed by retaliation or not, whether it was decades ago or ongoing, whether the accusation has been carefully vetted or is just something someone posted on social media with a MeToo hashtag. All transgressions are equal, none can be examined deeply without accusations of victim blaming, and the only remedy on offer is firing the perpetrator and permanent ostracization.

The noted scholar Mary Beard wrote in the Times Literary Supplement that she is “conflicted” on the issue of public shamings.

When I say ‘conflicted’ I mean exactly that. Part of me feels that the majority of the allegations that have followed since the Harvey Weinstein cases are probably true, and — in the absence of any real likelihood of criminal prosecutions  (even in cases where that would be a technical possibility) — a bit of public naming and shaming might be the best way of changing the culture on this (and, as I said before, changing the culture in ordinary workplaces as much as in celebrity culture).

But another part of me feels that some of these allegations are probably not true (or at least there is another side to them) — and that no newspaper account is ever going to let us judge which those (albeit minority) cases are. And those innocents have no way  of putting their side of it (at least a legal trial allows you to do that).

In a recent article in Jezebel, Stassa Edwards argues against appeals to due process or any talk of redemption for the accused. She makes the case that such talk is an attempt to sweep the problem under the rug and to return to a comfortable status quo. Certainly such arguments can be, but they are not by definition, and we should not be so quick to dismiss the idea of giving the accused a fair hearing. We need to be especially careful precisely in cases where emotions and stakes are high.

Edwards argues against a New Yorker piece by Masha Gessen, who she quotes here:

“The affirmative-consent and preponderance-of-the-evidence regimes shift the burden of proof from the accuser to the accused, eliminating the presumption of innocence,” she writes, never pausing to consider that jail, suspension or expulsion from school, or job loss are hardly synonymous, or that their long-term repercussions are the same.

Indeed, jail and losing a job are not the same. But we should not be too quick to minimize the impact of social shaming, loss of career and personal identity.

Jon Ronson, who studied those who have been publicly shamed found that years later, the shamers had gone on with their lives and assumed the forgotten targets of their public shamings had too. They’d just lost a job, what’s the big deal? But, he reported, “…we want to think they’re fine, but they’re not fine. The people I met were mangled.”

So “only a job” is not a good excuse to abandon the presumption of innocence. If you were accused of something, you would want an opportunity to respond and be heard whether in court or in the court of public opinion– whether the stakes were jail or losing your job or simply a loss of face, wouldn’t you?

Are we not sophisticated enough to hold these two thoughts at once: that these offenses represent a serious, far-reaching, systemic problem and that we need to be fair to the people who are accused as well as the accusers?

Those who have, at some period in our lives, experienced unwanted sexual advances and want change, should be the most concerned with giving the accused a fair shake. Exaggerating and conflating undermine our own efforts by making us easy to dismiss. Every example of over-zealousness provides an excuse for someone to say the problem doesn’t really exist.

We are a culture that uses celebrities as symbols in our shared mythology, much as we once told tales of the gods. Politicians and film stars are a common point of reference to talk about our dreams, aspirations and values. So the celebrity who transgresses is shunned in order to demonstrate our cultural values. Symbolically, if Louis CK’s actions are forgivable, then so are your wretched boss’s, and therefore we cannot yield.

Nor do we welcome much nuance if it disturbs the important process of myth-making. If individual cases do not quite fit the pattern, they are sometimes made to. Let me give you an example. I believe Anthony Rapp’s accusation against Kevin Spacey. Spacey did not deny it. What was outrageous in that case was Rapp’s age– 14 at the time Spacey allegedly made a move on him.

Since then, many additional accounts of bad behavior have been levied against Spacey, but they have mostly been by adults, although you would be forgiven for not noticing that. To be clear here, I am not dismissing any of the accusations against Spacey here or arguing that they are not truthful or serious. I simply wish to make a point about how the various cases have been synthesized in the reporting to create a seamless narrative.

Consider this passage in a USA Today article on another Spacey accuser. I have edited it to remove the name and some identifying information of the accuser:

It was July in New York and [he] was just 27, in his first major job out of college [at a theater where] he was running the fledgling film program. He was in his office one day, phone in hand, when Spacey walked in and sat down at an empty desk.

 [He] knew who [Spacey] was. Then 22, Spacey was an up-and-coming actor, playing a minor role in Henry IV Part 1, according to records.

The narrator goes on to report that Spacey groped him and became angry when he was rebuffed.

The article goes on “… he was shocked, then freaked out. Would Spacey get him fired?”

I removed the accuser’s name because I do not want to make this about him or to make it appear I am trying to minimize his experience or call his story into question. That is not my point. Rather, I have some questions on how USA Today chose to relate his story.

If you scanned the article quickly, you’d be forgiven for not noticing a few things. The victim is described as being “just 27.” The word “just” emphasizes his youth, although 27 is an adult in anyone’s book.  Spacey’s age does not earn a “just” even though– take note– he was five years younger than the other man. Note also that Spacey is described as an “up-and-coming” actor. This makes him sound notable. This is in contrast to the language used to describe the 27-year-old’s job: his first out of college, a fledgling program.

Other language could have been used to describe an actor who was not-yet-famous and who had only managed to land a “minor role” in a Shakespeare production. You might go so far as to call him a “struggling actor.”

It is not clear whether the victim’s concerns about being fired were his own. They were not presented in the form of a direct quotation. Was this 27 year old, who ran the film program at the theater really worried that a 22 year-old, then-unknown actor in a minor (easy to recast) role would get him fired? Was that what was on his mind? Or did he simply describe behavior that he found notably aggressive and the reporter speculated on his feelings? Perhaps the writer decided that a story of an awkward and unpleasant sexual advance between two co-workers (in which the person who made the advance arguably had lower status) did not fit the growing narrative of male abuses of power well enough.

These stories get reported under headlines saying that “a new accuser” has appeared.  Six out of ten people share news stories having only read the headline, which means most people will naturally assume that the stories that follow are more of the same even if there are important differences. To people who see headlines flashed across their newsfeeds, they are all Anthony Rapps.

A person does not have to be innocent to be a scapegoat. A scapegoat is someone who is made to carry the sins of others, to take on the burden of punishment to absolve an entire group. We use our celebrities this way, as symbols. We have always used them this way. They deserve it, we feel, because they courted fame in the first place. They get to be treated as small gods, and when they fall, they take on the sins of all who shared their transgressions.

But celebrities are just people. They should be held accountable for their actions in proportion to their severity, not in proportion to the severity of the social problem as a whole. Each accuser should be listened to and judged on the basis of her own story, not as a representative of the collective sufferings of women.

Edwards writes “what’s at issue here is civil rights—freedom from discrimination in the form of harassment because of gender or sex.”

She is right. Civil rights is the issue.

We can’t be champions of civil rights without having a concern for fair treatment of both the accused and the accuser.

Advertisements

What Does a Writer Look Like?

Today GQ posted a feature on “How to Dress Like a Writer.” My answer: stay in your pajamas all day. You are an introvert with a home office. GQ took a more dapper approach. Now, GQ is a men’s fashion magazine. So it would be unfair of me to point out the well-dressed writers they featured were all men. I came to the story through a side door and so I was struck by the absence of women before I realized what the publication was.  But this led me to wonder: when the average person hears the word “writer” what comes to mind?

I have written about gender and trends in publishing here in the past, so I won’t look up and link all the articles again, but research has shown that women read more than men, women make up the vast majority of publishing professionals, and this has been true for ages. In the Victorian era, female writers outsold their male counterparts by a comfortable margin.

Given all of this, you might expect the image that comes to mind when you say “writer” to be a woman. I’m guessing, however, that it is not. Your picture was probably more Ernest Hemmingway or Stephen King than Jane Austen or J.K. Rowling.

For even though women do more reading, and undoubtedly more writing, research shows books by male writers find a clearer path to publication, books that are seen as appealing to male readers are more likely to be published, to be taken seriously as literature and to get reviews. And even though female writers were more popular than male writers in the Victorian era, we have little historical memory of them. The serious writers studied in literature courses have overwhelmingly been male.

I did a little unscientific test to see what images the word “writer” evokes when not in the pages of a men’s fashion magazine. I typed “writer” into Google image search. Pictures of typewriters and fountain pens are the most common images associated with the term. More often than not, if there is a person in the picture, it is a man who is using the tool.

Writer at work

But the male images are not as overwhelmingly dominant as you might expect. At a quick glance my impression is that it is perhaps a 60/40 split of men to women. There was also one dog:

Boxer dog making note

What struck me more than the number of male images vs. female was the way male and female writers seem to be depicted. Here are three of the first images of women writers that came up in my search:

The women are in pastoral settings, getting inspiration from nature. Men are more likely to be shown in a professional setting, struggling over words at a typewriter in a book-filled office.

The overall impression I get from looking at these pictures is that writing is serious business for men, they labor and struggle over their text, whereas women write for pleasure and self-expression.

How does a writer dress? If he is a man, he dresses for the office and is correspondingly taken seriously as a professional. If she is a woman, she dresses for the beach or the forest, and probably carries a diary.

 

For more on initial assumptions about identity categories see my 2015 post What is an Identity?

Quote of the Day: Space for Imagination to Play Out

We endure in a society where the mainstream orthodoxy would like us to accept that ‘there is no alternative’. One of the last great taboos is money and the associated economic system. If you consider our mono-currency as a societal tool imposed from the top down, it shapes and informs how we behave and the values we are expected to live by. In a way, it is like DNA; if we can change the DNA of our economy we could create new exchanges, values and social relations. We have become so used to this abstract construct that it is the water we swim in and the box we need to think out of. In order for people to start thinking that another world is possible we need to open up a space for imagination to play out. Art, games and play are some of the few remaining arenas available to engage in speculation about the future.

-Neil Farnan from an interview in Furtherfield on Utopoly, a version of the board game Monopoly that encourages players to imagine society based on values beyond the economic monoculture.

Is Inequality Necessary?

511BEhcZ-cL._SY344_BO1,204,203,200_In 1492, two cultures collided. In my school we were taught to call this Columbus’s discovery of America. Of course, there were already people living here, and they equally discovered the Spanish. There are no written records of how the locals perceived of these strange new arrivals. Columbus, on the other hand, left a diary, which made it quite clear that he did not understand the local customs at all nor did he believe he had any reason to.

Reading Tzvetan Todorov’s The Conquest of America, I was often reminded of Undiscovery Day in Ocean Shores, Washington. Each year on the last Saturday in April the residents of Ocean Shores commemorate the time George Vancouver sailed right by their town without discovering it. They go to the shore and shout “Hey George!” (And then presumably head to the bar for drinks.)

Todorov’s thesis is that Columbus managed to encounter the people of America without ever really discovering them.

When Columbus first met the people he called Indians he found them to be generous and a bit foolish. He could not understand why they would trade gold for worthless things like bits of glass.

“No more than in the case of languages does Columbus understand that values are conventional, that gold is not more precious than glass in itself, but only in the European system of exchange,” Todorov wrote, “…a different system of exchange is for him equivalent to the absence of a system from which he infers the bestial character of the Indians.”

The people he encountered did not possess private property. They had an egalitarian society.  “I seemed to discern that all owned a share of what one of them owned and particularly with regard to victuals.”

Another member of the crew confirmed that they owned everything as common property and would “make use of whatever they pleased; the owners gave no sign of displeasure.” The Spaniards seemed to admire this– until their neighbors extended it to their property, at which point they went from generous to thieving in their eyes even though their behavior had not actually changed.

Before we get too smug about Columbus’s blind spots, we should admit that we are really no better. Can you imagine a society without private property? Our system of organizing society is so ingrained that we are largely unaware that there could be any other way to do it. A few years ago I wrote about what Economic anthropologist David Graeber calls this “the founding myth” of economics, the idea that money evolved out of a system of barter. In fact, the opposite is true. The idea that objects and services have a comparable value that can be quantified and exchanged developed with money. In an interview posted on the blog Naked Capitalism, Graber explained:

Obviously what would really happen, and this is what anthropologists observe when neighbors do engage in something like exchange with each other, if you want your neighbor’s cow, you’d say, “wow, nice cow” and he’d say “you like it? Take it!” – and now you owe him one. Quite often people don’t even engage in exchange at all – if they were real Iroquois or other Native Americans, for example, all such things would probably be allocated by women’s councils.

So the real question is not how does barter generate some sort of medium of exchange, that then becomes money, but rather, how does that broad sense of ‘I owe you one’ turn into a precise system of measurement – that is: money as a unit of account?

Buchan’s book, Frozen Desire, says that in ancient times there was “a contest between the moneyless and moneyed forms of social organizations…Money is normative. So pervasive is its influence on our lives that it makes less moneyed ages incomprehensible, consigning them to barbarism or folklore. Yet history is not inevitable: antiquity did not aspire to our present condition and might have generated a quite different present.”

After the fall of the Roman Empire, Buchan says, Britain for a time shifted to a non-monetary economy.  That means that in the time of Jesus and his contemporaries, the money model was not yet set in stone. We read accounts of Jesus telling his followers to take nothing with them, not to use money, and to rely on the kindness of others.  This is the old relationship model of commerce. Money was of Caesar. The Kingdom of God was to operate on an egalitarian system.

Yesterday I read an article on Big Think reporting on a study published in the journal Nature which argued that human sacrifice was not merely a religious ritual, but a means of social control.

Two-thirds of highly stratified societies once took part in the grisly act, while only a quarter of egalitarian cultures did. The groups who at one time practiced human sacrifice, had more rigid castes, titles that were inherited, and less social mobility. Researchers concluded that “ritual killings helped humans transition from the small egalitarian groups of our ancestors and the large, stratified societies we live in today.” Though sociologists have posited such a hypothesis before, this is the first time it’s been scientifically studied.

Among many today, religion is thought to be the standard bearer of morality. Yet, this study, as Watts said, “…shows how religion can be exploited by social elites to their own benefit.” Since these societies prospered, it proved an effective method of social control. “The terror and spectacle [of the act] was maximized,” in order to achieve the desired effect, Watts told Science. Moreover, ritualized killings would’ve given pause to rivals considering a power play for the throne, foreign ministers mulling over war, and bands among the populace grumbling for rebellion.

Yet, Watts and colleagues posit that social cohesion and stratification was necessary to give humans the ability to develop large-scale agriculture, build cities, erect monumental architecture and public works projects, and to allow for greater capacities for science, art, and learning. Though these findings are thought provoking and significant, some experts wonder if the phylogenetic analysis proves a causal relationship, or merely hints at one.

One of the things that interested me was the researchers’ conclusion that stratification was necessary to have modern culture. There is a double assumption here. Not only that we need a division of labor to achieve large tasks, but that some of the people must receive a smaller share of the rewards for a division of labor to work. In other words, Watts cannot imagine a division of labor without a corresponding class system.

As with gold and glass beads, values are conventional. There is no objective reason that the manual laborer must receive a smaller compensation than the manager. One could imagine rather that a job like working overnight to clean the machines at the slaugherhouse, a job that is both unpleasant and dangerous, might be compensated more than a job like management which has non-monetary rewards like status and a clean working environment. Just because we cannot imagine a large-scale system with a division of labor that operates on an egalitarian system doesn’t mean that such a thing could not exist. (See also my article on the Western notion of History as a Straight Line.)

Yet the human sacrifice theory makes sense to me. In the shift from the “I owe you one” economy to the monetary economy, imagine how radical this idea must have been: that I am entitled to a smaller share of the pie because my job is different from yours. Creating a stratified society required more than just differentiating jobs. It meant convincing people that not only should they take the unpleasant slaughterhouse job, but that the work is not worthy of as much reward as the job of the manager. To get people to agree to that, you need force and maybe the voice of a god.

 

The Others

This was the darker side of community. For a group to have a sense of cohesion, a sense of being “us,” it had to define what was outside of the group. It had to define a “them”— the excluded. Who “they” are changes over time and from society to society, but the process never changes. It is part of the nature of community life. To have an inside, a tribe must have an outer boundary. For most of the members of Paul’s community, young men dancing in gay clubs, people like Andy, were not “us” but “them.” Judging by his own reactions, Paul had to admit with some shame that he felt the same way. “I am not like him.”

I find that I have been thinking about this passage from the novel “Angel” quite a bit lately.

Something has happened this election cycle. It seems as though an epidemic of “othering” has descended upon us. To some extent this has always happened in election years. People dig in their heels, politicians try to differentiate between their views and those of their opponents. Republicans and Democrats try to set the stakes high and make it seem as though the people in the other party want to harm the country and only they can save it.

Then there are the pundits, covering the horse race and predicting how blocks of people vote based on demographic categories and stereotypes about them. “This area is rural and those will be big Ted Cruz voters…” “This area has a lot of students so they will vote for Bernie Sanders.” “Secretary Clinton expects to do well in South Carolina because of its large African-American population.”

The Los Angeles Times ran a story today by Liana Aghajanian in which she expressed her disillusionment with this kind of stereotyping.

After Bernie Sanders won Michigan, the media and its pundits were whipped into a frenzy, touting shock and confusion of how Arab and Muslim Americans — who constitute a healthy portion of the population in metro Detroit — could have supported a candidate who is Jewish.

The only way it felt appropriate to respond was to ask: Why wouldn’t they? Why do we so easily fall into these polarizing traps set up by mainstream media that paint and pit two communities against each other and then accept the idea as truth?

To assume anti-Semitism on behalf of an entire, very large population is not just irresponsible, but as the International Business Times wrote, “Reveals how much reporting on American Muslims is still rooted in an unsophisticated naiveté about what motivates them.”

Every four years we’re treated to this superficial analysis and asked to see our fellow countrymen as representatives of different groups.

“I can’t help feeling wary when I hear anything said about the masses,” the English chemist J.B. Priestly once said. “First you take their faces away from ’em by calling ’em the masses and then you accuse them of not having any faces.”

All of this is depressingly par for the course in elections.

Now we have Donald Trump, a candidate who elicits cheers and sighs of relief for saying “we’re too politically correct,” implying, of course, that those of us who do not agree that Muslims should all be treated as suspected terrorists or that illegal immigrants should be thought of as rapists do not actually believe what we are saying and are simply being polite.

There is room for polite disagreement on immigration policy. This is not about that. I am concerned that it is becoming increasingly acceptable to other and dehumanize groups of people. This is not a political problem, but a cultural one and, as photographer Brandon Staton put it in his viral open letter to Trump, a moral one. (If you want any more proof of this, and you have a strong stomach, you can scan the comments on his open letter for the phrase “you people.”)

To pillory “political correctness” is to overlook the fact that language does matter. There is a difference when you say that an immigrant “pops out a baby” or that she “has a child.” In the first case, you are speaking of her as something less than fully human.

“Is that why they pop out babies? To make them U.S. citizens? Is that why you popped out yours?”

What is the result of constant exposure to the idea that a group is not only “other” but “less than?” A racial empathy gap. As Lisa Wade wrote in Sociological Images:

Psychologists continue to document what is now called a racial empathy gap, both blacks and whites show lesser empathy when they see darker-skinned people experiencing physical or emotional pain. When white people are reminded that black people are disproportionately imprisoned, for example, it increases their support for tougher policing and harsher sentencing. Black prisoners receive presidential pardons at much lower rates than whites. And we think that black people have a higher physical pain threshold than whites.

This bears repeating: Somewhere in the uncritical parts of our minds, we actually believe that dark skinned people feel less physical pain than we do.

Talking about the civil rights movement, Martin Luther King, Jr. once said, ” “Instinctively we struck out for dignity first because personal degradation as an inferior human being was even more keenly felt than material privation.”

The only moral thing to do is to stand up for the dignity of other human beings, whether they are our fellow citizens or not, whether they share our religion or not, whether they speak the same language or not.

By the way, when Marco Rubio sent out a tweet in Spanish, he immediately received a predictable response.

CdtyjGsVAAADNqv

This is, of course, demonstrably untrue if “we” are taken to be all U.S. citizens.  More than 300 languages are spoken in the U.S. according to the U.S. Census Bureau. America has the world’s second largest population of Spanish speakers, more even than Spain. We have a growing population of Vietnamese, Russian and Chinese speakers. There are native speakers of Pennsylvania Dutch, Navajo, and Hawaiian. (In the latter two cases, they were here first.) There are even 1,000 speakers of the Pacific island language Samoan in Alaska. The only way to make this statement true is to define “we” as people who live in America and speak English. In that case it is true, but it is a meaningless tautology. (“We who speak English and live in America, speak English.”)

The strange thing is that illegal immigration has become such a hot button issue now as the number of Mexican immigrants leaving America is now actually greater than the number coming in.

But clearly the scope of the problem is much less important than the political value of having someone from the outside to blame for our ills.

Recently I questioned a Facebook friend who supported Trump and wrote about Mexicans “popping out babies” and getting free stuff in America.  In defending her views, she pointed to her own family history and contrasted it with the baby poppers of Mexico. Her grandfather fled Russia when the communists took over, and was forced to leave all of his possessions behind.

What fascinated me about this response is that being the descendant of a refugee did not produce empathy for other refugees, assuming that she agrees with Trump’s proposed Muslim ban. (I did not ask.) When her grandfather came to the U.S. he was fortunate that we distinguished between him and the people he was fleeing and did not keep him out because he and the communists were both Russian.

We can debate immigration policy. We can disagree. We can do it with respect.  But we cannot, as a moral nation, accept the notion that empathy is weakness. There is a way to take a hard line on immigration, and do it without dehumanizing people in the process. It is important.

In fact, empathy is hard. You have to work at it. You have to examine your own comfortable blind spots.  You have to be willing to adapt to others and not only assume they will adapt to you. It matters when we dehumanize people. Language matters.

Film Jobs are Jobs

I read an article this morning in the Christian Science Monitor with the title “Should Innovation be Tax Deductible.

The issue is whether Congress should amend the tax code to give companies engaging in research to pay lower taxes on the profits of such activities. Supporters of the idea believe it will increase innovation at home and keep well-paying research jobs from going overseas.

I am not going to offer an opinion on the proposed legislation itself, beyond saying that the argument presented against the idea in the article seemed to be to be less than persuasive. It consisted of the notion that corporations would abuse the benefit by re-classifying various activities in order to qualify for the incentives.

This is what corporations do. Saying that there should not be any incentives in the tax code because corporations will work around them is like saying we should not have speed limits because people will drive faster than them anyway.

But that is not what I am here to talk about. One particular paragraph caught my attention:

For example, a 2004 effort by Congress to lower tax rates for US manufacturers expanded far beyond lawmakers’ original definition of “manufacturing,” Mr. Gardner notes. “When the dust settled, the final law expanded the concept of ‘manufacturing’ to include roasting beans for coffee (an early example of the lobbying clout of Starbucks) and film and television production. When policymakers initially began discussing the manufacturing tax break, few would have imagined that the Walt Disney Company would reap more than $200 million a year in tax breaks for ‘manufacturing’ animated films,” he wrote…

What bothers me here is the mocking of the notion that television and film production deserve to be classified as “manufacturing.”

Manufacturing is making something as contrasted with agriculture (growing something) or service. Now I will be the first to admit that the Walt Disney Company is far from being a struggling entity in need of government assistance. That is not the point.

I assume that the reason Congress wanted to lower these tax rates was to keep jobs in the United States. Well, television and film employ a lot of people– real people who buy houses and cars and go shopping and raise families. According to the Bureau of Labor Statistics motion pictures and broadcasting show an annual average of 20,869 employees and total annual wages of $1.55 billion.

They make a product that American consumers value so much we spend an average of five hours a day watching it.

When you think of the movie industry, you probably imagine actors and directors. But it takes a lot of people to make a movie or a television show from caterers and hair-stylists to construction workers and lighting technicians.  USA today recently cited the film industry as an area of growth for blue collar workers in an otherwise fairly stagnant economy.

Atlanta needs construction workers, lighting experts and others to work in its fast-growing film industry. Skill is required, but not necessarily film experience for the 77,000 film workers (average pay $84,000) and support personnel in 2012, who turned out movies such as The Fast and the Furious and The Hunger Games franchises, according to the Motion Picture Association of America.

The good news is that film production is still less outsourced than some other industries. About 65 percent of the big, profitable “Hollywood” productions are still made here– although tax incentives from other states have taken a lot of those jobs away from California.

Mr. Gardner, quoted above, particularly mocks the idea that tax breaks were intended to benefit the making of animated films. So it might be worthwhile to know that animated film-making is starting to move overseas in a big way. According to the Hollywood Reporter:

Extremely generous subsidies in Vancouver, British Columbia enticed Pixar Animation and Sony Pictures Imageworks to open satellite locations in the Province in 2010…Sony Imageworks decided to double the size of its studio space in the city and grow its Vancouver workforce from 100 people to more than 250. In January 2014, Sony Imageworks announced layoffs at its Southern California facility and that it was shifting more positions to Vancouver. As the workforce in British Columbia grows, it shrinks in California.  In 2012, DreamWorks Animation announced plans to open a studio in Shanghai, China… Dreamworks CEO Jeffrey Katzenberg said the size of the studio in China could eventually surpass DreamWorks’ headquarters in Glendale, California, which employs more than 2,000 people. It appears job growth is happening in the animation world, but it’s happening in places like China, not California.

Are those 2,000 jobs not American jobs? Are they less worth keeping here than jobs making mechanical devices?

The reason, I think, it is easy to mock making cartoons as an example of manufacturing (in a way that I doubt one would mock the notion of, say, software as manufacturing) is that it falls into that broad category of “the arts.”

It is the same mindset that says giving a rich person an incentive to build a sports stadium is an investment in economic growth whereas giving funds to build a fine arts theater is supposed to be philanthropy and charity. Making music, dance, theater is art not commerce. It should be done for love not money.

I don’t know whether we need to give tax incentives to large corporations to keep them from moving overseas. If we do, though, we should care about the jobs making film sets as much as we care about the jobs making automobiles. Most of us spend more time each day using the film-makers’ products than the car-makers’.

 

 

 

History as a Straight Line

“Americans see history as a straight line and themselves standing at the cutting edge as representatives for all mankind.” -Frances Fitzgerald, American Myth, American Reality

Early in my college career, perhaps in my freshman year, I took a course on American Culture which used James Oliver Robinson’s American Myth, American Reality as a textbook.  I recorded the quote above in a journal of quotations I had just started collecting.

I thought of the quote again today when reading an article on revised AP U.S. history standards that will emphasize American exceptionalism.  The revisions were championed by conservative educators and politicians who felt that the previously released standards presented too negative a view of the country.  As Newsweek reported:

The Jefferson County school district in Colorado convened a board committee to review the curriculum, stating that all materials should promote “patriotism” and “respect for authority,” and “should not encourage or condone civil disorder.” The district stopped pursuing the review after hundreds of students walked out of classes in protest. The issue made it to the Republican National Committee, which passed a resolution accusing the AP U.S. framework of promoting “a radically revisionist view of American history that emphasizes negative aspects of our nation’s history while omitting or minimizing positive aspects,” and recommending that Congress withhold federal funding to the College Board pending a rewrite.

The squeaky wheel got the grease and the standards were revised again to try to make everyone on any side of the culture wars happy. One of the teachers who helped craft the redesign told Newsweek that their goal was to remove value judgments from the framework, and let facts speak for themselves.

Of course, history is not made up of “facts” the way mathematics is. History is made up of things that happened in the past between people of different cultures, ideologies, mindsets, and goals trying to survive cold winters, get enough to eat, and to live in society with one another. In the process they trade with one another, come up with economic systems, work, raise children, invent things, create art, fight over resources, practice religions, question their religions and prevailing philosophies, consider different elements of society part of the in-group or the out-group, they invent governing systems and sometimes become migrants or have wars. No nation ever was made up of people of a single mindset. Lots of things happened. Lots of people had lives that impacted other lives. Lots of people had perspectives. Out of the almost infinite pool of “things that happened” a historian must select certain things on which to focus.

For this reason the idea of history being “revisionist” is problematic. Rarely do our educators try to “revise” history by completely changing what happened, for example, saying the first president of the United States was not George Washington but Hiram Rodriguez. “Revisionist” histories are histories that focus on different aspects of the past.

The histories that we read in the good old days never did include all that happened to my ancestors and your ancestors in all its messy and wondrous complexity.  Historians ave to leave out of their stories all manner of events and people.  Early history text writers in the United States chose a patriotic narrative about an America whose ancestry is European, not Native American, Latino or Black. They chose to tell a story that focused on military and economic success with heroes from those realms. The past was already revised by these historians not to include the history of the card game whist, basket weaving, the story of some guy named Oziah who worked hard and followed the rules then died, changes in the way people have conceptualized love, slavery from the perspective of the enslaved, the War of 1812 from the Native American perspective, the biographies of all the people who ran for President and failed, nor did they choose to frame the account of the history of commerce and politics as background to a central narrative on the important business of creating art and culture or raising children or to begin the story of America in the mid-1800s with the first major wave of Jewish immigration. These are all stories that could have been told.

These days when people start fighting about how history should be taught to children, they largely argue about whose perspective should be included and who should be considered part of “us.” Is focusing on Civil Disobedience saying that America is bad and authority should be resisted or is it saying that African-Americans and working class laborers who staged sit ins are part of the American “us” and therefore events that were significant to African-Americans and the working poor are significant to us as Americans?

What rarely gets challenged, however, is the straight line narrative of American history. This can be summed up in the popular political poll question “Do you think the country is headed in the right direction?” The assumption is that history is a journey from something to something else. People on the left are more apt to see social change as progress (hence the label progressive) whereas conservatives are more apt to worry that social change is the beginning of a slippery downward slope to a chaotic society. What they have in common is that they see history as heading in a direction.

One of the sticking points in the AP framework debate was the interpretation of “manifest destiny.” Should it be presented in a positive light? What was a gain for the European settlers was a loss for the Native Americans. In either case, the underlying notion that there was something inevitable about this change is essentially intact. This is not the only way to view history. Richard Nisbett wrote in The Geography of Thought:

Japanese teachers begin with setting the context of a given set of events in some detail. They then proceed through the important events in chronological order, linking each event to its successor. Teachers encourage their students to imagine the mental and emotional states of historical figures by thinking about the analogy between their situations and situations of the students’ everyday lives. The actions are then explained in terms of these feelings. Emphasis is put on the “initial” event that serves as the impetus to subsequent events. Students are regarded as having good ability to think historically when they show empathy with the historical figures, including those who were Japan’s enemies. “How” questions are asked frequently— about twice as often as in American classrooms. American teachers spend less time setting the context than Japanese teachers do. They begin with the outcome, rather than with the initial event or catalyst. The chronological order of events is destroyed in presentation. Instead, the presentation is dictated by discussion of the causal factors assumed to be important (“ The Ottoman empire collapsed for three major reasons”). Students are considered to have good ability to reason historically when they are capable of adducing evidence to fit their causal model of the outcome.

What happened is the only thing that could have happened, and our job is to recognize is the road that got us there.

Thus, Nisbett writes “The fall of the Roman Empire, the rise of the Third Reich, and the American success in reaching the moon before the Russians, not to mention less momentous events, are routinely seen as inevitable by commentators, who, one strongly suspects, could not have predicted them.”

A month or so ago, you may recall, I ran a guest post by author Juliet Greenwood about her World War I novel “We That Are Left.” The article focused on the largely forgotten role of women on the battlefield.  The introduction to the post also pointed out that female writers outsold their male counterparts in the Victorian era and that women owned a large number of businesses in Colonial America.  Why do these facts come as a surprise? I suspect it is that these historical facts interfere with a nice seamless narrative about linear progress.

It is much easier to tell the dramatic story of increasing freedom for women– a straight line from corsets and arranged marriages to women’s suffrage, 1970s women’s lib, and then Margaret Thatcher, Hillary Clinton and female CEOs– if you leave out the women of previous ages who did the things we imagine they only later gained the right to do.