(This is the fourth crumb in a series. Start from the beginning of the series.)
The clear light of the New Testament shines on the persons who inhabit it. The shadows of the Old Testament lie thick on the persons who inhabit it. Already in the gospels, this effect is visible: Herod is bad, Judas is very bad; the other eleven apostles are good. Sometimes they show weakness, as when Peter, fearing the soldiers and the mob, denies that he knows Jesus, but this does not indicate that Peter has fallen, only that, as in the garden of Gesthemane, the spirit is willing but the flesh is weak.
A person may change in this new world: he may transform (like Judas) from good to bad or (like Paul) from bad to good—indeed, with the arrival of Paul, conversion and transformation will become central themes—but in the final tally, he must fall on one side or the other, amongst the saved or the damned.
The Old Testament is full of degrees of goodness. Moses is the greatest of the prophets, but he makes mistakes, and his life ends with the punishment for these mistakes: he must die in the wilderness, never to enter the promised land. King David is beloved of God but, at the height of his glory he schemes to have a man killed to cover up an illicit affair with the man’s wife, and for this a curse is laid on him and all his house. Saul is eaten up by murderous jealousy for David, but we see unmistakably a better nature within him and a real love for David, warring against his destructive impulses. Solomon is wise and blessed but he grows proud, is led astray, worships false gods, and as punishment for his sins, the kingdoms of Israel and Judah are rent apart, never again to be united. But Solomon is not ultimately fallen or redeemed. He lives, does some good, some bad, and then he dies and “sleeps with his fathers.”
If we tend to think of Solomon as wise and forget his faults or of Saul as wicked and forget his virtues, this is because we are engaging not with the bible but with subsequent cultural-religious traditions that have resolved the inherent ambiguity of the text into a more straightforward morality.
No such misreading is required in the New Testament. The text makes clear on which side each character falls. There is a ruthlessness to this, and a terrible simplicity. There is also a great romanticism: life, under these narrative rules, becomes a dire struggle, attains a drama that perhaps was impossible when each step was only a step, each turn only a turn.
As Josipovici has pointed out, sin and redemption in the Old Testament are not definitive and final. They refer to individual actions, they do not define entire lives: “If you sin, then, like David, you may repent, or, like Ahab, you may not. Even if you do you may, like Jonah, sin again. … When someone’s eyes are opened in the Hebrew bible it is not so that they may suddenly understand the falsity of their whole past life but only the immediate error under which they have been laboring… But for Paul, the important act is not repentance but awakening, an act of faith which totally transforms life” (Gabriel Josipovici, The Book of God, pg. 242).
More on this theme >>
Tuesday, May 29, 2018
Thursday, May 17, 2018
Poetry and Narrative (3 of 7)
Everything I said yesterday about the uncertainty and openness of meaning in the Old Testament refers specifically to the narrative sections of that document. The so called "Books of Poetic Wisdom," especially Psalms
and Proverbs, usually make their meaning much plainer and more
definite.
This runs counter to our modern notions about poetry, but it makes sense from another angle: in narrative, we are in the realm of events, a realm where first things happen, then we try to understand them. But in lyric poetry (the mode of Psalms) and wisdom poetry (the mode of Proverbs) we are in a realm of emotional interpretation and judgment: here, interpretation comes first and other things follow. If there are events, they come out of emotion and recollection (in lyric) or in the service of a particular lesson (in wisdom poetry). In these forms, then, details emerge under the aegis of a guiding thought; they are born already under the light of a reason that does not fit them into a pattern after they emerge, but rather draws them up out of darkness in the very process of pattern-making.
It is interesting to remember that narrative has this inherent capacity: not to interpret the world but to give us the world prior to interpretation.
But, of course, a narrative is also inevitably a pattern.
More on this theme >>
This runs counter to our modern notions about poetry, but it makes sense from another angle: in narrative, we are in the realm of events, a realm where first things happen, then we try to understand them. But in lyric poetry (the mode of Psalms) and wisdom poetry (the mode of Proverbs) we are in a realm of emotional interpretation and judgment: here, interpretation comes first and other things follow. If there are events, they come out of emotion and recollection (in lyric) or in the service of a particular lesson (in wisdom poetry). In these forms, then, details emerge under the aegis of a guiding thought; they are born already under the light of a reason that does not fit them into a pattern after they emerge, but rather draws them up out of darkness in the very process of pattern-making.
It is interesting to remember that narrative has this inherent capacity: not to interpret the world but to give us the world prior to interpretation.
But, of course, a narrative is also inevitably a pattern.
More on this theme >>
Wednesday, May 16, 2018
Acts (2 of 7)
In my eagerness to write about St. Paul, I skipped over Acts. In its own way, Acts evinces a radical break with the mindset of the Old Testament, but the break is more hidden. There is, in this book, no personal narrator and little of the internal or the subjective. And there is still a sparseness to the storytelling, a shortage of detail. Yet this sparseness does not produce the openness, the ambiguity, the strangeness that characterizes the Old Testament.
Reading the OT, we may understand the physical events but we rarely can be sure what they mean. Samson takes a liking to a Philistine woman, and he goes down to a place called Timnath to see her:
I have picked one out of hundreds of examples. Nearly all the narrative sections of the OT, and even some of the legalistic sections, have this opacity, this refusal to reveal any clear, fixed system of meaning. Even God’s will often does not supply a fixed point. Sometimes, in a passion, he purposes destruction, only to be talked out of it by a mortal. Often he claims to be on the point of abandoning his previous plans. After the flood, he strikes what sounds very much like a note of regret.
In the OT, we find a vast sequence of events from which we must make our own meaning, and this meaning therefore comes after the events: first they occur, then we look at them and interpret.(*) In this context, sparseness—the absence of interstitial details, of interior thoughts, of mood, of atmosphere—constitutes a shortage of clues, an openness to many different readings. The text is like a skeleton without flesh or ligaments: it can bend into all sorts of shapes.
But in Acts, the events seem to come already embedded in a definite system of meaning. For a new plan has entered the world, a plan that sweeps up everything and determines in advance the pattern of events and the significance of each one; a plan in which there is not room for negotiation or revision. So the sparseness of Acts becomes not ambiguity but simply brevity.
Here again we find that the gospels do not fit nicely into either category. The four evangelists seem to know the meaning of their story and to present it in order to send a particular message; yet at the heart of that story is a character who refuses to conform his words or his actions to any straightforward system of meaning.
More on this theme >>
Reading the OT, we may understand the physical events but we rarely can be sure what they mean. Samson takes a liking to a Philistine woman, and he goes down to a place called Timnath to see her:
…and, behold, a young lion roared against him. And the Spirit of the LORD came mightily upon him, and he rent him as he would have rent a kid, and he had nothing in his hand: but he told not his father or his mother what he had done. And he went down, and talked with the woman; and she pleased Samson well. And after a time he returned to take her, and he turned aside to see the carcase of the lion: and, behold, there was a swarm of bees and honey in the carcase of the lion. And he took thereof in his hands, and went on eating, and came to his father and mother, and he gave them, and they did eat: but he told not them that he had taken the honey out of the carcase of the lion.What on earth is going on here? Why was there honey in the lion's carcass? Did Samson do wrong in eating it and giving it to his parents? Why didn’t he tell them where he got it? Did all of this have something to do with Samson's intention to wed a non-Israelite? The text never answers these questions. We are left to make our own meaning, a meaning which can only appear definite if we insist on burying the ambiguity inherent in the text beneath a rigid religious program.
(Judges 14:5-9)
I have picked one out of hundreds of examples. Nearly all the narrative sections of the OT, and even some of the legalistic sections, have this opacity, this refusal to reveal any clear, fixed system of meaning. Even God’s will often does not supply a fixed point. Sometimes, in a passion, he purposes destruction, only to be talked out of it by a mortal. Often he claims to be on the point of abandoning his previous plans. After the flood, he strikes what sounds very much like a note of regret.
In the OT, we find a vast sequence of events from which we must make our own meaning, and this meaning therefore comes after the events: first they occur, then we look at them and interpret.(*) In this context, sparseness—the absence of interstitial details, of interior thoughts, of mood, of atmosphere—constitutes a shortage of clues, an openness to many different readings. The text is like a skeleton without flesh or ligaments: it can bend into all sorts of shapes.
But in Acts, the events seem to come already embedded in a definite system of meaning. For a new plan has entered the world, a plan that sweeps up everything and determines in advance the pattern of events and the significance of each one; a plan in which there is not room for negotiation or revision. So the sparseness of Acts becomes not ambiguity but simply brevity.
Here again we find that the gospels do not fit nicely into either category. The four evangelists seem to know the meaning of their story and to present it in order to send a particular message; yet at the heart of that story is a character who refuses to conform his words or his actions to any straightforward system of meaning.
More on this theme >>
Tuesday, May 15, 2018
St. Paul (1 of 7)
A couple years ago, I went looking for secondary sources on the bible. I tried Jewish, Christian and secular commentaries but found them all thoroughly unilluminating. Sometime later, a woman (with whom I subsequently made plans to conduct a certain ritual under a canopy), gave me a book by one Gabriel Josipovici called The Book of God. The first thing this book did was show me that I had been looking for the wrong sort of help from secondary sources; the second thing it did was teach me to read better; the third was to lay out a web of thoughts about biblical narrative, its treatment of character, its relation to other kinds of narrative, and the implications of all this for our relationship to our lives and ourselves.
Anything I now say about the bible is grounded in Josipovici’s ideas. What follows, especially the point regarding Paul’s subjectivity, owes a great deal to him.
* * *
The vividness and distinctiveness of the language of the King James Bible, which I imagine must have left its mark on all subsequent translations, exerts a powerful unifying force.* The various books of the bible may vary in content, authorship, and genre, but they are all unmistakably biblical sounding. Because the bible is such an opaque document, because we are so often hard-pressed to make sense of what is being said, style rises to the surface. For modern, secular people, encountering this text largely in disconnected fragments—a line quoted here or there, taken from disparate books—our sense of the bible is primarily our sense of a certain linguistic style. As we all know, that linguistic style is not in fact native to the text.
[* Since writing this crumb, I've read parts of Tyndale's bible and discovered that the King James follows it very closely, so it seems the KJ is not the originator of the linguistic style I'm describing. - 7/2/2023]
Beneath this unity of prose style, one finds a radical divide— in mentality, in content, in narrative form— between the Old and New Testaments. The bridge between them is the Gospels. The Gospels still retain much of the narrative style of the Old Testament: its sparseness, its tendency to leave things unsaid, its emphasis on action and speech, its shortage of internality. But the story of the Jesus's life feels very different form the stories of the OT. The nature of God, the relationship between God and his creations, the conception of human history—all have undergone a transformation when we turn the final page of the OT and find ourselves in the opening chapter of Matthew.
With the epistles of Saint Paul (the great early organizer of Christian churches whose letters make up 13 of the 27 books of the New Testament), we enter a new realm entirely. Gone is the sparseness and opacity that forced us to surmise, to guess, to interpret, that left us, sometimes, utterly at a loss. Paul wants to make his meaning crystal clear. He has none of Jesus’s penchant for ambiguous parables nor the OT’s penchant for reporting bizarre events without explanation. Paul has a message of burning import, and he wishes to tell it. Here at last we have a first-person narrator of the kind we moderns are familiar with: a narrator who tells us what he thinks and feels, who reports his emotions, who speaks of his own hopes and faults. Paul exhorts from a point of view that we can recognize as personal in a modern sense.
The OT narrators are very different. They report what was said and done, but they never tell us how they felt. We do not know whether Daniel was afraid in the lion’s den. Indeed, the narrative does not stay with Daniel amongst the lions but follows the King who has imprisoned him there to his royal chambers and, with the King, discovers Daniel alive the next morning. Even in Jonah, the angstiest of the prophets, we never hear directly about Jonah’s emotions. In the belly of the whale, it is only in the words of Jonah's prayer that we learn anything of his feelings. When I think of Jonah's story, I think of how he feared to take up the burden of prophecy, how, like Moses, he doubted his ability, but when I turn to the text, I discover that I have inferred these emotions; in the text there is only action:
When we do find the first person in the prophets, it is sometimes unclear whether the “I” refers to the prophet or to God, whose words he is repeating. It seems to slip back and forth without anything to mark the change. There is something strangely unfixed about the identities of these writers. As characters in their own stories, they are clear, but as narrators, as speakers, they blur and dissolve into a perspective beyond the individual. They do not emerge as individual subjects, as speakers, as personal narrators.
Paul is the first such narrator in the bible.
More on this theme >>
Anything I now say about the bible is grounded in Josipovici’s ideas. What follows, especially the point regarding Paul’s subjectivity, owes a great deal to him.
* * *
The vividness and distinctiveness of the language of the King James Bible, which I imagine must have left its mark on all subsequent translations, exerts a powerful unifying force.* The various books of the bible may vary in content, authorship, and genre, but they are all unmistakably biblical sounding. Because the bible is such an opaque document, because we are so often hard-pressed to make sense of what is being said, style rises to the surface. For modern, secular people, encountering this text largely in disconnected fragments—a line quoted here or there, taken from disparate books—our sense of the bible is primarily our sense of a certain linguistic style. As we all know, that linguistic style is not in fact native to the text.
[* Since writing this crumb, I've read parts of Tyndale's bible and discovered that the King James follows it very closely, so it seems the KJ is not the originator of the linguistic style I'm describing. - 7/2/2023]
Beneath this unity of prose style, one finds a radical divide— in mentality, in content, in narrative form— between the Old and New Testaments. The bridge between them is the Gospels. The Gospels still retain much of the narrative style of the Old Testament: its sparseness, its tendency to leave things unsaid, its emphasis on action and speech, its shortage of internality. But the story of the Jesus's life feels very different form the stories of the OT. The nature of God, the relationship between God and his creations, the conception of human history—all have undergone a transformation when we turn the final page of the OT and find ourselves in the opening chapter of Matthew.
With the epistles of Saint Paul (the great early organizer of Christian churches whose letters make up 13 of the 27 books of the New Testament), we enter a new realm entirely. Gone is the sparseness and opacity that forced us to surmise, to guess, to interpret, that left us, sometimes, utterly at a loss. Paul wants to make his meaning crystal clear. He has none of Jesus’s penchant for ambiguous parables nor the OT’s penchant for reporting bizarre events without explanation. Paul has a message of burning import, and he wishes to tell it. Here at last we have a first-person narrator of the kind we moderns are familiar with: a narrator who tells us what he thinks and feels, who reports his emotions, who speaks of his own hopes and faults. Paul exhorts from a point of view that we can recognize as personal in a modern sense.
The OT narrators are very different. They report what was said and done, but they never tell us how they felt. We do not know whether Daniel was afraid in the lion’s den. Indeed, the narrative does not stay with Daniel amongst the lions but follows the King who has imprisoned him there to his royal chambers and, with the King, discovers Daniel alive the next morning. Even in Jonah, the angstiest of the prophets, we never hear directly about Jonah’s emotions. In the belly of the whale, it is only in the words of Jonah's prayer that we learn anything of his feelings. When I think of Jonah's story, I think of how he feared to take up the burden of prophecy, how, like Moses, he doubted his ability, but when I turn to the text, I discover that I have inferred these emotions; in the text there is only action:
Now the word of the LORD came unto Jonah the son of Amittai, saying, Arise, go to Nineveh, that great city, and cry against it; for their wickedness is come up before me. But Jonah rose up to flee unto Tarshish from the presence of the LORD, and went down to Joppa; and he found a ship going to Tarshish: so he paid the fare thereof, and went down into it, to go with them unto Tarshish from the presence of the LORD.The books of the Old Testament prophets move with strange fluidity between first and third person. The book of Daniel, for example, is entirely third person for the first six chapters. In chapter 7, we get first person, but only in quotation: “Daniel spake and said, I saw in my vision by night, and, behold…” (Daniel 7:2). His vision goes on and on, the first person persisting. We seem to be in an extended quotation (the bible does not use quotation marks), but gradually we become unsure. Chapter 8 begins “In the third year of the reign of king Belshazzar a vision appeared unto me, even unto me Daniel…” (8:1). The third person has vanished—and it will not reappear. Daniel’s speech has become the book.
(Jonah 1:1-3, KJV)
When we do find the first person in the prophets, it is sometimes unclear whether the “I” refers to the prophet or to God, whose words he is repeating. It seems to slip back and forth without anything to mark the change. There is something strangely unfixed about the identities of these writers. As characters in their own stories, they are clear, but as narrators, as speakers, they blur and dissolve into a perspective beyond the individual. They do not emerge as individual subjects, as speakers, as personal narrators.
Paul is the first such narrator in the bible.
More on this theme >>
Tuesday, May 8, 2018
Seeing Ourselves (2)
But, of course, we also see ourselves as characters in film and television.
Film often evinces a novelistic conception of character: it develops through experience, it finds its value in specificity. In the sit-com, on the other hand, and to a large extent in the episodic drama as well, character is essentially static, and type is primary; individual variations are not wholly extraneous as they are in the statistical study, but they are secondary elements, add-ons to the core type. The new serial show naturally tends towards a more novelistic conception, but the handful of these that I’ve watched often seem haunted by the static, type-based conception native to their medium. And then, of course, there is the model of character found in television advertising.
At any rate, the specific conception of character adopted by film and television is less important than the nature of the moving image itself. The key feature of these media is that we experience the people in them first and foremost as sound and image. No depiction of character in prose can be as external as even the most perspectival film character. The force of the literal reproduction of face, body, gesture, tone overwhelms us. To see ourselves as these characters is to see ourselves from the outside.
What we imbibe from film is not a sense of ourselves as mind and spirit, but as look, gesture, and tone. To relate in this way to ourselves is disturbing. It generates a self-consciousness of a new type: a material self-consciousness, which applies itself not to our conduct or character but to our bodies, our voices, our clothes.
* * *
Perhaps this helps to account for the odd discrepancy between the faces of people fifty and a hundred years ago and the faces of people today—a phenomenon that I may or may not be imagining.
In the posed photographic portraits of the late 19th and early 20th century, the subjects are usually stiff, frequently odd looking; they look at the camera warily, as one looks at a strange, unknown device; at the same time, a wealth of personality seems to spill out of them. In candid shots of that time—I am thinking primarily of Walker Evans’s subway photos—the faces are without that wary stiffness and strangely expressive. I say strangely, because it is not the expressions that are so expressive but the faces themselves. Character seems graven into them, through some gradual erosion-like process. There is a graceful unselfconsciousness to even the ugliest of these faces that one rarely finds in contemporary faces. Or look at the faces of early film. In The Passion of Joan of Arc we meet with faces the likes of which no casting agent could summon up today. Even the faces of young lovers of the silent era have a strange soulfulness, a weight of character that is largely missing in today’s faces.
It would be going much too far to say that contemporary faces are all alike. They contain much variation, yet when I compare them to the faces of the past, this variation appears somehow flat. I begin to feel (almost inexplicably) that the faces around me are all in some way the faces of children: precocious, thoughtful, anxious, weathered, but still in some mysterious way child-like, as if their owners have not yet tasted too much of life. And when they become active, these faces evince the uncanny over-expressiveness of bad actors or even of masks.
In 1952, Satyajit Ray cast many of the minor roles in Pather Panchali and both leads from non-actors. On camera, these non-actors appear to be just what they are: the residents of a small village in rural Bengal. The same is true in many of the Italian neo-realist films: in The Bicycle Thieves, for example, Maggiorani, who plays the lead, was a factory worker; the boy who plays Maggiorani’s son was a flower-seller’s son whom De Sica spotted on the street in Rome. And the result is just what we would naively expect: these two ordinary, working-class people appear on camera to be just what they in fact are. This would never work today. Contemporary people are not capable of appearing to be what they are. Put on camera, they invariably begin to perform in the most generic and hackneyed manner. In the silent film era, an actor was someone who could appear strange, monstrous, full of wickedness or religious passion. Today an actor is simply someone who can seem not to be acting.
Of course, like so much else, this may all be a morbid fantasy of mine. I would like to try an experiment. I would like to dress up a bunch of modern people in outfits of the early 20th century, groom them accordingly, and then mix them up with actual photos of people from that time. I think I could guess which was which, nine out of ten times. I think most of us could.
I cannot help it if I appear to be a mad man, but lest I appear a lone mad man, I wish to point out that I am not the first to make an observation along these lines and to see something tragic in it. The following excerpts are from a 1973 article by Pasolini:
Film often evinces a novelistic conception of character: it develops through experience, it finds its value in specificity. In the sit-com, on the other hand, and to a large extent in the episodic drama as well, character is essentially static, and type is primary; individual variations are not wholly extraneous as they are in the statistical study, but they are secondary elements, add-ons to the core type. The new serial show naturally tends towards a more novelistic conception, but the handful of these that I’ve watched often seem haunted by the static, type-based conception native to their medium. And then, of course, there is the model of character found in television advertising.
At any rate, the specific conception of character adopted by film and television is less important than the nature of the moving image itself. The key feature of these media is that we experience the people in them first and foremost as sound and image. No depiction of character in prose can be as external as even the most perspectival film character. The force of the literal reproduction of face, body, gesture, tone overwhelms us. To see ourselves as these characters is to see ourselves from the outside.
What we imbibe from film is not a sense of ourselves as mind and spirit, but as look, gesture, and tone. To relate in this way to ourselves is disturbing. It generates a self-consciousness of a new type: a material self-consciousness, which applies itself not to our conduct or character but to our bodies, our voices, our clothes.
* * *
Perhaps this helps to account for the odd discrepancy between the faces of people fifty and a hundred years ago and the faces of people today—a phenomenon that I may or may not be imagining.
In the posed photographic portraits of the late 19th and early 20th century, the subjects are usually stiff, frequently odd looking; they look at the camera warily, as one looks at a strange, unknown device; at the same time, a wealth of personality seems to spill out of them. In candid shots of that time—I am thinking primarily of Walker Evans’s subway photos—the faces are without that wary stiffness and strangely expressive. I say strangely, because it is not the expressions that are so expressive but the faces themselves. Character seems graven into them, through some gradual erosion-like process. There is a graceful unselfconsciousness to even the ugliest of these faces that one rarely finds in contemporary faces. Or look at the faces of early film. In The Passion of Joan of Arc we meet with faces the likes of which no casting agent could summon up today. Even the faces of young lovers of the silent era have a strange soulfulness, a weight of character that is largely missing in today’s faces.
It would be going much too far to say that contemporary faces are all alike. They contain much variation, yet when I compare them to the faces of the past, this variation appears somehow flat. I begin to feel (almost inexplicably) that the faces around me are all in some way the faces of children: precocious, thoughtful, anxious, weathered, but still in some mysterious way child-like, as if their owners have not yet tasted too much of life. And when they become active, these faces evince the uncanny over-expressiveness of bad actors or even of masks.
In 1952, Satyajit Ray cast many of the minor roles in Pather Panchali and both leads from non-actors. On camera, these non-actors appear to be just what they are: the residents of a small village in rural Bengal. The same is true in many of the Italian neo-realist films: in The Bicycle Thieves, for example, Maggiorani, who plays the lead, was a factory worker; the boy who plays Maggiorani’s son was a flower-seller’s son whom De Sica spotted on the street in Rome. And the result is just what we would naively expect: these two ordinary, working-class people appear on camera to be just what they in fact are. This would never work today. Contemporary people are not capable of appearing to be what they are. Put on camera, they invariably begin to perform in the most generic and hackneyed manner. In the silent film era, an actor was someone who could appear strange, monstrous, full of wickedness or religious passion. Today an actor is simply someone who can seem not to be acting.
Of course, like so much else, this may all be a morbid fantasy of mine. I would like to try an experiment. I would like to dress up a bunch of modern people in outfits of the early 20th century, groom them accordingly, and then mix them up with actual photos of people from that time. I think I could guess which was which, nine out of ten times. I think most of us could.
I cannot help it if I appear to be a mad man, but lest I appear a lone mad man, I wish to point out that I am not the first to make an observation along these lines and to see something tragic in it. The following excerpts are from a 1973 article by Pasolini:
[In 1959], a provocateur among us [i.e. the Italian communists] would have been nearly inconceivable (unless he was an amazingly good actor) – in fact, his subculture would have marked him, even physically, as distinct from our culture. We would have known him for what he was from his eyes, from his nose, from his hair! …Now … [n]o one on earth could distinguish a revolutionary from a provocateur physically. Physically, Left and Right have melted together.Of course these changes depend on political and economic developments as well. I'm not saying they're due entirely to the consumption of moving images. One of the convictions behind these "crumbs" is that effects usually have many causes. When someone says "Here's why," we should always reply (not-rhetorically) "And why else?"
…
I feel an immense and sincere unhappiness in saying so (or rather, a literal feeling of despair) – but in the present moment thousands and hundreds of thousands of faces of young Italians resemble more and more the face of Merlin. Their freedom to have their hair look the way they want it to is no longer defensible, because it is no longer freedom. The moment has come, instead, to say to young people that their style is horrible, because it is servile and vulgar. Or better, the moment has come for they themselves to realize it, and for them to free themselves from this guilty anxiety to remain in synchronicity with the degrading order of the horde.
Monday, May 7, 2018
Footnote to "Seeing Ourselves (1)"
Freud’s narrative acumen and his acute interest in the details of memory and thought have been largely forgotten; his legacy, in the popular mind, consists largely in this: that he taught us to mistrust the individual’s knowledge of herself. That insight was not wrong, but it opened the door to something altogether bad: the subordination, at the level of our own psyches, of individual experience to scientific knowledge.
If I am right that the statistical study has supplanted the novel as the implicit model of character, it is not because people now read studies instead of novels (although lately this may in fact be the case*), but because the study has taken on the truth-revealing power that formerly belonged to lived and imagined experience.
(*In that, so far as I can tell, most educated adults nowadays read articles rather than books and, if they read books, read mostly non-fiction. Clearly, novels almost never site statistical studies, whereas articles and non-fiction books do quite often.)
If I am right that the statistical study has supplanted the novel as the implicit model of character, it is not because people now read studies instead of novels (although lately this may in fact be the case*), but because the study has taken on the truth-revealing power that formerly belonged to lived and imagined experience.
(*In that, so far as I can tell, most educated adults nowadays read articles rather than books and, if they read books, read mostly non-fiction. Clearly, novels almost never site statistical studies, whereas articles and non-fiction books do quite often.)
Sunday, May 6, 2018
Seeing Ourselves (1)
In the age of the novel, people saw themselves as characters in a novel. This is why Freud’s early case histories read like short stories: because the implicit psychological model of his time was inextricably bound up in narrative—in its emphasis on specificity and in its gradual unfolding. Character in those days was seen as fundamentally something that developed, and that developed in large part through hardship, through error—that is, through the recognition of error and the overcoming of hardship. Error and misfortune, then, were not to be studiously avoided. One had to meet with these things, to know them, to survive them, in order to be a fully developed person.
One result of this kind of development is peculiarity. The characters in novels—the best ones anyway—are like twisted old trees. Their interest, their value, comes from their specificity. As readers and writers, we are not interested in how a character is like others of their type (quiddity) but how they differ from their type (haecceity). Their haecceity is what is important about them; their quiddity is only a sort of background.
Freud was was anxious about the degree to which his case histories read like stories (see for example his comments on this theme in the critical analysis section of “Elizabeth Von R.”). After all, his stated project was to develop a science of psychoanalysis. He wanted to identify configurations of symptoms, to discover the underlying structure of hysteria, of obsessive-compulsive disorder— structures that would be common to all hysterics, all obsessive-compulsives. For such a project, what mattered was not what made a patient unique but what made him a member of a category—not his haecceity but his quiddity.
People no longer see themselves as characters in a novel. The basic model of the psyche today comes from the psychological study—not Freud’s case histories, which were still four-fifths narrative, but the modern statistical study: the sort that purports to tell us about ourselves, to reveal our hidden biases, to pierce through our false self-conceptions.
In the statistical study, everything that was essential to a character in a novel becomes noise. Haecceity—the way that an individual deviates from type—is precisely what the whole field of statistics is designed to eliminate, in order to reveal the underlying “laws” of human behavior. In this new model, what is essential is our quiddity. Our haecceity is an extraneous element. No wonder then that we have such a horror of error and misfortune, that we try to guard desperately agains them—lest they drive us away from the mean (the “true” value), towards those ever-narrowing tails of the bell curve.
At the same time, we are encouraged to express ourselves, we are told we are special, we are hypnotized with visions of beauty and genius—of outliers. As with so many things, these contradictory elements exist side by side without conflict. The desire for specialness, the fetishization of self-expression, the striving for beauty and brilliance have themselves become not merely common but nearly universal qualities.
Let us learn to see ourselves once more as characters in a novel. Then our failures and frustrations will again take on the rich hues of an unfinished struggle with life. Then we can again begin to find something beautiful and worthwhile even in loneliness and disappointment. Then we can let ourselves grow strange. It will not be easy to go back to this conception of the self. Again and again, the conception that is native to the statistical study will seek to reestablish itself, and we will realize we are again seeing our lives and our psyches through its flattening gaze. But gradually, with persistence, we can regain the novelistic conception. It will help, in this process, to read novels, especially old novels, novels written when the novelistic conception of character was still dominant.
Footnote to this post >>
More on Seeing Ourselves >>
One result of this kind of development is peculiarity. The characters in novels—the best ones anyway—are like twisted old trees. Their interest, their value, comes from their specificity. As readers and writers, we are not interested in how a character is like others of their type (quiddity) but how they differ from their type (haecceity). Their haecceity is what is important about them; their quiddity is only a sort of background.
Freud was was anxious about the degree to which his case histories read like stories (see for example his comments on this theme in the critical analysis section of “Elizabeth Von R.”). After all, his stated project was to develop a science of psychoanalysis. He wanted to identify configurations of symptoms, to discover the underlying structure of hysteria, of obsessive-compulsive disorder— structures that would be common to all hysterics, all obsessive-compulsives. For such a project, what mattered was not what made a patient unique but what made him a member of a category—not his haecceity but his quiddity.
People no longer see themselves as characters in a novel. The basic model of the psyche today comes from the psychological study—not Freud’s case histories, which were still four-fifths narrative, but the modern statistical study: the sort that purports to tell us about ourselves, to reveal our hidden biases, to pierce through our false self-conceptions.
In the statistical study, everything that was essential to a character in a novel becomes noise. Haecceity—the way that an individual deviates from type—is precisely what the whole field of statistics is designed to eliminate, in order to reveal the underlying “laws” of human behavior. In this new model, what is essential is our quiddity. Our haecceity is an extraneous element. No wonder then that we have such a horror of error and misfortune, that we try to guard desperately agains them—lest they drive us away from the mean (the “true” value), towards those ever-narrowing tails of the bell curve.
At the same time, we are encouraged to express ourselves, we are told we are special, we are hypnotized with visions of beauty and genius—of outliers. As with so many things, these contradictory elements exist side by side without conflict. The desire for specialness, the fetishization of self-expression, the striving for beauty and brilliance have themselves become not merely common but nearly universal qualities.
Let us learn to see ourselves once more as characters in a novel. Then our failures and frustrations will again take on the rich hues of an unfinished struggle with life. Then we can again begin to find something beautiful and worthwhile even in loneliness and disappointment. Then we can let ourselves grow strange. It will not be easy to go back to this conception of the self. Again and again, the conception that is native to the statistical study will seek to reestablish itself, and we will realize we are again seeing our lives and our psyches through its flattening gaze. But gradually, with persistence, we can regain the novelistic conception. It will help, in this process, to read novels, especially old novels, novels written when the novelistic conception of character was still dominant.
Footnote to this post >>
More on Seeing Ourselves >>
Friday, May 4, 2018
A Weird, Out-of-the-Way Place
I find myself thinking about a book I read many years ago, about a man driving an RV through the backwaters of the United States in 1978. The writer took this trip after his wife left him and he lost his job, but the book isn't about his wife or his job or his life—these things are mentioned only in passing. The book is about a side of America that, in 1978, was already rapidly disappearing: weird little out-of-the-way towns, each unlike all the others, lonely roadhouse diners, people long settled into the land where they lived, stamped with the spirit of the hill or the bayou, drenched in particularity. I wonder what it was like to drive around this country 20 or 30 or 50 years before that, to see a world still largely unconnected by telephone and television, internet and interstate, when the little highways, winding over and under hill, passed through every town. To arrive in one of these towns was to find oneself somewhere specific and distinct, where speech and manners and beliefs were inflected with an irreducible local element.
Little is left of that world, and what there is seems to be largely preserved for the sake of tourism. It is a tired point: chain stores and superhighways and indistinguishable suburbs and so on. Never mind.
Here is what you can do: make of yourself a weird old place. Find a hillock or a wood of the mind and build your soul a house there. Build it oddly, unreasonably, with rooms and corridors to your liking. So that, when people meet you, they feel they have come to the threshold of a strange, out-of-the-way place, unlike the places they are used to, somewhere that will take time to get to know. There is no reason to be inhospitable—you may as well do your best to welcome them. (People in weird old towns are often much friendlier than people in modern, anonymous places.) Show the visitor into your strange, dark sitting room, offer them tea. But keep yourself some locked rooms where guests are not allowed.
Is this not part of what it might mean to be an adult—if we discard all those external class-based markers (career, house, towels) and look for qualities that are not grounded in a system that has lost its capacity to impart meaning to almost anything?
Little is left of that world, and what there is seems to be largely preserved for the sake of tourism. It is a tired point: chain stores and superhighways and indistinguishable suburbs and so on. Never mind.
Here is what you can do: make of yourself a weird old place. Find a hillock or a wood of the mind and build your soul a house there. Build it oddly, unreasonably, with rooms and corridors to your liking. So that, when people meet you, they feel they have come to the threshold of a strange, out-of-the-way place, unlike the places they are used to, somewhere that will take time to get to know. There is no reason to be inhospitable—you may as well do your best to welcome them. (People in weird old towns are often much friendlier than people in modern, anonymous places.) Show the visitor into your strange, dark sitting room, offer them tea. But keep yourself some locked rooms where guests are not allowed.
Is this not part of what it might mean to be an adult—if we discard all those external class-based markers (career, house, towels) and look for qualities that are not grounded in a system that has lost its capacity to impart meaning to almost anything?
Thursday, May 3, 2018
The Vampire
Alright, so there is a confusion of categories: adolescent-rebel, adult-bourgeois, maturity-respectability. But why, as a child of the bourgeoisie, should I care? Why not just bite the bullet and become bourgeois?
As mentioned earlier, I have a mysterious incapacity to take this step; maybe if I could I would. But there is also something about bourgeoisdom that troubles me. It seems to me that there is no escape from it, that nowhere, or almost nowhere, could I find a group of people that does not hold its values and aspirations, that would not judge me a failure because I have failed to be bourgeois. It is almost as if the bourgeoisie is no longer merely a class but something more abstract.
Several years ago, a friend sent me an excerpt he had translated from a 1968 essay by Pier Paolo Pasolini, in which Pasolini discusses the bourgeoisie in a manner that introduced a whole new category into my thinking, a category which has since come to seem essential to understanding the world I live in. Here is the excerpt:
As mentioned earlier, I have a mysterious incapacity to take this step; maybe if I could I would. But there is also something about bourgeoisdom that troubles me. It seems to me that there is no escape from it, that nowhere, or almost nowhere, could I find a group of people that does not hold its values and aspirations, that would not judge me a failure because I have failed to be bourgeois. It is almost as if the bourgeoisie is no longer merely a class but something more abstract.
Several years ago, a friend sent me an excerpt he had translated from a 1968 essay by Pier Paolo Pasolini, in which Pasolini discusses the bourgeoisie in a manner that introduced a whole new category into my thinking, a category which has since come to seem essential to understanding the world I live in. Here is the excerpt:
I will often speak violently against the borghesia [the bourgeoisie], in fact, this will be the central theme of my weekly columns. And I understand very well that the reader will be “taken aback” by this fury of mine; well, what is going on here will become clear once I specify that by borghesia I do not mean a social class so much as a disease in every sense of the word. A very contagious disease, so much so that it has infected nearly everyone who fights against it – from the Northern workers, to the workers who have emigrated from the South, to the borghesi opposing the government, to the ‘isolated individuals’ (like me). The borghese – let’s describe the phenomenon playfully – is a vampire, who doesn’t find peace until he bites his victim on the neck out of pure, simple, and natural enjoyment at seeing the victim turn pallid, unhappy, ugly, devitalized, contorted, corrupt, restless, full of a sense of guilt, calculating, aggressive, terroristic, just like him.
How many workers, how many intellectuals, how many students have been bitten by night by this vampire, and without knowing it, are becoming vampires themselves!
[…]
From my solitude as a citizen, I will therefore attempt to analyze this ‘borghesia’ as an evil wherever it is found – meaning, by now, nearly everywhere (this is a more ‘lively’ way of stating that the borghese ‘system’ is capable of absorbing every contradiction – in fact, it creates the contradictions itself, as Lukács says, in order to survive by overcoming them).
Wednesday, May 2, 2018
Hollywood Rebels (2)
But, as we have seen, the political element reenters, or seems to reenter, in the form of class: the Hollywood rebel's rebellion is always both against the adult world and against the bourgeoisie, in such a way that the two are almost indistinguishable. (Well, almost always: there is also the rebellion against the “bigoted” working-class parent, but this is much rarer and has an overall different tone and quality; I actually can’t think of a specific movie that depicts it. The target of the Hollywood rebellion is nearly always wealthy and “respectable.”)
Adolescent rebellion bears a peculiar relation to class: the adolescent (in the paradigmatic case) feels herself to be rebelling against everything her parents stand for; but in fact her rebellion is merely a developmental stage, a step on the way to an adulthood which will, in all likelihood, look much like that of her parents. So this sort of “rebellion” is actually a stage in becoming a bourgeois adult.
And this should not be surprising. A wholesale rejection of a value system usually dooms one to recreate precisely those features one thought one was rejecting. It is only when we remember the power and significance of the value system we wish to leave behind, when we in some sense hold onto it, that we can actually find a way away from it— just as it is only when we remember the confusion of not understanding something that we can be said to actually understand it.
* * *
All of this comes into much sharper relief when we consider a film that does not fall into the usual pattern. In Douglas Sirk’s All That Heaven Allows (1955), Cary Scott, a middle-aged widow falls for the younger man who prunes the trees in her yard, a free-thinker, with a distinctly bohemian lifestyle and philosophy. The match is opposed by Cary’s bourgeois social circle and her preppy college-student children, and Cary must submit to this pressure or rebel against it. She chooses to rebel, but her rebellion is wonderfully quiet. There is no drunkenness, no drugs, no hint of the young or the hip. She does not act out, she does not make any scenes. It is others in her community, in fact, who, resentful of her bid for freedom from the rules that constrain them or hungry for salacious gossip, make scenes.
What drives Cary’s rebellion is not the frustration of a teenager but the real dread of a lonely widowed housewife who can find nothing to look forward to but the company of the fancy new television her son buys her for a Christmas present. What Cary is trapped in is much more complicated and much scarier than what the adolescent is trapped in, and she is bound to it by very real ties. She has no wish to embarrass or hurt her friends and children, nor to renounce her responsibilities, and the choice to reject the value system she lives in is therefore difficult and painful—as it ought to be for a real adult. The gleeful slightly mean-spirited pleasure that we get from watching respectability shocked and embarrassed is not available in All That Heaven Allows. But the joy of liberation is, and in this way we discover what the standard Hollywood narrative has obscured: that that pleasure and this joy are very distinct sensations.
Adolescent rebellion bears a peculiar relation to class: the adolescent (in the paradigmatic case) feels herself to be rebelling against everything her parents stand for; but in fact her rebellion is merely a developmental stage, a step on the way to an adulthood which will, in all likelihood, look much like that of her parents. So this sort of “rebellion” is actually a stage in becoming a bourgeois adult.
And this should not be surprising. A wholesale rejection of a value system usually dooms one to recreate precisely those features one thought one was rejecting. It is only when we remember the power and significance of the value system we wish to leave behind, when we in some sense hold onto it, that we can actually find a way away from it— just as it is only when we remember the confusion of not understanding something that we can be said to actually understand it.
* * *
All of this comes into much sharper relief when we consider a film that does not fall into the usual pattern. In Douglas Sirk’s All That Heaven Allows (1955), Cary Scott, a middle-aged widow falls for the younger man who prunes the trees in her yard, a free-thinker, with a distinctly bohemian lifestyle and philosophy. The match is opposed by Cary’s bourgeois social circle and her preppy college-student children, and Cary must submit to this pressure or rebel against it. She chooses to rebel, but her rebellion is wonderfully quiet. There is no drunkenness, no drugs, no hint of the young or the hip. She does not act out, she does not make any scenes. It is others in her community, in fact, who, resentful of her bid for freedom from the rules that constrain them or hungry for salacious gossip, make scenes.
What drives Cary’s rebellion is not the frustration of a teenager but the real dread of a lonely widowed housewife who can find nothing to look forward to but the company of the fancy new television her son buys her for a Christmas present. What Cary is trapped in is much more complicated and much scarier than what the adolescent is trapped in, and she is bound to it by very real ties. She has no wish to embarrass or hurt her friends and children, nor to renounce her responsibilities, and the choice to reject the value system she lives in is therefore difficult and painful—as it ought to be for a real adult. The gleeful slightly mean-spirited pleasure that we get from watching respectability shocked and embarrassed is not available in All That Heaven Allows. But the joy of liberation is, and in this way we discover what the standard Hollywood narrative has obscured: that that pleasure and this joy are very distinct sensations.
Tuesday, May 1, 2018
Hollywood Rebels (1)
I’ve been thinking further on the endless figurative rebellions of mainstream Hollywood. In many films, rebellion is an explicit and central theme—American Beauty, Office Space, The Breakfast Club, etc.— but even when it is not central, it is almost always present, at a level so deep that I am almost tempted to say that rebellion is the fundamental aesthetic position of Hollywood. It is through his or her rebellion, however trivial— the flouting of uptight manners, of teacher, of parent, of boss, of commanding officer, of social norm— that we recognize the Hollywood hero.
But it is a strange sort of rebellion that Hollywood shows us. For one thing, these rebellions always, as it were, dress up in the jewelry of political resistance, even though in most cases nothing is overturned, no structure dismantled or even threatened. Some middle-aged dad smokes pot and starts working out, some heiress falls for a poor artist but then he drowns, some teenagers write a manifesto and then feel better about themselves. We must ask how it is that these entirely apolitical narratives manage to achieve the edgy aura of political resistance and evoke the righteous joy of justice being done.
What they lack in real political content these rebellions make up for in performance: the rebels do not actually do much, but they strike excellent poses, flout norms with heroic coolness: they embarrass parents, do shocking things at garden parties, leave uptight people apoplectic. It is not the breaking of rules that makes audiences cheer gleefully but the flouting of them.
Flouting rules is exciting, because it suggests a lack of fear of consequences. But such a fearlessness is possible, for the Hollywood rebel, precisely because the rules that she is breaking do not actually have much hold on her. Real rebellion, whether it takes a political or a personal form, is always against constraints in which one is truly caught and enmeshed, against ties that have a real hold on one. To break such rules is terrifying. One expends one’s energy in breaking them; one has none left to flout them and strike poses.
But the type of the Hollywood rebel is not the political rebel but the adolescent. The adolescent’s aim is precisely to perform her independence. She may, in the process, break rules and weaken authority relations, but these actions are incidental: childhood always involves rules and authority relations, these are necessary and good, and their specific form is mostly a matter of culture; the adolescent’s psychological need is not to modify the parenting or pedagogical culture around her but to assert her independence. There is nothing wrong with this move (though we should remember that the very phenomenon of adolescence is a peculiar feature of modern industrialized culture), but we should not confuse it with political resistance.
More on this theme >>
But it is a strange sort of rebellion that Hollywood shows us. For one thing, these rebellions always, as it were, dress up in the jewelry of political resistance, even though in most cases nothing is overturned, no structure dismantled or even threatened. Some middle-aged dad smokes pot and starts working out, some heiress falls for a poor artist but then he drowns, some teenagers write a manifesto and then feel better about themselves. We must ask how it is that these entirely apolitical narratives manage to achieve the edgy aura of political resistance and evoke the righteous joy of justice being done.
What they lack in real political content these rebellions make up for in performance: the rebels do not actually do much, but they strike excellent poses, flout norms with heroic coolness: they embarrass parents, do shocking things at garden parties, leave uptight people apoplectic. It is not the breaking of rules that makes audiences cheer gleefully but the flouting of them.
Flouting rules is exciting, because it suggests a lack of fear of consequences. But such a fearlessness is possible, for the Hollywood rebel, precisely because the rules that she is breaking do not actually have much hold on her. Real rebellion, whether it takes a political or a personal form, is always against constraints in which one is truly caught and enmeshed, against ties that have a real hold on one. To break such rules is terrifying. One expends one’s energy in breaking them; one has none left to flout them and strike poses.
But the type of the Hollywood rebel is not the political rebel but the adolescent. The adolescent’s aim is precisely to perform her independence. She may, in the process, break rules and weaken authority relations, but these actions are incidental: childhood always involves rules and authority relations, these are necessary and good, and their specific form is mostly a matter of culture; the adolescent’s psychological need is not to modify the parenting or pedagogical culture around her but to assert her independence. There is nothing wrong with this move (though we should remember that the very phenomenon of adolescence is a peculiar feature of modern industrialized culture), but we should not confuse it with political resistance.
More on this theme >>
Friday, April 27, 2018
Adulthood (5)
When we think of adulthood, what do we think of? A respectable career, home ownership, spare towels in the linen closet, etc. There are of course other, more internal qualities that we associate with adulthood, but those are harder to perceive and to define; it is by the external markers of job, house, etc. that we most readily identify the adult and, by their absence, the one who has failed to achieve adulthood.
Let me then make the obvious observation: these are not in fact the markings of maturity but of membership in a particular class. But, because the class in question is the one that (nearly) everyone aspires to, these markers have taken on a magnified significance: they are not merely the markers of adulthood for the bourgeoisie; they are the markers of the adulthood that everyone is striving for. From the American dream and the fetish of “upward mobility” comes a confusion of concepts: success, maturity, class membership—which is which? There is a suggestion, almost, that the only true adult is the bourgeois adult.
Adulthood, then, takes on an aspirational quality: it is not something that everyone achieves by dint of surviving so many years on earth, nor even by dint of learning something from those years. It is something that one amasses. It comes from particular choices, choices that depend not upon moral strength or courage but on material prudence and conformity. This reorientation runs in strange parallel to the reorientation of class itself, from a fixed category that one is born into, to a kind of ladder that one must climb and climb— or fall!
And yet, at the same time, we seem to find something repugnant about the bourgeoisie, and what we find repugnant is almost precisely their (as we depict it) uptight, stodgy, comfortable, self-satisfied grown-up-ness. Perhaps no boogeyman looms as large as this one in our popular culture. All of rock and roll, all of punk rock rages against it. It is the target of all of mainstream Hollywood’s figurative rebellions: the young Victorian heiress bridling under her family’s uptight manners, the 1980s high-schooler turning the tables on repressive teachers, the suburban dad rediscovering pot and rock & roll—are all figures in revolt against adulthood-as-bourgeois-culture. It is impossible to know where one ends and the other begins.
Thus, at the same time that so many of us doggedly pursue bourgeois adulthood (or wonder what is wrong with ourselves that we fail to pursue it), we also imbibe a steady diet of fictional narrative glorifying the rejection of that form of adulthood and all the aspirations that come with it. As with so many contradictions that exist comfortably side-by-side within our cultural mythology, these fictional narratives do nothing to reduce the force of the command to grow up and become bourgeois. In fact, for me at least, they make that command much more powerful, because they seem to turn my resistance to it into something ridiculous, a playing-out of puerile Hollywood fantasies.
More on puerile Hollywood fantasies >>
More on the bourgeoisie >>
More on adulthood >>
Let me then make the obvious observation: these are not in fact the markings of maturity but of membership in a particular class. But, because the class in question is the one that (nearly) everyone aspires to, these markers have taken on a magnified significance: they are not merely the markers of adulthood for the bourgeoisie; they are the markers of the adulthood that everyone is striving for. From the American dream and the fetish of “upward mobility” comes a confusion of concepts: success, maturity, class membership—which is which? There is a suggestion, almost, that the only true adult is the bourgeois adult.
Adulthood, then, takes on an aspirational quality: it is not something that everyone achieves by dint of surviving so many years on earth, nor even by dint of learning something from those years. It is something that one amasses. It comes from particular choices, choices that depend not upon moral strength or courage but on material prudence and conformity. This reorientation runs in strange parallel to the reorientation of class itself, from a fixed category that one is born into, to a kind of ladder that one must climb and climb— or fall!
And yet, at the same time, we seem to find something repugnant about the bourgeoisie, and what we find repugnant is almost precisely their (as we depict it) uptight, stodgy, comfortable, self-satisfied grown-up-ness. Perhaps no boogeyman looms as large as this one in our popular culture. All of rock and roll, all of punk rock rages against it. It is the target of all of mainstream Hollywood’s figurative rebellions: the young Victorian heiress bridling under her family’s uptight manners, the 1980s high-schooler turning the tables on repressive teachers, the suburban dad rediscovering pot and rock & roll—are all figures in revolt against adulthood-as-bourgeois-culture. It is impossible to know where one ends and the other begins.
Thus, at the same time that so many of us doggedly pursue bourgeois adulthood (or wonder what is wrong with ourselves that we fail to pursue it), we also imbibe a steady diet of fictional narrative glorifying the rejection of that form of adulthood and all the aspirations that come with it. As with so many contradictions that exist comfortably side-by-side within our cultural mythology, these fictional narratives do nothing to reduce the force of the command to grow up and become bourgeois. In fact, for me at least, they make that command much more powerful, because they seem to turn my resistance to it into something ridiculous, a playing-out of puerile Hollywood fantasies.
More on puerile Hollywood fantasies >>
More on the bourgeoisie >>
More on adulthood >>
Tuesday, April 24, 2018
Adulthood (4)
Midway through the “Twixters” article, we find this paragraph:
In the years since, I’ve heard Grossman’s assessment repeated in various forms, usually (thank heavens) without the babying cheeriness. “Millennials expect too much from a job,” baby-boomers tell me. It is clearer than ever by now that I am one of the millennials they’re talking about, yet I no longer feel embarrassed or condemned. I have grown more bold in my failure. Once it ashamed me, but now I stand behind it with gloomy pride.
Here is what I have to say to these baby-boomers. When you entered the labor market, whatever job you took, you had at least this shred of dignity and purpose: that you were “a productive member of society.” It had not yet become clear that more productivity was just what society did not need. You did not seem to be a superfluous being whose “contribution” could only every be a contribution to a vast, intractable problem. You did not look out on the world and seem to hear a voice, muffled but urgent, whispering:
There are already too many lawyers, we do not need more of them. There are too many books, please don't write any more of them. Too many articles fighting for attention. Too many non-profits fighting for funding. Too many people, please don't deliver any more of them. Too many lives, please stop saving them. Too many buildings, please stop building them. Too many objects, please stop manufacturing them.
I’m well aware how self-justifying this all sounds. I don’t deny the charge. Naturally I want to justify myself, just as much as any investment banker or insurance salesman. I’m no better or worse than they—my shortcomings are just of a different type.
And, I won't deny, I long to go back: I long to believe again in the goals that once seemed clearly laid out for me. More than that, I long to achieve those goals. I wish success and prestige would suddenly fall upon me. I wonder, if they did, if I would forget all these galloping thoughts, these insatiable crumbs, and live comfortably and without criticism. Maybe I would. But they don’t fall upon me, and so I must keep nervously crumbling this loaf of thought and sprinkling it on the internet.
More on adulthood >>
Twixters expect a lot more from a job than a paycheck. Maybe it's a reaction to the greed-is-good 1980s or to the whatever-is-whatever apathy of the early 1990s. More likely, it's the way they were raised, by parents who came of age in the 1960s as the first generation determined to follow its bliss, who want their children to change the world the way they did. Maybe it has to do with advances in medicine. Twixters can reasonably expect to live into their 80s and beyond, so their working lives will be extended accordingly and when they choose a career, they know they'll be there for a while. But whatever the cause, twixters are looking for a sense of purpose and importance in their work, something that will add meaning to their lives, and many don't want to rest until they find it. "They're not just looking for a job," Arnett [a developmental psychologist] says. "They want something that's more like a calling, that's going to be an expression of their identity." Hedonistic nomads, the twixters may seem, but there's a serious core of idealism in them.In 2005, this set my teeth on edge. The discovery that my own dreams and ambitions were merely the predictable loop-the-loops of a member of a cohort of naive (and, note, self-aggrandizing) idealists was of course humiliating, but Grossman’s babying efforts to pat us on the back gave it all a more vicious bite. (His tone here reminds me of people who say things like, “Kids these days are geniuses! My three-year-old niece already knows her way around an I-pad better than I do.” I don’t believe these people are as naive as they pretend to be. On some level, they must realize these children are warped and addicted, but this awareness runs up against an incontrovertible command to tell kids they’re wonderful and brilliant, and the result is a sort of curdled positivity. Grossman’s self-deception is subtler, but it’s of the same general type.)
In the years since, I’ve heard Grossman’s assessment repeated in various forms, usually (thank heavens) without the babying cheeriness. “Millennials expect too much from a job,” baby-boomers tell me. It is clearer than ever by now that I am one of the millennials they’re talking about, yet I no longer feel embarrassed or condemned. I have grown more bold in my failure. Once it ashamed me, but now I stand behind it with gloomy pride.
Here is what I have to say to these baby-boomers. When you entered the labor market, whatever job you took, you had at least this shred of dignity and purpose: that you were “a productive member of society.” It had not yet become clear that more productivity was just what society did not need. You did not seem to be a superfluous being whose “contribution” could only every be a contribution to a vast, intractable problem. You did not look out on the world and seem to hear a voice, muffled but urgent, whispering:
There are already too many lawyers, we do not need more of them. There are too many books, please don't write any more of them. Too many articles fighting for attention. Too many non-profits fighting for funding. Too many people, please don't deliver any more of them. Too many lives, please stop saving them. Too many buildings, please stop building them. Too many objects, please stop manufacturing them.
I’m well aware how self-justifying this all sounds. I don’t deny the charge. Naturally I want to justify myself, just as much as any investment banker or insurance salesman. I’m no better or worse than they—my shortcomings are just of a different type.
And, I won't deny, I long to go back: I long to believe again in the goals that once seemed clearly laid out for me. More than that, I long to achieve those goals. I wish success and prestige would suddenly fall upon me. I wonder, if they did, if I would forget all these galloping thoughts, these insatiable crumbs, and live comfortably and without criticism. Maybe I would. But they don’t fall upon me, and so I must keep nervously crumbling this loaf of thought and sprinkling it on the internet.
More on adulthood >>
Monday, April 23, 2018
Adulthood (3)
In 2005, the year I graduated college, Time Magazine published an article by Lev Grossman about people my age and a few years older. The headline and subhead ran:
Grow Up? Not So Fast
Meet the Twixters. They're not kids anymore, but they're not adults either. Why a new breed of young people won't — or can't — settle down
This was the first of a wave of such articles discussing the strange Peter-Panish quality of my generation. (Do not ask me for other examples, I don’t remember any. Maybe I imagined them, maybe there were dozens.)
These articles irritated and unsettled me, but this was clearly not because the question they were asking was a bad one. There was something weird going on with my generation. We all knew it. We wandered the streets of large coastal cities like hungry wraiths looking for something solid to cling to. We wallowed in nostalgia for the pop-culture of our childhoods, which was of course a proxy for something else—a sense of security, maybe, but not material security—that we remembered having in our childhoods. Clearly, a good deal of what these articles were saying about us was, in some straightforward sense, accurate. This was why they were unsettling. But why then were they so irritating?
The clearest answer I can give to this question, and it is not a very clear one, is that these articles were like think-pieces written by a murderer, asking what is wrong with her hand that it keeps picking up blunt objects and banging them against round, hard things until those things crack open and their wet insides spill out. They are like think-pieces written by a drunk asking what is the matter with his belly that it keeps being full of alcohol. My point is not that the solution is obvious: an alcoholic may well ask himself why he keeps drinking, and the answer may be deep and complex; same with the murderer. My point is that (a) the problem is not isolated to the place where it is immediately manifest; and (b) it is perfectly clear why the writer is so intent on misunderstanding the situation. It is, after all, much more appealing to prod a sickly organ than to explore the dire state of the whole organism.
More on this theme >>
Grow Up? Not So Fast
Meet the Twixters. They're not kids anymore, but they're not adults either. Why a new breed of young people won't — or can't — settle down
This was the first of a wave of such articles discussing the strange Peter-Panish quality of my generation. (Do not ask me for other examples, I don’t remember any. Maybe I imagined them, maybe there were dozens.)
These articles irritated and unsettled me, but this was clearly not because the question they were asking was a bad one. There was something weird going on with my generation. We all knew it. We wandered the streets of large coastal cities like hungry wraiths looking for something solid to cling to. We wallowed in nostalgia for the pop-culture of our childhoods, which was of course a proxy for something else—a sense of security, maybe, but not material security—that we remembered having in our childhoods. Clearly, a good deal of what these articles were saying about us was, in some straightforward sense, accurate. This was why they were unsettling. But why then were they so irritating?
The clearest answer I can give to this question, and it is not a very clear one, is that these articles were like think-pieces written by a murderer, asking what is wrong with her hand that it keeps picking up blunt objects and banging them against round, hard things until those things crack open and their wet insides spill out. They are like think-pieces written by a drunk asking what is the matter with his belly that it keeps being full of alcohol. My point is not that the solution is obvious: an alcoholic may well ask himself why he keeps drinking, and the answer may be deep and complex; same with the murderer. My point is that (a) the problem is not isolated to the place where it is immediately manifest; and (b) it is perfectly clear why the writer is so intent on misunderstanding the situation. It is, after all, much more appealing to prod a sickly organ than to explore the dire state of the whole organism.
More on this theme >>
Sunday, April 22, 2018
Adulthood (2)
I grew up in a comfortable home. I went to private school, I did well in many subjects, I was considered promising. I went to a prestigious college, took time off, returned to school, dabbled in a wide array of disciplines, took more time off, returned a second time. I graduated, I took a full-time job, I worked for two years, I gave notice. I went abroad for a year, returned, took a new full-time job, worked a year, switched to a part-time position, began private tutoring. Another year passed, I left the part-time position, I started a blog, produced a play in my living-room, survived on freelance tutoring. I began working on a protest movement, abandoned my blog, stopped taking tutoring clients, gave away money, departed the protest movement disillusioned.
At this point, it occurred to me (not for the first time, but more forcefully than ever before) that I was too old to consider these my youthful wanderings. Looking over my life, I saw that the trajectory was not towards a settled career but away from it. This was, of course, very scary. I decided to buckle down, to make myself settle on something. I couldn't do it. I applied to jobs, to graduate school, but my applications were all delayed by indecision, begun too late, finished last minute. I received only rejections. I had gotten off of the train and could not find my way back on. At my parents’ synagogue on high-holidays, I would see kids I’d grown up with, now transformed into lawyers, doctors, professors.
I had the intellectual capacity to do a variety of things, but I lacked some temperamental ingredient. I’m not lazy: give me any job, and I work hard at it. I’m not apathetic or dull. On the contrary, I’m passionate and curious and enjoy solving problems. I have no difficulty getting along with other people. No, it’s something else. What is it?
Then a strange thing happened: I accepted my mysterious inability—not as something good or pleasant but as something I could not run from. Maybe it was a curse, but it was not a moral failing. It was not merely the accidental result of a series of careless decisions or listless dissipation. It was something essential either to my nature or to my relationship to the world.
In the meantime, I had begun to notice that a lot of the people I was close to showed similar symptoms. And by all reports, statistically speaking, my friends and I were no anomaly.
In fact, I had been long aware of these statistical trends amongst young people, but previously I had seen my participation in them only as a source of further humiliation. Now it seemed to hint at something more generally frightening but less personally demeaning. Maybe failure was in fact a reasonable response to the world. It was not the only reasonable response, perhaps, but it was no less reasonable, and, I began to think, in its peculiar way, no less honorable, than the alternatives.
More on adulthood >>
At this point, it occurred to me (not for the first time, but more forcefully than ever before) that I was too old to consider these my youthful wanderings. Looking over my life, I saw that the trajectory was not towards a settled career but away from it. This was, of course, very scary. I decided to buckle down, to make myself settle on something. I couldn't do it. I applied to jobs, to graduate school, but my applications were all delayed by indecision, begun too late, finished last minute. I received only rejections. I had gotten off of the train and could not find my way back on. At my parents’ synagogue on high-holidays, I would see kids I’d grown up with, now transformed into lawyers, doctors, professors.
I had the intellectual capacity to do a variety of things, but I lacked some temperamental ingredient. I’m not lazy: give me any job, and I work hard at it. I’m not apathetic or dull. On the contrary, I’m passionate and curious and enjoy solving problems. I have no difficulty getting along with other people. No, it’s something else. What is it?
Then a strange thing happened: I accepted my mysterious inability—not as something good or pleasant but as something I could not run from. Maybe it was a curse, but it was not a moral failing. It was not merely the accidental result of a series of careless decisions or listless dissipation. It was something essential either to my nature or to my relationship to the world.
In the meantime, I had begun to notice that a lot of the people I was close to showed similar symptoms. And by all reports, statistically speaking, my friends and I were no anomaly.
In fact, I had been long aware of these statistical trends amongst young people, but previously I had seen my participation in them only as a source of further humiliation. Now it seemed to hint at something more generally frightening but less personally demeaning. Maybe failure was in fact a reasonable response to the world. It was not the only reasonable response, perhaps, but it was no less reasonable, and, I began to think, in its peculiar way, no less honorable, than the alternatives.
More on adulthood >>
Friday, April 20, 2018
Adulthood (1)
“I have four clean towels in the closet at all times: I’m a grown-up,” said the girl at the next table. “I’ve got a hair-dryer too, and an iron and an ironing board."
“I can’t believe it!” said her friend.
“It’ll happen to you one day too,” said the first girl. “I didn’t see it coming. The other day, I looked around, and there it all ways, the towels, the iron, and I thought, my god, it’s happened, I’ve grown up.”
I could put up with no more of this. I turned around in my chair. “Listen,” I said, “what you’re talking about—that’s not being grown up, that’s being bourgeois. If you want to be bourgeois, that’s fine, that’s great, but don’t confuse it with adulthood.”
They looked at me, offended, naturally. “Some people never grow up,” murmured the second girl.
“You’re right!” I said. “In fact, hardly anyone does, no matter how many accouterments they manage to stock their closets with.”
“Why don’t you mind your own business?” said the one with the four towels. “No one was talking to you.”
“Oh, I suppose it was a private chat? At high volume, on a self-congratulatory topic, in a crowded café garden? I suppose you had no intention that the world should hear of your graduation into adulthood, you mealy-mouthed little bragger!” I cried striking the table, and overturning my coffee mug, which poured its contents neatly into my lap, as though there were some natural waterway leading in that direction.
I leaped up as the two girls retreated, pale but giggling, into the shadowy interior of the café.
* * *
The preceding is of course highly fictionalized. I did in fact overhear a conversation more or less like the one described between two girls in a café a couple years ago, but I had the— was it good sense? cowardice? apathy?— not to intervene. I present this work of partial fiction here by way of introducing a theme I would like to devote some time to.
More on adulthood>>
“I can’t believe it!” said her friend.
“It’ll happen to you one day too,” said the first girl. “I didn’t see it coming. The other day, I looked around, and there it all ways, the towels, the iron, and I thought, my god, it’s happened, I’ve grown up.”
I could put up with no more of this. I turned around in my chair. “Listen,” I said, “what you’re talking about—that’s not being grown up, that’s being bourgeois. If you want to be bourgeois, that’s fine, that’s great, but don’t confuse it with adulthood.”
They looked at me, offended, naturally. “Some people never grow up,” murmured the second girl.
“You’re right!” I said. “In fact, hardly anyone does, no matter how many accouterments they manage to stock their closets with.”
“Why don’t you mind your own business?” said the one with the four towels. “No one was talking to you.”
“Oh, I suppose it was a private chat? At high volume, on a self-congratulatory topic, in a crowded café garden? I suppose you had no intention that the world should hear of your graduation into adulthood, you mealy-mouthed little bragger!” I cried striking the table, and overturning my coffee mug, which poured its contents neatly into my lap, as though there were some natural waterway leading in that direction.
I leaped up as the two girls retreated, pale but giggling, into the shadowy interior of the café.
* * *
The preceding is of course highly fictionalized. I did in fact overhear a conversation more or less like the one described between two girls in a café a couple years ago, but I had the— was it good sense? cowardice? apathy?— not to intervene. I present this work of partial fiction here by way of introducing a theme I would like to devote some time to.
More on adulthood>>
Thursday, April 19, 2018
Thoughts
A thought is like a drug trip. In the midst of it, everything seems so meaningful and interrelated. Later, one awakes with a headache and remembers, with a confused sense of nostalgia and disillusionment, the excitement of the thought, the impression it gave of largeness, of drawing together many different themes. The worst thing to do at a moment like this is to cling to the thought’s past glory and clarity, to take, as it were, the hair of the dog. In this way, one can soon become thoroughly addicted. Some thoughts are so delightful that people can’t get enough. Soon an epidemic gets underway, everyone I meet is high as a kite on some thought, knowing smiles frosted on their lips.
That’s why I call these things crumbs: so I don’t forget how far they all are from being a loaf.
Still, there’s no good reason throw away a thought in disgust the moment one recovers from it. One only does this out of embarrassment, so as not to be reminded what a little, incomplete thing one was enthralled by. Much better to keep old thoughts around as reminders. And because, after all, it is a little flake of the truth.
That’s why I call these things crumbs: so I don’t forget how far they all are from being a loaf.
Still, there’s no good reason throw away a thought in disgust the moment one recovers from it. One only does this out of embarrassment, so as not to be reminded what a little, incomplete thing one was enthralled by. Much better to keep old thoughts around as reminders. And because, after all, it is a little flake of the truth.
Wednesday, April 18, 2018
A Pragmatic Approach
What will you do, and how will you live, when you know that there is nothing after or beyond this life? Will you live life more fully, more wholeheartedly because your eyes are not fixed on the pie in the sky? Or rather: will you dread that incomprehensible void of non-existence more than any hell that religious imagery could conjure? Will you live more carefully, guard your life more jealously, because death to you is not a passage but an end so complete that you dare not look upon it, lest it take hold of your mind, and all possibility of significance vanish before the yawning vastness of its irrevocable negation? Will you become obsessed with safety, pile precaution upon precaution, try to leave no chink where the cold wind of void may enter— but all in vain, of course. Will you live then in a world of dross, a world where nothing is valued so high as the extension, however incremental, of this life? Will you devote yourself to the material, to diet and nutrition, to fitness and health, to sterilization of surfaces and utensils, to food storage regulations? Will you live in terror of hazardous chemicals, of hidden poisons and sudden accidents? Will you discover then that every moment is precious, not in that it is full of secret ecstatic meaning, but in that you dread to lose it more than you dread any humiliation or decrepitude and will pay whatever price for a few more of the same?
And if, on the other hand, you believe that the soul survives the boundaries of this life, that it sheds the body like a snake its skin, and slithers on to unknown realms, will you give up all joy and lust for life in patient waiting for future rewards? Will your vision be dim to earthly beauty, earthly pain, earthly joy, because your mind is fixed on that other world? Or rather: will you live here more fully for living here more lightly? In caring less for flesh and more for spirit, will you perhaps live more truly in the world, and find yourself closer to its real ecstasies and tragedies? If you do not think that the closing curtain of death renders everything flat and equal, voids all pain and joy, settles all accounts to zero—will these pains and joys not then seem far more significant? And if you happen to believe that pain here in this world corresponds to joy in that other and joy here to pain there, will this reversal of meanings laid atop the immediate visceral meanings not rather enrich than impoverish them?
For this world is already half spirit.
I make no metaphysical claims. By spirit I mean only those substances that are perceived by the heart and the intellect: we are sensible to them, we encounter them, feel them, cannot avoid them. To say that those substances are not “real” in the way that other substances are real is to draw an artificial distinction. What do we mean by "real" if not that which we cannot avoid?
And if, on the other hand, you believe that the soul survives the boundaries of this life, that it sheds the body like a snake its skin, and slithers on to unknown realms, will you give up all joy and lust for life in patient waiting for future rewards? Will your vision be dim to earthly beauty, earthly pain, earthly joy, because your mind is fixed on that other world? Or rather: will you live here more fully for living here more lightly? In caring less for flesh and more for spirit, will you perhaps live more truly in the world, and find yourself closer to its real ecstasies and tragedies? If you do not think that the closing curtain of death renders everything flat and equal, voids all pain and joy, settles all accounts to zero—will these pains and joys not then seem far more significant? And if you happen to believe that pain here in this world corresponds to joy in that other and joy here to pain there, will this reversal of meanings laid atop the immediate visceral meanings not rather enrich than impoverish them?
For this world is already half spirit.
I make no metaphysical claims. By spirit I mean only those substances that are perceived by the heart and the intellect: we are sensible to them, we encounter them, feel them, cannot avoid them. To say that those substances are not “real” in the way that other substances are real is to draw an artificial distinction. What do we mean by "real" if not that which we cannot avoid?
Friday, April 13, 2018
The Arts
All ideology provokes opposition; powerful then is the ideology that contains the figure of its own opposition comfortably within itself.
Thus to the rigor of science is opposed the freedom of art; to the vapidity of consumerist culture, the edgy brilliance of art; to the conformity of the corporate office, the revolutionary exuberance of art; to the prudent uptightness of the nerd, the sexy recklessness of the artist. It does not matter whether corporate offices are in fact conformist or instead partake of artistic exuberance, or both; it does not matter whether the nerd is a lab technician or a banker or an insurance salesman; it does not matter whether it appears to be scientism or economics or mean kids at school or parents that oppress us. There is a wellspring of dissatisfaction, an abiding sense of being trapped in something; art stands as the figure of opposition to this feeling. In the myriad versions of the myth (in books, movies, etc.), the repressive force is dressed up in every possible disguise, but it draws its psychic power, its seeming reality from this wellspring, this sense of being trapped. The source of this feeling is certainly not science. It is not even reductive materialism, though that is surely a part of it. It is something larger than all these. My purpose here is not to name it. Perhaps it is unnameable. I am speaking only of the role played by art; these digressions are only to ensure that I do not seem to be simplifying and confusing things any more than I actually am.
Art is the spiritual outlet for the prisoner of rationalist ideology, the oxygen mask that drops down from the ceiling to alleviate the sense of gradual suffocation that might otherwise drive him to desperate action. This explains why it is precisely in the liberal, educated urban centers and amongst the most educated and rational classes that art is most devoutly fetishized.
At the same time, or rather at other moments, as it were under different lighting, science takes on precisely the virtues of art: thus the scientist as free thinker, as rebel against conservative belief systems, as champion of individualism, trusting in the evidence of his senses rather than the ossified wisdom of the past. I raise this point here only in order that the reader should not become confused by the double-image. Both depictions exist simultaneously and without real conflict. We see one, the lighting shifts and we see the other; the mythology is not at all destabilized.
Two important points must be made about this art which stands for all that we long for.
(1) Prior to about three hundred years ago, art had a very different role. There were, on the one hand, elaborate religious and mythological systems that sacrilized and structured the world and imbued it with meaning; on the other, there was painting, sculpture, poetry, music, etc. that spoke and signified within the systems of meaning that religion and mythology generated, that drew its imagery and allusions and often its passion from them, that was frequently devotional and in many cases actually created for ritual purposes. Now, the religious-mythological structures have been wiped away, and art is left to carry the entire burden: to not only depict sacred events but to be itself the sacred event; not only to express the meaning that is understood to lie in the world but to be the very source and form of that meaning. This is a burden it cannot bear.
It is entirely consistent with its role as the new source of the sacred that art has played such a central role in disassembling—or rather, flouting, tearing down, making a mockery of—older forms and objects of sacredness and devotion. The new king must, first of all, get rid of the old one.
But it is perhaps in its capacity as engine of desacrilization that we see most clearly art’s allegiance with reductive materialism. Science itself, real science, can do very little to dissolve systems of meaning, morality, sacrality, etc.. (Claims to the contrary are made with stunning confidence by people like Adam Gopnik, but it seems to me these claims are all nonsense—a discussion for another time.) The dissolution of these systems requires something that can act on the cultural level, that will not only soberly suggest that a literal reading of their cosmological foundations is untenable, but will actually flout their mores, make a laughing-stock of their sacred objects, undermine them at what we might call the ground-level of their psychological power.
I am not necessarily advocating a return to any particular religion or in fact to religion in general. I am only describing the way in which art and radical materialism work together.
(2) Over the past hundred years, and especially over the past fifty, all that which goes by the name art has undergone a division. At one time, this division was described as being between “pop” (or “low”) and “high” art. By now the division is further advanced, and the two resulting strands appear with a new bleaker clarity. On one side is the work of a highly consolidated media industry; what distinguishes this work, first and foremost, is not its popularity nor its low-brow-ness nor any other internal quality, but rather the economics of its production. It is mass-produced for a mass audience, carefully tailored to market segments, and disseminated through highly efficient distribution systems. On the other side stand the Real Artists— simultaneously heroic pioneers of the new and heroic bastions of High Culture— maintained by a rickety system of patronage from grant-making organizations, wealthy parents, and Kickstarter. The most important point about this latter group is that their work is viewed by an increasingly tiny segment of the population, and this segment consists largely of the artists themselves and their close friends.
The preceding paragraph sounds curmudgeonly and obnoxious, I know. But is there any other way to view the situation? Please write in! Tell me!
Thus to the rigor of science is opposed the freedom of art; to the vapidity of consumerist culture, the edgy brilliance of art; to the conformity of the corporate office, the revolutionary exuberance of art; to the prudent uptightness of the nerd, the sexy recklessness of the artist. It does not matter whether corporate offices are in fact conformist or instead partake of artistic exuberance, or both; it does not matter whether the nerd is a lab technician or a banker or an insurance salesman; it does not matter whether it appears to be scientism or economics or mean kids at school or parents that oppress us. There is a wellspring of dissatisfaction, an abiding sense of being trapped in something; art stands as the figure of opposition to this feeling. In the myriad versions of the myth (in books, movies, etc.), the repressive force is dressed up in every possible disguise, but it draws its psychic power, its seeming reality from this wellspring, this sense of being trapped. The source of this feeling is certainly not science. It is not even reductive materialism, though that is surely a part of it. It is something larger than all these. My purpose here is not to name it. Perhaps it is unnameable. I am speaking only of the role played by art; these digressions are only to ensure that I do not seem to be simplifying and confusing things any more than I actually am.
Art is the spiritual outlet for the prisoner of rationalist ideology, the oxygen mask that drops down from the ceiling to alleviate the sense of gradual suffocation that might otherwise drive him to desperate action. This explains why it is precisely in the liberal, educated urban centers and amongst the most educated and rational classes that art is most devoutly fetishized.
At the same time, or rather at other moments, as it were under different lighting, science takes on precisely the virtues of art: thus the scientist as free thinker, as rebel against conservative belief systems, as champion of individualism, trusting in the evidence of his senses rather than the ossified wisdom of the past. I raise this point here only in order that the reader should not become confused by the double-image. Both depictions exist simultaneously and without real conflict. We see one, the lighting shifts and we see the other; the mythology is not at all destabilized.
Two important points must be made about this art which stands for all that we long for.
(1) Prior to about three hundred years ago, art had a very different role. There were, on the one hand, elaborate religious and mythological systems that sacrilized and structured the world and imbued it with meaning; on the other, there was painting, sculpture, poetry, music, etc. that spoke and signified within the systems of meaning that religion and mythology generated, that drew its imagery and allusions and often its passion from them, that was frequently devotional and in many cases actually created for ritual purposes. Now, the religious-mythological structures have been wiped away, and art is left to carry the entire burden: to not only depict sacred events but to be itself the sacred event; not only to express the meaning that is understood to lie in the world but to be the very source and form of that meaning. This is a burden it cannot bear.
It is entirely consistent with its role as the new source of the sacred that art has played such a central role in disassembling—or rather, flouting, tearing down, making a mockery of—older forms and objects of sacredness and devotion. The new king must, first of all, get rid of the old one.
But it is perhaps in its capacity as engine of desacrilization that we see most clearly art’s allegiance with reductive materialism. Science itself, real science, can do very little to dissolve systems of meaning, morality, sacrality, etc.. (Claims to the contrary are made with stunning confidence by people like Adam Gopnik, but it seems to me these claims are all nonsense—a discussion for another time.) The dissolution of these systems requires something that can act on the cultural level, that will not only soberly suggest that a literal reading of their cosmological foundations is untenable, but will actually flout their mores, make a laughing-stock of their sacred objects, undermine them at what we might call the ground-level of their psychological power.
I am not necessarily advocating a return to any particular religion or in fact to religion in general. I am only describing the way in which art and radical materialism work together.
(2) Over the past hundred years, and especially over the past fifty, all that which goes by the name art has undergone a division. At one time, this division was described as being between “pop” (or “low”) and “high” art. By now the division is further advanced, and the two resulting strands appear with a new bleaker clarity. On one side is the work of a highly consolidated media industry; what distinguishes this work, first and foremost, is not its popularity nor its low-brow-ness nor any other internal quality, but rather the economics of its production. It is mass-produced for a mass audience, carefully tailored to market segments, and disseminated through highly efficient distribution systems. On the other side stand the Real Artists— simultaneously heroic pioneers of the new and heroic bastions of High Culture— maintained by a rickety system of patronage from grant-making organizations, wealthy parents, and Kickstarter. The most important point about this latter group is that their work is viewed by an increasingly tiny segment of the population, and this segment consists largely of the artists themselves and their close friends.
The preceding paragraph sounds curmudgeonly and obnoxious, I know. But is there any other way to view the situation? Please write in! Tell me!
Wednesday, April 11, 2018
Self-Flagellation (2)
A commenter suggests “another reason why [scientism]’s so appealing: elitism”:
But as with so many things, it’s hard to know which is the cause and which the effect: do we sneer at the superstitious in bitter envy, because secretly we long to believe in the sorts of things they believe? Or do we suffer the misery of a crushing world-view simply so that we may sneer at others?
What I want to suggest is that the latter description (at least taken on its own) is in some sense grounded in the same bleak world-view that we're critiquing here. It understands human beings in essentially predatory, competitive terms. It finds our deepest motives in some reductive-Darwinist compulsion to climb to the top. True, I am tortured by envy and competitive greed. But this is not all I am, nor will I believe that it’s what I am at heart. I envy not because I am built to envy but because I am discontent in myself, because of some sadness, some fear.
Let us not doubt that economic motivations, class warfare, insensate gobbling greed are at work in all of us, in all of our mistakes. But let us continue to insist that there are other motives at work as well, motives that are irreducibly personal and human. Let us view ourselves and our adversaries as characters in a novel, not as subjects in a statistical study. The more we do this, the more we will in fact experience our lives more like a novel and less like a statistical study.
I don't see it so much as self-flagellation… as the opportunity to push oneself out of the class of "menial service and stunted opportunity”, which is to be accomplished by pushing others into it. It's the old "I'm smarter than you because I can detect the world is awful" rag…All self-flagellation is at least in part about purification. And all purification is at least in part about elitism.
But as with so many things, it’s hard to know which is the cause and which the effect: do we sneer at the superstitious in bitter envy, because secretly we long to believe in the sorts of things they believe? Or do we suffer the misery of a crushing world-view simply so that we may sneer at others?
What I want to suggest is that the latter description (at least taken on its own) is in some sense grounded in the same bleak world-view that we're critiquing here. It understands human beings in essentially predatory, competitive terms. It finds our deepest motives in some reductive-Darwinist compulsion to climb to the top. True, I am tortured by envy and competitive greed. But this is not all I am, nor will I believe that it’s what I am at heart. I envy not because I am built to envy but because I am discontent in myself, because of some sadness, some fear.
Let us not doubt that economic motivations, class warfare, insensate gobbling greed are at work in all of us, in all of our mistakes. But let us continue to insist that there are other motives at work as well, motives that are irreducibly personal and human. Let us view ourselves and our adversaries as characters in a novel, not as subjects in a statistical study. The more we do this, the more we will in fact experience our lives more like a novel and less like a statistical study.
Tuesday, April 10, 2018
Self-Flagellation (1)
But we are not satisfied with this mortal kind of knowing. We want to know as God knows: definitely, perfectly. Which is why we keep raising up these systems of knowledge…So I wrote yesterday, but today I’m not so sure. True, there is a longing for superhuman certainty; and the progressivist pleasure in overcoming the past, in marching forward into the future, what Milan Kundera called the Kitsch of the Great March; and beneath all this an innate human greed, a wyrmish longing to sit on a hoard treasure.
But, on the other hand, there is something viscerally unpleasant in the feeling that we have the world all figured out. It is a crushing thought, and not from any elevated intellectual standpoint but from a very simple human one. It drains the very color out of life. So the question is not answered and must be asked more plainly: why do we put up with these systems of knowledge with their pretenses of absoluteness—and not only put up with them but swallow them greedily down and beg for more? When the decree comes down from on high that there’s no such thing as God or magic, that art and emotion are only chemical processes, that nothing matters, we do not merely groan and sadly accept our doom. No, we gleefully repeat it to everyone we meet. We heroically devote ourselves to it, we laugh at anyone dreamy-eyed enough to doubt it and revile their ignorance and delusion. Yes, we lash ourselves and one another for all the world like people who like to be whipped.
Until an honest observer must begin to doubt that there ever really was a decree from on high. Until an honest observer begins to think that the decrees are really generated down here below, and the monolith up on the hill (the cathedral, the university) is only a sort of show-tyrant, a figure onto which we project our collective compulsion to tyrannize and flagellate ourselves.
Progressivism (3)
If you want to pile something up, it has to be the sort of thing that stays put. But what sort of thing stays put? I’ll tell you what sort of thing: a thing that’s not alive.
Living things— living virtues, living understandings— wriggle away, shed their skin, become unrecognizable. The understanding that I have called wisdom is living understanding. It lives in the mind. To make it stay put, you have to kill it, stuff it, taxidermy it. It looks the same at first, but it takes only a moment to see that something essential has been removed. It’s not an animal anymore. It’s a wall-hanging.
Living things— living virtues, living understandings— wriggle away, shed their skin, become unrecognizable. The understanding that I have called wisdom is living understanding. It lives in the mind. To make it stay put, you have to kill it, stuff it, taxidermy it. It looks the same at first, but it takes only a moment to see that something essential has been removed. It’s not an animal anymore. It’s a wall-hanging.
Monday, April 9, 2018
Moderate Skepticism
There is nothing so dangerous as a moderate skepticism. A radical skepticism is at least logical: nothing is known, nothing is knowable, the chain of logic has no starting point. Such an idea has reached its terminal form. It has discovered the one thing that is definitely true— that nothing is definitely true— and is ready to reveal itself for what it really is: not an insight into the uncertainty of knowledge but a simple mis-understanding of what it is to know. It is then only one easy final step to abandon the fantasy of absolute knowledge and plunge back, open-eyed and unburdened, into the world.
What is dangerous is a partial skepticism, a skepticism that says: all those other ways of knowing are uncertain, but here is the One Way that is certain, here is the path to positive absolute knowledge.
At different times, in different places, different paths are given this idolatrous privilege. In some circles, it is a certain book; in others, a set of research methodologies. It is not that the bible is not full of wisdom. It is not that statistical studies will not generate knowledge. But this wisdom and this knowledge are subject to the same fundamental uncertainty that characterizes all human knowing. To know still means what it has always meant: that we have hold of a thought that has so far proved useful. And what we find, when we consider our experience in the world (like good empiricists), is that many of the thoughts that prove most useful are of the most indefinite sort: hazy, contingent, un-anointed by any magical source of transcendent truth.
But we are not satisfied with this mortal kind of knowing. We want to know as God knows: definitely, perfectly. Which is why we keep raising up these systems of knowledge, which are precisely Towers of Babel: projects of lifting ourselves up as high as God, to see as far as God. We do this even when we no longer believe in God— and why not? If there’s not already someone up there, that just leaves more room for us!
What is dangerous is a partial skepticism, a skepticism that says: all those other ways of knowing are uncertain, but here is the One Way that is certain, here is the path to positive absolute knowledge.
At different times, in different places, different paths are given this idolatrous privilege. In some circles, it is a certain book; in others, a set of research methodologies. It is not that the bible is not full of wisdom. It is not that statistical studies will not generate knowledge. But this wisdom and this knowledge are subject to the same fundamental uncertainty that characterizes all human knowing. To know still means what it has always meant: that we have hold of a thought that has so far proved useful. And what we find, when we consider our experience in the world (like good empiricists), is that many of the thoughts that prove most useful are of the most indefinite sort: hazy, contingent, un-anointed by any magical source of transcendent truth.
But we are not satisfied with this mortal kind of knowing. We want to know as God knows: definitely, perfectly. Which is why we keep raising up these systems of knowledge, which are precisely Towers of Babel: projects of lifting ourselves up as high as God, to see as far as God. We do this even when we no longer believe in God— and why not? If there’s not already someone up there, that just leaves more room for us!
Sunday, April 8, 2018
Progressivism (2)
Clearly certain kinds of questions are best answered by the accumulative type of knowledge: how to build a house, how much weight a bridge can hold, how to treat a Streptococcus infection, etc.. These are important questions to which we all sometimes need answers, but they are not the kinds of questions that most of us spend most of our time worrying about. Rather we spend our time worrying about how to navigate relationships—with other people, with the various parts of ourselves, and with various Other Things that may or may not be part of ourselves. This is not irrational or neurotic of us: it is in these relationships that we in fact find nearly all of the satisfaction and joy and misery and terror that are our lot in life.
Over the past 150 years and especially in the past 50, a lot of effort has gone into producing scientific answers to questions about these types of relationships. Approaches have varied. At one time surveying “test-groups” and “control-groups” was in vogue. Now I hear neural imaging’s all the rage. And there’s a lot of popular reporting, in magazines and on websites, on the supposed insights provided by these methods. However, no one I know has actually found any of it useful in answering questions like, “Should I marry the person I’m dating?” or “Why does my job seem so meaningless?” or even “How do I talk to my roommate about the dishes issue?” and “Should I go home for Thanksgiving?”
Will accumulative knowledge someday be found that bears on these sorts of questions? You may choose to keep an open mind and say that it might, but, if you are not entirely fanatical, you will have to admit that it also might not. That is, the questions to which we are most eager to find answers may be matters not of knowledge, but of wisdom. If they are, this is no threat to science, so long as science is content to speak only on those subjects about which it can speak. So long as it is content to be only one little system, not the whole System of Everything.
* * *
In the same way, we should contemplate the limits of progressivism. No doubt certain types of societal virtues are of the sort that we can pile it up like scientific knowledge, but others may prove slippery, and the slippery ones may turn out to be the most important. Maybe, even as we are working to make everything more just and more kind and more fair, these slippery virtues are slipping away. Maybe every incision we make in the fabric of culture to insert something good is a hole through which something else that we do not even have a name for is draining away.
What would these slippery virtues be? They would be things that are easy to sense but difficult to define. If we could define them, we could hang onto them. Whatever can be definitely described can be definitely remembered and definitely preserved. But, again, some things leach out of the very words of which they seem to be made.
The progressive believes that changes made for the good of society will in fact increase the good of society; therefore, the progressive seems to be in favor of change and in favor of justice. The opposing position is not (a) a desire for things to stay the same or (b) a preference for injustice. The opposing position is a kind of caution with regard to change, a sense that there is something delicate and precious mixed in with all the ugliness of the world, and that these delicate precious things are the easiest to lose sight of and thus the easiest to lose altogether.
A person who holds this opposing position is interested in conserving something, and so we might call her a conservative, but she need not hold any of the particular positions that are today associated with that term. Likewise, the particular political positions that we associate with the term "progressive" are only a small subset of the issues to which a progressive outlook can be and frequently is applied.
Over the past 150 years and especially in the past 50, a lot of effort has gone into producing scientific answers to questions about these types of relationships. Approaches have varied. At one time surveying “test-groups” and “control-groups” was in vogue. Now I hear neural imaging’s all the rage. And there’s a lot of popular reporting, in magazines and on websites, on the supposed insights provided by these methods. However, no one I know has actually found any of it useful in answering questions like, “Should I marry the person I’m dating?” or “Why does my job seem so meaningless?” or even “How do I talk to my roommate about the dishes issue?” and “Should I go home for Thanksgiving?”
Will accumulative knowledge someday be found that bears on these sorts of questions? You may choose to keep an open mind and say that it might, but, if you are not entirely fanatical, you will have to admit that it also might not. That is, the questions to which we are most eager to find answers may be matters not of knowledge, but of wisdom. If they are, this is no threat to science, so long as science is content to speak only on those subjects about which it can speak. So long as it is content to be only one little system, not the whole System of Everything.
* * *
In the same way, we should contemplate the limits of progressivism. No doubt certain types of societal virtues are of the sort that we can pile it up like scientific knowledge, but others may prove slippery, and the slippery ones may turn out to be the most important. Maybe, even as we are working to make everything more just and more kind and more fair, these slippery virtues are slipping away. Maybe every incision we make in the fabric of culture to insert something good is a hole through which something else that we do not even have a name for is draining away.
What would these slippery virtues be? They would be things that are easy to sense but difficult to define. If we could define them, we could hang onto them. Whatever can be definitely described can be definitely remembered and definitely preserved. But, again, some things leach out of the very words of which they seem to be made.
The progressive believes that changes made for the good of society will in fact increase the good of society; therefore, the progressive seems to be in favor of change and in favor of justice. The opposing position is not (a) a desire for things to stay the same or (b) a preference for injustice. The opposing position is a kind of caution with regard to change, a sense that there is something delicate and precious mixed in with all the ugliness of the world, and that these delicate precious things are the easiest to lose sight of and thus the easiest to lose altogether.
A person who holds this opposing position is interested in conserving something, and so we might call her a conservative, but she need not hold any of the particular positions that are today associated with that term. Likewise, the particular political positions that we associate with the term "progressive" are only a small subset of the issues to which a progressive outlook can be and frequently is applied.
Subscribe to:
Posts (Atom)