1. Figuring out what is or isn’t art is like pondering what is or isn’t “authentic” Vietnamese cuisine – a hobby of pedants and thought police that usually just gets in the way of a pleasurable experience.
2. Conflating art with aesthetics is like conflating French cooking with the entire culinary universe, or maybe even haute cuisine with the totality of what constitutes food.
3. Molecular gastronomy might be the cooking equivalent of contemporary art, not only because of its rarefied nature, elevated ambition, and intellectual bent, but also because it is elitist, full of gimmicks, faddish, and dying a well deserved death.
4. Art is a cancerous cell in the body of aesthetic practices, attempting to replicate itself at the expense of the larger body, crowding the diverse, multi-cellular ecosystem with its one dimensional excesses.
5. Eliminate all art departments and replace them with aesthetics departments (but let’s eventually dismantle them too).
6. Art departments have actually become Art Department Studies, mistaking the problems of art students, professors, and the educational edifice with the problems of art. They also forget that their professionalizing practices (the critique, baptism by theory, the artist statement, etc.) do not serve art, but serve only to beg for disciplinary approval from the corporate university.
7. Art, then needs audacious cooks, perhaps some of which have gone to school, but many that have not, who are not cooking to impress their instructors, but to make tasty food. Art needs the audacity of participation, not led by art world facilitators, but by upstart food truck ventures, by home cooks, by all the people who are bold enough to believe that they are already participating if the so called experts would just get out of the way.
…A novelist who made a cult of laziness, he had no qualms about taking it easy when it came to literary invention—“The same idea is in all my books; I shape it differently,” he once said…
Cossery’s heroes are usually dandies and thieves, unfettered by possessions or obligations; impoverished but aristocratic idlers who can suck the marrow of joy from the meager bones life tosses their way. They are the descendants of Baudelaire’s flâneur, of the Surrealists with their rejection of the sacrosanct work ethic, of the Situationists and their street-theater shenanigans, not to mention the peripatetic Beats or the countercultural “dropouts” of the 1960s. Henry Miller, who raised dolce far niente to an art form, praised Cossery’s writing as “rare, exotic, haunting, unique.” Whether Cossery’s merry pranksters wish merely to have a good time or, as in The Jokers, to wage an all-out campaign of raillery against the powers that be, there is one belief they all share: the only true recourse against a world governed by “scoundrels” is an utter disregard for convention, including the convention of taking anything seriously.
…The proud beggars in this story are Gohar, who has abandoned a professorship to live on the fringe as a street philosopher and bookkeeper in a brothel; Gohar’s protégé, the poet and drug dealer Yeghen, who tries to live his life as if it were itself a poem; and El Kordi, a revolutionary sympathizer chafing against his dead-end job as a government clerk.
The Egyptian-French novelist Albert Cossery was a philosophical and aesthetic dandy who loathed the idea of work, celebrated underground movements and ideas, and absolutely detested power. He was the dandy as a political subversive—an idea that must be resurrected.
Cossery, in a sense, is something of the offspring of the Surrealist Jacques Vache, a self-described “umourist” who revelled in doing nothing at all. An artist who decided not to create art, a poet who decided not to write poetry, all in an effort to prove that creation of works is counter-intuitive to the true artist, who must live the art and not leave evidence or relics as proof of genius.
Governments are, in fact, quite terrified of this sort of philosophical dandyism—of the aggregate of individuals who subvert by gleefully doing nothing.
And so it is the politically subversive dandy—the transcendent dandy—who is best-equipped to lead a new politically-subversive movement, where a panoply of ideas merge like a kaleidoscope. The dandy understands the absurdity of power and the various ways to subvert, ignore and transcend it, without resorting to violent means.
Dandyism, at its core, is political subversion, and Albert Cossery was nothing if not a dandy. And it was the dandies, the forgotten and ignored whom Cossery celebrated in his novels.
…Characters opt to withdraw from any idea of a career. To recognize the absurdity of joining power in its game (government) and staying as far away from it as possible. To know that love—for friends, fuck buddies, boyfriends, girlfriends—was all and that it was untouchable, transcendent.
We need a new era of dandyism, of subversives. We need a new counter-culture.
The dandy as imagined by Cossery has time to think and enjoy life. Idleness is not only a virtue for Cossery and his characters, it is elevated to the natural state of being—a rejection of the unnatural tethers which are fixed to our bodies as soon as we escape the womb: the classroom, the cubicle, the wage, the dollar, rent, and so forth.
The Egyptian lived radically lazily on the Left Bank, challenging social norms with books devoid of materialism and ambition.
All his life, Cossery sided with those he felt God had forgotten: petty thieves, pretty prostitutes, exploited workers and hungry vagrants. He despised materialism and eschewed the rat race. In Proud Beggars (1955), usually considered his masterpiece, a university professor finds peace of mind by becoming a bum, proving that beggars can be choosers…
For the author and his lovable rogue’s gallery, sleep, daydreams and hashish-induced reverie are endowed with mystical qualities. Idleness is more than a way of life. It offers the greatest luxury of all: time to think and therefore the chance to be fully alive, “minute by minute”. The overt message of these people whom God has forgotten (but who themselves have not forgotten God) is that paradise is not lost, but most of us are too busy to bask in “the Edenic simplicity of the world”.
“If we take eternity to mean not infinite temporal duration but timelessness, then eternal life belongs to those who live in the present.” ― Ludwig Wittgenstein
…Why do we assume our own temperaments and habits are at fault — and feel bad about them — rather than question our culture’s canonization of productivity?
…Whatever you’re doing, aren’t you by nature procrastinating from doing something else? Seen in this light, procrastination begins to look a lot like just plain existing. But then along come its foot soldiers — guilt, self-loathing, blame.
Though Expeditus’s pesky crow may be ageless, procrastination as epidemic — and the constant guilt that goes with it — is peculiar to the modern era. The 21st-century capitalist world, in its never-ending drive for expansion, consecrates an always-on productivity for the sake of the greater fiscal health.
…the voice — societal or psychological — urging us away from sloth to the pure, virtuous heights of productivity has become a sort of birdlike shriek as more individuals work from home and set their own schedules, and as the devices we use for work become alluring sirens to our own distraction. We are now able to accomplish tasks at nearly every moment, even if we prefer not to.
Still, humans will never stop procrastinating, and it might do us good to remember that the guilt and shame of the do-it-tomorrow cycle are not necessarily inescapable. The French philosopher Michel Foucault wrote about mental illness that it acquires its reality as an illness “only within a culture that recognizes it as such.” Why not view procrastination not as a defect, an illness or a sin, but as an act of resistance against the strictures of time and productivity imposed by higher powers? To start, we might replace Expeditus with a new saint.
At the conference, I was invited to speak about the Egyptian-born novelist Albert Cossery, a true icon of the right to remain lazy. In the mid-1940s, Cossery wrote a novel in French, “Laziness in the Fertile Valley,” about a family in the Nile Delta that sleeps all day. Their somnolence is a form of protest against a world forever ruled by tyrants winding the clock. Born in 1913 in Cairo, Cossery grew up in a place that still retained cultural memories of the introduction of Western notions of time, a once foreign concept. It had arrived along with British military forces in the late 19th century. To turn Egypt into a lucrative colony, it needed to run on a synchronized, efficient schedule. The British replaced the Islamic lunar calendar with the Gregorian, preached the values of punctuality, and spread the gospel that time equaled money.
Firm in his belief that time is not as natural or apolitical as we might think, Cossery, in his writings and in his life, strove to reject the very system in which procrastination could have any meaning at all. Until his death in 2008, the elegant novelist, living in Paris, maintained a strict schedule of idleness. He slept late, rising in the afternoons for a walk to the Café de Flore, and wrote fiction only when he felt like it. “So much beauty in the world, so few eyes to see it,” Cossery would say. He was the archetypal flâneur, in the footsteps of Walter Benjamin and Charles Baudelaire, whose verses Cossery would steal for his own poetry when he was a teenager. Rather than charge through the day, storming the gates of tomorrow, his stylized repose was a perch from which to observe, reflect and question whether the world really needs all those things we feel we ought to get done — like a few more pyramids at Giza. And it was idleness that led Cossery to true creativity, dare I say it, in his masterfully unprolific work.
I don’t want art to ask any questions, unless it is “what would you like for dinner?” I want art to be predictable, like a romantic comedy that leaves you crying on the couch even though you knew they would end up together. I’d like it to sit in your lap and purr. Art should be like a mailbox – mostly junk, filled with ads, scams, bills, and the occasional birthday card. I don’t want art to teach me anything, unless it is how to make compost or how to organize my closet. I want art to be happy with what it has, I don’t want it to try to get ahead. Art ought to be gossip magazines in the waiting room. It should be a doily on your grandmother’s dresser. Art ought to be a cup holder in your car or the wrappers in the backseat. Or maybe art could be an armrest or bath mat. Art should be like a GAP ad. It should be paint peeling from a barn. I think art should quit being art, should change its name, go into the witness protection program. Art should be your new neighbor and wave to you from their driveway. I want art to be normal. You never have to know the crimes, the dirty deeds, or its sordid past. Art should be an unmade bed that sometimes gets fresh sheets when you’re having a party.
“…passion as the basis of a life well lived.” – The last curator strangled with the guts of the last gallerist? – Against the moderate Enlightenment
But things aren’t quite as easy. First of all, Diderot has been denied this honour multiple times, most recently in 1913. He was, still is, thought of as an intellectual troublemaker, someone all too fond of Eros and erotic passion, an implacable enemy of the Church, an incorrigible skeptic when it comes to power and the right of individuals to decide over others. These difficulties could perhaps be overcome in our tolerant and republican age. After all Voltaire, who has preceded him in the sacred site of French national memory, was also not a friend of the Church.
Condemned by the Church and hated by the Court, d’Holbach and Diderot were beacons of free thinking and directly inspired the America’s Founding Fathers. Franklin is likely to have participated at the dinners and ensuing discussions; Jefferson, whose personal library still testifies to his interests, read and admired Diderot, d’Holbach, Helvétius, and Raynal, as well as their intellectual predecessors. For the Declaration of Independence, he transformed the Lockean formulation for the pursuit of life, health, liberty, and property into the more properly Epicurean and Diderotean “pursuit of Happiness.”
Diderot saw the truest and the highest goal of human nature not in reason, but in lust. Humanity’s existence is driven by Eros, by the search for pleasure. This sensualist approach had an important metaphysical consequence: in a world without sin, a world in which no wrathful God has condemned all lust and demands suffering from his creatures on this earth in order to soften the blow of eternal punishment, the goal of life becomes the best possible realization of pleasure, the education of desire in accordance with natural laws. In a society without transcendental interference, this chance, the chance of the pursuit of happiness, must be given to all.
These views were anathema to just about everyone who sought to maintain or gain power, from the aristocracy to Revolutionary dictators such as Maximilien de Robespierre and Napoleon, all the way up to the Catholic Restoration that followed. “Men will not be free until the last king is strangled with the guts of the last priest,” wrote Diderot—not the kind of message to appeal to the nineteenth-century bourgeoisie. Laissez-faire capitalism allowed a self-proclaimed Christian middle class to profit from the misery of the working poor, at home and in the colonies. They could not argue for their position with Diderot, who became ever more scathing about the justification of power, privilege, slavery, and colonial expansion. It was Voltaire’s moderate Enlightenment that offered them the necessary vocabulary and allowed them to see themselves as the guardians of civilization, Enlightenment and religious values, put above the ignorant masses by divine providence.
The radical Enlighteners had understood and condemned this emergent power structure, as a ‘conspiracy of priests and magistrates’. Their thought was evolutionist long before Charles Darwin; they defended the rights of slaves before William Wilberforce and of women before Mary Wollstonecraft. They wanted to see individual fulfillment in a morality built around lust and social justice in a society built on pleasure and free choice, not on pain and oppression. Their potent ideas were unsurprisingly intoxicating discoveries for latter-day revolutionaries; among d’Holbach and Diderot’s ardent readers were Karl Marx, Friedrich Nietzsche, and Sigmund Freud.
The grand narrative of a rationalist Enlightenment that freed humanity from superstition only to subjugate it once again, this time to the dictates of reason and rationalization, suited the interests and self-image of an economy driven by entrepreneurship and fuelled by a cult of efficiency and cheap labor. Our own time dominated by the fiction of the market that needs to be obeyed and placated like an ancient deity and a society optimized for maximum consumption are a direct consequence of the Enlightenment cult of reason without the Diderotian emphasis on empathy. We have inherited the truncated history and repeated it to one another, tacitly encouraging a narrowness of thought that bears little resemblance to one of the freewheeling exchanges over candlelight at d’Holbach’s salon. The existence of a second, more radical Enlightenment tradition is not denied completely, but two centuries of historical bias have done their work, slowly but surely.
But while his role in this magnificent mammoth work was important for the eighteenth century, it is Diderot the philosopher who can still speak to us today. His advocacy of a passionate life, of social solidarity and empathy as the foundations of morality, his interest in science as the basis of all knowledge and in art and Eros as ways of creating meaning have lost nothing of their freshness, or their necessity. The real potential of the Enlightenment, he says time and again, is not the absolute rule of reason, but the rehabilitation of passion as the basis of a life well lived.
“Philosophy should be conversation, not dogma – face-to-face talk about our place in the cosmos and how we should live”
Western philosophy has its origins in conversation, in face-to-face discussions about reality, our place in the cosmos, and how we should live. It began with a sense of mystery, wonder, and confusion, and the powerful desire to get beyond mere appearances to find truth or, if not that, at least some kind of wisdom or balance.
Socrates started the conversation about philosophical conversation. This shabby eccentric who wandered the marketplace in fifth-century Athens accosting passersby and cross-questioning them in his celebrated style set the pattern for philosophical discussion and teaching. His pupil Plato crafted eloquent Socratic dialogues that, we assume, capture something of what it was like to be harangued and goaded by his mentor, though perhaps they’re more of a ventriloquist act. Socrates himself, if we believe Plato’s dialogue Phaedrus, had no great respect for the written word. He argued that it was inferior to the spoken. A page of writing might seem intelligent, but whatever question you ask of it, it responds in precisely the same way each time you read it — as this sentence will, no matter how many times you return to it.
Besides, why would a thinker cast seeds on barren soil? Surely it is better to sow then where they’re likely to grow, to share your ideas in the way most suited to the audience, to adapt what you say to whoever is in front of you. Wittgenstein made a similar point in his notebooks when he wrote: ‘Telling someone something he will not understand is pointless, even if you add he will not understand it.’ The inflections of speech allowed Socrates to exercise his famous irony, to lay emphasis, to tease, cajole, and play, all of which is liable to be misunderstood on the page. A philosopher might jot down a few notes as a reminder of passing thoughts, Socrates suggested, but, for philosophical communication, conversation was king.
New technology is changing the landscape in which philosophical conversations — and arguably all conversations – take place. It has allowed contemporary philosophers to reach global audiences with their ideas, and to take philosophy beyond the lecture halls. But there is more to this ‘spoken philosophy’ than simply the words uttered, and the ideas discussed. Audible non-verbal aspects of the interaction, such as hearing the smile in someone’s voice, a moment of impatience, a pause (of doubt perhaps?), or insight — these factors humanise philosophy. They make it impossible to think of it as just a mechanical application of rigorous logic, and reveal something about the thinker as well as the position taken. Enthusiasm expressed through the voice can be contagious and inspirational.
However, it was John Stuart Mill who crystallised the importance of having your ideas challenged through engagement with others who disagree with you. In the second chapter of On Liberty (1859), he argued for the immense value of dissenting voices. It is the dissenters who force us to think, who challenge received opinion, who nudge us away from dead dogma to beliefs that have survived critical challenge, the best that we can hope for. Dissenters are of great value even when they are largely or even totally mistaken in their beliefs. As he put it: ‘Both teachers and learners go to sleep at their post, as soon as there is no enemy in the field.’
Whenever philosophical education lapses into learning facts about history and texts, regurgitating an instructor’s views, or learning from a textbook, it moves away from its Socratic roots in conversation. Then it becomes so much the worse for philosophy and for the students on the receiving end of what the radical educationalist Paolo Freire referred to pejoratively in Pedagogy of the Oppressed (1970) as the ‘banking’ of knowledge. The point of philosophy is not to have a range of facts at your disposal, though that might be useful, nor to become a walking Wikipedia or ambulant data bank: rather, it is to develop the skills and sensitivity to be able to argue about some of the most significant questions we can ask ourselves, questions about reality and appearance, life and death, god and society. As Plato’s Socrates tells us, ‘These are not trivial questions we are discussing here, we are discussing how to live.’
Giacomo Leopardi (1798-1837) has been remembered as a poet who produced delicate verse inspired by a melancholy version of Romanticism, along with some sharp epigrams on the discontents that go with civilisation. This was always a crude view of the early- 19th-century Italian writer. Leopardi’s subtle sensibility eludes conventional intellectual categories and the true achievement of this subversive genius has been little recognised.
With astonishing prescience, he diagnosed the sickness of our time: a dangerous intoxication with the knowledge and power given by science, mixed with an inability to accept the humanly meaningless world that science has revealed. Faced with emptiness, modern humanity has taken refuge in schemes of world improvement, which all too often – as in the savage revolutions of the 20th century and the no less savage humanitarian warfare of the 21st – involve mass slaughter. The irrationalities of earlier times have been replaced by what Leopardi calls “the barbarism of reason”.
…An anthropologist of modernity, Leopardi stood outside the beliefs of the modern age. He could never take seriously the faith in progress: the notion that civilisation gradually improves over time. He knew that civilisations come and go and that some are better than others – but they are not stations on a long march to a better world. “Modern civilisation must not be considered simply as a continuation of ancient civilisation, as its progression . . . These two civilisations, which are essentially different, are and must be considered as two separate civilisations.”
His sympathies lay with the ancients, whose way of life he believed was more conducive to human happiness. A product of the increase of knowledge, the modern world is driven by the pursuit of truth; yet this passion for truth, Leopardi suggests, is a by-product of Christianity. Before Christianity disrupted and destroyed the ancient pagan cults with its universal claims, human beings were able to rest content with their local practices and illusions. “Mankind was happier before Christianity than after it,” he writes.
What fascinated Schopenhauer, along with many later writers, was Leopardi’s insistence that illusion is necessary to human happiness. Matthew Arnold, A E Housman, Herman Melville, Thomas Hardy, Fernando Pessoa (who wrote a poem about the Italian poet) and Samuel Beckett were all stirred by his suggestion that human fulfilment requires a tolerance of illusion that is at odds with both Christianity and modern science. A version of the same thought informs the work of Wallace Stevens, perhaps the greatest 20th-century English-language poet, who saw the task of poetry as being the creation of fictions by which human beings can live.
Leopardi was emphatic in affirming the constancy of human nature and the existence of goods and evils that are universally human. He was far from being a moral relativist. What he rejected was the modern conceit that aims to turn these often conflicting values into a system of universal principles – a project that fails to comprehend the irresolvable contradictions of human needs. “No one understands the human heart at all,” he wrote, “who does not understand how vast is its capacity for illusions, even when these are contrary to its interests, or how often it loves the very thing that is obviously harmful to it.” Modern rationalists imagine they do not succumb to this quintessentially human need for illusion, but in reality they display it to the full.
…The barbarism of reason is the attempt to order the world on a more rational model. However, evangelists for reason are more driven by faith than they know and the result of attempting to impose their simpleminded designs on the world has been to add greatly to the evils to which human life is naturally prone.
Some will find Leopardi unsatisfying because he proposes no remedy for modern ills, but for me a part of his charm comes from how he has no gospel to sell. The Romantic movement turned to visions of natural harmony as an escape from the flaws of civilisation. With his more penetrating intelligence, Leopardi understood that because human beings are spawned by natural processes, their civilisations share the ramshackle disorder of the natural world. Brought up by his father to be a good Catholic, he became a resolute atheist who admired ancient pagan religion; but because it was not possible to return to the more benign faiths of ancient times, he was friendly to Christianity in his own day, seeing it as the lesser of many evils: “Religion (far more favoured and approved by nature than by reason) is all we have to shore up the wretched and tottering edifice of present-day human life.”
Realising that the human mind can decay even as human knowledge advances, Leopardi would not have been surprised by the stupefying banality and shallowness of current debates on belief and unbelief. He accepted that there is no remedy for the ignorance of those who imagine themselves to be embodiments of reason. Today’s evangelical rationalists lag far behind the understanding of the human world that he achieved in the early decades of the 19th century.
“It seems to me that daily practice—small choices, lives well lived, mindfully and attentively lived—is the only way a just society can sustain itself.” – A world made of stories
Bill: I think you and I would both say that a traditional experience of wilderness—the kind where you’re living outdoors for an extended period, in a landscape far away from ordinary comforts—is wholly a creature of civilization. It’s an expression of certain cultural values, but it’s still a real experience. It’s still something we can use to take our compass bearings. We can still look to those values for our sense of self in these places.
Michael: I think science can give us a measure, too. When you study how nutrients cycle in a natural environment, for example, you can learn something about how to nourish soil. The study of ecosystems in their untrammeled state can teach us ways to mimic them, and that’s a really important resource for things like sustainable agriculture.
Leaving “wilderness” aside, I do think there’s this wild other —I don’t know what exactly we should call it—that has an enormous amount to teach us. I think the encounters we have with plants and animals are really useful. We learn important things about what it means to be human and what it means not to be human. There is that quality of wildness that’s essential as something to learn from, to reflect on, to measure ourselves against.
Michael: Politics come in as soon as we attempt to define “sustainability.” I think we’re contesting it right now. There are meetings going on now between environmentalists and corporate leaders about how to define the sustainability labels put on products, and that’s a fiercely political argument.
Bill: That’s right. The other trouble with sustainability is that it tends to point toward a future in which the good system is a stable system. But that’s not how history works. History is unstable. Perhaps that’s why the word resilience now gets invoked. Resilience and sustainability together are the territory in which our political and theoretical work needs to be done. We need unstable systems that nonetheless operate within a band of sustainability.
Michael: The idea of resilience—there’s an example of drawing from what we understand about natural systems.
Michael: And there is some role for science in describing those systems and explaining how they work. I find the word useful.
Bill: I do, too.
Michael: The word is useful in many different contexts, because it links to nature qualities we like in ourselves, in our children, and in the social realm, so I think it’s very productive. But where does it come from?
Bill: Out of ecology and climate science. It emerged as more and more scientists began to believe that the effects of climate change are such that we are going to lose ecosystems that we hoped could be saved. As the larger system migrates toward its limits, the question of which systems are going to survive has become more and more compelling.
Michael: But the word also comes out of biodiversity studies, right? The idea that the more species there are in a unit of land, the more it can deal with fire, with changes in temperature, and so on? It’s an interesting measure to apply to certain things. I mean, we need words that constitute value judgments, right?
Bill: We do—so we can tell stories about them. Environmentalism at its best has been good at telling stories about the connections we don’t ordinarily see in our lives. How what we buy in a grocery store has consequences for the earth, for people, for animals. Taking responsibility for the choices we make in our daily lives: that’s one of the things environmentalism has been teaching all along.
I’d contrast it with the illusion of a transcendent leap, that if we can just embrace the cosmic good, we can have a revolutionary moment in which all is transformed. But the older I get, the more I mistrust the notion of a revolutionary leap. It seems to me that daily practice—small choices, lives well lived, mindfully and attentively lived—is the only way a just society can sustain itself. We have to make daily choices. We can’t imagine one big apocalyptic change.
Michael: Wendell Berry has this great line about distrusting people who love humanity. You can’t love an abstraction, he says. You can’t love a statistic. You can love the person near you, and your community, and your neighbors.
Bill: Use abstractions as metaphors for humanity, but stay close to people.
Michael: I think that’s true. Another very important lesson I’ve learned from Wendell Berry is about the danger of specialization, the fact that we’re now good at producing one thing and consuming everything else. The sense of dependence that follows from the division of labor makes us despair of ever changing the way we live; it encourages us to feel that change can only come from outside—from government, from disaster—because we can no longer do very much for ourselves. That partly explains the power of gardening, which offers a reminder that, in a pinch, we can provide for ourselves. That’s not a trivial thing. It makes us more receptive to imagining change.
Bill: For me, the moral lesson of the garden—and I’m agreeing with you—is that being attentive to the work of the garden leads to greater appreciation for the work that makes life possible, which involves the work of others.
Bill: Right. Ecology, storytelling, history—they all render connections visible. We make that which is invisible visible through story, and thereby reveal people’s relationships to other living things.
Michael: Stories establish canons of beauty, too. There is a role for art in changing cultural norms about what’s worth valuing. One hundred fifty years ago, certain people looked at a farm and saw what you might see if you look today at a nuclear power plant or some other degraded landscape. Part of the reason we tell stories is to create fresh value for certain landscapes, certain relationships.
Bill: And stories make possible acts of moral recognition that we might not otherwise experience. They help us see our own complicity in things we don’t ordinarily see as connected to ourselves.
Michael: Yes, exactly. That recognition can help remove the condescension in so much environmental writing by showing us that, look, these things we abhor are done in our name, and we are complicit in them, and we need to take account of them. It was Wendell Berry’s idea that the environmental crisis is a crisis of character. The big problem is the result of all the little problems in our everyday lives. That can be a guilt trip, but it doesn’t have to be. You can tell that story in ways that empower people.
Bill: Messy stories invite us into politics. They also invite us to laugh at ourselves. And those things together—the ability to laugh, to experience hope, to be inspired toward action at the personal and political levels—these strike me as the work of engaged storytelling in a world we’re trying to change for the better.
Bill: Maybe that’s a good note for us to end on, don’t you think? The poet Muriel Rukeyser once said that “the world is made of stories, not of atoms.” When we lose track of the narratives that human beings need to suffuse their lives and the world with meaning, we forget what makes the world worth saving. Telling stories is how we remember.
“They made our lives in the library seem adventurous and superior.” – 40 years of Theory talking to itself
Tradition, history and art were subordinated to a collection of thinkers and arguments that went under the name of “theory”. They provided abstract rules and explanations for how human events unfolded and artistic creation happened. Theory had all the attractions of being conjecture-clean, clever, overarching – but it squeezed the vitality and unpredictability from human achievement like juice from a lemon. Instead of reading classic poems and novels, scholars mastered theories of literature. Instead of learning the details of a historical record, they acquired a theory of historical change. Forty years on, the results are in. Learning has declined and the humanities are an impoverished field. The outcome could have been foreseen, for what is the theory of history and art, or of love, gardening or health, for that matter, compared with present and past realities? But the enthusiasms of the moment were too strong.
It didn’t take long to realise that other idols ruled the graduate programmes. Yes, we read Shakespeare, Hume, Austen and Lovejoy, but what we did with them depended on an entirely different group: the theorists.
Derrida, Foucault, Lacan, Adorno, Rorty, Paul de Man… they set a powerful agenda for humanistic study. Their work was complex and diverse, but what made all of them theorists was a focus on method. Instead of studying directly the contents of history and thought, they said we should examine the tools of study – terms, evidence, values and practices. Biographers, for instance, aim to record a human life; theorists step back and ponder the narrative structure of that life (or any life), the nature of historical evidence and so on. The sceptical tenor was spreading in schools from Aberdeen to Berkeley to Sydney, and those of us who enlisted out of inspiration had to change our attitude. Theory was hyper-analytical and against common sense, leaving no ordinary enjoyment untouched. The beauty of Keats’s verse, the truth of Nietzsche on herd morality, the heroism of Lincoln… well, you couldn’t esteem such things any more. By their own declaration, the theorists probed the basic elements of language, culture and ego, and to affirm something as conventional as beauty was to be pitiably square and naive.
Some were turned off, but many were intoxicated by the approach. Indeed, it is hard for non-academics to grasp how heady those conversations in the seminars and the student lounge could be. The classic writers were still essential, but the theorists were daring and radical, and the mention of them made the energy level in class discussions jump. If a fellow student spoke about Gulliver’s Travels by borrowing from Swift’s life, one could cite Derrida on how outside materials don’t reveal the meaning of a work but close off multiple meanings to “privilege” just one. Or one could steer the talk towards Lacan on aggression, then apply it to the world of the Houynhnms and Swift’s portrayal of mankind as yahoos. Or one could take a postcolonialist tack and recount Gulliver’s efforts to “go native” (ridiculous, to be sure, but one heard much worse). At that point, the colloquy would turn theoretical, with people taking sides.
Usually, the theorists would win. Traditional scholars fell back on custom and textual evidence, while theorists and their disciples enjoyed the thrill of roguish poses and weighty topics – Derrida on Western thought, Foucault on madness and civilisation, de Man on irony and death. They made our lives in the library seem adventurous and superior. Think, for instance, how Foucault flattered the student ego. In a series of books, he argued that the freedoms we cherish in bourgeois society, along with the liberal reforms of the Enlightenment, were in fact subtle forms of social control working through heightened surveillance and low-intensity coercions. The compliment this outlook paid to weary junior scholars struggling to find a place in the world was hard to withstand. While the rest of society accepted modern life and muddled through, the clear-eyed minds we fancied ourselves to be understood what was really going on.
When it was a minority endeavour, it functioned as a gadfly, obnoxious sometimes, but useful for testing assumptions. When theory became a dominant habit, it lost its rationale. With nobody around to defend untheoretical positions, it had nothing more to say, no more bunk to debunk. As the numbers of old-fashioned scholars dwindled, theoretical interventions became pointless and predictable. A recent book by another president of the MLA spent pages blaming the poor reading habits of students on the New Critics, figures whose influence waned back in the 1960s. The antiquated target shows how empty theory’s victory was. How many times could you “call into question” a basic assumption or “problematise” a term without sounding like a cliche?
The test of time was undeniable. The simple truth was that the accomplishments of theory mocked its claims. Derrida and the rest spoke in grandiose terms about the implications of theoretical acumen, and their votaries echoed the tone in portentous statements. When de Man declared that “the linguistics of literariness is a powerful and indispensable tool in the unmasking of ideological aberrations”, his followers repeated it as if it marked a leap in the course of human intelligence. But nobody appeared to benefit from the insight except its practitioners. Theory infiltrated the humanities, theorists found jobs and changed the curriculum, new journals and programmes were founded. But the effect beyond the campus was negligible. A few psychiatrists remained Lacanians, and some architects practised a version of deconstruction, but the influences were scattered. To proclaim theory’s social impact was nothing more than a pretence.
Still, theory’s influence in the university has been enormous. Even among people who’ve pulled away, certain axioms remain a matter of principle – for instance, the notion that sexual identity is a social construct with no biological determinants. In the absence of support from the outside world, theory has become an insider activity. And with the anti-theorists routed long since, all theory can do is rehearse the arguments made 40 years ago, the same interpretations and same conclusions. The pretexts change – Milton yesterday, Buffy the Vampire Slayer today – but the outlook doesn’t.
Professors have profited from theory for a long time, and they’re too comfortable and invested to have second thoughts.
The arrogance was self-defeating, of course. Theory couldn’t sustain the humanities by itself, and the exhilaration that brought us into the habit struck outsiders as a self-congratulatory joy carried out in an affected tongue. With the public estranged from our practice and with younger scholars not replenishing the reserves of knowledge, the humanities are a guild imploding. Theory is dead, but it has taken something much more valuable with it: higher learning.
So yes, Alex’s hatred was most certainly pure. But somehow, for me, that doesn’t really get at what made his writing so wonderful. Because it was a joyful hate that Alex nurtured. An inspiring hate.
For all the talk of his sharp tongue and even sharper pen, we are, after all, talking about a man who once confessed to weeping on an airplane as he watched 1993’s Homeward Bound: The Incredible Journey, a film about two talking dogs and a sassy cat trying to make their way back home.
I always thought of this as the Cockburn version of Kafka’s famous dictum: “there is hope but not for us.” Which strikes me as wonderfully optimistic.
So Alex’s hate, ever pure, is just the twin of — and sorry to sound like a total hippy here — his love. His love of America’s lost interior. His love of freaks and weirdos, the dispossessed, the losers and the forgotten.
And the truth was that despite my supposed socialism, it made me a snob. Alex however, despite a healthy love for folks like Marx, Engels and even the dreaded Lenin, never became a snob. He never turned his nose up like I did at the Red States. Whenever I’d read him talking about his encounters bumping along the ex-Confederate hinterland, I’d find myself saying “goddamnit it, Alex. Don’t you get it? These people are racist, theocratic, quasi-fascist bastards. If you weren’t from Ireland, you’d totally get this.”
And it’s in this sense that Alex played what I think was his most valuable role for the left, though as a staunch anti-militarist, he’d probably hate the metaphor: he was like our drill sergeant. He hurled abuse at us — but beautifully stated and almost alway hilarious abuse — from every possible direction. “Oh, maybe if Hillary — SLAP!” “Oh, maybe if I buy organi — SLAP!” “Oh, if only the Democrats — SLAP!” “The Kennedys were the last true — SLAP!” But why was he doing it? Because he was mean? No. Because he wanted us to survive. He wanted us to win.
And honestly, we needed it.
Criticality as rearguard defense of capital – The “purity” of critique is the metaphysics of irrelevance
The Art – Architecture Complex is a book concerned with contemporary architecture and design, a subject I am vastly underqualified to critically pursue. How I could venture into this task without the requisite specialization is best explained by my conviction that marginality with respect to such specialization is sometimes preferable to expertise. And it may well be that both art and architecture are fields too important to be left to their professional defenders. And anyway, if Foster’s observations are accurate, architecture has itself been dissolved, our ways of building and dwelling transformed into cinematic encounters under consumer media’s management.
With this title, The Art – Architecture Complex, Foster invokes that sense of capitalist conspiracy first expressed in the 1960’s phrase, “military/industrial complex.” The book is massively informative but characterized by the author’s trademark polemic with regard to the pluralism that is post modernity in general. Foster has long made clear his preference for purity over plurality…
This metaphysics, the foundation for Foster’s criticism in general, is oppositional in form. He typically opposes resistance and transgression to complicity, outside to inside, the real to the illusory, and the virtual to the actual. This marks a limit to his analyses, and for some would render his conclusions helplessly conservative, even when his objections to “capital” might seem necessary. This is the crux of his situation; critical for his historical consciousness, conservative for the same, his oppositionality leaving him without traction with regard to a historicity of experience now re-composed by way of electronic “abstraction.” In this new situation, Foster refuses to acknowledge how antiquated his use of “the image” and “spectacle” has become, clinging as this does to some notion of an objective foundation, a reality that would offer an external standpoint from which critique proceeds. His fervent conviction that there is an “outside” from which criticism can orient itself and from which critical attacks may be mounted, that distance is definitive of criticality, fails to account for and integrate the pluralizing impact of electronic communications media with which the post modern is to be identified. Even a likely sympathizer such as Bruno Latour asks, “Are we not like those mechanical toys that endlessly make the same gesture when everything else has changed around them?”
…Baudrillard, informed by the media theory of Marshall McLuhan, developed a more “performative” vision of architecture’s relationship with new media. Although the literary character of his thought has often been anathema to traditionalists such as Foster, his observations are acute if expressed in apocalyptic language. His willingness to embrace the media as environment means he spends less time spinning his wheels on a positioning of critique now no longer available as it was in the nineteen eighties.
Perhaps what is needed, following Foster’s denunciations of design as mere consumerist manipulation in the service of greater efficiencies for capitalism, is recognition of a more general outline. That would be one that attributes the root of the problem more deeply in a description of the rationalist prejudices that dominate our thinking and being. For the style of critique demonstrated by Foster and his colleagues this would be bad news, leaving them revealed as a part of the problem in so far as their project is itself inextricably dedicated to the founding of criticality in a modernity already itself a practice of instrumental rationality. In this sense, the critique mounted from Foster’s “leftist” optimism has become a rearguard defence inevitably and finally supporting and requiring those elements of purification and linearity so essential to the drive of technics (including capital) for ever greater efficiency.
“Nowadays the abstractions of aesthetic and intellectual criteria matter much less to me than people’s efforts to console themselves, to free themselves, to escape from themselves, by sitting down and making something.” – Generous criticism for perilous times
Criticism was socializing by other means. And since socializing in those narrow literary and intellectual precincts consisted of egos battling for position, status, friendship, and love, it was inevitable that the criticism embody and sometimes exemplify what Delmore Schwartz—no mean takedown artist himself—once called “the scrimmage of appetite.” Pulverizing reviews were not taboo because the victims could always make their retort at the next social gathering, or on the pillow, or in one of the journals that served as kitchen tables for the extended family of writers who published there.
In the popular imagination, the intellectuals—especially the so-called New York intellectuals—were hugely influential, but this is a distortion. Hannah Arendt’s “Reflections on Little Rock,” published in Commentary in 1959, caused few ripples outside her own circle, even though Arendt’s argument that the United States was wrong to pursue integration at that moment was incendiary. By contrast, her article “Eichmann in Jerusalem” caught on in the larger culture because she published it in The New Yorker, one of the dreaded “middlebrow” magazines that was then in the process of making intellectuals like James Baldwin, Dwight Macdonald, Elizabeth Hardwick, Mary McCarthy, Alfred Kazin, and Harold Rosenberg prominent national figures.
The insular, hothouse atmosphere of postwar intellectual combat is where, about twenty years after it disappeared, I schooled myself in the dark art of the takedown. I can now see the irony of my situation—or, as those bygone critics would have said, my “position.” My awareness of my own ineffectuality in the world also led me to seek out the power conferred by words. But the world had changed. I was not practicing a shared style. I was cultivating an idiosyncrasy: I was one of the few critics who carried a hatchet.
What had once been nicely divided into highbrow and middlebrow culture—even at the time a crude formulation of a complex reality—had become a wildly eclectic place where “high” and “low” and everything in between existed side by side. A critic I admired as I was starting out was Robert Hughes, who reviewed art for Time, ruminated more deeply on culture in the New York Review and in the back of the New Republic, published serious books on serious subjects, and introduced people to art in television series on PBS. His style was gripping, elegant, and, above all, popular. Though I didn’t share his fierce aversion to much contemporary art, I thought that the way his career blurred the distinctions between “high” and “low” was exemplary.
Authority is a slippery thing, and its nature is going through yet another permutation in literary life. There are plenty of young, gifted critics writing fiercely and argumentatively in relatively obscure Web publications. But they are keenly aware that, along with the target of their scrutiny, the source of their own authority is also an object of examination. Macdonald simply took for granted the fact that membership in a community conferred on him a certain accredited brilliance. This is what, for me, makes reading him now an incomplete experience, because the group that certified his judgments has disappeared. Literary criticism on the Web, on the other hand, draws whatever authority it has by renouncing any claims to authority. The Web critic relies on his or her readers for attentiveness and approval. A social style is gradually replacing an idiosyncratic one—whether it’s n + 1’s collective tone and worldly references to literary coteries and cliques or Choire Sicha’s slyly self-deprecatory exclamation marks.
Twenty years ago, Robert Hughes might have taken up the subject of Andy Warhol with a demon barber’s gleam in his eye. Instead, this article by Nick Faust in the online journal The New Inquiry, though its occasion is a reflection on Andy Warhol, captures the new spirit of criticism. It is a positive critique of the genre’s social possibilities, and it is expressed in a social tone:
Likewise, art writing must attempt to draw new connections, weaving in unpublished, hushed talk that always gets spoken but generally not on the record. The documentation of the piece, the Facebook posts, tweets, and vines that surround such work, the gossip about the work in the bathrooms of the gallery and outside during the smoke breaks and back in the patios and bars after the opening, the press releases both in unchecked email and listserv format, and the 10,000 art-opening invites that networked artists receive each day on social media, the write-up of the work, the studio visits, the sketching out of the ideas, the conversations that influence and sustain the practices—all these are rich and evocative and can provide tremendous energy and meaning to a work and extend its life out beyond.
In other words, the critics will leak the total secret context surrounding a work of art—Edmund Wilson meets Edward Snowden! The French Annalistes wrote history in such a way, characterizing a moment in time by reconstructing its finest, most mundane details. Why not a criticism that draws from the same energy? I have no idea whether it would be successful, but I love the possibility of it.
Applying old standards to a time when everyone is throwing everything they can at the proverbial wall to see what sticks is like printing out a tweet, putting it in an envelope, and sending it to someone through the mail. The very fact that reading and writing are in jeopardy, or simply evolving, means that to try to put the brakes of old criteria on a changing situation is going to be either obstructive or boring. In our critical age of almost manic invention, the most effective criticism of what, in the critic’s eyes, is a bad book would be to simply ignore it, while nudging better books toward the fulfillment of what the critic understands to be each book’s particular creative aim. The very largeness and diversity of present-day audiences make less and less relevant the type of review that never gets beyond the book under review. It’s the critic’s job nowadays not just to try to survive and flourish amid ever-shifting modes of cognition and transmission, but to define new standards that might offer clarity and illumination amid all the change. Quite simply, the book review is dead, and the long review essay centered on a specific book or books is staggering toward extinction. The future lies in a synthetic approach. Instead of books, art, theatre, and music being consigned to specialized niches, we might have a criticism that better reflects the eclecticism of our time, a criticism that takes in various arts all at once. You might have, say, a review of a novel by Rachel Kushner that is also a reflection on “Girls,” the art of Marina Abramović, the acting style of Jessica Chastain, and the commercial, theatrical, existential provocations of Lady Gaga and Miley Cyrus. Or not. In any case, it’s worth a try.
New demands for new times are the big-picture reasons I’ve lost the taste for doing negative reviews. I have smaller, personal ones, too. Having become an author of books myself, I now find that the shoe is most definitely on the other foot. I once dismissed as maudlin the protest that one shouldn’t harshly disparage a book because the author poured the deepest part of herself into it. What, I replied, has that got to do with defending civilization against bad art and sloppy thinking? Nowadays the abstractions of aesthetic and intellectual criteria matter much less to me than people’s efforts to console themselves, to free themselves, to escape from themselves, by sitting down and making something. In my present way of thinking, mortality seems a greater enemy than mediocrity. You can ignore mediocrity. But attention must be paid to the countless ways people cope with their mortality. In the large and varied scheme of things, in the face of experiences before which even the most poetic words fail and fall mute, writing even an inferior book might well be a superior way of living.
Academics cloning themselves – On being critical of everything except the system that grants you prestige (because *that* is the one non-corrupt product of the system)
IT IS EASY TO SEE how the modern academic discipline reproduces all the salient features of the professionalized occupation. It is a self-governing and largely closed community of practitioners who have an almost absolute power to determine the standards for entry, promotion, and dismissal in their fields. The discipline relies on the principle of disinterestedness, according to which the production of new knowledge is regulated by measuring it against existing scholarship through a process of peer review, rather than by the extent to which it meets the needs of interests external to the field. The history department does not ask the mayor or the alumni or the physics department who is qualified to be a history professor. The academic credential is non-transferable (as every Ph.D. looking for work outside the academy quickly learns). And disciplines encourage—in fact, they more or less require—a high degree of specialization. The return to the disciplines for this method of organizing themselves is social authority: the product is guaranteed by the expertise the system is designed to create. Incompetent practitioners are not admitted to practice, and incompetent scholarship is not disseminated.
Since it is the system that ratifies the product—ipso facto, no one outside the community of experts is qualified to rate the value of the work produced within it—the most important function of the system is not the production of knowledge. It is the reproduction of the system. To put it another way, the most important function of the system, both for purposes of its continued survival and for purposes of controlling the market for its products, is the production of the producers. The academic disciplines effectively monopolize (or attempt to monopolize) the production of knowledge in their fields, and they monopolize the production of knowledge producers as well…
Disciplines are self-regulating in this way for good academic freedom reasons. The system of credentialing and specialization maintains quality and protects people within the field from being interfered with by external forces. The system has enormous benefits, but only for the professionals. The weakest professional, because he or she is backed by the collective authority of the group, has an almost unassailable advantage over the strongest non-professional (the so-called independent scholar) operating alone, since the non-professional must build a reputation by his or her own toil, while the professional’s credibility is given by the institution. That is one of the reasons that people are willing to pay the enormous price in time and income forgone it takes to get the degree: the credential gives them access to the resources of scholarship and to the networks of scholars that circulate their work around the world. The non-academic writer or scholar is largely deprived of those things. This double motive—ensuring quality by restricting access—is reflected in the argument all professions offer as their justification: in order to serve the needs of others properly, professions must be accountable only to themselves.
The hinge whereby things swung into their present alignment, the ledge of the cliff, is located somewhere around 1970. That is when a shift in the nature of the Ph.D. occurred. The shift was the consequence of a bad synchronicity, one of those historical pincer effects where one trend intersects with its opposite, when an upward curve meets a downward curve. One arm of the pincer has to do with the increased professionalization of academic work, the conversion of the professoriate into a group of people who were more likely to identify with their disciplines than with their campuses. This had two, contradictory effects on the Ph.D.: it raised and lowered the value of the degree at the same time. The value was raised because when institutions began prizing research above teaching and service, the dissertation changed from a kind of final term paper into the first draft of a scholarly monograph. The dissertation became more difficult to write because more hung on its success, and the increased pressure to produce an ultimately publishable work increased, in turn, the time to achieving a degree. That was a change from the faculty point of view. It enhanced the selectivity of the profession.
The change from the institutional point of view, though, had the opposite effect. In order to raise the prominence of research in their institutional profile, schools began adding doctoral programs. Between 1945 and 1975, the number of American undergraduates increased 500 percent, but the number of graduate students increased by nearly 900 percent. On the one hand, a doctorate was harder to get; on the other, it became less valuable because the market began to be flooded with Ph.D.s.
…What is clear is that students who spend eight or nine years in graduate school are being seriously over-trained for the jobs that are available. The argument that they need the training to be qualified to teach undergraduates is belied by the fact that they are already teaching undergraduates. Undergraduate teaching is part of doctoral education; at many institutions, graduate students begin teaching classes the year they arrive. And the idea that the doctoral thesis is a rigorous requirement is belied by the quality of most doctoral theses. If every graduate student were required to publish a single peer-reviewed article instead of writing a thesis, the net result would probably be a plus for scholarship.
But the main reason for academics to be concerned about the time it takes to get a degree has to do with the barrier this represents to admission to the profession. The obstacles to entering the academic profession are now so well known that the students who brave them are already self-sorted before they apply to graduate school. A college student who has some interest in further education, but who is unsure whether she wants a career as a professor, is not going to risk investing eight or more years finding out. The result is a narrowing of the intellectual range and diversity of those entering the field, and a widening of the philosophical and attitudinal gap that separates academic from non-academic intellectuals. Students who go to graduate school already talk the talk, and they learn to walk the walk as well. There is less ferment from the bottom than is healthy in a field of intellectual inquiry. Liberalism needs conservatism, and orthodoxy needs heterodoxy, if only in order to keep on its toes.
And the obstacles at the other end of the process, the anxieties over placement and tenure, do not encourage iconoclasm either. The academic profession in some areas is not reproducing itself so much as cloning itself. If it were easier and cheaper to get in and out of the doctoral motel, the disciplines would have a chance to get oxygenated by people who are much less invested in their paradigms. And the gap between inside and outside academia, which is partly created by the self-sorting, increases the hostility of the non-academic world toward what goes on in university departments, especially in the humanities. The hostility makes some disciplines less attractive to college students, and the cycle continues.
…And the academic world would be livelier if it conceived of its purpose as something larger and more various than professional reproduction—and also if it had to deal with students who were not so neurotically invested in the academic intellectual status quo. If Ph.D. programs were determinate in length—if getting a Ph.D. were like getting a law degree—then graduate education might acquire additional focus and efficiency. It might also attract more of the many students who, after completing college, yearn for deeper immersion in academic inquiry, but who cannot envision spending six years or more struggling through a graduate program and then finding themselves virtually disqualified for anything but a teaching career that they cannot count on having.
It is unlikely that the opinions of the professoriate will ever be a true reflection of the opinions of the public; and, in any case, that would be in itself an unworthy goal. Fostering a greater diversity of views within the professoriate is a worthy goal, however. The evidence suggests that American higher education is going in the opposite direction. Professors tend increasingly to think alike because the profession is increasingly self-selected. The university may not explicitly require conformity on more than scholarly matters, but the existing system implicitly demands and constructs it.
You acknowledge that professional wrestling is often seen as anti-intellectual. Why do you say to that?
Pro wrestling at its heart, it’s like Greek melodrama. It has a very rich culture. I mean some form of professional wrestling has existed since late 1800s. It’s existed in its current scripted state since the 1930s. So it’s got a really long cultural heritage.
Because people engage in it in a very theater-like manner, it actually requires understanding the genre. So there’s a lot of discourse-specific language that goes with it. So one of my participants had to explain to me, Oh a “face” is a good guy, from the term baby-face. And a “heel” is a bad guy. And then there’s a tweener–in gaming terms, we call it a “chaotic neutral”–which means you don’t know whether they’re good or bad, and what they do in certain situations is relatively unpredictable. Watching people work through argumentation about different points and share resources and information shows me that this is a very intellectually active community.
If we’re talking about pro wrestling fandom as a learning community, who are the teachers? Are there ways to “graduate” to higher levels?
The teachers—it’s totally peer-to-peer. So everybody’s putting in their expertise where they have it. People who don’t happen to be as experienced in wrestling but are better at giving grammatical or genre-based writing critique get to put expertise there. Others put expertise when it comes to how wrestling storylines are developed, what elements go into that.
That’s another thing about wrestling–a lot of people think it’s very American-centric. It’s very international. My fans were from the Philippines, India, all across Europe, South America, the US. It’s a really wide fan base. So a lot of people who participate on these boards are English-as-a-second-language speakers. And so they get feedback on improving their written English. My participant from South America said he was able to not only get a community around his interest in wrestling, but they helped him to improve his English skills, which he took back and used in school. So the teaching kind of goes in multiple directions. Everybody’s a teacher and learner, as the situation comes up.
One of my participants, she’s 17 years old. She’s in the Philippines. And she came to the community because when she told her local friends she was interested in wrestling, it was very socially stigmatizing for her. They started making fun of her for being a tomboy. So instead of giving up wrestling, she just stopped talking to them about it, and she found this community online. The fantasy wrestling federation part of the community really drew her in, and she got hooked. And then she started writing for the school newspaper as well. That led to a medical career, where she’s gonna do a lot of technical writing. So her wrestling interest was an introduction to writing in a way that she found really engaging.
What’s different about learning inside the pro wrestling online fan community versus, say, learning inside school?
Like most interest-driven communities, it’s a much more low-stakes environment, so people are willing to try things that might not pan out. In a high-stakes learning environment, a lot of times what happens is, people feel so much pressure, they don’t want to try things that they’re not sure will work out exactly right, because they don’t want to suffer consequences of that. So this allows people to role-play different kinds of characters, and a lot are experimenting with making videos about being in character as a wrestler, or best-of videos. They put them online and get feedback on how their video editing’s going, so if it completely fails, they’re like, “Well I tried this, it didn’t work,” and people give them suggestions on how to fix the problem. So they’re willing to experiment in areas they might not be willing to try otherwise.
Self-fashioning is part of the age-old purpose of higher education, particularly in the liberal arts and sciences. The key point is to be aware, sometimes, that this is happening—to deliberately engage in fashioning—not just let events and experiences sweep you along without your conscious participation.
There is another thought-provoking maxim related to but distinct from “Know thyself,” also grounded in the Greek and Roman classics: “Take care of yourself, attend to yourself.” This variant was highlighted by Michel Foucault in a lecture called “Technologies of the Self.” Foucault insists that this “taking care of yourself” is “a real form of activity, not just an attitude.” It’s like taking care of a household or a farm or a kingdom. That’s what we are talking about in discussing “self-fashioning”: paying deliberate attention to your “self,” taking good care of it, tending and developing it, not just taking it for granted.
This all sounds appealing, but like most young people, and most people across history, you are more likely to be self-absorbed than self-abandoning. What we may all need most is reflection on the importance of community, of other selves. I’m going to link the two in this essay because I believe firmly that we fashion our “selves” both in solitude and in society.
For most of us—certainly those of us on a university campus—solitude is a relatively rare experience. If we are to fashion ourselves, we will be doing so in the presence of other people, most of the time. We develop as selves through our interactions with other human beings—through relationships, beginning with the family and then the school and the neighborhood, through art, music, language, culture, ideas. Our selves are never, and cannot be, purely isolated beings: we are the products of our experience and our environment, and we need to understand the self in and through society, not as a stand-alone cardboard cutout.
The warnings of Montaigne and Rousseau about how this experience can deform us, pull us away from our true selves, misshape our selfhood, should be in our minds. But we should also recognize that most of what is best about us comes from our interactions with other individuals.
So the formation of selfhood that depends on having someone else shape you like a work of art falls short of forming a successful human being. And it’s not surprising that theories of education since the eighteenth century rely much more on individual choices and taking a significant responsibility for your own intellectual development.
In college, you have an exceptional amount of freedom to choose from the bewildering variety of great courses listed in the catalog, and the amazing proliferation of extracurricular activities, including both those that are already established and those that you might help organize, as so many Harvard students do. If you sometimes think, as you make these choices, about what kind of self this seminar (or this sport, or this club, or this office) will help you to become, you may find guidance here. Does this activity promise to make you a deeper, fuller, more interesting person? Does it expand your life in new ways, or build on what you have done before in ways that make you stronger? Does it challenge you to develop new mental or emotional muscles, so to speak?
According to these pieces of advice, you should think about society not as a kind of zoo or curiosity shop where you can pick up a persona that suits you, but as the source of inspirational exemplars, diverse possible ways of shaping yourself, fascinating models. This means reading biographies and history, novels and essays, and paying attention to how people you admire handle challenges as they come along.
Yet society is not only a source of inspiring examples: it is even more often, as Rousseau said so well, a source of profound pressures to behave in certain ways. Society will surely shape you, the opinions and preferences and activities of your family, your friends, your classmates and professional colleagues, everyone with whom you spend any considerable amount of time. But too often the pressures are negative and will not help you in your self-fashioning, as all of us know when we reflect on peer culture, websites, TV shows, and movies. For worthwhile self-fashioning, you need a surrounding society that speaks to what is most importantly human, and brings you together with others in rewarding collective activities.
In the fifth chapter of her powerful work of political philosophy, The Human Condition, Hannah Arendt discusses the connections between individuals and political communities. She notes that each human being is “distinguished from any other who is, was, and ever will be”—which is a vivid way of thinking about selfhood. Yet precisely because each of us is a distinct individual, we need speech and action to communicate; I cannot just sense instinctively what somewhat else is thinking. In speaking and acting, we “disclose ourselves” and thus expose ourselves to possible misunderstanding or exploitation by others, but also to the rich possibilities of communication.
Speech and action, in Arendt’s sense, cannot exist in isolation; they are meaningful only within human relationships. By the same token, “human nature”—as distinct from our more animal qualities—depends precisely on our capacity for speech and action: it is in fact through speech and action that each of us constitutes our self. This is Arendt’s distinctive contribution to our discussion of self-fashioning: the self is created not by each of us as individuals in isolation, but through the activities we share with other human beings—language, creativity, striving, politics. If your goal is to fashion a worthwhile self, you should be mindful of your surroundings and choose companions and activities that will give you opportunities to develop your language, creativity, striving, and politics in more depth.
…But if you are to have a whole, integrated, complete self, you must resist becoming totally immersed in private spheres. You must see it as part of your self-interest and your moral duty to play your part in society, to give something of yourself away to others who are in need, to help sustain the common structures that make up our public life. If you fail to do this, you will become a shrunken and diminished self. Recognition of this fact is what Alexis de Tocqueville called “self-interest rightly understood,” or “enlightened self-interest”: not pure egoism or selfishness, but caring for yourself in the context of acknowledging your responsibilities to others, which brings with it significant moral commitments and deep rewards.
So whether they have too little solitude or too much, women have often had a different experience of solitude and society from men. Men can leave the house and go off on a journey in many societies where women can never travel alone. Women in most cultures have had much less opportunity than men to explore the world, follow their adventurous inclinations. And they have been less likely to have a place or time where they can enjoy solitude. It’s worth keeping this in mind when you read authors who write about self-fashioning. You can sometimes stop and ask: Would this advice have worked for a woman in the society this author is describing? Or are these generalizations accurate only for the men? What, after all, were the women doing in this society?
Let me close with a quote from Ralph Waldo Emerson’s essay on “Self-Reliance”: “It is easy to live in the world after the world’s opinion; it is easy in solitude to live after our own; but the great man is he who in the midst of the crowd keeps with perfect sweetness the independence of solitude.”
This is the image I want to leave you with: developing the ability to maintain “with perfect sweetness” the independence of solitude—the integrity and wholeness of the self—in the midst of the crowd. Your education should give you the capacity to shape and sustain your selfhood. It should both furnish richly the back shop of your mind, and prepare you to be a productive member of whatever society you live in. And at best, it should also give you the ability to retreat into yourself even in the midst of a busy life when you need to get your bearings, refresh your spirit, reaffirm your integrity, and confirm what is most important to your self.
“Not all are called to be artists in the specific sense of the term. Yet, as Genesis has it, all men and women are entrusted with the task of crafting their own life: in a certain sense, they are to make of it a work of art, a masterpiece.”
I want to reflect today on the title chosen for this gathering, “Beauty will Save the World.” That’s quite the assertion, and I don’t know if I can convincingly support it, but I’ll give it a shot. My tentative thesis today is that the best way to cultivate healthy local cultures is to celebrate their beauty. It’s not to pass laws, it’s not to develop rational or economic arguments for their benefits, it’s not to start some new program. All these might be needed subsequently, but if we don’t first bear witness to the beauty of a healthy culture, then other approaches are doomed. It’s in this way, by enabling us to see the truth and goodness of healthy way of life, that beauty will save the world. So I want to think with you about the beauty of local culture, why that beauty is important, and how to cultivate it. I’ll begin by describing a beautiful, and I think saving, activity that I’ve had the privilege of participating in this past year.
Rather, our hope is that the students and staff and faculty who participate will see and experience how beautiful it can be to grow and eat our own food. This rich, practical connection with our food is what Wendell Berry calls the pleasures of eating. These pleasures are complex, and they are nearly impossible to quantify, but if you’ve ever eaten a sandwich with tomato slices still warm from your garden, you know something of these pleasures. When you plant a seed, water it, weed around the delicate seedling, try to protect it from deer and bugs, watch it blossom and set fruit, and wait for that fruit to ripen, the act of eating the fruit is not merely an input of calories and nutrients. Rather, eating is just one part, perhaps the climax, in a whole narrative that we’ve embodied and lived out, a narrative that connects us to our fellow gardeners and to the place in which we live.
To call something beautiful in this sense is to speak about its material shape or form, and also about the meaning or splendor that emerges from the form and makes it desirable. And as von Balthasar goes on to argue, when we see a vision of the beautiful, when we see the contours of its form, we are enraptured by its splendor, caught up in a desire to participate in the radiance that beauty grants us to see as love-worthy. So to call this narrative of our community garden beautiful means that the whole way of living that the garden enables us to glimpse, in which we work together and share the fruits of this work, is desirable and love-worthy.
And yet oversimplification leading to disease marks nearly every aspect of our fragmented, modern lives. Our corporate medical system does not aim for health, but rather isolates various parts of the body and treats particular abnormalities. Hence our medical establishment has been particularly unhelpful at offering preventive care and treating complex problems such as obesity. Our monoculture agriculture is merely another instance of our propensity to isolate and specialize, and I’m not sure that our biculture of corn and soybeans here in Michigan is much of an improvement. We still don’t have complex polycultures that include animals and a true variety of plants. Such simplification works itself all the way down to our lawns, which we spray with toxic chemicals just to have “beautiful” grass.
In their false simplification, such specialized visions and the ways of life toward which they lead inevitably contribute to disease. These narrowly-focused ways of life become insipid, losing the splendor of beauty, and yet they define much of our lives as we search for quick and easy solutions. Wendell Berry notes the irony in our culture’s stereotypical view of country life as “simple,” noting that in actuality, it is urban, specialized living that is simple:
When I am called, as to my astonishment I sometimes am, a devotee of “simplicity” (since I live supposedly as a “simple farmer”), I am obliged to reply that I gave up the simple life when I left New York City in 1964 and came here. In New York, I lived as a passive consumer, supplying nearly all my needs by purchase, whereas here I supply many of my needs from this place by my work (and pleasure) and am responsible besides for the care of the place. (The Way of Ignorance, “Imagination in Place” 47-48).
My point, then, is that our culture’s tendency toward reductive specialization is intrinsically un-beautiful, that beauty arises only from complex, harmonious forms, that health is beautiful. Currently, our cultural aesthetic is, in Solzhenitsyn’s terms, sickly and pale: we too often confuse the pretty, the mere appearance, for true beauty, hence our acceptance of lush green lawns that cause water pollution. But perhaps beauty can save, or at least salve, our world by giving us a richer imagination of health and thus causing us to desire ways of life that, as von Balthasar might say, carry the splendor of truth and goodness.
How do we actually see such forms whose beauty might inspire us to find more healthy ways of living? I think there are at least two conditions for perceiving such visions of beauty. The first is that we see beauty on a local scale.
We have to be able to see the whole to perceive beauty (again, note the connection between beauty and health). Analysis of the beautiful, if it does not begin with a vision of the whole and keep this vision constantly in mind, quickly devolves into an abstract rummaging through dead parts. It becomes what von Balthasar calls “anatomy,” which “can be practiced only on a dead body, since it is opposed to the movement of life and seeks to pass from the whole to its parts and elements” (Seeing the Form 31). This is the way the “industrial mind,” a term that Berry derives from the Southern Agrarians, sees the world. Such a vision, precisely because it is too narrow and specialized, inevitably leads to disease and deformation. In his essay “Solving for Pattern,” Berry argues that solutions based on this sort of specialized vision always worsen the problem—he gives the example of addressing soil compaction by using bigger tractors, which only compact the soil further, leading to the need for even larger tractors (The Gift of Good Land 136). So while a bad solution “acts destructively upon the larger patterns in which it is contained,” “a good solution is good because it is in harmony with those larger patterns” (137). In order to see the beauty of these larger patterns, and thus perceive what modes of life would harmonize with these patterns, we need to be able to see the whole form. When we try to imagine a beautiful whole on a global or even national scale, the difficulty, if not impossibility, of this task makes the temptation to perform a quantitative analysis of isolated parts almost irresistible. And yet such a fragmented gaze can’t see the living, beautiful whole, which is precisely the form that can give us the vision of health and beauty our imagination needs.
The second condition for perceiving this vision of healing beauty is a personal experience or encounter. We don’t see the whole form of beauty when someone describes it abstractly. I can tell you about the Sistine Chapel and describe its scheme and what the various parts depict, but you won’t really see its beauty unless you stand in it yourself. The same holds true for a Bach fugue. This is so because of the complexity and richness of beauty; there is a qualitative difference between an experience of the beautiful and an abstract description of that experience.
…Every morning the local bakery draws a group of men who drink coffee, eat pastries, and talk about the work that awaits them in the day ahead. Their conversation is punctuated by oblique references to stories they all know and by the habitual phrases of friends absent or dead. The community’s memory lives in such conversation. But it’s hard to quantify and analyze what makes this community a healthy one; merely listing its attributes does not convey the beauty of its form. We perceive its beauty as a whole, when we experience life in such a community.
…So we all need to practice creating beauty. It’s remarkable how counter-cultural this participation might be, since we now live in a society that thinks “beauty” is meant to be produced by professionals from big cities and consumed by the rest of us.
We may not all be gifted artists like Kathleen, but we can still all be involved in creating beauty. As Pope John Paul II wrote in his “Letter to Artists,” “Not all are called to be artists in the specific sense of the term. Yet, as Genesis has it, all men and women are entrusted with the task of crafting their own life: in a certain sense, they are to make of it a work of art, a masterpiece.” We all have an opportunity and a responsibility to participate in this task of culture, and our “sub-creation,” as Tolkien calls it, should be guided by the contours of the beauty we’ve perceived.
I am afraid that what often keeps us from embracing the quotidian work of sustaining the “little platoons” of which we are a part is the sense that this local work can’t affect the national and international problems over which the news media continually obsesses. But while such local work may seem futile in our current political and economic environment, it may actually be the most consistent and effective way to cultivate health, given the farce that national politics has become. This is why Berry believes that our “Our environmental problems [as well as our other diseases that afflict our society] are not, at root, political; they are cultural” (What Are People For, “A Few Words in Favor of Edward Abbey” 37). Dreher echoes this sentiment in an essay on Wendell Berry in which he considers him to be “a latter-day Saint Benedict”: “I am convinced that conservatives have placed far too much stock in political action and far too little in the work of culture” (The Humane Vision of Wendell Berry 281). Dreher hopes that Berry has begun a sort of monastic cultural movement, where instead of pouring their energy into national politics or the culture wars, individuals work to form healthy, beautiful communities in their homes. These communities might then preserve and sustain culture, providing beacons of hope that stand in stark contrast to sick society around them.
I do want to qualify this politics/culture distinction. Politics is indeed part of culture and a shaper of culture, but my point is that it shouldn’t be the primary arena in which we try to affect cultural change. Rather, fostering healthy and beautiful cultures will inspire others to participate and cultivate the communities of which they are a part. Representative democracy too often relies on the slim majority forcing everyone else to do the majority’s will, whereas culture relies on beauty to foster a robust conversation about the common good, and then to persuade others that this common good, that health, is desirable.
This distinction provides, perhaps, the clearest insight into the unique power of beauty: whereas political power ultimately relies on force, beauty simply invites others to perceive the splendor within its form. Beauty is an invitation, a gift, and thus it is always vulnerable to rejection. This is its weakness, and this is why beauty is often overlooked as a salve for our contemporary problems. But its weakness is also its strength. In our cynical world, where people are jaded by political posturing over truth and strident demands that some particular way is the only right way to live, beauty simply puts itself on offer. And if its form reveals truth and goodness, then those who behold beauty may find it love-worthy. Once our affections are moved, right action and truthful speech will follow.
The time has indeed come. Amidst the Googlization of everything, it has become clear that there are some things Google cannot and ought not do. Google cannot replace a distracted student’s brain with a curious and attentive one, nor can it enhance such qualities as courage, kindness, and truth-telling.
Given the aims of a humanistic education—the intellectual formation of human beings, whether they live in the digital age or the stone age—the production of ever sleeker and shinier gadgets is, despite proclamations of revolution, largely superfluous. As Coppage suggests,
[P]erhaps purchasing one more avenue for Google Now to anticipate our every need is not educationally value-added. Perhaps, what our education system should be focused on is keeping our minds sharp and disciplined, preserving the powers of self-direction and careful attention.
The powers of self-direction and careful attention are precisely things that are cultivated through the intellectual and moral habits of individuals and their relationships with other human beings, not through the replacement of mental and physical processes with Google products. When we discover that we are serving Google rather than Google serving us, we will find the service very poor indeed.
The poet Mary Oliver once wrote, “To pay attention, this is our endless and proper work.” If this is true, then our proper work may include carefully considering our use of technology and how that use shapes us, and regularly setting aside the speedy tablet to give our attention to the slow language, poetry.
Poetry is a better, though harder, master than Google. Reading poetry is a peculiarly difficult act, for it demands the devotion of body, mind, and heart. And, as Coppage points out, “There are no shortcuts,” and never will be, if poetry remains and we remain human.
A poem truly experienced is a poem that is lived with—memorized and spoken, recalled in the morning and remembered at night, growing more precious and meaning more deeply through days of recitation and reckoning. To appreciate poetry, we need to pay attention, and it may take some time to train ourselves to take the painstaking care needed to read a poem well. But the end of our labor is joy, just as a good meal needs time to slowly simmer and at last to savor and celebrate.
Now, what has all this Luddite romanticizing have to do with education?
A whole lot, it turns out, if we’re concerned with educating human beings rather than credentialing digital natives. Language shapes the way we think, and the words we use shape our vision of the world. Poetry renews our language, re-imbuing meaning into words maltreated by sound-byte discourse and Facebook memes.
Poetry demands both precision and imagination; it plumbs the depths of meaning, whereas Google can only optimize our search for information. Google can give us words on the screen, but it is up to us to make them our own.
If, as Plato said, “The object of education is to teach us to love what is beautiful,” then let us teach ourselves to love poetry. For love requires knowledge of the deepest kind—knowledge of an entirely different order from Google analytics—and learning to love requires profoundest attention.
“We must shift from a vision of intelligence, as a basically neutral cognitive ability, to a holistic vision of intelligence as an ability that nurtures the human spirit and enables a person’s full realization. Intelligence and love of life in this vision go hand in hand.” – Ramón Gallegos
“As Dewey says, ‘It is not experience which is experienced, but nature – stones, plants, animals, diseases, health, temperature, electricity, and so on.’ My valuing experience of an act of injustice as wrong is about value that I find in the same world where I also find plants and stones. To dismiss the importance of valuing in inquiry because it is merely subjective or a mere psychological reaction is to assume a dualism or to presuppose the supremacy of the theoretical standpoint in revealing what is real.” – Gregory Pappas
So much can be said about Sue Bell Yank’s post The Constructivist Artwork that it is difficult for me to address everything. Her piece is quite welcome as it raises many interesting questions. The quotes above hint at the crux of my response. Pragmatism, in many ways nullifies many of the “problems” posed by Yank. To start, the distinction between idealism and constructivism can be pragmatically useful, but the pragmatist believes that ideas are things, so they are as much a part of the world as ice cream. Pragmatism also preaches meliorism (which is essentially the belief that life can be improved) so it is not truth in any final sense that is sought, but a truth that “works.” Pragmatism, as William James describes it is “radical empiricism.” In his pragmatist version of empiricism, contra Locke, and Plato, the fact/value distinction (like so many others) dissolves. So if we apply some of these points of view to the piece by Yank, we see that she is correct that “constructivism is inevitable.” But, so is idealism, because the two epistemological nodes are part of a continuum.
This requires a holistic point of view to adequately address and leads to one of the difficulties with this piece. It suffers from a one dimensional understanding of what knowledge is and mistakes education as being solely concerned with this limited (intellectualist) notion of knowledge. As Gallegos points out above, knowledge and intelligence needn’t be the purely cognitive type of material Yanks seems to imply. She says, “But often, experiences that are novel and rich with ideas have an educational “potential” and therefore a position on how we acquire knowledge and what that body of knowledge is.” Note that she describes experiences rich with ideas. This point of view is similar to the proponents of academic standards in schools (which functions in somewhat the same way as Yank describes “museums, art spaces, and funding entities” engaging in.). It mistakes that which can be measured for that which is valuable. So I’m left with making two suggestions – one, is to expand what counts as knowledge, or two, advocate for art practices that do more than engage the mind. Holistic educators are a rich source of guidance here (see Nel Noddings, Ron Miller, etc.). Without this adjustment, we’re stuck in the art world academics want – one that cultivates their own specialist skills and interests rather than an art world that cultivates thinking, yes, but also joy, love, and the soul.
“Loyal to our critical principles, we can barely squeak out the slenderest of affirmations. Fearful of living in dreams and falling under the sway of ideologies, we have committed ourselves to disenchantment…What we need, therefore, is to rethink our educational self-image and subordinate the critical moment to a pedagogy that encourages the risks of love’s desire.” – R.R. Reno
artists without artworks – “who have radically chosen non-creation and have assumed the status of artist, the living for one’s self, outside of all artistic production”
These days the disappearance of the work increasingly haunts art. This unique thing to be venerated, to reflect upon, or to contemplate belongs less and less to our artistic practices. However, is this necessarily the same as saying that there is no art left? In a certain way art is done with art, in terms of what we have come to call art. We keep the name—art—yet, fundamentally, its content has changed. We can therefore no longer think through discourses on art, either aesthetic or historical, that unite notions of art and artwork to the extent of making us believe in the necessity of the latter as an absolute creation and of asserting the complete independence of the art field. Art practices themselves have abandoned the notion of artwork and the idea of art that accompanied it. Twentieth-century art has thus unceasingly been haunted by minority- becomings, those of “artists without artworks,” to borrow a phrase coined by Jean-Yves Jouannais, (1) who have radically chosen non-creation and have assumed the status of artist, the living for one’s self, outside of all artistic production (Dadaism is a fitting example). Moreover, in this century other artistic practices have come into being that make the word “work” difficult to use with respect to them, and the term “artwork” even more so: performances, actions, happenings, ephemeral art, certain installations and videos, and so forth. Finally, we know very well that we must not examine art as a series of incontestable objects to be preserved. Art does not deploy itself only as a succession of productions offered to the veneration of the public in museums or in galleries, but equally as an artistic path or trajectory.
It then becomes a matter of abandoning aesthetic discourse as a discipline that follows the artwork and takes an interest in its reception, in the notion of taste, and that analyzes the internal elements that objectively constitute the artwork. Even very recently, with a philosopher such as Nelson Goodman, we still consider the finished or accomplished work through a philosophy of interpretation that studies how to “make works function.” (2)
What is forgotten in all these perspectives is that art, before being an artwork, before it can be understood as a “masterpiece,” also constructs itself by means of the overflow that brings it into being, by experiences that result from a non-linear activity on the part of the artist. The artwork is not necessarily abandoned but reconsidered in the overflow itself. Ensuring that the trajectory is the equivalent of the work, that the work is nothing other than an artistic experience (including the experience of not making), may be the paradoxical signature of the contemporary artist.
However, how can we come to an understanding of art from the perspective of experience? If the artist’s experience actually exists, it always resides in a sort of formalization of experience itself. Art is no longer defined expressly through the creation of a work often associated with the figure of the inspired artist or of one who possesses genius; it is sustained through formalization, by means of a kind of language or art form, an individual experience rooted in the sensible world and in the singular impressions that are retained by the artist. In support of the idea that the work disappears in favour of the setting in place of artistic experiences, I would like to mention an artist such as Allan Kaprow who developed the concept of the happening in 1959 in New York. As a performer, Kaprow thought he could abolish the frontiers between art and life through a formalization of the experience that takes the shape of experimentation in happenings. Art had to return to events and daily objects in order to restore the proximity between artists and their public and between works and actions. Since the 1960s, the production of environments that introduce objects of daily use around which the public moves amounts to manifestations of a definition of art as an experience of the world that surrounds us. This definition of art brings it closer to life. However, it never reaches the point of confusing the two. The artist’s experimentation always amounts to a formalization of lived experience insofar as it stops short of making this experience sacred, which means that art sides with transience. The fact remains that such formalization in experimentation establishes the power of actions and events in art. The disappearance of the artwork resides in the erasure of its autonomy and of the myth of art’s exceptional character. Art becomes experience, experimentation, and intervention. Not only does it reflect on ordinary life but also, in the same movement, it affirms its precariousness against all logic of power. Precarious, art no longer recognizes itself in the enshrined edifice of the artwork, but tries tirelessly to reclaim the tangled web of experience with what constitutes its own work, formalization, but a formalization that has become uncertain and relational. In having become fragile or tenuous, the work on forms must always begin anew insofar as it has the tendency to melt into lived experience or into the complexity of the world.
It is specifically the artist’s experience, with his or her doubts and everyday uncertainties, that is formalized in such a way as to turn experience into an expression, and expression into an experience. Art distances itself from a thinking that would bring it back to the enshrined site of the work, better to correspond to a world of diversity, series, networks, and links; thus, the artist can experiment and produce an artwork that from now on stands as a fragile trace of this experimentation. Not only human beings are vulnerable, but also art itself as it bears the burden of vulnerability, far from the all-powerful artwork.
Nevertheless, has the work disappeared? No, it lives through its extinction. It still shines in the evanescence that dissipates it, similar to an ephemeral mark in a delimited space and time. In this dissipation of the artwork that postpones it without cancelling it out, the experience of the work’s absence maps out the conditions of a negative artwork. In contemporary art, the subject is not the folly of the artwork’s absence but a certain regime of experimentation of the covert work: the artwork’s disappearance as an illicit work!
“What could be more normal than artists producing artworks? After all, they’re just doing their job, and there seems to be no stopping them.”
Art’s broken promises
By and large, when artworlders talk about what might be broadly described as art’s ‘use value,’ they’re bluffing. Anyone who believes that art, in any conventional sense of the term, by ‘questioning,’ ‘investigating,’ or otherwise ‘depicting’ some socio-political issue, actually empowers anyone to do anything about it, is actively engaged in self-delusion. Yet art continues to make such promises — using its institutions to lend them not only a largely unchallenged semblance of truth, but all the trustworthiness of convention — only to immediately break them. Why? Is it because art is unable to do away with its romantic underpinnings, except by abandoning itself to all-out cynicism?
…What could be more normal than artists producing artworks? After all, they’re just doing their job, and there seems to be no stopping them. And besides, who would want to stop them? So they go on and on making art — adding to the constantly growing category of objects obeying that description. What is more unusual, and far more interesting, is when artists don’t do art; or, at any rate, when they don’t claim that whatever it is they are doing is, in fact, art. When they recycle their artistic skills, perceptions and habitus back into the general symbolic economy of the real.
There is, of course, a context for this shake-up of the status of art and the artist, bequeathed by the twentieth century: artistic activity itself is developing on a massive scale and in a mind-boggling variety of forms, and the production of meaning, form and knowledge is no longer the exclusive preserve of professionals of expression. One finds artistic skills and competencies at work in a variety of areas far beyond the confines of the symbolic economy of the art world, and the practices which they inform are in many cases never designated and domesticated as art. The fact that this sort of art-related creativity seeks no particular validation from the art world, that it pays scant heed to the values and conventions underpinning it, should by no means inhibit us from charting its genealogy and identifying its inherent rationality. And yet, aesthetic philosophy, persisting as it does in construing art as an enigma to be deciphered, as an object begging interpretation, seems decidedly ill-equipped to theorize art in this expanded sense. Beyond both the well-worn logic of appropriation, which consists of recuperating as art all description of objects and activities not intended as such; and beyond the converse, though symmetrical logic consisting of using artistic practices — those, in other words, initiated and managed by artists — to stake out and claim new territories for art, it seems worth pursuing use-value in this particular direction though on the basis of an extraterritoriality and reciprocity that prefigure an unforeseen future for it.
…Duchamp points to the symbolic potential of recycling art — and artistic tools and competencies — into the general symbolic economy of life (as opposed to the standard readymade, which recycles the real into art). The point, and starting point, of this project is to reactivate this unacknowledged genre of artistic activity.
Art without artists, without artworks, and without an artworld
So what happens when art crops up in the everyday, not to aestheticize it, but to inform it? When art appears not in terms of its specific ends (artworks) but in terms of its specific means (competencies)? Well, for one thing, it has an exceedingly low coefficient of artistic visibility: we see something, but not as art. For without the validating framework of the artworld, art cannot be recognized as such, which is one reason why it is from time to time useful to reterritorialize and assemble it in an art-specific space. In one way or another, all the collectives in this project confront a common operative paradox: though informed by art-related skills, their work suffers from — or, should we say, enjoys — impaired visibility as art. Yet this impaired visibility may well be inversely proportional to the work’s political efficacy: since it is not partitioned off as ‘art,’ that is, as ‘just art,’ it remains free to deploy all its symbolic force in lending enhanced visibility and legibility to social processes of all kinds. It is a form of stealth art, infiltrating spheres of world-making beyond the scope of work operating unambiguously under the banner of art. The art-related practitioners involved in this project have all sought to circumvent the reputation-based economy of the artworld, founded on individual names, and have chosen to engage in collaborative action; they use their skills to generate perception and produce reality-estranging configurations outside the artworld. As the wide range of tools developed by these collectives show, this has nothing to do with an ban on images; art has no reason to renounce representation, a tool it has done much to forge and to hone over its long history. The question is the use to which such tools are put, in what context, and by whom: tools whose use-value is revealed as they are taken up and put to work.
“The solution to a bad dream isn’t to argue yourself into a better dream, but to wake up and look at the world—then laugh or cry or be bored.”
All this is far from “how to” advice. I think we improvise our way into what becomes a life, and that means listening to the last two notes we played, as well as knowing some basics: Am I any good on the sax? Should I stick to drums? Am I paying attention to what the rest of the ensemble is doing? And there are other questions. How do I discover a leaning, a capability, a pleasure, a calling? John Rawls talks misleadingly of “life plans”—I suppose this is on the model of “investment plans” or “career plans.” My mind doesn’t work that way. I can’t put down general “learning objectives” for my classes. I don’t have a life plan for my life, and don’t know what my long term objectives are (if I have any). If something goes bad, I have something to say. But I don’t start with a plan or desire for specific outcomes—except in the most platitudinous sense: stay healthy, don’t starve, be a mensch. In class, if asked for an overall aim, I’d say “get to love these issues, texts, figures, passages. Praise what you love. Get comfortable sharing your growing interests and loves as you ramble or stumble through the whirl, eye ready for sudden insight, sudden center.”
A recent magazine piece (maybe in the Guardian?) by Wittgenstein’s biographer, Ray Monk reflects on Wittgenstein’s collection of photographs. There’s a connection between looking at the photos collected and Wittgenstein’s emphasis on looking — rather than explaining. In a parody, we could say that philosophers explain-explain-explain. They can forget to just look at the world, or flow with it, or listen to it (like listening to music). Wittgenstein thinks that philosophy is not a set of theories, one of which may be correct. Nor is it a set of bad theories about to be replaced, thank God, by the good theory I’ve just concocted. Enlightened as I surely am, I hereby stop this proliferation of error by announcing the truth. (It’s nice to fantasize omniscience.)
Wittgenstein thinks philosophies are symptoms of unhappiness, of verbal and intellectual confusion, of anxieties that are nearly inescapable. (Don’t we really, really, need to understand?) But maybe these inescapable worries are rather unreal, like a bad dream—real enough in the moment, and troubling, but forgettable when you awake and can so easily change the subject. The solution to a bad dream isn’t to argue yourself into a better dream, but to wake up and look at the world—then laugh or cry or be bored. Whatever your reaction after fresh contact, you’d no longer worry about whether the world exists, or whether feelings are always dangerous and unreliable, or whether moral relativity is true or false. You’d soak up the morning, act as you act, and solve your daily problems the way most persons do—one by one, with a minimum of ‘theory’ directing them. So…stop explaining. Just look! That’s Wittgenstein’s advice. Acknowledge your confusion, but the aim is to move into life—join the dance!
Wittgenstein had a deep interest in religion, in Tolstoy, Goethe, and Kierkegaard: he wrote, echoing a bit of Kierkegaard, “faith is a passion; wisdom, like cool grey ash.” He carried Tolstoy’s Gospel in Brief to the trenches during WWI, and read from it every day. His Investigations is like a maze or storm at sea or series of unsolvable puzzles, full of almost biblical enigmas. You might say it holds both that human life has no Ground, no big foundation in logic or a rock-solid God, Science or Reason, and that it nevertheless has all the (God-given?) ground it needs—in overlooked aspects of life: the smile of a child, the rise of the sun, the sound of a clarinet, or a call to prayer from a minaret. To feel that, to live from it, would be something like leading a life of faith, being grounded in it. “All theory is grey, my friend, but ah, the glad golden tree of life is green.” Yes, that’s good, but not quite Wittgenstein. For him, theory might be “cool grey ash” but life was too polychromatic, including shades of black, to qualify as golden or green. In any case, it’s not just too much theory that makes for what he called “the darkness of the times”—his and ours. In his 1929 Notebook he writes enigmatically, “What is good is also divine.” He refused ashes. He could imbibe good: “Tell them I’ve had a wonderful life.”
I know that’s not a ringing conclusion, but it needn’t be reason for disappointment or angst. Except in rare instances, it’s not a well-plotted research program that culminates in definitive findings, conclusions, and closure. It’s a register of deep wonder and yearning. If that’s right, then philosophy will be always asking, no matter what, and always opening an impoverished agenda, and always improvising its way.
Dean Dettloff: Wow. I feel as though you’re already performing this kind of intimacy-therapy on me in this interview alone! The themes of renewal you trace are neither bound to psychological experience nor public consciousness, though they deal with both. You clearly have a heart for interpersonal relationships and societal healing, which seems to bleed into your philosophy of education and a desire for these kinds of ideas to reach a public audience instead of staying within the academy. Would you discuss the way these sensibilities have shaped your role as an educator, both in the academy and outside of it?
Edward F. Mooney: “Intimacy-therapy” captures something about teaching and learning. Unfortunately, the ideal gets lost in the bustle about stiff “learning objectives,” about generating knowledge for the social-industrial-military complex—the specialized research university as a knowledge-generating machine. In my view (I’m in a decided minority), the best education is paternal, avuncular, maternal, fraternal, “sisterly” — where (Platonic) “care of the soul” is front and center. You and I in this blog can discover (and rediscover) the truths of “intimacy-therapy” in the company of other mentors: Kierkegaard, Berdyaev, Nishitani, and countless others you feature for us.
…But the humanities ought to have care of our souls, so the loss of an articulate expression of this in that section of the university is especially unsettling. I think a certain ideal has been abandoned. I’d hesitate to share my enthusiasm about this ideal of intimacy with my colleagues, say in a department meeting where the dean has put pedagogic practice in the spotlight. This is what I could expect:
“Professor Mooney, what are you saying! That you throw a book out to a class and wait to see what happens? No lectures, no tests on information acquired, no honing of necessary skills? We pay you for encouraging free-form emotional response ?”
You have yet another part to your question that’s more difficult to answer. You ask me to “discuss the way these sensibilities have shaped your role as an educator, both in the academy and outside of it?” What makes it difficult is that it assumes I have a grasp of my sensibilities, a grasp of what underlies my love of Mozart or canoeing or Thoreau. But I want to say that I just am certain sensibilities whose provenance is dark or shaded, and whose agency now, in the life I articulate, is also dark or dappled at best.
If a writer knocks on my door, and I only remark on their height or weight, I’ll have missed an essential dimension of their being. I can report on what a philosopher said for an exam, if required. But that would leave the living spirit of the saying out of my response. I want to convey my sense of the living spirit I’ve been excited by. If I adopt “professional distance” as a posture of response, then I’ll be leaving out ever so much. Lyrical philosophers (I can’t think of a better name) deserve lyrical response, especially if there’s a reason they need to be lyrical. So I guess that leads to a question beyond the question of why I write the way I do. It leads to asking why Thoreau and Kierkegaard (for example) write the way they do. Why does anybody need lyrical philosophy?
DD: That, of course, is a question deserving some exploration. Why does anybody need lyrical philosophy?
EFM: Of course, that’s the big question. Let’s say we grant that Kierkegaard or Nietzsche or Plato or Schopenhauer have moments of great lyricism. Let’s assume this isn’t an accident or mere aesthetic flourish but a moment when each feels that to say what they want to say lyricism is inescapable. Why should this be?
Well, it’s based in philosophical anthropology, I think. We are calculating logic-wielding creatures and can be marvelous proof machines (and counter-example machines). We can shine at producing persuasive logical argument tending toward definitive conclusions. That’s our stock in trade as philosophers. We are also, at a more primal level, deeply moral creatures, wanting a fair deal, wanting reciprocal trust, needing to promise and to have promises honored. So lots of philosophy deals with understanding these matters of logic, argument, and morals.
We are, at an equally primal level, creatures of dance and singing, theater and narrative. Sometimes—especially when we move out of the corrals of logic and forensic morality—we face wild questions (Why death? Why birth? Why suffering? Why rain? Why love lost? Why love requited? Why injustice? Why beauty?). These can be given “social scientific” answers, but they also resonate deeper than that. At this deep level, they can best be articulated (if not answered) lyrically, artistically, religiously. Dance and singing, theater and narrative, articulate the enigma that we are creatures who in fact agonize over these questions (Why do we bother? What’s the evolutionary advantage? What’s the practical advantage?). And perhaps it’s our essence as humans to be self-reflective this way. We agonize even as answers continue to elude us, and even as we know they will always elude us.
I see lyrical philosophy as approaching poetry and great narrative, myth and song—say in Schopenhauer or Thoreau or parts of Plato—at exactly those moments when these wild questions obtrude. They strike at an angle that tells us that logic and morals and standard arguments fall short. These fail to address them in their depth. And we know just as certainly that we will falter in giving lasting or satisfying answers. But we can’t leave the questions, in all their intensity and passion, unvoiced, suppressed, abandoned by the road. We dance without practical or logical rationale to express what seems to elude our everyday philosophical capacities. We write a hybrid philosophy that melds with the poetic, musical and dance-like.
The philosophical bearing of lyrical philosophy is to express those heartfelt, nagging, inescapably wild questions that surely ought not to be buried or avoided. Are we not, as persons, drenched in love and love-lost, envy and eloquence, new life and old age, iniquity and pain of every sort—and also drenched in great moments of unspeakable serenity and joy? Aren’t these worth philosophical memorialization, praise, and lament?
I shouldn’t forget the quieter hurts that could use quieter healings. There are sufferings that don’t appear in the daily news or in hospital statistics. My student with a blank look on her face; or the other one who drops out, preferring dorm drinking to whatever a poem might offer. There’s the other guy, who freaks at the idea of putting a thought in a sentence; there’s the one whose parents exert devastating pressure to succeed on their kid, now a senior (translation: “make enough money that our investment in your education won’t have been in vain”); then, the one who has become a smart-aleck cynic. Often the hurt comes from a sense of disconnection from anything that matters—a lost intimacy with others and our shared world.
I think sometimes it’s only when we come across writing that speaks to soul-ache that we can “discover” how much we hurt. We’re given a measure of articulation and depth. We unexpectedly feel recognition of our own pains and joys that we had not yet found words to equal. The discovery of expressiveness is a discovery of what we have to express. At the moment it arrives to us, we become vulnerable and then capable of returning expressiveness in kind. We can find ourselves hurting or singing or carried away in exaltation just as a sentence we’ve encountered bespeaks hurt or song or exaltation.
What I’ve called “lost intimacy” is the loss, I suppose, of participating in occasions of such expressive mutuality. It’s the loss of lyricism in philosophy, or the feel of the poetic in universities and much of cultural life, and the hegemony of an ideal of professional distance and suspicion of what I’ve called the soul. It’s related to the fact that we don’t have companions or mentors with whom we can speak about the joys that course through our lives, or about the emptiness that can cloud our days, or make nightmares of sleep. We have professionals who in therapy “hear our story,” and we sometimes have Rabbis or Gurus, Pastors or Coaches or Priests. But we also need to share intimate matters as equals, not just as client to an expert responder, or priest to parishioner. Attentive aunts, parents, siblings, or lovers might fill the bill. I think complaints about unchecked globalization and technology bespeak a fear that fragile enclaves of intimacy (if they exist) are increasingly at risk.
The artist without works pursues the assertion of an ideology rather than the building of a career – Undeeds, unart, and the undone.
An artwork does not necessarily need an author, an author does not necessarily need an artwork. Since her student days, Dora Garcia has been fascinated by the figure of the artist without works. An absurd figure, often tragic, it started to gain weight and prestige when it crossed with the figure of the dandy. A dandy is an artist who considers the production of things (books, music, or art) to be the dullest of things. Needless to say, to think of a career as a producer of “things” would add vulgarity to dullness. The dandy artist without works evolved later to the conceptual artist, on the one hand, and the counterculture hipster, on the other. The difference is mainly one of cultural environment. But the core is the same: an artist who tries to have an ideology rather than a body of work. The very word would produce a leer on the face of the artist who stays as far from the institution as possible, who flees from the idea of the “guild” or “profession” as if it were the bubonic plague (Guy Debord, Gil Wolman, Tristan Tzara). Because the artist without works nevertheless enjoys seeing and reading and listening to good artistic products, they will try to either have them made by someone else, or to have them made by nobody: a music that composes itself (Cage), a painting made by a machine (Duchamp), a book that has been found (Borges), a text written by other texts (Burroughs). The artist without works despises the petit bourgeois idea of the “genius” as a virtuoso who excels in their field and is ready to be served as entertainment in the bourgeois salons. The artist without works has nothing to offer to the mainstream public, and fame would make them think that something is definitely wrong.
An Artwork Does Not Necessarily Need An Author; An Author Does Not Necessarily Need An Artwork
As soon as artists start being shaped in art schools, a yearning is imprinted upon them, setting up the source of a lifetime of uneasiness, longing, and want. The yearning for fame and recognition, which must be achieved by an unstoppable production of things. However, and paraphrasing Francis Picabia, by fleeing the atrocious destiny of being unknown, artists necessarily land on that other atrocious destiny: the failure. Francis Picabia: “Men can be divided in two categories: failed and unknown.”
What follows wants to be a personal homage to those artists who have not produced things, who have produced things but have tried to hide it, or who have directed the steps of others to produce things they wanted to see but did not want to make; and in NOT doing have exerted a vigorous influence in other artists. The artists without works.
We could imagine the artist without works as a tragic figure, paralyzed by the fear of not meeting his own expectations, or not deserving to be in the same room as those he admires most. Or, we could imagine the artist without works as well as a defeatist figure, the type of artist who puts into question the sense of doing anything if it will be however misunderstood, misused or even worse, forgotten.
These figures exist; but they are not what I am talking about. The artist without works I want to pay homage to is not tragic, but joyful. Is the artist that, intersecting with the figure of the dandy, even intersecting with the figure of the hipster and the countercultural hero, prefers NOT WANTING. Note that it is not the elimination of will power but on the contrary the glorification of the will of nothing. The artist without works pursues the assertion of an ideology rather than the building of a career, an ideology that would not rest on objects but on deeds, or rather, un-deeds. The artist without works seeks the beauty of the not doing, not wanting, not leaving something behind. He chooses the radicalism of the refusal: I am not there, I’d rather not to.
Refusal of many things. One, refusal to make sense.
The killing of the author brings to the artist an exhilarating freedom. Freedom, as when we free ourselves by pretending to be someone else.
Two: Refusal of quantity.
No need to write them in full, just write an entry in that imaginary encyclopedia. Going back again to the depiction of the artist producer as merchant, another typical vendor viciousness is the will to produce much, so as to keep their clientele fulfilled and contented. But Pessoa, another expert in disappearance, says: “Each of us has a little amount of things to say, but there is not much to say about that, and posterity wants us to be brief and precise. Faguet (author of Petite histoire de la littérature française, 1913) says clearly that posterity only loves brief authors.”
When one just has to write novels of three lines, what does one do with the rest of his time? Pierre Cabana says: “Our best artwork is the use of our time”. Seas of time open for the creator of micro-narratives, becoming therefore the dandy, the amateur, the dilettante. But not only.
First, the artist without works disappoints the audience with his indolence, by not doing any productive work whatsoever. “Nothing can offend them more” as Guy Débord said. Second, he infuriates them by insulting them, and third, he terrorizes them with his criminal behaviour.
Félix Féneon, the inventor of “Nouvelles en trois lignes” was said to be an anarchist who deposited bombs (1894, restaurant Foyot, Paris). The surplus of time makes of the artist without works a hobo, a walker, a demonstrator, a subversive, a striker, a drug-dealer, an outsider, a sexual degenerate, a surrealist, a banalyst, a situationist. The iconoclastia of conceptualism was not so much a sign of linguistic Puritanism as a refusal to produce, a political stance, a sabotage.
It is obvious that if there is an audience the artists without works couldn’t care less about, are the critics. It is important that those un-made artworks are beyond good and bad, they are, in fact, indifferent to the idea of criticism. As Robert Filliou established with his principle of equivalence: well done = badly done = undone.
Last refusal: Refusal of being here at all.
“Can one make works which are not works of ‘art’? Can one make something that has no function, that performs no work, that is not beholden to a purpose, even that of art? Something not beholden to leisure either?”
[There are a host of problems with this piece -
A sloppy conflation of work, labor, and effort which leads to an incomplete analysis (isn't the representation of work as work that Molesworth describes generating from Taylorism part of what she herself engages in with regard to so called domestic labor/maintenance work? Isn't she ceding too much to the object of her critique (by accepting the very basis of its representation)?
A little more fleshing out of laziness, leisure, and idleness would be nice and perhaps adding in slacking which has, to my mind, even more relevance with regard to a cultural point of view.
Duchamp, of course, serves her ends a little too neatly with regard to the impossibility of doing nothing. In order to validate Barthes she preordains failure at a "complete cessation of artistic activity," but this might have been true for Duchamp, it needn't be true for everyone (this could simply be a misreading of what she is claiming of course). There are less tidy examples of quitting art if Molesworth cared to look...
And why is being "romantic" so terrible?
Despite some of these problems, this piece is still "useful." Nice "work,"]
These photographs provide us with a context to view the readymades, but one characterized by blurred boundaries. The home, traditionally conceived of as a space of rest, is here crossed with the studio, historically understood as the primary site of artistic work. Adding to this confusion is yet another smudged edge, because work (making art in the studio) and leisure (not working, which takes place at home, or art making as a form of leisure) are brought into extreme proximity. The lack of a hard-and-fast divide between work and leisure is emphasized by these images of functional maintenance objects-objects designed to aid in the cleaning and tidying up of places and people-rendered deliberately dysfunctional. Duchamp’s ambivalence toward work did not only relate to artistic production, but he resisted the labor of housework as well.
In his critique of the everyday, Lefebvre sought not simply “entertainment” or “relaxation” but the articulation of different forms of knowledge, knowledge that could aid in the potential and/or intermittent process of “disalienation.” It is not in leisure as such where a critique of capitalism is to be found. Rather, a critique may emerge in those moments when the relations between elements of the everyday are made evident or challenged. Duchamp’s presentation and arrangement of the readymades exhibit a desire to foil the functionality of these objects, whose usefulness resides in their ability to aid domestic and maintenance labor. Yet in foiling work, the readymades do not offer leisure as work’s simple antithesis (nor do they offer art as pure leisure). Instead, their placement in the home/studio tangles the categories of both work and leisure. This presentation of nonwork and leisure has a social and historical context larger than Duchamp’s studio, for Duchamp’s refusal of work (both maintenance and traditional means of artistic labor) happened alongside one of the most profound shifts in twentieth-century conceptions of work: Taylorism. Just as the photographs of the readymades in Duchamp’s studio have not been adequately theorized, the sociohistorical conditions within which the readymades came into being in New York are absent from much Duchamp literature.8 As Duchamp’s work of this period appears concerned with the terms of work, an examination of the contemporaneous shift in the practice, conception, and representation of work seems necessary.9
In Bodies and Machines, the literary critic Mark Seltzer argues that Taylorism not only altered the work process (by making it more “efficient”) but also invented new forms of work. He contends that “the real innovation of Taylorization becomes visible in the incorporation of the representation of the work process into the work process itself–or, better, the incorporation of the representation of the work process as the work process itself.”20 The representation of labor–graphs, flow charts–became a form of labor in and of itself, with manual laborers represented by their newly established managers. We can see clearly the irony of a Taylorized household, as women were asked to represent, manage, and alter their own manual labor.
… By stymieing the “work” of looking at art, Duchamp transformed the gallery into a version of his mazelike studio, a place where humor and play were encouraged-work discouraged. In both instances Duchamp represented forms of labor (or alternately leisure), be they making or looking at art, but he did so by disallowing such labors and/or leisures to take place.
We have already seen the confusion between the spaces of work and leisure in the photographs of the readymades in Duchamp’s studio. We can also see that the arrangements of the readymades interject an element of play among a set of otherwise fairly banal functional objects. Additionally, the objects blur the boundaries between home and work (typewriter cover, comb, shovel) in that their functions are all bound to the labor of maintenance, a stratum of labor structural both to the space of the home and more traditionally conceived work spaces. Not only has Duchamp blurred the traditional boundaries of work and leisure in the studio, but the readymades are functional objects rendered playful through their humorous appeal to slapstick.
While Marxism offers us the most sophisticated theoretical account of labor, it has also concerned itself with work’s dialectical other, play or leisure. For many Marxist thinkers play has an idealistic, almost utopian dimension, in that it is posited to exist outside the rules and regulations of everyday life. Herbert Marcuse has focused more of his philosophical energies on play than his Marxist contemporaries. He writes that play is a dimension of freedom, a “self-distraction, relaxing oneself, forgetting oneself and recuperating oneself.”27 If for Marcuse play is a dimension of freedom, then he enables play to serve as a critique of society, because of its position outside the conventions of the everyday. One hesitates to instrumentalize play in this way, turning it into a philosophical lever in the service of some utopian vision, but in Duchamp’s slapstick-infused readymades, the idea and the actuality of play offer possibilities for examining the tangled knot of work and leisure in everyday life.
In 1913 Duchamp jotted a note to himself: “Can one make works which are not works of ‘art’?” Can one make something that has no function, that performs no work, that is not beholden to a purpose, even that of art? Something not beholden to leisure either? In such a formulation, art and play exist in an analogously tenuous realm of (im)possibility. Marcuse states it thus: “On the whole play is necessarily related to an Other which is its source and goal, and this Other is already preconceived as labor.”28 But, if play can only be seen in relation to work, and it is seen as the lesser component of this dialectic in that play is enabled or made possible by work (“its source and goal”), then play, in its officially sanctioned role as nonwork, becomes a form of work. (One need only think of the regimentation of “the weekend” or each summer’s obligatory Disney movie.) Lefebvre argues that one ramification of this interdependence between labor and play is that “there can be alienation in leisure just as in work.”29 Duchamp attempted to use play, in the form of slapstick, not as a reprieve from work but as a means to stop work. This is where play’s potential utopian or critical dimension (a utopia free from labor and a critique of capitalism’s dependence on alienated labor for profit) can be seen most fully.
Lefebvre observes that “there is a certain obscurity in the very concept of everyday life.” He asks, “Where is it to be found? In work or in leisure? In family life and in moments ‘lived’ outside of culture?”35 He suggests that family life has become separate from productive life and that leisure has become as fragmented as labor. Ultimately, he concludes that the three constitutive elements of the everyday-work, private life, and leisure-have become discrete, alienated from one another. Yet Duchamp attempts, through humor and slapstick, to hold these three elements together. The readymades show that these categories are not discrete in experience but rather in ideology, for Duchamp’s practice presents domestic or private life as neither outside nor separate from the category of work. He uses leisure, in the form of slapstick and play, to expose domestic space as filled with work (be it maintenance work or art work) and in turn transforms that work into leisure or play. In the end, the readymades propose a space filled with neither work nor leisure; instead, they offer a kind of laziness. Characteristic of the readymades’ complex relation to both work and leisure, laziness operates as a third term, triangulating work and leisure, offering a criticism of both.
Duchamp’s laziness was the subject of many of his contemporaries’ responses to visiting his studio. Robert Lebel described Duchamp’s studio as “a large room with a bathtub in the center which Duchamp used for his frequent ablutions, and a rope an arm’s length away which allowed him to open the door without getting up.”36 Georgia O’Keeffe, reminiscing about meeting Duchamp in his New York studio, recalled one of his domestic work stoppages: “it seems there was a lot of something else in the middle of the room and the dust everywhere was so thick that it was hard to believe. I was so upset over the dusty place that the next day I wanted to go over and clean it up.”37 This refusal to clean was memorialized in Dust Breeding (1920), a section of the Large Glass photographed by Man Ray after it had accumulated several months’ worth of dust. But nowhere is Duchamp’s laziness more evident than in the readymades, where he produced art with the least effort possible–buying it already made. For Taylor, Duchamp’s dabblings with play and laziness- his experiments with not working-had a name: Duchamp was soldiering.
Taylor described soldiering as “under working, that is, deliberately working slowly so as to avoid doing a full day’s work.”38 For Taylor soldiering had two causes: first, the “natural instinct and tendency of men to take it easy”; and second (considered to be more dangerous), “intricate second thought and reasoning caused by their relations with other men.” Taylor called this “systematic soldiering.”39 Workers have two modes of foiling the factory: laziness exhibited in the form of individual soldiering and organized resistance in the form of strikes. Taylorism proposed to eliminate both. Striking and soldiering are extremely different critiques of work, one organized, systematic, and social; the other a private rebellion (refusing to dust). But in maintenance work in the home there can be no strike. Duchamp’s readymades operate more closely to the second form of soldiering; they are not a strike per se, so much as they are a work slowdown. They temporarily stop or stall activities such as cleaning and tidying by turning housework into slapstick. Likewise, the studio as a place where art is made is suffused with a kind of laziness.
Laziness is mostly figured as a parasitical form of work avoidance. It runs the risk of being aristocratic (not working because others work for you) or primitivist (native peoples as unfettered by the work ethic). There are two theoretical accounts of laziness as a philosophical position, and both maintain a similar utopian dimension to the previous discussion of the function or structure of play. Paul Lafargue and Roland Barthes argue that laziness is an attempt to completely escape the logic of work. They do not offer leisure as the antidote to work, but laziness as the refusal of work.
Lafargue, a Cuban-born ex-medical student, wrote the radical pamphlet “The Right to Be Lazy” in i88o-a tirade against work that infuriated his father-in law, Karl Marx.40 Originally printed in French, the tract was translated into English and published in the United States in 1917 (the same year Duchamp purchased the urinal that would become Fountain). Lafargue’s polemic against “progress” belongs to the primitivist side of laziness, extolling unindustrialized native peoples who do not toil for a capitalist exploiter. Lafargue writes: “It [the proletariat] must return to its natural instincts, it must proclaim the Rights of Laziness, a thousand times more noble and sacred than the anemic Rights of Man concocted by the metaphysical lawyers of the bourgeois revolution.”41 Lafargue sees the advent of industrial production as enabling time for leisure, as opposed to the increased profits envisioned by Taylorism. But he never posits that “free time” should be used for “productive” or “creative” forms of leisure. Instead, he insists on feasting and sleeping as the “Rights of Man.”42 The most indelible image from the tract remains a quotation that perversely describes Duchamp’s infamous decision to give up art for chess, his relinquishing of a working life as an artist for the life of a game player: “Jehovah, the bearded and angry god, gave his worshippers the supreme example of ideal laziness; after six days of work, he rests for eternity.”43
In fact, Duchamp never stopped making art. He designed magazine covers, made the Boîte-en-valise, and, ultimately, worked for twenty years on Étant Donnés (1946-66). The problem exposed by the “untruth” of the abandonment is how terribly difficult it is not to work. Roland Barthes addresses this point in a short interview entitled “Dare to Be Lazy.” Barthes describes two forms of laziness, one born of the struggle to get something done, laziness as procrastination from work, or “marinating” in order to work. Barthes says: “Obviously, this shameful laziness doesn’t take the form of ‘not doing anything,’ which is the glorious and philosophical form of laziness.”44 The philosophical form is precisely what is at issue. Barthes asks, “Have you ever noticed that everyone always talks about the right to leisure activities but never about a right to idleness? I even wonder if there is such a thing as doing nothing in the modern Western world.”45 Yet Barthes realized the potential nihilism in the concept of doing nothing. For laziness, he notes, is a problem for the subject: “In a situation of idleness the subject is almost dispossessed of his consistency as a subject. He is decentered, unable even to say ‘I.’ That would be true idleness. To be able, at certain moments, to no longer have to say ‘I.'”46
Duchamp came closer to doing nothing than most artists. But he was lucky. Generously supported by his patrons Louise and Walter Arensberg, who paid his rent and living expenses in exchange for artworks, and hired by the wealthy Stettheimer sisters as their French tutor-although since they had been raised in France, all three sisters were completely fluent and obviously needed no tutor-Duchamp largely managed to avoid working.47 He lived an aristocratic leisurely life, his idleness made possible through the wealth of others and a frugal life-style. Yet while Duchamp may have courted laziness, and let laziness infuse his art practice, ultimately the complete cessation of artistic activity was impossible. Impossible, as Barthes suggests, for it would mean an abandonment of the first person pronoun.
Duchamp’s readymades are an attempt to think outside the logic of work, a logic in which “the goal of labor is the full reality of human existence.”48 Not to work-to be lazy-is then to deny the full reality of human existence, to deny the category of “I,” at least the form familiar to bourgeois capitalism. Duchamp experimented with this idea by evoking the involuntary laughter within which the “I” is no longer central, and by transforming his studio, a place of work, into a site of play. The studio became a place where he could be, in Bergson’s term, “absentminded” or, in Marcuse’s, “self-distracted.” This questioning of the “I” runs throughout Duchamp’s work. After all, this is an oeuvre marked by a proliferation of aliases; a deliberate use of linguistic shifters; an emphasis on language and the self as both shared and constructed49), and a dismantling of perspectival vision (with its creation of a fixed subject), all concerns that point toward a consistent questioning of the category of “I.” Duchamp toyed and played with the possibility of nonwork-the right to laziness-the ability not to say “I.” That this position is impossible (or worse yet, romantic) should not deter serious thinking about laziness. Duchamp, by saying that he abandoned art making without really doing so, was perhaps pretending to be lazy, acting at not working. Lefebvre suggests that when “acting explores what is possible” it adds “something real: the knowledge of a situation, an action, a result to be obtained.”50 If what is to be obtained through such play is knowledge (and disalienation), then what knowledge is potentially garnered through laziness? Is it the suggestion that there can be no alienation in laziness, for there is no “I” to separate from or be identical with?51 Or is laziness a conduit to bring us back to the most fundamental of Marx’s demands, a demand designed to alter the terms of alienated life under capital: “The reduction of the working day is the basic prerequisite.”52
Duchamp’s challenge to the primacy of the category of work largely took the form of a protest against maintenance labor, pointing toward the changing historical conditions of housewifery, domestic space, and work in the early twentieth century.53 Duchamp used the readymades to foil maintenance labor, which resulted in a limited artistic production, for maintenance labor permits all other work. The readymades stymie a subject whose identity would be bound up with, and structured by, the phenomenon of work. Instead, they offer humor and laziness, slapstick and play, modes of experience that gesture toward a different set of possibilities for how we might conceive of the everyday and how we might inhabit it.
Creating art by doing nothing – Félicien Marboeuf and rejecting the productivist approach to culture – “My art is that of living”
More than 20 artists will pay homage to Félicien Marboeuf in an eclectic exhibition opening in Paris next week. Although he’s hardly a household name, Marboeuf (1852-1924) inspired both Gustave Flaubert and Marcel Proust. Having been the model for Frédéric Moreau (Sentimental Education), he resolved to become an author lest he should remain a character all his life. But he went on to write virtually nothing: his correspondence with Proust is all that was ever published – and posthumously at that. Marboeuf, you see, had such a lofty conception of literature that any novels he may have perpetrated would have been pale reflections of an unattainable ideal. In the event, every single page he failed to write achieved perfection, and he became known as the “greatest writer never to have written”. Heard melodies are sweet, but those unheard are sweeter, wrote John Keats.
…The artists he brings together all reject the productivist approach to art, and do not feel compelled to churn out works simply to reaffirm their status as creators. They prefer life to the dead hand of museums and libraries, and are generally more concerned with being (or not being) than doing. Life is their art as much as art is their life – perhaps even more so.
…Jouannais celebrates the skivers of the artistic world, those who can’t be arsed. “If I did anything less it would cease to be art,” Albert M Fine admitted cheekily on one occasion. Duchamp also prided himself on doing as little as possible: should a work of art start taking shape he would let it mature – sometimes for several decades – like a fine wine.
With his bovine-sounding surname, Félicien Marboeuf (1852-1924) seemed destined to cross paths with Flaubert. He was the inspiration for the character of Frédéric Moreau in L’Education sentimentale, which left him feeling like a figment of someone else’s imagination. In order to wrest control of his destiny, he resolved to become an author, but Marboeuf entertained such a lofty idea of literature that his works were to remain imaginary and thus a legend was born. Proust — who compared silent authors à la Marboeuf to dormant volcanoes — gushed that every single page he had chosen not to write was sheer perfection.
Or did he? One of the main reasons why Marboeuf never produced anything is that he never existed. Jean-Yves Jouannais planted this Borgesian prank at the heart of Artistes sans oeuvres when the book was first published in 1997. The character subsequently took on a life of his own, resurfacing as the subject of a recent group exhibition and, more famously, in Bartleby & Co., Enrique Vila-Matas’ exploration of the “literature of the No”. Here the Spanish author repays the debt he owes to Jouannais’s cult essay (which had been out of print until now) by prefacing this new edition.
Marboeuf has come to symbolize all the anonymous “Artists without works” past and present. Through him, Jouannais stigmatizes the careerists who churn out new material simply to reaffirm their status or inflate their egos, as well as the publishers who flood the market with the “little narrative trinkets” they pass off as literature on the three-for-two tables of bookshops. In so doing, he delineates a rival tradition rooted in the opposition to the commodification of the arts that accompanied industrialization. A prime example is provided by the fin-de-siècle dandies who reacted to this phenomenon by producing nothing but gestures. More significantly, Walter Pater’s contention that experience — not “the fruit of experience” — was an end in itself, led to a redefinition of art as the very experience of life. A desire to turn one’s existence into poetry — as exemplified by Arthur Cravan, Jacques Vaché or Neal Cassady — would lie at the heart of all the major twentieth-century avant-gardes. “My art is that of living”, Marcel Duchamp famously declared, “Each second, each breath is a work which is inscribed nowhere, which is neither visual nor cerebral; it’s a sort of constant euphoria.”
It may be ironic that a professor of “language studies” uses the term “amateur” as an insult, since he no doubt knows that for most of its life it meant someone who did something for love (Latin amo) rather than for money, and only acquired its dismissive sneer during the bureaucratic twentieth century. George Orwell, of course, was an “amateur” in the field of analysing political language, and even recommended that more of his regular work, book-reviewing, be done by “amateurs”:
Incidentally, it would be a good thing if more novel reviewing were done by amateurs. A man who is not a practised writer but has just read a book which has deeply impressed him is more likely to tell you what it is about than a competent but bored professional. That is why American reviews, for all their stupidity, are better than English ones; they are more amateurish, that is to say, more serious.
Of course, “amateurs” are not everywhere to be celebrated. I would not like to have root-canal surgery performed on me by an amateur dentist. Not everyone has a valid opinion on medicine. On the other hand, we are all language-users. Very many of the “amateurs” who have attended my talks on Unspeak think in careful and sophisticated ways about language, and their opinions are not to be dismissed simply because they haven’t had the right sort of academic training. My view, indeed, is that the analysis of language in politics is too important to be left to “professionals” who murmur among themselves in the diagrammatic glades of “discourse analysis” and other subdisciplines. Professor Salkie surely knows, to be blunt, that nowadays, “amateur” is most often the kneejerk insult of the salaryman who desires to protect his own turf.