Wednesday, June 30, 2004
Une nouvelle notion est enfin arrivée
|
STOP MORE SPAM with the new MSN 8 and get 2 months FREE*
A little fine writing can go a long, long way
A little fine writing can go a long, long way Terry Eagleton The Times Higher Education Supplement: 25 June 2004 | |
Quintessentially English in form, the essay allows the writer to say a little while suggesting a lot, writes Terry Eagleton. Cynically speaking, an essayist is a writer who has no more than 20 pages' worth of knowledge of any particular topic. An essay, on this view, is a kind of verbal equivalent of a nudge and a wink: it intimates that you could say a good deal more than you actually do. While others drone on discursively, you confine yourself with winning modesty and exquisite tact to a well-wrought aperçu. The essayist is like the experienced drunk who knows that if he allows himself more than the odd pregnant comment the slurring in his speech will be instantly obvious. You can toss off an essay, but you cannot toss off an epic. This, to be sure, hardly accounts for a Bacon, Montaigne or Chesterton. Some of the most innovative philosophical interventions, by scholars such as Quine, Putnam and Davidson, have taken the form of essays. Milton's magnificent defence of free thought, Areopagitica, is hardly a full-blown book, neither are Shelley's Defence of Poetry or John Stuart Mill's On Liberty. The essay is a minor form that has often served major purposes. It is also a form for which the English have a special affection, given their nervousness of systematic thought. No subject is too lowly for it. You can write a whimsical few pages on dormice or pepper pots, as you could not easily write a play or a novel on such topics. This penchant for whimsy is another reason why the English have held the essay in high regard. In the hands of a Charles Lamb or a Virginia Woolf it has the quirkily idiosyncratic quality that the natives of this island particularly admire. It conveys the taste and texture of a uniquely individual mind, and the English love a "character" as much as they love a lord. If Byron had a fan club to outdo Mick Jagger's, it was largely because he was both. The anti-systematic nature of the form also made it popular with maverick Marxists, such as Walter Benjamin and Theodor Adorno, who used it to challenge the traditional theoretical treatise. This kind of essay is a conscious assault on the book, which is regarded as outmoded. Some modernist thinkers did not just produce books about deconstruction; they also sought to deconstruct the book. Benjamin dreamed of writing a book that consisted of nothing but quotations, just as Ludwig Wittgenstein wanted to write one consisting of nothing but jokes. What is often most striking about the essay is not what is said, but the quality of the mind that says it. This is why people will buy volumes of collected essays on a diverse range of subjects, not all of which any individual reader is likely to find engrossing. We buy them because we want a grandstand view of the mind of an Orwell or a Sontag at work, even if some of the topics they deal with interest us hardly at all. The essay is a supremely individualist mode - another reason why it is so popular among the spiritually privatised English. In its looseness of structure, impressionism and open-endedness, it is a quintessentially liberal form. As the word "essay" suggests, it is a tentative, trying-it-on sort of venture, the reverse of doctrinal or didactic. It is a Protestant rather than a Catholic phenomenon, one that flies a kite rather than lays down the law. This is why the sceptical, non-committal 16th-century French aristocrat Montaigne is its rightful progenitor, a man who chose the essay form not because he lacked more than several pages' worth of knowledge, but because he lacked more than a few pages' worth of certainty. There are two main branches of the form, which contrast sharply. On the one hand, there is the rambling, conversational, impressionistic piece that the English find most seductive, from Sir Thomas Browne to J. B. Priestly. It is a style of writing that sacrifices structure to detail, the analytical to the elegant. It is prose saturated in the atmosphere of an individual mind, which takes its time in getting to the point with all the genial unbuttonedness of an after-dinner speech. The other kind of essay is very different in tone. It is pithy, pungent and polemical, as in the writing of Swift or Hazlitt. In such hands, the essay becomes an activist mode, a partisan intervention on some pressing issue that cannot be postponed until such time as a book has rolled from the press. In fact, a lot of what we now regard as literature, and thus see as bathed in an aura of timelessness, first emerged in the form of pamphlets, essays or sermons in specific religious or political debates. The essay is a timely form, well adapted to local wrangles and instant ripostes. What really ensured its future, however, was the growth of periodical literature. In 18th-century journals such as Tatler and The Spectator, essays played a key role in the formation of middle-class morals and manners. Later they became the stomping grounds of the Victorian sage, where Carlyle, Arnold, Mill and George Eliot would debate moral and social issues. The periodicals redefined authors as journalists, able to throw off a hasty piece on whatever subject readers desired. The essayist had become the hack. From there, the essay graduated to the specialist journal, in the shape of critical articles and scientific papers. And this, for the most part, is how we encounter it today. The brief, well-focused piece seems peculiarly well adapted to an age in which the traditional gravitas of literature has given way to the instant consumability of newsprint. The aphorisms of the classical essayist live on as the soundbites of the columnist. The essay was always an evanescent form, with no illusions about its own immortality; and this is no doubt one reason why it lends itself to modern audiences. Another reason for its popularity is that the fragmentary, subjective and provisional - all qualities of the classical essay - have perhaps never been as obvious as in these relativist, tentative times. In this sense, the essay is a rather longer version of the recurrent "like" of the American teenager's speech, a word expressing an uncertainty that captures the postmodern sense that nothing can be said for sure. A rough metaphorical approximation is now the closest we can come to truth. (I have heard of a US professor who makes his students put a quarter in a glass for every "like" uttered in class, and who is growing extraordinarily well heeled on the proceeds.) The idiosyncrasy of the essay, a form that usually has "as I see it" as an invisible subtitle, chimes well with the postmodern sense that there are no longer any normative forms of knowledge, just partial, partisan ones. The essay is no longer a leisurely form of letters but a kind of cognitive capsule, to be hastily swallowed by those who take their knowledge, like everything else, on the hoof. It is a literary survivor, as epic and pastoral are not, but only because it is so adept at changing its functions to suit fresh needs. The death of the novel has been announced with farcical regularity, but never the demise of the essay. There will always be room for swiftly digestible think-pieces, not least in a society that finds discursive prose increasingly hard to handle. Complex plots and extended narratives are still with us, but nowadays they are known as movies rather than literary fictions. As for myself, writing essays and producing full-length books have never felt like very different kinds of activity. But this, I suspect, is because I enjoy writing as such, regardless of the genre involved. In fact, I enjoy it so much that I am embarrassingly overproductive, a problem about which it is naturally impossible to expect the least breath of sympathy from my colleagues. It is like complaining to the Savoy management that your suite has so many rooms it's impossible to find your way around. Terry Eagleton is professor of cultural theory and John Rylands Fellow at Manchester University. The latest symptom of his overproductivity is The English Novel: An Introduction, published by Basil Blackwell. The Times Higher and Palgrave Macmillan humanities and social sciences essay-writing prize is launched this week. |
Protect your PC - Click here for McAfee.com VirusScan Online
A little fine writing can go a long, long way
|
Hazem Azmy
Do you Yahoo!?
Yahoo! Mail - 50x more storage than other providers!
Friday, June 25, 2004
Is this what makes us go 'here we go'?
Is this what makes us go 'here we go'? Karen Gold The Times Higher Education Supplement: 25 June 2004 | |
Natural feeling versus political programme, modern versus ancient, the historians can't reach agreement on what causes nationalism. Karen Gold reports. "More than 1,000 years before the arrival of Slavs, in the 6th century AD, the lands east of the Adriatic were the home of peoples known to the ancient world as Illyrians, the precursors of the present Albanians." So runs the history of Kosovo on the web pages of the Albanian Liberation Peace Movement. The Serbian Ministry of Information's website tells a different story: "The Serbs have been living in the territory of Kosova and Metohija since the 6th century. That territory was the centre of Serbian statehood, an inalienable national treasury, indispensable for the identity of the Serbian people." Nationalist feeling comes as naturally to us as breathing, according to Gottfried von Herder, the German philosopher. At the end of the 18th century, full of Romantic sentiment and liberal politics, he coined the terms Nationalismus and Volk and put forward the argument that nationalism was an organic entity in nations, embodied in language and culture, and existing, consciously or unconsciously, whether anyone wanted it to or not. We have Herder to thank for the Brothers Grimm, whom he inspired to collect fairy tales and folklore, the cultural expression of the Volk. It was really only in the 20th century that historians began to suggest that a belief in nationalism as a natural state was useful to political leaders and/or elites who wanted to persuade people to act in a unified way. Historians have focused on key periods and episodes to argue this case, pointing, in particular, to the French Revolution, whose leaders purveyed notions of La Patrie and a standardised French in the hope of uniting a scattering of peasants speaking different dialects across the newly liberated land. They have traced the way the 19th-century Napoleonic and British empires aroused resentment while, in Europe at least, failing to fill the faith gap vacated by religion and divinely appointed rulers. Resistance leaders in an unbroken line from Italy's Guiseppe Mazzini, through Ireland's Daniel O'Connell, to India's Mahatma Gandhi, this argument goes, all saw nationalism as a way of uniting and inspiring the disaffected masses to reject their oppressors in the name of creating, or recreating, a nation. Some historians, such as Elie Kedourie of the London School of Economics (the LSE has been rich soil for nationalism theories), have argued that the main reason why nationalism spread in these circumstances was because it was an ideology that appeared in the right place at the right time. Alienated peoples were shown it, and they bought it. Marxist and proto-Marxist thinkers, in particular Eric Hobsbawm and Ernest Gellner, pre-eminent nationalism theorists for the past 20 years, were more sceptical. (Marx himself virtually ignored nationalism as a horizontal distraction from vertical class conflict.) The anthropologist Gellner argued that nationalism arose out of pressures created by the Industrial Revolution, when people from different backgrounds, speaking different dialects, converged on the city and had to be welded into a literate and retainable workforce. So the state created a common language, a common past and a common culture for them. This view of nationalism as originating from above was criticised by Hobsbawm as inadequate, even though true. The artefact of nationalism cannot be understood without understanding the assumptions, hopes, longings and interests of ordinary people under capitalism, he argued. The myths and histories that nations created, about themselves and each other, spread only because the working class needed to believe in them. Why did they need to? Answers to that question take us into an entirely different realm of explanations. Sociobiologists have argued for a genetic predisposition to nationalism: members of a group who believe they have a claim to their own territory are likely to defend it more successfully than those who doubt it, or even hold high-minded principles about sharing it with others. Cultural primordialists, such as the US anthropologist Clifford Geertz, hold a similar position, arguing that territory and kinship are inescapable cultural givens. Psychologists have also put forward related theories, positing a universal tendency for people to consider other groups less important than their own and to form stereotypes about them. Horror stories that circulate about other nations - from competing cruelties in Kosovo, to unsubstantiated rumours of mass rapes and killings attributed by both sides in the First World War, to the ancient blood libel believed of the Jews - seem to substantiate this argument. But historians have criticised it as problematic. Anthony D. Smith, professor of nationalism and ethnicity at the LSE, says: "The trouble is that psychologists tend to equate nations with groups. But there are many groups in the world, and they are not all necessarily nations... And there are some nations where everyone doesn't even speak the same language, like Switzerland. Ideas about groups really don't get to the specificity of nationalism." Much of the debate about nationalism's longevity, and therefore what causes it, depends on definitions. Should it be defined as a specific political programme or a more cultural movement? Smith believes that nationalism's roots are in culture. That makes him a "perennialist" - someone who believes that nationalism precedes the 18th century, though in less sophisticated forms. He suggests that foreshadowing the modern nation are "ethno-symbols", constructed on language and a vernacular literature, but also on less obvious elements: memory, value, myth, symbolism and landscape. The idea of nationalism may be modern, but its roots are in a distant shared past, he argues. Another perennialist, Adrian Hastings, the late Leeds University theologian, pointed to 14th and 17th-century England as periods when national identity was particularly strong; others argue that the Jews and Armenians sustained powerful national identities over millennia. Mazzini, formulating 19th-century Italian nationalism against Napoleonic France, passionately defended the cultural roots of his movement: "They (Italians) speak the same language, they bear about them the impress of consanguinity, they kneel beside the same tombs, they glory in the same tradition, they demand... to contribute their stone to the great pyramid of history." Ironically, there was virtually no public or academic interest in the roots of nationalism before and after its florid expression in two world wars. Instead, peak times for academic exploration of nationalism's causes have been the 1960s, prompted by African and Asian independence, and the late 1980s and early 1990s, after the break-up of the Soviet Union and Yugoslavia. The argument here has been whether nationalism is the cause or the product of the break-up of old states and the creation of new ones. Rogers Brubaker, the US sociologist, for example, has argued that the organisation of the Soviet Union into component parts was what taught people to think of themselves as Lithuanian or Ukrainian. In contrast, Michael Hechter, in Containing Nationalism, and Miroslav Hroch, the Czech political theorist, point to social and linguistic ties and to a memory of a common past as the trigger for claiming nationhood - though still prompted by historical circumstance rather than any organic drive. Recently, sociologists, particularly on the left, have debated to what extent a new nationalism is appearing in Europe, expressed as anti-immigrant feeling, and how far it is driven by identity or by class interests. The fundamental issue is the same as that between Kedourie, Gellner and Hobsbawm: to what extent is this nationalism attributable to a shared ideology, to frustration among individuals who seek an identity having been disappointed by what the 20th-century state offers them, or to pressures from above to stand together and conform. Whatever the case, this is nationalism within identified nations. Nationalism among people who are not yet nations has another modern cause, historians argue, which is that today's world structures will hear people only through the representation of a nation-state. "Once almost the whole world is organised into nation-states then if you want to be recognised as a legitimate entity you have to be a nation-state," says David Bell (see below). "So it is almost inevitable that nationalism will follow." It is almost inevitable but not entirely. The urge to create, or recreate, nations in the dismantled Soviet Union was not uniformly strong, he says. The Russian Federation remains a federation. In central Asia, religious forces seem to be exerting a more powerful pull than nationalist ones. So are we about to see a decline in nationalism across the world, to be replaced by globalisation or religion? The way people see the future of nationalism depends very much on the explanations they credit for its past. Smith says: "If you think nationalism is a given in history, then you will think it is going to be around for a long time. If you think it is a completely modern phenomenon, then you might or might not think it is going to pass more or less quickly away." |
Hazem Azmy
Do you Yahoo!?
New and Improved Yahoo! Mail - 100MB free storage!
Is this what makes us go 'here we go'? - on Nationalism
Is this what makes us go 'here we go'? Karen Gold The Times Higher Education Supplement: 25 June 2004 | |
Natural feeling versus political programme, modern versus ancient, the historians can't reach agreement on what causes nationalism. Karen Gold reports. "More than 1,000 years before the arrival of Slavs, in the 6th century AD, the lands east of the Adriatic were the home of peoples known to the ancient world as Illyrians, the precursors of the present Albanians." So runs the history of Kosovo on the web pages of the Albanian Liberation Peace Movement. The Serbian Ministry of Information's website tells a different story: "The Serbs have been living in the territory of Kosova and Metohija since the 6th century. That territory was the centre of Serbian statehood, an inalienable national treasury, indispensable for the identity of the Serbian people." Nationalist feeling comes as naturally to us as breathing, according to Gottfried von Herder, the German philosopher. At the end of the 18th century, full of Romantic sentiment and liberal politics, he coined the terms Nationalismus and Volk and put forward the argument that nationalism was an organic entity in nations, embodied in language and culture, and existing, consciously or unconsciously, whether anyone wanted it to or not. We have Herder to thank for the Brothers Grimm, whom he inspired to collect fairy tales and folklore, the cultural expression of the Volk. It was really only in the 20th century that historians began to suggest that a belief in nationalism as a natural state was useful to political leaders and/or elites who wanted to persuade people to act in a unified way. Historians have focused on key periods and episodes to argue this case, pointing, in particular, to the French Revolution, whose leaders purveyed notions of La Patrie and a standardised French in the hope of uniting a scattering of peasants speaking different dialects across the newly liberated land. They have traced the way the 19th-century Napoleonic and British empires aroused resentment while, in Europe at least, failing to fill the faith gap vacated by religion and divinely appointed rulers. Resistance leaders in an unbroken line from Italy's Guiseppe Mazzini, through Ireland's Daniel O'Connell, to India's Mahatma Gandhi, this argument goes, all saw nationalism as a way of uniting and inspiring the disaffected masses to reject their oppressors in the name of creating, or recreating, a nation. Some historians, such as Elie Kedourie of the London School of Economics (the LSE has been rich soil for nationalism theories), have argued that the main reason why nationalism spread in these circumstances was because it was an ideology that appeared in the right place at the right time. Alienated peoples were shown it, and they bought it. Marxist and proto-Marxist thinkers, in particular Eric Hobsbawm and Ernest Gellner, pre-eminent nationalism theorists for the past 20 years, were more sceptical. (Marx himself virtually ignored nationalism as a horizontal distraction from vertical class conflict.) The anthropologist Gellner argued that nationalism arose out of pressures created by the Industrial Revolution, when people from different backgrounds, speaking different dialects, converged on the city and had to be welded into a literate and retainable workforce. So the state created a common language, a common past and a common culture for them. This view of nationalism as originating from above was criticised by Hobsbawm as inadequate, even though true. The artefact of nationalism cannot be understood without understanding the assumptions, hopes, longings and interests of ordinary people under capitalism, he argued. The myths and histories that nations created, about themselves and each other, spread only because the working class needed to believe in them. Why did they need to? Answers to that question take us into an entirely different realm of explanations. Sociobiologists have argued for a genetic predisposition to nationalism: members of a group who believe they have a claim to their own territory are likely to defend it more successfully than those who doubt it, or even hold high-minded principles about sharing it with others. Cultural primordialists, such as the US anthropologist Clifford Geertz, hold a similar position, arguing that territory and kinship are inescapable cultural givens. Psychologists have also put forward related theories, positing a universal tendency for people to consider other groups less important than their own and to form stereotypes about them. Horror stories that circulate about other nations - from competing cruelties in Kosovo, to unsubstantiated rumours of mass rapes and killings attributed by both sides in the First World War, to the ancient blood libel believed of the Jews - seem to substantiate this argument. But historians have criticised it as problematic. Anthony D. Smith, professor of nationalism and ethnicity at the LSE, says: "The trouble is that psychologists tend to equate nations with groups. But there are many groups in the world, and they are not all necessarily nations... And there are some nations where everyone doesn't even speak the same language, like Switzerland. Ideas about groups really don't get to the specificity of nationalism." Much of the debate about nationalism's longevity, and therefore what causes it, depends on definitions. Should it be defined as a specific political programme or a more cultural movement? Smith believes that nationalism's roots are in culture. That makes him a "perennialist" - someone who believes that nationalism precedes the 18th century, though in less sophisticated forms. He suggests that foreshadowing the modern nation are "ethno-symbols", constructed on language and a vernacular literature, but also on less obvious elements: memory, value, myth, symbolism and landscape. The idea of nationalism may be modern, but its roots are in a distant shared past, he argues. Another perennialist, Adrian Hastings, the late Leeds University theologian, pointed to 14th and 17th-century England as periods when national identity was particularly strong; others argue that the Jews and Armenians sustained powerful national identities over millennia. Mazzini, formulating 19th-century Italian nationalism against Napoleonic France, passionately defended the cultural roots of his movement: "They (Italians) speak the same language, they bear about them the impress of consanguinity, they kneel beside the same tombs, they glory in the same tradition, they demand... to contribute their stone to the great pyramid of history." Ironically, there was virtually no public or academic interest in the roots of nationalism before and after its florid expression in two world wars. Instead, peak times for academic exploration of nationalism's causes have been the 1960s, prompted by African and Asian independence, and the late 1980s and early 1990s, after the break-up of the Soviet Union and Yugoslavia. The argument here has been whether nationalism is the cause or the product of the break-up of old states and the creation of new ones. Rogers Brubaker, the US sociologist, for example, has argued that the organisation of the Soviet Union into component parts was what taught people to think of themselves as Lithuanian or Ukrainian. In contrast, Michael Hechter, in Containing Nationalism, and Miroslav Hroch, the Czech political theorist, point to social and linguistic ties and to a memory of a common past as the trigger for claiming nationhood - though still prompted by historical circumstance rather than any organic drive. Recently, sociologists, particularly on the left, have debated to what extent a new nationalism is appearing in Europe, expressed as anti-immigrant feeling, and how far it is driven by identity or by class interests. The fundamental issue is the same as that between Kedourie, Gellner and Hobsbawm: to what extent is this nationalism attributable to a shared ideology, to frustration among individuals who seek an identity having been disappointed by what the 20th-century state offers them, or to pressures from above to stand together and conform. Whatever the case, this is nationalism within identified nations. Nationalism among people who are not yet nations has another modern cause, historians argue, which is that today's world structures will hear people only through the representation of a nation-state. "Once almost the whole world is organised into nation-states then if you want to be recognised as a legitimate entity you have to be a nation-state," says David Bell (see below). "So it is almost inevitable that nationalism will follow." It is almost inevitable but not entirely. The urge to create, or recreate, nations in the dismantled Soviet Union was not uniformly strong, he says. The Russian Federation remains a federation. In central Asia, religious forces seem to be exerting a more powerful pull than nationalist ones. So are we about to see a decline in nationalism across the world, to be replaced by globalisation or religion? The way people see the future of nationalism depends very much on the explanations they credit for its past. Smith says: "If you think nationalism is a given in history, then you will think it is going to be around for a long time. If you think it is a completely modern phenomenon, then you might or might not think it is going to pass more or less quickly away." |
MSN 8 helps ELIMINATE E-MAIL VIRUSES. Get 2 months FREE*.
Thursday, June 24, 2004
Lowest morals, highest office (review of American Dynasty)
|
Hazem Azmy
Do you Yahoo!?
Yahoo! Mail Address AutoComplete - You start. We finish.
What not to wear if you want to get on
What not to wear if you want to get on Caroline Evans The Times Higher Education Supplement: 18 June 2004 | |
As fashion students put their cutting-edge designs on show, many faculty remain sartorially challenged. How long before academics are told: that blazer is so last century - and so is your job? Caroline Evans ponders the effects of equating dress sense with pedagogic sense Not long ago in The Times Higher, Valerie Atkinson asked what it might mean for academics to "modernise" according to government initiatives. Might they, she asked, deliver their lectures in rap, dressed in clothes bought in this century? The equation is one that even I can compute: academic + chic = oxymoron. An academic whose dress sense is too sharp may be considered suspect, but the point itself is academic because there are no chic academics. That's the received wisdom, but it's not always true, particularly among American celebrity academics. Andrew Ross of New York University made it into The New York Times in 1991 by wearing a yellow Comme des Garcons jacket that, he said, was a send-up of the academic male convention of wearing yellow polyester. (I have never seen a British academic clad in yellow polyester, but if I did I would argue for his modernisation, if not instant liquidation.) Jane Gallop, professor of English and comparative literature at the University of Wisconsin-Milwaukee, aka "the poststructuralist Mae West", vamps it up on the lecture circuit - Jshe once gave a lecture on psychoanalysis and the phallus dressed in a skirt made of men's ties stitched together. And Wayne Koestenbaum, professor of English at the Graduate Centre of City University New York, has been known to choose his cologne to suit the author whose work he is teaching. Most of us are not such aesthetes, and most academics are a nondescript lot. Elizabeth Wilson, author of the uber-text on fashion and modernity, Adorned in Dreams, once noted that in her university secretaries were smart and fashionable whereas academic staff dressed as if they'd been dragged through a hedge backwards. You may say this is as it should be; our minds should be on higher things and our inattention to dress signals this - shabbiness connotes seriousness whereas high fashion is frivolous. But "dress down", the academic equivalent of "dress for success" can also be a form of inverted snobbery. After all, you don't want to look like management. Academic dress is also a badge of affiliation. Walking once from the Birmingham Centre for Contemporary Cultural Studies into the senior common room for the whole university, Wilson noted that "whereas I and my companions were all dressed in an early 1980s cross between punk and new romantic, the 'normal' academics in the lounge were all wearing blazers and flannels, outfits I hadn't seen since my father died in 1963". This is the sad truth about the academic wardrobe at its worst. It is more Terry Thomas than tweed. Would that it did still consist of the shabby-genteel corduroy, leather elbow patches, pipes and brogues so bitingly satirised by Kingsley Amis in Lucky Jim. Even the dashing braggadocio of Malcolm Bradbury's History Man would have a certain 1970s retro appeal. Instead, we have polyester and power suits, chinos and smart casuals. The dress historian Christopher Breward says that "the mediatisation of academics generally has upped the ante across the academic spectrum. Roy Porter was as notorious for his medallions as Simon Schama is for his leather jacket - image becomes a kind of signature". Tweed survives only in a few obscure habitats where it has returned to nature; a few feral examples exist in the Institute of Historical Research. Elsewhere, it has modernised unexpectedly, its olde worlde charm a passport to the new world of fetish wear, where it is included in the dress code at "Night of the Cane" evenings, as in "fetish, school uniform, academic, evening dress and military". None of this would matter had postmodernism not made cultural commentators of us all. Styles may shift, but academics' reputation for bad dress sense never changes, and no one is keener on anatomising this than academics themselves. Students take to the internet in droves to dissect their professors' clothing choices. Soon, however, the academics turn their excoriating gaze to themselves. Masochists to a man, they are hardest on their own disciplines. A social scientist says his male peers in ill-fitting blue suits occupy "the very bowels of hell". The professors are at it, too. In a colloquy on academic dress in the US Chronicle of Higher Education in 1998, the advice columnist and author of Ms Mentor's Impeccable Advice for Women in Academia recommended sartorial experimentation for only the eccentric, secure and tenured. For the rest, "dispiriting conformity". Ms Mentor - in real life Emily Toth, an English professor at Louisiana State University - Jargued: "If you don't know how to dress, what else don't you know? Do you know how to assess students or grade papers? The clothes are part of the judgement of the mind." But when I look at a photo of Toth, witty as her prose is, I'm not sure I like her blouse - Jsorry, her mind. On the other hand, I do like Jane Gallop's argument that style, far from being superficial, is, as any literary scholar will tell you, often the best way to convey complicated ideas. As Gallop says: "You should use everything you have to make people think." I have never seen her picture, and I think I had better not in case it turns out that I don't like her clothes, either. (Not that I'm shallow or anything.) Should we follow the Americans' lead and modernise, and, if so, why? Isn't the inalienable right to dress badly one of the precious few perks in academia? Of course, we might have no choice but to spruce up if policy-makers heed the French academic Gilles Lipovetsky. He argues that fashion trains us to be adaptable and produces exactly the sort of "kinetic, open personality" that higher education needs, able to reinvent the educational wheel on a weekly basis. If dress sense became a measure of pedagogic sense, the sartorial equivalent of academic league tables would be required. It is in the nature of fashion - and fashionable people - to change constantly, so sartorial quality assurance inspections and research assessment exercises would have to be quite frequent - twice a year, in fact - to keep up with the fashion seasons. The spring and autumn sartorial quality assurance inspections could take place during London Fashion Week in February and September; the altogether classier biannual RAEs could be staggered to correspond with the Paris couture shows in January and July. How would we be assessed? One suggestion is academic reality TV, "a sort of Queer Eye for the Straight Guy" for academics. And why not? We have peer review for our articles, we have quality assurance for our teaching; next stop, reality TV for our dress codes. Trial by Television, the sartorial equivalent of peer review, will maintain dress standards. But who will be the judges? Trinny and Susannah, or David Starkey? I'm not sure whose victim I'd least like to be. Best not go there. Try something different. In Big Rector, academics are clustered together in a conference venue on the banks of the Bosphorus. There are spats, rivalries, alliances and dalliances. The losers, voted out of the lecture hall one by one for their dress deficiencies, hope for lucrative book deals and media opportunities... not that different from a normal day, then. Caroline Evans is reader in fashion history and theory at Central Saint Martins College of Art and Design (University of the Arts London). |
Do you Yahoo!?
New and Improved Yahoo! Mail - Send 10MB messages!
Saturday, June 19, 2004
Authors receive guide to the darkart of shelf-promotion
Authors receive guide to the darkart of shelf-promotion
Phil Baty
The Times Higher Education Supplement: 18 June 2004
One of the world's most prestigious academic publishers has been accused of promoting the use of "dirty tricks" in a document advising authors how to gain a competitive advantage over their rivals.
A document called Ten Sneaky Tips for Marketing your Book, sent out unofficially to some Routledge authors, includes advice on how to manipulate the customer review system of the popular internet bookseller, Amazon.co.uk.
As well as innocuous tips, such as ensuring books are displayed more prominently than those of rivals in shops, the guide advises: "Find your book on Amazon and get friends, colleagues, people that owe you money to put up a glowing review.
"Better still (and if you are of a slightly unethical persuasion), ask them to put a review up on a competing book's page that claims that while this book is very good, xxx (by yours truly) may well be a wiser purchase.
"Amazon will automatically set a link up to your own book from the review, and this will help persuade punters to buy your book instead. Of course, guerrilla tactics like this should be done with discretion and are not for those prone to feelings of guilt!"
One author who received the guide said: "I was quite shocked when I received this guidance among the usual package of marketing material. These kind of dirty tricks seem a further example of the erosion of ethics in the academic sphere."
Routledge is part of the stock-exchange listed Taylor and Francis group, which was founded in 1798. It publishes in the region of 1,800 academic books a year and has a specialist backlist of 20,000 books.
Roger Horton, Routledge managing director, said: "This communication came to my notice only last week. It was produced by one individual and had not been approved by me or any management. It is not a Routledge document and we do not endorse this.
"It was an isolated case and it has now been withdrawn."
A spokeswoman for Amazon told The Times Higher that it had spoken to Routledge and it accepted that the document was not endorsed by the publisher.
She said: "Amazon.co.uk has always believed that customer reviews provide a unique forum for discussion between our various audiences. In general, the honour system on which it operates is observed.
"Amazon.co.uk is a community of customers and, as such, we rely on those customers to let us know if they think that a review or a comment is not appropriate.
"Amazon.co.uk has specific guidelines for all customer reviews posted on our site. We operate the industry standard system of self-regulation known as 'notice and take down' - that is, we will immediately investigate any complaint we receive about comments on our site and, if appropriate, remove them."
It is unclear how widely the guide was distributed.
Thursday, June 10, 2004
Review of Mouse or Rat - Eco's book on translation
A scholar with the gift of tongues who can smell a mouse in a text full of rats Shusha Guppy The Times Higher Education Supplement 11 June 2004 | |||||||||||||||
In the confrontation between Hamlet and his mother, Gertrude, (Act III, Scene iv), when a sudden noise indicates that someone is eavesdropping, Hamlet exclaims: "How now? A rat?" and plunges his sword through the arras to kill the intruder, believing him to be his uncle Claudius, his father's murderer. Instead he discovers that he has killed Polonius, his fiancée's father. Umberto Eco takes Hamlet's exclamation as an example of the difficulty in choosing the right equivalent for a seemingly simple word, rat, in various European languages, all derived from the Latin root rattus. The trouble is that these languages also have mouse, from the Latin mus. The obvious difference between the two rodents is size - a rat is larger than a mouse - but the nuances and connotations relating to them vary in different languages. So which one is right? Eco proposes that in Italian, topo (mouse) is correct, for that is what an Italian Hamlet would say: "Cosa c'e? Un topo?" By contrast in Camus' The Plague, when one morning the protagonist, Dr Rieux, finds "a dead rat" on his doorstep, ratto must be used, for mice do not carry the plague, rats do. In both examples, and in translation generally, one does not rely on "definitions provided by dictionaries... but negotiates which portion of the expressed content is strictly pertinent in that given context". Thus all translation is negotiation, a give-and-take "process by virtue of which in order to get something, each party renounces something else". Based on a series of lectures given in Toronto and Oxford, Mouse or Rat? is a collection of essays on the difficulties, pitfalls, joys and rewards of translation. Eco draws on his own experiences as translator and editor, and on working with translators of his own work. As always he dazzles with his vast erudition, charm and humour. He believes that the translator's first "ethical obligation (is) to respect what the author has written... what he means". But what is meaning? Eco's definition is "that which remains unchanged in the process of translation". Thus, to translate an idiom, the translator must find an equivalent in the "destination language", and in so doing he may have to become "literally unfaithful" to remain "truly faithful to the meaning" of the original text. Eco gives examples from his own books that have baffled translators. He admits to sometimes playing linguistic games that are "ostensibly erudite" - a whole page listing the variety of tramps and vagabonds in The Name of the Rose, or allusions to a poem by Leopardi in Foucault's Pendulum, for which the English and German translators had to find equivalents in Keats and Goethe. To illustrate the basic problem of "meaning", Eco submits the first chapter of the Book of Genesis in English to AltaVista, the internet search engine, asking it to translate the text into Spanish and back again into English. The result is hilarious: "the spirit of God moved on the face of the waters", becomes "the alcohol of God moved on the face of the waters", for "AltaVista has merely a list of correspondences, like the Morse Code", and cannot "contextualise" or know that spirit is different used in a church from in a pub. So first there are differences between tongues, then between texts, "a very crucial point". Of equal importance to meaning is context: "In order to translate, one must know a lot of things, most of them independent of mere grammatical competence", for "translation does not only concern words and language in general, but also the world, or at least the possible world described by a given text". Sometimes translation changes and enriches the "recipient language", compelling it to "express thoughts and facts that it was not accustomed to express": the translations of American writers - Hemingway, Faulkner, Dos Passos - changed "the narrative styles of Italian and French writers after the War", while Heidegger's "radically changed the French philosophical style". Hence the need for "updating" certain key texts as language subtly changes over time. In the first line of Dante's sonnet to Beatrice - "Tanto gentile e tanto onesta pare..." - the words "gentile" and "onesta" had different meanings in the poet's time from today, and English translators have used various words over the centuries to convey the original intention. Literary devices such as hypotyposis - "the rhetorical effect by which words succeed in rendering a visual scene" - and ekphrasis - when a verbal text describes a visual work of art - are straightforward, except with modern writers who, like Proust in A la Recherche du Temps Perdu, practice "occult ekphrasis": describing a work of art without giving clues. This has given Eco another opportunity "to play frequent games with paintings" - for example by Vermeer in The Island of the Day Before - hoping that "the cultivated reader" would have the thrill of recognising his sources. Verse translation has the added problems of rhyme and rhythm. Eco illustrates the difficulty with English translations of The Divine Comedy: some translators have gone through endless contortions to keep the rhyme and the metre, while others have found the price "disastrously high", and achieved greater fluency without them. When translation is from one medium to another it is called adaptation - as when a novel is turned into a film - or transmutation when a text serves as a basis for a ballet, or a symphonic poem, as in Debussy's Pelléas et Mélisande after Maeterlinck's play, and Prélude pour l'Aprés-midi d'un Faune, based on Mallarmé's poem. At their best "such transmutations serve to help appreciate the inspiring work better". In conclusion, Eco returns to the Pentecostal "gift of tongues", reversing "the defeat of Babel". His own gift of tongues makes the reader of his brilliant book understand the enriching pleasure derived from the knowledge of languages. "When one learns a foreign language, one acquires a new country," Goethe said. Eco has made half a dozen countries his own. Shusha Guppy is London editor, Paris Review. |
Hazem Azmy
Do you Yahoo!?
Friends. Fun. Try the all-new Yahoo! Messenger
Saturday, June 05, 2004
Why don't scholars admit that holy war means war?
Why don't scholars admit that holy war means war? Daniel Pipes The Times Higher Education Supplement - 03 October 2003 | |
Jihad is a fundamentally martial concept, but, Daniel Pipes says, most US specialists on Islam paint it as a struggle for self-improvement and social justice and, in doing so, camouflage the very real armed threat it poses. In June 2002, the faculty of Harvard College selected a graduating senior named Zayed Yasin to deliver a speech at the university's commencement exercises. When the title of the speech - My American Jihad - was announced, it quite naturally aroused questions. Why, it was asked, should Harvard wish to promote the concept of jihad - or "holy war" - just months after thousands of Americans had lost their lives to a jihad carried out by 19 suicide hijackers acting in the name of Islam? Yasin, a past president of the Harvard Islamic Society, had a ready answer. To connect jihad to warfare, he said, was to misunderstand it. Rather: "In the Muslim tradition, jihad represents a struggle to do the right thing." His own purpose, Yasin added, was to "reclaim the word for its true meaning, which is inner struggle". To be sure, Yasin was not a scholar of Islam, and neither was the Harvard dean, Michael Shinagel, who endorsed Yasin's "thoughtful oration" and declared in his own name that jihad is a personal struggle "to promote justice and understanding in ourselves and in our society". But they both accurately reflected the consensus of Islamic specialists at their institution, such as David Mitten, a professor of classical art and archaeology as well as faculty adviser to the Harvard Islamic Society, for whom true jihad is "the constant struggle of Muslims to conquer their inner base instincts, to follow the path to God and to do good in society". Harvard's scholars are not exceptional in this regard. As I discovered through an examination of media statements by university-based specialists, they tend to portray the phenomenon of jihad in a remarkably similar fashion -the portrait, however, happens to be false. A survey of two dozen scholars' contributions to US newspapers and television discussions (and specifically not their scholarly writings) shows that only one speaks candidly about jihad. Hamid Algar of the University of California, Berkeley, scorns the mealy-mouthed post-9/11 apologetics of his colleagues, whereby jihad is "redefined as a form of self-improvement -like kicking tobacco was an act of jihad". Algar is refreshingly honest about his religion: "People keep saying: 'Islam is a religion of peace.' Well, that's true, but not under all circumstances." Of the other professors, just four admit that jihad has some military component and even they, with but one exception, insist that this component is purely defensive in nature. Thus, John Esposito of Georgetown University, Washington DC, perhaps the most visible academic scholar of Islam in the US, holds that "in the struggle to be a good Muslim, there may be times where one will be called upon to defend one's faith and community. Then [jihad] can take on the meaning of armed struggle". To half a dozen scholars in my survey, jihad may indeed include militarily defensive engagements, but this meaning is secondary to lofty notions of moral self-improvement. Charles Kimball, chairman of the department of religion at Wake Forest University, Winston-Salem, North Carolina, puts it succinctly: Jihad "means struggling or striving on behalf of God. The great jihad for most is a struggle against oneself. The lesser jihad is the outward, defensive jihad." But an even larger contingent -nine of those professors surveyed -deny that jihad has any military meaning at all. For Farid Eseck, professor of Islamic studies at Auburn Seminary in New York City, jihad is "resisting apartheid or working for women's rights". Finally, there are those academics who focus on jihad in the sense of "self-purification" and universalise it, applying it to non-Muslims as well as Muslims and treating it as something all Americans should admire. Bruce Lawrence, a professor of Islamic studies at Duke University, Durham, North Carolina, holds that non-Muslims should cultivate "a civil virtue known as jihad". This accumulated wisdom of the scholars suggests that Osama bin Laden had no idea what he was saying when he declared jihad on the US several years ago and then repeatedly murdered Americans in Somalia, at the US embassies in East Africa, in the port of Aden, and then in New York and Washington DC on 9/11. It implies that organisations with the word "jihad" in their titles are grossly misnamed. But it is bin Laden, the jihad organisations and jihadists worldwide who define the term, not a covey of academic apologists. More important, the jihadists' understanding of the term is in keeping with its usage over 14 centuries of Islamic history. In pre-modern times, jihad meant mainly one thing among Sunni Muslims, then, as now, the Islamic majority. It meant the legal, compulsory, communal effort to expand the territories ruled by Muslims at the expense of territories ruled by non-Muslims. In this prevailing conception, the purpose of jihad is political, not religious. It aims not so much to spread the Islamic faith as to extend sovereign Muslim power (though the former has often followed the latter). The goal is boldly offensive, and its ultimate intent is nothing less than to achieve Muslim dominion over the entire world. As for the conditions under which jihad might be undertaken - when, by whom, against whom, how war is declared, how it is ended and so on -these are matters that religious scholars worked out in excruciating detail over the centuries. But about the basic meaning of jihad -warfare against unbelievers to extend Muslim domains -Jthere was perfect consensus. Jihad was no abstract obligation through the centuries but a key aspect of Muslim life. According to one calculation, Muhammad himself engaged in 78 battles, of which just one (the Battle of the Ditch) was defensive. Within a century of the prophet's death in 632, Muslim armies had reached as far as India in the east and Spain in the west. Important victories in subsequent centuries included the 17 Indian campaigns of Mahmud of Ghazna (971?-1030), the battle of Manzikert opening Anatolia (1071), the conquest of Constantinople (1453) and the triumphs of Uthman dan Fodio in West Africa (1804-17). In brief, jihad was part of the warp and woof not only of pre-modern Muslim doctrine but of pre-modern Muslim life. That said, jihad also had two variant meanings over the ages, one of them more radical than the standard meaning and one quite pacific. The first, associated mainly with the thinker Ibn Taymiya (1268-1328), holds that born Muslims who fail to live up to the requirements of their faith are themselves to be considered unbelievers, and so legitimate targets of jihad. This tended to come in handy when (as was often the case) one Muslim ruler made war against another; only by portraying the enemy as not Muslim could the war be dignified as a jihad. The second variant, usually associated with Sufis, or Muslim mystics, was the doctrine customarily translated as "greater jihad" but perhaps more usefully termed "higher jihad". This invokes allegorical modes of interpretation to turn jihad's literal meaning of armed conflict upside-down, calling instead for a withdrawal from the world to struggle against one's baser instincts in pursuit of numinous awareness and spiritual depth. But as Rudolph Peters notes in his authoritative Jihad in Classical and Modern Islam (1995), this interpretation was "hardly touched upon" in pre-modern legal writings on jihad. In the vast majority of pre-modern cases, jihad signified one thing only: armed action against non-Muslims. Things today are somewhat more complicated, as Islam has undergone contradictory changes resulting from its contact with western influences. Muslims having to cope with the West have tended to adopt one of three broad approaches: Islamist, reformist or secularist. Secularists (such as Kemal Atatürk) reject jihad in its entirety. Islamists, besides adhering to the primary conception of jihad as armed warfare against infidels, have also adopted Ibn Taymiya's call to target putatively Muslim rulers who fail to live up to or apply the laws of Islam. Reformists reinterpret Islam to make it compatible with western ways. They have worked to transform the idea of jihad into a purely defensive undertaking compatible with the premises of international law. This approach, characterised in 1965 by the Encyclopedia of Islam as "wholly apologetic", owes far more to western than to Islamic thinking. In our own day, it has evolved into what Martin Kramer has dubbed "a kind of Oriental Quakerism", and it, together with a revival of the Sufi notion of "greater jihad", is what has emboldened some to deny that jihad has any martial component whatsoever, instead redefining the idea into a purely spiritual or social activity. For most Muslims today, it is the classic notion of jihad that resonates. This goes far to explain the immense appeal of a figure such as bin Laden after 9/11. Yasin may have assured his Harvard audience that "jihad is not something that should make someone feel uncomfortable," but this concept has caused and continues to cause not merely discomfort but untold human suffering. Islamists seeking to advance their agenda in western, non-Muslim environments -for example, as lobbyists in Washington -cannot frankly divulge their views and still remain players in the political game. To avoid arousing fears and isolating themselves, these individuals and organisations usually cloak their true outlook in moderate language, at least when addressing the non-Muslim public. When referring to jihad, they adopt the terminology of reformists, presenting warfare as decidedly secondary to the goal of inner struggle and social betterment. Such talk is pure disinformation, reminiscent of the language of Soviet front groups in decades past. An example of it was on offer at the trial of John Walker Lindh, the Californian teenager who went off to wage jihad on behalf of the Taliban regime in Afghanistan. At his sentencing in October 2002, Lindh told the court that, in common with "mainstream Muslims around the world", he understood jihad as a variety of activities ranging "from striving to overcome one's own personal faults, to speaking out for the truth in adverse circumstances, to military action in the defence of justice". That a jihadist caught in the act of offensive armed warfare should unashamedly proffer so mealy-mouthed a definition of his actions may seem extraordinary. But it is perfectly in tune with the explaining-away of jihad promoted by academic specialists and Islamist organisations engaging in public relations. For use of the term in its plain meaning, we have to turn to Islamists not so engaged. Such Islamists speak openly of jihad in its proper, martial, sense. Here is bin Laden: Allah "orders us to carry out the holy struggle, jihad, to raise the word of Allah above the words of the unbelievers". And Mullah Muhammad Omar, the former head of the Taliban regime, exhorting Muslim youth: "Head for jihad and have your guns ready." It is an intellectual scandal that, since 9/11, scholars at US universities have repeatedly issued public statements that avoid or whitewash the primary meaning of jihad in Islamic law and Muslim history. It is as if historians of medieval Europe were to deny that the word "crusade" ever had martial overtones, instead pointing to such terms as "crusade on hunger" or "crusade against drugs" to show that the term signifies an effort to improve society. Among today's academic specialists who have undertaken to sanitise this key Islamic concept, many are no doubt acting out of the impulses of political correctness and the multiculturalist urge to protect a non-western civilisation from criticism by making it appear just like our own. As for Islamists among those academics, at least some have a different purpose: they are endeavouring to camouflage a threatening concept by rendering it in terms acceptable within university discourse. Westerners struggling to make sense of the war declared on them in the name of jihad have every reason to be deeply confused as to who their enemy is and what his goals are. Even people who think they know that jihad means holy war are susceptible to the combined efforts of scholars and Islamists brandishing notions such as "resisting apartheid" or "working for women's rights". The result is to cloud reality, obstructing the possibility of achieving a clear, honest understanding of what and whom we are fighting, and why. This is an edited extract from a new chapter in Daniel Pipes' book Militant Islam Reaches America, published by WW Norton on October 14, £9.99. |
Do you Yahoo!?
Friends. Fun. Try the all-new Yahoo! Messenger