Archive for April, 2005

Recently deceased people that I admire (also not about the pope)

I have a friend, who is Jewish and currently spending his year abroad at Hebrew University in Jerusalem, who last year gave me his belief that James Joyce was the most successful of the literary modernists because he was the only one who found an authentic cultural identity for the artist and, by extension, modern man in society, an identity which, paradoxically, consisted of its deracination from cultural moorings. In other words, while Yeats, for example, was fruitlessly trying to revive an authentic mythology, a spiritual grounding, Joyce’s evocation of the mythic is more inherently and self-consciously ironic, aware of the naivete of believing in such a project, and casting the absence of the mythic and the secure in the world today into even harsher relief. Regardless of the truth of this idea (I think Kafka far surpassed them all and actually succeeded in seamlessly fusing the primeval and the ultra-modern), the reason I mention this, and the reason my friend’s religion is pertinent, is because this sort of notion of a culture grounded in deracination is even more obviously characteristic of Judaism, and that may in fact have been the inspiration for his idea, consciously or not. I recently read an autobiographical work by Georges Perec entitled W ou le souvenir d’enfance (W or the memory of childhood), which is very much in this vein. Perec was a small child in France during WWII and lost both his parents to it, one in battle and the other in the Holocaust, and his evocation of the time is extremely elliptical, composed, as he says, “d’absenses, d’oublis, de doutes, d’hypothèses, d’anecdotes maigres” (“of absences, of forgetfulness, of doubts, of hypotheses, of thin anecdotes”), and the whole text is interspered, between every chapter, by an account of an Olympic dystopia, a society devoted entirely to sports competition which slowly crescendoes into pure concentration-camp horror and is intended, in my opinion, to show how at that time one’s personal past was being continually interrupted and superimposed by the catastrophes of the war, of history. The main themes are clear: silence, brokenness, the impossibility of any sort of recovery of the past. It’s very finely done but my objection, as to much Holocaust art and literary modernism, is the barrenness of this sort of spiritual orphanage. This is not to deny the importance of the horrors of the past but simply to suggest that, in the case of both Joyce and Perec, there is something, if not disingenuous, then at least deliberately hopeless about the whole project. It seems to me that the irrevocable annhilation of the past is one of the premises of their work, and that in the end it becomes the exaltation of a dead end. In the linguistic minuteness and general incomprehensibility of both, esp. Finnegans Wake, this is almost literally true, as if the language has gone to the limit (and beyond) of expressiveness in this direction, and this is the end.

Which brings me to the thrust of things. Philip Roth has just published some correspondence with Saul Bellow in the New Yorker in which Bellow describes mainly the genesis of The Adventures of Augie March, his first major success. He was living in Paris when the inspiration came to him; I find many parallels between his situation and my own, and not only on that account. He felt, as I often do here, “that Europe was defying me to do something about it” and was consequently terribly depressed. And then, ah! inspiration, “I discovered that I could write whatever I wished…I did not have to kill myself in the service of art.” It seems that this disburdening is not purely personal, but also has larger cultural overtones, for, as he says, “That “Augie March? happened in dismal postwar Europe (knowledge of the Holocaust was slowly coming to us back then) is evidence of an independent move of the mind, a decision not to surrender to horror. I discovered that I no longer wanted to be put upon by art seriousness.” This moment of inspiration at the end of a downcast sojourn in Paris seems to symbolically indicate, then, a break with an exhausted and morbid European culture, a discovery of an artistic vitality native to America and the English language. And there you have it: Bellow had discovered a new free language and mode of expression which was also implicitly an assertion of freedom, a step forward and away from the onerous legacy that extinguishes everything in Perec and so many others’ writings and thoughts. It would no doubt be a great over-simplification to say that Europe has followed the cultural path of Perec and America of Bellow, but it is comforting to find a soul from my culture that, in contrast to much of Europe’s (and possibly America’s) cultural élite, understands that Adorno’s belief that poetry after Auschwitz is barbaric is a “surrender to horror” and, more importantly, actually discovered a vital form of expression that could resist it, and resist it sensibly and animatedly, not tragically and despairingly. And this has made him practically the founder of post-war American literature.

Take that, William T. Vollmann!

Movie critic tries to delve into psychology of violent spectacle, doesn’t get very far. It seems to me that there may be a confusion of aesthetic and moral response here. Super-realistic violence is always going to be more fundamentally revolting than Kill Bill-esque carnage, but that doesn’t mean that the reaction is the fruit of some articulate moral principle. Holocaust art is probably where this divergence is most evident–the violence is by its nature atrocious, but people generally feel that the very portrayal of it is just as innately moral. But the issue arises from the fact that extremely realistic violence doesn’t always nor in everyone induce the sort of generalized aversion that it does in pacifist movie critics. If we were to divide this sort of artistic production according to what sort of effect it tends to elicit, it might break down something like this: 1. Realistic violence which induces both aesthetic and ethical recoil 2. “Stylized” violence which is so unrealistic that it may be exciting but at heart doesn’t evoke real violence 3. Realistic violence presented in a rather blasé fashion (i.e. movies with Bruce Willis in them) 4. Realistic violence which tends to provoke the crusading spirit–Revenge! Justice! This might at first appear to correspond to an order of acceptability, with #1 having the benevolent effect of discouraging violence, all the way down to #4 which dangerously stirs people up. This is, however, questionable as a universal. The heroic aspect of violence after all, as facile as it may appear to our jaded culture, should not however be underestimated. I can’t believe that, even in the case of Holocaust art, provoking complete disgust and abhorrence of violence is desirable any more than I can believe the slogan I saw in a park in London, near a statue of Winston Churchill no less, that said: “There has never been a good war or a bad peace.” Well, actually it might be true descriptively, but not proscriptively because, as Edward Gibbon recognized, “peace cannot be honourable or secure, if the sovereign betrays a pusillanimous aversion to war” (emphasis mine), and the point could be generalized. The idea is that pacifism, paradoxically, does not even lead to non-violence, because it removes the restraints on the violent–bearing in mind, of course, that this truth can always be over-extended. Nevertheless, if the Holocaust and the related disasters of the war had only inspired a loathing of violence and a resolution not to engage in it, we would still be living in their thrall.

Like, whoa

One subject which is of more or less recurring interest to me is the philosophy of one Sayyid al-Qutb, sometimes identified as the philosopher of al-Qaeda (though he died in 1966), and certainly one of its intellectual forebears. Probably this subject interests me because, and I hope I am not to be misunderstood here, I feel that I can deeply understand its attractions. With my own tendency to romantize the historical, the mystical and “divine certainties,” reading about him is almost like seeing a grotesque shadow of my own intellectual proclivities in some parallel universe. There’s a good article here, excerpted from a book by Paul Berman, himself a writer of whom I am rather fond, partly due to being one of the very few journalists on current affairs and politics at the present with a good prose style.

I will simply note a couple of salient points with regard to al-Qutb at present. I think Berman is very definitely correct that al-Qutb’s philosophy ought to studied and grappled with intellectually, no matter how obscure and even embarrasing it may seem to be arguing about Islamic theological issues after having throughout our lives either no experience with them or being taught that they were nonsense. It is almost like an extremely advanced skier being exiled to the bunny hill to learn snowboarding. But while the initial steps may feel like a pointless indulgence in superstition, no one could argue about the rather dangerous coherence of al-Qutb’s views. It seems to me that, rather than simply dismissing them as nonsense, if one cannot find compelling grounds for dissenting from them and for upholding one’s own views, then one’s own views are not worth upholding.

Coherence is of course not necessarily the measure of the truth, however. I can well appreciate al-Qutb’s attack on the “fragmented” Western world-view, though I have my doubts that anyone’s point of view is not fractured to some extent. Furthermore, I find something rather naive and self-defeating in those, like Karl Popper, who in fact relish the uncertainties and unsettledness of science and a liberal world-view and then are perpetually surprised when people flee from it for the security of some blanketing dogma (the “escape from freedom”). What both they and al-Qutb seem to fail to apprehend is that, among those who cherish freedom, freedom is not, or ought not, be the end-goal of human endeavor, but rather the beginning. No one can really cherish the nebulousness of total uncertainty and wavering, even though space should be allowed for dynamism and the change of values and ideas. Paul Nizan, for example, the communist writer I cited some weeks ago, after deploring the aimlessness and pointlessness of total freedom without any personal drive or motivation, writes: “La liberté est un pouvoir réel et une volonté réelle de vouloir être soi. Une puissance pour bâtir, pour inventer, pour agir, pour satisfaire à toutes les ressources humaines dont la dépense donne la joie” (Liberty is a real power and a real will to be oneself. A force to build, to invent, to act, to satisfy all the human capacities whose use gives us joy). In other words, to counter the alleged emptiness of freedom he means to substitute a new notion, something along the lines of Kantian “positive freedom.” But he confuses the two elements: one may exert a positive constructive force in a condition of freedom, but the freedom remains the conditions under which the action occurs, not the action itself or the force behind the action.

An openness to the possibility of an active, constructive human will operating in a condition of freedom, of the absence of external restraint, is similarly lacking from al-Qutb’s analysis. He diagnoses the split between mind and body, between church and state that characterizes Western society as a whole, but does not allow for the possibility that that split may not necessarily be operant in the individual lives of all its constituent members. It seems to me that one might be entirely in agreement with al-Qutb on the need to heal that rift on an individual and even on a societal level and still not want some particular sect like Islam or Christianity to be imposed by the authority of a government or by him and his “vanguard” of true believers. But then again at that point it all depends on whether you believe the Qu’ran is the word of God or not. But that at least does seem to reduce al-Qutb’s considerable intellectual force to a simple matter of a leap of faith, and that is an achievement not unworthy of the effort. The reason I bring up Nizan and, by extension, the Western critics of the notion of freedom is that I fear that some might feel themselves compelled by the compulsive need to oppose a priori communism or Islamism or whatever else to accept the terms of the debate that these critics impose, to accept al-Qutb or Nizan’s view of Western society as schizophrenic or aimless and then to actually defend schizophrenia and pointlessness! It is only when we open ourselves to the real merits of the case that the seeds of a defense are to be found, and if none is to be found then what, I repeat, is the point of the defense?

My favorite superstitions (surprisingly, not about the pope)

Equality as a political philosophy for the so-called “liberal left.” On one level it seems of absurd because equality is obviously impossible in practice, but that’s true of most political ideas, and we can admit it provisionally as a regulative if not a normative ideal. But, on the other hand, I think I can condense my thought as to why equality is fundamentally bad and wrong-headed as a political ideal by making one simple point. The reason I make fun of utilitarians is because they treat an assumption or pre-condition of any social or political theory, that the greatest aggregate benefit for the members of the society should be sought, as itself a socio-political theory distinct from all the others. However, equality cannot be countenanced as the base for a socio-political philosophy precisely because it does not presuppose this need. In other words, a society where everyone has nothing is just as equal as one in which they have everything. The actual good of the members of society does not intrinsically enter the equation. The idea of equality is equally noxious on a moral and even on a selfish level because of the privileged place it gives to envy, but that is not the question here so I can leave the explanation of that aside for now.

p.s. One could even argue epistemologically a la Berkeley about whether the notion of “equality” can even credibly be held to exist between any two things, but I’ll save that discussion for the real die-hard partisans of equality.

p.p.s. It must be admitted that the notion of equality enshrined in American political philosophy from the Declaration of Independence (all men are born equal) is likewise faulty, posited as it is on John Locke’s largely discredited tabula rasa view of the nascent human mind (soul).

Cultural Banach space

A week or so ago, Petya quoted the following definition of “heteronormativity” (March 29th entry):

Heteronormativity means, quite simply, that heterosexuality is the norm– in culture, in society, in politics. Heteronormativity points out the expectation of heterosexuality as it is written into our world. It does not, of course, mean that everyone is straight. More significantly, heteronormativity is not part of a conspiracy theory that would suggest that everyone must become straight or be made so. The importance of the concept is that it centers on the operation of the norm. Heteronormativity emphasizes the extent to which everyone, straight or queer, will be judged, measured, probed, and evaluated from the perspective of the heterosexual norm. It means that everyone and everything is judged from the perspective of straight. [Samuel A. Chambers: The Telepistemology of the Closet; or, The Queer Politics of Six Feet Under. The Journal of American Culture, Volume 26, Number 1, March 2003]

Around 5:00 AM on Sunday, I sent her a short response which she quoted and dissected (April 4th entry). In essence, I had two points, which may or may not be self-contradictory: (1) there’s no qualitative difference between “heteronormativity” and any other cultural norm; (2) “social norms” are basically a bullshit construct to begin with. Of course, given that I wrote the email very early on Sunday morning, I didn’t state either of these points particularly well (or, one might argue at least in the case of the second, at all).

I should say that the above-quoted definition/exposition of heteronormativity is basically correct; in a society in which the majority of people are (or, according to Kinsey et. al., merely identify as) straight, it’s inevitable that, e.g., most people will, in the absence of additional information, assume people they’ve never met before are probably straight. This observation verges on the tautological. Of course, to extend the meeting-someone-new example, most people also assume that people they meet aren’t cannibals and watch a fair amount of television, so it’s not at all clear that there’s anything particularly special about heteronormativity as opposed to non-cannibal-normativity or telenormativity.

At this point, I realize that someone whose brain works differently than mine might think I’m trivializing the whole heteronormativity thing with my examples; after all, cannibals are a damned sight rarer than homosexuals and it’s probably not a good idea to worry to much about whatever psychic damage the non-cannibal norm is doing to them, and people who don’t watch television are unlikely to suffer anything worse than occasional social awkwardness when someone mistakenly assumes that any sentient American will identify and be amused by the phrase “I’m Rick James, bitch!” In contrast, it’s clear that people whose sexual proclivities/identities differ from the heterosexual norm (not just homosexuals) will, at the very least, have to deal with consistent low-grade and occasional acute psychological trauma, precisely because psychological health, self-perception and identity are so intimately related with sex and sexual preferences. In this context, however, please note my careful use of the word “qualitatively” in the opening paragraph. The hetero-norm is no different in kind from other cultural norms, but, because sexual identity tends to be so important to our lives, the deleterious effects of such a norm tend to be felt more intensely than those of many other norms and, hence the difference is quantitative rather than qualitative.

By now, I hope, it’s clear that the statement “[t]he problem with heteronormativity is that it is hardly ever recognized as a norm” is entirely wrong. There are dozens of norms that are hardly ever recognized as norms; I mentioned two above, but other examples include the “bathing regularly” norm, the “education is good” norm, the “democracy is good” norm (intimately tied in with the “voting is good” norm), the “wearing clothes is good” norm, etc. Of course, all of these norms have been pointed out and dissected at by various social critics, academics, etc., but, for practical purposes, they’re not “recognized as norms”. No, the problem (or, perhaps, importance) of heteronormativity derives not from its purported lack of recognition as a norm, but from the relatively greater importance people ascribe to their identity as sexual beings than to their identities as television viewers, bathers, students or voters.

In fact, the notion that “[t]he problem with heteronormativity is that it is hardly ever recognized as a norm” seems to be an expression (presumably unintentional) of the rather pernicious but increasingly popular assumption that norms qua norms are bad. Ironically, this is (or at least has the potential to be) itself a norm. Anyway, norms, in and of themselves, are value-neutral. Certainly anyone who finds him/herself on the wrong side of a norm isn’t going to think much of it, but this isn’t the whole story. For example, although there are people who legitimately suffer from the bathing-regularly-is-good norm (people allergic to soap, people without access to bathing facilities, etc.), I think even they would agree that, in general, it’s better if people aren’t going around smelling like mildew and passing along tinea to all their friends. Even more obvious examples of “good” norms are the one that says killing is bad or the one that tells a woman walking alone late at night in a bad part of town that young males in dark alleys should probably be avoided. The point is, norms in and of themselves aren’t necessarily bad; the ones that are are so because they do damage to those who violate the norm (and, one might reasonably argue, everybody else as well, but this is a bit to psychological for me to deal with right now) without having any (or at least not enough) countervailing benefits.

Looking at the clock, I realize that I really should be going to bed and I haven’t even gotten to my second point, the fact that the two points may well be contradictory, the possibility that I may not even believe some of what I’ve written above, and my issues with Petya’s implicit dismissal of me as someone who does “not want to be bothered with conversations about exclusion, oppression, and -isms of all sorts.” All those will have to be left for another day; in the meantime, I’m curious to hear if anybody thinks the above is at all coherent.

Your daily sunshine

Regardless of people’s views on Iraq or whatever else, it is generally taken for granted, everywhere and not just in the U.S., that the finest moment ever in American foreign policy was World War II. Maybe, but I feel that there has been an insignificant amount of sobriety, let alone contrition, among Americans about the incineration of Dresden, Tokyo, Hamburg and the other cities in Germany and Japan that were devastated in 1945, not to speak of the atomic bombs, long after any doubts as the outcome of the war had been decided. That’s why it is always good to read articles like this one (by a German). I don’t know why I was so struck by it, since it does not present any general information of which I was previously unaware. But I think it is sentences like this one, which are always chilling especially when they involve popular figures who are generally considered quite admirable:

“Churchill and Roosevelt unleashed with their 3,000 aeroplanes an “around-the-clock-bombing”, which Basil Liddell Hart, the greatest British military historian of his day, termed “the Mongol devastations”. Two thirds of the bomb tonnage of the five year air war fell in February, March and April of 1945, most of it on militarily insignificant targets. The tiniest part of this tonnage, the precision strikes on the 16 major train routes connecting the Ruhr region with the rest of Germany, had the greatest effect.”

This is not to negate the legitimate reasons which were alleged at the time and since to justify these actions; in the eyes of the author, and mine, the primary, and the most justifiable motivation was the idea that these were really the first military actions of the Cold War rather than the last of WWII, and even that they may have stopped the Russian army dead in its tracks, rather than continuing to advance until it ran into the possibly still-inferior (at that time) Allied armies. They may paradoxically have saved both Germany and Japan from Soviet occupation. But even if that, the most favorable and generous interpretation possible, is correct, it still seems that more Americans should acknowledge that the war was far from being an unmitigated moral triumph, especially considering that the military planners of these bombings seem to have had a near-total disregard for the relative military value of their targets. The simple fact is that between 50,000 and 100,000 died in each one of these raids, almost all civilians. We’ve been arguing for decades about whether the rape of Nanking constitutes genocide, and that involved about 300,000. Of course, it probably doesn’t help that the major documentation of these slaughters in the American cultural consciousness is Slaughterhouse Five. Apparently W.G. Sebald wrote a book about the subject before he died that made quite a stir in Germany, so maybe things will change, but I doubt it, at least until such a book is written by an American. And, a point that the author leaves implicit, if Churchill and Roosevelt were willing to authorize that kind of slaughter, the situation becomes even dicier when lunatics like Curtis LeMay are in leadership positions. And it’s pretty hard to disagree with the conclusion regarding the use of the nuclear bombs:

“Its first deployment went without a hitch. The know-how was there, and there was no alternative. Some people are probably still saying that.”

Oh heavens! the people they are so healthy!

I’m always amazed by Frank Furedi’s Nietzsche-like ability to make perfectly innocuous, even admirable concepts sound like curse words. For example, from this article on our culture’s fetishization of sickness:

“Governments today do two things that I object to in particular. First they encourage introspection, telling us that unless men examine their testicles, unless we keep a check on our cholesterol level, then we are not being responsible citizens. You are letting down yourself, your wife, your kids, everybody. We are encouraged continually to worry about our health. As a consequence, public health initiatives have become, as far as I can tell, a threat to public health. Secondly, governments promote the value of health seeking. We are meant always to be seeking health for this or that condition. The primary effect of this, I believe, is to make us all feel more ill.”

While I’m not sure that I would call examining one’s testicles introspection, it could be justly argued that men over, say, 50 doing so occaisonally may prove more useful than the average afternoon brooding session. I don’t deny that the obsession with physical health does tend to induce the “normalization” of illness, nor that governments, especially (a point my brother has made before) in a system of at least partly government-run healthcare, take the prerogative of intruding themselves more and more on our ability to make decisions regarding our health. However, that by no means entirely negates the value of “health-seeking” in general. Isn’t that, after all, what survival is?

But Furedi is one of these professorial Marxists for whom a single moment lost from re-distributing income in society is a total waste. It is just as extremist a view as that of the health nuts. This is a man that a year ago was bewailing the fact that people no longer identify themselves by their political faction, that they were searching for gasp “personal meaning.” And as intellectually solid as it may seem to moan about moral concepts of right and wrong, good and evil giving way to healthy and ill, I would say that the process is both overstated and, to the extent that it exists, somewhat of a move to re-ground those terminal abstractions in something concrete. For one thing, it’s not like ethics was ever entirely divorced from survival concepts: at least two of the first five books of the Bible, for example, while couched in authoritarian moral-directive language, are pretty much pure health-and-lifestyle manuals (circa 2000 B.C.). I think Plato and Kant have exercised a deceptive influence, because considerations of personal and group well-being have been pervasive in ethics, whether it be hellfire in the afterlife for the Revivalist preachers or some sort of statistical well-offness for the utilitarians. So natural is the urge to connect morals and resultant well-being at some level that I was convinced until the age of 17 that Kant’s categorical imperative was founded on some sort of utilitarian principle.

Perhaps I’m one of the “morally illiterate,” but I find very few a priori moral directives very helpful anymore, and they always seem to be pierced by innumerable valid exceptions. In science, rules are assumed to provisional and subject to revision; in ethics, they tend to override any other consideration. As a result, we wind up with completely deracinated dogmas, and even great minds like Kant deceiving themselves into believing that a utilitarian principle is actually an a priori one. If we actually take the concept of health seriously, but not only in a reductive physical sense, it might restore an element of clarity to our thinking which has been lost in the decline of archaic ethical structures like religions.

And finally, I am quite aware of the role of medicalization in reducing humans to a position of dependency and helplessness. This point has been made, from quite a different extremist angle, by the ultra-libetarian Dr. Thomas Szasz, for example. Then again, people have said much the same thing about, for example, evolutionary psychology. What Furedi, Szasz and Richard Lewontin for that matter don’t seem to realize is that the perceived loss of control over one’s own mind and body (although, as B.F. Skinner pointed out, the results of such a theory, if correct, cannot actually be called either a loss or gain, since they were that way all along) is counter-balanced by the parallel development of technology that allows the manipulation of our biological systems. In other words, maybe someon’s “melancholy temperament” today would be diagnosed as mild schizophrenia, but on the other hard on the heels may come a treatment to cure it. So just like anything else there are two sides to the issue, but it seems to me that if the premises neurobiology are correct humanity in fact has more control over the mind and mental and emotional states today than ever before.

p.s. One might expect that intellectuals of Furedi’s ideological temperament would be more open to this trend. After all, it seems so recently (well, ok, it was 40 years ago, but I discovered it relatively recently) that Michel Foucault published Madness and Civilization to decry the wealthy and powerful élite shutting off and isolating the proles by labelling them as “insane.” Well, these days the élite are much more likely to identify themselves with the plebes by slugging a few Prozac with them. And if that’s not class solidarity, what is?

p.p.s. As evidence of the rather authoritarian and hierarchical basis of ethical norms, I might point out the near-total lack of development of a moral code regulating the interaction of nations, where there is no body invested with supreme authority to impose such a thing and whose behavior as a result frequently, as has been often noted, resembles strongly that of myopic psycopaths. Many people would take this as evidence of the efficacy of ethical norms, at least on a group or general level, which I don’t necessarily disagree with. However, if it is just a matter of my own actions, I generally trust my own instincts more.