24 December 2021

Lasch—To Postpone a Reckoning


Christopher Lasch
The World of Nations (1973)

Ch. XVIII, "Birth, Death, and Technology: The Limits of Cultural Laissez-Faire"
The prevailing image of technological utopia begets the counter-image of technological nightmare—the appalling vision of a scientific totalitarianism, embodied in such anti-utopian novels as Aldous Huxley's Brave New World and George Orwell's 1984. On the one hand we have a greatly exaggerated faith in the ability of science to solve all the material problems of life, and an exagger-
[301]
ated idea of the autonomy of science and technology as determining forces in history; on the other hand, these inflated estimates of the power of science give rise to a hysterical fear of scientific dictatorship. This fear, precisely because it is cast in the form of an anti-utopian vision of the future, serves to postpone a reckoning with science, while the sweeping quality of the scientific control it envisions serves to paralyze our will to act in the present. At the same time it gives the illusion that the destructive possibilities of science are at least being squarely confronted. The anti-utopian and the utopian myths of science have a common root in the assumption that science is an autonomous force, rather than an instrument of the will of the human community, and that its development is inevitable and irresistable.

(pp. 300-301)

23 December 2021

Lasch—On Ellul


Christopher Lasch
The World of Nations (1973)

Ch. XVII, "The Social Thought of Jacques Ellul"
According to The Technological Society, the last chance of revolution disappeared in the nineteenth century, when the revolutionary movement ceased to oppose technology with "spiritual forces" and adopted the materialist perspective as its own, thereby hastening the final triumph of economic man. "Proudhon and Bakunin had placed spiritual forces in rivalry with the economic order. Against them, Marx upheld the bourgeois order of the primacy of the economic. . . ." Unfortunately for this analysis, Marx never propounded any such thing as "dialectical materialism"—that was the contribution of Engels, who sought to establish the scientific credentials of Marxism according to the positivist standard of scientific truth that had come to prevail at the end of the nineteenth century. Marx was not a determinist; he did not deny the element of human will in history; he made no easy assumptions about the inevitability of progress; nor did he equate social progress with technology.

(p. 271 footnote)




[Ellul's] The Presence of the Kingdom is addressed to Christians, but it raises questions that all radical intellectuals have to confront, whether or not they approach them from a Christian perspective. The crisis of the faith is one aspect of the cultural crisis of our time, and Ellul's plea that the church speak directly and critically to social issues springs from the same concerns that have led other intellectuals, working from secular premises, to insist that culture must no longer be regarded as an activity having no relation to politics, that artists and scholars must abandon the pretense of neutrality, and that a new humanism, in short, is likely to take shape only if it makes connection with the struggles of exploited classes to change the world. In the years since The Presence of the Kingdom first appeared, pleas for culture to be "relevant" have once again become common and even fashionable; but as the level of political militancy rises, the advocates of cultural "commitment" have more and more reiterated the position they held in the United States in the thirties and which they have never ceased to hold on the European left—namely, that cultural radicalism means that intellectuals should enlist in the proletarian revolution (now seen as a global uprising of the non-white, colonized peoples). Ellul's work, taken as a whole, constitutes a sustained critique of this position...

(p. 272)


Well, is it necessarily emblematic of the pretense of neutrality to paint abstracts? What about painting abstracts in the epoch of reality tv, deep fakes, VR, and the social media echo chamber?

By reading the artistic surface for explicit political content or commentary, the meaning of the larger gesture and all that necessarily attaches to it vis-a-vis process, curation, criticism, distribution, etc., all of this is ignored. At our peril. A justice issue again, I would say...the quotidian kind of injustice that no one notices until a more total, cosmic variety has engulfed them.
Neither science nor art provide any alternative to the prevailing chaos. On the contrary, science and art contribute to it: science, by divorcing itself from philosophy and becoming
[275]
merely a higher branch of technology; art, by giving up any pretense to make statements about objective reality, thereby dissolving itself in "self-expression." Neither science nor art any longer communicate anything except, in the one case, information required to solve technical problems—and even this is conveyed in symbols accessible only to specialists—and in the other case, inner experiences incommunicable by definition. Modern art, by opposing to technological domination a cult of the irrational, "guides us in the direction of madness." Faced with rampant disorder, men take refuge in the great "explanatory myths" of our time: "the bourgeois myth of the Hand of Moscow, the socialist myth of the Two Hundred Families, the Fascist myth of the Jews, the communist myth of the anti-revolutionary saboteur." These provide the only "means of intellectual coherence" in a world made meaningless by loss of continuity, loss of memory.

(pp. 274-275)


Well, when did art ever have anything more than

pretense
to show for itself when it comes to
mak[ing] statements about objective reality
?
Given
inner experiences incommunicable by definition,

why

bother

communicating

?

And why is
self-expression
the only other option?

At least the younger Lasch here seems more amenable than the elder to the notion that expression too is a mere
pretense
.


The turn away from these imperatives has an obvious justification if the imperative itself is shown to be merely
a cult of the irrational
.


What is left for art to do, then? To make its statements on the level of the artist's conduct of life rather than on the level of surface content. Surface content is too easily misinterpreted, and even more easily properly interpreted for the purpose of using or abusing its underlying intent. Sontag: "Sometimes a writer will be so uneasy before the naked power of his art that he will install within the work itself—albeit with a little shyness, a touch of the good taste of irony—the clear and explicit interpretation of it." In "life" as opposed to "art" we have other names for such "uneasy" people, names which are even less flattering than specialist or irrational.

Treating art as a form of "speech" in the First Amendment sense entails the fringe benefit of placing it more accurately in the well-worn distinction between word and deed. And therein lies the need for "art" entirely apart from its potential bearing on "politics." The need is in the deed.

If the communicative imperative can be dispensed with, the landscape of rationality and irrationality shifts. Much art may then be "pointless," but not necessarily "irrational." At this particular historical juncture I'm quite at peace wallowing in "pointlessness" rather than being railroaded into making a statement, one way or another, with any type of artwork or "cultural" activity I might undertake. On the other hand, if I do in fact use a tuba where others use an iDevice, is that distinction simply to be disregarded on the grounds that my weird tuba music says nothing explicit about "technological domination"? How much more explicit does one need to be about it?

Endnotes to Ch. XXI. "The "Counter-Culture""
Seeing in these books [of Charles Reich, Theodore Roszak, Philip Slater] symptoms of a much deeper cultural malaise and appalled by the anti-intellectualism often associated with the new left—and more generally by a flood of irrationalism in modern society as a whole—those who still believed that a radical politics without critical reason was a monstrosity attempted, in effect, to construct an ad hoc defense of liberal culture, as in the manifesto on the "cultural crisis," that would still be distinguishable from a defense of liberalism as a political ideology. [sic] What is really required, however, is a more penetrating understanding of the "counter-culture" itself and of its social and cultural antecedents. Does the "new culture" represent merely the culmination of cultural modernism, as some have claimed—a democratization of the avant-garde? Or does it portend a regression to a more primitive consciousness? Increasingly events seem to point to the conclusion that it is precisely the premises of modernism that are being rejected in, say, rock music and street theater. If art traditionally has been an interplay between tension and its resolution, the new art banishes tension and seeks to dissolve all oppositions in direct, unmediated experience, non-verbal states of being, trancelike euphoria. Performers alternately assault their audiences, whipping up moods of subdued violence, and make "love" to them, in both cases hoping to merge the performance with "life" and to put both art and life safely "beyond interpretation."
..and therefore, the countervailing "cultural" tendency would be to land dangerously short of interpretation, no?
The audience is offended or, worse, titillated; it enjoys being verbally assaulted; it imagines itself instantaneously released from "bourgeois inhibitions." Relieved of the need to perform and act of imaginative identification, it is more passive than ever, while its lingering reservations about the new art are silenced by the fear that what is new must
[335]
be necessarily significant. "Great art is always ahead of its time." The rhetoric of the avant-garde is pressed into the service of an esthetic with which it has little else in common, in order to clothe the contemporary artist in an inscrutable authority that he claims to reject but uses in many ways to intimidate his audience and critics.

(pp. 334-335)




Part of the job of criticism today would seem to be to insist on the difference between attempting to give popular themes more lasting form and surrendering to the utter formlessness of the moment.

(p. 335)

22 December 2021

Mumford—Art and Technics (postlude)


Once we have achieved the right form for a type-object, it should keep that form for the next generation, or for the next thousand years. Indeed, we should be ready to accept further variations only when some radical advance in scientific knowledge, or some radical change in the conditions of life has come about... This interpretation of the path of technics, as leading to a series of flat plateaus rather than as a steady climb upward is, I know, a baffling contradiction to the popular one. ...The animus of the last three centuries has been toward improvement, innovation, invention without end; and the chief duty of man, according to the utilitarian catechism, is to adapt himself to such mechanical changes as rapidly as is necessary to make them profitable. But this stale view assumes that we are capable of learning nothing, that we are incapable of mastering the machine we have created and putting it in its place; that we shall not emancipate ourselves from the manias and compulsions that our preoccupation with the machine has brought into existence; that philosophy and religion and art will never again open up to man the vision of a whole human life. ...But once we arrive at a fuller degree of self-understanding, we shall render unto the machine only that which belongs to the machine; and we shall give back to life the things that belong to life: initiative, power of choice, self-government—in short, freedom and creativeness. Because man must grow, we shall be content that the machine, once it has achieved the power and economy of a good type, should stand still—at least until the creator again places himself above the level of his mechanical creature.
(pp. 83-84)
This is Mumford at his most Situationist, making a connection which the Situationists themselves were too self-absorbed and self-important to make. This is the case for Functionalism itself as a weapon against wasteful consumption and alienated expression, for an ascetic rather than a hedonistic resistance to entrenched power. This is how technology can free us from the burdens of mere survival now rather than at some yet-to-be-determined time in the future.

[December 2019. Like most concluding remarks, this was originally written first rather than last.]

21 December 2021

Mumford—Art and Technics (xvi)

[Prefatory note: I struggled mightily with this final installment of the series, so much so that what should have been a centerpiece became an afterthought. It remains both incomplete and overlong. It is at least completed somewhat by recent posts, at the cost of adding verbiage rather than paring it. Such is the content-rich, editor-poor world we live in. Enjoy, if you can.]


Lewis Mumford
Art and Technics (1952)
The general effect of this multiplication of graphic symbols has been to lessen the impact of art itself. ... In order to survive in this image-glutted world, it is necessary for us to devaluate the symbol and to reject every aspect of it but the purely sensational one. For note, the very repetition of the stimulus would make it necessary for us in self-defense to empty it of meaning if the process of repetition did not, quite automatically, produce this result. Then, by a reciprocal twist, the emptier a symbol is of meaning, the more must its user depend upon mere repetition and mere sensationalism to achieve his purpose. This is a vicious circle, if there ever was one. ...people must, to retain any degree of autonomy and self-direction, achieve a certain opacity, a certain insensitiveness, a certain protective thickening of the hide, in order not to be overwhelmed and confused by the multitude of demands that are made upon their attention.
(p. 98)
...we only half-see, half-understand what is going on; for we should be neurotic wrecks if we tried to give all the extraneous mechanical stimuli that impinge upon us anything like our full attention. That habit perhaps protects us from an early nevous breakdown; but it also protects us from the powerful impact of genuine works of art, for such works demand our fullest attention, our fullest participation, our most individualized and re-creative response. What we settle for, since we must close our minds, are the bare sensations; and that is perhaps one of the reasons that the modern artist, defensively, has less and less to say. In order to make sensations seem more important than meanings, he is compelled to use processes of magnification and distortion, similar to the stunts used by the big advertiser to attract attention. So the doctine of quantification, Faster and Faster, leads to the sensationalism of Louder and Louder; and that in turn, as it affects the meaning of the symbols used by the artist, means Emptier and Emptier. This is a heavy price to pay for mass production and the artist's need to compete with mass production.
(pp. 98-99)




I

to devaluate the symbol and to reject every aspect of it but the purely sensational one

In order to make sensations seem more important than meanings


The word sensation here could be read two ways, and indeed it should be read both ways to fully understand the argument. Broadly, there is sensation as against intellect, feeling as against thinking, the immediate, aesthetic, intuitive, sensory, visceral part of the artistic experience as against the consciously-considered, reflective, intellectual one. More narrowly, there is the colloquialized sense borne of classic commercialism, sensation as in "sensationalism," "overnight sensation," etc., indicating an appeal that is wider, more immediate, more intense, and (probably) less rational than the mere standard-issue hit. Each cultural "sensation" raises the threshold of human sensation, against which striving imagizers progressively turn up the volume in order to be heard over each other.

The colloquial usage is an especially culturebound one, to be sure, idiomatic to time and place; and so the confluence of verbiage between this idiom and the more literal one gives a strong indication about the era in which they have became conflated: in a society where marketing, broadly construed, permeates most every public-facing thought and action, it is neither the rational nor the intellectual but the visceral which is the most effective tool for commanding attention. To create a sensation, you must appeal to sensation.

I propose three co-determinative aspects of this:

(1) Reasoned response can be consciously shut down by the subject more easily than visceral response; and indeed, we tend to "plug" our ears-of-reason when the stimulus conflicts with a strong existing belief. Conversely, the Lizard Brain and the autonomic nervous system will respond, in their way, to any stimulus that is strong enough, whether we like it or not. (Or, to borrow Kahneman's verbiage, perhaps System 1 has done its work before System 2 has even had a chance to notice.) What's more, our senses are profoundly unequal in the anatomies and motor skills which are available to inhibit stimuli: the vast superiority of the eyelids to the uninnervated cartilage surrounding the ear and nasal canals shows that the relative "dominance" of the senses is mirrored in their respective mechanisms of inhibition. (Or is it simply that danger is loud and smelly?) Sensation's only kill-switch is inhibition of the stimulus, and the default setting of this switch is "on." It is thus easier to catch the target audience in a state of visceral receptivity than in one of rational receptivity.

(2) Reasoned response is inherently slower than the visceral kind, hence it is at a disadvantage wherever stimuli compete for attention and where response time is significant. The fabrication of urgency is a staple of classical salescraft. Making the sale means preempting reason from joining the customer's inner dialog.

(3) Reasoned response is conditioned by culture, whereas visceral response is of animal origin. If "logic" can usually be abstracted rather without incident, reasonableness per se may nonetheless differ drastically from person to person. Viscerality, meanwhile, is immediate, often unwilling, and far less susceptible to cultural mediation. Whereas human psychological diversity is the supreme obstacle to mass cultural appeal, human visceral uniformity is as close to universal as any properly human phenomenon can be. At mass scale, culturally-mediated response is intractable whereas visceral response can be managed. Hence mass manipulation must tend toward the visceral in order to be effective.

[Later: Kahneman's book would seem to support (1) and (2) vigorously but it leaves some doubt about (3). Certainly his System 1 is anything but immune to conditioning. At the same time, for Wexler in Brain and Culture, e.g., there are, at least to my layman's sense, plenty of functionally universal aspects of human psychology which could furnish support for (3). To be continued.]




II

For Mumford at his most breezy, formal art essentially issues from a puerile version of the marketing orientation; and so, in such market-oriented terms, modern art is a predictably perverse response to the perverse incentives that mechanical record-keeping has created. There is nonetheless the great hope, in Mumford's view, of civilizing influences acting upon this puerile art-mind, of social and environmental influences by which raw unmediated desire may evolve into the desire (far less acute perhaps) to replace one's parents in the existing social structure. "Like sexual love" itself, however, "this development likewise brings with it the same danger of premature arrest or fixation at one of its infantile or adolescent phases" (29), thereby threatening the existing social structure with dissolution.

Obviously this assessment owes something to the depth psychology of the time, and obviously adult exponents of formal art practices are bound to find that it mistakes superficial resemblance for deep-psychological machinations; perhaps also that the constellation of idiomatic expressions surrounding maturity in fact reveals a strong ambivalence about Growing Up, of which Mumford's stoic paternalism captures only the conservative extreme; perhaps also that truly selfless artists, like nice guys, finish last. I realize that this last bit is indeed quite puerile, even by my standards. What I mean to raise here, specifically, is the possibility that the external social structures and agents to which artists are expected, by Mumford and many others, merely to submit themselves as a settled matter of holding the fabric of society together, these structures and agents may, perhaps, in some instances, reasonably be judged by artists' rational faculties, if not also by their visceral ones, to be unworthy of submitting to, whether on moral, aesthetic, cultural, or any other valid grounds. This is the issue that drove Freud and Adler apart: when is illness properly seen as a failure to adapt, and when is it evidence of a society that is poorly adapted to people's needs?

One would not know from his venomous attack on modern art here that Mumford was in fact among his century's most learned and eloquent spokespeople on the question of properly adapting the lived environment to human needs. Modern art seems to have threatened his own peace of mind so severely that he couldn't help but lash out at it as part of the problem. He would be neither the first nor the last to feel this way. In a cosmic sense, all can be forgiven in such matters of taste and culture. In this case, though, Mumford takes two tacks which on the earthly level can be called into question: he rationalizes aesthetic judgments in polemical form; and he reads artworks as discursive statements without giving adequate thought to their abstract gestures.

Throughout his writings Mumford returns often to his architect friend's saying that "it takes a great client to make a great building." The important fact that "great clients" don't grow on trees seems to have been lost here among the vitriol. He merely assumes that child-artists who neglect to enter into partnership with such clients must be either stupid or crazy (and perhaps are mere products of their environment in one or both respects). He assumes the Adlerian posture and justifies it with Freudian theory.

For my part, I am now racking my brain to name a single "great client" I have had during my first two decades of professional toil, and I'm having trouble coming up with one. I once asked a group of commiserating coworkers, trolling them mildly, why they never tell any happy stories from their side hustles playing for contracted events, back-up bands, and recording sessions. I already knew the answer: "Because there aren't any" happy stories from these sorts of engagements. I have never lived in a world where Mumford's "selfless" ideal felt achievable to me, not even in the rarefied air of niche culture, and certainly not in the commercial sector. For me, working for "clients" has always involved not gentle compromise but rather total dissolution of the self, of morality, and of identity. Perhaps I am indeed a victim of structure, or perhaps I am a walking failure of agency. In either case, I suspect, though I can't know for sure, that the generation of modernists Mumford so despised did indeed have qualms of precisely this kind about the world they were thrust into. There is copious primary evidence to support this notion. But this alone hardly makes it possible to read such meaning into their works. The epistemological obstacles to this must be dealt with before any sociological or aesthetic analysis can be built upon it.

[Later: Well gee, did I really write the line
total dissolution of the self, of morality, and of identity
two years before reading any Lasch? That is pretty striking.]

It is on the surface somewhat more plausible to think that we can read the general integration or disintegration of a culture through the integration (or lack thereof) of its artists, the freest of free-will agents, into its central conceits; this not based on particular content but rather on general posture. As against the more literal, socio-determinist streak in art criticism such as Mumford evinces here, this seems to me more easily accepted, but only by using vagueness to purchase a modicum of plausibility. There is no doubt that the more radical "modernist" projects were necessarily of an oppositional nature, a fact which is lost in the baffling accusation of their having "less and less to say." As little as may be "said" by splattered paint or silent music, I cannot imagine that the oppositional quality was so lost on too many contemporary observers. One question is whether such modernist aesthetics can (or should) survive if decoupled from the oppositional posture and the historical moment which once accompanied them. The validity and effectiveness of such opposition is, however, a rather different question.

Ironically, for as much disgust as Mumford evinces here, the deterministic nature of his account also "determines" a certain reluctance to assign blame to artists as individuals. He cannot quite credit them with making lemonade from lemons, nor can he blame them for failing to do this. Their fate, it would seem, is sealed by forces beyond their control. Bad societies can produce only bad art. Perhaps at some distant time in the future, with the aid of sufficient historical data and computing power alike, such an assertion as this can be properly evaluated. As it stands, I am not convinced, for one, that it can even be evaluated; that is, I would question whether it is a testable hypothesis in the properly scientific sense, even if some heuristic notion of "good" and "bad" was fixed in advance. Such theories live on the contested borderlands of reason, entering the city only by stealth and being summarily expelled the moment they let their guard down. Unable to know if they are true, nor even which questions they purport to answer, we are left to ponder their mere plausibility. Many intelligent people find them plausible. I do not.

An artist (or any citizen) who felt, under such "bad" circumstances, that things were in fact perfectly fine, or even good as ever (and make no mistake that plenty of such people can always be found), must such mindless optimists not be judged much more harshly than those whose human sensibilities led them to react against their circumstances? The differing moral and ethical ramifications of these two paths necessitate that some notion of "free will" be accepted, if only as a social fiction, to serve as antidote to any "determinism" which would render the distinction between them invisible. And if the artists, as one constituency among many, have for their part had less and less to say as things have gotten worse and worse, we might do well to keep in mind the well-worn saying about actions speaking louder than words.

Would we wish it otherwise? Mumford would not, I don't think, for he finds almost as much contempt for the Polyannas as for the radicals.
The healthy art of our time is either the mediocre production of people too fatuous or complacent to be aware of what has been happening to the world--or it is the work of spiritual recluses,...artists who bathe tranquilly in the quiet springs of traditional life, but who avoid the strong, turbid currents of contemporary existence, which might knock them down or carry them away. These artists no doubt gain in purity and intensity by that seclusion; but by the same token, they lose something in strength and general breadth of appeal.
(p. 147)

Countervailing these tendencies, "modern art" has in fact opened up whole new vistas of exploration vis-a-vis visceral sensation, and it is undeniable that this was quite often done for its own sake and for no better reason. That said, Mumford's axe-grinding ways have clearly gotten the better of him on this particular topic. One may appeal here to an observation he himself revels in making about all kinds of historical developments: it is undeniable that a certain visceral escalation was already detectable prior to the "modern" era. Certainly it is already evident in the trajectory from, say, Mozart to Beethoven to Wagner to Mahler, to cite just one facile, close-at-hand example. A technological-determinist explanation has some plausibility here too; but if we are to live well with technology rather than merely living in fear of it (and a close reading of Mumford shows that he is indeed of the former rather than the latter inclination), then the question is not one of cutting sensation off at the knees but rather of applying a historically-grounded aesthetic sense to the full range of newly available sensations.



III
the symbols that most deeply express the emotions and feelings of our age are a succession of dehumanized nightmares, transposing into esthetic form either the horror and violence or the vacuity and despair of our time
(p. 7)
Mumford appeals to Picasso's Guernica to illustrate this point, which raises a host of problems. Most obviously, "great" works are not necessarily representative simply because they are great. More likely, "greatness" is not representative of an entire milieu or historical period. This is not just a gotcha technicality. The conceit to socio-deterministic reading of art depends entirely upon this notion of representativeness; otherwise it would be possible to cherrypick evidence in favor of just about any interpretation of the prevailing social conditions, because you would be able to find a few works here and there (probably even a few good works) with which to support your assertion.

For Rudolf Arnheim, meanwhile, in his monograph on Guernica, the fact that "every work of art is symbolic by its very nature" begets a constant temptation towards "the error of reductionism: the belief that the true meaning lies always at the deepest level to which the enquirer can dig." (Ch. II) Predictably, psychoanalytic interpreters come in as the worst offenders, which serves well enough to illustrate the point but leaves aside the broader question of how and why techniques for studying people might be applied to the study of their artworks; the same question, that is, which haunts even the most responsible, "modern" sociological acccounts.

For Arnheim, Guernica's "reality level", or "the level of abstraction at which an image represents reality," is very close to the surface. That is, "the "story" of the mural makes obvious sense at the level of the human implications of a military assault." So, whether or not Mumford is correct in his socio-deterministic view of art generally, such a tack is (or ought to be) superfluous regarding this particular work, whose appeal to sensation had, in its immediate moment at least, obvious (and obviously valid) ends.

This line of thinking comports well with the misgivings I have laid out here, but I don't find Arnheim wholly coherent either. In the case of artworks whose demonstrable "reality level" is, we might say, quite low, Arnheim not only permits but in fact "requires the perceiver to look for a more abstract meaning" (my italics). And so he is here willing to accept, conditionally, even the briskest flights of psychoanalytic fancy: they "can neither be proved nor disproved." Yet "it seems evident that the level of interpretation is correctly chosen" if the artwork is just this unnaturalistic. There is a comfortably logical flavor to all of this, always seductive as we float in the sea of aesthetic subjectivity, but there is no concurrent effort whatsoever to flesh out the role and justification for "interpretation" in the first place. The closest we get is that, when a work "makes obvious sense" we are to restrain ourselves; to the extent that another work makes less sense, to that same extent are we required to translate this unreality back into some other ersatz reality; it matters not, seemingly, which one or how real it is itself, just that it has a high "reality level" (as do many psychoanalytic interpretations without, nonetheless, being scientifically testable).

All the arrows here point to a set of psychological assumptions about the viewer which are, I would insist, not to be taken for granted quite so easily. Realism (or realistic-ness) has been made into a point of psychological equilibrium: the further a painting takes us from plausible natural events, to that same extent do we become disequilibriated, and thus to that same extent do we not just want but need to find an interpretation which, regardless of its truth value or explanatory power, makes us feel better about the whole ordeal. In this account, art playfully threatens to Disturb The Comfortable without ever doing so in the end. Any disturbance, so to speak, is reduced to a mere prompt, at first sight of which we are to return ourselves as quickly as possible to the comfort of "reality." But really, this insistence on "reality" against abstraction is every bit the protective thickening of the hide as is any more general attenuation of sensation. If there is a higher purpose for such a defensive posture, we are not told what it is.

Most curious of all, I think, is that Mumford could surely have found other "modern" works which better support his views, and that the work he chose to illustrate them comes off as a poor choice. Are there not plenty of modern works which more truly display empty appeals to sensation and/or infantile cries for attention, works which are less famous yet more representative of this particular trend? It is easy enough to imagine that both the familiarity and the topicality of Guernica made it more attractive as material for a "lecture" than it might have been for a scholarly monograph. One can only hope this is the real reason for its choice here. Still, the intellectual and psychological details of this kind of error are never negligible, emblematic as they themselves are of very particular attitudes and viewpoints which, dare I say, are no less distinctive of time and place than any properly artistic statement. In that respect, what we have here is not actually Arnheim's "error of reductionism" but in fact its opposite, a facile reading of surfaces as a succession of dehumanized nightmares or some such thing.



IV

Mumford's breezy treatment of Picasso's Guernica is breezy enough to deal only in the broadest of social changes and conditions. The phrase the wounds and scars of our time is, once again, far too broad: the wounds and scars in this case are exceedingly specific ones, as Mumford himself surely knew from his own involvement in bringing the mural to New York.

Even if this type of destruction was truly new and previously unimaginable, it is quite a stretch to think that there previously existed nothing so powerful in its own time and place as to affect artists in this way. It is of course impossible to believe that nothing quite so anguished and anguishing was ever experienced or witnessed just because it was never painted, sculpted, sung, written, or danced, nor because such works were simply not recognized as art. It is no affront to history, nor to justice, nor to art to observe that, clearly, things have been plenty bad enough, for plenty of people, plenty of times before, to suggest a statement such as Guernica. For Mumford, then, the mediation between the child-artist's Id and the strictures of civilization is a collegial mediation in times of imagistic scarcity, but it becomes violent in times of imagistic abundance. In the first case, any old artistic productions are inherently rare and exceptional enough to command attention, to soothe the Id, to placate the child; in the latter case, artworks must stoop to progressively baser appeals to animal instincts in direct proportion to the wider propagation of images in society; all of this because the ultmate statement that art makes is not "Look at this," but rather "Look at me."

If all of this is so, then a retreat from representation and narrative might, as I have not yet tired of suggesting, be most welcome. But here Mumford, the anti-idolator who nonetheless insists on the essentiality of the "esthetic symbol," can see only the same tantrum-throwing by the attention-starved child-artist, an appeal to pure novelty without even the redeeming quality of well-honed craft. If nothing else these passages lay bare just how drastically the ground would shift over the ensuing two decades. It was actually the generations who had not quite yet come of age in 1952 who took up explicitly the materials and techniques of advertising and salesmanship and presented them in the context of formal art institutions and practices. (Of course they relied upon these existing institutions and practices far more than most of them were willing to admit.) There is a certain absurdity, I think, in finding sensationalism in the work of a paint splattering abstractionist; or at least that is how it looks to me now with the benefit of much historical and cultural distance from the moment. The old jibe about child's play, misguided as it may be in narrow technical respects, is perhaps useful ammunition against the accusation of empty sensationalism, especially, as in this case, when the source of the two jibes is the same. If anyone can do it, then it is not quite sensational. Really the "sensation" was all in the break with convention and expectation, perhaps also in the general comportment of the principals rather apart from any properly aesthetic or technical considerations. It would be shocking to find that "society" had nothing to do with these developments. But what urgent question, really, is "society" the answer to?

It can be argued that I am paying way too much attention to these particular comments of Mumford's. They are not significant in the wider art-critical discourse, and they are not all that significant within his own oeuvre either. What I think is important about them, rather, is the possibility that they are widely-shared sentiments, and, as such, that in tandem with the more prevailing, learned critiques of modernism they really close the walls in on artists, so much so that I'm not sure who even among this cohort of critics would really want to live in the kind of society they idealize, no matter its technological capacities.

20 December 2021

Lasch—Consulting the Principle of Competence


Christopher Lasch
The World of Nations (1973)

Ch. X, "After the New Left"
The mystique of participation has had a profoundly misleading influence on recent American radicalism. It is a symptom of the general malaise of modern culture that watching a play, reading a poem, or getting an education are defined as passive and spectatorial, inherently inferior in the quality of their emo-
[150]
tional satisfaction to acting in a play, writing a poem, or simply "living." The notion that education and "life," art and "reality," understanding and action are radically opposed derives ultimately from the opaqueness of the structures in which we live and from a despair of understanding them. Official propaganda encourages this belief as assiduously as the so-called counterculture, which in this respect (as in many others) merely reflects prevailing values—or, more accurately, takes them more literally than they are taken by the ruling class. Thus, although the cult of participation encourages among other things a distrust of professionalism, the institutions of American society continue to be operated by professionals. It is only the left which, both in its politics and in its culture, clings to the illusion that competence is equally distributed among people of good intentions and regards any attempt to uphold professional standards as a betrayal of democracy.

... Its distrust of professionalism does not rest merely on a healthy disrespect for "experts" or on an awareness of the ways in which the concept of professionalism has been progressively debased... It reflects an intellectual orientation which, pushed to its furthest extreme, scorns not only professionalism but the "work ethic" itself, on the grounds that spontaneous and sensuous enjoyment of life is the only genuine form of participation in its pleasures, while submission to a discipline is inherently "alienating."

(pp. 149-150)




It is useful to be reminded...that "participatory democracy" in the strict sense works, if it works at all, only in very small communities; and that because the complexity of industrial society makes it impossible for such communties to achieve complete autonomy, those who advocate direct democracy as a general program are advocating, in effect, a return to a simpler stage of social and economic organization. One might add that decentralization, a measure of which is undoubtedly desirable, does not automatically lead to democratic results. Unless it is accompanied by a shift in political power, the decentralization of certain administrative functions may serve merely to reduce friction and to placate dissatisfaction with existing practices. ("The organizing principle of the new model [corporate or academic] institution," writes Michael Miles, "will be centralized control through decentralized structures.")

...

It is obvious that all institutions in American life are not equally democratic. "Private" corporations, academic or industrial, are not even formally democratic in their organization, unlike the state. Before arguing that they should be, according to [Robert A.] Dahl [After the Revolution?], one must consult the "principle of competence," according to which authority should be exercised by those who are best qualified to exercise it and who understand the consequences of their decisions. To insist on democracy in the operating room or on the bridge of an ocean liner would be madness for patients and passengers. The argument for democracy in the state therefore depends on the proposition that "the ordinary man is more
[152]
competent than anyone else to decide when and how much he shall intervene on decisions he feels are important to him." In order for this argument to apply also to the university or the private corporation, it must be shown that these institutions, although in most cases nominally private, actually embody political power, are intertwined with the state, and are public in everything but name.

Dahl rather uneasily skirts the issue of the university. If we were to apply his categories to this particular case, we should have to distinguish at the outset between democratizing the corporate structure of the university, so as to give the entire university community access to corporate decisions..., and democratizing the classroom itself, as many cultural radicals are demanding. In the former case, the principle of competence would favor the institution of democratic procedures; in the latter, their adoption would quickly complete the wreckage of an already debased higher education.

(pp. 151-152)

19 December 2021

Lasch—Student Activism


Christopher Lasch
The World of Nations (1973)

Ch. VIII, "Is Revolution Obsolete?"
The issue of "law and order" has recently become prominent in national and local elections. Instead of seeking to understand its origin, many radicals—along with most of the liberals—interpret the need for order as incipient fascism. They argue that productive workers are so strongly committed to the existing industrial system that they will gladly opt for fascism to preserve it. Not only does this view confuse a commitment to order and economic security with a commitment to capitalism as such, but it seems to imply, when coupled with an analysis that insists on the revolutionary potential of blacks and students, that economic expendability alone can serve as a basis for revolutionary discontent. From this we can only conclude that any revolution likely to occur in advanced countries will be, by definition, a minority revolution imposed on the rest of society by a self-appointed vanguard whose economic superfluity liberates it from "false consciousness."

In the face of such an analysis it is necessary to insist that unless a movement for change enlists the active support of the great majority, it is unlikely to accomplish anything that would be recognizable as democratic socialism. This means, among other things, that the student movement will have to transcend its character as a student movement and forge links with those who work in the main institutions of industrial society. Whether it does this has become one of the most important political questions of our day.

(p. 114)




Ch. X, "After the New Left"
The absence of continuity in American radicalism—in American life generally—made it possible for the radicals of the sixties to discover all over again the existence of oppression and exploitation, the power of the ruling class, and the connection between capitalism and foreign wars. In their excitement, they quickly proceeded from reformist to revolutionary ideas, not only leaving most of their followers behind but glossing over a host of difficulties—both tactical and theoretical—that were inherent in the adoption of revolutionary goals. It should at least have been treated as an open question whether classical conceptions of revolution, deriving from a conjunction of historical circumstances not likely to recur, have any meaning in an advanced industrial society. A major theoretical problem for the new left was precisely to work out a new conception of social reconstruction, in other words to reformulate new ideas about revolution itself instead of being content with unanalyzed images from the past. In the absence of any real analysis of the concept or its applicability to contemporary American life, "revolution" quickly became the emptiest of clichés and was used indiscriminately by radicals, liberals, conservatives, advertising men, and the media, usually to describe changes that were nonexistent.

Useless as the word soon became, it had important effects on those who continued to take it seriously. Consider its influence on the antiwar movement. As soon as the leaders of the movement realized that the Indochina war could not be attributed simply to diplomatic bungling but had roots in the social structure of advanced capitalism (roots which have yet, however, to be fully explained), they began to insist that this recognition be immediately embodied in the movement's practice. This at least seemed to be the intention of the much-publicized transition "from dissent to resistance," announced in 1966-67, although it was not always clear whether this slogan implied an escalation of strategy or merely more militant forms of civil disobedience. (Even in the latter case, however, the almost unavoidable ten-
[127]
dency was to justify new sacrifices by the announcement of revolutionary objectives.) In any case, "from dissent to resistance" was a misleading slogan for a movement that would continue to depend on "dissenters" for much of its effectiveness. Even as a tactic, "resistance" led the antiwar movement into attacks not only against the war but, increasingly, against the entire apparatus of military-corporate domination both at home and abroad, while at the same time the adoption of an "anti-imperialist" perspective unavoidably narrowed the movement's ideological appeal and its base of support. A dangerous dispersion of energies followed from decisions made by the antiwar movement in 1966 and 1967—decisions that arose not so much from calculation of their political consequences as from the need to make an adequate response to the rising militancy of the young, to the agony of the choices confronting men eligible for the draft, and to the atrocity of the war.

The history of the student movement in many ways paralleled that of the antiwar movement, if indeed their histories can be disentangled. After the student left discovered the university's links to the war machine and the corporations, it needed to develop an analysis of higher education that would simultaneously explain why the university had become the center of opposition to the war. An analysis that treated the university simply as an agency of oppression could not explain why so many students had apparently resisted brainwashing and consistently took positions more critical of American society than those taken by other citizens. The problem confronting the student movement was to expose and attack the university's "complicity" in war and exploitation without forgetting that it was precisely the relative independence of the universities (or, more accurately, of the colleges of arts and science), together with the fact that they were at least formally committed to values directly counter to those of industrial capitalism, that made them a good ground on which to fight.

The adoption of revolutionary points of view did nothing to clarify those issues. It encouraged on the one hand a misplaced class analysis of the university itself, in which student "prole-
[128]
tarians" confront a ruling class made up of administators and faculty, and on the other hand a preoccupation with the "real" problems outside academic life, especially those of the working class, which led student activists to abandon the attempt to reform the university and in many cases to leave academic life altogether. These positions, however much they differed from one another, shared an unwillingness to confront the difficulty of explaining the university's relation to society or the relation of students to the class structure as a whole. Were students to be regarded as future members of an oppressive bourgeoisie, whose defection from this class and rejection of "bourgeois life styles" therefore constituted the first stage of the "cultural revolution" called for by Abbie Hoffman? Or were they apprentices to a new kind of technical intelligencia, in which case student rebellion might be considered, in Norman Birnbaum's phrase, as an anticipatory strike of the workforce? These questions concealed an even more fundamental issue: Had the class structure of industrial society changed in such important ways as to render much of traditional Marxism obsolete? The inability of the "Marxist" left to answer these questions helps to explain the rapid growth of a left based on youth culture, on "liberated life styles," which at least takes a clear position in favor of the first of these hypotheses, and which is prepared to interpret even a change of costume as a "revolutionary act"—thereby reducing the complexities of revolutionary action to an absolute minimum.

(pp. 126-128)




Whether the student movement becomes the basis of a new labor movement depends in considerable part, according to [Michael] Miles [The Radical Probe], on whether student radicalism overcomes the influences that have recently crippled it—dogmatic Marxism, infatuation with the traditional working class, terrorism, chic cultural protest. Deploring dogmatism and posturing, Miles nevertheless insists on the "ideological dimension" of student protest—an aspect of the movement that many of its friends have tried to
[138]
minimize. One of the signs of current political exhaustion is a renewed distrust of ideology. ... Many observers would regard the ideological element in student protest as regrettable and unnecessary; Miles regards it as central. In his judgment, the underlying seriousness of the student rebellion reveals itself nowhere more clearly than in the conflict between a technocratic and managerial ideology on the one hand and the ideology of the radical intelligentsia on the other—mutually exclusive views of the world which, indeed, postulate the historical extinction of the adversary. Just as Brzezinski regards the intellectuals as "historical irrelevants," so the left-wing intellectuals hope to eliminate the technocrats as a class. It is just because it has an ideological dimension, in Miles view, that student rebellion may portend a larger movement, "since there is not the slightest possibility of the left organizing these social forces [the new middle class, new working class, etc.] without a systematic alternative vision which first identifies these progressive social forces in its analysis and then appeals to them in its social content." An ideology in this sense is inseparable from the search for a constituency and serves not to encourage but to check the left's propensity for fantasy.

(pp. 137-138)

18 December 2021

Lasch—on education, from early essays


Christopher Lasch
The World of Nations (1973)

Ch. I, "Origins of the Asylum"
The humanitarian reformers [e.g. Samuel Tuke's York Retreat, founded 1796] consciously or unwittingly took over from the cloister the principle that withdrawal from the world is a necessary condition of spiritual cleansing. The results, however, were unexpected: a principle that served very well the needs of an institution into which entrance was voluntary and residence expected to be permanent took on quite different meanings when transferred to places people were forced to enter for a time in the expectation that they would emerge once again into the world, fully prepared for the burdens of citizenship. Whereas the purpose of the monastery was to teach one to live as a monk, the asylum could not very well claim that its ideal product was the ideal inmate of an asylum, tractable, amenable to arbitrary discipline imposed from above, and unable to function outside systems of total control; yet such was in fact the person asylums tended to create.
(p. 8-9)


The segregation of children, running in so many ways parallel to the segregation of criminals, madmen, and other outcasts, suggests that the new-style confinement originated not simply in a concern for public safety but in a deeper and more elusive change of sensibility, which expressed itself in part in a new concept of childhood.
(p. 14)


Like other asylums, the school, once it had been divorced from other institutions (like the church) and from other purposes extraneous to its newly defined objective of sheltering the young in loco parentis, developed an administrative bureaucracy more concerned with order and discipline than with education. No other institution, in fact, so clearly illustrates the ambiguity of humanitarian reform.
(p. 15)


Ch. XII, "The "Counter-Culture""
At one point he [Charles Reich] deplores the university's obsession with scholarly "productivity." A more serious critic would proceed to an analysis of the body of scholarship produced by current conditions. He might try to show that the flood of scholarly monographs in no way enriches our understanding and in fact impedes the necessary work of theory and synthesis. He might also try to show that much of this work is ideological in content, serving to legitimate existing social relations. Instead, Reich objects that writing scholarly books is rarely a "creative" experience for the individuals engaged in it. This completely misses the point: it is precisely because the activity does offer genuine pleasure that there is so little disposition to criticize the institution that makes it possible, the modern university. As long as the university allows us to "do our own work," we ask no questions of it. The real problem of academic life is not how to find private satisfiction but how to create a community of scholars. More teaching and less research—Reich's solution—is a trite and hollow formulation that obscures the underlying question of what we are to teach.
(p. 188)


Ch. XVI, "Educational Structures and Cultural Fragmentation"
Before the nineteenth century formal schooling was considered indispensible only for those preparing for careers in law, medicine, or the church, although others of course availed themselves of it. Most people did not go to school at all—a fact that has usually been taken to mean that the masses lived in darkness and that the eventual achievement of universal education came about because the masses learned to demand it as their birthright. Recent scholarship, much of it inspired by a growing disullusionment with the school, calls this assumption into question. It now appears that middle-class reformers played a decisive role in the creation of the modern school system and that they saw the school essentially as an instrument of social control—the "civilizing" effects of education being closely associated, in their minds, with the need to discipline people whom the dislocations of early capitalism threatened to render unruly and rebellious. The coming of universal education did not so much liberate the masses as subject them to bureaucratic custody.
(p. 251)


The ideology of school reform shared another feature with the ideology of the asylum. It contained a built-in, ready-made explanation of its own failures. Once the principle of the common school and the asylum had been generally accepted and the memory of earlier customs had begun to fade, critics of the new system found it difficult to resist the logic of the position put forward by the custodians: that the admitted failures of the new institutions could be attributed to lack of sustained and unequivocal public commitment, particularly in the matter of funds, and that the only remedy for those failures, therefore, lay in bigger and better schools and asylums, better professional training, more centralization, greater powers for the custodians—in short, another dose of the same medicine. Toward the end of the nineteenth century the school, like the asylum, came under heavy public criticism. The schools were inefficient and costly ... too many of the pupils failed. This criticism, however, in no way questioned the underlying premises of universal compulsory education; its upshot was a concerted drive to make the schools more "efficient." ...as in his [Joseph Mayer Rice's] earlier writings he stressed [in Scientific Management in Education (1913)] the need to remove education from political control. The application of this commonly held idea to education had the same consequences as its application to city government in the form of civil service reform, the city manager system, and other devices intended to end "political" influence and promote the introduction of "business methods." It encouraged the growth of an administrative bureaucracy not directly responsible to the public and contributed to the centralization of power.
(pp. 254-255)


When Randolph Bourne visited the Gary schools in 1915, he saw them as the triumph of Deweyite principles. By incorporating industrial activities that had formerly taken place in the family and by running into the summer (though on a voluntary basis), the Gary schools served "as an extension of the home." Well equipped with every kind of facility, the system served as a community center, "a sort of town public university, attended by all classes of the population." Since the plant was maintained by the students themselves, the students "learned from doing," in
[259]
the jargon of the day. "School life in Gary is, therefore, not a mere preparation for life, but a life itself." Conventional discipline could be dispensed with, "for Mr. Wirt finds that when children are busy and interested they do not have time to be mischievous. . . . They move freely about the building with the unconscious air of owning the school themselves." Noting that the academic classes were tracked "according to whether the children give promise of completing the school course in ten, twelve, or fourteen years," Bourne saw in this mechanism only a means of ensuring that "the brighter children are not retarded by the slower ones." He was impressed by the "helper" system, in which the older children helped to teach the younger ones, and by the easy way in which intellectual and practical instruction was combined. It is impossible to read some of his descriptions without sharing his enthusiasm... The following passage shows progressive ideals at their best; but it also suggests, inadvertently, why progressivism failed.
I dropped into physics class, and found a dozen twelve-year-old girls and their nine-year-old "helpers" studying the motor-cycle. With that fine disregard for boundaries which characterizes Gary education, the hour began with a spelling lesson of the names of the parts and processes of the machine. After the words were learned, the mechanism was explained to them as they pored over it, and their memory of vaporization, evaporation, etc., called into play. The motor-cycle was set going, the girls described its action, and the lesson was over, as perfect a piece of teaching as I have ever heard. The intense animation of that little group was all the more piquant for having as a background the astounded disapprobation of three grave superintendents from the East.
As these last words indicate, it was neither the helper system, the relaxation of discipline, nor the hope of training "versatile mechanics" that accounted for most educators' interest in the Gary schools. It is saddening to read, side by side with Bourne's
[260]
account, the study of the Gary system written three years later, under official auspices, by Abraham Flexner and Frank P. Bachman. These authorities urged that the "helper" system be "less freely and more discriminately used." With regard to industrial work, they judged it a defect of the system that "the tasks themselves are determined not by simple educational considerations . . . but by practical daily need in the school system or the home." This criticism struck at one of the best features of the Gary system, the union of pedagogy and practice—a union which, as Bourne noted, gave the students a proprietary interest in the school and allowed a relaxation of discipline. ...

[261]
The fate of progressive education is already foreshadowed in this [Flexner and Bachman's] report. Those aspects of the Gary system that are demonstrably "efficient" will be retained and extended—tracking, lengthening of the school day, fuller use of the plant. The genuinely innovative features of the system, unable to produce satisfactory results according to the latest tests, will be unceremoniously dropped. Only the rhetoric of progressivism will remain—prattle about the need for "practical education" and "the proper development of the entire child."
(pp. 258-261)


The university was especially vulnerable to the charge of impracticality because it still retained the traces of its ecclesiastical origin: the ceremonial use of Latin, the classical curriculum, allegedly pointless ritual, a notable resistance to innovation. Institutions of higher learning were much slower than the rest of the educational system to absorb the lay culture that had always co-existed with the ecclesiastical culture schools in general had originally been designed to preserve. In the seventeenth and eighteenth centuries lay culture was revolutionized by new advances in science, political theory, philosophy, mathematics, and the arts; but almost every important discovery in these fields took place outside the universities and in many cases in open opposition to the universities.
(p. 262)


As eager to identify themselves with modernity as the utilitarians, those who defined the university as a research center had the additional advantage of being well organized into professional organizations. Many of them had been trained in Germany, and they brought to academic discussions the prestige of German scholarship, then much admired. Above all, they spoke in the name of modern science. Their defense of pure research—of what Veblen was to call "idle curiosity"—carried great weight.

Unlike Veblen, however, most of the advocates of this position were unwilling to rest the case for "research" on "curiosity" alone. The defense of pure science itself rested on the material achievements of modern science. ...[For Daniel Coit Gilman, e.g.,] the university's pursuit of [pure science] should be subjected to no narrowly utilitarian test. But Gilman did not leave the argument there; he added that applied mathematics had played an important role in developing steam locomotion, the telegraph, the telephone, photography, and electric lighting. "These wonderful inventions," he pointed out, "are the direct fruit of university studies." What is ominous about this defense of pure research is the assertion that although scientific "progress" is the purpose of the university, the results of that progress cannot be communicated to the public or even from one depart-
[264]
ment of the university to another, except in the form of "wonderful inventions." The whole controversy over pure and applied research presupposed an almost total fragmentation of culture.
(pp. 263-264)


Much of the recent clamor for relevance reflects an awareness that there is an increasingly remote connection between degrees and the training actually required for most jobs. The solution, however, starts not with changing the content of courses but of getting rid of the whole idea that courses—and colleges—are the only means to an education. The whole system of compulsory schooling needs to be reconsidered. Rather than trying to reform and extend the present system, we should be trying to restore the educative content of work, to provide other means of certifying people for jobs, and to hasten the entry of young people into the adult society instead of forcing them to undergo prolonged training—training which, except in the older professions, has no demonstrable bearing on qualifications for work.
(p. 267)


The displacement of apprenticeship by schooling led to a hardening of class lines, as educational advantages accumulated in the upper bourgeoisie and the professional and managerial strata. Another undesirable feature of the change, particularly as it concerned the lower schools, was that the working class abdicated control over apprenticeship—the process of qualification for work—to professional educators. It is significant that trade unions initially resisted the introduction of industrial education into the schools and capitulated only reluctantly, when they became convinced that the advantages of shifting the costs of vocational training to the schools outweighed the loss of their own control. In time not only the unions but vocational groups of all kinds not only acquiesced in but welcomed the assumption of vocational education by the schools and universities. By the 1930s Robert Maynard Hutchins could complain that "every group in the community that is well enough organized and has an audible voice wants the university to spare it the necessity of training its recruits." The effects of this shift on the school were as bad as the effects on society itself, not only because, as Hutchins pointed out, the competitive upgrading of vocations into professions threatened to downgrade the professions into vocations, but because the school system as a whole was called on to serve, in effect, as a source of vicarious experience; indeed as a substitute for experience. The point does not need to be belabored that educators have not followed Hutchins' sensible advice to "leave experience to life and set about our own job of intellectual training." It is more important to note that Hutchins himself refrained from exploring the implications of returning
[269]
vocational training to the vocations. He took it for granted that prolongation of adolescence was inevitable in industrial society, that institutions of some sort must provide for this extended adolescence, that the school was best qualified to do so, and that in short "economic conditions require us to provide some kind of education for the young . . . up to about their twentieth year."
(pp. 268-269)

17 December 2021

Bibliographilia—Lasch's LeWitt

Here is the Sol LeWitt article referred to by Lasch in Chapter IV of The Minimal Self.

Some choice cuts:
When an artist uses a conceptual form of art, it means that all of the planning and decisions are made beforehand and the execution is a perfunctory affair. The idea becomes a machine that makes the art. This kind of art is not theoretical or illustrative of theories; it is intuitive, it is involved with all types of mental processes and it is purposeless. It is usually free from the dependence on the skill of the artist as a craftsman. It is the objective of the artist who is concerned with conceptual art to make his work mentally interesting to the spectator, and therefore usually he would want it to become emotionally dry. There is no reason to suppose, however, that the conceptual artist is out to bore the viewer. It is only the expectation of an emotional kick, to which one conditioned to expressionist art is accustomed, that would deter the viewer from perceiving this art.
It seems to me that there is much in this exposition which Lasch chooses to overlook. This is less a matter of misrepresenting LeWitt's overall statement as of disregarding all that which he (Lasch) would locate beyond the pale.

Though I want to defend and reclaim certain of these aspects, there are a few things that I too would dispense with. I'm not sure, for one thing, just how purposeless an art can be if one objective of the artist is to make his work mentally interesting to the spectator. This is hardly the highest of purposes, but if it is not still a purpose of some kind or other, I would like to know why.

I am perhaps more willing than Lasch to accept, provisionally, the notion of an art that is free from dependence on the skill of the artist as a craftsman, if only as one possibility among many. I find it self-evident that much of what LeWitt and cohort hope to get at this way is indeed worthy, and that there is no other way to get at it than this. I would maintain even so that the craft ethic, as lionized by Lasch, Mumford, Sennett, and others, also has much to recommend it, on which point I have probably said enough for the time being. I will add here only that the emergence of a whole art-industrial complex of sorts wherein the craft ethic is absent, conspicuously, intentionally, and explicitly absent at that, this has borne out, I think, exactly what more culturally conservative observers were inclined to predict about it from the start.

I do not wish to deny that the craft-oriented and the high-culture-oriented milieux also had their unseemly social ramifications, only to insist that the "conceptual" and "experimental" turns have not betokened any particular progress on this front; indeed, the explicitly anti-progressive Lasch of The True and Only Heaven seems to me far more relevant in this connection than does the Freudian Lasch of The Minimal Self. Where the progress lies, if there has indeed been any, is in an expansion of both aesthetic and conceptual possibilities to better serve the bounded yet irreducible diversity of human needs and desires. Beyond that very general notion, people will continue to behave well and to behave badly; we still need to be able to tell the difference, and there is nothing unique to art or artists that is either necessary or sufficient to help us do that.

Toward the end of the above passage is where one especially starts to wonder if Lasch was coming at this from such a different place (or with such an agenda in mind) as to truly be unable to understand the larger point. To say that he evinces the expectation an emotional kick is an understatement. In fact he anchors his entire account in just such a view of art, the emotions in question having quite specific sources, targets, and qualities. I am not here to fault him for that alone. What is unbecoming of his prodigious intellect, I think, is the attempt to normalize this expectation, which, here and elsewhere, means conflating the metaphorical and the technical. When he scolds Robert Smithson for speaking of the "expressive fallacy," no effort is made to address the reason (a good one!) why the colloquial understanding of "expression" in art could legitimately be called a "fallacy" rather than merely a conceit or a theory or an objective. Classicists and craftspersons alike should care as deeply about this issue as anyone, because it goes to the heart of their respective projects.
Conceptual art is not necessarily logical. The logic of a piece or series of pieces is a device that is used at times, only to be ruined. Logic may be used to camouflage the real intent of the artist, to lull the viewer into the belief that he understands the work, or to infer a paradoxical situation (such as logic vs. illogic). Some ideas are logical in conception and illogical perceptually. The ideas need not be complex. Most ideas that are successful are ludicrously simple. Successful ideas generally have the appearance of simplicity because they seem inevitable.
Lasch would be neither the first nor the last to find in this something like a "restriction of perspective." I have certainly had that reaction to conceptual works which proceed from ludicrously simple ideas. But ultimately this too is a matter of expectation; the very notion of restriction is a relative one. This technique has its uses. Generally these are not uses toward which I find myself inclined, but I do not deny them altogether.

True believers would be correct to detect in these statements a certain ambivalence toward the conceptualist project generally even while I am concerned here to defend it against a very specific charge. The point is, the restriction is imposed by the artist on themselves, not necessarily on the audience; LeWitt is explicit about that. It is up to the audience whether they would like their own perspectives expanded or narrowed by a conceptual (or frankly, any other) piece. This is, in capsule, one of life's great balancing acts, between the known and the unknown, comfort and growth, expansion and consolidation. It is precisely by virtue of my commitment to the craft ethic, and not in spite of it, that I find conceptual art tends to expand rather than restrict my perspective.
Since the function of conception and perception are contradictory (one pre-, the other postfact) the artist would mitigate his idea by applying subjective judgment to it.
A central Talebian insight, sort of, adapted for artists. I can only concur.

In the era of artistic plurality, it is not unusual to find the most incisive commentary emanating from outside one's own affinity group rather than circulating within it. Such is the case here, I think, with this observation of LeWitt's that the artist would mitigate his idea by applying subjective judgment to it. This captures the aestheticist project in a nutshell, and it is no less accurate or important for being phrased in an unflattering way.

In place of the artist's idea or "concept," then, is a subjectivity, the subjectivity of the aestheticist artist, which itself is a mediation between what is inside the artist and what is outside of them, for lack of a better way of putting it. This mediation is neither the content nor the meaning of the work. It is merely a process whereby, as Jorn said in a somewhat different context, a primary act of creation is followed up with a secondary act of critique. Because this critique is in some sense a self-critique, it is not so simple, I don't think, to level the charge of self-indulgence; similarly, because the artist's subjective judgment is formed by external as well as internal factors, it is not so easy to say that such artists "write only for themselves" or "only for other artists."

An aestheticist work has quite far to go beyond this in order to prove itself worthy of attention. My interest here is not in lowering or circumventing anyone's standards on that front. I merely wish to address the aforementioned accusations, which have become so obligatory that no one seems to really understand what they mean.
To work with a plan that is preset is one way of avoiding subjectivity. It also obviates the necessity of designing each work in turn. The plan would design the work. Some plans would require millions of variations, and some a limited number, but both are finite. Other plans imply infinity. In each case, however, the artist would select the basic form and rules that would govern the solution of the problem. After that the fewer decisions made in the course of completing the work, the better. This eliminates the arbitrary, the capricious, and the subjective as much as possible. This is the reason for using this method.

When an artist uses a multiple modular method he usually chooses a simple and readily available form. The form itself is of very limited importance; it becomes the grammar for the total work. In fact, it is best that the basic unit be deliberately uninteresting so that it may more easily become an intrinsic part of the entire work. Using complex basic forms only disrupts the unity of the whole. Using a simple form repeatedly narrows the field of the work and concentrates the intensity to the arrangement of the form. This arrangement becomes the end while the form becomes the means.
About this idea that the basic unit should be deliberately uninteresting so that it may more easily become an intrinsic part of the entire work, is this not a Classical concept too? It is quite contrary to, say, the Baroque, the Romantic, and indeed to the Modernist most of all. But I do find that the best conceptual pieces (the "best," that is, in the sense of Lewitt's ideal that the idea becomes a machine that makes the art) have a certain Classical elegance and clarity to them. I tend to be more amenable to this kind of conceptual work; the work of my former CalArts classmate Todd Lerew always struck me as exemplary in just this regard. Much of what travels under the cover of "conceptual art" falls well short of this ideal, however, because it proceeds from too many competing ideas and/or leaves too many opportunities for the subjectivity of the artist to show itself.

The lengths to which one must go to avoid these twin pitfalls might suggest that the enterprise itself is contrived rather than fundamental. But in the best work I do not get the sense that it is the least bit contrived, just as, I hasten to add, what looks from the outside like a lifetime of torturous, arbitrary technique-mongering by the classical or jazz virtuouso suddenly makes all the sense in the world when listening to the best examples. Why not limit oneself to the best examples and forget the rest? Sociological pretensions won't allow this of course, but for any old listener unburdened, presumably, by such pretensions, pleasure is the law.
It doesn't really matter if the viewer understands the concepts of the artist by seeing the art. Once it is out of his hand the artist has no control over the way a viewer will perceive the work. Different people will understand the same thing in a different way.

Recently there has been much written about minimal art, but I have not discovered anyone who admits to doing this kind of thing. There are other art forms around called primary structures, reductive, rejective, cool, and mini-art. No artist I know will own up to any of these either. Therefore I conclude that it is part of a secret language that art critics use when communicating with each other through the medium of art magazines.
...
Architecture and three-dimensional art are of completely opposite natures. The former is concerned with making an area with a specific function. Architecture, whether it is a work of art or not, must be utilitarian or else fail completely. Art is not utilitarian. When three-dimensional art starts to take on some of the characteristics, such as forming utilitarian areas, it weakens its function as art. When the viewer is dwarfed by the larger size of a piece this domination emphasizes the physical and emotive power of the form at the expense of losing the idea of the piece.
Spot on, I think. Lasch, in these terms, treats art merely as a utilitarian use of society's architecture for psychosocial development.

Bonus coverage:
a few of LeWitt's Sentences on Conceptual Art
1. Conceptual artists are mystics rather than rationalists. They leap to conclusions that logic cannot reach.

2. Rational judgements repeat rational judgements.

3. Irrational judgements lead to new experience.
Well, okay. New isn't always better. And is this even new anymore?
25. The artist may not necessarily understand his own art. His perception is neither better nor worse than that of others.

26. An artist may perceive the art of others better than his own.
Indeed, per Freud and followers, others may know us better than we know ourselves. Why should this be different in art? If art is so "social," then this inherent aspect of human social life must be operative there as well.

At the same time, what is it exactly that we seek to "understand" through or about art? And why? The answers to these questions can only be different for the artist than they are for any and all "others," and so not just the "perceptions" themselves will be different but also their meaning and, I especially want to insist, their relevance.

16 December 2021

Bibliographilia—Lasch's Roth

Here is the article by Philip Roth which launches Chapter IV of The Minimal Self.

Several passages are of interest despite (in some cases because of) not having been referenced by Lasch.
It is hardly news that in best-sellerdom we frequently wind up with the hero coming to terms and settling down in Scarsdale, or wherever, knowing himself. And on Broadway, in the third act, someone says, “Look, why don’t you just love each other?” and the protagonist, throwing his hand to his forehead, cries, “Oh God, why didn’t / think of that!” and before the bulldozing action of love, all else collapses—verisimilitude, truth, and interest. ... If the investigation of our times and the impact of these times upon human personality were to become the sole property of [this tendency (?)]...it would indeed be unfortunate, for it would be somewhat like leaving sex to the pornographers, where again there is more to what is happening than first meets the eye.
Well, sure. But is this a difference of degree or of kind? In this analogy, are we sure Great Books are not just really good, deep, intellectual pornography? Is there not always more to what is happening than literature can do justice to?
It is possible that I have exaggerated both the serious writer’s response to our cultural predicament, and his inability or unwillingness to deal with it imaginatively. There seems to me little, in the end, to be used as proof for an assertion having to do with the psychology of a nation’s writers, outside, that is, of their books themselves. So, with this particular assertion, the argument may appear to be somewhat compromised in that the evidence to be submitted is not so much the books that have been written, but the ones that have been left unwritten and unfinished, and those that have not even been considered worthy of the attempt . Which is not to say that there have not been certain literary signs, certain obsessions and innovations and concerns, to be found in the novels of our best writers, supporting the notion that the world we have been given, the society and the community, has ceased to be as suitable or as manageable a subject for the novelist as it once may have been.
The Silent Evidence gets it’s hearing! (Sort of.)

To beat the dead horse some more...when it's always, always, always our best writers whom we find supporting the notion du jour, surely it is worth at least noting the possibility of causes and effects having become all mixed up.

I most enjoyed Roth's investigation into the
a-grammaticality
of the more overtly self-asserting writers, raising the question,
Is this language in the service of the narrative, or a kind of literary regression in the service of the ego?
...
What we have here, it seems to me, is not so much stamina or good spirits, but reality taking a backseat to personality—and not the personality of the character described, but of the writer who is doing the describing.
There is, on one hand,
a firm conviction on the part of the writer about the character,
and on the other,
the conviction—or the desire for us to be convinced—of something else: Herbert Gold IS. I am! I am! In short: look at me, I’m writing.
And so,
Because Gold’s work serves my purposes, let me say a word or two more about him. He is surely one of our most productive and most respected novelists, and yet he has begun to seem to me a writer in competition with his own fiction. Which is more interesting—my life or my work? His new book of stories, Love and Like, is not over when we have finished reading the last narrative. Instead we go on to read several more pages in which the author explains why and how he came to write each of the preceding stories. At the end of Therefore Be Bold we are given a long listing of the various cities in which Gold worked on this book, and the dates during which he was living or visiting in them. It is all very interesting if one is involved in tracing lost mail, but the point to be noted here is that how the fiction has come to be written is supposed to be nearly as interesting as what is written. Don’t forget, ladies and gentlemen, that behind each and every story you have read here tonight is—me. For all Gold’s delight with the things of this world—and I think that his prose, at its best, is the expression of that delight—there is also a good deal of delight in the work of his own hand. And, I think, with the hand itself.
So, when the authors themselves want to tell us all about themselves, it’s self-indulgent. But when an author (or a stuffed-shirt critic-scholar) wants to tell us all about another author, that is criticism, outreach, putting their work in its proper context, and so on. We should call it "critical capture," in the vein of the "regulatory capture" of government agencies by the businesses they regulate. The line here between a fellowship and a cartel is a fine one. (Musicians are like this too.)

Incidentally, this is the passage that leads to the line,
not so much an attempt to understand the self, as to assert it,
of which much is made by Lasch, and of which I certainly would make much myself, but at which point I was led to expect something very different than what Lasch ultimately presented. Instead of this
assert[ion]
of self which, by the time of the millennium, had run rampant, Lasch hones in on the experimentalists, conceptualists, and minimalists, whose
deliberate effacement of the artist's personality
ultimately proved
a more accurate forecast than Roth's of the direction art would actually take in the coming years.
Are we sure about that? And are we sure we've gathered all of the evidence?

Roth continues.
I must say that I am not trying to sell selflessness. Rather, I am suggesting that this nervous muscular prose that Swados talks about may perhaps have to do with the unfriendliness between the self of the writer and the realities of the culture. ... Of course the mystery of personality is nothing less than the writer’s ultimate concern; and certainly when the muscular prose is revelatory of character—as in Augie March—then it is to be appreciated; at its worst, however, as a form of literary onanism, it seriously curtails the fictional possibilities, and may perhaps be thought of, and sympathetically so, as a symptom of the writer’s loss of the community as subject.
Well...I am trying to sell at least a certain kind of selflessness here: not the total kind, but a kind which buries the self in technique and abstraction so that selves per se cannot easily become the content or the surface of the work; the self may still in some sense express itself, but it is not expressed to the audience. This orientation is anathema to all things literary. If I spend too much time preoccupied with the literati, it's because I see essentially literary pretensions everywhere I look.

When you're not a literary person, the line between the
revelatory of character
and the
literary onanism
can be hard to see.

Here is the final long paragraph:
And now, alas, what does all of this add up to? It would certainly be to oversimplify the art of fiction, and the complex relationship between a man and his times, to ignore the crucial matters of individual talent, history, and character, to say that Bellow’s book, or Styron’s, or even Herbert Gold’s prose style, arise naturally out of our distressing cultural and political predicament. However, that our communal predicament is a distressing one, is a fact that weighs upon the writer no less, and perhaps even more, than his neighbor—for to the writer the community is, properly, both his subject and his audience. And it may be that when the predicament produces in the writer not only feelings of disgust, rage, and melancholy, but impotence, too, he is apt to lose heart and finally, like his neighbor, turn to other matters, or to other worlds; or to the self, which may, in a variety of ways, become his subject, or even the impulse for his technique. What I have tried to point out is that the sheer fact of self, the vision of self as inviolable, powerful, and nervy, self as the only real thing in an unreal environment, that that vision has given to some writers joy, solace, and muscle. Certainly to have come through a holocaust in one piece, to have survived, is nothing to be made light of, and it is for that reason, say, that Styron’s hero manages to engage our sympathies right down to the end. However, when survival itself becomes one’s raison d’être, when one cannot choose but be ascetic, when the self can only be celebrated as it is excluded from society, or as it is exercised and admired in a fantastic one, we then, I think, do not have much reason to be cheery. Finally there is for me something hollow and unconvincing about Henderson up there on top of the world dancing around that airplane. Consequently, it is not with this image that I should like to conclude, but instead with the image that Ralph Ellison gives to us of his hero at the end of Invisible Man. For here too the hero is left with the simple stark fact of himself. He is as alone as a man can be. Not that he hasn’t gone out into the world; he has gone out into it, and out into it, and out into it—but at the end he chooses to go underground, to live there and to wait. And it does not seem to him a cause for celebration either.

15 December 2021

Tensions, Releases, Unities


Christopher Lasch
The Minimal Self (1984)
The fundamental importance of the distinction between self and not-self—the source of all other distinctions, it has rightly been said—might suggest that it serves as the first principle of mental life, the axiomatic premise without which mental life cannot even begin. In fact, however, it is a distinction that is accepted, in the infancy of life, only with the greatest reluctance, after fierce inner struggles to deny it; and it remains the source of our existential uneasiness, as well as the source of our intellectual mastery of the world around us.
(p. 163)


[it] presents itself, at first, as a painful separation from the surrounding environment, and this original experience of overwhelming loss becomes the basis of all subsequent experiences of alienation, of historical myths of a lost golden age, and of the myth of the primary fall from grace, which finds its way into so many religions. Religion, like art at its best, seeks precisely to restore the original sense of union with the world, but only after first acknowleding the fact of alienation, conceived as original sin, as hubris followed by divine retribution, as existential loneliness and separation, or in the arts (especially in music, which conveys these experiences at their deepest level), as the rhythm of tension and release followed by inner peace.
(p. 164)




I

When even a thinker of Lasch's caliber can succumb to the most threadbare, sentimental cliches about music, surely we are up against something uniquely insidious. What on earth could it be?

The assertion that this whole drama of human development is played out anew in each cycle of musical
tension and release followed by inner peace
,
through which the artwork
conveys these experiences at their deepest level
,
this raises myriad questions and answers none.

ee.gg.

How close is the correspondence,
really
,

between

tension
and
alienation
?



How,
really
,

might any of this be
convey[ed]
?



And what,
really
,

makes
any given level of convey[ance]
any
deep[er]
(or
shallower?)


than

any
given
other
?



Whatever truth or untruth there is in the psychoanalytic account of primal alienation as against an original sense of union , it seems absurd to claim that this drama is played out anew, even metaphorically, with each ensuing cycle of musical tension and release. Or, if it is not absurd, then I can at least say, speaking only for myself, that the mere implausibility of all of this is only the second most harrowing concern which emerges from the above. The first is that the mere thought sounds harrowing, exhausting, and ultimately, monotonous. What fresh hell would this be, to pass through

fierce inner struggles
,

overwhelming loss
,

divine retribution
,

existential loneliness
,

over

and

over

and

over again

?



I thought the whole point of the Freudian angle was that

ideally

you
pass through
exactly
once
,

or
if you're William James
or
Billy Graham,

twice
,

but not more than that
.

(And not never.)

Debussy: "Let's leave, he's starting to develop." A terrible way to encounter new music, perhaps, but nonetheless totally understandable once a certain kind of development has become (too) familiar.



II


These days, much music does not fit neatly into either the fact of alienation on one hand nor the original sense of union on the other. Much of it certainly does not fit into some metaphorical journey between the two. And I can't imagine that elevating these concepts to archetypes of musical or artistic experience gets us any closer to understanding anything that is significant or even real about all but a handful of exceptional works. Of course, it depends on how you look at it.

When music became individualistic it became stylistically pluralistic too, and this was the death knell for the universalistic pretensions of such venerable constructs as tension and release, although, as we see above, they have not gone quietly. They do not need to go at all, actually, but we do need to be responsible about any claims made on their behalf. Certainly they may remain valid analyses of the historical eras to which they belong, and they remain available for artists to make use of, part of the postmodern grab bag of styles and aesthetics; all such proof as historians or artists may seek being, of course, in the proverbial pudding.

But the fact remains that today (and for some long time now, it must be said), music may seem to do both or neither of these things depending on how it is thought to work, how it is described phenomenologically, how it is actually/materially made, distributed, and consumed, etc. So, if Lasch's "participatory democracy" cannot afford "double standards," it would seem, even in the area of society where the very notion of unitary standards is itself most problematic, then either his is a far more reactionary program than it purports to be, or, more likely I think, the institution of art has been treated too much like any other social institution without accounting for what is different about it. Perhaps also the metaphorical level has, as it so often does, become conflated with the material one.

The same flattening effect follows in the wake of Lasch's deterministic tendencies, which show themselves sporadically but consistently throughout his whole writing career. Having elevated classical ideals to normative ideals, not only are other kinds of music non-normative, they are not even strictly speaking individualistic; rather, they are determined by social conditions. I am not the right person, and this is not the right place, to undertake a broad consideration of the philosophical discourse surrounding determinism and free will. I can approach this topic only with a very narrow focus on the place of art under "participatory democracy." In that department, and with apologies to any more philosophically sophisticated readers, I take it as axiomatic that in the area of art, the social fiction of agency and free will has many benefits and few drawbacks.

There is, coincidentally enough, a much more literal, functional, material aspect to the way this dialectic between determinism and free will is played out in the late modern and "postmodern" art milieux that Lasch considers explicitly in The Minimal Self. The elision of selfhood and of the will which Lasch finds so symptomatic of a desperate "survivalist" mentality has, whatever else one thinks about it, nonetheless proven itself many times over to be an unusually fruitful generative strategy. That many of these artists have also generated eloquent and learned rationalizations for their work might less easily be taken at face value, especially by someone of Lasch's sensibility, on the grounds that vehemence of rationalization can just as easily be evidence for an element of denial as for an element of truth. But the fact remains, I would argue, that not only have the ideas survived the passing of the milieu with which Lasch identifies them, they have also added a new drama to the artist's toolbox in the form of this dialectic between determinacy and indeterminacy, between letting sounds be what they are and wilfully organizing them into higher-order agglomerations.

Here is the raw material for all the existential metaphors any critic could ever hope to read into an artwork; except that in this case, for technical reasons, it is easy enough for the artists to render the dialectic comprehensibly, right on the artistic surface. There is then no need to outsource the overlay of tortured metaphors to a pretentious, self-dealing, parasitic critical establishment, and there is every reason to expect that even people who don't like it cannot fail, nevertheless, to get it. Frankly, I myself don't like it very often, but fortunately enough for me, I came along at a time when these ideas were neither old nor new in art-historical terms but rather somewhere in transition between the two, and I think this might be the ideal location from which to consider, at least for myself, what they may or may not have to offer.

The faux-drama of tension and release, meanwhile, has been enshrined a few times over in abstract schemata of reductive analysis from which, as with Lasch's determinism, the ultimate product is an intensely normative kind of thinking which makes a hash even of what narrow individualism did exist in the common practice era. The virtue of these analytic systems is that they are at least functional, or so I am told, for certain practitioners to hone their craft. The proof of that is also in the pudding. I find that it usually tastes rancid. In any case, Lasch's existentialist schema of tension and release has not even this paltry claim to practical application.



III


To give just one close-at-hand example of a level of analysis which is totally oblique to Lasch's thesis of infantile "alienation" yet ultimately speaks more concretely and usefully, I think, to the issues raised by it, Dean MacCannell has pointed out that
Modernized peoples, released from primary family and ethnic group responsibilities, organize themselves around world views provided by cultural productions. The group does not create the world view, the world view creates the group.
(The Tourist, p. 30)
Concomitantly,
Strangers who have the same cultural grounding can come together in a cultural production, each knowing what to expect next, and feel a closeness or solidarity, even where no empirical closeness exists. Their relationship begins before they meet.
(ibid, p. 32)
Now, it seems to me that Lasch is more interested in recovering (selectively) certain pre-modern social conditions than in working within the postmodern conditions which MacCannell has spent a long and fruitful career wrestling with; and at that point it is merely talking past him to suggest that MacCannell here has pinpointed a much more pervasive and tangible instance of this basic "alienation," perhaps alienation's archetypal postmodern variant. Unfortunately, Lasch's various defenses of religion and of "lower-middle-class morality" are, while eloquent and persuasive, nonetheless wholly impractical, at least in absence of a truly radical regression to pre-modern technological, political, and social structures. The same goes, I think, for much of what he says about art in The Minimal Self.

By the time that Cagean experimentalism and process-oriented minimalism, to name only two of the more recognizable brand names, had passed from the cutting edge into recent history, the classic regime of tension and release had necessarily passed along with them from aspiring universal to elective stylization. Lasch invokes no such explicitly universalistic or scientistic language to make his case here (nor, as far as I can tell, elsewhere either), but the implication that minimalism, especially, is irrevocably tainted by the social conditions out of which it arose, is very strong. The sad state of the world explains the sad state of art. We have been here before, though rarely with such a fully-developed psychoanalytic account of precisely how these social conditions ramify on the level of artistic form and content.



IV

In Steven Mithen's The Singing Neanderthals there is a chapter entitled "Language Without Music," and within this a subheading "When Singing Sounds Like Shouting." Here Mithen relates the case of a stroke victim, HJ, for whom "music now sounded 'like awful noise', and that of an unnamed twenty-year-old man who, following the emergency "clipping" of a hematoma, found that "sounds had become 'empty and cold', while singing 'sounds like shouting'." It seems clear from the verbiage chosen that neither gentleman found the "noise" or the "shouting" particularly pleasant, or at least that they found it much less pleasant than they formerly found "music." The question arises, then, of whether music which lacks the linguistic basis subsequently expounded by Mithen (which may simply be music which does or could be said to possess it in its construction but which is heard by a hearer who lacks or has lost the relevant linguistic faculties to make it possible for them to experience "music" as such) can ever be anything but "empty and cold."

Given what Mithen has to say on the hard-science side of things, it does seem unavoidable that even a successful achievement of a nonlinguistic music, whatever we can agree makes for "success" in that arena, would indeed be a music which achieves some sort of severance between the body and the artifact such as Jones thinks he detects in the playing of white jazz musicians and in Euro-American art and music generally. But I wonder if such an achievement, as I am calling it here, would not also lead unavoidably to the conclusion that even something so seemingly rooted in (and exalting of) the common practice era as tension and release is, in the end, largely subjective, or at least more subjective than, say, a "Generative Theory of Tonal Music" or Mithen's "Singing Neanderthals" hypothesis or a Pinkerian appeal to "human nature" would suggest.

Now Lasch:
What distinguishes contemporary art from the art of the past, at least from the art of the nineteenth and early twentieth centuries, is the attempt to restore the illusion of oneness without any acknowledgement of an intervening experience of separation. Instead of trying to overcome this separation and to win through to a hard-earned respite from spiritual struggle, much of the literature and art of the present age, and much of our "advanced" music as well, simply denies the fact of separation. It sees the surrounding world as an extension of the self or the self as something programmed by outside forces. It imagines a world in which everything is interchangeable, in which musical sounds, for example, are experienced as equivalent to any other kind of sound. It abolishes selfhood in favor of anonymity.
(p. 164)
If this passage marks out the opposite pole from linguistic music (what else to call it in this connection?), mustn't there also be an intermediate (or perhaps bastard) case whereby this assumption is operative only on one or the other side of the musical transmission (artist and audience), and/or where the experience of the music itself (defined ultimately as the experience of a specific listener) can/does move in and out of the linguistic mode on a moment-to-moment basis?

Much of the music Lasch specifically mentions isn't likely to succeed very well in the linguistic mode, but nor is it definitionally devoid of any such elements nor of the possibility that a few people here and there may well succumb to their own "linguistic" musical moments in among the "awful noise" (leaving aside of course the unthinkable idea that the "noise" might not be so "awful").

The fact that this possibility is mostly just a possibility is often raised in objection to the arguments I am advancing here. But at that point we are again, I think, thrown back on the related questions of whether this failure to conform has any of the same moral implications in the area of connoisseurship as it might have in politics, commerce, or social control; of whether art can in fact afford a certain "pluralism" here that "democracy" cannot; and whether it is not precisely in giving aesthetic voice to such exceptions to dominant sensibility wherein art makes much of its unique contribution to society and to life.

To be sure, much Common Practice Era music can be (dare I say it usually is now) heard merely as noise, i.e. by listeners who lack the conditioning (would even Mithen not admit that there is much conditioning involved?) to hear linguistically. I for one have never ceased dearly missing my former non-linguistic, non-theory-knowing self, the self that existed for the few short years between the first flower of my self-guided listenership and my first exposure to the post-Schenkerian world of roman numerals and figured bass. Suffice it to say that the Common Practice Era hasn't sounded the same since, and in fact that it sounds (it would for once not be wrong to say also/instead that it feels) much, much worse. For me, both as a composer and as a listener, this is precisely what makes it necessary to work against the linguistic conceit.

For a time I had developed and clung to a theory, based on this need to outpace the linguistic part of my brain, that one must constantly be seeking out new music which one cannot quite keep up with "linguistically" in real time. Much unlike actual language, which can be uniquely stressful when a native speaker of a language one knows only patchily speaks too quickly, I have often found myself needing to be gently outpaced by music in order to enjoy it. What a surprise to learn, against the received wisdom of the musicologists and critics, that much music in fact lives in precisely this space not (or not only) by having become ever more complex but simply by wilfully defying the linguistic imperative.

This is the technical side of my defense of an interest in modernist aesthetics, the side which Lasch, presumably a well-intentioned amateur or non-musician, can be forgiven for undertheorizing. For the moment, I too must leave it a bit undertheorized pending further research. Provisionally, I would suggest a connection between these types of observations and Psychology proper, particularly (this is just what comes first to mind) the Big Five traits and the presence among them of something called Openness to Experience. For this tuba player at least, it is quite daunting to consider assembling a responsible academic argument in favor of considering the non-linguistic mode as something which certain among us are bound to be open to, regardless of whether it is "natural," while the desperate clinging to linguistic-ness evinces the contrary tendency. Ditto with the bodily/corporeal aspects of the separation-union issue. But this is what I would tentatively propose, subject to further elaboration. The elegance in this explanation, potentially, is that we can accept both that the "linguistic" mode is somehow "natural," as Mithen and Pinker would have us believe, and that just because something is "natural" does not mean, necessarily, that it is "natural" for everybody, nor that it needs to be, nor that it is necessarily good or bad. (Pinker himself makes precisely this latter point about "human nature" generally, and this seems to have gotten lost in much of the chatter surrounding The Blank Slate.)