07 March 2021

The Genetic Fallacy in Art and Life (i)


(2016, rev. 2020-21)
(Previously: Author's Disclaimer/Preface)
(a part ii may or may not be forthcoming, perhaps before, during or after another massive series)



It is always tempting to assign blame for unsavory social or scholarly trends to partisans of particularly visible brand name thinkers: to the Damned Freudians, the Fucking Marxists, the Obstinate Foucauldians. If not for Fanon, there would be no political violence on the Left! Blame the white male descent into anarcho-capitalism on Milton Friedman and his 10 Quotes to Make Liberal Heads Explode! The strongest appeal of such genetic explanations lies in their parsimony. They are simple and tractable in comparison to the actual complexity of the phenomena for which they purport to account. But as complex as human society actually is, so rarely are such simple, gratifying explanations justified.

Just ask an artist. In artmaking itself as well as in the scholarship surrounding it, those who most noisily fly the flag of a brand name, by name, are to be trusted the least, most of all by their own brand-name standards. How ironic is it that the ever-quotable Friedman himself said something precisely to this effect? Flag-waving is first and foremost an attention-getting maneuver. In the case of books which almost no one actually takes the time to read, the boisterousness of true believers most often belies their miniscule numbers and great vulnerability. Much like the autonomic puffing of furry animals caught in conflict situations, the point is to make themselves appear bigger than they actually are. The attention the rest of us can't help but pay reinforces the overall impression of a discernible ideological paper trail. (We are mammals too, after all.) Perhaps there are indeed a few exceptional individuals and microcommunities where such a trail can be established and followed back to its source; but how trustworthy, really, are their self-reported narratives of the "Aha!" moment?


Breadth and depth of learning are the best medicine. They are highly destructive of certainty rather than constitutive of it. Certainty is the badge of an uncultivated mind.


From the semblance of direct intellectual paths arises the first, best set of questions towards testing genetic hypotheses: How many of us on the Left have actually surveyed a representative sampling of Freud? Marx? Foucault? How many of our Rightist foils can quote Milton Friedman from books rather than from clickbait? How could such bodies of work, in all their complexity and internal contradictoriness, possibly lead to the drawing of common conclusions on a mass scale? Fromm defined ideology as "socially patterned rationalizations," and reading is not much of a social activity, eh? How many of us are both constitutionally and materially equipped to understand this work literally, let alone with any nuance whatsoever? And to pursue a personal synthesis? I would say not too many at all. This factor alone (not to mention the general problems of willfully putting "theory" rather directly into "practice") severely limits the possible impact of scholarship on mass politics, even as it clearly can have momentous impacts on the trajectories of individuals and their achievments, and even as it is customary for such individuals to deflect forthcoming accolades toward payment of these intellectual debts, in word if not always in deed.


"Individual initiative" is never an answer but merely a restatement of the question. It is tautological. It is a fallacy of parsimony.


When it comes to the problematization of individual subjectivity, the death of the author, the dethroning of high art, and so many other canonical art-theory tropes which challenge essentialism's hitherto unchallenged assumptions, it is important to keep in mind that questions of reading and misreading nonetheless pale in comparison to those of non-reading, non-engaging, non-struggling as socially-patterned antiphenomena, the massive cold spots on any given intellectual heat map. I'll bet lunch that the "Marxist" heat map in particular is actually pretty frigid, and that cold is the absence of something rather than the presence of something else. And if I am wrong about this, then the expansion of the epithet "Marxist" to include everything left of center still wins me the bet on a technicality. The actual Marxists, bless their bleeding hearts, need not enter into it at all.

If simply correcting literal misreadings or encouraging creative ones, if establishing baseline empirical facts or reinterrogating the ones we (think we) already had, if any of these were the path to salvation, we would have arrived long ago. In fact we have rightly learned to be suspicious of those flag-wavers who posit just such tasks as Final Solutions rather than the trifling preparatory obligations they are. Another MF periodically reminds us that even the most seemingly radical thinkers are "merely conduits for the zeitgeist," at least in the colloquial sense of cause and effect. To be sure, genetic logic is cause-and-effect logic, i.e. it is uniquely well-suited to the task of mistaking one for the other. Conduit Theory, to the contrary, cannot be fooled by this primitive trap, no matter the social scale. Thought which issues from a particular cultural milieu cannot help but find corroboration therein. There is nothing mystical or prophetic about this dynamic, not even in the hands of mystical or prophetic writers. Further, as the Freudian brand admonishes us, the particular ways in which writers are wrong are also supremely informative. Too bad Freud's method of analyzing "fortuitous actions" has found its most amenable habitat in the GOTCHA culture of the Twitterverse. That is a flesh-eating waste of a promising idea.


It is galling to find so many online user reviews opining that a given book is both worth reading and "outdated." Pick one motherfuckers.


Psychoanalysis provides useful cover for any schlub who wants to claim, for any reason, that their rhetorical opponent is actually self-loathing, self-deceiving, that they literally don't believe what they themselves are saying...because who else ever could believe it? Psychoanalysis, the field of inquiry taken collectively, indeed furnishes such tools lovingly and in abundance; and yet psychoanalysis, the field of inquiry taken collectively, can only ever be equal parts cause and effect. It represents the crest of a larger wave of skepticism which has since overtaken Western intellectual life and would have done so just as readily even without constructs such as the Freudian unconscious to help it along. GOTCHA culture just happened to wash up on the beach in its wake.

I am of course wary of condemning and exalting vulgar relativism in the same breath; of implying that it doesn't matter what we read, just that we read. On a personal level I must mention that I absolutely am becoming less and not more certain the more I learn. But I would also say that the Author is only Dead if we Leave Him Be; that is, if time or ignorance leaves us utterly lacking in context for His ideas. (Hell, the way things are going we'll be lucky if people can still read well enough to parse Him literally.) Yes, (re)building context under such circumstances can be quite the pain in the ass; and yes, postmodernists, institutional-level med[dl]iation in that process is quite often more ideological than altruistic, more destructive than constructive, and more contrived than inevitable. But if we can humor the notion of individual initiative for just a moment, every one of us must individually be capable of establishing which authors are worth the trouble, for us; and if, for whatever reason, we don't take the initiative to build context that enables us to make some educated guesses as to who these authors might be, then we might as well not bother either with art or with life. Certainly we may not accuse our political opponents of being dupes. To make that accusation, you damn well better have done your homework.

For me at least, there is a middle phase of discovery about a new topic during which writers become interchangeable, but also a later critical mass of understanding beyond which the particular value of individual contributions reemerges into view. The sample rate must nonetheless always remain sufficient to capture the full spectrum of the issue. True consensus belongs only to extremes which are logical impossibilities: total ignorance and total knowledge. In practice we all are doomed to inhabit incommensurable positions across the vast middle ground which lies between these two extremes. It is a region which inexorably defies genetic logic, confluences of chance notwithstanding. No one actually lives in a poleplace where genetic logic comports with the intellectual environment. If they say they do then they are lying. I just said certainty is hard to come by, but I am certain about this.

As a result, and also because we are, it must be said, exceedingly weak and simple creatures relative to the social edifices which we unwittingly help to construct, we do seek facile comforts in times of distress and later rationalize them as Parsimonious Solutions, as if to appoint ourselves lead scientists conducting cutting edge research on the culture in which we ourselves are subsumed. Hence we blame Frantz Fanon for Eco-Terrorism and Milton Friedman for Paleo-Conservatism even as the vast majority of Eco-Terrorists and Paleo-Conservatives remain variously but powerfully under-educated, no matter what the wide circulation of reductionist glosses might seem to indicate. After all, causation just feels better than correlation. The thrill of the hunt is most intoxicating when our prey is just elusive enough to keep us entertained but not so elusive as to be unattainable; and so we distort its image in our own heads until our ideational porridge is just the right temperature. This is, incidentally, one of just a few psychic maneuvers that orthodox psychoanalysis, itself an infamous bastion of overreliance upon genetic reasoning, has had pinned down from the start, so blatant and endemic is it to human social life. It is just too unsettling a proposition for too many people to leave the answers to pressing existential questions permanently floating in the ether; and yet the effort at spearing, skinning, cleaning, and curating them is obviously the greater of the two evils.

06 March 2021

The Genetic Fallacy in Art and Life—Author's Disclaimer/Preface

(2020-21)


The bulk of the forthcoming essay was written between four and five years ago. That time already seems more like a past life. I abandoned this project when I realized that I could not (and possibly no one could) bring off its full demands in an intellectually responsible way. Also when I remembered that I have grown to hate reading things like this. Also when I accepted that this was a desperate lunge toward equilibrium borne of a living situation which had become unpleasant. It arises most directly from this latter consideration. After CalArts, I rented a room in a North Hollywood apartment for about four years. (Later I would learn that almost everyone who moves to LA rents a North Hollywood apartment for some similar stretch of time.) My apartment-mate was a Valley native almost exactly my age whose backstory and views could not have been more different from mine. Our more intense political discussions are among my most valued as well as my most traumatic memories. Supposedly this was and is exactly what a Divided America needed to be doing more of. Frankly I think we might just have another civil war if we all did intentionally what I did accidentally. I for one have had my fill for a good while. Give me time for about 500 more books before I next confront the specter of an alt-right cohabitant. Politics aside, I grew to deeply respect this guy for bootstrapping himself after being dealt a really terrible hand in life. I also realized that underneath all the bluster he was off-the-charts brilliant. I consider him an intellectual equal and often wondered if he was not in fact my superior. I am not one to confuse education and intelligence. No one who has been to graduate school should need any clearer empirical demonstration that the one does not follow from the other. To my detriment, it seems that I veritably radiate the contrary impression; either that or there are just certain things anti-academic people like to say about people who finished college, whether or not these things are true. If the latter, then they stand guilty of projection, that most Freudian of thought crimes, and Freud's ghost gets to have a chuckle at their expense while the ghost of Ernest Jones whacks him off. If the former, then maybe I just need to be more mindful of managing impressions, and maybe ghosts don't actually whack each other off. Anyway, about my roommate, curiosity eventually turned into avoidance when I found that subtleties of context and idiom made discussion of anything more than the weather extremely difficult for both of us. In between breakthroughs, we spent way too much time hammering out semantic and historical baselines. While I was making my great leap into books, he did almost all of his reading on the internet (as I formerly did too) and openly questioned my frequent trips to retrieve materials from the library. The library was but a five minute walk away. Susan Sontag used to go there after school to work on her editorials for the North Hollywood High School newspaper. I thought that was cool. He thought it was part of the problem. One time I got him to at least consider the usefulness of public libraries by invoking the specter of a tech company monopolizing the electronic distribution of "books." But by that time I was just bluffing, trying to survive rather than thrive. Needless to say this made the discussions even less constructive than they had already become. Finally, as Trump's 2016 candidacy gained momentum, my cohabitant became enraptured, he seemed to identify personally with the man, and the frankly racist test-balloons which he had previously learned not to float over my airspace gradually reemerged as well-rationalized "racialist" aircraft carriers. Alienation of affection set in. It was felt, and it is felt still. The only other people I know who voted for Trump did so with little to no enthusiasm, the same level of enthusiasm with which I voted, in my first one of these obscene spectacles, for Al Gore. One such unenthusiastic maybe-Trumper whom I work with told me in the course of a comparatively tame political conversation that I go "straight out of the liberal playbook." This recapitulated my old roommate's assertion that I would agree with him/them if only I could reject the lies I had been taught in school. That is reason enough to post this, albeit a reason I wish I didn't have. They won't read it and wouldn't understand it. Those are facts and not insults, empirically tested ones no less against which the next countervailing evidence will be the first, and against which offense taken is merely creeping doubt projected. Neither education nor intelligence nor the twain can guarantee understanding; and understanding, though it is a practical necessity, is not a moral quality. Sometimes I too do not understand, literally or otherwise, what these gentleman are on about and I can't find my way there by any available route that I can see. Their opinion of me, apparently, is that I have not bothered to look, and that my education has consisted of passively-ingested propaganda. The two of them actually are as different from each other as I am from each of them, but they have this opinion of me in common, along with their contempt for the public libraries and used bookstores where I have sought and found many things which they remain ignorant of. The first time I said I was going to the library after work, my co-worker told me "You have a disease." That is an insult. (Technically it's also a microaggression, which I do believe is a real thing, even though I'm skeptical of multiculturalism, the Situationists, government arts funding, the anti-gentrification movement...) This essay was one attempt to reckon with all of these issues and more, all at once, complicated yet further by the burden of its concurrent therapeutic, equilibriating function, resorted to instinctively after one too many invitations to a debate on the genetic diversity of American Blacks, a debate for which I was and probably will remain ill-prepared, I confess, to take any informed position at all. The exercise here was to explore what such mutual ill-preparedness means without moralizing about it. This is not easy to do. I think it might be impossible. Certainly it is impossible for anyone to think that you have achieved it unless you engage in some serious impression management. All these misgivings and others aside, following an emergent pattern here, another frozen essay is hereby defrosted, heated, and served. Just don't start any civil wars.

23 October 2020

The Mind on Furlough

I am still furloughed and still on the public dole. It has been just over six months, and if it goes on for only another six that will hit the under for most people's bets. I have been tremendously productive in ways which probably don't count for much. The first thing that happened was that my apartment got very clean; now it is extremely messy, even by my standards. I stopped practicing on July 1 and have not yet resumed. Instead, I have torn through books and records, scrabble word lists and annotated games. I have given the life of the mind due regard. I would rather have my job back, but I also could never relate to those who found themselves "bored" or stir-crazy when the first lockdowns hit. Privately I already had enough on my plate for ten lifetimes, and much of it was going to require a lockdown of one sort or another anyway. Be careful what you wish for, I guess.

The immediate future is very uncertain, but I have become more preoccupied with the long-term. Specifically, I cannot fully repress the thought that as I have just begun to find full self-actualization in the cultivation of the mind, the efforts and the thrust of the wider world are all directed toward rendering the human mind obsolete. I am reminded of this by things as varied as: Quackle simulations (a Scrabble computer program which plays quite well but not infallibly, and which almost every serious player now uses to self-evaluate); a podcast about the uses which VR and game engines are finding far outside of their conventional bailiwicks; the need for constant rule tweaks just to keep pro sports entertaining now that minds real and virtual have been unleashed upon them, thus exploiting the existing rules so brashly and effectively that the "product" suffers; and of course, the LAX jetpack stories, which remind us of just how far behind schedule we have gotten in fulfilling bygone pop-technological prophecies.

We may still be a long way off from the singularity, but I don't know that we are all that far off from a world where all of the mind-based abilities I have so enjoyed cultivating are either superfluous, obsolete, or politically retrograde. Life will surely go on, but I will look awfully silly. Silly, and incapable, and certainly unnecessary. Because this relates to closely-held values and to identity, I do care what people think about me in this respect. What the man-without-god question was for my god-oriented forbears, so the man-without-mind question has become for me. Of course we did get some good philosophy and cantatas out of the old paradigm; the bygone prophets of doom would probably be surprised to know just how much mileage we've gotten out of these old things even without an imaginary friend to guide us. Yet these too are mind activities, which just makes the analogy more troubling, makes it harder to imagine that life will indeed just go on, because it has to, just like it had to when god (and the author close behind him) died or were killed.

Aside from a couple of college summers, I have never had so much time to devote to my own work. I have often found myself thinking even so that lack of brain has been a far greater obstacle than lack of time. There are days where I can find 9-letter words through disconnected tiles, and there are days when I can't keep my 3s straight. There are days when I can read for 12 hours and other days when I can barely focus for more than a few pages. As for my former work life, there were days on the ol' Metrolink where I could fully absorb a difficult book chapter and other days where I had to punt and aim for a much-needed nap. I have always been this way, regardless of what else is going on in my life. Where I am almost inhumanly consistent is that I wake up every day, regardless of how much brain I have, with a burning desire to progress, develop, actualize. Like my coworker's old Powerbook G4 which I coaxed into running Lubuntu, my own power module flashes the message "No Kernel Support," which means I eventually overheat and have to rest. This can be demoralizing. The occasional triumphs are gratifying. But they seem increasingly like triumphs which technology will soon render superfluous.

12 October 2020

Facts and Fancy

(from my Goodreads review of Babes in Tomorrowland: Walt Disney and the Making of the American Child, 1930-1960 by Nicholas Sammond)

The overall posture and style of this study are so self-consciously disinterested and relativistic as to read like a caricature of postmodern academic writing. This pastiche has lost not merely its sense of humor but its sense of purpose too. The fear of letting a stray value judgment slip out seems to have stultified the author's analytical capabilities. And yet values per se are largely what the study is about. The superficial irony of this is plain enough, but I think it is more than ironic. It is at least mildly disingenuous. In some respects it is cowardly.

The disinterested empirical scholar is discouraged from bringing their own values into the mix because disinterested empiricism cannot, by its own inner logic, operate that way. This book stumbles its way into a subdiscipline where disinterested empiricism is thought to be especially de rigeur but where it is actually quite inadequate. Sammond repeatedly invokes something like "the dominant presence of members of the white, Protestant, progressive middle class in the study of childhood." (7) He repeatedly names and specifies these agents of institutionalized moralization, repeatedly inviting us to consider them by profession, race, and class. Their work, he tells us, was profoundly shaped by classbound values. The fact of classboundedness and the identity of the classes in question are unequivocally named and reiterated. But Sammond seldom names the values themselves, and when he does name them I found it difficult to conjure much righteous indignation.

I do not wish to suggest that there actually is a universal morality. That is not what I believe. I don't think you have to believe it, though, to trip up on the idea that "truthfulness" and "unselfishness" are "middle-class virtues" (85) which cannot be reasonably expected of other classes. To me that sounds a lot like, say, reading being a White thing. Sammond himself probably believes no such things, but he is not allowed to say so, because this is scholarship and mere opinions aren't worth anything. The hubris of progressive sociologists, on the other hand, is an objective fact which can be presented as such, for if there is no universal morality then all progressivism is just a stillborn moral fallacy. Even "truthfulness" cannot mooch a provisional exemption. Truthfulness!

Naturally, the chickens of relativism roost in the hencoop of hypocrisy. What are the moral implications of accommodating the actions of a dishonest or selfish poor person? Does this help them or hurt them? Is it justified merely by the fact that they are poor and you are rich? By the right to cultural self-determination? Liberty? Consequentialism? Echoing overzealous committees everywhere, Sammond could claim that these properly philosophical questions are beyond the scope of his social-scientific study. I agree that they threaten to explode any such study into an unwieldy interdisciplinary patchwork; but I would strongly disagree that they are, literally, outside his scope. His own methods have made these questions essential to his scope and he makes no effort to acknowledge or address this. Instead, the really important takeaway is that most of the reformers were white, Protestant, progressive, and middle-class, whereas not all of their objects were these same things. As it turns out, this is not quite worth writing a book about.

Reformers of any slant in any area of human endeavor are vulnerable to the charge that they have put forth their own values as universal ones. Without this fundamental arrogation there can be no collective social action of any kind. The mere fact of arrogation is endemic, background radiation to the perceptible heat and light of social and political life. The arrogation of reformers is not an urgent sociological issue. What is urgent, I think, and what could have been pursued more doggedly here, is a compelling chronicle of the dynamic interaction between values and institutions. Strictly speaking, the thesis that "discursive circuits constructed around and through media-effect arguments sell products and build careers" (360) does describe a dynamic process, but it begs a lot of questions too. My sense is that Sammond forbid himself as a matter of methodology from opining, judging or blaming, and that by proscribing these things he railroaded himself into a static account rather than a dynamic one. (When your first order of business is to name the race and religion of the principals, it's hard to say much of anything more without offending.)

I also am not convinced, either by this account or by others, that the interaction between the Disney Studio and the reformers Sammond identifies was truly dynamic until quite late in the period he covers. In amongst all of the imbrication and commodification, I noticed that the dates, types and sources of the documents he reproduces throughout the book support my skepticism. Concerned parents created the market and Disney, eventually, seized on it. But Disney already had an enormous market, and progressives had a lot of ideas which were oblique to Disney and to media generally. Following academic convention, Sammond takes a laser-focus on the tiny area of overlap. It turns out there is not nearly as much for him to write about as the length of the book would imply.

If you don't already know something about the reformers Sammond chronicles, you still won't have much of an idea of what their values actually were after reading his book. He detects that the progressives have unduly assumed at least one non-working, stay-at-home parent, a luxury which many working class and immigrant families didn't enjoy; and he points out that child labor has persisted in agriculture (and disproportionately among children of color) long after progressives had more or less succeeded in abolishing it for white children. These are sobering reminders for white, middle-class readers; they are nonetheless quite underwhelming in the role Sammond has carved out for them here, where the towering monoliths of American Sociology, Enterprise, and Entertainment have collided in a giant orgy of...what exactly?

"Truthfulness" and "unselfishness" arise in the discussion of Disney's Pinocchio. It is the natural film for Sammond to discuss, since its overbearing didactic moralism stands out even in the Disney oeuvre. Yet transparent texts can be difficult to handle, and Sammond breaks everything he touches. With so much threadbare symbolism sitting right on the surface (Stromboli is literally a puppetmaster), Sammond cannot possibly work his way back to "middle-class values" without committing an act of interpretation. He has previously been too vague about values, whereas this film is explicit about them. Sontag warned us about this: "to interpret is to impoverish." Disinterested empiricism has taken him as far as it can, and now it is his turn to recapitulate in reverse the error of media effects crusaders by projecting upon the text the social location of those most eager to consume it. Consumer eagerness now engulfs the text from without, metastasizing into its organs of content and meaning. Suddenly it is not Edward Filene or Walt Disney but Sammond himself who has elevated consumption to a moral value! Buy a film and you become its content! And its content you! It's cheaper than the naming rights to a distant star or atoll! Hence a fleeting indulgence in armchair criticism is the precise moment when things go off the rails for good, whereby "truthfulness" becomes "middle-class," whereby poor people's untruthfulness is locked away in the black box of cultural self-determination, whereby Pinocchio cannot reflect the values of a solitary poor person unless all of the other poor people are also lining up to view it. Not just a filmic text is impoverished this way but also the "virtue" of everyone who is not "middle-class." That is quite an accomplishment.

I'm not a critic or a sociologist, but I feel like there has to be a better way to go about this. Fromm defined ideologies as "socially patterned rationalizations." Say we take those three concepts, pair them into three dyads, and then study each dyadic nexus; each one generates a limited but salient field of material which is relevant to our topic, and also a sprawling field of extradisciplinary connections. Given the organic limits of human cognition and the profusion of published research, each of the outward-facing fields is functionally unbounded; but they are perfectly finite in number (there are three of them), and this makes it possible at least to momentarily stare into each abyss and admire what makes it unique from the others and from the original topic. Then we return to the inside, reassemble the triad, and look for the triadic nexus. A geometric analogy to planes, dimensions and wormholes suggests itself. This is just silly stuff I think about, but it seems to me that this book has done none of this nor anything remotely resembling it. It is not even a one-dimensional sociology, because it has not even the first prerequisite for the dimensionalization of sociological thought, namely a sentient authorial being. The strict repression of authorial slant in this area of scholarship is quite ironic given one of Sammond's key takeaways from the inconclusiveness of Media Effects research: even children do not simply swallow whole everything they are told or exposed to. I think we can assume this of readers of scholarly publications as well. A profusion of value-oriented scholarship could actually be the best way to achieve the "parallactic" ideal that some postmodernists have put forth, whereby observation from a variety of angles permits a clearer view than any single one of them can alone. The first step towards that ideal is not to give up on fixed moral positions but rather to stake them out. A moral position can be the second point which defines a line of inquiry. This poses methodological challenges, to be sure, but there is a payoff for surmounting those challenges, a payoff with which studies like Sammond's cannot compete. Fromm and Maccoby made a blind stab in this direction which is simultaneously comical and profound: they constructed numerical scales of psychoanalytically-defined traits by which to measure the Mexican villagers they studied, they took the measurements (basically they made them up), and they performed some conventional statistical analysis of these figures to look for Results. To a self-loathing postmodernist this looks like pure arbitrary slant, the methodological equivalent of intentionally exceeding the speed limit at first sight of a cop. My contention is that if hundreds or thousands of diverse minds were to construct their own numerical scales and take their own "measurements," the aggregated results would be as meaningful as the minds are diverse. (This diversity would need to be more than skin-deep.) Against this backdrop, Sammond's approach looks like another fruitless search for perfect objectivity, distance, disinterest. If the slant is always there anyway, we might as well turn it to our advantage.

At great semantic and rhetorical pains, Sammond does eventually work his way around to some interesting big-picture theses about commodities and the social construction of childhood. For reformers and parents alike, the erroneous belief in strong media effects
"smoothes over some unpleasant contradictions in the construction of personhood and identity in democratic capitalist society. Quite simply: the child as susceptible to commodities stands in for the child as commodity-in-the-making...[whereby] persons must be simultaneously and impossibly unique individuals and known quantities." (360)
Ay, that's the stuff! But by this time the sins of omission are piled high, reflected in the endnotes by a veritable profusion of beyond-the-scope apologias which I literally lost count of. I'm reasonably sure I have never seen so many in one place, actually, and I think that is a singularly meaningful reflection on the nexus of topic and method here.

09 October 2020

City Living

"By and large working artists seek privacy and anonymity. But they also require exposure to all sides of life. These two benefits are available jointly only in the city. For artists solitude is not a vacuum, empty and meaningless. Isolating oneself in the country is contrary to experiencing and feeling the realities of the human condition, as beautiful as the country might be. Though it sounds contradictory, isolation in the midst of hyperactivity paradoxically means a chance to create one's own beauty, or to react against one's own choice of exposures; a little understood phenomenon among nonartists."
Robert Perine
Chouinard: An Art Vision Betrayed

31 May 2020

Three Views on Competition


-I-

Mid-quarantine sports media has gone nostagic out of necessity, with the recent Michael Jordan documentary leading the charge, and thereby it has been pointed out that MJ's pathological competitiveness would today run afoul of numerous sensitivities. Leaving aside for now the litigation of those sensitivities and the triteness of the observation, I think it is nonetheless an observation worth dwelling upon and extending: the games are mere escapist entertainment and the Darwinistic element is, unlike the analogous Roman spectacles, more symbolic than real; yet the people are real, and if they are not usually worth truly feeling sorry for, that is not to say that their outrageous salaries somehow void their basic human entitlement to dignity and health. MJ in his more infamous moments undoubtedly created a Hostile Work Environment. Since his playing career ended, American pro sports have seen a handful of high-profile breakdowns, AWOLs, and early retirements which are either partly or wholly attributable to similar behavior by less-revered teammates. The social ground has indeed shifted beneath the feet of bigtime sports, making this human toll seem less collateral and more integral. It is doubly inconvenient, then, to be told that MJ's now-questionable motivational tactics were integral rather than peripheral to his success. I love sports and I also think it is okay to be uneasy with this, i.e. to impose today's standards on yesterday's events. Nothing would ever get better if that type of hindsight was not allowed. The anti-civs can howl all they want about PCness and Revisionist History, but even they know that there is a right and a wrong way to treat people. Some of them may even have had an experience that (gasp!) changed their mind! I certainly have...and somehow I still love watching sports. Love of sports is, paraphrasing a girlfriend-of-a-friend, my only "normal" trait. I take no offense to the comment; rather, taking it at face value, I choose to strategically deploy this aspect of my public-facing self in those tough social situations where it is crucial to seem normal. But of course in my remaining abnormal moments, it has only gotten more difficult with time to ignore everything about sports that is unseemly.

For the most part, even pro sports lockerrooms have at least met the new sensitivities halfway. The greatest countervailing force to this belated enlightenment is not the odd Old School jock, but rather the amount of money at stake. Rule-bending/breaking is itself something of an art form, and exceedingly thin competitive margins in high-stakes endeavors tend to encourage its consolidation and refinement. MJ of course authored the definitive work of contextual rule-bending when he...created separation from Utah's Bryon Russell in the closing seconds of the 1998 NBA Finals (his only signature moment that I vividly remember watching on live TV). It is a "work" without parallel even in MJ's oeuvre, by, of and for that moment and that moment only. Among the commentators I listen to, the ones who not only were there but also have dug deepest into all of this after the fact tend to emphasize a similarly contextual, circumstantial, incentive-driven understanding of MJ's most infamous interpersonal conflicts. Context and circumstance are crucial to the sensitivity question, no matter where you fall on it. It is of course possible for high-stakes athletic competition to unfold with minimal "personal" friction, but it is not possible 100 percent of the time. Similarly, high-level competition might emerge without correspondingly high stakes, but it probably won't. A long-winded way of saying: sports cannot be sanitized much further than they already have without harming their intrinsic appeal to "normal" people (or, as the case may be, to the little tiny "normal" person that lives inside even of weirdos like me).

For now, a basketball game still has a winner and a loser, and the team sports franchise remains nothing less than the contemporary archetype of patriarchal, non-particularistic organization. The Chicago Bulls can turnover personnel a dozen times and still be the Chicago Bulls. If the rarefied air of hyper-thin competitive margins is then thought to be first and foremost hazardous to one's health, and thus unseemly on specifically that basis, this represents a particularistic turn which is anathema to what bigtime sports are. If the human toll is no longer acceptable then I will fall in line with that new reality, but I don't have to enjoy the new sports-like product. I'm not very normal, but I am too normal to enjoy games where no one wins or loses. If we now "root for players, not for teams," as even some of the above-mentioned sports commentators claim to, then it does seem that winning and losing no longer matters. I can't relate to this, but I think I at least understand it. Why we would continue to channel competition-averse desires through sport is, conversely, something I can't even understand.


-II-

The current slogan of the North American Scrabble Players Association is, "Making Words, Building Friendships." What exactly does this imply? (Or not?) Perhaps most basically, a matter-of-fact concurrence with reality: even I have made at least two unusually good/close friends on the Scrabble scene, and the proportion of people I find tolerable is non-negligibly greater than the baseline expectation. This being as it may, if an equivalence is what is implied in the slogan, I think this is wholly illogical and dishonest. Certainly it is not in concurrence with the reality I've observed. There is at least as much animosity as friendship in Scrabble; this much was palpable in the room from my very first expedition to a NASPA-sanctioned club. What has only become clear with experience is that this animosity exceeds the mere social background radiation one expects to find everywhere; it is, rather, intrinsic to the game itself. The disproportionately extreme consequences of seemingly marginal strategic decisions make Scrabble as much about mastering one's own emotions as about cognitive ability or competitive spirit, and no one in this pressure-cooker is a perfect master of their emotions, no matter how friendly they are the rest of the time. If you want to engender friendship, you definitely do not place people into this sort of dynamic interaction with chance; and if you have a friend-target in mind, it's best that this friend-target not embody the opposition in a zero-sum test of the two acquaintances' comparative abilities to manage said dynamic interaction with chance. Given some of the dust-ups I've witnessed, I give myself relatively high marks for civility; but mere civility is not friendship. I confess that I find friendship (d)elusive here, not only in the heat of competition, where it is colored by visceral emotion, but also upon distanced reflection, where it is colored by a wider interest in sport as expounded upon above. I grant that the drawing of any analogies between the NBA and the NASPA strains credulity. Nonetheless, both a game of basketball and a game of Scrabble have a winner and a loser; if the analogy can be extended only that far, this is nonetheless quite a significant fact with many significant implications. As such we might add that both are ritualized, sanitized reenactments of base instincts, or some flavor of that old trope. What does this phylogenetically distant basis in primal violence mean? Is it the distance or the violence which is more meaningful? Sensitivity is the obsession of the moment for hard-liners on both sides that question, but I think the answer really depends more on our intelligence than on our sensitivities. (I also believe the covariability of intelligence and sensitivity to be generally overstated1, though I do need to learn more about this and could be swayed.) For the most part, sentient adults are capable of compartmentalizing ritualized reenactments from so-called real life. As a species we are, I think, quite capable of civility per se in this scenario, if not always of friendship; and as the eminently social species, there is much to be gained if we can achieve this, and certainly also lost if we cannot: the ventilating function of such ritualized, non-destructive competitive outlets, the lexico-cognitive dimension of Scrabble as healthy mental exercise, the greater acuity of such exercise-benefits when they are channeled by competition rather than pursued casually, and so on. Call these the Extrinsic Benefits if you insist, though really they are intrinsic to this uniquely human institution. Of course the institution of friendship matters too. But if friendship is your end, tournament Scrabble is a strange choice of means. If friendship were the ultimate aim, what wouldn't we change about Scrabble? And if winning and losing isn't what really matters, what are we doing playing a game that has a winner and a loser?


-III-

My 2002 summer expedition to the Aebersold workshop in Louisville was rather fruitless from a playing perspective, but the lengthy evening concerts were, as many others have remarked, themselves worth the trip. By now most of the finer details have blurred, but I specifically recall a Don Braden-Eric Alexander tenor battle not for the music (which I'm sure was fine nonetheless) but for Braden's mid-set remark to the assembled newbs. Paraphrasing: it can't help but be a competition when the two of us are up here together, and this is fine as long as it serves the music. I can't help but agree, which leads us seamlessly back into navel-gazing: is competition thought able to serve a constructive purpose here because there is, metaphors and figures of speech aside, no winner or loser in music? Certainly there is an aesthetic dimension to sport: John Stockton is said to have described the Dream Team scrimmages as "poetry;" and Scrabble played at the highest level certainly has struck many an informed observer as "beautiful." But only in the case of exhibition games can I imagine a convincing argument that aesthetics are essential to sport, even as they are quite essential to my own interest in it. Conversely, as Debussy would have it, "Pleasure is the law" in music. That assertion can be problematized from any number of abstract ethical perspectives, same as can ritualized competition; but the overwhelming thrust of real social practice, rational or not, is on the side of pleasure here. Hence I think the burden is on the ethicizer/moralizer to demonstrate that pleasure and competition alike are entirely about wants and not at all about needs. I do not believe this to be true in either case.


1. Anecdotally, the phenomenon of the pathologically cutthroat pickup basketball player always seemed to me a product of vulnerable class position, not of individual psychology, and certainly not of intelligence. Where individual psychology comes in, I suppose, is in the case of players whose competitive drive stems from perceived vulnerability that is not necessarily real. MJ and Tom Brady are often mentioned in this connection, as is the significance of what I am calling "perceived vulnerability" (as opposed to the real kind) in the realm of politics and demagoguery.

30 May 2020

Conquering Dependence on Necessary Evils

One day as a high-schooler writing music on my Dad's PowerMac, I discovered that ConcertWare had a meter called "Free Time." Thus began an abiding compositional habit of periodically dispensing with barlines. Having now seen much more printed music and made many more forays (not totally successful ones) into hand-written/mind's-ear composition, it is always a bit embarrassing to think back to moments like this, when composing was for me something of a video game. Whether the software thus encouraged that impressionable young person to play fast and loose with convention or whether it merely allowed him to is a question of framing rather than of substance, and one which composers will answer more according to our own orientations rather than according to reality. Since the reality was in my particular case lost to the sands of time without anyone (including myself) caring nearly as much as composers seem to care about this issue in the abstract, perhaps this is just fine. Admittedly, from the perspective of a more experienced quasi-teacher attending to a hypothetical student, I would not be totally at ease with such a process now. Yet the same hindsight shows that there were at least two undeniably propitious elements in my case: (a) the ease and accessibility of this feature exploded a hitherto unquestioned convention rather than rigidifying it, and (b) ConcertWare undeniably handled unmetered notation far more flexibly than Finale, Sibelius or MuseScore do, even now.

It is true that such departures can be made too easy as well as too difficult, depending on the technical intermediary and the cultural atmosphere. It is also true that frequent interface with printed music outside of one's computing life has a way of diluting the computer's influence over notational decisions. I was fortunate as a tween to at least be seeing printed music in band class, and occasionally tripping over stacks of it at home. I suppose it was only later, when I realized that composers, publishers and conductors I had heard of (or at least a few of them) were open (or at least not irrevocably opposed) to temporarily dispensing with barlines, and when I encountered my first gentle opposition to this practice on the part of other musicians, that my decision thereby became something of an informed decision, taken freely. And when a beloved college wind band conductor habitually referred to barlines as "a necessary evil" in rehearsal, as an idealist I of course heard "evil" more than "necessary," and at that point all barline bets were off.

Unmetered notation remains controversial, even among the most seasoned and fluent musicians. Periodically I have occasion to pause and reflect on this situation, and it occurs to me now that there is a significant connection here to another Style Wars polemic which bubbles up occasionally: the question of learning one's part from notation as against learning it via aural transmission. In addition to asking for unmetered music to sound a certain way, by writing unmetered passages composers are asking the player to do some extra work; perhaps to figure out for themselves, by shedding, where the barlines might be if they had been used; perhaps to become familiar enough with (essentially, to memorize) the passage such that the coordinating function of the barline is superfluous; and perhaps therefore not to concern themselves with what other players' parts might be asking of them, nor with how those other players might handle those demands, including the possibility (within reason) of different grouping/phrasing in different parts. There is more to unmetered passages than the possibility of multiple "correct" meterings or the absence of composerly guidance (not to say intent) on said point: there is, more importantly, a practice, rehearsal, and performance process which is mediated by a notational decision. The result of this now-changed process is what I am seeking with unmetered passages. I am not seeking a "perfect" rendition as if barlines had been deployed and subsequently observed by unusually adept players or by a machine. I am, in a sense, actually going out of my way to avoid this.

Process is the only reason that the performance of unmetered music might, potentially (hopefully?), sound different than if the music were metered; getting music to sound a certain way is the only logical reason to depart from received notational convention; and departing from received notational convention is a good way (if not the only way) to shake up the performance process. This is the kind of procedural perfect circle that composers dream about, and usually only dream about. If the "process" merely consists of the performers staying 5 minutes after the first rehearsal to compare parts and draw in uniform barlines, then we can still say that the notation has mediated the process, and that the music might still sound different than if the composer had provided the same information to them from the outset. But this amounts to normalizing/conventionalizing what was non-normative about the piece in order to make it easier to play. That maneuver is the domain of Jobbing, not of Artistry. Shedding also makes any given piece easier to play, regardless of notation, and invites the reflection which breathes life into Dead Tree composition. It is socially ungraceful to point this out in a world where Everyone Is Busy and there is already plenty of music to listen to. I accept that judgment on a cosmic level. On an earthly level, meanwhile, I see unexplored/neglected aesthetic avenues hiding in plain sight and conjecture that they might be fun to explore. So come fly with me, or whatever.

Reflection tends to be baked into the process of aural transmission, and it tends to be eschewed (usually almost totally) by users of notation. This I do not deny, but I do choose to find fault with the users rather than with the notation. Thus for me the basis for preferring one mode of transmission to another is a matter of what I might want to do with it, not what everyone else thinks everyone else is doing with it. Modes of transmission are mere vehicles for the realization of the abstract concept of a work; it is the concept which indicates favorably or poorly for either process, not the other way around. Notation is all about expedience, and this is both its best and worst quality. Notation allows Eye Players to realize music without reflecting on it, perhaps even, as the figure of speech would have it, without even thinking about it. Owing to innumerable big-picture factors which are best set aside for now, this is normally exactly what happens (or doesn't happen). Certainly no one is more puzzled by or discontent with this situation than I am, and I will not be out-discontented by partisans of Ear traditions who choose to resolve this structure-agency question one-sidedly. It is true that the structure here (the notational system) is what enables users to become passive re-creators, but it is not true that it imposes passive re-creation, nor that the etiology of passive re-creation is entirely or even mostly a matter of the notational system, nor that the notational system has nothing more to offer us than the shortest on-ramp to the path of least resistance. If any given Eye Player chooses to reflect upon their Eye Music, they will find every bit as much to reflect upon as will the ear player upon theirs. If they neglect to take this opportunity where it presents itself, then my heart bleeds for them.

Writing without barlines aims at imposing a process that is intermediate between the rhetorical extremes of the Ear Player who is forced into a reflective outlook by the laboriousness of their process and the Eye Player who habitually tears through piles of written music without any reflection whatsoever because Everyone Is Busy and reflection would slow them down. Writing without barlines aims at imposing selective reflection by omitting small pieces of customary information, while nonetheless providing all the other information that written music customarily provides.

Notation doesn't breed soulless performance; rather, soulless performers gives soulless performances. Unfortunately this conclusion has become unavoidable as Ear Playing increasingly carries the day and soullessness remains rampant. Yes, Everyone Is Busy, and so there aren't too many bands around today where everyone really commits to the Mingus process. We're so Busy, actually, that the dwindling repertory has moved decisively away from anything even as structurally specific as Haitian Fight Song. The overdetermination of musical structure by social structure is a material question, not an expressive or metaphysical one. You cannot claim the exquisite-corpse process as an affirmative creative decision when your five band members have moved to five different states! You cannot claim notational or conceptual simplicity as an affirmative creative decision when you know that no one is willing to rehearse! I am not saying that you cannot succeed under these circumstances. What I am saying is that you cannot claim success.

When process is materially circumscribed from the outset, concept can only trail at a distance. It is unideal for process to lead concept in this way because all processes are conceptually limiting. Ideally the creator of the work would have taken account of this from the embryonic stage of creation, identifying a process which best serves their concept while working around the inevitable potholes. That is, ideally the mediation between process and concept takes place though the creative process itself, not in sequence with one consideration leading the other around by the scruff of the neck after the piece is "done." When process dictates to concept, its flaws and slippages are foregrounded anywhere the creator is unwilling to sacrifice concept to expedience. On one hand, this unwillingness is socially maladaptive; on the other hand, it is one leading indicator of the presence of a soul. Hence owing to unconscious self-other identifications that even educated citizens of enlightened post-industrial societies are subject to, this unwillingness to compromise tends to be rewarded by the soulful and punished by the soulless. And that's where we're at!

28 May 2020

Pre-Endgame Strategy

In my current situation I find the long-term rather than the short-term impacts of the quarantine most concerning, and perhaps for this reason I've frequently found myself thinking about one particular long-term concern.

Stay-at-home orders are nearly superfluous in my case, hence the lockdown has, for me, so far been little else than a welcome sabbatical from rat-racing, and a fruitful period of study (both self- and other-). The near-total lack of structure is nonetheless something which I've always found slightly hazardous. And so here is one extrinsic benefit of music education that I'll toast to: as a brass player, I figured out even before the clickbait psychojournalists did that having a routine would be essential not just to parochially musical concerns but to the general preservation of sanity. Thus the tuba hour commences at noon daily. It is really more like 20 minutes and almost never starts before 1pm. I hesitate to call this "discipline," since the timing is too loose and too brief to qualify. If it is "maintenance," then disrepair carries the day. The main objective is not to forget how to play. There are a couple of mild conceptual challenges involved and no technical ones. Part of me laments that this is what it has come to for someone who veritably haunted the practice rooms in college, and who, gun to head, still claims the tuba as the center of his increasingly entropic intellectual and creative universe. All of those misgivings being as they are, I have no doubt that I'm making made good on my frequent admonitions to young students that even this amount of practice, when it is logically structured, narrowly focused, and adhered to daily with the devoutness of a sacred ritual, can be productive and worth the trouble.

It never occurred to me to promote this ritual as a prospective lifeline to structure, invocable if the rest of the world seems to have frozen in time. Maybe I'll try that if and when I next return to teaching, since none of my other spiels have ever been the least bit effective in inspiring commitment where it did not previously exist. I am of course reluctant to expose students to the multi-layered ambivalence of the mid-career professional; that sort of radical honesty might be a bit too radical even for me. To take music and, more specifically, a musical instrument as not just a specialty but an identity, to face society as a tooba player, encompasses, as I have probably already written enough about, quite the dizzying array of privileges, struggles, and absurdities. In the present absurd conditions I do feel quite fortunate to have a readymade vehicle of routine, and I do believe the sanity-preserving function to have been borne out by this experience, but all of that merely represses the reality that it has been a decade and a half since I last found rigid adherence to a practice regimen fun and fulfilling for its own sake, and that both the duration of adherence and the intensity of "fun" have steadily diminished with time. This, taken together with the long-term inevitability of physical and mental decline, paints quite the discouraging picture of the aging brass player. Can this downward curve ever be flattened?

For all that I've invested in book learning, I am guided on the endgame question almost exclusively by two fond anecdotes which I've never bothered to investigate. First: a friend is fond of remarking that 50 year-old drivers have the fewest accidents and the lowest insurance premiums. They sit at an optimal point on the x-y graph of accumulated experience (lots) against physical decline (not yet). This seems to me a supremely relevant consideration for brass players as well, i.e. with an eye toward balancing cumulative achievement with quality of life by determining the optimal time to walk away. On which point the second, more morbid anecdote is salient, a nugget of my mother's dime-store-Marxist antisheltering, and a burden which more conventional American parents would never reveal to a pre-adolescent child: when all people do for 50 years is work, they often don't know what to do with themselves upon retiring, even if they thought they would; and when people don't know what to do with themselves in this profound sort of way, even when they thought they would, they often just die.

If the "x-y graphs" and "optimization" of the first anecdote sound too fully rationalized or mathematical to be useful in Real Life, then the urgency of death inspired by the second anecdote ought to be motivation enough to embrace them. Overlaid on all of that, for me at least, is the question of what Erikson called "generativity," essentially the province of culture's 50 year-old drivers, and for me split (not always happily) into generativity that pays the bills and generativity that feeds the soul. While I certainly tend to look forward to a day when I have played my last corporate ice cream social, even I would grant that a withdrawal from that kind of work represents a certain loss of identity in a society where your work defines you. (I think I want to live in a society that is not like that, but this is unlikely to happen.) By the same token, having developed out of tuba playing all kinds of peripheral intellectual and creative interests, the thought of someday making those peripheral interests central, without the tuba there to ground them, has always been both superficially appealing and deeply scary. Be it a privilege or a chore depending on the day, tuba playing is both the initial inspiration and the ultimate outlet for those other pursuits. Hence I fear equally the old-age regret of having stopped playing too soon, leading to a loss of focus in the other areas, and that of hanging on too long, wasting time doing subpar tuba work when that time could be more fruitfully devoted to the other areas. To be sure, both of these prospective regrets seem, literally, deadly. Thus I think it is reasonable to consider such scenarios ahead of time, before moments of choice are upon you. Tweeting about having a "no regrets" outlook regarding the things you can't control is no substitute for seeking foresight and taking initiative regarding things that are very much within your control.

To wit, I would conjecture that the optimization function f(tuba) is bimodal: either (a) give up playing young enough that a new generative identity can form, or (b) hang on to the one you've got til the bitter end, perhaps reinventing your aesthetic as your declining technique dictates. The third, more conventional option, as mutually determined by social and structural norms, is Retirement at the socio-structurally appointed Retirement Age. Many musicians simply aren't able to pursue this the way people with real jobs can, and some who could and should pursue it neglect to do so. The denouement of COVID will have a lot to do with whether or not this course is even available to me. That aside, I think that Retirement is plainly incoherent with not one but both of the above anecdotes; it is incoherent with considerations of identity, aesthetics, and achievement alike; in a word, it is incoherent with psychobiology itself. And so without denying that Retirement represents a privilege of sorts, I think it is my third choice. I view it as a privilege only relative to the fourth option: working myself into the grave. And so as events continue to unfold, I will be focused on playing a good pre-endgame.

11 May 2020

Bananaphone -- Quarantine Edition

10 April 2020

Mumford -- Art and Technics (xv)

"As against a single person who could use a brush passably, there were thousands who could take reasonably good photographs. Here the first effect of the machine process was to deliver people from the specialist and to restore the status and function of the amateur. Thanks to the camera, the eye at least was reeducated, after having been too long committed to the verbal symbols of print. People awoke to the constant miracles of the natural world, like an invalid long secluded in a dark room, able for the first time to breath fresh air... But though the art of taking pictures is necessarily a selective one, the very spread and progress of that art, not least with the invention of the motion picture, was in the opposite direction; it multiplied the permanent image as images had never been multiplied before, and by sheer superabundance it undermined old habits of careful evaluation and selection. And that very fact, which went along with the achievement of a democratic medium of expression, has raised a whole series of problems that we must wrestle with today, if, here as elsewhere, we are not to starve in the midst of plenty." (94-95)

"What has been the result of the mass production of esthetic symbols that began in the fifteenth century? ... [The good:] By means of our various reproductive devices, a large part of our experience, which once vanished without any sort of record, has been arrested and fixed. Because of the varied processes of reproduction that are now at hand, many important experiences, difficult to transpose into words, are now visible in images; and certain aspects of art, which were once reserved for the privileged, are now an everyday experience to those who make use of the resources of printing and photography." (95-96)

In other words, reproduction is also, in many instances, record-keeping. All of the oppression and dispossession which inhered in denial of the right to have a past, a heritage, a discrete culture, and indeed the very right to collective introspection vis-a-vis these identifications, to have a hard look in the mirror on the cultural level, all of these privileges have been progressively democratized by the ever-increasing ease and ubiquity of this "mass production of esthetic symbols."

To understand the bearings of this change we must realize that it was at once a technical innovation, a social device, a means of popular education, and a way which the monopoly of art by a small group was broken down. With the invention of graphic reproduction, pictures could go into circulation like any other commodity; they could be sold at markets and fairs so cheaply that all but the poorest classes could afford to own them. (87)

From the fifteenth century onward, the picture was not merely something that you saw...[Rather,] in the cheap medium of an engraving it could be carried home; and so, in a sense, what it lost in uniqueness it gained in intimacy and variety and wide distribution. ... If they [reproductions] lacked pretentiousness, they gave to the unpretentious moments, the common occupations, the daily scene, the common pastimes, the dignity of being sufficiently memorable to be preserved. That was a victory for democracy, achieved in the arts long before its proposition, that all men are created equal, was put forward in politics. (88)

But here is a supremely pessimistic phylogenetic observation: aesthetics and memory became democratized before many more basic, essential forms of power. And so nowadays the homeless have smartphones but no homes; perhaps this is not an anomaly but in fact reflects a basic reality of Technics-driven civilizations.

[The bad:] The fact is that in every department of art and thought we are being overwhelmed by our symbol-creating capacity; and our very facility with the mechanical means of multifolding and reproduction has been responsible for a progressive failure in selectivity and therefore in the power of assimilation. We are overwhelmed by the rank fecundity of the machine, operating without any Malthusian checks except periodic financial depressions; and even they, it would now seem, cannot be wholly relied on. Between ourselves and the actual experience and the actual environment there now swells an ever-rising flood of images which come to us in every sort of medium... A picture was once a rare sort of symbol, rare enough to call for attentive concentration. Now it is the actual experience that is rare, and the picture has become ubiquitous. (96)

Indeed, and I would extend this observation of Mumford's to the aforementioned photo-will too. Universal photorepresentational agency is only democratically salutary at a much smaller social scale than the one which currently presents itself; in other words, IRL accountability is a necessary check upon antisocial uses of photorepresentation. At present, meanwhile, the prospective subject-as-object is too likely to remain a mere abstraction to the photographer even (perhaps especially) beyond the curation and transmission stages. This begets alternately anarchistic and fascistic phenomena, here defeating by brute force any conceit to order or reason, there furnishing the proprietors of so many top-down, self-dealing orders with the best tools yet for exploiting anyone less powerful than them.

Photorepresentation was long ago made technically accessible, and some degree of curatorial agency has always been baked into the photorepresentational process; but reception, be it a matter of contemplation or gainfulness or anywhere in between, cannot (has not yet been?) Technically enhanced. The individual human being remains the basic unit of reception whether subsumed among ten thousand or ten billion others, and whether subsumed in a real or virtual community. The potential expansion of the capacities of the subject are, as the passage above hints at, wildly incommensurate with those of the subject-as-object. Vis-a-vis photorepresentation, what power we gain as desiring subjects we cede proportionally as we are threatened with photo-objectification at the hands of others. It is not merely that "progressive failure in selectivity and therefore in the power of assimilation" is an imposed failure at an impossible task, but also that the consequences of failure have changed. Overwhelmingly for the worse, I would say.

That said, I find it highly counterintuitive, actually, if I may be permitted a temporary flight of ivory tower rationalism, that conditions of scarcity would be the ones under which powers of discernment would be sharpened. Insofar as scarcity means taking what one can get, do we not thereby become eminently undiscerning and less picky? Hunger is the finest sauce. A recipe for ascetic inner peace, perhaps, but not for sharpening the powers of discernment. It doesn't make sense that "old habits of careful evaluation and selection" could be superior to new ones when there was previously far less to evaluate and select from.

In fact Mumford does later make a remark more or less to that effect:

As long as a work of art was an individual product, produced by individual workmen using their own feeble powers with such little extra help as they could get from fire or wind or water, there was a strict limit to the number of works of art that could be produced in a whole lifetime... Under such a system of production there was no problem of quantity; or rather the problem was that of too little, not too much. Natural and organic limitations took the place of rational selectivity. Only those who exercised some special political or economic monopoly were ever even temporarily in a position of being threatened by a surfeit; and so the appetites remained keen, because only rarely could they be sated. Under such conditions, there was little reason to exercise a vigilant control over quantity, for fostering a discipline of restraint and a habit of studious selection; such discrimination as was necessary was that exercised on a basis of quality alone. (106-107)

This seems to me closer to reality, though I'm still not sure that the last line follows from what precedes it. It is not so much that pre-industrial culture begot well-balanced standards of discernment as that industrial modernity leads us to pine for them. In the speculative realm such standards are thus made conspicuous by their absence in present reality; as to whether they were ever part of any bygone reality, that is a rather different question.

Perhaps the pre-industrial epoch which Mumford (and I myself along with him, I confess) is tempted to idealize is in fact worth idealizing only for the happy accident that certain Technical capacities had stabilized at a level which was somewhat in harmony with Human capacities. If it was the aristocrats, then, who were first in human history to be "threatened by a surfeit," this is to say that they were the first to illustrate how easily human beings are seduced by abundance, how easily the conceit to a discerning posture is revealed by circumstance as merely a conceit.

Expressive art, just in proportion to its value and significance, must be precious, difficult, occasional, in a word aristocratic. (108)

Not that I disagree much with the broader sentiment, but is there anything whatsoever "precious, difficult, occasional" about aristocratic consumption patterns? Or is this merely an ideal which human beings of all classes are hard-pressed to live up to unless it is immutably (i.e. materially) imposed on them? Only by a sensitivity to the finer distinctions among "precious" morsels coupled with what Mumford unabashedly calls a "puritanical" ethic of consumption might an "aristocrat" live up to their station; but we might more profitably label this achievement based on observed behavior rather than caste.

In such small rhetorical inconsistencies lies a crucial underdeveloped theme: perhaps the crediting of aristocrats with blazing the trail of refined taste is a narrative peddled by and for aristocratic interests. In fact the aristocrats' well-known lack of restraint was the first, best warning of what sins of excess would befall the rest of the human race should things like photorealistic generativity, fatty foods and sexual indulgence ever become available to them in abundance. This seems to me (I speak conditionally here as I am rather out of my depth by responsible academic standards) to comport better with the actual historical record, but also to thoroughly undermine any attempt to valorize the standards of those who have the most above those who have less (and certainly, I hasten to add, vice versa). To the extent that humans of every nation, class and epoch have consistently succumbed to such excesses as were available to them, to that same extent the evidence in favor of considering this an absolute human characteristic approaches an incontrovertible preponderance.

To complete the strictly rationalistic line of thought, abundance should be the condition which imposes this "aristocratic" posture by brute force; the condition by which we should be driven by threats to sanity and survival, no less serious than that proverbial marauding lion was to the bodily integrity of the caveman, to evolve on the fly our powers of discernment. If instead of a heightened sensitivity we find a mere numbing effect, if instead of a seasoned palate we find a mere retreat from the stimulus, if all it takes to bring about "a progressive failure in selectivity" is for us "being overwhelmed by our symbol-creating capacity," then perhaps "selectivity" per se is a secondary rather than a primary psychological phenomenon. We don't live to discern, we discern to live. And that is to say that we are eminently un-discerning.

We are rapidly dividing the world into two classes: a minority who act, increasingly, for the benefit of the reproductive process, and a majority whose entire life is spent serving as the passive appreciators or willing victims of this reproductive process. ...an endless succession of images passes before the eye, offered by people who wish to exercise power, either by making us buy something for their benefit or making us agree to something that would promote their economic or political interests... (97)

As a result of this whole mechanical process, we cease to live in the multidimensional world of reality, the world that brings into play every aspect of the human personality... We have substituted for this, largely through the mass production of graphic symbols...a secondhand world, a ghost-world, in which everyone lives a second-hand and derivative life. (97-98)

Indeed, the "wish to exercise power" and the acting "for the benefit of the reproductive process" have only grown closer together since these passages were written. The class angle is crucial, reflecting as it must barriers both economic and social. Unlike those more basic concerns, however, photorepresentational class boundaries are increasingly permeable; or, if that is going too far, it is at least increasingly possible even for agents who have very little overall power to nonetheless wield the technics of photorepresentation against those even less powerful than they are; similarly for the utterly powerless to wrest a modicum of power via the increasing accessibility of photorepresentational record-keeping. All of which is to say that photorepresentation is a form of power, one of the few which occasionally begets drastic (if temporary) inversions of seemingly insurmountable power gradients: think undercover cameras in a corporate slaughterhouse, or in an extra-marital dalliance with the chairman.

The Deep Fake phenomenon is the dialectical fissure here: in one respect it threatens to facilitate an ultimately powerful merger of surface photorealism with willful/gainful wholecloth creation; in another respect, as the technics of Deep Fakes gradually become more accessible, we will have no choice but to cease to trust photorealistic documents merely because they are photorealistic, and undoubtedly this would upend most of what we (think we) know about the place of photorepresentation in society and culture. It could, theoretically, mostly undo the aforementioned "democratization" via a total "devaluation" of the image-as-epistemic-claim. At that point, only "the actual experience" will count. A wonderful situation, it may seem, for artists with shows on the books, but a terrible one for epistemic and intellectual life, so terrible in fact as to threaten dignified existence and art along with it.

That these possibilities were latent in the medium of photography from the beginning is no reason to tar the entire field with the brush of disenfranchisement. But I do think that the difference between Mumford's classical-functional conception of photography and photography in a world of fully-subverted Deep Fakery is one of degree rather than kind. The degree in question is that to which morality leads or follows the id; that to which gainfulness leads or follows aesthetics; that to which the subject chooses to represent the object based on unmet psychological/individual needs rather than social/collective ones. In this respect, the fact that you can no longer trust even a timestamped, photorealistic document could ultimately mean the subversion of entrenched power, or it could mean the ultimate triumph of it. Again, responsibility is a human burden no matter the machines we may invent.

08 April 2020

Freud — The Psychopathology of Everyday Life (ii) — Superstition and Suspicion

[244]I would distinguish myself from a superstitious man, therefore, as follows: I do not believe that an event not caused in any way by my own mental life can tell me any hidden facts about the future structure of reality, but I do believe that an unintentional expression of my own mental processes can reveal some hidden factor which itself belongs to my mental life alone. I may believe in outer (real) chance, but not in fortuitous inner (psychic) actions. A superstitious man will see it the other way around: he knows nothing of the motivation of his fortuitous actions and slips, he believes fortuitous psychic factors exist, and he is inclined to ascribe a significance to outside fortuitous events that will make itself felt in reality, and to see chance as a means of expression for something hidden that is outside him. There are two differences between me and the superstitious man: first, he projects a motivation on to something outside him, while I look for it within myself; and second, he interprets chance as some incident that has happened, while I derive it from an idea. However, what seems to him concealed corresponds to the unconscious in me, and we share an urge not to see chance as solely accidental but to place some kind of interpretation on it.

I assume that this conscious ignorance and unconscious understanding of the motivation of psychic fortuitous events is one of the roots of superstition. Because a superstitious person is ignorant of the motivation of his own fortuitous actions, and because that motivation is clamouring to be recognized, he has to accommodate it in the world outside himself by displacement. If there is a connection of this kind it will scarcely be confined to this one case. In fact I believe that a large part of any mythological view of the world, extending a long way even into the most modern forms of religion, is nothing but psychology projected into the outside world. The vague recognition (it might be called endopsychic perception) of [245]psychic factors and circumstances in the unconscious is reflected--it is difficult to put it any other way, so here I must call on the analogy with paranoia--is reflected in the construction of a supernatural reality, which science will transform back into the psychology of the unconscious. The myths of Paradise and the Fall, of God, good and evil, immortality, and so on, could be understood in this way, turning metaphysics into metapsychology. There is less of a gulf between paranoiac and superstitious displacement than may at first glance appear. When human beings first began thinking, as we know, they felt compelled to resolve the outer world, anthropomorphically, into a diversity of personalities in their own image; the chance events that they interpreted in superstitious terms were therefore the actions and expressions of persons. They were just like those paranoiacs who draw conclusions from the trivial signs they observe in other people, and like all those healthy people who, correctly, judge character by the fortuitous and unintentional actions of their fellow men. Superstition seems misplaced only in our modern, scientific but by no means complete view of the world; as the world appeared to pre-scientific ages and peoples, superstition was legitimate and logical.

Relatively speaking, therefore, the Roman who abandoned some important enterprise if he saw birds flying in the wrong formation was right; he was acting logically in line with his assumptions. But if he abstained from the enterprise because he had stumbled on the threshold of his door (un Romain retournerait [a Roman would turn back], as they say), he was definitely superior to us unbelievers, and a better psychologist than we are, despite our current efforts. His stumbling showed him that some doubt existed, something in him was working against his enterprise, and its power could impair his own ability to carry out his intention just as he was on the point of performing it. One can be sure of success only if all mental forces are united in making for the desired aim. ...

[246]Anyone who has had the opportunity of studying the hidden emotions of the human mind by psychoanalytic methods can also contribute some new ideas about the quality of the unconscious motives expressed in superstition. It is particularly easy to see how superstition arises from suppressed hostile and cruel feelings in neurotics, who are often very intelligent but afflicted with compulsive ideas and obsessions. Superstition is to a high degree an expectation of bad luck, and anyone who frequently ill-wishes other people, but has repressed such ideas because he has been brought up to wish them well instead, will be particularly likely to expect bad luck to descend upon him from outside as a punishment for his unconscious ill-will.

The Psychopathology of Everyday Life, trans. Anthea Bell, pp. 244-246

The crucial distinction here is inner- as opposed to outer-directed psychology. Freud here quietly levels a devastating critique of those who project their otherwise healthy skepticism exclusively onto the outside world and not at all back upon themselves. In today's colloquial terms this amounts to worrying about things you can't control, a sure recipe for frustration if not for madness itself, as well as for the peculiar condition, raised earlier in the work, of social actors who know (or seem to) others better than they know themselves.

I don't know that I myself can make any exceptional claims to self-knowledge, but as an introvert mired in lifelong estrangement from the tyranny of extroversion which seems to run the world I was foist into at birth, I certainly am apt to posit a privileged position here for my comrades in inner-directedness, and I can certainly conjure my fair share of anecdotes in which excess gregariousness is accompanied by obvious deficits of self-scrutiny. And since introversion and gregariousness are, of course, not mutually exclusive, I would head the list with my own more gregarious moments, which seem not merely to suggest but in fact require a temporary relaxation of filters. As pertains specifically to public social interaction I indeed identify unapologetically with that ever-trendy neologism, the "ambivert," and as I have slowly learned to negotiate the social world and become more familiar (if not truly more comfortable) with its demands, the compulsive talker has made ever more frequent appearances and the wallflower ever fewer. This has indeed been profitable for both my self-knowledge and my relationship to this external social world; but it has also confirmed for me beyond a reasonable doubt that I am almost sure to regret the things that pop out of my inner extrovert's mouth, and often times profoundly so. I definitely like myself less as the filter has become leakier with age, and I'm afraid that is probably a meaningful observation.

16 March 2020

Mumford -- Art and Technics (xiv)

"As with printing, photography did not altogether do away with the possibilities of human choice; but to justify their productions as art there was some tendency on the part of the early photographers, once they had overcome the technical difficulties of the process, to attempt to ape, by means of the camera, the special forms and symbols that had been handed down traditionally by painting. Accordingly, in the nineties, American photographs became soft and misty and impressionistic, just when impressionism was attempting to dissolve form into atmosphere and light. But the real triumphs of photography depended upon the photographer's respect for his medium, his interest in the object before him, and his ability to single out of the thousands of images that pass before his eye, affected by the time of day, the quality of light, movement, the sensitivity of his plates or film, the contours of his lens, precisely that moment when these factors were in conjunction with his own purpose. At that final moment of choice--which sometimes occurred at the point when a picture was taken, sometimes only after taking and developing a hundred indifferent prints--the human person again became operative; and at that moment, but only at that moment, the machine product becomes a veritable work of art, because it reflects the human spirit." (93)

"in conjunction with his own purpose"
In other words, this is where willfulness and vanity are turned to constructive ends; or at least the artists themselves are bound to think so, since these are their "purposes" and not someone else's. Ideally the audience/recipient also has an active part to play in assuming the same discriminating posture vis-a-vis any transmission they might choose to receive; but the question of direct communication of messages and ideas, of the equal validity of myriad contradictory interpretations of the transmission, looms large here. What if the recipient's purpose is, from the outset, somehow at odds with that of the artist? And even where their purposes are in fact aligned precisely, who is to say that this happy accident could not still get lost in the aesthetic shuffle?

Indeed, the skeptic is wont to intone: If you really need to send a message, write a letter. To decode that saying in terms of the present discussion: a language (one worthy of the name and shared by the parties concerned) is the most functional Technical means of comunication (worthy of the name), and photography is not, quite, a language. In Technics, initial design choices determine the use dynamics of a machine, which in turn determine an essential purpose for it. Similarly, the very notion of message or purpose seems to dictate that there is, whether really or merely ideally, an essential standard against which efforts of realization can be judged. All together now: Communication is a branch of Technics, not of Art. Aesthetic productions, meanwhile, say what words alone cannot, which is fine if the substance of your message or purpose is vague or negotiable. If your purpose is in fact deathly important, then you should get Technical and exclude a large part of your human personality from the transmission. If your personality is the message, then it's fine to use Art, but you had better be an unusually interesting and deep person if you expect anyone else to care.

From any position short of full comminicative potential, the aesthetic distinction between impressionistic photography and photographic photography (what else to call it?), between machine art which "attempts to ape...the special forms and symbols that had been handed down traditionally by [handicraft]" and that which "depends upon the [machine artist's] respect for his medium," between art which subverts mechanical function and art which affirms it, this distinction comes to seem rather arbitrary. In this passage it is implied, and elsewhere asserted, that art which subverts Technics is less successful across the board, including (perhaps especially) aesthetically, than art which affirms Technics. Mumford puts forward "machine art" as the clearest illustration of this dynamic, but ultimately I think that the distinction between functional art and recreational/contemplative/aestheticist art is most meaningful here. In other words, where there is a clear, objective standard of success or failure against which to measure the artist and their work, a standard which is borne of quotidian matters rather than hedonistic or metaphysical ones, then I would expect a pattern to emerge whereby affirmative machine use begets demonstrably higher functioning products than does subversive machine use. Art for its own sake, meanwhile, is definitionally oblivious to process, cares only about results, and imposes no (or at least many fewer) absolute standards of success or quality. This is why, from the standpoint of an aestheticist artist, Mumford's stricture against subversive machine use seems more like axe-grinding than meaningful analysis. I for one consider there to be no remarkable deviation in quality along the distinction between affirmative and subversive technicians; I do, however, see a certain path dependence as inhering in each way of working, with subversion leading not, as is so often claimed for it, to a broad flowering of untapped possibilities but simply to more or less equally narrow set of possibilities dictated by the initial design of the machine.

As such, the lesson I would take from the example of nineteenth century impressionistic photography is neither that it is doomed aesthetically nor that it can find no function, but merely that the desire for social acceptance under a very particular rubric is itself quite the arbitrary consideration vis-a-vis Art, arising as it does from neither aesthetic nor from functional demands but from social insecurity. It would be totally unsurprising, then, if art issuing from this quite unartistic mindset would fail at fulfilling roles which it was neither conceived nor designed to fill. In my own bailiwick there is no shortage of analogous examples: there are instrumentalists who turn to extended techniques and avant-garde performance practices simply to draw attention to themselves, to stand out, to be contrarian, or to conceal other deficiencies; and there are those who make their names and careers as earnest, compelling avant-gardists who subsequently choose to cash in on the mere spectacular potential which inheres in a drastic reversal of course. And then there are musicians like Robin Hayward and Vinko Globokar who have built compelling practices on technical subversion and succeeded on most every critical level all while sustaining a sincere posture. That they are exceptional examples is, I think, a function of the overall poor signal-to-noise ratio in the contemplative arts, and not necessarily a function of how contemplative artists use or misuse machines.

If all of this is so, then it would be absurd to claim that the "human person" is less "operative" here than elsewhere. I've known some profoundly deficient, supremely operative human persons, and I think we can all be thankful, actually, that they've gravitated toward the contemplative arts and away from the functional ones.


"that final moment of choice"
Perhaps the photorepresentational will has just recently found its Technical apotheosis in the smartphone and its various space-age cameras, in the "burst" and the "moment," functions which have done for curation what the camera itself did for representation. This seems a near-archetypal instance of an innovation which was technically achievable decades before social conditions led it to be advertised widely on television. Similarly, it is just the latest instance of the problematic, the others, the imperfect rejects, being at minimum more interesting, and often enough also more artistic, than the acceptable, the idealized, the perfect, which it is the contemporary will's social duty to prefer. The proof: these others are so good, in fact, that a recent TV commercial leads with the outtakes rather than with the choice cuts. In instancing the "final moment of choice" as a normative (non-)choice, it becomes undeniable that the outtakes are more interesting than perfection even if they are not necessarily better.

If there are device- and marketing-specific reasons, as well as social ones, that two-factor photo curation has just recently come (back?) into popular consciousness, this practice is incidentally also extremely relevant to the dynamic Mumford outlines here. These technologies themselves now make more transparent than ever before the possibility that this "moment of choice" can just as well come after the properly technical concerns and the gadgets themselves have been powered down and returned to the shelf. Curation is at that point not merely more accessible but, given the wide reach of these devices materially and socially alike, very nearly an essential part of photography, much as music production and post-production are, despite the prodigious recent growth of specialized credentialing therein, more likely than ever before in the recording era to be handled by the performers themselves. In one sense the counterproductive elisions of agency Mumford writes against have been made harder to accomplish; in another sense this has come about via a new regressive disenfranchisement within a formerly "democratized" art form, whereby social stigma and normative thinking pre-determine artistic choice that formerly lay more wholly with the individual. If you don't believe me, try playing raw sessions for an audiophile.

For those of us who wish to present ourselves to the world as artists first and foremost, there are two ways to interpret all of this vis-a-vis the will. Perhaps a compulsory choice is no choice at all; or perhaps this choice was always implied/tacet and by being made conscious makes (gently enough) a genuine agent out of the formerly passive recreator. Perhaps production responsibilities are imposed on music performers via an unfortunate confluence of economic, material and cultural forces; or perhaps musicians have thus wrested control of something they can do for themselves as well as anyone else can do it for them, thereby cutting overhead and regaining agency where usury and abdication and previously prevailed. As for photo-representational art, perhaps the social world thus represented is, essentially, a play of wills which is only made stronger by diversity; or perhaps this social world is a war of wills where greater technical power makes possible ever greater mutual destruction.

Presumably photogs still need the skill to account for many of the same variables Mumford lists even if their timing no longer needs to be perfect. There is even the possibility, which I assume has by this time been realized thousands of times over if not necessarily under the auspices of the formal art world, of a firmer division of labor between moment photographer and burst sorter, between Technician and Curator. In such a scenario, neither person is able to lay a whole claim to Mumford's conception of artisthood independent of the partnership, much like a termite colony in which the group demonstrates the characteristics of a complex organism but the individual bug does not. Termites get a lot of work done this way, but a human society committed to any degree of individualism might think twice about the implications of such extreme divisions of labor for the fate of the individual. Is there not a point where lifting the burdens of agency itself becomes oppressive by stunting development? And is this not intrinsically what Technical advances do in spite of their many more salutary aspects?

"his interest in the object before him"
The object which is mechanically reproduced by the photographic image has, as far as I can tell, no sentience or agency in Mumford's account; but in fact this object is quite frequently, perhaps even paradigmatically another human being, another citizen, social agent, desiring subject; and this means that the advent of photography greatly intensified a conflict of rights between the subject's freedom of expression and the object's freedom from it.

Mumford speaks to the possibility that the moment of choice can occur at two different stages of the process, either in the moment the picture is taken or as it is selected from among many such options. This two-part process of generation followed by curation is hardly unique to the photorepresentational arts, but the unique political dynamics of the representation of one subject by another are multiplied, literally and figuratively, by it. The object-agent can now be violated not just once but twice: first they can have their image captured for purposes over which they have less control or certainty than they are justly entitled to; next, they may see this image reproduced, deployed, distorted in all kinds of ways that may be more specifically violating. As photography becomes faster, more powerful and more precise, it requires a lot less skill than it used to capture the object in an unflattering moment; rather, you simply need enough time and a fast enough camera. The narrow area into which the expressive personality of the subject is channeled by this technology is coextensive with the area where the object-agent can be violated. Just as machine art has unique and distinctive aesthetic and functional qualities, so it enables unique forms of violation which humanity didn't have to wrestle with back when it was far more difficult and technically inaccessible to hand-draw someone's spitting image quite so well. And so the internet is full of clickbait portals which compete for our attention this way: football game wrap-ups which lead with piles of players in unfortunate positions, political coverage leading with spitting-mad stills of unsympathetic figures who may merely have been speaking a prosaic word that happens to begin with a hard consonant. If representational mediums do not quite lead inexorably to these sorts of outcomes, nonrepresentational mediums do lead inexorably away from them. It seems to me (still) that this fact has not been adequately considered or elaborated by scholars of art's place in society.