Showing posts with label postmodernism and postmodernists. Show all posts
Showing posts with label postmodernism and postmodernists. Show all posts

09 June 2024

Critical Commonplaces


The Commonplace Raised to a Higher Power
Arthur Danto
interviewed by Hans Maes
(excerpt)




This so-called 'method of indiscernibles'...arises, according to Danto, not only in the field of aesthetics but in all other areas of philosophy. Philosophy is supposed to address its subject matter...by seeking the conditions that make the things under scrutiny the kinds of things they are. The appropriate way of seeking these conditions, Danto suggests, is to examine how the thing...differs from an object or event that is ostensibly indiscernible from it.




Ugh.

Could you forgive some of us, just maybe, for seeing in this merely an exercise?

01 March 2024

Fallacies Intentional and Unintentional


This is my Goodreads review of The War on Music by John Mauceri. It turned into more of a summation of everything I've been churning over for the past several years.



Scattered amongst the howlers is a story that deserves to be told. Two stars for that story, zero for its rendering here.

This review is both too long and too vague. I blame the bullshit asymmetry principle .

+=+=+=+

There are some notes and citations at the end, but really this is a polemical work and not a scholarly one. It is a mad dash on the hamster wheel for Mauceri, who repeatedly stakes out some patch of moral high-ground only to tell on himself later. Even the digression on sour liner notes is recapitulated when, in the acknowledgments, he says, "Many peers have read this manuscript, some of whom were enraged. ... What was hated—and why—taught me a great deal." One can only hope. But for now he has merely doubled down, as any polemicist must.

10 July 2023

Richard Maltby—The Cinema of Disintegration


Richard Maltby
Harmless Entertainment:
Hollywood and the Ideology of Consensus

(1983)



CHAPTER 10

MERE ANARCHY: THE CINEMA OF DISINTEGRATION

[315]

...

PACKAGING CONCEPTS

The fragmentation of production coincided with the merger of the major distribution companies with larger corprate groupings. The period from 1966, when Gulf and Western took over Paramount, to 1969, when Kinney National Services merged with Warner-Seven Arts, saw an upheaval in company ownership more substantial even that that of the early 1930s. The majors diversified, predominantly into other media, or were absorbed into conglomerates attracted by their undervalued stock, their film libraries and their real estate. However, the reorganization of the industry that followed diversification was a less fundamental change than that provoked by the Paramount decrees. By and large, it extended the effects of divorcement. The merger with other media concerns, particularly the record industry, was in a sense only an extrapolation of the majors' post-

[316]

Paramount commitment to a power-base in distribution rather than production, and the growth of independent production completed a process begun in the early 1950s.

Hollywood's acquisition by conglomerates has, to a degree, merely been the swapping of one set of distant masters for another. The new landlords of the Dream Factory, like their predecessors, have pursued the primary motivation of profit; on occasion obtained by slum clearance projects like Kirk Kerkorian's sale of M. G. M. assets to build a Las Vegas hotel, or the urban renewal program of Century City on the back lot of the Fox studios. But if Hollywood has shrunk physically under corporate ownership, with its volume of production declining from 196 features in 1969 to 106 in 1978, its business remains much the same, and in one respect only have the new patterns of ownership made a significant difference to the way it conducts that business. The role of the mogul has been abolished: Hollywood's recent studio executives are men under different influences from those of Warner or Cohn. They share a trait common among corporate management, of frequent mobility of employment. Where Mayer ruled M. G. M. from 1924 to 1951, the studio saw six different studio heads in the years between 1968 and 1979. Only Warner Brothers and Universal had the same management team throughout the decade, while career structures like that of David Picker are increasingly the norm. Picker became President of United Artists in 1969, left to go into independent production in 1973, became head of production at Paramount in 1975, and returned to independent production for Lorimar in 1977. This pattern of short tenure in senior management helped to remove the last vestiges of any identifiable studio styles. By the mid-1970s the post-Paramount attitude of regarding each production as a one-off event had reached a point where none of the majors any longer possessed a recognizable identity either in its personnel or its product.

The corporate acquisitions and the economic crises of the late 1960s occasioned the removal of the old guard. Box-office failures combined with the spectacles of the counter-culture (Haight-Ashbury, Chicago, Woodstock) to offer further evidence that Hollywood's liberal consensus was no longer adequate to the demands of a more youthful and volatile audience. The accepted explanation was that the industry had lost contact with its audience because there were too many old men with too much control over production to encourage the right material. In response,

[317]

Hollywood engaged in an unparalleled wave of parricide. Its most conspicuous victims were the last surviving moguls. Jack Warner sold his interest in the studio in 1967 to embark on a notably unsuccessful career in independent production. Darryl Zanuck lost the last in a series of proxy fights at Fox, and retired in 1971. Between 1966 and 1973 all the majors acquired new, much younger production heads, drawn as often as not from outside the immediate confines of Hollywood. The more public search for the kid genius director concealed a more enduring palace revolution giving power to a younger generation of executives whose previous careers were most likely to have been in television, talent agencies or "creative management."* If the personnel changed, the professional ethos remained the same. Heads of production continued to insist on their ability to gauge an unstable public taste, and to argue that the nature of the industry militated against predictable profit margins.

In other areas of its financial operations, the new Hollywood was more susceptible to corporate influence. The long-term response to the financial crisis of the late 1960s was for the majors to withdraw further from direct involvement in production, concentrate on financing and distribution, and find more ways of hedging their bets over investment. Tax shelter finance became an important source of production funding in the early 1970s when bank capital was more cautious about investment in films, and it probably saved Columbia from collapse. Occasionally two companies would jointly finance a large-scale production, sharing distribution rights. Of greater significance was the practice of pre-selling films to exhibitors by demanding non-refundable guarantees in advance of screenings, passing the loss on unsuccessful blockbusters like A Bridge Too Far (1977) and 1941 (1979) onto the owners of the empty theatres. In mid-decade the majors began to recognize and capitalize on the value of ancillary markets to the point where television sales in particular were commonly negotiated in advance of production, and their revenues taken into account in calculating budgets. Such mechanisms of distributor protection


__________
*e.g., James Aubrey, former head of CBS-TV, who became President of MGM in 1969; Ted Ashley, former agent at William Morris and founder of the Ashley Famous Agency, who took over production at Warner Bros in 1977; David Begelman, co-founder with Freddie Fields of Creative Management Associates, who became Columbia's production chief in 1973.

[318]

meant that, at least for them, a film might show profit without drawing audiences. Their regular distribution fee, of 30 per cent of rentals, guaranteed them healthy windfall profits on "supergrossers," while also delaying the point at which every film was deemed to have broken even, after which the distributor would have to pay the film's producers a percen- tage of the profits.

Distributors negotiated from a position of strength to ensure their own stability, if necessary at the expense of exhibitors and producers alike. Theatre owners and television companies might have to carry the can for occasional unexpected box-office failures, but producers were more consistently penalized by overhead charges, punitive deductions for going over budget and interest charges while the film was recouping its costs. Although the commonly accepted notional figure for a film breaking even is 2.5 times its negative cost, on occasion distributor manipulation of figures prevented a film declaring profit up to a point well in excess of its notional break-even level. In December 1979 Fox declared that Alien, with a negative cost of $11m, had so far earned $48m in worldwide rentals and was still $2.5m in deficit. The net result of these distributor practices has been a pattern of broadly stable and increasing profitability for all the majors during the decade. By 1980, Ned Tanen, President of Universal Theatrical Pictures, was confident enough in both the certainty of profit and the uncertainty as to how it would be earned to declare,

the business projections we make for each year usually end up correct within one or two percentage points. We end up where we thought we were going to be, but we never, ever get there the way we thought we were going to get there.

Stabilized distribution economics and a mobile corporate bureaucracy are the real legacies of the crisis of the late 1960s, not, whatever Francis Ford Coppola's good intentions, greater freedom for the individual filmmaker.

The dominance of the major distributors suggests that the influence of the smaller production or production-distribution companies has been exaggerated by writers in pursuit of critical genealogies rather than economics. In itself, the Hollywood Renaissance of 1969-71 was an inconsequential event: in search of the profitable youth film and uncertain where to find it, the studios floated independent

[319]

production companies with radical intentions (in particular BBS and Pressman-Williams) by agreeing to distribute their product, and themselves backed a few small-budget first features by young directors. After Easy Rider, these were almost uniformly unsuccessful: the few "anti-Establishment" successes at the turn of the decade were either large-budget productions such as Little Big Man or Carnal Knowledge, or, like Midnight Cowboy and M*A*S*H, were made by older and more established directors.

The illusion of the Hollywood Renaissance has, on the other hand, been of more consequence in formulating the received history of the 1970s, largely because of the allegedly crucial influence of one man, Roger Corman, in sponsoring the first efforts of the majority of directors who attained critical prominence in the rest of the decade. Michael Pye and Linda Myles, in particular, have promoted Corman's centrality to the American cinema of the 1970s, in their book The Movie Brats. His record of success is not to be denied: Bogdanovich, Coppola, Scorsese, Kershner, Nicholson and Wexler all got their breaks via Corman, while his company, New World, was the prototype for Coppola's American Zeotrope, which itself sponsored Lucas. But Corman is (in almost any terms, but particularly economically) a peripheral figure in the film industry. Whatever claims to critical attention he may have, the nature of Corman's low-budget operation inevitably places it outside the orbit of the major companies, on whose omissions and miscalculations it is to a large degree dependent. Like his mentors Sam Arkoff and James B. Nicholson of American International Pictures, Corman's stock-in trade has been the exploitation of otherwise unrequited demand, whether that be as producer of biker movies or as American distributor of Cries and Whispers. His reasons for employing young talent have equally always been economic. Untried directors, actors and crew eager to make their first film are cheaper than seasoned and unionized professionals. AIP, New World and their imitators have largely taken over the function of B-features as the training-ground for talent the majors will later absorb.

Corman's historical importance stems from his commercial success in the period of the majors' greatest insecurity. But his working procedures were not a solution to Hollywood's economic problems, because they did not provide the majors with substantial enough product. In the early 1970s they were prepared to employ anyone, even Russ

[320]

Meyer, who might provide a clue to audience taste. By mid-decade, they had abandoned their scruples and committed themselves to producing and distributing the kind of overtly sensationalist material they had previously avoided, and independents like Corman could not compete in production values with the likes of The Omen and Carrie (both Fox, 1976). With the decline of low-budget production, Corman's critical cultism and his commercial reputation began to ebb.

It may be that the most significant legacy of the brief rise of the exploitation movie in the Hollywood Renaissance was the majors' adoption of exhibition patterns that independents like AIP had been pioneering earlier in the decade. Saturation booking, the simultaneous release of a film into a large number of theatres at the same time, was a standard practice among exploitation filmmakers, whose economics required the rapid recoupment of investment. The majors began experimenting with it in the late 1960s, shortly before they started to use national television advertising. Strategies of this kind greatly increased distribution costs by expanding publicity budgets and print costs. Where in 1960 a maximum of 350 prints of a film might be made, by the late '70s a movie given blockbuster treatment might require as many as 1000 prints. Expenditure on publicity now regularly exceeds a film's negative costs (Fox spent $10. 8 million making Alien, and $15. 7 million advertising it). Such marketing mechanisms, available only to a limited number of films at a time, inevitably reinforce the distributors' blockbuster mentality. The new economics revealed themselves clearly enough in 1971, when the year's top-grossing film, Love Story, earned more money in domestic rentals than the next three highest-grossers combined.

As James Monaco has pointed out, what is notable about this economic strategy is that it is an essentially conservative response to a situation of limited audiences. The increased expenditure on publicity, with its tacit acknowledgment that it is possible to sell a film to the public, provides a further mechanism of distributor control. A low-budget production like American Graffiti may produce phenomenal profits when measured by the ratio of rental income to negative cost (in this case of 5000 per cent). But the decision to sell the film vigorously enough to make such earnings possible lies with the distributors, whose preference remains for the reliable investment. American Graffiti's success bought George Lucas a fourteen-fold increase in budget for his next film, Star Wars, the most remunerative movie in Hollywood

[321]

history. By comparison to American Graffiti, it yielded a mere 1855 per cent profit on investment. But it was a product more satisfactorily geared to the logic of a corporate economics seeking market stability than the much less predictable earnings of Lucas' earlier film. Despite the enormous cash-flow figures of individual films, the blockbuster approach to marketing is, like all distributor mechanisms, designed to guarantee commercial stability rather than maximize profits. In this respect, it is in the grand tradition of Hollywood economics, where a superficial extravagance conceals a fundamental conservatism.

In contrast to the calamities of 1969-71, relatively few blockbusters have failed to cover their negative costs in the later 1970s, given the protection for the distributor provided by exhibitors' advance guarantees. On the other hand, blockbuster economics have a peculiar and apparently cyclical habit of getting out of control. At the outset of the cycle, unexpectedly large profits accrue to one or more films, provoking a wave of imitations formulaically repeating the successful film's attractive "elements." Production and marketing budgets expand in the attempt to produce more of the same, to a point where investment in production exceeds any possibility of recoupment, and companies suffer heavy losses as a result of overproduction. Retrenchment, in the form of limitations on budgets and a drop in the number of films produced, follows until the cycle repeats itself with another spectacular financial success provoking imitation. From the crash of Cleopatra in 1963 the cycle has repeated itself twice, reaching its critical stage in 1969-71 and 1980- 81. The most recent crisis, involving films such as Hurricane (1979), Raise the Titanic (1980), and most notoriously Heaven's Gate (1981) has not, however, been nearly so severe as the previous decade's, because the major distributors have maintained a firmer grip over expenditure, on occasion simply deciding to write off a $22 million investment in the production of Sorcerer (1977) rather than plough an equivalent amount into its promotion. The losses on individual films in 1980-81 were, in any case, occurring in a broadly buoyant market. The crisis was provoked rather by a degree of laxity in the supervision of a number of substantial projects and the box-office failure of a cycle of disaster movies, rather than the complete breakdown of producers' ability to predict public taste. The conservative blockbuster approach, with its commitment to marketing rather than production, remains fundamentally sound.

[322]

To some extent, the differences between the production methods of exploitation movies and the packaging of blockbusters is merely a question of scale. In 1955 AIP was pioneering a process of commodity packaging by constructing a film around a title and an advertising campaign. The Beast from 10,000 Leagues has mutated into American Gigolo, initially constructed around a title and John Travolta (replaced, with a drastic cut in the budget, by Richard Gere). The essential change has been the mutation of the idea ("You bring me an idea, " said Jack Berners. "Things are tight. We can't put a man on salary unless he's got an idea.") into the concept ("That notion of the gigolo as a metaphor for the man who can't receive pleasure hit me and from that moment I had a metaphor that was uniquely representative of that problem.") The heavy emphasis on marketing strategies, combined with the absorption of distribution companies into multimedia conglomerates, has elevated the concept to a central place in contemporary Hollywood construction. Movies no longer exist as autonomous industrial products, but are increasingly manufactured as one item in a multi-media package. Star Wars, with its toys, games and bubble-gum spin-offs, is only the extreme version of the conventional packaging of a concept as film, record, "novelization," and so on. The use of pre-sold source material, in novel or play form, was hardly new in Hollywood, but producer Robert Evans set a precedent when he persuaded the publishers of Love Story to print 25,000 copies of the book by offering $25,000 for its promotion. Integrated and jointly financed promotion campaigns became increasingly the norm in the late 1970s, by which time the hype had become almost an art-form in its own right. The carefully orchestrated publicity campaign for Jaws ensured that the film's release just happened to coincide with widespread reporting of shark sightings around the American coast. Timing in such complex campaigns could be crucial in other areas, too. The disaster for Star Wars had nothing to do with the film. It was in not having the children's toys in the stores in time for Christmas.

This process of multi-media packaging has effectively substituted for the studio in the placement of an individual film. Instead of being part of a balanced cluster of films produced out of the same studio, it has become one of a group of products occupying different places in the media web. Likely to be the most profitable individual element, the status of the movie has nevertheless been diminished by a need for formal compromise with the demands of other

[323]

products. In its construction, its producers have been obliged to consider the possibilities for its exploitation as a series of linked but separate commodites, and to compile their package accordingly.

As Hollywood terminology the package has a more specific meaning relating to the assembly of a production. Stars, script (or concept), and less frequently a director or producer, are "packaged" by a talent agency or an independent producer, and this package is then offered to one of the majors for financial backing or a distribution deal. Apart from its tendency to de-emphasize narrative, such an assembly procedure is no more novel than the pre-sold source, but it is another function formerly performed by the studios and now dispersed among a more amorphous body. Packages can be initiated by a wide variety of sources, and it is contemporary Hollywood folk wisdom that more time and effort is spent in the arrangement of the packages than in the resulting film, the process being made more complicated than previous systems of production by the competing interest of the various individuals involved. As Joan Didion put it in her essay "In Hollywood,"

... to understand whose picture it is one needs to look not particularly at the script but at the deal memo.

She provides an acute analysis of the aesthetics of the deal:

The action itself is the art form, and is described in aesthetic terms: "A very imaginative deal," they say, or, "He writes the most creative deals in the business." ... The action is everything, ... the picture itself is in many ways only the action's by-product.

The deal mentality is the result of uncertainty; many more films obtain money for development costs than go into production, and each individual, to stay in reasonably frequent work, needs to be involved in several projects at the same time in the expectation that one of them will come to fruition. This is particularly true for independent producers, whose income generally comes from profits rather than project development money, and who must therefore gamble on as many deals as he or she can keep going. Deal psychology has also facilitated--as well as in part being caused

[324]

by--the predominance of agents in contemporary production. The speculative and negotiating skills needed by the producer as deal-maker have much more in common with those of the talent agent than they do with the organizational and financial abilities required by a studio producer. Since the deal was inaugurated by Arthur Krim and Robert Benjamin of United Artists in 1951, the dividing line between agent and producer has become ever thinner, and the occasions on which the agent has become the producer more common. The most grandiose version of this occurred in 1962, when MCA was forced by the Department of Justice to abandon its talent agency activities and took over Universal, but the list of former agents who have become producers or heads of production is almost endless, and it is these figures who supply and maintain the deal mentality, and the insecurity it breeds.

While Didion's recognition of the substantial irrelevance of the final product to the processes of its packaging is further evidence of the New Hollywood's narcissism and incoherence, it should not in itself be seen as evidence of a decline. Packaging is no more detrimental to film production than the modes of organization it has replaced; those, like James Monaco and Pauline Kael, who insist on seeing it as such have essentially failed to recognize that Hollywood never existed to make films, but rather to make people go to the movies. Like the studio system, the goal of packaging is the production of entertainment; like the studio system, packaging functions as an arrangement for reducing emphasis on the role of the content in what is being sold. The logic of media conglomeration has widened the marketplace in which the product is sold. It is now as tangibly on offer in book- and toy-stores as it is in movie theatres. In the process, its nature has changed.

The aesthetics of the deal have combined curiously with the critical enhancement of the director's status to produce, in the work of Spielberg, Lucas, De Palma and Milius, films which at the same time demonstrate a "personal cinema" through their mannerisms and operate the mechanistic structures that James Monaco has aptly identified as those of an "entertainment machine," much less concerned than earlier movies with telling their audience a story. Repeated assertions that the story is seldom a central element in deal-making indicate the extent to which narrative has been dethroned. Steven Spielberg suggests,

[325]

What interests me more than anything else is the idea. If a person can tell me the idea in twenty-five words or less, it's going to make a pretty good movie.

But it is unlikely to be a film in which narrative reaches any great level of complexity, something which is clearly true of all Spielberg's films, which comprise situations allowing for plenty of spectacle but little plot development.

The speed with which narrative declined as a force in the movies in the 1970s may be indicated by looking at the decade's one contribution to Hollywood's repertoire of genres, the disaster film. Disaster movies are contemporary, debased epics, but more importantly they represent the archetypal package vehicle, the instrument the majors found for spending their money on predictably appealing spectacle. As a genre, they share neither an iconographic nor a narrative consistency, but rather an assembly of elements: stars in emotional conflict, sustained in crisis by a physically restricting situation. Airport, the first success of the disaster cycle, established a conventional pattern by which the audience is attached to the narrative by its concern for individual characters. Later variants overtly dislocated the competing elements that Airport successfully held in tension. Airport and its sequels maintain a linear (if circular) narrative: the survival of its characters is attached to the fate of the aircraft. All of them survive or perish together, however big or small their billing. The Poseidon Adventure (1973) is much more selective. Not only does its situation manage to dispose of all the minor characters (they are drowned en masse minutes into the film), but it also permits spectacle to be detached from any plot obligation. Random incident determines the fate of individual characters: Shelley Winters has a heart attack, Stella Stevens falls into a burning oil slick. Since the plot itself cannot develop--either some or all of these characters will survive or they won't--relations between characters are required to fill in the gaps between the film's spectacular occurrences. Because the situation supplies them with so little to sustain dialogue ("how do we get out of here?", "Where do we go next?") and the need to make the right choice to stay in the movie, they have to talk about something else. Hence the amount of time given over to discussing how fat Winters is, and the unprovoked belligerent exchanges between Gene Hackman and Ernest Borgnine.

[326]

The result is an overt and unintegrated application of sentiment, most apparent in Winters when for no good reason she remarks to her husband, Jack Albertson, "Manny, how long is it since we told each other I love you?" At her death she repeats the same function with a more explicitly symbolic purpose, as she gives Hackman the Jewish sign for Life she has brought for her grandson. Separable incidents such as these provide an arbitrary and imposed meaning for the action, which otherwise remains spectacularly independent of significance.

Irwin Allen's next production, The Towering Inferno (1974, Fox and Warner Bros, a package assembled by Creative Management Associates), carries the process further, eliminating narrative altogether and substituting a game pattern of random incident and problem-solving for its characters. The film's introduction establishes a number of potentally complex character relationships with a thematic issue, mainly revolving around the complicity of William Holden and Richard Chamberlain in the breaching of safety codes. These are hastily abandoned once the fire breaks out, and are used instead to confirm characters' positions. Chamberlain becomes the film's bad guy, Holden's moral ambiguity is simply forgotten in the confusion. Where in The Alamo the survivors represent the hope of the future, the best Holden can offer by way of moral summary at the end of The Towering Inferno is, "All I can do is pray to God that I can stop this from ever happening again." The film operates the mechanisms of earlier narrative forms--Jennifer Jones' cat becomes a sentimental object embodying loss when O. J. Simpson gives it to Fred Astaire at the end--but operates them detached from a continuous narrative. The film is a series of disconnected exchanges between characters interrupted by the spectacle of the fire. Its packaging revolves round its situation and its consortium of stars. Characters are paired off in the introduction, offering a multiplicity of separate stories which the film may or may not choose to develop. The quantity on offer permits the film to dispose of some of them at random: Robert Wagner's clandestine affair with Susan Flannery ends abruptly when they become the first victims of the fire; Jennifer Jones arbitrarily falls to her death. Any character or story is available for sacrifice without disrupting the spectacle, and the only guarantee for survival is star status. By the same token, individual

[327]

scenes operate as separate and complete units in themselves, unconnected to the rest of the film. Paul Newman, Jones and two children spent ten minutes negotiating a demolished staircase, an incident quite detached from events occurring elsewhere and getting them, literally, nowhere. Immediately after wards they discover their route down is barred, and have to climb up again.

The film revolves around creating incidents engineered by an arbitrary chance, such as the cement which blocks the door into the party room. No adequate explanation is offered for its presence, no justification required except that it provides grounds for another scene. Its placement is as fortuitous as that of the wall-light which Newman uses as a foothold to climb up to the pipeshaft in the same scene. Instead of seeking narrative continuity, the film is constructed like a set, with each group of characters isolated in their own area. What provides its coherence is not any sense of continuity or character development (the characters actually get simpler as the film progresses, and moral status is finally reduced to how well each character behaves when he or she stands in line for the bosun's chair, but the performances of its stars. Richard Dyer has commented on the importance of the stable camera and the stars' charisma in making the audience secure as they witness a disaster, but the stars' performances have another function as well. They--particularly Newman and McQueen, but also Holden--are the only sources of coherence in a film whose content is concerned with collapse, destruction, and deconstruction. Against this, the stars fulfillment of their industrial, commercial function directs the film away from a concern with loss, death, pain and money to a celebration of its performers, whose presence is necessary to justify and explain away everything else in the film. The audience witness performance as they witness spectacle, and since neither proposes causal relationships between consecutive events, they must accept arbitrariness in the film's plot progression.

[328]

The impression of arbitrariness in the reporting of disaster reinforces the arbitrary quality of experience itself, and the absence of continuity in the coverage of events, as today's crisis yields to a new and unrelated crisis tomorrow, adds to the sense of historical discontinuity--the sense of living in a world in which the past holds out no guidance to the present and the future has become completely unpredictable.

Although Christopher Lasch's remarks are primarily directed against the news media, they apply equally to the narrative structures of packaged blockbusters. A variety of psycho-sociological explanations for the disaster movie phenomenon have been offered, and they can readily enough be identified as part of a larger conglomeration of films (including the science fiction packages which replaced them and horror films) which explore the bourgeois American hero's confrontation with the Unknown. This general emphasis seems at first sight almost too easy to identify as a significant cinematic response to the circumstances of the 1970s. Specifying what provokes such heroic insecurity is, however, rather more difficult, particularly in a critical climate dominated by psychosexual interpretations (the shark in Jaws as both phallus and vagina dentata). What has been less frequently pointed out is the aptness of the disaster movie as a metaphor for the film industry's own situation. Faced, at the beginning of the decade, with economic catastrophe and un- certainty about audience demand, Hollywood responded by abandoning the structures of narrative continuity that had previously served it so well, and inaugurating a cycle of speculative investments in disaster in which the only security, for audience and industry alike, came from star performances. The Unknown in these films is not merely contained in their content, but also in the way they are put together out of separable elements. Later variants of the package took the phenomenon to even greater extremes. Close Encounters of the Third Kind makes no attempt to connect its scenes or explain itself. As a narrative it is incomprehensible, as a story it spends two-and-a-half hours getting to the point at which a 1950s science fiction movie would begin. The Unknown in the American cinema of the 1970s is, more than anything else, a matter of narrative structure, a question of what commercial cinema should do if it is not to tell stories. Both the initial problem and its apparent solution came from the new instrument of consensus, television.



...

[357] American Graffiti might more conventionally be described as nostalgic, but nostalgia is only a form of fantasy. Nostalgia consists in a particular relation to history, in which objects are displaced from their material context in time and relocated in another framework detached from their original position. American Graffiti is no more set in 1962 than Star Wars is set "In a distant galaxy long long ago and far far away." It is set in 1973, fixed there by the style of its images and performances, and creates a fantastic version of Modesto, California by its nostalgic consumption of objects loosely belonging to the period it claims to represent. Nostalgia collapses into sentiment in the film's last shot, when it arbitrarily attempts to revise itself by entering history with a deterministic account of its characters' subsequent lives. The nature of the film is suddenly and drastically changed. Instead of remaining within the safe space of the fantasy movie, where privileged characters can produce non-causal performances, it suddenly claims that this night has been a formative experience, a dramaturgy which will lead to change in the external world. Curt escapes the closed world which will kill John and stifle Steve (Ron Howard) by going to college and becoming a writer in Canada, presumably to escape the draft. In a vestige of the liberal tradition, Terry the Toad (Charlie Martin Smith) is killed in Vietnam because he is physically inept with a motor scooter.

[358]

Nostalgia has pervaded the American cinema of the 1970s as a leitmotif of narrative uncertainty. In the films of Dick Richards, for example, it seems as if the authenticity of the costumes and the labels on the tin cans is used as a substitute for coherent story development. The Culpepper Cattle Company (1972) resolves itself by a familiar device in films which make some initial attempt to reconsider the presuppositions of their genre. It collapses into generic conventionality, with the bad guys developing consciences and saving the wagon train. The same strategy of collapse can be found in Coma's (1978; dir. Michael Crighton) abandonment of its assertive heroine (Genevieve Bujold) and in the gradual conversion of Alice Doesn't Live Here Anymore (1974; dir. Martin Scorsese) from a film about Ellen Burstyn's independence into a "woman's picture." At the beginning of the film, she and her neighbor fantasize about Robert Redford. At the end, she gets Kris Kristofferson.

Another of Richards' contributions to the decade's generic nostalgia, Farewell My Lovely (1975), offers an alternative response in employing the insecurities of film noir. The investigative narrative and its archetypal heroes, the private eye and the journalist, emerged in 1974 as figures for post-Watergate fictions. Their heroic status was compromised by their inability to bring their narratives to a successful resolution (Chinatown, The Parallax View); instead, the films beguiled their audiences with the notion that the central characters were as confused about the plot as they were. The employment of noir fixtures was a self-conscious justification for narrative confusion. The audience was presented with a recognizable terrain inhabited by objects and lighting codes remembered from earlier films, and this evocation of displaced objects directed attention away from plot to the image and the central performances of bewilderment and uncertainty.

The resort to nostalgic conventions and the unconvinced re-enactment of generic patterns is indicative of the more general collapse of temporal coherence in films of the 1970s. Wherever else in American culture the sense of historical continuity has come under attack, Hollywood has measured its deterioration in the growing failure to construct coherent linear narratives. Temporal connection, the primary tool of narrative causality, has been increasingly abandoned in favor of structures that declare their incoherence. Dog Soldiers (1978; dir. Karel Reisz) is in many respects (its presentation of space for example) notable for the old-

[359]

fashioned conventionality of its construction. But it makes no attempt to place its characters in time, either historically (the film might be set in 1971 like the book on which it is based, or it might not), or in their movements from scene to scene. Instead, there is an assumption of simultaneity: the audience is forced to assume that the disparate events affecting the two principal characters occur at more or less the same time if it is to construct a comprehensible narrative sequence--a task which the film passively declares is not its responsibility. As it progresses, Dog Soldiers degenerates into a chase movie and its central conception of splitting the post-Vietnam American hero into two individually inadequate and mutually dependent characters collapses. By the climax both have become capable of heroic action, the motivation for which remains inaccessible to the audience, since neither character has previously offered a rationale for his actions. Ray (Nick Nolte) declares at one point, "I don't always have to have a reason for the shit I do," and the unmodulated performances of both Nolte and Michael Moriarty provide the spectator with no evidence of their motivations.

Where the American cinema of the consensus developed its mechanisms of construction around a requirement to produce narratives that were rigid in their linear determinism, the cinema of disintegration has commonly abandoned the attempts to tell stories at all, providing rather a sequence of events arbitrarily connected by the fact of their being edited together. From this the audience may construct as much of a story as they feel capable of. This loss of confidence in the ability to construct a sequential narrative time reveals itself most clearly in a reluctance to provide an ending. Star Wars does not just announce that it is not set in the conventionally remote future of science fiction but in the distant past. At its end it declares that it is the fourth episode in a series of nine.

More normally, Hollywood's recent products have refused to provide a sense of resolution in their conclusion, and have abandoned their central protagonists to an ambiguous fate. Gene Hackman seems particularly prone to this discomfiture. In Night Moves (1975; dir. Arthur Penn), he is left wounded in a disabled boat which describes circles in an otherwise empty ocean. The Conversation (1974; dir. francis Ford Coppola) closes with him playing the saxophone in the apartment he has just demolished. While the conclusion of Penn's film is clearly open to metaphorical interpre-

[360]

tation, the end of The Conversation is merely ambiguous, available to signify anything. Coppola is notorious for the difficulty he has in ending his films, Apocalypse Now (1979) being merely the most spectacular and extravagant example. But the reluctance of Hollywood's contemporary self-conscious auteurs to provide endings which locate the meaning of their films is remarkably consistent. One might argue that the ambiguity of the final "God Bless America" sequence of Michael Cimino's The Deerhunter (1978) is an economic necessity, since a film which refuses to declare its attitude to American involvement in Vietnam is a safer box-office bet than one which does. One might argue that it allows the audience a choice of interpretation, or that it reflects the ambivalence of American response to the war. What it undoubtedly does do is to leave the film open as a text for an endless critical game-playing over its ideological implications, which may well guarantee Cimino's dubious status as an auteur simply by the weight of paper devoted to him. As part of a more general tendency, the contemporary emphasis on an aesthetics of performance would suggest that, since "Robert De Niro is The Deerhunter," whatever Robert De Niro does has the support of the film.

The privileging of performance which is so consistent a feature of the Hollywood product in itself disrupts the temporal continuity of a causal narrative. In performance structures, what a performer does at the end of his or her routine is no more significant than what he or she has done at any other point. The openness of Altman's (or, to a lesser extent, Coppola's) films to almost infinite restructuring is evidence of this, and endorses the argument that a fixity of meaning simply is not present in these inherently incomplete texts. By not telling a story (but rather offering several incomplete stories for the spectator to choose from), such films cannot be said to occupy narrative time. It is, then, hardly surprising that so little of the American cinema of the 1970s has concerned itself with an investigation of temporal structure, preferring instead to abandon time as a fictive concern either by the resort to nostalgia or by making narrative construction entirely the responsibility of the audience.

One of the few consistent exceptions to this general practice has been Sam Peckinpah's reassessment of the primary cinematic myths of America. Peckinpah's critical neglect during the decade has been curious: dismissed for his apparent political conservatism and misogyny and condemned for his depiction of violence, Peckinpah has never-

[361]

theless conducted the most complex revision of cinematic temporal structures since Welles (or perhaps Griffith), and provided a functioning solution to the problem of joining inside and outside while operating firmly within the new post-television aesthetic. Peckinpah's films, however pessimistic their thematic conclusions might be, present some of the few coherent discussions of the pervasive phenomenon of incoherence in the contemporary American cinema and, contrary to most critical assumption, reconsider the problematic nature of heroism in a universe where morality can no longer be straightforwardly attached to physical decorum.

His early films (up to The Getaway, 1972) play on the extent to which their central characters exist as heroic outsiders because of their opposition to temporal progress. One advertising slogan for The Wild Bunch (1969) was, "The land had changed. They hadn't." It was equally applicable to the two gunfighters in Ride the High Country, Tyreen in Major Dundee, Cable Hogue and Junior and Ace Bonner. Usually aging men running out of space in which to act because time (progress) has made them redundant, Peckinpah's early heroes engage in some futile, romantic, and usually fatal gesture of rebellion, a sub-Hemingway stance which has clung as firmly to Peckinpah's public persona as it once did to John Huston's.

His later films, however, have questioned the traditional mechanisms of heroism. His central characters lack moral certainty, and they are also deprived of the guarantee of heroic status their performances might bring them elsewhere. Pat Garrett and Billy the Kid (1973) does not concern itself primarily with Billy, whose mythic status is secure before the film begins, and who has nothing to achieve except its confirmation by his death. Instead Peckinpah concentrates attention on Garrett, who falls victim to the moral incompatibility of his desire to survive to be "rich, old and grey" and his need for individual independence. As a mythic force, Billy remains immune from narrative pressure, a situation reinforced by the industrial status of Kristofferson's performance. His physical movement is unaffected by the events of the film, and he relaxes into a separable activity of role-playing which represents both Billy the Kid and Kris Kristofferson, country-rock star. By contrast, James Coburn demonstrates his entrapment within the narrative, and his vulnerability to historical processes by becoming stiffer and more pained in his movements as the film progresses. Garrett's tragedy lies in his gradual discovery that a pro-

[362]

fessional commitment to a linear course of action guarantees neither the loyalty and respect of his corporate employers nor the moral endorsement of the film and its spectators.

Peckinpah's subsequent films all assume the moral vacuum Garrett discovers. Bring Me the Head of Alfredo Garcia (1974), The Killer Elite (1975) and Cross of Iron (1977) occupy an anarchic terrain in which betrayal is endemic and heroism is inevitably compromised. Their central characters all function within a framework which assumes that their personal objectives will prove incompatible with those of the larger external forces which have determined the circumstances the film presents. For the protagonists, any action is permissible in the quest for survival, from the mutilation of a corpse to the murder of a child, but such figures can no longer hope for the sympathy of their audience. Nor, increasingly, do they seek it; Steiner (James Coburn) has no attachments to anything outside his platoon, and no rationale for his behavior except survival in what he describes to the Russian boy they take prisoner as No Man's Land. None of the characters in Cross of Iron enact positions which the audience can endorse, since the conventional yardsticks of morality by which they might be judged are not contained within the film. Steiner's brutal laughter, which closes the film over images of dead children, is an acceptance of the arbitrariness of the war the film depicts, and of the film's depiction of it.

Peckinpah's films match their deconstruction of moral certainty with an equally deliberate deconstruction of the spatial and temporal certainties within which such a moral certainty might exist. The films realize the condition of arbitrariness rather than merely depicting it, and force the audience to experience the condition of their characters by paralleling the characters' moral situation with the physical, perceptual situation of the audience. At its broadest, this process is signalled by Garcia's beginning and ending on a frozen frame: cinematic time is displayed as an arbitrary construct, which the film is free to play with as it wishes, and which the audience must simply endure acceptingly. Where, in his earlier films, Peckinpah employed slow motion to render ambiguous the spectator's response to a brutal action by revealing its grace, his later films employ it to reveal the arbitrariness with which the film travels through the gate of the projector. Slow motion ceases to indicate a significant event, as it did in The Wild Bunch, but rather to divert the audience's attention to incidental

[363]

physical trajectories, such as the arc described by the spent shells ejected from a sub-machine gun. Peckinpah repeatedly demonstrates the moral incompatibility of cause and effect; Cross of Iron returns again and again to intercut shots of explosions and artillery shells being ejected. This is a description of process, established by a kind of angle-reverse angle cutting, but one which is only made possible by the recognition that the cinema constructs its space according to unique laws which enforce a relation between two consecutive images.

As juxtaposition constructs significant space, it also enforces temporal progression. The tank battle in Cross of Iron enacts in microcosm the narrative process of Garcia. The sequence begins with a series of static shots of the Russian tanks, cut together in an accelerating montage which animates the tanks themselves into movement. The film constructs not only its own moral landscape, but also its own momentum, which arbitrarily obliges or interrupts the movement of its characters. Peckinpah's aesthetic is constructed around the acknowledgment that the American cinema of the 1970s can place any two shots together and create an arbitrary meaning through the creation of an arbitrary space and time. It is an aesthetic that makes no concessions to the audience, who are offered fewer and fewer positions they may comfortably adopt, either spatially, temporally, or ethically. In Cross of Iron, the spectator becomes a redundant witness to a process completely out of his or her control.

In Peckinpah's films, the audience's only recourse is to a morality external to the film itself. In this deliberate anarchy is the most coherent statement of the endemic incoherence of contemporary American cinema. The collapse of consensual structures has led the American film into an apparently unavoidable oppositional stance to the primary source of consensus, television. The best hope it has offered has been the suggestion that it is possible to survive a disaster movie, but the heroic status of survivors, from Travis Bickle to Rolf Steiner, is uncertain to say the least. Even the most closely argued of these films oblige the audience to keep a distance from the screen which threatens them. The juvenile attempts at consensus via a conservative engagement in fantasy have merely produced a reactionary cinema of escapism that re-enacts Hollywood's simplest generic and heroic archetypes without the context that once gave them meaning. The more complex articulations of Coppola or Altman limit them

[364]

selves by their exclusion of the audience, and their refusal to offer a fixed meaning. The nihilism of this response achieves its most deliberate formulation in the anarchy of Peckinpah's world









[368]

INTERLUDE


THE MULTIPLE REVISIONIST AND THE DETACHED
NARCISSIST: DON SIEGEL AND CLINT EASTWOOD


The increasingly provisional nature of cinematic structures in contemporary Hollywood in many respects echoes the practices of the filmmakers of Dissent. In interviews, Scorsese is fond of declaring that his tracking shots borrow from Fuller's. But the disintegration of consensus has eliminated the context in which Fuller might register his dissent through his mobile camera, leaving only the empty form for Scorsese to imitate.



07 July 2023

Rank—Art and Artist (iii)—The Psychological Ideology of Art


Otto Rank
Art and Artist
trans. Charles Francis Atkinson
(1932/1989)


[xiii]

AUTHOR'S PREFACE



...

[xiv] On the one hand, the individual urge to create is by no means the only specific quality of the artist; equally, on the other hand, canons of style, evolved from the collective consciousness, can by no means be regarded as the true essence of artistic creation; the one individual factor represents merely the motive-power, while the other, collective, element provides the forms that are suited in the circumstances to its activity and utterance.

...

[xxiii] in The Trauma of Birth I discerned the fact, which I later developed theoretically, that the creative impulse, which leads to the liberation and forming of the individual personality—and likewise determines its artistic creativeness—has something positively antisexual in its yearning for independence of organic conditions. Correspondingly, my conception of repression differed from Freud's; for to him it is the result of outward frustration, while I trace it to an inward necessity, which is no less inherent in the dualistic individual than the satisfying of the impulse itself.

...

[xxiv] if the neurotic type, who fails to synthesize his dualistic conflict, be studied from the therapeutic angle, the impression received is that of individuals who (psychologically speaking) represent the artist-type without ever having produced a work of art. ... In short, it would seem that the creatively disposed and gifted type has to have something in addition

[xxv]

before it can become a really productive artist, while on the other hand the work of the productive individual must also be added to before it can rank as a genuine work of art.

Neither the cultural and scientific history of art nor the aesthetic psychology of the artist has so far provided a satisfactory answer to this central question of the whole problem of art: namely, what constitutes the correlation between artist-type and the art-product; that is to say, the artistic creativeness and the art-form? And although it may seem evident that this common factor in the artist and the art product must be a super-individual, collective element, so obvious a conclusion at once raises a series of questions, the mere meaning of which is enough to show that they but make the real problem more acute. The first among such questions is likely to be: what does this collective factor, both generally and particularly in the creative individual, mean? Following directly upon this comes the next question: what is the characteristic which distinguishes the specific, artistic collectivity—subjective or objective—from others, such as religious, social, or national? In other words, why does the individual, endowed with this mysterious collective force, become now a popular leader, now the founder of a religion, and now an artist?

30 June 2022

John Berger—The Success and Failure of Picasso


John Berger
The Success and Failure of Picasso
(1965)

My note says:
p. 6—"the man, the personality, has put his art in the shade"
p. 9—"For Picasso, what he is is far more important than what he does."
p. 13—"Picasso's historical ambiguity...his fame rests upon his modernity... And yet in his attitude to art...there is a bias which is not in the least modern..."
It could not have been obvious in 1965 just how post-modern this outlook is, though in drawing a connection between the "what he is" outlook and Picasso's great fame JB clearly grasps the underlying mechanism. It is but a short step from the focus on self and the hostility to learning and reason and experimentation to the phenomenon of Famous for being Famous. The Picasso herein described would have made a near ideal instagram user...and instagram (the company and the user community) would have loved having him. The nineteenth- and twenty-first-century provenance of this ethos suggests a cyclical rather than linear history.

04 December 2021

Lasch—Of Valor, Chivalry, and Brains

Christopher Lasch
The Revolt of the Elites (1995)
The upper middle class, the heart of the new professional and managerial elites, is defined, apart from its rapidly rising income, not so much by its ideology as by a way of life that distinguishes it, more and more unmistakably, from the rest of the population. Even its feminism—that is, its commitment to the two-career family—is a matter more of practical necessity than of political conviction. Efforts to define a "new class" composed of public administrators and policy makers, relentlessly pushing a program of liberal reforms, ignore the range of political opinions among the professional and managerial elites. These groups constitute a new class only in the sense that their livelihoods rest not so much on the ownership of property as on the manipulation of information and professional expertise. Their investment in education and information, as opposed to property, distinguishes them from the rich bourgeoisie..., and from the old proprietary class—the middle class in the strict sense of the term—that once made up the bulk of the population.

Since they embrace a wide variety of occupations...and since they lack a common political outlook, it is also inappropriate to characterize managerial and professional elites as a new ruling class. Alvin Gouldner...found the unifying element in their "culture of critical discourse," but even though this formulation captures an essential feature..., it exaggerates the intellectual component in the culture of the new elites and their interest in the rationalization of life, just as it minimizes their continuing fascination with the capitalist market and their frenzied search for profits.

A more salient fact is that the market in which the new elites operate is now international in scope. Their fortunes are tied to enterprises that operate across national boundaries. They are more concerned with the smooth functioning of the system as a whole than with any of its parts. Their loyalties—if the term is not itself anachronistic in this context—are international rather than regional, national, or local. They have more in common with their counterparts in Brussels or Hong Kong than with the masses of Americans not yet plugged into the network of global communications.

Robert Reich's category of "symbolic analysts" serves, apart from its syntactical incoherence, as a useful, empirical, and rather unpretentious description of the new class. These are people, as Reich describes them, who live in a world of abstract concepts and symbols, ranging from stock market quotations to the visual images produced by Hollywood and Madison Avenue, and who specialize in the interpretation and deployment of symbolic information.

(pp. 33-35)



So, here is precisely the thing (or one of them) which Lasch misses in his earlier attack on postmodern art in The Minimal Self: much postmodernism (and modernism, and several other scattered radicalisms and avant-gardisms here and there) is in fact a no-holds-barred Counterelite Revolt against precisely this regime of
interpretation and deployment of symbolic information
.
Certainly this alone does not gain these artists any extra moral capital, but it does show, I think, a sort of dialectical antithesis arising out of the knowledge economy itself. Unfortunately Lasch, like many others, is so attached to the
symbolic
dimension of art, and takes such joy in
interpret[ing]
it, that monochrome paintings and static music are simply beyond the pale. That seems to me like a pretty severe misjudgment, not necessarily of taste, but certainly of motive and utility.

Incidentally, the beleaguered, embattled, fallen-from-grace sense of
interpretation
bequeathed to us by Sontag's famous essay
,

and also her likening of interpreters to "leeches"
,

and also the overtones of militarism and conquest inherent in
deployment
,
all of these are, I think, very good hints as to some of the reasons artists have staged such a Revolt. And the defense of this Revolt is laid out beautifully by Lasch himself in this final work of his.


A more serious objection than imprecision is Reich's extravagantly flattering portrait of the "symbolic analysts." In his eyes, they represent the best and brightest in American life. Educated at "elite private schools" and "high-quality suburban schools...", they enjoy every advantage their doting parents can provide. ... These privileged young people acquire advanced degrees at the "best [universities] in the world," the superiority of which is proved by their ability to attract foreign students in great numbers. In this cosmopolitan atmosphere they overcome the provincial folkways that impede creative thought... Unlike those who engage in mind-numbing routines, they love their work...

Unlike old-fashioned intellectuals, who tend to work by themselves and to be jealous and possessive about their ideas, the new brain workers...operate best in teams. Their "capacity to collaborate" promotes "system thinking"—the ability to see problems in their totality, to absorb the fruits of collective experimentation, and to "discern larger causes, consequences, and relationships. Since their work depends so heavily on "networking," they settle in "specialized geographical pockets" populated by people like them. ...

(pp. 35-37)

But here the Pomos are very much Collabos too, and this is both symptom and cause of the desperation (often enough material and spiritual desperation alike) with which so many of us now confront the flaming ruins of industrialism. It is in this co-optation of collaboration, its conscious weaponization against the time-honored ways of old-fashioned intellectuals, where I would anchor any broad polemic against various "postmodern" developments in art. By insisting on the symbolic orientation instead, Lasch's "survivalist" dragnet (in The Minimal Self) snares too many artists who properly belong, in fact, to the very craft morality he seeks to recover.


Universal admission to the class of "creative" people would best meet Reich's ideal of a democratic society, but since this goal is clearly unattainable, the next best thing, presumably, is a society composed of "symbolic analysts" and their hangers-on. The latter are themselves consumed with dreams of stardom but are content, in the meantime, to live in the shadow of the stars waiting to be discovered and are symbiotically united with their betters in a continuous search for marketable talent that can be compared, as Reich's imagery makes clear, only with the rites of courtship. One might add the more jaundiced observation that the circles of power—finance, government, art, entertainment—overlap and become increasingly interchangeable. It is significant that Reich turns to Hollywood for a particularly compelling example of the "wondrously resilient" communities that spring up wherever there is a concentration of "creative" people. ...

Only in a world in which words and images bear less and less resemblance to the things they appear to describe would it be possible for a man like Reich to refer to himself, without irony, as secretary of labor or to write so glowingly of a society governed by the best and brightest. The last time the "best and brightest" got control of the country, they dragged it into a protracted, demoralizing war in Southeast Asia, from which the country still has not fully recovered. ...

This arrogance should not be confused with the pride characteristic of aristocratic classes, which rests on the inheritance of an ancient lineage and on the obligation to defend its honor. Neither valor and chivalry nor the code of courtly, romantic love, with which these values are associated, has any place in the worldview of the best and brightest. A meritocracy has no more use for chivalry and valor than a hereditary aristocracy has for brains.

(pp. 37-39)


05 October 2021

Heigh Ho, Pomo

Gerald Graff
"The Myth of Postmodern Breakthrough" (orig. 1979)
in Critical Essays on American Postmodernism (1994)
ed. Stanley Trachtenberg
pp. 69-80
In an essay that asks the question, "What Was Modernism?" Harry Levin identifies the "ultimate quality" pervading the work of the moderns as "its uncompromising intellectuality." The conventions of postmodern art systematically invert this modernist intellectuality by parodying its respect for truth and significance. ... It appears that the term "meaning" itself, as applied not only to art but to more general experience, has joined "truth" and "reality" in the class of words which can no longer be written unless apologized for by inverted commas.

Thus it is tempting to agree with Leslie Fiedler's conclusion that "the Culture Religion of Modernism" is now dead. The most advanced art and criticism of the last twenty years seem to have abandoned the modernist respect for artistic meaning. The religion of art has been "demythologized." A number of considerations, however, render this statement of the case misleading. Examined more closely, both the modernist faith in literary meanings and the postmodern repudiation of these meanings prove to be highly ambivalent attitudes, much closer to one another than may at first appear. The equation of modernism with "uncompromising intellectuality" overlooks how much of this intellectuality devoted itself to calling its own authority into question. . . .

(pp. 70-71)

With no scruples whatseover about repeating myself, I must say that following my trip to art school the ultimate archetype of these "highly ambivalent attitudes" and of the "deliberate avoidance of interpretability ha[ving] moved from the arts into styles of personal behavior" (71) will always be, for me, the radical conceptual art grad student who drives a gas-guzzling motor vehicle and listens exclusively to top-40 radio.

My unconsidered gut reaction to Graff's final sentence above is that "modernist" musicians tended more towards reasserting/recovering/recreating some lost "authority" and were usually not too interested in questioning themselves. Also that the principals of the eventual postmodern backlash are quite comfortable slipping into the tattered robes of "authority" whenever they think they can get away with it. Hence this whole question of exposing shams of undue authority is what inclines me toward a positive self-identification as a "postmodernist." I can't really say so in casual conversation, however, because there are too many other associations with the term which don't fit me at all.

Conspicuous among them: I do believe that rational, just authority exists. It's just that, in music, I am typically most skeptical about its possibility on the level of "meaning;" and yes, those scare-quotes are so totally necessary anytime that warhorse word is trotted out of the stable.

23 April 2021

Parsons on the Romantic and the Methodical


...the dominant character structure of modern Germany had been distinguished by a striking dualism between "A: an emotional, idealistic, active, romantic component which may be constructive or destructive and anti-social," and "B: an orderly, hard-working hierarchy preoccupied, methodical, submissive, gregarious, materialistic" component.

In the traditional pre-Nazi German society it is overwhelmingly the B component which has become institutionalized. The A component arises from two principal independent sources: certain features of the socialization process in the German family, and the tensions arising from life in that type of institutional order. It is expressed in romantic, unrealistic emotionalism and yearnings. Under other circumstances the dissociation has historically been radical–the romantic yearning has found an outlet in religion, art, music and other-worldly, particularly a-political, forms. (248)

...

The peculiarity of the Nazi movement is that it has harnessed this romantic dynamism to an aggressive, expansionist, nationalistic political goal–and has utilized and subordinated all the motives behind the B component as well. In both cases the synthesis has been dependent at the same time on certain features of the situation and on a meaningful definition of the situation and system of symbols. The first task of a program of institutional change is to disrupt this synthesis and create a situation in which the romantic element will again find an a-political form of expression. This will not, however, "cure" the basic difficulty but only its most virulent and, to the United Nations, dangerous manifestation. (248-249)

Talcott Parsons
"The Problem of Controlled Institutional Change" (1945)
in Essays in Sociological Theory (1954)
pp. 238-274

Note (4 June, 2016): This resonates strongly with my conception of the aesthetic realm as, at minimum, a "padded cell" for various human impulses to inhabit without being enabled to do real damage (or, it is fair to add, make improvements) to the "real"/outside world. It would, of course, be great if in the first place there was not so much inner destructiveness flowing from human beings out into the world that we needed a special reservoir just to drain it off. I don't know that TP's discussion here anywhere near fully accounts for that. Even so, it is also not to be assumed a priori, as some postmodern Critical Theorists seem eager to do, that pure/absolute aestheticism is so inherently destructive in and of itself. As TP describes it here, the Nazi synthesis of "A" and "B" was an unusual and unlikely achievement, and one that could be disrupted precisely by recreating an apolitical space for romanticism to inhabit. And so, has the American academic left not been working quite diligently since the 1960s at forging and promoting just such a synthesis between industriousness (i.e. activism) and romanticism (i.e. art and aesthetics), accompanied by "a meaningful definition of the situation and system of symbols?" Hate to say it, but I think that description fits almost perfectly. Perhaps the antidote is also the same.

Note (23 April, 2021): I know that you're never, ever supposed to liken anyone to the Nazis. At the same time, such a taboo effectively limits what we are allowed to learn from history. Obviously there are many, many more differences than similarities here. That caveat should be superfluous, but I realize that in the present environment it is not. The point is that here we have one single instance of an influential speculative thinker offering up the speculation that art and politics make for an explosive combination. This seems to me very much worth considering in light of current events all over the political spectum. Perhaps we cannot learn much here, but surely we can learn something.

18 April 2021

Chasing The Over-a-Hundred Prize


The problem in deciding whether a scientific result or a new innovation is a "breakthrough," that is, the opposite of noise, is that one needs to see all aspects of the idea—and there is always some opacity that time, and only time, can dissipate. ...

Likewise, seemingly uninteresting results that go unnoticed, can, years later, turn out to be breakthroughs.

So time can act as a cleanser of noise by confining to its dustbins all these overhyped works. Some organizations even turn such scientific production into a cheap spectator sport, with ranking of the "ten hottest papers" in, say, rectal oncology or some such sub-sub-specialty.

If we replace scientific results with scientists, we often get the same neomaniac hype. There is a disease to grant a prize for a promising scientist "under forty," a disease that is infecting economics, mathematics, finance, etc. Mathematics is a bit special because the value of its results can be immediately seen—so I skip the criticism. Of the fields I am familiar with, such as literature, finance, and economics, I can pretty much ascertain that the prizes given to those under forty are the best reverse indicator of value... The worst effect of these prizes is penalizing those who don't get them and debasing the field by turning it into an athletic competition.

Should we have a prize, it should be for "over a hundred": it took close to one hundred and forty years to validate the contribution of one Jules Regnault, who discovered optionality and mapped it mathematically—along with what we dubbed the philosopher's stone. His work stayed obscure all this time."

N.N. Taleb
Antifragile (2012)
pp. 329-330

These prizes have their counterparts in the music world, and there too they are well-known as strong "reverse indicators of value," at least to everyone outside the immediate social orbit of the committee and the recipients. A certain amount of focus on the under-forties arises from a good-faith response to a good-faith criticism: lifetime achievement awards are obscene when the elderly achiever really could have used that money to stay afloat during their starving-artist years. This, together with the realization that many small grants to individual artists would do more good than a few massive grants to superstars and large organizations, has (re)shaped the landscape somewhat for the better. I suspect there are good intentions behind this; yet NNT's observations here supply the necessary damned if you do caveats. Radical postmodernists get the most attention for rejecting the cleansing effect of time, for junking the Thirty Year Rule that historians formerly observed, etc., and yet functionally the bourgeois mainstream has also rejected these things, much more quietly but with equal thoroughness and equally strident rationalizations. In one respect it is obvious that people need the money more when they are younger. The problem, though, is that it is not so easy to see the future. To attempt to do so is a fragilista maneuver through and through.

In borrowing this Talebism, I am certainly wary of embracing Taleb's peculiar brand of Darwinism. He is not the least bit convincing when he claims, after all else he has written here and prior, to be content with merely passing on his genes and riding off into the Darwinian sunset. Clearly he lives to read, argue, eat and drink, put mice down people's shirts, and so on. (How I wish I'd had this last idea when I was sitting in Dr. Damschroder's theory classes!) I'm not in favor of a full Hunger Games approach to artisthood. The point, rather, that Taleb's arguments reveal, and which is in my experience simply not yet acknowledged in artists' circles, is that the fragilizing effects of awards are even worse than the trappings of a pure survival competition. They compound "cumulative advantage," lead to "Matthew effects," and claim to see the future. They "penaliz[e] those who don't get them and debas[e] the field by turning it into an athletic competition." I'm sure we will continue to have them even so. We all might as well apply just in case. But please don't believe anything anyone involved says about the recipients or the process, and please don't believe that competitions and awards are about supporting the next generation's finest practitioners. They cannot be about that. The list of Pulitzer winners speaks for itself here.

Synchronically, lifetime achievement awards are indefensible; diachronically they are the only defensible kind of award in fields that are, for our purposes here, the opposite of math, where the value of results is very rarely immediately seen, indeed where this value is all but guaranteed to change, and where this guarantee in and of itself does not need to be elevated to a risk by our having previously bet against it. Of course we can collectively decide to redefine value as strictly limited to that which can be immediately seen. It may seem like this has already happened; but watch those pomos carefully, especially when they don't know they are being watched, and you will find all the evidence you'll ever need that this presentism is no more a part of who they really are than their mismatched tube socks.

12 October 2020

Facts and Fancy

(from my Goodreads review of Babes in Tomorrowland: Walt Disney and the Making of the American Child, 1930-1960 by Nicholas Sammond)

The overall posture and style of this study are so self-consciously disinterested and relativistic as to read like a caricature of postmodern academic writing. This pastiche has lost not merely its sense of humor but its sense of purpose too. The fear of letting a stray value judgment slip out seems to have stultified the author's analytical capabilities. And yet values per se are largely what the study is about. The superficial irony of this is plain enough, but I think it is more than ironic. It is at least mildly disingenuous. In some respects it is cowardly.

The disinterested empirical scholar is discouraged from bringing their own values into the mix because disinterested empiricism cannot, by its own inner logic, operate that way. This book stumbles its way into a subdiscipline where disinterested empiricism is thought to be especially de rigeur but where it is actually quite inadequate. Sammond repeatedly invokes something like "the dominant presence of members of the white, Protestant, progressive middle class in the study of childhood." (7) He repeatedly names and specifies these agents of institutionalized moralization, repeatedly inviting us to consider them by profession, race, and class. Their work, he tells us, was profoundly shaped by classbound values. The fact of classboundedness and the identity of the classes in question are unequivocally named and reiterated. But Sammond seldom names the values themselves, and when he does name them I found it difficult to conjure much righteous indignation.

I do not wish to suggest that there actually is a universal morality. That is not what I believe. I don't think you have to believe it, though, to trip up on the idea that "truthfulness" and "unselfishness" are "middle-class virtues" (85) which cannot be reasonably expected of other classes. To me that sounds a lot like, say, reading being a White thing. Sammond himself probably believes no such things, but he is not allowed to say so, because this is scholarship and mere opinions aren't worth anything. The hubris of progressive sociologists, on the other hand, is an objective fact which can be presented as such, for if there is no universal morality then all progressivism is just a stillborn moral fallacy. Even "truthfulness" cannot mooch a provisional exemption. Truthfulness!

Naturally, the chickens of relativism roost in the hencoop of hypocrisy. What are the moral implications of accommodating the actions of a dishonest or selfish poor person? Does this help them or hurt them? Is it justified merely by the fact that they are poor and you are rich? By the right to cultural self-determination? Liberty? Consequentialism? Echoing overzealous committees everywhere, Sammond could claim that these properly philosophical questions are beyond the scope of his social-scientific study. I agree that they threaten to explode any such study into an unwieldy interdisciplinary patchwork; but I would strongly disagree that they are, literally, outside his scope. His own methods have made these questions essential to his scope and he makes no effort to acknowledge or address this. Instead, the really important takeaway is that most of the reformers were white, Protestant, progressive, and middle-class, whereas not all of their objects were these same things. As it turns out, this is not quite worth writing a book about.

Reformers of any slant in any area of human endeavor are vulnerable to the charge that they have put forth their own values as universal ones. Without this fundamental arrogation there can be no collective social action of any kind. The mere fact of arrogation is endemic, background radiation to the perceptible heat and light of social and political life. The arrogation of reformers is not an urgent sociological issue. What is urgent, I think, and what could have been pursued more doggedly here, is a compelling chronicle of the dynamic interaction between values and institutions. Strictly speaking, the thesis that "discursive circuits constructed around and through media-effect arguments sell products and build careers" (360) does describe a dynamic process, but it begs a lot of questions too. My sense is that Sammond forbid himself as a matter of methodology from opining, judging or blaming, and that by proscribing these things he railroaded himself into a static account rather than a dynamic one. (When your first order of business is to name the race and religion of the principals, it's hard to say much of anything more without offending.)

I also am not convinced, either by this account or by others, that the interaction between the Disney Studio and the reformers Sammond identifies was truly dynamic until quite late in the period he covers. In amongst all of the imbrication and commodification, I noticed that the dates, types and sources of the documents he reproduces throughout the book support my skepticism. Concerned parents created the market and Disney, eventually, seized on it. But Disney already had an enormous market, and progressives had a lot of ideas which were oblique to Disney and to media generally. Following academic convention, Sammond takes a laser-focus on the tiny area of overlap. It turns out there is not nearly as much for him to write about as the length of the book would imply.

If you don't already know something about the reformers Sammond chronicles, you still won't have much of an idea of what their values actually were after reading his book. He detects that the progressives have unduly assumed at least one non-working, stay-at-home parent, a luxury which many working class and immigrant families didn't enjoy; and he points out that child labor has persisted in agriculture (and disproportionately among children of color) long after progressives had more or less succeeded in abolishing it for white children. These are sobering reminders for white, middle-class readers; they are nonetheless quite underwhelming in the role Sammond has carved out for them here, where the towering monoliths of American Sociology, Enterprise, and Entertainment have collided in a giant orgy of...what exactly?

"Truthfulness" and "unselfishness" arise in the discussion of Disney's Pinocchio. It is the natural film for Sammond to discuss, since its overbearing didactic moralism stands out even in the Disney oeuvre. Yet transparent texts can be difficult to handle, and Sammond breaks everything he touches. With so much threadbare symbolism sitting right on the surface (Stromboli is literally a puppetmaster), Sammond cannot possibly work his way back to "middle-class values" without committing an act of interpretation. He has previously been too vague about values, whereas this film is explicit about them. Sontag warned us about this: "to interpret is to impoverish." Disinterested empiricism has taken him as far as it can, and now it is his turn to recapitulate in reverse the error of media effects crusaders by projecting upon the text the social location of those most eager to consume it. Consumer eagerness now engulfs the text from without, metastasizing into its organs of content and meaning. Suddenly it is not Edward Filene or Walt Disney but Sammond himself who has elevated consumption to a moral value! Buy a film and you become its content! And its content you! It's cheaper than the naming rights to a distant star or atoll! Hence a fleeting indulgence in armchair criticism is the precise moment when things go off the rails for good, whereby "truthfulness" becomes "middle-class," whereby poor people's untruthfulness is locked away in the black box of cultural self-determination, whereby Pinocchio cannot reflect the values of a solitary poor person unless all of the other poor people are also lining up to view it. Not just a filmic text is impoverished this way but also the "virtue" of everyone who is not "middle-class." That is quite an accomplishment.

I'm not a critic or a sociologist, but I feel like there has to be a better way to go about this. Fromm defined ideologies as "socially patterned rationalizations." Say we take those three concepts, pair them into three dyads, and then study each dyadic nexus; each one generates a limited but salient field of material which is relevant to our topic, and also a sprawling field of extradisciplinary connections. Given the organic limits of human cognition and the profusion of published research, each of the outward-facing fields is functionally unbounded; but they are perfectly finite in number (there are three of them), and this makes it possible at least to momentarily stare into each abyss and admire what makes it unique from the others and from the original topic. Then we return to the inside, reassemble the triad, and look for the triadic nexus. A geometric analogy to planes, dimensions and wormholes suggests itself. This is just silly stuff I think about, but it seems to me that this book has done none of this nor anything remotely resembling it. It is not even a one-dimensional sociology, because it has not even the first prerequisite for the dimensionalization of sociological thought, namely a sentient authorial being. The strict repression of authorial slant in this area of scholarship is quite ironic given one of Sammond's key takeaways from the inconclusiveness of Media Effects research: even children do not simply swallow whole everything they are told or exposed to. I think we can assume this of readers of scholarly publications as well. A profusion of value-oriented scholarship could actually be the best way to achieve the "parallactic" ideal that some postmodernists have put forth, whereby observation from a variety of angles permits a clearer view than any single one of them can alone. The first step towards that ideal is not to give up on fixed moral positions but rather to stake them out. A moral position can be the second point which defines a line of inquiry. This poses methodological challenges, to be sure, but there is a payoff for surmounting those challenges, a payoff with which studies like Sammond's cannot compete. Fromm and Maccoby made a blind stab in this direction which is simultaneously comical and profound: they constructed numerical scales of psychoanalytically-defined traits by which to measure the Mexican villagers they studied, they took the measurements (basically they made them up), and they performed some conventional statistical analysis of these figures to look for Results. To a self-loathing postmodernist this looks like pure arbitrary slant, the methodological equivalent of intentionally exceeding the speed limit at first sight of a cop. My contention is that if hundreds or thousands of diverse minds were to construct their own numerical scales and take their own "measurements," the aggregated results would be as meaningful as the minds are diverse. (This diversity would need to be more than skin-deep.) Against this backdrop, Sammond's approach looks like another fruitless search for perfect objectivity, distance, disinterest. If the slant is always there anyway, we might as well turn it to our advantage.

At great semantic and rhetorical pains, Sammond does eventually work his way around to some interesting big-picture theses about commodities and the social construction of childhood. For reformers and parents alike, the erroneous belief in strong media effects
"smoothes over some unpleasant contradictions in the construction of personhood and identity in democratic capitalist society. Quite simply: the child as susceptible to commodities stands in for the child as commodity-in-the-making...[whereby] persons must be simultaneously and impossibly unique individuals and known quantities." (360)
Ay, that's the stuff! But by this time the sins of omission are piled high, reflected in the endnotes by a veritable profusion of beyond-the-scope apologias which I literally lost count of. I'm reasonably sure I have never seen so many in one place, actually, and I think that is a singularly meaningful reflection on the nexus of topic and method here.

28 December 2017

Preliminary/Residual Thoughts on Descaling

(1) When even the most specialized of academic specialists cannot hope to keep up with the deluge of publication in their narrow specialty, the result is a new and distinctive kind of social volatility borne of something like information overpopulation. Research findings would then resist synthesis into social action, operating only in fragments scattered far and wide throughout the social system. Many collective advances would remain mere potentialities whose likelihood of manifesting plummets as the system continues to grow in scale. No matter the gross quantity of raw information such gains in scale might beget, the basic unit of social agency (the individual human being) stays pretty much the same. Ditto the system gain from pooling such units into networks (e.g. research teams, political action committees, musical ensembles) which show diminishing returns at scales proportionate to today's information overload. Even the effect of introducing better information into the system is mitigated by diffusion given such vast scale as the current global village (not to mention its Virtual shadow-world) has attained. The tortu(r)ously slow burn of incremental progress seems pleasurable in comparison to the fracturing and anomie which the present situation promises to engender.

(2) The above assumes that an increase in the gross quantity of overall knowledge production begets a corresponding and proportionate increase in the (smaller) gross quantity of competent and constructive knowledge production; this as opposed to merely spreading ever thinner a fixed quantity of collective intellectual potential. This is a very large assumption which may not be warranted; but if not, then we are left with an older, simpler problem: the haystacks grow while the needles and the metal detectors pretty much stay the same. As for the sentient pieces of throbbing flesh wielding the latter device, one can only hope that their dignity is not too closely cherished.

(3) Perhaps then there is something to be said for periodically turning one's back on the great data diffusion and carving out a little extra time to cherrypick the choicest nuggets from the twilight of pre-computerized thought, e.g. in the same vein as Debord but with a dash more childlike curiosity and a tad less puerile obstinacy. Whatever strictly perspectival shortcomings individual thinkers of the recent past might now be understood to have had, at least the economy of ideas within which they were subsumed was of a more just and optimal scale. Even the choicest of today's intellectual nourishment is grown in depleted soil, meanwhile, and thus perspective has become a problem of abundance rather than one of scarcity. If this is not quite a fatal blow to progress, it just as surely has not been adequately accounted for by progressives who merely consider the ostensible quality of information but not the system-level prospects for making any use of it whatsoever. In any event, it promises to be a very long time indeed before ideas are again permitted to circulate in an optimally-scaled intellectual environment; optimally-scaled, that is, not merely for progress but also for dignity.

(4) A recent 30 second junket on Google produces one intriguing and one utterly demoralizing revelation: (a) the term/concept "descaling" has found at least cursory usage in the heavy economics literature; (b) in absence of companion terms to narrow the field, any such Google search is badly confounded by the far more pressing and widely discussed issue of how to clean a coffeemaker.