Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

22 March 2024

where we've been, where we're going, according to the botverse


kacattac:
summarize the theory of aesthetics presented at fickleears.blogspot.com during March 2024

ChatGPT:
The theory of aesthetics discussed on Fickle Ears blog in March 2024 encompasses a range of topics, including the intricacies of aesthetic interests and qualities, the subjective nature of musical experience, and the impact of environmental factors on artistic performance.

One aspect touched upon is the subjective and deeply personal nature of musical experience, emphasizing that music's interpretation and appreciation can vary widely among individuals. This perspective underscores the importance of open-mindedness and engagement with music on a personal level, suggesting that our connection to music is shaped by individual taste, emotion, and context.

Another discussion revolves around the challenges musicians face in maintaining creativity and excellence over time. It suggests that even in seemingly repetitive and automated performances, there's room for subtle improvements and personal growth. This viewpoint challenges the notion of "automating" performance in the arts, arguing instead for the persistent human element of creativity and adaptation in even the most routine performances.

Together, these discussions present a nuanced view of aesthetics that values personal experience, continuous improvement, and the complex interplay between an individual's internal world and their external artistic expressions.

kacattac:
how has this theory of aesthetics evolved when compared and contrasted with content posted to fickleears.blogspot.com during April 2021 and May 2021?

ChatGPT:
The theory of aesthetics explored on Fickle Ears during March 2024 and earlier in April and May 2021 reflects an evolution in the blog's approach to discussing art, culture, and the role of the critic.

In the earlier posts from 2021, the blog delves into the complexity of aesthetic judgment and the role of the critic. It discusses the challenges in changing deeply ingrained aesthetic perceptions and the limitations of language in conveying aesthetic qualities. The blog suggests that aesthetic appreciation involves more than just intellectual understanding; it requires a deep, often pre-verbal, engagement with the artwork. This period also reflects on the role of critics in helping audiences appreciate both new and traditional aesthetic qualities, despite the limitations of common language, which is seen as potentially obstructive to disinterested perception​​.

Moving forward to 2024, the discussions on Fickle Ears continue to build upon these themes but with a more pronounced emphasis on the subjective and experiential aspects of music and artistic performance. The theory presented emphasizes the deeply personal nature of musical experience, suggesting that even in seemingly repetitive performances, musicians are constantly adapting and innovating, bringing a human element to their art that defies automation.

Throughout this evolution, Fickle Ears maintains a critical stance on the reductive views of art and performance. The blog critiques the notion of reducing aesthetic experience to a set of standardized metrics or rules, emphasizing instead the nuanced and multifaceted nature of artistic engagement. This shift from a more theoretical and critical discourse towards a more experiential and subjective understanding of aesthetics highlights the blog's dynamic approach to exploring the complexities of art and its appreciation.

These shifts reflect a broader understanding of aesthetics as not just a matter of intellectual discourse but as a deeply felt, personal experience shaped by a multitude of factors, including individual background, societal influences, and the ever-changing dynamics of cultural production and reception.



17 March 2024

Fickle Ears According To ChatGPT


kacattac: Summarize the site fickleears.blogspot.com in 500 words or less

ChatGPT: Fickleears.blogspot.com is a blog dedicated to music reviews and analysis, with a focus on indie, alternative, and experimental genres. The site offers in-depth critiques of albums, EPs, and singles, providing readers with insights into the artistic merits and cultural significance of various musical works.

The blog's author demonstrates a keen understanding of music theory and composition, often delving into the technical aspects of songs while also considering their emotional impact and thematic content. Reviews are written with a thoughtful and subjective approach, incorporating personal anecdotes and reflections to convey the author's connection to the music being discussed.

One of the distinguishing features of fickleears.blogspot.com is its eclectic taste, covering a wide range of musical styles and artists from around the world. While the blog primarily focuses on independent and lesser-known acts, it also occasionally features reviews of mainstream releases, offering readers a diverse array of music to discover and explore.

In addition to album reviews, the blog occasionally publishes interviews with musicians, providing readers with insight into the creative process and inspiration behind their work. These interviews offer a behind-the-scenes look at the music industry and help contextualize the artistry featured on the blog.

Fickleears.blogspot.com also serves as a platform for critical discourse within the music community, encouraging readers to engage with the content through comments and discussions. The blog's author fosters a sense of community among music enthusiasts, facilitating dialogue and exchange of ideas around the music being reviewed.

Overall, fickleears.blogspot.com is a valuable resource for music lovers seeking thoughtful analysis and commentary on a diverse range of musical offerings. Whether readers are looking to discover new artists or deepen their understanding of familiar genres, the blog provides engaging and insightful content that celebrates the rich tapestry of contemporary music.




kacattac: Summarize fickleears.blogspot.com’s theory of aesthetics

ChatGPT: fickleears.blogspot.com's theory of aesthetics centers around the idea that music is a deeply personal and subjective experience, shaped by individual taste, emotion, and context. The blog emphasizes the importance of approaching music with an open mind and a willingness to engage with diverse styles and genres. Rather than adhering to strict criteria of "good" or "bad" music, fickleears.blogspot.com encourages readers to appreciate the artistic merits and cultural significance of each piece on its own terms.

The blog's aesthetic theory also values authenticity and innovation, celebrating artists who push boundaries and challenge conventional norms. While technical proficiency is acknowledged and appreciated, it is not the sole determinant of artistic merit. Instead, fickleears.blogspot.com emphasizes the importance of emotional resonance and creative expression in evaluating music.

Furthermore, the blog's aesthetic philosophy recognizes the interconnectedness of music with other forms of art and human experience. Music is seen as a reflection of society, history, and individual identity, with each song carrying its own narrative and meaning. By contextualizing music within broader cultural and social contexts, fickleears.blogspot.com seeks to deepen readers' appreciation and understanding of the music being reviewed.

Overall, fickleears.blogspot.com's theory of aesthetics promotes a holistic and inclusive approach to music appreciation, valuing diversity, authenticity, and emotional resonance above rigid standards or genre classifications. Through its reviews and analysis, the blog seeks to inspire curiosity, foster dialogue, and celebrate the rich tapestry of musical expression.



14 March 2024

Constant Lambert—Music Ho! (ii)


Constant Lambert
Music Ho! A Study of Music In Decline
(1934)




[201]
(g) The Spirit of Jazz


By jazz, of course, I mean the whole movement roughly designated as such, and not merely that section of it known as Afro-American, or more familiarly as 'Harlem'. The negro once enjoyed a monopoly of jazz, just as England once enjoyed a monopoly of the industrial revolution, but for the negroes to imagine that all jazz is their native province is as if an Englishman were to imagine that all locomotives were built by his compatriots. Even the Harlem section of jazz is by no means so African as might be supposed.

...

[202] ... The European's enthusiasm for so-called negro music is in equal ratio to the negro's appropriation of European devices, and the more the European tries to imagine himself 'down on the Delta' the more the negro tries to imagine himself in an aristocratic salon. In this connection, it is amusing to recall the situation that arose recently when a well-known negro-dance arranger

[203]

was called in to produce a ballet for a highbrow company trained in the classical tradition. While all the Europeans flung aside their carefully won training to indulge in an orgy of pseudo-Charlestons the negro himself was moved to tears, not by his own work but by the classic elegance of Lac des Cygnes.

...

[205] ... The phrase 'barber-shop chord'—which denotes a chord of unusual succulence—dates back to the days when a guitar hung in every negro barber's shop, and a client who was waiting would vamp about on the instrument until at a lucky trouvaile everyone would shout 'Hold that chord'. It need hardly be pointed out that this type of harmonic experiment is as sophisticated in its method as that of the contemporary composers who—deny it hotly though they may—compose 'at the piano'.

...

The superiority of American jazz lies in the fact that the negroes there are in touch not so much with specifically barbaric elements as with sophisticated elements. ...

[206]

The sudden post-war efflorescence of jazz was due largely to the adoption as raw material of the harmonic richness and orchestral subtlety of the Debussy-Delius period of highbrow music. ...

... Though popularly regarded as being a barbaric art, it is to its sophistication that jazz owes its real force. It is the first dance music to bridge the gap between highbrow and lowbrow successfully. The valse has received august patronage from Beethoven onwards, it is true, but the valses of the nineteenth-century composers are either

[207]

definite examples of unbending or definite examples of sophistication—sometimes both. ... In the nineteenth century the split between the classical and popular came between a follower of Liszt, let us say, and a follower of Gungl. Today the split occurs between a composer like Kurt Weill and a composer like Jarnach—both of them pupils of Busoni.

The same rapprochement between highbrow and lowbrow—both meeting in an emotional terrain vague—can be seen in literature. [e.g. Eliot] ...

[208]

The words of jazz songs mark the first popularization of that well-known modern vice—the Inferiority Complex. Until recently a certain exuberant self-confidence has usually formed the spiritual background of a popular tune. ... A general air of physical attractiveness, sexual bounce and financial independence is naturally assumed by the writers of pre-war song hits. ...

In modern songs it is taken for granted that one is poor, unsuccessful, and either sex-starved or unable to hold the affections of such partner as one may have had the luck to pick up. ... For the most part...the heroes and heroines of modern songs meet with the rebuffs they deserve and take refuge in the unmute reproach of 'Ain't misbehavin' ', and 'Mean to Me',...

[208]

... The other side of the medal, the series of crazy words, crazy tune numbers, with their assumed galvanic energy has an equally neurasthenic basis. The so-called 'hot' songs are as depressing as the so-called 'sweet'; they spring from no genuine gaiety such as inspires the marches of Sousa, the sardanas of Bou and the valses of Waldteufel—they are a desperate attempt to hide an underlying boredom and malaise.

Well, one does wonder if the author has not badly misjudged all of this simply for ignorance of The Blues as a major tributary.

07 March 2024

Sennett—The Craftsman (i)


Richard Sennett
The Craftsman
(2008)


Prologue: Man as His Own Maker




Pandora's Casket
Hannah Arendt and Robert Oppenheimer

Just after the Cuban Missile Crisis,... I ran into my teacher Hannah Arendt on the street. The missile crisis had shaken her, like everyone else, but it had also confirmed her deepest conviction. In The Human Condition, she had argued a few years previously that the engineer, or any maker of material things, is not master of his own house; politics, standing above the physical labor, has to provide the guidance. ... She wanted me to draw the right lesson: people who make things usually don't understand what they are doing.

Arendt's fear of self-destructive material invention traces back in Western culture to the Greek myth of Pandora.

Well,
if a given

fear

well and truly

traces back,

then it is not

her

fear.

And
if it is
"her" fear,
then it does not
"trace back."

i.e. We're already headed for the Genetic Fallacy. Not a great start.

03 March 2024

Neil Postman—Amusing Ourselves To Death


Neil Postman
Amusing Ourselves To Death
(1984)


[9] In studying the Bible as a young man, I found intimations of the idea that forms of media favor particular kinds of content and therefore are capable of taking command of a culture. I refer specifically to the Decalogue, the Second Commandment of which prohibits the Israelites from making concrete images of anything. "Thou shalt not make unto thee any graven image, any likeness of any thing that is in heaven above, or that is in the earth beneath, or that is in the water beneath the earth." I wondered then, as so many others have, as to why the God of these people would have included instructions on how they were to symbolize, or not symbolize, their experience. It is a strange injunction to include as part of an ethical system unless its author assumed a connection between forms of human communication and the quality of a culture. We may hazard a guess that a people who are being asked to embrace an abstract, universal deity would be rendered unfit to do so by the habit of drawing pictures or making statues or depicting their ideas in any concrete, iconographic forms. The God of the Jews was to exist in the Word and through the Word, an unprecedented conception requiring the highest order of abstract thinking. Iconography thus became blasphemy so that a new kind of God could enter a culture. People like ourselves who are in the process of converting their culture from word-centered to image-centered might profit by reflecting on this Mosaic injunction. But even if I am wrong in these conjectures, it is, I believe, a wise and particularly relevant supposition that the media of communication available to a culture are a dominant influence on the formation of the culture's intellectual and social preoccupations.

Speech, of course, is the primal and indispensable medium. It made us human, keeps us human, and in fact defines what human means. This is not to say that if there were no other means of communication all humans would find it equally convenient to speak about the same things in the same way. We know enough about language to understand that variations in the

[10]

structures of languages will result in variations in what may be called "world view." How people think about time and space, and about things and processes, will be greatly influenced by the grammatical features of their language. We dare not suppose therefore that all human minds are unanimous in understanding how the world is put together. But how much more divergence there is in world view among different cultures can be imagined when we consider the great number and variety of tools for conversation that go beyond speech. For although culture is a creation of speech, it is recreated anew by every medium of communication—from painting to hieroglyphs to the alphabet to television. Each medium, like language itself, makes possible a unique mode of discourse by providing a new orientation for thought, for expression, for sensibility. Which, of course, is what McLuhan meant in saying the medium is the message. His aphorism , however, is in need of amendment because, as it stands, it may lead one to confuse a message with a metaphor . A message denotes a specific, concrete statement about the world . But the forms of our media, including the symbols through which they permit conversation, do not make such statements. They are rather like metaphors, working by unobtrusive but powerful implication to enforce their special definitions of reality . Whether we are experiencing the world through the lens of speech or the printed word or the television camera, our media-metaphors classify the world for us, sequence it, frame it, enlarge it, reduce it, color it, argue a case for what the world is like. As Ernst Cassirer remarked:

Physical reality seems to recede in proportion as man's symbolic activity advances. Instead of dealing with the things themselves man is in a sense constantly conversing with himself. He has so enveloped himself in linguistic forms, in artistic images, in mythical symbols or religious rites that he cannot see or know anything except by the interposition of [an] artificial medium.

[11]

What is peculiar about such interpositions of media is that their role in directing what we will see or know is so rarely noticed. A person who reads a book or who watches television or who glances at his watch is not usually interested in how his mind is organized and controlled by these events, still less in what idea of the world is suggested by a book, television, or a watch. But there are men and women who have noticed these things, especially in our own times, Lewis Mumford, for example, has been one of our great noticers. He is not the sort of a man who looks at a clock merely to see what time it is. Not that he lacks interest in the content of clocks, which is of concern to everyone from moment to moment, but he is far more interested in how a clock creates the idea of "'moment to moment." He attends to the philosophy of clocks, to clocks as metaphor, about which our education has had little to say and clock makers nothing at all. "The clock," Mumford has concluded, "is a piece of power machinery whose 'product' is seconds and minutes." In manufacturing such a product, the clock has the effect of disassociating time from human events and thus nourishes the belief in an independent world of mathematically measurable sequences. Moment to moment, it turns out, is not God's conception, or nature's. It is man conversing with himself about and through a piece of machinery he created.

25 August 2023

Framed


from "The Frame Problem," Stanford Encyclopedia of Philosophy:

To many philosophers, the AI researchers' frame problem is suggestive of a wider epistemological issue, namely whether it is possible, in principle, to limit the scope of the reasoning required to derive the consequences of an action.
...
Using mathematical logic, how is it possible to write formulae that describe the effects of actions without having to write a large number of accompanying formulae that describe the mundane, obvious non-effects of those actions?
...
What we need, it seems, is some way of declaring the general rule-of-thumb that an action can be assumed not to change a given property of a situation unless there is evidence to the contrary. This default assumption is known as the common sense law of inertia. The (technical) frame problem can be viewed as the task of formalising this law.
...
The puzzle, according to Dennett, is how “a cognitive creature … with many beliefs about the world” can update those beliefs when it performs an act so that they remain “roughly faithful to the world”?
...
the question of how to compute the consequences of an action without the computation having to range over the action's non-effects. ...the “sleeping dog” strategy... not every part of the data structure representing an ongoing situation needs to be examined when it is updated to reflect a change in the world. ...

...the epistemological question is not so much how the computational challenge can be met, but rather how the robot could ever be sure it had sufficiently thought through the consequences of its actions to know that it hadn't missed anything important.

Fodor suggestively likens this to “Hamlet's problem: when to stop thinking” (Fodor 1987, p.140). The frame problem, he claims, is “Hamlet's problem viewed from an engineer's perspective”. But to warrant the award of depth, an epistemological problem must at least resist the most obvious attempts to resolve it. In the case of Hamlet's problem, the obvious appeal is to the notion of relevance. Only certain properties of a situation are relevant in the context of any given action, and consideration of the action's consequences can be conveniently confined to those.

... Fodor's claim is that when it comes to circumscribing the consequences of an action, just as in the business of theory confirmation in science, anything could be relevant (Fodor 1983, p.105). There are no a priori limits to the properties of the ongoing situation that might come into play. Accordingly, in his modularity thesis, Fodor uses the frame problem to bolster the view that the mind's central processes — those that are involved in fixing belief — are “informationally unencapsulated”, meaning that they can draw on information from any source (Fodor 1983; Fodor 2000).
...
solutions to the logical frame problem developed by AI researchers typically appeal to some version of the common sense law of inertia, according to which properties of a situation are assumed by default not to change as the result of an action.
...
According to Fodor, this metaphysical justification is unwarranted. To begin with, some actions change many, many things. ... But a deeper difficulty presents itself when we ask what is meant by “most properties”. What predicates should be included in our ontology for any of these claims about “most properties” to fall out?
...
These questions and the argument leading to them are very reminiscent of Goodman's treatment of induction... Goodman showed that inductive inference only works in the context of the right set of predicates, and Fodor demonstrates much the same point for the common sense law of inertia.
...
An intimate relationship of a different kind between the frame problem and the problem of induction is proposed by Fetzer (1991), who writes that “The problem of induction [is] one of justifying some inferences about the future as opposed to others. The frame problem, likewise, is one of justifying some inferences about the future as opposed to others. The second problem is an instance of the first.” This view of the frame problem is highly controversial, however (Hayes 1991).



From Susan Sontag, "Godard," in Styles of Radical Will:
[164]
The most obvious way Godard segments the forward-moving sequence of narration into tableaux is by explicitly theatricalizing some of his material, once more laying to rest the lively prejudice that there is an essential incompatibility between the means of theatre and those of film. The conventions of the Hollywood musical, with songs and stage performances interrupting the story, supply one precedent for Godard—inspiring the general conception of A Woman Is a Woman, the dance trio in the cafĂ© in Band of Outsiders, the song sequences and Vietnam protest skit performed outdoors in Pierrot le Fou, the singing telephone call in Weekend. His other model is, of course, the non-realistic and didactic theatre expounded by Brecht. An aspect of Godard Brechtianizing is his distinctive style of constructing political micro-entertainments: in La Chinoise, the home political theatre-piece acting out the American aggression in Vietnam; or the Feiffer dialogue of the two ham radio operators that opens Deux ou Trois Choses.



A lively prejudice ? Or a mere statement of the obvious? Who's to say?

At the risk of obscuring rather than clarifying the situation, I would suggest that us arty types pay closer (any) attention to our ontological predicates ; at which point there is little left to say about the import and export of means between art forms but quite a lot yet to be said about ends.

In other words:
if it works, it works;
but
it cannot possibly work the same way in cinema
as it does in theater

because
there remain
(despite spirited efforts to the contrary)
irreducible material differences
("ontological predicates," if you insist)
between the two mediums
.

In other words,

some actions change many, many things .

In other words,
means be damned;
the ends can have no precedent .


01 January 2023

Self-Curation


Richard Seymour
The machine always wins: what drives our addiction to social media

A 2015 study looked into the reasons why people who try to quit social media fail. The survey data came from a group of people who had signed up to quit Facebook for just 99 days. Many of these determined quitters couldn’t even make the first few days. And many of those who successfully quit had access to another social networking site, like Twitter, so that they had simply displaced their addiction. Those who stayed away, however, were typically in a happier frame of mind and less interested in controlling how other people thought of them, thus implying that social media addiction is partly a self-medication for depression and partly a way of curating a better self in the eyes of others. Indeed, these two factors may not be unrelated.


[from a screenshot-to-self, 2019]

01 June 2022

McLuhan—The Ground Rules


Marshall McLuhan
Understanding Media (1964)
MIT Press edition (1994)

Stereo sound...is "all-around" or "wrap-around" sound. Previously sound had emanated from a single point in accordance with the bias of visual culture with its fixed point of view. The hi-fi changeover was really for music what cubism had been for printing, and what symbolism had been for literature; namely, the acceptance of multiple facets and planes in a single experience. Another way to put it is to say that stereo is sound in depth, as TV is the visual in depth.

(p. 282)

24 December 2021

Lasch—To Postpone a Reckoning


Christopher Lasch
The World of Nations (1973)

Ch. XVIII, "Birth, Death, and Technology: The Limits of Cultural Laissez-Faire"
The prevailing image of technological utopia begets the counter-image of technological nightmare—the appalling vision of a scientific totalitarianism, embodied in such anti-utopian novels as Aldous Huxley's Brave New World and George Orwell's 1984. On the one hand we have a greatly exaggerated faith in the ability of science to solve all the material problems of life, and an exagger-
[301]
ated idea of the autonomy of science and technology as determining forces in history; on the other hand, these inflated estimates of the power of science give rise to a hysterical fear of scientific dictatorship. This fear, precisely because it is cast in the form of an anti-utopian vision of the future, serves to postpone a reckoning with science, while the sweeping quality of the scientific control it envisions serves to paralyze our will to act in the present. At the same time it gives the illusion that the destructive possibilities of science are at least being squarely confronted. The anti-utopian and the utopian myths of science have a common root in the assumption that science is an autonomous force, rather than an instrument of the will of the human community, and that its development is inevitable and irresistable.

(pp. 300-301)

23 December 2021

Lasch—On Ellul


Christopher Lasch
The World of Nations (1973)

Ch. XVII, "The Social Thought of Jacques Ellul"
According to The Technological Society, the last chance of revolution disappeared in the nineteenth century, when the revolutionary movement ceased to oppose technology with "spiritual forces" and adopted the materialist perspective as its own, thereby hastening the final triumph of economic man. "Proudhon and Bakunin had placed spiritual forces in rivalry with the economic order. Against them, Marx upheld the bourgeois order of the primacy of the economic. . . ." Unfortunately for this analysis, Marx never propounded any such thing as "dialectical materialism"—that was the contribution of Engels, who sought to establish the scientific credentials of Marxism according to the positivist standard of scientific truth that had come to prevail at the end of the nineteenth century. Marx was not a determinist; he did not deny the element of human will in history; he made no easy assumptions about the inevitability of progress; nor did he equate social progress with technology.

(p. 271 footnote)




[Ellul's] The Presence of the Kingdom is addressed to Christians, but it raises questions that all radical intellectuals have to confront, whether or not they approach them from a Christian perspective. The crisis of the faith is one aspect of the cultural crisis of our time, and Ellul's plea that the church speak directly and critically to social issues springs from the same concerns that have led other intellectuals, working from secular premises, to insist that culture must no longer be regarded as an activity having no relation to politics, that artists and scholars must abandon the pretense of neutrality, and that a new humanism, in short, is likely to take shape only if it makes connection with the struggles of exploited classes to change the world. In the years since The Presence of the Kingdom first appeared, pleas for culture to be "relevant" have once again become common and even fashionable; but as the level of political militancy rises, the advocates of cultural "commitment" have more and more reiterated the position they held in the United States in the thirties and which they have never ceased to hold on the European left—namely, that cultural radicalism means that intellectuals should enlist in the proletarian revolution (now seen as a global uprising of the non-white, colonized peoples). Ellul's work, taken as a whole, constitutes a sustained critique of this position...

(p. 272)


Well, is it necessarily emblematic of the pretense of neutrality to paint abstracts? What about painting abstracts in the epoch of reality tv, deep fakes, VR, and the social media echo chamber?

By reading the artistic surface for explicit political content or commentary, the meaning of the larger gesture and all that necessarily attaches to it vis-a-vis process, curation, criticism, distribution, etc., all of this is ignored. At our peril. A justice issue again, I would say...the quotidian kind of injustice that no one notices until a more total, cosmic variety has engulfed them.
Neither science nor art provide any alternative to the prevailing chaos. On the contrary, science and art contribute to it: science, by divorcing itself from philosophy and becoming
[275]
merely a higher branch of technology; art, by giving up any pretense to make statements about objective reality, thereby dissolving itself in "self-expression." Neither science nor art any longer communicate anything except, in the one case, information required to solve technical problems—and even this is conveyed in symbols accessible only to specialists—and in the other case, inner experiences incommunicable by definition. Modern art, by opposing to technological domination a cult of the irrational, "guides us in the direction of madness." Faced with rampant disorder, men take refuge in the great "explanatory myths" of our time: "the bourgeois myth of the Hand of Moscow, the socialist myth of the Two Hundred Families, the Fascist myth of the Jews, the communist myth of the anti-revolutionary saboteur." These provide the only "means of intellectual coherence" in a world made meaningless by loss of continuity, loss of memory.

(pp. 274-275)


Well, when did art ever have anything more than

pretense
to show for itself when it comes to
mak[ing] statements about objective reality
?
Given
inner experiences incommunicable by definition,

why

bother

communicating

?

And why is
self-expression
the only other option?

At least the younger Lasch here seems more amenable than the elder to the notion that expression too is a mere
pretense
.


The turn away from these imperatives has an obvious justification if the imperative itself is shown to be merely
a cult of the irrational
.


What is left for art to do, then? To make its statements on the level of the artist's conduct of life rather than on the level of surface content. Surface content is too easily misinterpreted, and even more easily properly interpreted for the purpose of using or abusing its underlying intent. Sontag: "Sometimes a writer will be so uneasy before the naked power of his art that he will install within the work itself—albeit with a little shyness, a touch of the good taste of irony—the clear and explicit interpretation of it." In "life" as opposed to "art" we have other names for such "uneasy" people, names which are even less flattering than specialist or irrational.

Treating art as a form of "speech" in the First Amendment sense entails the fringe benefit of placing it more accurately in the well-worn distinction between word and deed. And therein lies the need for "art" entirely apart from its potential bearing on "politics." The need is in the deed.

If the communicative imperative can be dispensed with, the landscape of rationality and irrationality shifts. Much art may then be "pointless," but not necessarily "irrational." At this particular historical juncture I'm quite at peace wallowing in "pointlessness" rather than being railroaded into making a statement, one way or another, with any type of artwork or "cultural" activity I might undertake. On the other hand, if I do in fact use a tuba where others use an iDevice, is that distinction simply to be disregarded on the grounds that my weird tuba music says nothing explicit about "technological domination"? How much more explicit does one need to be about it?

Endnotes to Ch. XXI. "The "Counter-Culture""
Seeing in these books [of Charles Reich, Theodore Roszak, Philip Slater] symptoms of a much deeper cultural malaise and appalled by the anti-intellectualism often associated with the new left—and more generally by a flood of irrationalism in modern society as a whole—those who still believed that a radical politics without critical reason was a monstrosity attempted, in effect, to construct an ad hoc defense of liberal culture, as in the manifesto on the "cultural crisis," that would still be distinguishable from a defense of liberalism as a political ideology. [sic] What is really required, however, is a more penetrating understanding of the "counter-culture" itself and of its social and cultural antecedents. Does the "new culture" represent merely the culmination of cultural modernism, as some have claimed—a democratization of the avant-garde? Or does it portend a regression to a more primitive consciousness? Increasingly events seem to point to the conclusion that it is precisely the premises of modernism that are being rejected in, say, rock music and street theater. If art traditionally has been an interplay between tension and its resolution, the new art banishes tension and seeks to dissolve all oppositions in direct, unmediated experience, non-verbal states of being, trancelike euphoria. Performers alternately assault their audiences, whipping up moods of subdued violence, and make "love" to them, in both cases hoping to merge the performance with "life" and to put both art and life safely "beyond interpretation."
..and therefore, the countervailing "cultural" tendency would be to land dangerously short of interpretation, no?
The audience is offended or, worse, titillated; it enjoys being verbally assaulted; it imagines itself instantaneously released from "bourgeois inhibitions." Relieved of the need to perform and act of imaginative identification, it is more passive than ever, while its lingering reservations about the new art are silenced by the fear that what is new must
[335]
be necessarily significant. "Great art is always ahead of its time." The rhetoric of the avant-garde is pressed into the service of an esthetic with which it has little else in common, in order to clothe the contemporary artist in an inscrutable authority that he claims to reject but uses in many ways to intimidate his audience and critics.

(pp. 334-335)




Part of the job of criticism today would seem to be to insist on the difference between attempting to give popular themes more lasting form and surrendering to the utter formlessness of the moment.

(p. 335)

30 April 2021

Karen Kurczynski—Jorn's Distrust of Photography

Kurczynski, The Art and Politics of Asger Jorn
...distrust of the prefabricated image in fact led him [Jorn] at first to dismiss—shortsightedly—the artistic and critical potential of photography. Jorn writes that excessive faith in the progress and objectivity of science leads to the view that photography is the best art form, because it is the most objective and realistic. But there is no objective reality even in science, he argues, since it is always tied to the needs and interests of those who fund it. Rather than make art more objective, Jorn argues for the subjectivity of science. Writing in the late 1940s, he warns of the danger of considering photography a substitute for reality, the equivalent of armchair traveling instead of real experience. Jorn argues that the close connection of photography to reality is precisely what makes it the least artistic. He was unable to see the potential of photography as a creative medium because of its associations with impersonal, mass reproduction. Jorn reductively associated photography with the culture industry... While hopelessly limited for any contemporary understanding of photography, these views were typical of the 1950s. (198)
It seems once again that an opportunity has been missed and an opponent talked past rather than hit where it hurts: is "artistic" or "critical" potential really the issue, or is it rather that both potentials are in fact so shockingly vast in relation to the type of agency required that a certain devolution in the latter respect was now simply inevitable? The vast power of representation had in fact been democratized, and this rather directly and drastically cheapened initiative, intent, vision, subjectivity...it is hard to name an "artistic" or "critical" value that was not cheapened this way; that is, for the abundance of those potentials rather than their lack. Photography, in the moment that it was new, was too easy in proportion to the power inhering in it. This is not a denial that photographic skill exists, but rather that it was now both harder to distinguish from the ordinary and more widely dispersed and rather less scarce. Seems to me that these are material questions susceptible to material validation, whereas I cannot imagine successfully teasing out the objectivity/realism/science question raised here (certainly not only on the broadest of strokes painted here). The "substitute for reality" seems equally absurd on the surface, but I think it ultimately has been materially validated; in this case people can tell the difference but even so don't seem to care about the difference. But even here, the given discussion has suddenly shifted entirely to the consumption side of things; nothing is said about initiative, accessibility, technique, etc. as this pertains to creators, whereas it seems to me that the accounts of Jorn's chosen mediums throughout center around creation rather than reception. Continuing on, the word "impersonal" is used; this also demands that creation and reception both be explicitly accounted for; otherwise the impression is that of reactionary bluster rather than considered critique.
Jorn suggests that abstract art addresses our imagination more directly that the "indirect and superficial" art of photography. He writes that "visual art means first and foremost visual effects, and the most elementary, direct visual art is that which effects our power of imagination by means of colors, forms, and direct visual effects." (198)
Now we're talking! But there is again an unsatisfying, overgeneralized aspect betokening another missed opportunity. Is it photography itself which is "indirect," or is the photograph in fact the intermediary begetting an unduly "indirect" response in the subject? Is the problem in fact that the photograph is so direct (or perhaps simply suggests/imposes this conceit whether or not it is true) that the subject's imagination is subdued not for lack of "direct" stimulation but in fact for (the conceit to/impression of) an overabundance of it, thereby constraining the imagination inside thick walls of information rather than inviting it on an open-ended journey guided only by the occasional signpost? This analysis certainly is available re: representation and reproduction, as against abstraction/nonrepresentation and singularity. We may well read "elementary" as "leaves something to the imagination"; of course it is not just photorealism which fails this prescription precisely where nonrepresentation succeeds, but also language properly construed which fails where non-/pre-/supra-linguistic cognition succeeds. This is indeed a role (dare I moralize and say a Function?!) for abstract art and music; yet that aspiration to utilitarianism hits a snag if the ultimate, final, exalted end product of whatever particular process we are talking about remains representational, photorealistic, linguistic, communicative, etc. Seems that those types of thought are necessary, by definition, for any social intercourse at all, hence serial abstractifying exercises can be only a means, never their own ends, and in fact uniquely vulnerable to the conquering dictates of social ends which are contingent rather than absolute.

[from a notebook, 2018]

23 October 2020

The Mind on Furlough

I am still furloughed and still on the public dole. It has been just over six months, and if it goes on for only another six that will hit the under for most people's bets. I have been tremendously productive in ways which probably don't count for much. The first thing that happened was that my apartment got very clean; now it is extremely messy, even by my standards. I stopped practicing on July 1 and have not yet resumed. Instead, I have torn through books and records, scrabble word lists and annotated games. I have given the life of the mind due regard. I would rather have my job back, but I also could never relate to those who found themselves "bored" or stir-crazy when the first lockdowns hit. Privately I already had enough on my plate for ten lifetimes, and much of it was going to require a lockdown of one sort or another anyway. Be careful what you wish for, I guess.

The immediate future is very uncertain, but I have become more preoccupied with the long-term. Specifically, I cannot fully repress the thought that as I have just begun to find full self-actualization in the cultivation of the mind, the efforts and the thrust of the wider world are all directed toward rendering the human mind obsolete. I am reminded of this by things as varied as: Quackle simulations (a Scrabble computer program which plays quite well but not infallibly, and which almost every serious player now uses to self-evaluate); a podcast about the uses which VR and game engines are finding far outside of their conventional bailiwicks; the need for constant rule tweaks just to keep pro sports entertaining now that minds real and virtual have been unleashed upon them, thus exploiting the existing rules so brashly and effectively that the "product" suffers; and of course, the LAX jetpack stories, which remind us of just how far behind schedule we have gotten in fulfilling bygone pop-technological prophecies.

We may still be a long way off from the singularity, but I don't know that we are all that far off from a world where all of the mind-based abilities I have so enjoyed cultivating are either superfluous, obsolete, or politically retrograde. Life will surely go on, but I will look awfully silly. Silly, and incapable, and certainly unnecessary. Because this relates to closely-held values and to identity, I do care what people think about me in this respect. What the man-without-god question was for my god-oriented forbears, so the man-without-mind question has become for me. Of course we did get some good philosophy and cantatas out of the old paradigm; the bygone prophets of doom would probably be surprised to know just how much mileage we've gotten out of these old things even without an imaginary friend to guide us. Yet these too are mind activities, which just makes the analogy more troubling, makes it harder to imagine that life will indeed just go on, because it has to, just like it had to when god (and the author close behind him) died or were killed.

Aside from a couple of college summers, I have never had so much time to devote to my own work. I have often found myself thinking even so that lack of brain has been a far greater obstacle than lack of time. There are days where I can find 9-letter words through disconnected tiles, and there are days when I can't keep my 3s straight. There are days when I can read for 12 hours and other days when I can barely focus for more than a few pages. As for my former work life, there were days on the ol' Metrolink where I could fully absorb a difficult book chapter and other days where I had to punt and aim for a much-needed nap. I have always been this way, regardless of what else is going on in my life. Where I am almost inhumanly consistent is that I wake up every day, regardless of how much brain I have, with a burning desire to progress, develop, actualize. Like my coworker's old Powerbook G4 which I coaxed into running Lubuntu, my own power module flashes the message "No Kernel Support," which means I eventually overheat and have to rest. This can be demoralizing. The occasional triumphs are gratifying. But they seem increasingly like triumphs which technology will soon render superfluous.

14 March 2020

Mumford -- Art and Technics (xii)

"In the case of photography...there was for long a question as to whether it was or was not art. And the answer to that question is: Is there any leeway for choice and initiative on the part of the photographer? If there is such leeway, there is a possibility of art, that is of success or failure in terms that would have significance to the beholder. Perhaps the best effect of machine art is to make us conscious of the play of the human personality in the small area where it remains free, a differentiation so delicate, so subtle, that a coarse eye would hardly take it in and an insensitive spirit would not know what it meant." (82)

p. 93 -- "As with printing, photography did not altogether do away with the possibilities of human choice; but to justify their productions as art there was some tendency on the part of the early photographers, once they had overcome the technical difficulties of the process, to attempt to ape, by means of the camera, the special forms and symbols that had been handed down traditionally by painting. Accordingly, in the nineties, American photographs became soft and misty and impressionistic, just when impressionism was attempting to dissolve form into atmosphere and light. But the real triumphs of photography depended upon the photographer's respect for his medium, his interest in the object before him, and his ability to single out of the thousands of images that pass before his eye, affected by the time of day, the quality of light, movement, the sensitivity of his plates or film, the contours of his lens, precisely that moment when these factors were in conjunction with his own purpose. At that final moment of choice--which sometimes occurred at the point when a picture was taken, sometimes only after taking and developing a hundred indifferent prints--the human person again became operative; and at that moment, but only at that moment, the machine product becomes a veritable work of art, because it reflects the human spirit."

23 December 2019

Mumford -- Art and Technics (ivb)

If the acronym STEM once found merely prosaic use, it now by its mere existence and invocation encapsulates quite neatly the territory which the objective has pillaged from the non-objective. STEM encompasses the most lucrative professions and growthiest growth industries, and also demands the most rigorous academic and technical skills. It experiences shortages of qualified (home-grown) workers, and it is also too white and too male for its own good. For all of these reasons and more, STEM Matters. If "modern man" has, incidentally, also "patterned himself on the machine" with terrible consequences, if money cannot actually buy happiness (still), if the academic topics thought most rigorous are really just the most objective ones, if nature can be thought to trump nurture even where diversity is valued over homogeneity, that is none of STEM's business. STEM is too busy making money, improving lives, and generally being awesome. If you want in on that action, then STEM is what you do.

Artists as a group are liable to be more unified in their general sense of objection to this situation than they are in the substance of their specific critiques. I tend toward Mumford's outlook, seeing in the current condition of American art and artists several particular manifestations of a more widely observable human dynamic by which the objective, the quantifiable, and the provable runs amok. Against this there is the option to play ball, to commence laying out the various strictly objective cases for the value of art and artists: educational, therapeutic, economic, activist, and so on. To me this latter tack has always looked self-defeating. The harder we work at proving our own objective worth, the further we get from actually doing so, and the more ridiculous we look both to STEMers and to each other. Objective value is not what we do. I fear that we let the Technics terrorists win by trying to play their game.

In spite of this division among artists, or perhaps because of it, the A-for-Art eventually wedged its way into the acronym, forming STEAM. It takes an artist to notice a sort of symbolic conflation of chemical and physical transformation in this too-cute linguistic maneuver, and also to appreciate and reclaim the irony. STEM always carried a cultural charge and was bound to be reactive for this reason. How stable, then, is the new molecule? Where, really, are the affinities between STEM and The Arts? There are some good ones, to be sure, but do they supply the necessary energy for a change of state?

Mumford's Functionalist rejection of Industrial Design is a powerful rejoinder to orthodox STEAM rhetoric. In what he classifies as "machine arts," with printing as the paradigmatic example, "we give up a certain subjective freedom in order to better serve a common collective goal." (74) And so,
by very reason of its impersonality and standardization, a machine art, once it has achieved a high level of form, is not subject to endless variations: the main problem is to keep it at its original high level.

Whereas "repetition without variation and re-creation is fatal to the existence of the humane arts,"
This is not so with the arts of the machine. Here the type is the supreme achievement; for the sake of functional economy, for the sake of order and common use, the fewer new demands that are made, the better. The capital danger in the arts of the machine is misplaced creativity, in other words trying to make the machine take over the functions of the person. (73)

I read this as essentially a consequentialist argument. To me it is at least imaginable that pure design considerations could be applied to a more-or-less perfected technology without affecting its functionality. Mumford's unwillingness to abide, say, the painting of typewriters and coffee grinders even after they have been designed, manufactured and purchased (p. 80), seems calculated to project total rhetorical consistency rather than rhetorical grace. This is the intransigence which earned Functionalism its staunchest enemies: what kind of Puritan would want to live in a world where purely decorative touches were so deeply mistrusted as to be veritably quarantined even from the most quotidian of technical pursuits? Ironic, then, that such radical anti-functionalists as Jorn and Debord unequivocally shared Mumford's contempt for "the canons of conspicuous waste, dear to the businessman, and the newly rich" by which "someone is picking your pocket of money you might use for better purposes, under the pretext that he is furnishing you with art." (75)

In other words, where the Situs wanted to liberate desire, Mumford was deeply suspicious of it. Both saw consumerism channeling desire to nefarious ends, but where the Situs saw unmediated desire as the weapon that would topple entrenched power, Mumford saw desire as one side of an essential duality which demands balance, discipline and restraint rather than release.

From the standpoint of effective communication, the handwrought manuscript tended by its very elaboration to lose sight of its essential reason for existence. In this respect, its development was very similar to that we often find in other arts, a tendency on the part of human fantasy, once it is emancipated from the restraint of practical needs, to run riot, to seek to prolong the esthetic moment beyond any reasonable duration. ...Quite evidently this desire to prolong a pleasurable occupation, while it makes for a good life, has its own kind of shortcoming; and in the case of the book, the very esthetic excellence of the illuminators and illustrators served also to retard the process of copying and so limit the circulation of books. (69)

I suspect that printing is a well-cherrypicked example to which there not as many companions as Mumford would have us believe. That said, the current smartphone/tablet landscape seems as good an illustration as any of how collisions of agendas can disfigure a technology to the extent of interfering with a quite settled functional profile. Yet another CalArts memory comes to mind, that of Barry Schrader asserting that the digital computer is a settled type of machine which has not meaningfully changed in decades, thereby eliciting raucous protest from a gaggle of BFA-1 Music Techbros, who were palpably invested in the idea that they were entering a cutting-edge field. Mumford's Puritanical version of Functionalism (he reluctantly but unapologetically embraces both labels) is a bit hard to relate to personally, even for me; but read consequentialistically (and, as it were, pessimistically) it certainly is not lacking for anecdotal support. The balance between art and technics is an achievement, not a given, because the conflicting impulses at play here (the need for expression and the need for order) are so fundamental and powerful.

I think also of a passage from Cory Doctorow:
I’ve seen sausages made. I’ve seen laws made. Both pale in comparison to the process by which anti-copying technology agreements are made.

This technology, usually called “Digital Rights Management” (DRM), proposes to make your computer worse at copying some of the files on its hard drive or on other media. Since all computer operations involve copying, this is a daunting task — as security expert Bruce Schneier has said, “Making bits harder to copy is like making water that’s less wet.” (27)

Apple in particular, with the i-Devices, not only has made an end run around this "daunting task" but also figured out how to have people literally lining up at midnight to pay steep prices for the privilege of being thus manipulated. STEAMers sure are correct to see the iPhone as a money-machine which fully integrates the initials. It seems to me that this is, however, exactly the kind of mongrel contraption which Mumford theorized, a weaponization of aesthetics in service of endless consumption, resulting in a sleek supercomputer that can't keep a webpage loaded unless you threaten to fight it. (Yes, I own one.)

Incidentally, musical instruments are excellent examples of machines which achieve their visual appeal rather by accident, and where there are, with a few exceptions, not very many things you can do to them to change their appearance that won't at least marginally affect their utility. Instrument building is also a near-perfect STEAM topic, sitting as it does squarely on the intersection of art and technics, of rugged necessity and personal expression, of the machine arts and the performing arts. Yet even among the most accomplished high school instrumentalists, how many play so well that the differences between silver and lacquer, pistons and rotors, funnels and bowls, are viscerally rather than intellectually or rationally known? And what about the vast majority who are less accomplished, or who know these things only as abstract information that they have learned secondhand, or who have never stopped to consider them at all without prompting?

Indeed,
Behind the appearance of printing from moveable types, apparently so sudden, and on superficial analysis just a great mechanical feat, we find a thousand years of self-discipline and esthetic training, which went along with the effort to respect the gifts of the spirit and to deepen the inner life. Some of that training still is important for those who would design typography. You might think that, once printing was achieved, it would be possible to cut loose entirely from these earlier sources; but in fact the continued interdependence of art and technics could not be better illustrated than in this wholly mechanical art. ... As soon as the art of the calligrapher fell into decay, the art of type design became more difficult, for in aiming at mechanical accuracy and finish, the designer often lost the precious touch of the hand itself. Once utilitarian and rational interests predominated over esthetic ones, as they did in the nineteenth century, there followed a series of lapses both in type itself and in the layout of the printed page... (71-72)

Again, there is a consequentialist flavor to this view, but Mumford is hardly the only one to advocate for the continued need for "self-discipline and esthetic training" even as machines continue to take over more and more tasks from humans. If STEM, the brand, is too rigidly one-sided, or perhaps appears that way on the surface to concerned parents, that is a question of overall purview; the extent to which different disciplines are compartmentalized or integrated in the curriculum is an entirely different question. Maximum integration of topics seems to me unnecessary if the goal is simply to develop the whole person, though I fear the actual goal is simply to look that way superficially. Comprehensive education seems like the obvious winner in any case. But of course comprehensive education is, again, essentially politically unviable anywhere fiscal conservativism reaches a certain critical mass, and so this invites some messy workarounds. STEAMers certainly can count on the artists to play along: we're generally happy to trade our street cred for any degree of mainstream acceptance vis-a-vis this vocal minority of true believers in Hire Education. There is liable to be a personal incentive for us to integrate our marginalized disciplines with more prestigious ones regardless of the potential therein. Call it a classic democratic compromise where nobody gets exactly what they want.

24 December 2017

Automation and Autonomy

A friend suggests that live musical performance by human beings could become a species of Vintage Aesthetic when seen against the impending advance of computers and computer-brained robots. No doubt this dynamic has already manifested itself here and there vis-a-vis existing technological leviathans (e.g. "canned" music), and no doubt it will eventually take its place in the canon of anti-aesthetic prescriptivisms alongside moral uplift, the literary imperative, political activism, cultural preservation, and pediatric neural calisthenics. All of which is to say that it promises to have the same chilling effect that such historically contingent prescriptions have always had even as it keeps a lucky few human artists gainfully employed.

To take such a sea change and make it generative rather than prescriptive requires a retreat into the absolute. Indeed, I don't think it is a coincidence that as we perceive the general pace of change to have quickened artistic autonomy has become ever less fashionable, i.e. that feelings of anomie or "normlessness" would beget various desperate attempts to contrive new norms. The word "perceive" is important here, for do we not also see a strong correlation between the more-is-more phenomenon and a certain constitutional fixation on change as against stasis? Change-in-the-air is the supreme rationalization for scorched-earth modernism, whose manifestations range from puerile self-importance to the burning of libraries; hence there is always an important balancing role here for the attempt to step outside the parochial concerns of the moment. I would certainly not place the aesthetic sphere at or near the center of such concerns, but nor does banishing it to the compost heap of history do it justice. Everything is aesthetic, much as everything is political.

We must take the Vintage issue seriously, and certainly the larger one of automation/computerization as well, but only on our (and our art forms') own terms. That is, it behooves any contemporary musician to face directly the formal, technical, and aesthetic questions that automation raises, or raises in the negative, as it were. Fruitful cross-pollenation is inevitable, healthy coexistence is not, and prescriptions are always already constraints, no matter how urgent they might seem to be. Indeed, is it not merely by accidents of history that such urgency ebbs and flows? And is that not a powerful argument for artistic autonomy rather than against it?

08 November 2016

Waiting For The Singularity

A thought on political dysfunction for this singularly disheartening election season:

Owing to immutable material conditions, the impending technocracy is thought to be already quite well prepared to automate most of what we do with our limbs and yet quite far indeed from automating most of what we do with our brains. Wouldn't we be better off if the reverse were true? Us human grunts have come to so despise manual labor while taking such pride in governing ourselves; and yet we seem to be quite a bit more competent in the former arena than the latter. Perhaps, then, we ought to put the robots in charge NOW, warts and all. Even their not-quite-human level of intelligence should be adequate to notice how close we are to destroying each other and summarily put us all back to work making widgets. Actually, given that "the social animal" is also the cronyist, nepotist, tribalist animal, perhaps being not-quite-human is actually the robots' strongest qualification for the job.



05 November 2016

Escape From The Walled Garden: Honeymoon

I'm typing this latest cursory missive on an Intel NUC i5 running Debian Linux and bearing the full weight of a lifelong desire for a more "stable" computing life than Apple seems willing to provide. Perhaps that's a strange complaint from someone whose 2006 MacBook still runs and has managed to be instrumental in almost everything I've created for the last decade, including this entire blog. My 2001 iMac also ran forever and I expect it would start right up today if I had it handy. But that's not all those two machines have in common: both were left high and dry through planned obsolescence years before any reasonable consumer would have considered replacing them, and I've now had just enough of that song and dance to make a run for it.

In sentiments that I imagine are not particularly unique or interesting, I must confess that I'm not sure what the point is of having superior hardware if the OS and software are designed to stop working with it every few years, and since I've never been willing or able to pay the ever-steeper price of keeping up with the Jobses, the whole thing has been enormously frustrating for a long time. Ditto the wider ambivalence about long-term environmental impacts of all this "consumption," impacts which I trust will come home to roost in a more immediate way just a tick beyond some yet-to-be-understood point of no return. I hate to toe the line of such holier-than-thou hypocrisy, but here's putting it on the record anyway that I've at least considered this side of the issue, albeit while freely admitting that I don't quite have the courage to wholly refuse to participate.

And so while attempting to remain in control of my callow newbie enthusiasm, I'm going to give Linux and it's barely-adequate suite of musical accoutrements a fair shot to prove itself a worthy alternative. Most of that enthusiasm was sapped rather quickly as I attempted to make playdates for my Debian system with various devices, among which the printer and the Zoom H1 have proven amenable while the scanner and bluetooth speaker have not. And yes, gentle composers, I am committing to attempt to survive with MuseScore, at least for a while, and to ponder learning a typesetting program that could handle the thornier jobs. Sibelius has been very much (I'm sure many older folk would say too much) a part of my musical identity since the age of 18, and I'm not sure I've fully grasped what life will look like without it. At the same time, its recent history makes me just as nervous as does Apple's, albeit for rather different reasons; and further, 90% of what I write is "lines and dots" music anyway, even if those lines and dots sometimes come in waves. I will of course be sure to report progress (or lack thereof) as circumstances dictate.

22 December 2012

Soundcrowd

Timed comments let your friends and fans give you valuable feedback at specific moments throughout the waveform. Pinpoint exactly what's working well and start a conversation around it.


Really? You mean it's not just for person after person to say "that was dope" as a pretext for making sure a picture of themselves and a link to their page pops up the next time someone listens? Are you sure? Because to this prematurely crotchety old bastard of a blogger, it sure looks like the same endless feedback loop as the rest of the musical interwebz, where the incentive to find something, anything, to say in absence of any need for that something to be coherent, let alone constructive, merely ensures that the signal to noise ratio in our online musical discourse remains poor, good intentions (more like subterfuge) be damned.

12 December 2012

On Giving a Shit

About aestheticism, I just wrote, "fine with me, by the way, if we not try too hard to explain it..." I should say (and this is overdue here) that there's a real danger in not wanting to even attempt explanations of very important issues. I've written similar things here many times, but never, I want reader(s) to know, without some apprehension. I have a few different reasons for this isolated instance of anti-intellectualism: one was the realization that a certain sense of mystery is essential to how I experience my very favorite musical works; another is having developed a deep distrust of reverse engineering in art, or at least of that particular kind of reverse engineering musicians have so fallen in love with which works backward, whether philosophically or physiologically, from an emotion.

It's not difficult to imagine, for me at least, that sometime before the Earth falls into the Sun, medical research and technology will enable an understanding of the biological basis of aesthetics which is unfathomable to us today, and that artificial intelligences will be created which achieve a human level of aesthetic synthesis based in whole or part on this new understanding. Many will say that this truly ends art (which I suppose it could if you subscribe to a social theory of art), while others will simply deny that it could ever happen. My feeling, though, is that as long as I still had something nice to listen to, I'd be just as happy, and since we've thus far managed to collectively create way too many nice things to listen to without knowing exactly how much of which neurotransmitter is released when we hear a Hindemithian stretto versus a Lutoslawskian cloud, I would posit that future generations have a very slim chance of going wanting.

I'm less sure that we need to pursue these things than I am that someone will pursue them, and that they are, given enough time, achievable. Mostly, though, I'm ambivalent (may I flatter myself, actually, by saying uncharacteristically ambivalent?), and that's the source of some similarly ambivalent statements that I probably wouldn't make about most any other field of intellectual endeavor. All else being equal, we should always want to know more, but we can't know everything, and thankfully (this is the larger point here) we don't always need to. Indeed, I think there is very little necessity left in our over-saturated art world; that is, the work already exists: all kinds of art for all kinds people (though access, of course, is another matter).

11 December 2011

Fakery

I'm beginning to wonder if a chart of our collective progress on the technological march towards virtual reality would actually resemble that of an oscillation (i.e. periods of success followed by periods of regression) rather than an exponential curve (i.e. continuous accelerating progress). I say this not because one could reasonably say that we've truly regressed technologically, but because of some recent experiences with fake reality that are more real but less useful (and perhaps more to the point for a musician/artist, less aesthetically pleasing) than the older, less real ones seem to me to be.

As a kid, I was not allowed to gorge myself on video games to the extent of many of my peers; my mother simply would not allow it. I did gorge myself on television, but still less than some. Even so, when today I happen by some bizarre turn of events to catch a glimpse of a cartoon or video game, I'm typically most surprised at (a) the turn towards realism, and (b) how profoundly aesthetically unsatisfying this is to me. It's an old man's gripe to be sure (I'm not yet 30, but in the technological world, that's middle age), and I've heard enough of them directly from old men to be wary of committing the same fallacies. My gut reaction is nonetheless remarkably consistent. Modernist though I claim to be, perhaps I'm finding for the first time some appreciation for the advice of so many conservative composition teachers that imposing limitations on one's process can be beneficial to the outcome. There are more than a few vinyl hoarders and NES players (and conservative composers) who would agree, no doubt cherrypicking their evidence with the utmost caution and backtracking appropriately when confronted about their iPhones.

The reason I bring this up here is that I'm coming to view notation software playback as one of these areas. When I upgraded to Sibelius 6 in 2009, I had been using version 2 since it first came out (yes, that's kind of a really long time). There was quite a bit to learn, a lot of useful new features, and a few real pissers. (The chord symbols! Barf...) The biggest challenge to this day, though, has been the built-in sounds. They are much more "realistic" sounds than the old general MIDI sounds I had become very accustomed to, by which of course I mean that it would now be much easier, possibly even a foregone conclusion, to identify by ear the instrument they purport to represent. (Forget the specific instrument; with the old sounds, you sometimes wondered which instrumental family was in play.) For whatever reason, though, I find them much more difficult to work with: the timbral whole is still less than the sum of its parts.

In some cases, notably the tuba, this is because they've essentially built mistakes into the samples:

Band Teacher Purgatory Sounds Like This


Yes, there's a better than average chance that your garden variety community band tuba player will wobble slightly on a low A before the pitch stabilizes, but seriously guys, let's just shoot for the stars next time and pretend that tubists are at least theoretically capable of emitting a steady tone for more than 2 beats at a time.In other cases, I undoubtedly struggle because I spent an incredibly long time working with the old general MIDI sounds and hence got very accustomed to interpreting them. Space, balance and blend have always been the achilles heels of notation software mockups of through-composed, acoustic music, and I don't think the present results are any more accurate despite representing an obvious attempt to improve in just these areas. I'm left to wonder if I am, in fact, just getting old, or if we had not actually stumbled on a semi-optimal degree of reality, unbelievable as that would have been at the time, in comparison to which the next rung of progress actually looks regressive. Perhaps a virtual reality that is obviously fake would be more useful here than one with loftier aspirations and spectacular failures.