Recently in bookclubiv Category
Just so discussion generated by our little pow-wow here was linked up, I asked Brian if we could include a bit of his analysis in this space, and this is what he sent along in response.
[Kelly suggests] an inevitability in technological advances, the
incandescent lightbulb was going to happen even if Edison never
existed. The direction of this inevitable flow of technology, if you
believe in it, could be characterized as what technology 'wants' to
be. Lightbulbs WILL happen because they suit technological needs,
(which in turn emerge from biological needs).and the heart of the matter (the area I'm most interested in):
...Kelly misses a real opportunity. How do styles of expression
evolve? Are there inevitabilities in schools and styles and genre? Is
art directed? Convergent? Would minimalism have existed if Glass and
Reich hadn't been around? If we're going by an evolutionary model,
would that imply that art designed to be most widely consumed is the
truest expression of what art wants? Or is longevity the metric? What
does ART want?
Read Brian's entire response post here. I hope you'll be able to join us full time for the next round of MTG book club, Brian!
Dang, Molly got to the Kelly's whole technology-is-inevitable argument before I did, but that's OK, I'll talk about it anyway, because I think it's a good example of the interesting tension that I felt throughout the entire book. Kelly insists on inevitability to argue against throwing up cautionary roadblocks, what, in chapter 12, he calls the Precautionary Principle. One the one hand, this is, in part, fodder for the six-degrees-of-teleology game that you can play throughout the whole book. (The Precautionary Principle is bad because it slows down innovation. But why is innovation automatically good? See Progress, belief in, pp. 1-359, passim.) But, in casting technological advances as inevitable, he introduces a subtle but interesting disconnect.
Kelly defines inevitability in two ways. Here's the first (p. 176):
[E]very realizable technology is inevitable because sooner or later some mad tinkerer will cobble together almost anything that can be cobbled together.Add the missing agency--that is, some mad tinkerer will choose to cobble together, &c.--and the inevitable is not quite as inevitable. Extraordinarily, even overwhelmingly likely? Certainly--but not inevitable. That's hairsplitting, to be sure, but that tiny space between likely and inevitable is surprisingly lively philosophical ground.
Kelly's second characterization of inevitability likens it to genetics--"Who you are is determined in part by your genes," he writes (p. 177), and technological development is the same way, an inexorable, if sometimes irregular unfolding of a predetermined plan. A strong analogy, but when Kelly writes (p. 179)--
In our lives we have no choice about becoming teenagers--as a former moody teenager, I am compelled to point out a flaw in his logic; we do have a choice, even if the opt-out is, to put it mildly, extreme. But it's at those gedanken-ish margins that you can sense how Kelly, even as he sensibly qualifies his argument, nonetheless fairly consistently errs on the side of a view of human nature that I, at least, find too passive.
It's what's at the source of Kelly's repeated post hoc propter ergo hoc assumptions about technological innovation (or, as I called it to my lovely wife, "that thing-following-another-thing fallacy"; she knew the actual term, having paid more attention to The West Wing). Technology begets technology, in Kelly's view; innovations are the cause of further innovations. It's highly suspicious, especially if, like me, you're more inclined to the idea that people cause innovation, not technology itself--the exponential increase in invention could just as well be due to increased population density as some sort of technological Hegelian vector towards the absolute. And it leads Kelly to a bout of untoward mysticism, in the passage that Molly quoted back at the start of the week:
But can you imagine how poor our world would be if Bach had been born 1,000 years before the Flemish invented the technology of the harpsichord? Or if Mozart had preceded the technologies of piano and symphony? How vacant our collective imaginations would be if Vincent van Gogh had arrived 5,000 years before we invented cheap oil paint? What kind of modern world would we have if Edison, Greene, and Dickson had not developed cinematic technology before Hitchcock or Charlie Chaplin grew up?This manages to be too optimistic and too pessimistic, all at the same time. The optimism is easily seen if you apply the criterion to less morally exalted acts: What kind of modern world would we have if Henry Deringer had not developed pistol technology before John Wilkes Booth grew up? But also, the idea that certain kinds of genius need particular kinds of technology in order to flourish can work against technology as well: if your genius is dependent on obsolete technology, there's going to be far less opportunity for you to practice it, and far less likelihood that your genius will enrich the world.
But that's assuming that the technology is what sparks, or even necessarily enables genius. Do you honestly think an imagination as fertile and prolific as Bach's wouldn't have found some way to express itself, harpsichord or not? The catch, of course, is that, a thousand years earlier, Bach's genius might not have been in the position for its products to be passed down to subsequent generations. Does that mean it isn't genius? Is genius only that which proves useful or beautiful after the creator is gone? And now we've opened the door to aesthetics.
And that's as good a place as any to stop, I think. Because the aesthetics of being a classical musician in the 21st century can sometimes seem to be as much a rebuke to technology as it is enabled by it. Practitioners have shown characteristically human ingenuity in leveraging technological advances in making a musical career at the same time that the actual musical practice--instruments that still demand inefficient courses of mastery, an ideal of live, fallible performance--persistently and joyously occupies a space that, at least from one vantage, exists in spite of the human addiction to technology. Technological advance is neither good nor bad; it's good and bad, the sort of rich gray area that, even in an era of dazzling technological power, artists will most incisively make their playground.
By Molly Sheridan
"Technology is anything invented before you were born."
"Technology is anything that doesn't quite work yet."
--Kelly jokes about how other people define technology
I admit that sometimes Kelly's book made me nervous, like trapped on a speeding train nervous.
This was not the only pop culture parallel I drew. One of the points he grounds his argument on is the idea that certain ideas and developments have been and will continue to be inevitable, in that they have historically been simultaneously and independently invented or evolved as solutions to certain challenges. Our world will not be stopped. Shades of the two almost-identical versions of the world in the sci-fy tv show Fringe, anyone?
Perhaps I found this idea particularly beautiful, as well as musically related, because it gave me repeated flashbacks to one of the most lovely quotes I can recall from a NewMusicBox composer interview. Wendy Carlos was talking about the power of music and she noted:
And an essential part of music is to connect with our shared inner feelings, to recognize the connections and know that you're not alone. We're born alone; we die alone. In between we have music, and a great gift it is, too. It's in there with our social structures: families and friends and loved ones, a shared humanity. I like to think of it as the old metaphor of two ships at sea. We flash our signal lights as we pass one another. It makes life less lonely. It's wired into us. If music were taken away from us, I do believe we would invent it again. In a few generations, we would develop it all over again.
Marc has already pointed out one of the most immediate connections between the proliferation of technology and music-making, that is, the evolution and development of instruments. He's exactly right--there's a deep, sometimes invisible relationship between the musical technology we choose to adopt and the music that results. (There's also Kelly's own what-if-Bach-didn't-have-the-harpsichord contrafactual, which I find mildly ridiculous for reasons I might get into before week's end.) But is there a connection to be made on a more abstract scale? I think there is-and is has to do with the whole idea of the Technium.
Whether you buy it or not, or even treat it as a metaphor or not, the Technium is an othering strategy: it puts technology out there, in its own realm, a realm related to--but not necessarily contained within--the realm of human behavior and responsibility. Kelly is aware of the impulse behind othering; he mentions it in his discussion of the Industrial Revolution (p. 41):
Kelly's remedy for this illusion, though, still keeps technology outside, only as as "action": "No longer a noun, technology was becoming a force--a vital spirit that throws us forward or pushes against us."
The worst by-products of the industrial age-black smoke, black river waters, blackened short lives working in the mills-were so remote from our cherished self-conception that we wanted to believe the source itself was alien.... When technology appeared among our age-old routines, it was set outside ourselves and treated like an infection.
Going back to the Industrial Revolution, there's a pair of nice examples (which I'm borrowing from Julie Wosk's fascinating Breaking Frame: Technology and Visual Arts in the Nineteenth Century) hinting at how making technology something other reinforces the sort of intellectual entropy I talked about in my last post, the way the choices we make regarding technology can collapse our thinking into comparatively impoverished channels. The examples have to do with the English town of Coalbrookdale, a kind of advance guard of the Industrial Revolution, a town that became famous (and infamous) in the late 18th and early 19th centuries as a center for iron production. By all accounts, Coalbrookdale was a fairly hellish place-something you nevertheless wouldn't gather from the painter William Williams' 1777 landscape A Morning View of Coalbrookdale:
The town can be identified in the distance by the graceful plumes of smoke rising into the clear sky. This is about as obvious an example of othering as you can get: the farther away you set yourself from technological disruption, the prettier it looks.
Now, here's another painting: Philip James de Louthenberg's 1801 Coalbrookdale by Night:
This is a much more complicated view. On the one hand, it's a dramatically infernal picture, the vantage designed to maximize both the bright blaze of the smelting fires and the silhouetted darkness surrounding it. But it is also a beautiful artifact, a virtuoso performance, a dazzling play of color, with a kind of thrill-ride you-are-there effect, putting the viewer in the thick of Coalbrookdale's unearthly landscape.
In other words, it's a textbook example of the sublime. Given the terms of this particular blog post, a good way to think about the sublime and its intent is that it's something that inspires us to other ourselves, a dialectic between our everyday experience and an experience that pushes us out of our comfort zone, with the end result an expanded sense of the world. (Last weekend, for example, I was able to take in a live performance of Jean Barraqué's Piano Sonata, a monument of abstract, atonal modernism. The Barraqué Sonata takes the idea of the sublime to an extreme: it ruthlessly others you, the listener.) For a couple of centuries now, the ideal of the sublime has pretty well permeated the way we talk about music, any music. This is a little bit strange, because I think one of the things the course of history teaches us is that human beings, left to their own devices, have a greater propensity for othering people and things outside of themselves than othering themselves.
Why do people do this? Laziness, I think--it's less work to interpret the world in such a way that doesn't involve re-making one's own self-identity the way the sublime would have you do. Now, I'm the last person in the world who should be criticizing laziness, and I know the fact that self-othering doesn't feel like work to me (I enjoy Barraqué highly) puts me in a curious minority, not on a higher artistic or moral plane. But one of the reasons I know I'm in a curious minority is because the collective choices we've made about the technological progress of mass media has resulted in an ecology where, the more self-othering a bit of content is likely to inspire, the less likely it is to have widespread distribution.
Of course, the technology helps, too--I can go into my laptop hard drive and pull up no fewer than five recordings of the Barraqué Sonata whenever I want. But I'm already inclined to push my own envelope of artistic comfort. The experience of sublimity can be a learned experience--I learned it, somwhere along the line--but the explosion of information technology over the past century has not produced any indication that human beings are developing any more capacity for self-reflection.
The current peak of information technology is the Web, which Kelly loves--rather explicitly, on pp. 322-23. "I am no longer embarrassed to admit that I love the internet," he writes. "It is a steadfast benefactor, always there. I caress it with my figety fingers; it yields to my desires, like a lover." It's interesting to read how much more equivocal early predictions about the World Wide Web were, for instance, than Kelly is--and how much those early predictions hinted at the epistemic closure that is the dark side of Kelly's yielded-to desires. Edward Tenner, for example, from 1994:
(My favorite is this anonymous anagram of information superhighway: "New Utopia? Horrifying sham.") The Web has made it easier than ever to experience the self-othering of sublimity--but also has made it easier than ever to avoid the possibility of self-othering at all. Technology wants whatever we want it to want--even what we might not like to admit that we want.
Any future information network will help unhappy people secede, at least mentally, from institutions they do not like, much as the interstate highway system allowed the affluent to flee the cities for the suburbs and exurbs. Prescribing mobility, whether automotive or electronic, as an antidote to society's fragmentation is like recommending champagne as a hangover remedy.
By Molly Sheridan
What Technology Wants was the first book I purchased for my Kindle, and considering I'm a feet-dragger when it comes to new tech tools (the reader was a gift from my husband) the reverse concept was never far from my mind as I read: Yeah, and what about what I want from technology?
What Matthew suspects is correct: Kelly will hold to the idea that progress is always a good thing, no matter what, to the very end of the book. I thought his arguments as to why that was might make for some interesting discussion here. He says, repeatedly, that because new technology increases choice, it is always good, no matter its dueling positives and negatives. As someone who regularly laments our society's consumerist focus and my own (admittedly romanticized) interest in simpler living, this did not resonate for me at first, but later he acknowledged my anxiety:
It is true that too many choices may induce regret, but "no choice" is a far worse option. Civilization is a steady migration away from "no choice." As always, the solution to the problems that technology brings, such as an overwhelming diversity of choices, is better technologies. The solution to ultradiversity will be choice-assist technologies. These better tools will aid humans in making choices among bewildering options. That is what search engines, recommendation systems, tagging, and a lot of social media are all about. Diversity, in fact, will produce tools to handle diversity
While this did not totally alleviate my issues with his "more is more and therefore always good" stance, I saw his point. And Kelly himself says in several places that he wants everything to be available, but then allows only a small curated list of items into his own life.
As for the anthropomorphic issue, I started to buy into that much more easily once it was clear that this wasn't a case of whether technology wanted chocolate or vanilla, but rather the idea that by understanding certain trends in development to date (trumpets, helmets, light bulbs), we might be better equipped to anticipate new technology and prepare for it. I took that to mean that we could better avoid disruptors like Napster and Pirate Bay in the future. Kelly writes:
The better we can forecast, the better we can be prepared for what comes. If we can discern the large outlines of persistent forces, we can better educate our children in the appropriate skills and literacies need for thriving in that world. We can shift the defaults in our laws and public institutions to reflect that coming reality.
For example, if technology is going to "want" to mess around with human genes--and we can probably say more confidently that it will with every advance in genetic mapping that comes along--then what do we need to be thinking about ethically, morally, and technologically today before we get there? In a way, it's kind of adopting the Amish way of integrating new technologies: a way to contemplate their impact on the community before adopting them whole hog, thereby avoid some negative consequences.
I also came around to some of his discussion in this area once he clarified that he was thinking not of some super robot that we were about to build and endow with extraordinary independent AI, but rather he was anticipating the ways technology was going to make humans smarter/more efficient/etc. through augmentations, whether than means a smarter skillet or a smarter search engine.
What I was really searching for was a way to reconcile the technium's selfish nature, which wants more of itself, with its generous nature, which wants to help us to find more of ourselves....Yes, technology is acquiring its own autonomy and will increasingly maximize its own agenda, but this agenda includes--as its foremost consequence--maximizing possibilities for us.
Ah, the anthropomorphic fallacy. Is there any more comfortable way of avoiding having to deal with the darker impulses of human nature? And like a lot of other optimistic views of technology, What Technology Wants is steeped in it. Kelly even puts it in the title, straight off begging the question: does technology want anything?
Now, I should clarify up front that I don't think anthropomorphization is necessarily a bad thing. It can be a useful way to illustrate ideas, an interesting lever to peel back assumptions, &c.--that is, as long as, on some level, we still acknowledge that it is a fallacy, that it's a metaphor or and allegory we're adopting for rhetorical convenience. Kelly's Technium, his nummulosphere of technology that he proposes as the Seventh Kingdom of Life, is a great metaphor, a neat mind-bend that makes the provocative sweep of his book possible. The problem is, at least so far (I'm only halfway through the book-I had four concerts to review and a rehearsal and two services to play this past weekend, cut me some slack), Kelly doesn't think it's a metaphor. Every time he comes close to acknowledging it, he immediately falls back into it (as early as pp. 15-16, for example). Which means that he makes some equally unacknowledged assumptions-assumptions that consistently push aside the responsibility of human beings to, once in a while, not take the path of least resistance.
The most important one of these assumptions-and one that, idly skipping ahead, it seems he will maintain for the entire book-is that Progress is a Good Thing. Kelly introduces a bunch of metrics-longevity, urbanization, as so forth-that are, indeed, progressive, steadily increasing over time. Kelly then emphatically, if not literally, capitalizes the P in progress. On page 100, he cites the steady increase in life expectancy over the past century, asking, "If this is not an example of progress, then what is it?" What it is is a particular datum that is increasing over time. I think most everybody would be pleased with this increase (that is, when we're not looking at long-term Social Security projections), but that is still opinion. Kelly thinks that it's fact. A page later, he states his creed: "Progress is real." No, it's not. Progress is a belief system.
Which is not to say it's not a useful belief system when it comes to making sense of the world. But, like any belief system, it brings with it the danger of ignoring anything that might disrupt its order. Chapter 4 of What Technology Wants sets up information as a force to balance entropy. Here's how Kelly rather nicely defined entropy:
Each difference... becomes less different very quickly because every action leaks energy down the tilt. Difference within the universe is not free. It has to be maintained against the grain.The thing is, the same thing happens with thinking vis-à-vis belief systems. Belief systems are the entropy of intellectual activity, shunting thought down more frictionless channels. Which is why, I think, Kelly goes on to talk about information as an impersonal entity. I think you can make a reasonable case that entropy exists outside of human observation, but information? Isn't information pretty much defined as a signal that is useful to us in ordering our sense of the world? It's why we call the other signals noise. But for Kelly, information is a work-around that maintains Progress in the face of the accelerating disorder of entropy.
One more example for today: on page 16, Kelly introduces us to the PR2 robot, programmed to find its own power source:
Before the software was perfected, a few unexpected "wants" emerged. One robot craved plugging in even when its batteries were full, and once a PR2 took off without properly unplugging, dragging its cord behind it, like a forgetful motorist pulling out of the gas station with the pump hose still in the tank. As its behavior becomes more complex, so will its desires.This is a great anecdote. At first, it seems to confirm the idea that technology is evolving beyond our control in a life-like way-even developing the capacity for behavior analogous to human compulsion and irrationality. (That robot's crazy!) But look closer: who says that such cravings and behavior meant that the PR2's software was imperfect? That's right-we do.
The PR2 and its software reminded me of one of my favorite books, Michael Foucault's Madness and Civilization, in particular Foucault's analysis of the factors that led to the institutional confinement of the insane:
[I]n the history of unreason, it marked a decisive event: the moment when madness was perceived on the social horizon of poverty, of incapacity for work, of inability to integrate with the group; the moment when madness began to rank among the problems of the city. The new meanings assigned to poverty, the importance given to the obligation to work, and all the ethical values that are linked to labor, ultimately determined the experience of madness and inflected its course. [p. 64, emphasis added]We define madness-whether it be in other people or in the machines we build-in terms of the order, however consciously or unconsciously, we want to maintain: another belief system.
Am I enjoying the book? Yeah, actually-Kelly tells a story that has great scope and cheerful ambition, he makes interesting connections, and pretty consistently sparks deep thinking. I fully admit that I am a glass-half-empty kind of guy, but I also like entering the glass-half-full world, something that Kelly facilitates with straightforward fluency. The Technium, I think, is a good myth, in the sense of being a framework for making increased sense of the world-useful information, in other words. But every time I find myself thinking hey, wait a minute, I have to remind myself: Kelly actually believes it.
With music, as well as more broadly culture, as the context in which we're reading Kevin Kelly's book, the (entirely hypothetical) evolution-like course of the development of musical instruments is something I've been especially interested in.
I understand that Kelly defines "technology" as broadly as to include written language (which is something that Julian Dibbell, it's worth mentioning, also emphasizes in his introduction to this year's Best Technology Writing collection), and by extension cultural production.
Early on in his What Technology Wants, Kelly discusses two things that play into an understanding of where musical instruments came from, and where they're headed.
First is an anecdote, in which Kelly talks about the paleontologist Niles Eldredge's interest in the trumpet. (I read the book via the Kindle software on my iPod Touch, so I can't give a specific page number for this.)
As a hobby he collects cornets, musical instruments very similar to trumpets. Once Eldredge applied his professional taxonomic methods to his collection of 500 cornets, some dating back to 1825. He selected 17 traits that varied among his instruments--the shape of their horns, the placement of their valves, the length and diameter of their tubes--very similar to the kind of metrics he applies to trilobites.
Second is Kelly's emphasis on progress as mapped by a move from what might be described as from physical to virtual goods--from an industrial economy to a service economy, and hence in instrument terms from physical instruments to software-based ones. Of course, the virtuality of software-based instruments is a hedge, since they're predicated on a physical object, namely some sort of computing device, be it a conventional computer or a mobile device like an iPhone.
In the end, I'd love to know what Eldredge--which is to say, by extension, we--thinks about how virtual instruments will affect the projected development of the trumpet.
I also read the new Kelly book in the context of recent perceived about-faces by one-time technology evangelists, like Jaron Lanier's You Are Not a Gadget and Douglas Rushkoff's Program or Be Programmed, but that's a separate line of inquiry.
This week (November 15-19) members of our friendly music and culture blogger think tank are once again gathering around the computer, this time to reflect on Kevin Kelly's What Technology Wants. This wasn't in the book, but I sure hope technology wants 3-D printers that produce wine and cheese so these things can have sharable snacks in the near future.
In the meantime, to get everyone's appetite whetted, what the book does cover is not just the recent explosion of new technologies and how they are impacting our lives for better and worse, but how their increasing sophistication ties into the really long trajectories of evolutionary development reaching back to the Big Bang. In so doing, Kelly attempts to get a glimpse down the road a pace and see what various examples might indicate about where we and technology are headed. In Kelly's view, trying to put on the brakes is futile and, in fact, an active anticipation and embrace of these "wants" is to be encouraged. Technology is not a neutral force. There are bad and good uses, but new technology increases choice (which for Kelly is always a good), adding just enough weight to the plus side of the equation that over time such progress is always more good than bad.
If you need a concrete musical/cultural example as to why we should be champing at the technological progress bit, consider this one which Kelly offers towards the end of the text:
If the best cathedral builder who ever lived was born now, instead of 1,000 years ago, he would still find a few cathedrals being built to spotlight his glory. Sonnets are still being written and manuscripts still being illuminated. But can you imagine how poor our world would be if Bach had been born 1,000 years before the Flemish invented the technology of the harpsichord? Or if Mozart had preceded the technologies of piano and symphony? How vacant our collective imaginations would be if Vincent van Gogh had arrived 5,000 years before we invented cheap oil paint? What kind of modern world would we have if Edison, Greene, and Dickson had not developed cinematic technology before Hitchcock or Charlie Chaplin grew up?
When this book first hit the shelves, I knew I wanted to read it but that without the chance to dig around in its ideas with colleagues, I'd probably miss out on some of the valuable meat of the exercise. I admit that I had a bit of an ongoing debate with myself as I read, wondering if I had made a bad call in picking this as a good selection for what is ostensibly a music and culture blog. In the end, however, Kelly's ideas--whether I agreed with them or not--really scratched my interest in occasionally pulling the musical discussion back far enough that we can see the issues and ideas impacting the much wider human picture. We are then free to apply them as we desire to our own narrower work. I think What Technology Wants offers us that in spades, and I hope everyone enjoys the conversation this week.
Some resources:
Kelly's review of his own book, pointing out what he sees as its major arguments.
Jerry Coyne, professor of ecology and evolution, reviews the book for the NYT Sunday Book Review, pointing out what he sees as some of its major flaws.
An audio interview with Kelly about the book.
Kelly's TED talk on related themes:
Blogroll
Still not sated? Explore San Francisco Classical Voice's amazing index of classical music blogs.