Dissonant Notes

Thursday, September 20, 2012

Hating Hipsters: How The Mainstream Hijacked Authenticity And Made Non-Conformity A Joke



Hipster. As I type the word, I feel a sense of inner defeat. Popular culture is still obsessed with hipsters and is particularly obsessed with using the word hipster as a byword for moron. I myself have already written about hipster being used in the pejorative sense (at this point there’s no other kind of usage), yet here we are; a new essay and still I feel the need to reopen this particular can of worms. Why exactly does the whole world want to distance itself from the term hipster? That’s easy: hipster means trend-follower, somebody who only likes bands that are cool, somebody who ditches bands when they start to become famous (no such person actually exists, but I’m just conjuring up what the term is meant to convey). Ultimately, it implies people so image-conscious that they live in fear of being or doing anything remotely uncool or unhip. There’s a reason why so many people are anxious to be thought of as geeks. Geeks are uncool. Geeks just like what they like and don’t care whether it’s cool or not. Geeks are authentic. When we get right down to it, hating hipsters is a way of declaring your authenticity.

Confused? OK, what is the term authentic meant to convey? Something real, something unaffected. I looked up the word authentic in the dictionary and this is what it says:

au·then·tic  [aw-then-tik]  
adjective
1.       not false or copied; genuine; real

Why does hating hipsters make you authentic? Because what people hate most about hipsters is how (supposedly) phony they are, about how much they (supposedly) worry about whether the music they listen to is too mainstream.  When you’re uncool, you just are. You don’t care. Hell, you don’t even know what’s cool or uncool anymore, right? You stopped caring ages ago. If you keep digging, you get to the truth behind hipster-hating which is this:  people who make a big deal about hating hipsters, or who take the time to mention how unhip they are, genuinely believe that everything they enjoy (books, movies, music, visual arts) is based on the fact, and only on the fact, that they like it.  They haven’t been influenced in any way by societal trends, or peer pressure, or advertising, or notions of cool and uncool. In other words, these non-hipster people think that they, and they alone, have risen above all worldly influences and reside in a pure state of unaffectedness.

There’s an irony here because being hip was originally a quest for authenticity, a quest that began by rejecting mainstream society. In his book Sincerity and Authenticity, Lionel Trilling observed that at one point, in Western society, sincerity was the most praiseworthy personality trait. Sincerity meant honesty, singularity, and freedom from hypocrisy. It implied an earnest and uncluttered approach to life. With sincerity held in such high esteem, however, society also viewed insincerity as the ultimate sin. People were at pains to present a cohesive, consistent personality. This strain led to the idea that presenting yourself as sincere was in itself a falsehood, a disguise that one had to wear in order to be judged worthy by other members of society. Sincerity became inexorably linked with the falseness of bourgeois society, with social masks and etiquette. With bourgeois morals under attack, authenticity soon overtook sincerity as the most worthy personality trait. Authenticity implied a healthy lack of concern over how others perceived you. “Here I am flaws and all, take it or leave it” was the message that resided at the heart of an authentic life. Authenticity therefore meant rejecting societal norms, exploring other avenues, doing whatever one felt like. If you contradicted yourself, so be it! At least you were being authentic and not worrying about the judgment of others or the petty morals of the day.

The shift in attitude from sincerity to authenticity as the preferred personality trait coincided to a large degree with the rise of popular culture. When popular culture exploded in the ‘50s it ran parallel with notions of rejecting mainstream values and mores in order to live an authentic life. The Beats, Teddy Boys, and hippies all consciously rejected societal norms. They looked and frequently acted ridiculous by everyday standards, but this ridiculousness was an attempt to scrub away years of societal repression.  These youth movements sought to rid society of the imposed norms of politeness and decency that up to that point had denied basic notions like sexuality.

Since the ‘50s, popular culture has attempted more and more to align itself to notions of authenticity. This realignment can be observed most dramatically in the world of advertising. Where once advertisers sought to induce brand popularity by implying something was missing from a consumer’s life, more and more we see products being marketed as an obvious extension of a consumer’s way of living. Instead of buying a product to make you somebody different, you buy a product because this product reflects who you already are. In other words, advertisers are savvy to the fact that most people don’t want to feel like they’re being sold something they don’t want. Consumers would rather feel like they were going to buy that product anyway but just hadn’t heard about it. Popular culture and advertising have done such a good job of appropriating authenticity that anyone attempting to criticise popular culture, or reject mainstream values, is looked upon with suspicion and pity. They are viewed as being inauthentic. Yet at the same time, nobody thinks of themselves as slaves to popular culture and shifting tastes. What exactly is going on?


It appears that we are at a point where normalcy has hijacked the notion of authenticity so completely that we think of those who differ from us in any way as insincere. Except in place of the word insincere, we now use hipster or snob. How so? To enjoy mainstream music or movies is supposedly to be unaffected by the judgment of others, it is a grand act of rebellion whereby one shakes off the shackles of cool and simply embraces what one truly enjoys. What happens, though, when you encounter somebody who dislikes something you enjoy? You’ve been grooving to the Gotye album for a few months now when suddenly you hear somebody at your work talk about how much they hate Gotye. Hate Gotye? What is their problem? Well, you realise it isn’t cool to like Gotye, but you don’t care about things like cool. This Gotye hater must be hung-up in some way. They must be a hipster. They’ve foolishly bought into notions of cool and as a result are afraid to enjoy Gotye. You, however, are unafraid. This tendency to attach derogatory labels to those who differ from us, while ignoring ways in which we are the same as the subject of our anger, appears to be one of the main fuels for hipster hatred. It also exposes the staggering falsity behind ideas of authenticity.


In his recent book A Universe From Nothing, physicist Lawrence Krauss talked about a strange phenomenon that results from an expanding universe. Due to the fact that everything is always moving away from everything else, if you make observations at any point in the universe it always looks like you are the centre of the universe. If you were suddenly transported to the other side of the universe, however, it would look the same. You aren’t the centre of the universe; it just looks that way. When it comes to outside influence on our opinions and tastes, it seems like a similar phenomenon is at work. Each person imagines that everyone else is a hipster, that everyone else is being influenced by society, that everyone else is inauthentic. If the music they enjoy happens to be hip, well that’s neither here nor there. They enjoy it because they like it, not because it’s hip. People even imagine that their close friends and loved ones fall under the sway of outside influence. Trust me, right now your friends think you’re only listening to that one album because of Pitchfork, or maybe because everybody at that one indie record store likes it, or perhaps because the cute person you work with likes it too. Everything they like, however, is free from such taints.

Your brain is a wonderful thing. It does tend to play one nasty trick on you, though. It gets influenced by everything around it and then makes you believe that no influence was involved. In fact, it rationalises your decisions to make it seem like you have very sensible reasons for carrying out your actions, when in fact your actions are mostly irrational or based on emotion. Let’s say you’ve started listening to an album that you had previously dismissed and let’s say that your friend’s suspicions are correct and you really have started listening to it because a cute co-worker says it’s their favourite album. What will be going on in your head? Well, your brain will be telling itself that you’ve been meaning to reinvestigate that album for a long time anyway (which may have an element of truth to it) and so in reality there’s no connection between your secret crush and the fact that you’ve started listening to this album. There’s a term for this in psychology. It’s called Introspection Illusion.

Introspection Illusion works like this: we firmly believe that we have access to the mental processes through which we come to decisions but experiments indicate that we don’t. We therefore make up what seem like completely rational reasons for our actions after the fact. Worse than that, we believe that our introspection is more dependable than the introspection of others and, as a result, consider ourselves to be superior when it comes to self-reflection. We imagine that other people are dishonest and untrustworthy in their own self-reflection but that we are truly self-aware. We do this by believing the rationalisations of our own behaviour and by being suspicious of other people’s rationalisations. In other words, everybody is quietly suspicious of everybody else’s behaviour.

In this light it looks like hatred of hipsters is Introspection Illusion run wild with a heavy dose of psychological projection thrown in for good measure. If we can externalise and demonise what we believe are weak-minded traits (conformity, trend-hopping), then that allows us to avoid any unpleasant truths in regards to our own behaviour. We can see this mindset play out in every aspect of society from personal tastes to political opinion. Terms like ‘sheeple’ and ‘brainwashed’ appear on a daily basis all over the internet, with millions of people absolutely convinced that almost everyone else is a walking automaton, incapable of true introspection and intellectual honesty.

Beyond the self-delusion that allows somebody to hate hipsters, there is also an even more unpleasant side to this cultural phenomenon. With mainstream and authenticity now seen as being essentially the same, those who flout culturally endorsed gender roles are viewed with suspicion. The main trait of hipsters that seems to draw the most ire is their fashion sense. This apparent obsession with surface and image is not only seen as pathetic, it is also viewed as feminine and unbecoming for men. (Understand that when I use terms like feminine and masculine, I’m using them in the sense of how certain behaviours are viewed. Our society equates femininity to being a woman and masculinity to being a man. I am not attributing an obsession with image and fashion to being a woman). Through the ages the idea of femininity has become synonymous with certain unpleasant characteristics, namely superficiality, passivity, and weakness. Fashion is considered the realm of outward appearances, shallowness, and a willingness to follow trends on a whim and as such is inexorably linked in the minds of many with femininity. The moment a man steps into the world of fashion he is considered feminine. Indeed, the term “faggy hipster” is almost as popular as “hipster”. Women hipsters generally get an easier time of it than men do, probably because society has no problem with a girl in skinny jeans. Leaving aside big glasses, there’s no overriding trait about hipster women that ruffles society’s feathers. Women just get a harder time of it in a larger sense. 1





Contempt for hipsters reveals not only a nasty disdain for the feminine, it also quietly endorses derision for those who veer away from both traditional gender roles and those who differ from the norm in general. Hipsters are often criticised for all looking the same, yet those doing the criticising look identical to most members of society. The critcisers exist in an approved normality that allows them to rationalise their own conformity but be extra-sensitive to the conformity of those who look different. Normal society, when faced with any subgroup (hippies, punks, hip-hop fans, hipsters, etc) will take great pains to point out how uniform this subgroup is. Thousands upon thousands of people make jokes about the conformity of hipsters, about how hipsters won’t do something because it’s too mainstream, yet never take the time to explore the conformity of the very joke they are making, never mind their own clothes and tastes. Once again we see psychological projection in action as anxiety about our own personal conformity is eased by finger-pointing and laughing at those “others” who conform.2




Seen for what it is, hating hipsters is just another way of society policing itself. From time immemorial those who reject, even in some small way, societal norms are punished with social stigma. Modern society demands that we see ourselves as thinking, free-willed individuals who have somehow arrived at the perfect equilibrium. Each of us in our own way imagines that our attitude to life is the right one and that those who disagree with our attitude are simply being unreasonable or nonsensical. When faced with such unreasonableness it helps if we can attach labels that stigmatise those who think and act differently. If you can label that person who doesn’t like Gotye a hipster and a snob then your individuality can remain intact.

It appears that we have reached a point where people cannot accept that somebody doesn’t like the music of a band that they happen to love. They imagine that this dislike must be the result of some inherent character flaw, a flaw that thankfully they don’t have. These same people seem to imagine that someone else who likes a band that they do not also possesses some inherent character flaw. People who like music that we do not are only feigning enjoyment in order to appear cool, something that we ourselves would never do. God, who has time to worry about such things? Here, however, is the very difficult truth: we are all conformists on some level. Those who truly do not conform are mostly dead, in jail, or are outcasts and pariahs. Clearly hipsters conform, but they also reject certain societal norms (which do you reject?). It appears that even these small rejections are enough to set off firestorms of rage and condemnation. The term ‘hipster’ is a handy put-down for all occasions. So here’s another truth: when you call out hipsters, or use ‘hipster’ as a way to stigmatise somebody else, you’re not only projecting your own fears of conformity onto somebody else, you’re also being an uptight moral guardian. You are keeping everyone in line. You are enforcing strict gender roles. You are enforcing strict dress codes. You are enforcing strict attitudes to taste. You are condemning those who veer, even slightly, away from what your idea of reasonable happens to be. Your friends probably agree with your judgments, so it feels right.



We are at a time when it is almost impossible to be truly rebellious in terms of dress or taste. Everything has a niche. Yet the ever-growing hatred of hipsters reveals a deep fear behind this liberal acceptance of most choices. It reveals a fear that we ourselves are merely well behaved consumers who in almost every sense toe the line. When faced with such gnawing insecurity about our own authenticity, if we can point and laugh elsewhere and attribute motives and ideas to complete strangers, then it helps us retain our own sense of individuality. Much like the right-wing tactic for demonising welfare recipients, almost everyone is able to trot out some story about a ‘hipster’ who possessed the most clich├ęd opinions and then apply this approach to everybody else who looks similar. Even if a person bears only a slight resemblance to our mental picture of a hipster, if they possess opinions about music or movies that differ from ours, then a quick, sneering ‘hipster’ or ‘snob’ remark will ease our troubled minds.

In a modern capitalist society, we are bombarded constantly by product, and it would seem that when confronted with such a bombardment it would be helpful to have a strong sense of taste, to approach each product with discernment. Yet the anti-hipster movement finds authenticity in uncritical acceptance of all correctly marketed products. It demands unyielding conformity and untroubled passivity to all cultural artifacts that pass a certain popular threshold. It also imagines that absolute conformity somehow frees a person from the burden of conforming to what non-conformity looks like. In a wonderful twist of logic, non-conformists are the true conformists; you (the uncool, non-hip conformist) are merely being you, which involves looking the same as the majority. You don’t look like that in order to conform, however, you just dress like that because the look appeals to you. In the past, the label non-conformist was a pejorative term. People suffered social ostracism because they didn’t conform perfectly. These days, because people don’t like to think of themselves as a conformist, a new word or term was clearly needed to deride those who make reasonable, everyday people uncomfortable about their choices. The word had to undermine a person’s whole being, destroying their credibility by implying a pathetic motive for their actions. It had to be able to render a person’s entire existence laughable. Well, now we have it. It’s hipster. Wait, don’t tell me. You just use that word because you genuinely hate hipsters though, don’t you? My mistake.



1. In a patriarchal society, masculinity is considered natural, while femininity is considered unnatural. For this reason, any way a woman dresses is viewed with great scrutiny. Women are placed in a bind in regards to their fashion choices, which goes something like this: if a woman accepts a feminine approach to fashion then she is consciously or unconsciously inviting men to look at her. She is asking for attention. She cannot complain about being objectified because she is objectifying herself by playing up to societal notions of femininity and female sexuality. If she rejects a feminine approach to fashion then she is being contrary. She is probably a ‘feminist’ (all negative connotations implied). She is clearly not doing herself any favours. A criticism often made by men about attractive women is “She’s beautiful but she knows it”, as if women should exist in a state of childlike innocence in regards to their looks and sexuality. The moment a women is aware of the power of her appearance she is conceited and manipulative. Women are scrutinised for signs of hypocrisy, attention-seeking, and superficiality while the way men dress and act is seen as natural and uncomplicated. Unless men make fashion choices that are seen as feminine. At that point they will be viewed with a similar scrutiny to women. Western societies were set up to reward masculine traits while suppressing and dominating feminine ones. Berating hipsters is one small but not unimportant aspect of that.


2. It’s interesting to note that the American alternative music scene that emerged in the early ‘80s was one that enforced strict masculine, puritanical guidelines. Since then any kind of dressing up or gender blurring has been looked at as suspicious and silly. Even though credible artists such as David Bowie, Roxy Music and Funkadelic all looked and sounded out of this world, the post-Hardcore American scene seemed to view such antics as distasteful. It was all about the music, such Puritanism being a deliberate stand against the image-conscious pop stars of the 80s, pop stars who were for the most part women, African-Americans, or males who seemed to ignore traditional notions of masculinity. The ruling white, Christian, uptight mindset that permeated every aspect of American society appeared to have its strongest supporters in the alternative American music scene. Glam created gender confusion in the ‘70s, and ‘80’s pop was the real offspring of glam (pop is viewed as feminine and as such is accused of the same ‘crimes’ as femininity itself, while masculine rock is seen as natural and proper). American alternative music fans felt more comfortable in well-defined gender roles and overwhelmingly masculine musical expression and the Riot Grrrl movement was, if anything, an all out attack on the stifling masculinity of this scene. It’s no surprise that Portland (and the Pacific North West in general) is seen as the ultimate home of hipsters.

Saturday, August 25, 2012

Brian Wilson Tired of Music Critics Putting Him in a Box



The name Brian Wilson conjures up a host of labels: songwriter, singer, bass player, genius, lunatic, legend. When faced with such a complex personal history and daunting creative output, music critics seem to find it easier to attach one of those labels, put Brian Wilson in a box, and move on. Things turned ugly recently, however, when one music writer attempted to do just that.

According to reports the writer arrived at Wilson's house with a gun in his hand and began screaming expletives and yelling, “Just get in the box”. He had even brought a sturdy wooden box with him.

This is the seventh time in recent memory this has happened, and Wilson is said to be afraid for his life.

David Finch, Wilson’s 72-year-old neighbour, claimed in a recent interview that he literally saved Wilson’s life on one of these occasions.

“I pop in and visit Brian regularly,” said Finch, “He leaves his door unlocked all the time so I just walk in and start chatting. Well, this last time I walked in I heard a strange noise. It alternated between gentle weeping and this kind of guttural moan. I’d never heard anything like it since ‘Nam. Turns out some cold-hearted bastard had gotten Brian in a box and then nailed it shut. It took me almost an hour to get him out.”

Police are continuing to investigate the incidents but remain puzzled as to the motives of the writers in question. Wilson’s house is now under constant supervision by armed guards, and they have been ordered to shoot on sight if intruders are seen to have a box of any kind.

2012 was to have been a big year for Wilson, with a full-blown Beach Boys reunion in the works as well as a host of reissues, but music critics forcing the reclusive genius into a box one too many times may put paid to all those plans.

Al Jardine, Wilson’s band-mate in the Beach Boys, made a heartfelt plea to music critics earlier today.

“Please, for the love of god stop trying to put Brian in a box. He’s not fucking Houdini. Enough with the whole box thing. Seriously.”

Time will tell whether Jardine’s plea is listened to.


(This article originally appeared on Collapse Board)

Tuesday, August 21, 2012

Why Chris Hedges is Wrong About Science and Atheism




I’d like to begin by saying that I have a lot of admiration for Chris Hedges. The impulse for this essay did not come from a general dislike for the man but rather out of a deep sense of frustration. My frustration stems from his repeated attempts to demonise science and rational thought while at the same time framing religion as a force for good. His latest attempt, a recent essay entitled The Science of Genocide, lays out his worldview for all to see. In it, he places the blame for much of the slaughter of the 20th century firmly on scientific thinking, portraying science as a reckless monster unleashed by unthinking humans. Scientists themselves are seen as morally inept servants of the war machine, happy to create bigger and better weapons in order to feed their egos and bank accounts.

Behind Hedges jeremiads, one can discern a dangerous romantic appeal to unreason and a kneejerk disgust of philosophical materialism.  One can also make out a borderline fanaticism that attempts to draw lines in the sand and woe betide anybody on the wrong side of that line. Take this little nugget from early on in The Science of Genocide:

…few in the sciences look beyond the narrow tasks handed to them by corporations or government. They employ their dark arts, often blind to the consequences, to cement into place systems of security and surveillance, as well as systems of environmental destruction, that will result in collective enslavement and mass extermination. As we veer toward environmental collapse we will have to pit ourselves against many of these experts, scientists and technicians whose loyalty is to institutions that profit from exploitation and death.

Ignoring for now the blustering tone, the above quote is not only inflammatory, it is quite categorically wrong. Peer-reviewed scientific studies on climate change have been in agreement for years about the environmental damage being wrought by humanity. Without these studies, we would be clueless as to the effects of climate change. Why, I wonder, did these studies ever see the light of day, given that scientists apparently exist in a moral vacuum ever ready to appease their corporate or governmental overlords? They saw the light of day because Chris Hedges is wrong. He has painted his overwrought picture with broad strokes that care not a jot for accuracy but instead concentrate on maximum emotional appeal. Scientists are dehumanised and seen as corrupt and evil:

…a world that will protect the ecosystem and build economies that learn to distribute wealth rather than allow a rapacious elite to hoard it, will never be handed to us by the scientists and technicians. Nearly all of them work for the enemy. 

Did you read that? Nearly every scientist works for ‘the enemy”! Producing no data to back up his outrageous and dangerous claim, Hedges condemns all but a few scientists that are now alive, labeling them slaves to an enemy whose actions will ultimately destroy life on earth:

They will relentlessly push forward, exploiting and pillaging, perfecting their terrible tools of technology and science, until their creation destroys them and us. They make the nuclear bombs. They extract oil from the tar sands. They turn the Appalachians into a wasteland to extract coal. They serve the evils of globalism and finance. They run the fossil fuel industry. They flood the atmosphere with carbon emissions, doom the seas, melt the polar ice caps, unleash the droughts and floods, the heat waves, the freak storms and hurricanes.

There’s no room for doubt in Hedges’ world. Science is evil and scientists are almost exclusively malevolent.  On the other hand he holds onto the notion that religious thinking can be a force for good. At the beginning of The Science of Genocide Hedges invokes the bombing of Hiroshima to indicate how destructive science can be. Yet the fact that the crew of the Enola Gay identified as Christian, as did the President who authorized and celebrated the bombing, is apparently of little interest for Hedges. He chides scientific thinking for being morally neutral, which it is, but champions religious thinking for its moral questioning. Religion is not morally neutral, yet it was helpless to stop the destruction of Hiroshima. Science can make a bomb, but it can’t justify its use seeing as it is morally neutral. Religion can justify its use, which makes it much more dangerous. At this point Hedges would no doubt point out that religion inspires people to feed the hungry and to help the sick, ignoring the fact that science almost certainly helped with food production / transportation and with the creation of life-saving medicines. To note these obvious facts would make grand moral condemnations more difficult. Scientists are almost exclusively servants of the enemy. No moral grey areas. End of story.

In another recent essay, How to Think, Hedges lambasts scientific thinking and urges his readers to take a leap of faith and champion the imagination, at the expense of fact if necessary. Early on the essay serves as a warning to society:

Human societies see what they want to see. They create national myths of identity out of a composite of historical events and fantasy. They ignore unpleasant facts that intrude on self-glorification.

But soon turns into an appeal to unreason: 

The Shakespearean scholar Harold Goddard wrote: “The imagination is not a faculty for the creation of illusion; it is the faculty by which alone man apprehends reality. The ‘illusion’ turns out to be truth.” “Let faith oust fact,” Starbuck says in “Moby-Dick.” 

“It is only our absurd ‘scientific’ prejudice that reality must be physical and rational that blinds us to the truth,” Goddard warned. There are, as Shakespeare wrote, “things invisible to mortal sight.” But these things are not vocational or factual or empirical. They are not found in national myths of glory and power. They are not attained by force. They do not come through cognition or logical reasoning. They are intangible.


This is where Hedges’ conflicted mode of thought is exposed for all to see. One minute he is lamenting a society that refuses to face the truth in terms of historical and scientific facts, the next he extols us to move beyond materialistic facts and to treasure things unseen and unexplainable by reason. He does not make the connection that people ignore historical and scientific facts by clinging to illusory and unreasonable world-views.  Ignoring climate change, racism, homophobia, sexism, anti-Semitism, nationalism; all of these things require denial of facts and staunch unreasonableness. Right-wing Christianity is one of the loudest voices denying climate change, as well as evolution. Their faith has ousted fact, and American society (and by extension the entire world) suffers as a result. Hedges’ anti-scientific thinking places him firmly in a strong Christian tradition, as does his championing of unreasonableness in the face of facts.

On the plight of the American LGBT community, he again stealthily endorses religious thinking even as he attacks mainstream religious institutions. His article, entitled The War on Gays, comes accompanied by an image of Jesus walking on water while holding in his arms an LGBT youth. Under the water we see a monster about to attack, a monster that turns out on closer inspection to be a mitre. The image is disingenuous and not a little sickening for many reasons. For one, when it came to homosexuality the Jesus of the New Testament was at best morally neutral. He does not mention homosexuality at all, which many have suggested implies he was just fine with the homophobia of the Old Testament and felt no need to repudiate it. The Bible, New Testament included, is rife with stomach-churning homophobia. Appeals to reason and logic remain at the heart of every modern quest for equal rights, and the LGBT community is no different. Opponents of these rights appeal to unreason, to kneejerk emotional disgust, to fear of the ‘other’. To show an image of a morally neutral, mythological character coming to the rescue of LGBT youth is an outrageous sleight of hand that attempts to position religious thought as the savior of an embattled group of people, when in fact the opposite is true.



The article itself is mainly an extended interview with a pastor who outright states that it is the church waging war on the LGBT community. Hedges’ usage of a pastor to say these words obviously means that the article is not a flat out condemnation of religion.  Seeing as Hedges has a problem when atheists criticise religious thinking, using a pastor seems like a damage limitation exercise on his part. At one point Hedges states:

As the economy unravels, as hundreds of millions of Americans confront the fact that things will not get better, life for those targeted by this culture of hate will become increasingly difficult. Rational debate will prove useless.

Now he wants rational debate. Yet at times Hedges moves himself to the side of frantic emotional appeals to unreason, as when he condemns thousands of hard-working scientists for merely being scientists. His unreason came forth before when he reacted in anger at the rise of New Atheism. Hedges dubbed these new atheists fundamentalists who were the flipside of religious extremists. His evidence? Sam Harris seemed to have a distinctly anti-Muslim streak and Christopher Hitchens supported the war in Iraq. Now, as far as I can see, Sam Harris is effective when listing the dangers of religion but makes for a bad philosopher as his book The Moral Landscape proves. It contains little science and its philosophical outlook amounts to rehashed utilitarianism. His views on Muslims do at times appear to be superficially similar to those of certain right-wing Christians. Hitchens’ support of the war in Iraq was, to my mind, a terrible moral error. It endorsed a western cultural colonialism that Hitchens spent most of his life railing against. Yet for all that, the atheism put forward by Hitchens, Harris, Richard Dawkins and Daniel C. Dennett was not by any means a united front. For one, both Dawkins and Dennett took very public stands against the war in Iraq. Dawkins added his voice to the Not One More Death book which publicly condemned the Iraq war, and Dennett wrote the following in 2007:

Saddam Hussein was an extraordinarily evil dictator, and the world is well rid of him, but the steps taken by the USA to accomplish this – unilateral, arrogant, and shockingly ignorant about local conditions – have brought shame on the nation.

They have also been stunningly counterproductive. Respect for America has plummeted worldwide, a dangerous development both for us in America and for those around the world whose well-being and security is partially protected by American support for principles of freedom and equality.

Our declarations of good intent are now deservedly regarded with cynicism by our friends and suspicion by those who desperately depend on us.

Hedges insistence in comparing New Atheism with religious fundamentalism is gross stupidity. The words of atheists have prompted no violent actions, have persuaded none to fly planes into buildings, or bomb abortion clinics, or wage war, or mutilate women’s bodies, or force women to cover up their bodies lest they be physically or sexually assaulted, or circumcise males, or execute homosexuals, or any other number of acts carried out by religious fundamentalists. At heart Hedges seems to despise the philosophical materialism implicit in atheism and seeks to condemn high-profile atheists for thought-crimes and for brutalities that they have never, as atheists, endorsed or carried out. As a logical or ethical choice, atheism is no more extreme than vegetarianism, indeed it has less practical implications. Right-wing Christians have more often been the most war hungry of Americans. There is a difference between publicly stating that God is a delusion and endorsing all manner of atrocities and, if Hedges can’t see the difference, then something is terribly wrong. Unfortunately, Hedges is not alone in his thinking.

After the death of novelist David Foster Wallace, a commencement speech he gave became an internet sensation. In this speech, he too casually lumped together religious fundamentalists and atheists, labeling them both as repulsive. Here is his reasoning:

True, there are plenty of religious people who seem arrogant and certain of their own interpretations, too. They're probably even more repulsive than atheists, at least to most of us. But religious dogmatists' problem is exactly the same as the story's unbeliever: blind certainty, a close-mindedness that amounts to an imprisonment so total that the prisoner doesn't even know he's locked up.

This seems like a common criticism of atheism, one which Chris Hedges and thousands of others (especially in America) uncritically endorse, despite the fact that the “fundamentalism” of atheism, as stated, leads an atheist to no acts of violence. It appears as if simply being an atheist is committing an unforgivable thought-crime. This critique rests on the premise that the answer to all great questions must lie somewhere in the middle, or that one must keep an open mind to both sides of the argument lest one become dogmatic. Let’s take the issue of slavery though. Before its abolition, there were loud and sometimes violent debates about the merits of slavery. Should the mature mind have taken the middle way, refusing to take an extreme stand? That would have been moral cowardice and an act almost as immoral as supporting slavery. Women’s rights: should women get the vote? Still think the middle way is best? LGBT rights? Need I go on? To suggest that finding some kind of middle ground is always the most sensible and mature option is dangerous thinking. Wallace’s speech also does the opposite of what it supposedly intends to do. He goes out of his way to apply a negative label to someone for simply having a different viewpoint from his own. This atheist viewpoint hurts nobody, yet to some it is as dangerous as a murderous religious fanatic. How did we get to the point where this logic is taken seriously, and by people who are seen as serious and intelligent?

The truth is that atheism gets under people’s skin. The idea of the annihilation of the self after death seems so frightening to many that they will shoot the messenger in order to shut out the idea. Atheists are labeled smug, arrogant, repulsive, and all manner of ad hominem accusations are leveled at them. For reasons that can only be personal, Chris Hedges is one of the voices trying to slander atheism and scientific thinking. He wants them to be considered part of the enemy, and he wants religious thinking to be on the side of good.

There is a phrase connected with Buddhism called “The Stink of Enlightenment”, which is meant to convey the idea of a spiritual fascist who believes that they have risen above the mundane and mediocre and exist only in the heavenly realms of deep religious contemplation. Those who fall victim to the stink look on the ordinary person with scorn. They mock their spiritual emptiness and lament their materialistic existence. With this in mind there is a Hedges’ essay from 2010 entitled After Religion Fizzles, We’re Stuck With Nietzsche which contains his vision of what life without religion will look like, and it stinks. Hedges goes all out to paint a religious-free world as one populated by smug, self-satisfied, mediocre little people who care for nothing more than what’s in front of them. Without religious thought to make us care we will be contemptible sheep, happy with whatever consumer culture gives to us. Taking Nietzsche’s idea of the Last Man to be his vision of a completely secular society, Hedges goes on to say:

The Last Man, Nietzsche feared, would engage in the worst kinds of provincialism, believing he had nothing to learn from history. The Last Man would wallow and revel in his ignorance and quest for personal fulfillment. He would be satisfied with everything that he had done and become, and would seek to become nothing more. He would be intellectually and morally stagnant, incapable of growth, and become part of an easily manipulated herd. The Last Man would mistake cynicism for knowledge.

“The time is coming when man will give birth to no more stars,” Nietzsche wrote about the Last Man in the prologue of “Thus Spoke Zarathustra.” “Alas! The time of the most contemptible man is coming, the man who can no longer despise himself.”

“They are clever and know everything that has ever happened: so there is no end to their mockery.” The Last Men indulge in “their little pleasure for the day, and their little pleasure for the night.”


There are several things wrong with this picture (ignoring the fact that it was not a secular society that Nietzsche feared and was trying to paint a picture of). For one, in any open, democratic society individual rights generally increase as religious influence wanes. Hedges refuses to explore the complex relationship between individuality, rights, and capitalism. He credits the great religions with beginning the quest for individual rights (which is most probably partly true) but suddenly laments the vulgar individuality he sees as the end result of this quest. Religious influence must wane in an open society but Hedges imagines a completely secular society to be a worthless, morally bankrupt one. The picture he paints of humanity is a distressing one. His vision of the future is always bleak, while he imagines a past where political dissent was tolerated, or even welcomed, by those in power (as Chomsky points out, this has never been the case). Hedges seems so intent on viewing the secularisation of society as some grand fall from grace he ignores the fact that never in the history of civilization have so many people had their rights protected and guaranteed.

Ultimately, despite my criticisms, I see Chris Hedges as one of the truly vital voices in modern political journalism. He shines a much-needed light on the crimes of Western states around the world, he details the grotesque brutality of globalization and all the misery it brings and he was on the front-line of the Occupy movement and gave it consistent and supportive exposure. In many ways, his ideals are my own. I too detest the corporatisation of our political system and by extension our society, but unlike Hedges I do not connect atheism or philosophical materialism to the rising evil of corporate rule. The influence of corporations and the ruling elite poses a threat to every decent human being. They would gladly debase any democracy, new or old, and reduce the rights of individuals to those of serfs. They exploit the poverty of the Third World and use it as leverage to reduce the rights and expectations of workers in the First World. Yet my atheism and love of science make me feel like I am the enemy of Chris Hedges. The society he describes which blindly  worships science is not the one I see in America. Individual rights were won by appeals to reason and to steer away from that course is incredibly hazardous. Many atheists and scientists are more than willing to be moral voices in the battle against corporate domination and if Chris Hedges can’t see that then he possesses none of the open-mindedness and wisdom which he insists is essential to living an ethical life.

Wednesday, August 15, 2012

The iPod and the Function of Music




Something strange is happening to the music industry. Modern technology is turning back the clock to the extent that record companies are leaning more toward single song deals as opposed to album deals. Songs are becoming more important than albums and any music that wishes to achieve chart success must be a big pop moment rather than a small piece in a bigger picture. Sound familiar? When rock and roll first exploded in the mid ‘50s the album meant nothing. The focus was on the single and only big name artists like Elvis expected healthy album sales. When jazz entered its most artistic phase in the late ‘50s it shifted public perceptions and transformed the album into a piece of art. Post-Beatles rock also elevated the album, making it a credible artistic statement and not simply a collection of singles. While writers like Nik Cohn lamented this shift from feverish pop excitement to serious art, most welcomed the change and performers who did not follow suit became obsolete. Pop movements such as disco have always preferred the song over the album but the idea of the album as the true medium of the artist has dominated public thinking since the mid ‘60s. Which technology has managed to break the albums’ stronghold? The iPod and digital downloading look like the guilty party.


What the iPod, and by extension digital downloading, has done is move the focus from the artist to the listener. While artists may want to pepper their albums with more difficult tracks the listener now has the option to ignore those tracks and simply download whatever combination of songs they wish. The danger here is that listening to music becomes more about instant gratification than long-term enjoyment. Prospective music buyers often listen to small snippets of songs before choosing to buy and as such the onus is on the song to instantly grab the listeners attention. The switch from vinyl to CD allowed music fans to skip any unwelcome songs without too much trouble, but now they can simply avoid owning them altogether. In a sense, it makes music listening more democratic, but that spells bad news for artists who take the album seriously. One of the cornerstones of modern artistic thought is that art should challenge those who encounter it. Modern music consumption allows the listener to bypass such notions as ‘being challenged’ and go straight for the stuff with instant appeal.


The other way the iPod has changed music listening is by no longer making music a private experience. In the past music was something you rushed home to listen to, with concerts providing the communal aspect, but the iPod now allows people to listen to music everywhere, be it while jogging, taking the bus, driving, sitting in class, working on a paper at the coffee shop; music listening becomes ubiquitous. True, a Walkman or a portable CD player allowed for many of these same experiences but, unless the listener made himself or herself a mix CD or tape, they remained confined by the artist’s vision in terms of running order. With the iPod the listener has ultimate control and much more music available at his or her fingertips. In this environment, music takes on a more functional aspect. Its purpose mutates from meaningful art that enriches our lives to boredom-relief and quick-fix stimulation; the musical equivalent of the lifestyle magazine in the dentist’s waiting room. 


Before anybody thinks this is another ‘the modern world is turning us all into shallow, gadget obsessed sociopaths with the attention span of a goldfish’ essay, I’d like to inject some optimism. While ‘50’s rock and roll was indeed song driven, with a high turnover of artists and a heavy emphasis placed on teen trends and heavy profits, from this situation bloomed modern popular culture. Each change in technology brings out the usual doomsayers who decry the lack of humanity and warmth in machines and harken back to a time like the ‘60s when technology was less dominant. Take a look at contemporary articles from the ‘60s, however, and you will find many familiar sounding essays that paint the picture of a mindless youth obsessed with TV and loud music. Humanity possesses a noble ability to adapt modern technology to meet its creative needs. There will always be those who look for surface-level thrills, and there will always be those who like to dig a little deeper. Many of us will find ourselves doing both. The challenge for modern musicians is how to adapt to this loss of artistic control brought about by the iPod revolution. To those who think that serious music will be dealt a lethal body-blow by the iPod I present the following example.


At the beginning of the 20th century the two dominant artistic modes were the novel and poetry (other than the visual arts). Novelists and poets were the superstars of the age, and their opinions on public matters carried real weight and influence. With the emergence of popular culture, however, a shift in attitudes occurred. While the novel, beaten and bloody, survived, the relevance of the poem all but disappeared. Why would this be? To be blunt, poetry did not have mass popular appeal. While the pulp novel sold to millions, there was no pulp poetry. Intellectual novelists existed on the fringes, yet they at least retained a much higher cultural relevance than poets did after the ‘60s. The only reason novelists still have relevance is because the novel retains mass popular appeal, not in spite of it. In other words, unless a particular artistic discipline has massive popular support then it ceases to become culturally relevant. Difficult, challenging, and downright unlistenable music exists in the shadow of popular music, but take away that shadow and the whole endeavor becomes meaningless. Just as jazz seemed to be scaling untold artistic heights, it suffered cultural death because it was all intellect and no pop appeal.


While the iPod and digital downloading may have transformed the cultural landscape, those who bemoan musics apparent shift away from serious art toward pop pulp should take comfort in the fact that music in general remains relevant. Perhaps this pushes the ‘true artist’ to the sidelines, but that is how it should be. An artist without a fight is a dog without a bark. For the truly independent artist there are many options available, and the cheapness and availability of home recording equipment puts a higher focus on self-containment. The one thing that may truly suffer is collaborative music endeavors. The future may lie with the solitary creator as opposed to the group. The two most important cultural disciplines to emerge in the 20th century were recorded music and cinema. Each has produced stunning works of artistic greatness, but each wrestles daily with notions of commerciality and mass appeal. This struggle is the stuff that provides these disciplines with much of their energy and relevance. To those who wish for the eradication of vulgar, flashy pop, be careful what you wish for. You may lose more than you think.

Tuesday, August 14, 2012

Why Pop Music Matters (No Matter What Age You Are)





Throughout my life I’ve always been comfortable in the critically approved world of ‘serious’ music and in the societally approved world of pop. Each served a purpose, each was able to fit a particular mood or occasion. At the ripe old age of 23 I co-wrote a fanzine called No Remakes that nobody bought, and in the 1998 yearly round-up I proudly made ‘Stop’ by The Spice Girls one of my singles of the year. I meant it too. Maybe it’s a British thing but I have never viewed pop music as something to fight against. In recent years, however, I’ve found myself drifting away from the world of pop to such a degree that I had trouble thinking of any pop songs in recent years that I truly enjoyed. Was I, gulp, getting too old for pop music? Is there a point at which a person just shouldn’t enjoy pop music anymore? Perhaps pop music just isn’t as good as it once was? I decided to go through every possible scenario in my mind, in an attempt to get to the bottom of my predicament. What conclusions did I reach? Let’s save that till the end.

The first question to, ahem, pop into my mind was “Am I too old for pop music and is that a bad thing?” Pop is ambient music for the young. It blares out from clubs, bars, mobile phones, car radios and i-Pods. Its energy matches the restlessness and emotional turmoil of youth. Isn’t it only right and proper to leave pop music for the young? Is holding out for pop thrills past your mid-30s an act of gracelessness and desperation, the cultural equivalent of being cryogenically frozen in order to forestall decay? Should I simply accept that this music is not for me, is not made for me, does not have me in mind? It seems like the easy solution, but if I can enjoy a pop song at age 28 or 32, why not 36? I began to suspect that tiring of pop music meant tiring of life, and that soon enough my drifting away from pop would harden into dislike, which would then transform into open antagonism, and before you know it I would be blethering on about music not being as good as it was back in the day. My god, had it already happened? Shaking my fist at the sky, I decided to defy the gods and take back my love of pop music. In order to do this I needed to kill the biggest myth of all, the myth that pop music just isn’t as good anymore.



It’s natural to attach greater significance to music that surrounds you between the ages of 13 to 30. Society drills home just how thrilling, carefree, and full of promise those years are. You fall in and out of love numerous times, you start earning money, and you move out of your parents home. You become an adult. During this time, you don’t have to seek out pop music. It finds you. As well as heartbreak, pop music peddles two main slogans to young ears: “Be yourself” and “Do whatever you want to do”. Behind these essentially bland messages lurks pop music’s most meaningful directive: live for the moment. Don’t worry, enjoy this song, dance some more, buy another drink. By doing this, you will be obeying your true instincts, you will be throwing off the shackles placed upon you by an uptight society. The swirling cloud of responsibilities, of morning alarms, of un-kept promises and unpaid bills will evaporate in that moment and you will truly exist.

As you get older, opportunities to cut loose become less and less available. As life gets more serious we romanticise our youth because life was, in retrospect, less serious. Eventually everybody has to figure out what they want from life and whether they have any real chance of getting it. Youth allows us to push those issues away, to think about them some other time. Pop music allows us to enjoy the moment so, in a sense, we are not merely enjoying the song but the emotional context of the song. With the emotional context that youth offers (lack of responsibility, endless potential) no longer present, our ability to enjoy pop may suffer as a result. What if, however, there really has been a decline in pop music’s quality level?

Pop music, for many people, began with The Beatles. (It also ended with The Beatles for many people too, at least as a credible form of music). Didn’t you know that The Beatles were simply rock and roll with the rough edges removed? Didn’t you know that rock and roll was just a commercialised bastardisation of blues and country? Didn’t you know that country and blues songs were merely debased folk music and cheapened Christian spirituals with updated lyrics about sex and drinking? It goes on. Every time music changes it has its champions and its critics.

The tragedy of rock music is that it went from cutting edge rebellion to conservative defender of values in a very short amount of time. Music magazines still run stories of Dylan going electric as a singular moment in rock history, and each person who reads this story shakes their heads sadly at the idea that anyone would castigate Dylan, thinking that, obviously they would have embraced this thrilling new sound. These same people then decry the current state of music and complain loudly at almost every new development, claiming that the current version of pop is some degraded, commercialised bastardisation of what music once was. Despite the obviousness of the historical lessons above, each generation still produces thousands of individuals who imagine that THIS time music really has drifted too far from its roots, that some essential quality is missing, that music has become meaningless.



Ultimately, nobody can prove one way or the other whether ‘music’ was ever good or bad, and to think that anybody can launch a rational argument based around the idea that the entire musical output of a new generation is somehow not meeting some in-built standard is foolish beyond words. When a music fan starts to imagine that the essential spirit of music lies in holding on to an old idea rather than embracing a new one, it’s probably fair to say that they have become something of a musical conservative. I say this without labeling myself the most forward thinking of listeners. I merely state it as an absolute, unarguable fact. No art form or style has ever held firm amid the onslaught of modernisation and emerged the victor. The only thing able to somewhat succeed in ending innovative thinking and inevitable change thus far has been murderous totalitarian governments. Left to their own devices, many artists willfully experiment, and those in the commercial field are no different. This is not to say that pop music is above criticism. If pop music has a problem, however, it is in its process and in its reception. While the music plays on regardless, an intellectual war rages beneath the surface. With charges of frivolity thrown constantly at pop, postmodernism came to its rescue, bringing a brand new set of problems in its wake.

There is something rotten at pop’s core. While it is undoubtedly more welcoming to women and non-whites, it has a tendency to use and discard those same people at will. Women’s looks are under constant scrutiny in the world of pop, to the extent that a little extra weight can undermine a performer’s entire career. Once a person’s moment under the spotlight is over, hosts of cackling jackals take great delight in declaring that person a non-entity. Pop worships at the altar of youth and beauty, and anyone deemed old or ugly should probably wander off into the cold and die the moment their time in the spotlight is over.

Pop music is also free-market driven. Those who imagine that pop music pushed through important cultural gains, for instance viewing MTV’s decision to play Michael Jackson’s ‘Billie Jean’ video as a watershed moment in race relations in America, are actually outing themselves as cheerleaders for neo-liberalism and market-driven change. Postmodernism’s embrace of pop as a stick to beat academia and serious critics with presents a huge contradiction in terms of postmodernism’s supposed aims, i.e. the breaking down of accepted cultural norms about Western Civilization. It exposes postmodernism for what it is, an in-house coup by one set of academics at the expense of another. Neo-liberalism is merely the next phase in Western Civilization’s obsessional belief that freedom and the free market remain inexplicably linked. The fact that postmodernism is willing to embrace this belief shows that postmodernism is merely the next link in the chain of Western thought rather than a serious attempt to undermine it. Postmodernism bows down before the power of the market as much as any neo-con and, as such, it props up the single most important and dominant aspect of Western culture, the very one that Western armies and corporations are forcing on the rest of the world as we speak.



This brings me to another highly unpleasant aspect of pop music, namely the cultural reaction to pop music. In Britain, pop music seemed like a natural thing to enjoy. I, and many of my friends, felt no need to provide an intellectual framework for our appreciation (I have no idea whether this mindset prevails). In America, however, being vocal in your love of pop music seems to take two forms: an overtly intellectualised postmodernist approach that treats pop songs like a blank slate upon which to inscribe your scholarly credentials or a bloodless ironic appreciation that renders every pop song “awesome”. Both are, in their way, the reactions of persons embarrassed by the idea of enjoying pop music and who, as a result, feel the need to show that they are in some way smarter than the music they are enjoying. The postmodernist approach to pop music criticism treats rock critics as the ‘establishment’, even though pop is clearly the cultural victor despite lacking the same critical credibility. Granted, rock criticism does need a good kicking most of the time but I’m inclined to believe that rock criticism doesn’t really matter anymore and kicking it is a pointless exercise. This wasn’t always the case, but rock music’s rejection of pop led to an “us against them” mentality. The main problem was that the “us” in this equation referred to a very specific demographic, namely women and non-whites.

Rock critics gave up on pop the moment it stopped being straight white guys with guitars. Even though The Bee Gees matched these criteria, most still refused to take them seriously. I mean, they wrote disco songs! Rock music criticism quickly fell prey to the lure of the profound, but the problem was that it failed to see profundity in the ecstasy of dancing or in the sensual rhythm of untortured sexual impulse. Rock criticism spurned novelty and wit, preferring the anguish of ‘authentic’ emotion. To put it bluntly, rock critics flat-out refused to see worth in almost any post-Soul music made by black performers (other than reggae until 1978 and hip-hop between ’87 and ’97, although even that is debatable), and women were all but denied entry to the rock club altogether (admiring Patti Smith and Kim Gordon just isn’t going to cut it).

The truth is that rock music has been playing catch-up since the early 70s in terms of innovative thinking, relying instead on the outdated romantic notion of authentic expression as a way to feel superior to pop music. Seen as frivolous and feminine relative to the masculine profundity of rock, pop’s innovations received approval from the public even as rock critics scorned them. If history has shown us anything, it’s that pop songs are not as throwaway as we imagine. Given the safety of distance, disco’s stature has only grown over the years. Expect a resurgence of interest in New Jack Swing any minute (deservedly so).

Of course, the tendency to rescue overlooked musical movements brings its own problems. With rock as the ‘establishment’, championing old music was given a free pass as long as the music wasn’t classic rock. Postmodernism gave retro thinking its full approval on the condition that it struck a blow against stodgy rock thinking. With guitar music as the only true enemy, championing anything that has ever angered the rock press became a cause for celebration, even if it was blatantly retro and backwards. Spurious postmodernist thinking has no new enemy to move past and, as such, is as much to blame for our cultural inertia as backward looking guitar bands. That said, pop continues to move forward as it has always done.



Pop took the artistic inclination to experiment and pumped it full of business-think steroids so as to keep the music in a constant state of revolution. It put those opposed to pop’s agenda in the unenviable position of either championing artistic conservatism or endorsing deliberately unmarketable product as a means to sneer at the novelty-driven desires of the pop music aficionado. In other words, pop outflanked all of its critics by making them extreme traditionalists or anti-populist cranks. The deep, dark secret at the heart of the pop experience is this: pop music doesn’t need an intellectual framework, it doesn’t need postmodernism, and it certainly doesn’t need this essay. It lives, breathes, and devours all in its path regardless of whether you approve or not. It doesn’t care whether you give your endorsement with an ironic smirk or with a heartfelt scream. Pop music is smarter than you are. So why bother paying it any attention?

The way I see it, enjoying pop music really has nothing to do with trying to keep up with what the kids are doing. It’s more a way of allowing yourself to develop in new directions. Pop is not a genre, considering rock’n'roll, rock, soul, funk, reggae, disco, new wave, r&b and hip-hop have all at some point been thought of as pop music. Pop is a way of thinking. Genuine enjoyment of pop music shows that the paint hasn’t dried yet on the portrait that is you. The slow descent into senility, when your musical tastes shrink in ever decreasing circles as you only endorse artists who remind you in some way of the music you enjoyed in your mid 20s (without reminding you too much, otherwise you’ll accuse them of being rip-off artists), has not yet begun. Which isn’t to say that a person must turn off their brain and give approval to every new development that pop takes. On the contrary, picking and choosing is at heart what the pop experience is all about.

We may not like every new development, but to imagine that we can undo all of pop’s revolutions, that we can go back to some older time when ‘real artistry‘ had more cultural value, is to place yourself with Meissonier as opposed to Manet. It would involve some kind of fascistic artistic aristocracy closed to all except those who followed the strictest of rules. The vulgarity of the market is the perfect antidote to such ugly thoughts.



The cruelty at the heart of pop, its tendency to discard the obsolete, can feel cold when you are on the receiving end but exhilarating when you are in the vanguard of a new movement. Loving pop doesn’t mean loving only pop, or indeed loving the majority of the Top 40, but the moment you stop loving pop altogether, it’s a sure sign that your brain is settling into a holding pattern. Granted there are those who have never liked pop and spend their time exploring the weird outer-reaches of experimental and challenging music that revels in its non-commercial nature but this is merely the more serious-minded flipside of the same philosophy that drives pop, the philosophy of reveling in the new and unexplored. If you’ve ever felt the thrill that only a great pop song can provide, but find yourself forgetting when you last felt that thrill, then I think it’s time to throw on the local pop radio station, ignore the fact that ‘Moves Like Jagger’ is still being played six times a day, and try to find something to love. That’s my plan. I feel like something important is at stake.

For those with an ear for such musical endeavors, pop has the power to not only move you (in every sense); it can also stop you from becoming boring and predictable, set in your ways; complete. If you’re feeling distant from pop music, throw yourself back into the maelstrom. The minute you begin to wonder if a song is actually good or not, and you have no critical consensus to guide your thoughts, is the moment when your brain will be alive again. Only pop provides this shock. I know you’re on the internet, so quit messing around and find those pop hits before it’s too late.



(This essay originally appeared on Collapse Board)

Wednesday, February 8, 2012

English Blues – The Untold History of the Chunka-Chunka Song


One of the last major questions about the history of music remains unanswered, that question being: what was the first Chunka-Chunka song? Wait a second … you don’t know what a Chunka-Chunka song is? Let me explain. A Chunka-Chunka song is a song where one can, without too much trouble, sing the phrase “Chunk, chunk, chunka-chunka” over the top of the music. A good example would be ‘Dead End Street’ by The Kinks. Have a listen:


Do you hear what I mean? Chunk, chunk, chunka-chunka. This sound has forever been associated with English pop, with ribald cockneys gathering rand the old Joanna for a right old knees up. The Chunka-Chunka song, however, has a strange ancestry, an ancestry that jumps back and forth across oceans and across time.

In 1920 Mamie Smith released what is generally regarded as the first blues recording, a song called ‘Crazy Blues’. It is not the blues as it is commonly regarded, which seems to focus on either delta or Chicago blues. This was blues mixed with vaudeville and jazz. A good-time, bad-time big city blues that could easily be thought of as an early jazz recording. ‘Crazy Blues’ sounds closer in spirit to a song like ‘Dead End Street’, more so than anything in the British music hall tradition.


‘The Unfortunate Rake’, an old English folk ballad, crossed the Atlantic and became ‘The Streets Of Laredo’, but somewhere else along the way it transformed itself into ‘St. James Infirmary Blues’. In 1928 Louis Armstrong recorded a version of ‘St. James Infirmary Blues’ which went on to become one of his signature recordings. You can hear the influence of this recording on the song ‘Minnie The Moocher’ by fellow jazz artist Cab Calloway. Calloway would even record his own version of ‘St. James Infirmary Blues’ but it is ‘Minnie The Moocher’, more so than ‘St. James Infirmary Blues’, which stands as the grandfather of the Chunka-Chunka song. Though the influence of music hall was strong on The Kinks, the spirit of ‘Minnie The Moocher’ feels stronger. ‘Dead End Street’ sounds more like English vaudeville blues than music hall. Yet even before The Kinks the Chunka-Chunka song would be awakened from its slumber by another American source. Motown records.


'Where Did Our Love Go’ was close, but it was The Supremes’ ‘Baby Love’ that really brought back that Chunka-Chunka feeling. Perhaps it doesn’t qualify as an indisputable Chunka-Chunka song, but it is pure proto-Chunka-Chunka. The race to deliver the first real Chunka-Chunka number was on. ‘Watcha Gonna Do About It?’ by The Small Faces was just a little too fast, ‘Girl’ by The Beatles just a little too slow. In a moment of inspired madness, The Beach Boys made a majestic effort to take the prize with ‘God Only Knows’. Sure, it’s gorgeous, sublime, and deliciously melodic, but ‘God Only Knows’ is pure Chunka-Chunka. The only thing it lacked was bite, and perhaps a touch of rinky-dinky piano.


In the end, The Kinks deserve the award for first authenticated Chunka-Chunka song with ‘Sunny Afternoon’. It had everything you could ask for in a Chunka-Chunka number, and with that patented English humour it turned the Chunka-Chunka song into an all England affair. With the floodgates open The Beatles produced ‘I Want To Tell You’ and ‘I’m Only Sleeping’ as well as the almost-but-not-quite-Chunka number ‘Good Day Sunshine’. It was at this point that McCartney (desperate to outdo the Beach Boys) went Chunka-Chunka crazy, churning out ‘Penny Lane’, ‘With A Little Help From My Friends’ and perhaps the archetypal Chunka-Chunka number ‘Getting Better’.


The Small Faces helped associate Chunka-Chunka with Englishness by penning ‘Lazy Sunday’ and ‘Happy Days Toy Town’, but these songs, along with ‘Blackberry Way’ by The Move and 'Care Of Cell 44' by The Zombies, represented the final flourish for Chunka-Chunka. Its blues roots washed clean by McCartney and his Beach Boys fixation, by the 70s it was all but gone and the world would have to wait until the UK’s ill-fated flirtation with its glorious past before the Chunka-Chunka song reemerged as a recognisable song form. 

Both Blur and Oasis made use of Chunka-Chunka and its associations with the 60s and Englishness, penning ‘Sunday Sunday’ and ‘Digsy’s Dinner’ respectively. They were poor relations compared to their 60s ancestors, and once more it would take an American to breathe life into the corpse of Chunka-Chunka. After ‘Pictures Of Me’, ‘Baby Britain’ by Elliott Smith was perhaps the last hurrah for Chunka-Chunka. Rescuing it from Britpop and little-Englander hell, Smith used the Chunk as a backdrop for one of his hellish portraits of alcoholism. Except it all sounded so jaunty and fun. Then, as before, it disappeared once more. This time probably for good.


To some, the Chunka-Chunka song is a horrid creature, a stilted sexless creation that reeks of nostalgia and misplaced national pride. In truth, however, the Chunka-Chunka no more belongs to England than any other country, and the US has as many claims to the Chunka-Chunka hall of fame as England. It’s hard to declare any particular song-form dead, but it’s also hard to see how the Chunka-Chunka could be seriously resurrected. If it should die, think only this; that there’s some corner of the pop world that is forever Chunka-Chunka.


(This article originally appeared on Collapse Board)

Tuesday, February 7, 2012

R.I.P. Laura Kennedy



I’m going to assume that you know about Bush Tetras. Maybe you’re a fan, or maybe you’ve just seen their name here and there and you’ve always meant to check them out. If you’re in the latter category I highly recommend it. Their small discography belies their influence and importance. In their original incarnation they released a few scorching singles, an EP, and then they were gone. Those original recordings are still discussed, still sought after, and are still capable of reducing people to either slack-jawed awe, or making them dance like a demented robot.


In their original line-up was bass-player Laura Kennedy. As luck would have it, I managed to both meet and befriend Laura. A little over 10 years ago I moved to Minneapolis and, after acquiring a work permit, I got a job at the record store Cheapo in Uptown Minneapolis. Laura was already an employee of the store and when we were first introduced I had no idea about her past. She just came across as a welcoming, funny, self-assured human being. After a few weeks somebody told me she had been a member of Bush Tetras. At that point they were only a myth to me, a band I’d read about but never listened to. This would soon change.



I remember thinking that Laura was just so ridiculously … cool. She never once mentioned Bush Tetras to me, but she radiated strength and an aura of not giving a damn. She went out of her way to make me feel welcome, at one point even asking me if I wanted to go with her to see a solo show by Exene Cervenka. I never went to the show, partly fearing that my company would be disappointing outside of a work setting. The next day I asked her how the show was and she replied that it had been great, and that afterwards her and Exene had sat around and had a couple of beers. I instantly regretted my decision not to go (and still do), but at the same time wondered what I could have contributed to a conversation involving two such bad-ass women.

After we had both moved on from Cheapo I discovered to my great pleasure that Laura lived in my neighbourhood. Meeting her on the street became a regular occurrence, as we caught up on life and the goings on of Minneapolis. In 2008 she received a liver transplant and things seemed touch and go for a while, yet she came through and before long we bumped into one another again in Uptown. I saw her less and less while on the street but she was always ready to share an opinion on Facebook. I desperately wanted to interview Laura for Collapse Board, but her Facebook page had been quiet and I didn’t want to risk bothering her. I decided to wait till I heard how she was doing.

On 14 November I looked on Laura’s Facebook page and saw that things had taken a turn for the worse in regards to her health. Then suddenly the news was out. Laura had passed away that very day. I wish in no way to insult those who knew and loved Laura by implying that our friendship was anything more than random conversations on Minneapolis streets about the ups and downs of life. Yet I liked Laura, and felt in her warmth that she liked me too. The idea that we would bump into one another always filled me with happiness, and it feels strange and rather terrible to think that it won’t happen again.

If you’re reading my blog, I know you’ll love Bush Tetras. It’s your kind of music. Smart, funky, and cooler than ice-cold. Laura was a part of that magic. Goodbye Laura.


Laura Kennedy, co-founder of Bush Tetras, passed away on Monday the 14th of November, 2011.

(This article originally appeared on Collapse Board)