This month’s It’s Lit! dives into the world of graphic novels.
Obviously, I’m a fan of graphic novels. I think that the format provides an interesting and engaging storytelling method. Sometimes I think of graphic novels as a step between novels and movies (storyboards anyone?). Other times I think of them as a great way to pair down a story to its elements. And then there are the times when I don’t think too hard and just enjoy reading graphic novels.
I’ve previously written about how the snobbery of literature is especially pointed when it comes to graphic novels. And it always seems to come back to holding up a very certain kind of novel as “literature” and everything else as “unworthy”. Something I’ve come to call defending Fort Literature from the invading Lesser Works.
Maybe if people just gave graphic novels a chance to entertain them…
In the past few decades, literature has expanded to not only mean the “novel” but “graphic novels” as well. Today we are gonna break down how the graphic novel went from the comic book store to the classroom. Hosted by Lindsay Ellis and Princess Weekes, It’s Lit! is a show about our favorite books, genres, and why we love to read. It’s Lit has been made possible in part by the National Endowment for the Humanities: Exploring the human endeavor.
I’ve previously written about how some literary authors don’t really understand nor respect genre fiction. Of course, that doesn’t appear to give them pause before sitting down with their quill and parchment – literary authors exclusively use olde timey equipment: true fact – to knock out a genre novel. Their attempts at writing genre tend to reflect this disdain and ignorance of the form, and they end up doing a poor job of writing it.
Well, at least we know he’s treading on well-worn paths and reinventing all the tropes he’s painfully unaware of with his latest novel. But good on him for flying the ignorance flag so high so we don’t waste our time as readers.
It gets better. I received the monthly recommended review books from Penguin and saw McEwan’s new novel, Machines Like Me, on the list. This was the publisher’s blurb:
Our foremost storyteller returns with an audacious new novel, Machines Like Me.
Britain has lost the Falklands war, Margaret Thatcher battles Tony Benn for power and Alan Turing achieves a breakthrough in artificial intelligence. In a world not quite like this one, two lovers will be tested beyond their understanding.
Machines Like Me occurs in an alternative 1980s London. Charlie, drifting through life and dodging full-time employment, is in love with Miranda, a bright student who lives with a terrible secret. When Charlie comes into money, he buys Adam, one of the first batch of synthetic humans. With Miranda’s assistance, he co-designs Adam’s personality. This near-perfect human is beautiful, strong and clever – a love triangle soon forms. These three beings will confront a profound moral dilemma. Ian McEwan’s subversive and entertaining new novel poses fundamental questions: what makes us human? Our outward deeds or our inner lives? Could a machine understand the human heart? This provocative and thrilling tale warns of the power to invent things beyond our control. Source.
Yes, it even has a love triangle. This is certainly not a bog-standard sci-fi novel at all. No sir. This explores big ideas… This is the cover art…
There are several potential explanations here:
McEwan is one of the arrogant literati who would never stoop to reading such crass material as genre fiction. Of course, when they write it, it is very important literature that you should absolutely buy and praise them for writing it.
McEwan is painfully ignorant to the point that someone really should have taken him aside during the (above quoted) interview and shown him the Wikipedia page for Science Fiction on the magical communication box they carry in their pocket.
McEwan is hoping that his comments will stir controversy that will help sell more copies of his books.
Now I am a bit late to the internet pile-on that inevitably results from modern faux pas as it is reactionary and lowers the quality of discourse. Definitely not because I got distracted on other things. Anyway, the reason why I have come back to this incident is that it ties into a thread I have been commenting on for several years now: Literary snobbery, or the Worthiness argument.
But the most interesting argument I have seen defining the difference between literature and genre fiction was around the class divide. The snobbery was literally built into the divide because genre stories were published in cheaper books for the workers and the more literary stories were published in fancier books for the new middle class.*
So it is quite possible that the reason why we have comments like McEwan’s is because they are tapping into 150 years of class snobbery that disallows them from reading or appreciating genre fiction. If they do read some, it will be classed as a guilty pleasure, because they can’t be seen actually acknowledging genre as having substance.
Or it could just be about attention seeking to sell some books.
* The argument doesn’t really discuss what rich people read. I assume that the rich people were too busy counting money to be bothered reading either genre or literature.
It is no secret that I like to write book reviews. Except when I’m sleepy. Or if I’m busy. Or if I get distracted by… Where am I?
Anyway, I wrote this blog a year or so ago addressing an example of people conflating book reviews with book promotion. I’m reposting it because of a recent discussion I had where several people were angry that a Mythcreants article failed to mention the authors of books being used as examples.
The issue that people seemed to have with the names being omitted was that it “wasn’t how reviews are meant to be written”….* Apparently the important details of a book review are listing the Title, Author, Publisher, Publication Date, etc, so that people can find the books easily. Yeah, an article on the internet referencing well-known books isn’t making it easy enough to find the books. Their irony meter must have been lost with the demise of Altavista.
Let me state up front: to my mind, book reviews are about helping people find good stuff to read and promoting authors whose work you’ve enjoyed. Of course some reviewers use it as a an opportunity to break away from the internet commenter stereotype and be jerks to others. This can be frustrating for authors and readers, but about par for the course as far as internet comments goes.
Fortunately an author has written a blog post advising people how to properly review books. Because you’re doing it wrong…
As a reader, reviewer, and occasionally sober writer, I don’t think authors should be telling readers what to do or how to do it, so the post doesn’t sit right with me. Actually, a lot of things don’t sit right with me, especially if they aren’t single malt and well aged.
Who are book reviews for?
You might be forgiven for thinking that writing a book review is primarily to flatter the author, or thank the author for writing an enjoyable book.
Book reviews are for prospective readers; to inform those buyers who are browsing the Amazon bookstore, chatting on Goodreads or following on-line bloggers, to decide if they might enjoy the book as much as the reviewer did.
The first major point is one I agree with: book reviews are for readers. But maybe not the readers the author thinks. Reviews are primarily about that reader and their thoughts. Sure, they may be trying to communicate with other readers and make recommendations, but let’s face facts, it is mostly just about sharing an opinion. Or is that shouting into the void? I can never remember the difference.
My philosophy on book reviewing – helping and promoting good stuff – is probably shared by many, or not, I haven’t checked. But I see it as an important aspect of being a reader and writer. Thanks to the wonders of technology we have access to more books than we’ll ever be able to read, and some of them are worth reading. Sharing your opinion of a book can help others find stories that will entertain them. Personally I’m not a fan of sharing reviews of books I haven’t enjoyed, just the ones I think others will enjoy reading, but negative and positive reviews are both helpful.
But, that’s just me. Readers aren’t obligated to say nice things about a book, nor promote it, nor make sure there are links to anything (except references, those are damned important I tells ya!).
The next points:
What to include:
The best single rule to remember is this: Only write about the actual book!
You can include a very brief outline of the story, but remember the book description is already right there, so consider these points:
Was the story believable, did it keep you engaged right to the last page?
Did the structure of the plot work for you?
If it’s a mystery, was there one?
The characters. Did they seem real, multi-dimensional people?
The author’s writing style. How was it for you?
The first point on this list is, frankly, rubbish. A book does not exist in a vacuum. Well, unless it was taken into space, but why would anyone do that? Unless they are an astronaut, but they’d want the book in the ship with them.
Anyway, all art/media is a product of the space it was created in, it has cultural and philosophical underpinnings that are part and parcel. And following on from that, the cultural landscape changes over time and individuals consuming the art/media are going to evolve as a result. So you can’t just write only about the book in a review, you will always bring baggage with you. Want an example? Try watching the original Ghostbusters movie without thinking Bill Murray’s character isn’t just a tad rape-y by today’s standards.
The next points about the story and how it is written are fair enough advice on things you could include. But you could include all or none of those things and still write a good review. The review has a subject and you only really need to include the stuff relevant to that subject – which means you might never mention the author, publisher, publication date, major or minor characters, etc.
The next part is where this blog post gets juicy:
What not to include:
Your possible relationship to the author, however vague.
If you need to reference the author, then use the surname only or call them the author or include their full name. Never use Christian names as it may compromise the validity of the review and some sites will remove them permanently.
Imagine if you saw this review on the latest Dan Brown: Hello Dan love, fabulous book, Five stars! I expect the vast majority of us would laugh, Dan Brown would most certainly cringe – but most importantly, would this sort of review help you form a decision to buy the book if you’d not read it?
I don’t know about the last point, I’ve read Dan Brown’s Inferno; I’m not sure he knows how to cringe.
There are two points to unpack here: the first is the idea that your relationship with the author doesn’t have any bearing on a review; the second is the idea that you’re trying to help the author sell books. To suggest that there is a way to refer to an author or that you shouldn’t mention conflicts of interest is wrong. If I like someone I will naturally be inclined to think their book is better than someone whom I don’t know or like. Similarly, if I already like someone’s writing, their lesser works are likely to be viewed more favourably than a new author’s work.
It also irks me that the blog is implying that the author is off-limits for criticism. That is rubbish. If I know that the book or author is controversial, then that will also colour my review and is worth raising. An example was James Frey’s Lorian Legacies series published under a pseudonym. During my reading of I Am Number Four I noticed several very lazy factual inaccuracies and wanted to know who the author actually was. It was then that I found out that Frey had scammed a bunch of writing students to produce the series. Not only did that colour my (lack of) appreciation of the novel, but it was information I felt other readers should know. Because screw that guy.
This links nicely into the next point about reviews helping to sell books. It is true that book reviews help sell books: who’d have thunk? It is also true that if you want to see more great material from an author one of the things you can do is make sure people know you enjoyed the book. But since when is it the reader/reviewer’s obligation to help sell books for an author? Shouldn’t an author be happy that you bought and enjoyed their book? Well, unless you borrowed the book from a library, friend, or got a freebie. I understand the desire of authors to encourage people to review their book/s, and what it can help do in terms of recognition and thus sales, but a review isn’t about selling a book. The review is about the reader sharing their thoughts on a book they have read. Book store clerks get paid, readers don’t. Worth remembering.
The weather! I’m being tongue-in-cheek here but really, no honestly, there’s no need to mention the weather…
How long the book took to arrive in the post, or that it was damaged. This isn’t the fault of the author – stick to reviewing the book.
Likewise, problems with your Amazon account: It won’t download. This is not the author’s fault and should never form part of a book review.
These next few comments are mainly about Amazon reviews and how people talk about the buying of the book in their review. While it is completely understandable for an author to be frustrated with comments in a review that are unrelated to the story they wrote, this recommendation is not only self-serving drivel, it forgets who the review is for.
If someone orders a book from Amazon and the shipment is not filled quickly, or won’t download, or the pricing is ridiculous, or the bonus “massager” didn’t arrive with the romance novel order, then that is a legitimate gripe. Other readers on the Amazon store will want to know this stuff. But even if this wasn’t on the Amazon store, there is legitimate griping to be done. For example, in Australia several major publishers price their ebooks based upon the currently available paper version’s price.** So if the hardcover is out, you pay hardcover prices for an ebook. Where else but a book review are you going to express your consumer discontent on this? Well, aside from in a blog post like this one. Or if you are talking to one of the publishers at a writers’ festival. Or if you know someone, who knows someone, who is related to a publisher.
I’d agree that there needs to be a distinction made been the store’s service, the publisher’s pricing, and the novel itself, in site reviews. Having those categories un-lumped would be something I’d support. Along with free hats. Everyone loves a free hat. But that I can’t go along with blaming the reviewers for not making that distinction.
Spoilers: giving away crucial parts of the plot and therefore spoiling it for other readers, e.g. I’m glad Susan was dead by chapter three.
Copying and pasting the entire book description – please dont.
And the worst of all: I haven’t read it yet… so one star. Why on earth do sites allow these ‘reviews’ to remain?
Some people really hate spoilers, others love them, others still are ambivalent, others still still will hunt you down and kill you with your own keyboard for posting them. As such there is an etiquette to posting spoilers in a review. If you are an undefeated Muay Thai fighter with a decent ground game, then you can post whatever spoiler you like. If you are anyone else, you can post spoilers as long as you warn people you are going to do it. Then people can read or skip the spoilers to their heart’s content, assuming the Muay Thai fighter hasn’t ripped their heart out of their chest for posting a spoiler.
I’m not sure I’ve ever seen someone post a book review that includes a copy and paste of the book description. If this actually happens then it is either a compliment to the blurb author for capturing the book perfectly so as to act as an appropriate review, or someone needs to learn how to type their own thoughts.
The final point is about the dismissive one star review. Now some people, such as the blog post author, complain that these sorts of reviews are not legitimate. Nonsense. Books have a cover, a blurb, the reader might have read other books by the author, etc, all of which can be more than enough information for the reviewer. For example, let’s say that someone has written a book – fiction or non-fiction, it doesn’t really matter – that claims climate change is a hoax perpetrated by the Chinese, NWO, Zionist Bankers to something something profit-gravy-train. Do I really need to leave a long and extensive review once I’ve read the book? Or can I just point out that it is nonsense? Is that not legitimate comment?
Now these have all been relatively minor gripes to quibble with. Fun fact: Quibbles are the Earth equivalent of Tribbles. Bonus fact: the most famous Quibble currently sits on the President of the USA’s head.
Quibbles and Tribbles aside, let’s talk about complaining about book reviews not being done the way you want, when you yourself don’t review books. There is that inspiring – or is that insipid? – quote about being the change you want to see which applies here. The author of the blog post has exactly zero book reviews on their site, not counting the promotion of reviews of their own novels. The author’s Goodreads page has no books listed as having been rated or reviewed.*** Do as I say, not as I do. I mean, sure, that isn’t demanding at all.
In summary, it’s best to be thankful for readers writing reviews, even the bad reviews.
*Let’s just ignore that the article in question wasn’t a book review but an article that was using famous books as examples to make a larger point. The poor dears are having enough problems with conflating reviews with promotion.
**Not sure this is still the case in 2017. It was the case for many years though.
*** Update: As of 2020, the author whose post inspired this piece has made a few changes to their Goodreads page. In June 2020, it appears they have added 11 reviews, ~90 ratings, and ~560 books to their read profile. No one star ratings, but plenty of ratings only “reviews”. I wonder if they saw this post and decided to walk the walk a bit more?
There is something about music that we all love. By “music” I mean I’m going to discuss the popular stuff that people love to criticise. By “we all” I mean some people, since not everyone likes music, and even music lovers have tastes that differ from the norm. And by “love” I don’t mean the squishy kind. As a music fan, I feel the need to defend modern music, since I quite like some of it.
Recently there have been a number of people disparaging modern music. E.g.:
This isn’t a new argument. Much like the kids these days argument – wave your Zimmer Frames at the sky now – the modern music sucks argument is based around a number of cognitive biases. Survivorship bias is one part, in that we only remember the music that lasts, and we certainly don’t remember the bad stuff. One of the more interesting parts of our biases is how our musical tastes are formed in our teens and early twenties (14-24). In part, this is when our brains are developing and we are creating our identity. Another part is that everything is still new and exciting, so we get a rush from experiences that we won’t later in life. So everything after that short time period seems strange and against the natural order of things.*
Pubertal growth hormones make everything we’re experiencing, including music, seem very important. We’re just reaching a point in our cognitive development when we’re developing our own tastes. And musical tastes become a badge of identity. – Professor Daniel J. Levitin (Source)
But of course, rather than discuss the interesting dynamics at play, the discussion has instead latched onto a study that provides “objective proof” that modern music sucks. Rather than directly cite the study, the vitriolics have found a Youtube video that misrepresents the study to suit their preconceived ideas.
So what does the objective proof study actually say? Well, after a quick search – seriously, how hard is it for these whiners to link and read the damn study – I found the original study. But rather than provide proof that music has gotten worse since the 1960’s, it instead directly states:
Much of the gathered evidence points towards an important degree of conventionalism, in the sense of blockage or no-evolution, in the creation and production of contemporary western popular music. Thus, from a global perspective, popular music would have no clear trends and show no considerable changes in more than fifty years. (Source)
Kinda the opposite of the claim, huh! As a general statement, music hasn’t gotten better or worse, it has pretty much stayed the same over the last 50 years. Nobody has ever noticed that…
Other studies have looked into changes in music over time. A more recent study found that styles of music have changed, often becoming more complex over time. But it isn’t quite that simple. The more popular a style of music becomes the blander it becomes.
We show that changes in the instrumentational complexity of a style are related to its number of sales and to the number of artists contributing to that style. As a style attracts a growing number of artists, its instrumentational variety usually increases. At the same time the instrumentational uniformity of a style decreases, i.e. a unique stylistic and increasingly complex expression pattern emerges. In contrast, album sales of a given style typically increase with decreasing instrumentational complexity. This can be interpreted as music becoming increasingly formulaic in terms of instrumentation once commercial or mainstream success sets in. (Source)
In other words, music sucks because it tries to be popular. And it works.
So saying that modern music sucks is nonsense. What is bland and generic is popular music. Always has been, probably always will be. There is good music being made all the time, you just aren’t going to find it without looking.
I’ve come up with a set of rules that describe our reactions to technologies:
1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you’re thirty-five is against the natural order of things.
When I think of literature I think of an older guy sporting a greying moustache, sipping a sherry, wearing a smoking jacket, seated in a library of leather-bound books in front of a simmering log fire. The guy speaks with an aristocratic English accent and expounds on the greatness of some book that other older men dressed like him, sitting in similar log-fire warmed libraries, also like to read when not shagging the maid.
Now clearly not everyone who reads literature fits this image. Some probably can’t even afford a maid to shag. But it does appear to be an image that people aspire toward, an image that informs what they deem literature, and thus what they deem worthy of reading. Rather than judging any written work based upon its lasting artistic merit – although that definition is so subjective as to be useless and ideal for starting pointless arguments…. (cough) – people seem intent on creating boundaries before a work is allowed to be judged. They must defend Fort Literature from the invading Lesser Works.
Normally I’d launch into a whinge about how speculative fiction is unfairly maligned, or how I’ve read crime fiction that has more artistic merit than most literary works. But instead, I’m going to talk about graphic novels. In an article on The Conversation, Catherine Beavis explained how the graphic novel Maus came to be part of the literature curriculum.
Despite this explanation, there was always going to be someone in the comments telling us how a graphic novel can’t be literature. I assume they wrote their comments whilst wearing a smoking jacket and taking a break from shagging the maid.
Well well……..so it’s art as literature.
Why not a more well-known comic (sorry graphic novel).
Not saying this isn’t a worthy addition to any curriculum, but more as social comment rather than literature.
Surely the novels of great Australian writers should be preferable – Winton, Malouf, Carey etc.
Let’s break these points apart one by one. As will be seen from further comments, the argument primarily revolves around the feelpinion that because graphic novels contain pictures they are art and thus not literature. A similar argument could be made for movies being TV shows and thus we could abolish the Oscars… actually, that isn’t a bad idea. Anyway, I guess we’d better break the news to the literature professors that Shakespeare’s plays need to be taken off of the curriculum.
The argument then moves to the “I haven’t heard of it, so it can’t be good” assertion. Maybe because they realise this isn’t a great argument, they immediately distance themselves from it. But we start to see the “worthy“ argument being formed. I’ve argued many times that “worthy“ is a great subjective argument put forward by people who think they are worthy.
Of course, it wouldn’t be a literary argument if someone didn’t cite some authors they deem worthy. For those unfamiliar with Winton, Malouf, and Carey, they are award-winning Aussie authors who write “interior histories” and about “people rebuilding their lives after catastrophe” and “people who experience death and will never be the same again”. None of those statements could be applied to a graphic novel about someone who survived the Holocaust… No sir.
Their list of worthy authors is as subjective as their comments about graphic novels and Maus. I could similarly ask why the Hitchhikers Guide to the Galaxy isn’t on the curriculum. It has a lot to say about society and has entered the lexicon, which is more than can be said for any of the other authors mentioned nor the graphic novels being shunned. I could say the same again about Superman or Spiderman, which have implanted ideals and phrases of morality into society, regardless of whether people have read those graphic novels or not.
*Steps on soapbox*
I personally welcome any work into the class that will encourage kids to read, think and learn. And to anyone who derides graphic novels, they are clearly saying they don’t or haven’t read any.
*Steps down from soapbox*
The commenter responded to criticism of their subjective opinion:
That may be so, but my bigger point was that literature = words.
This is art with captions.
Not disputing that it may be hugely popular or good (even great)…
but literature it ain’t.
I think the appropriate response to this is a head shake. The problem is the black and white definition of what literature is, whilst ignoring the fact that the graphic novels fit the definition of literature. Pointing out the flaws in these opinions is as easy as saying that graphic novels, with very few exceptions, are composed of words. They also use graphics, but that is often a collaboration between the writer and the artists they work with. Thus, by the definition of “literature = words”, graphic novels are eligible to be classified as literature.
But anything to keep only the “worthy” books in contention as literature. Can’t have that kids stuff being called literary!
So I named three contemporary Australian writers – call me subjective.
I am not knocking the (art) form…just that it (to ME) is not literature.
Your opinion is obviously as valid as mine……don’t get huffy.
The last point here is one that irks me more than irksome irkers on an international irking junket. Opinions are not equally valid. That sort of subjectivism nonsense eats away at reality and suggests we “just don’t know, man”.
The commenter made a subjective list, so I put together some examples that were superior in quantifiable ways (impact on society, entering the lexicon, referenced by society) to show that the subjective claims were more worthless than a $9 note because clearly not much knowledge or thought was put into the claims.
There is also the idea of literary critique and argument, rather than stating feelpinions. I’ve stated an opinion and argued it, offering reasoning. The examples I countered with aren’t necessarily the best choices, but I have justified and quantified my argument, something you learn in high school literature class. Art Spiegelman won a Pulitzer, so clearly, someone in the literati agrees. And surely a Pulitzer prize winner is worthy of being on the curriculum. But of course all opinions are equally valid and “I’m entitled to my opinion”, dammit!
Surely the whole point of literature is that the reader has to imagine the scene described, the way words are spoken, the implications of what is said and much more. It’s all in the mind, which develops through reading.
A graphic novel presents the words and pictures with almost no imagining required. The number of words is hugely reduced to give way to often wasted space. In the example above there are 21 words, which if in normal lowercase type could be written in 10% of the space.
Sorry I’m not convinced graphic novels have any merit for senior students.
Shakespeare’s plays give stage directions and poetry is often deliberately obscure. So how do those examples fit this exclusionary definition of literature? I’m sure some artists would object to the idea that they aren’t conjuring a scene that develops in people’s minds. And is the idea to only allow readers to imagine a scene? Isn’t it about conveying ideas and emotions too? Isn’t this some great mental gymnastics to try to maintain Fort Literature from invasion by the Lesser Works?
The second paragraph is also exemplary of someone who hasn’t read many, if any, graphic novels. So, of course, this commenter wouldn’t be convinced that graphic novels are of any merit. First, they’d have to know something. But that doesn’t hold them back from commenting.
While I’m in the mood for alienating folks, let me also say that this is a good example of dumbing down literature.
Give the kids a picture with limited words and maybe they’ll get the idea.
Don’t kids these days have the attention span to read a novel?
The last graphic novel I read was 480 pages long and took many hours to read. It covered sexual identity, morality, the greater good argument, do evil deeds make us evil, etc, as issues. The last “literature” novel I read was about a woman who manipulated people to get what she wanted. It was ~300 pages long and took many hours to read.*
This argument is typical of people who have a snobbish attitude to something based upon pure ignorance of the topic. Similar statements have been made throughout time, decrying the dumbing down or declining standards of today’s youth. Oddly enough it has been proven false again and again only to be spouted once more.
There is a similar article on The Spectator – a home for uninformed opinion – which argues that if we let graphic novels into literature we have to let in everything. They must defend Fort Literature from the invading Lesser Works. Maybe I’ll address that one at some stage when I’m feeling masochistic, but I’m going to leave it there. The maid has arrived.
*This comparison was true at the time of my original comments on The Conversation. I’ve read many graphic novels since, but no further literature novels.
Update: Nerdwriter made a particularly good video discussing Maus and how it is constructed as a story and piece of art. Every frame, every image, the whole page, has meaning. Kid’s stuff indeed.
Update: A great video essay about how you can’t judge art objectively.
When The New York Times hired Bret Stephens many supporters of sound science were concerned. Bret has a history as a climate science denier and disinformer, using his clout as a Pulitzer Prize winning journalist to undermine climate science. With the publication of his inaugural column at The New York Times the concerns were confirmed.
Bret’s piece attacks climate science by attempting to argue that nothing can be 100 percent certain, so it is only rational to doubt claims of that sort. Except that is nonsense.
Climate science has never claimed 100 percent certainty. The evidence for human influences on climate is overwhelming, but scientists don’t claim to know anything with 100 percent certainty. That isn’t how science works. Climate science is routinely reported with error margins and uncertainties.
This isn’t the only problem with Bret’s article. He makes many other factual errors, as covered by Dana Nuccitelli and others. So Bret’s article is either deliberately deceptive, or naively uninformed.
It is hardly the first time Bret has been a climate disinformer. In his previous role at the Wall Street Journal he wrote similar articles that sought to undermine climate science and disinform his readers. During a January 23rd 2015 appearance on Real Time with Bill Maher, Bret utilized a splurge of cherry picked historical events and reports to discredit climate science. He included the much-debunked 1970s cooling argument, and an irrelevant reference to a fisheries management conference, in his argument that the experts are probably wrong. Just ignore all the evidence. And don’t check Bret’s claims too closely. So being deceptive or uninformed is nothing new for Bret.
Charts of misinformation in opinion pieces during Bret’s time at the Wall Street Journal (source: MediaMatters.org)
Writing an opinion column at The New York Times that is either deceptive or uninformed does not speak well of the credibility of Stephens nor his new employer. Why would a respected news outlet like The New York Times publish a column that is deceptive or uninformed?
James Bennett, an editor at The New York Times defended their original hiring decision in the face of criticism. Bennett said,“The crux of the question is whether his work belongs inside our boundaries for intelligent debate, and I have no doubt that it does. I have no doubt he crosses our bar for intellectual honesty and fairness.”
Yet with his very first column, Bret has shown a lack of intellectual honesty and fairness. So exactly how low is the bar being set?
No credible news outlet could allow one of their opinion columnists to continue to write nonsense for them. Has The New York Times sold their credibility on climate science for conservative clicks? Are they doing this to create sensationalism? In either case, it speaks to the standing of The New York Times that they would use such an important issue in climate change to hurt public understanding of the issue for attention.
Certainly many scientists have decided that The New York Times no longer deserves their subscription (e.g. 1, 2). The response from The New York Times is hardly complimentary to their new slogan “Truth is more important now than ever”. When you respond to scientists who have cancelled their subscriptions over Bret Stephens’ climate disinformation by arguing there are two sides to the debate, or that the scientists can’t stand differing opinions, you wonder if The New York Times understands what Truth actually means.
If The New York Times values truth then they shouldn’t have hired Bret Stephens to write about climate change. If they care about their credibility now they will sack him. But it seems clear that they have sold their credibility for clicks.
With pork-barrelling season in full swing, we will be seeing plenty of politicians hitching their wagons to prominent sports and sporting teams. The proclamations that sports are True-blue, dinky-di, Aussie will come to win over voters, with a little somethin’ somethin’ in the budget to sweeten the deal. Because sport is king in Australia, right?
Aussies are routinely described as sports mad, sports addicts, and that we love watching and playing sports in sporty sports ways. But how many of us actually play sports? How many of us actually watch sports? Given that you could describe weekly matches of football as repeats of the same teams doing the same thing for months on end annually, it is worth taking a look at a few of our assumptions about the claims.
Let’s start with a look at how many Aussies play “sports”. Inverted commas around sports? Yes, because when people say that 60% (11.1 million) of Aussies play sports – down 5% compared to 2 years previous – what they actually mean is that we’re classifying walking and generally not sitting on the couch watching TV as sport. Let’s make it fairer on sports and subtract the walkers from being classified as sport participants. And let’s not succumb to temptation and call golf just more walking with intermittent cursing. That means that our 11.7 million “sports” participants is suddenly 7.5 million, which is 41.4% of the population (and falling with the ageing population). That figure sounds impressive until you realise that figure is participation of at least once in the past year and doesn’t account for the regularity of participation. How regularly someone is involved in sports is a much better indicator of our interest and love of sports. As opposed to accounting for that time you went to the gym because of a New Year’s resolution or because the doctor ordered you too out of concern for being dragged into an orbit around you at your next visit. The reality is that less than half of the population engage in regular (3 times per week on average) physical activity, with roughly a third of those people being gym junkies (NB: young men are more likely to play a sport, that drops with age and isn’t replaced with other activities, whilst women are more likely to be involved in non-organised sports and remain doing so).
The Top 20 most popular physical activities are dominated by fitness activities like the already mentioned walking, aerobics/fitness, swimming, cycling, and running. One of the big name sports, AFL, ranks 16th on the list behind yoga. When yoga beats football for popularity it must only be a matter of time before the PM declares it the most exciting sport. For those wondering where rugby is on the list, the rest of Australia says ‘hi’.
Of course, this is only looking at sports. How does sports participation compare to other activities? Well, ABS figures show that we spend roughly 23 minutes a day reading, versus 21 minutes on sports and outdoor activities (NB: this varies between genders and age groups). The US figures show similar results with more time reading than playing sports, but they also spend less of their day on both activities. So at least we are still better read and fitter than Americans in the low barmetrics.
Obviously sports aren’t all about participation and most would regard themselves as avid armchair sportspeople. It could be argued that the best way to stay injury free in sports is to participate from the comfort of the couch in front of the TV at home. The other option is to attend a sporting stadium dressed in clothes made from random assortments of gaudy colours to cheer on a team who are wearing similar clothes but are less inebriated. Or would the most appealing option be to go to a movie, concert or theme park? The correct answer is that people would prefer to attend a movie (59%), a concert (40%), or a theme park (34%). Live Comedy (31%) was more popular than Football (30%), Cricket (29%) and Rugby (25%).
Of course, someone is bound to point to spectator numbers for AFL, A-League, and NRL that look very impressive. With average match attendances in the tens of thousands, and millions annually, sports are clearly important.
At a glance the figures look mildly impressive, but much like enhancement pouch underwear, things aren’t nearly as impressive when you look at the attendance figures in the cold light of day.
Even if we disregard the doubling up and totalling of attendance occurring in the stats, it is easy to see that even the most popular sport in Australia would rank behind visiting Botanic gardens, zoos and aquariums, and libraries. They aren’t even in the same ballpark as cinema attendance. But we can go deeper on the reading, library and cinema figures, even getting frequency statistics so we can tell the difference between the people doing something “at least once” versus people doing something regularly in the past year. 47.7% of people are reading a book weekly, 70% of library attendees (mostly women) visited at least 5 times in the past year, 65% of Australians are (computer) gamers, and 65% of Aussies go to the cinema an average of 6-7 times a year. And yet sport has a segment in news broadcasts whilst reading, gaming, and parks and zoos battle to get media coverage. Technically if we wanted to be fair then the sport segment would be cut to make way for movie news and a live cross to the local library.
What about the economy? How much are households spending on sports? That’s a great question and a great segue into a discussion of howtrickle-down economicsdoesn’t work in sports either. I mean, funding sports that way when it hasn’t worked in the economy must be a no-brainer, right? [Insert low IQ athlete joke here] Or we could stay on topic and discuss the $4.4 billion sports and physical recreation spend by households annually. Let’s not complicate things by talking about the buying of stuff like footwear, swimming pools, and camper vans. Seriously, camping is in the sport spending category? Either way, $4.4 billion sounds like a lot of money, until you realise that gaming is a $3 billion industry, and that households spend $4.1 billion on literature and $4.7 billion on TV and film.
We allow governments to spend a lot of money on big sports and big sporting events. Think that hosting the Olympics will encourage people to play sports? Nope. Actually, seriously, nope. One report described this idea as nothing more than a “deeply entrenched storyline”, sort of like a fairy tale handed down from one Minister for Sport to the next. Part of the problem is that we buy this narrative hook, line, and sinker, such that the sports themselves (and surrounding data agencies) never really bother to keep statistics to prove the claims. But they make for great announcements and ribbon cutting events on the election campaign trail, so the myth keeps on keeping on.
Ultimately the argument isn’t that sports are unpopular or bad but rather that we spend an inordinate amount of time pretending we like them far more than the reality. And that is impacting our elected officials more than a chance to wear a high-viz vest at a press conference. Maybe it is time to rethink what media and funding we throw at sports, and perhaps consider a gaming segment on the news.
So this pork-barrelling season look forward to the announcement of a new multi-million dollar yoga stadium in a marginal electorate near you.
Update: Charlie Pickering and The Weekly team cover some similar points for the Grand Prix events in Australia.
In the wake of the shocking Sydney Lindt Hostage situation our brave libertarian Senator Leyonhjelm struck straight to the heart of the real cause of the events and hinted at a foolproof solution. He pointed out that we are a ‘nation of victims’ and need to have access to guns to solve our problems, because it has worked so well in the USA.
His nuanced dissection of the events is a breath of fresh air. This was definitely not an issue of a man with a violent criminal history, nor his lack of treatment for mental health issues, nor about issues surrounding bail in our justice system, nor about racial and religious tensions in Australia. Nope, this was all about not being able to shoot people you have a problem with.
We should be thanking Senator Leyonhjelm and his fellow libertarians with gifts, which is appropriate timing leading into the Festive Season and our desperate need to stimulate the free market. So make Joe Hockey proud and buy some libertarian gifts.
Gift Idea: Ayn Rand’s Atlas Shrugged, with foreword written by Rand whilst on welfare.
Ayn Rand’s Atlas Shrugged is a must have for all libertarians. The all-new edition has a foreword written by Rand in the 1970s explaining her principles and complaints about how small her welfare checks were.
Gift Idea: Smoker’s lungs desk ornament, with bonus lungs of their children who rode in the car with them while they smoked.
This is a great gift for libertarians as it acts as a conversation piece to allow them to discuss how over-taxed smokers are.
Gift Idea: Bushmaster AR15 semi-automatic rifle chambered in .223-caliber.
The Bushmaster is the freedom weapon of choice and a must have for defending your rights. Comes as a box set of the rifle, one thousand rounds of ammunition, and paper targets of school children.
What better way to celebrate one of my favourite weeks than with a quote from John Green about his book, The Fault In Our Stars, being banned:
I guess I am both happy and sad.
I am happy because apparently young people in Riverside, California will never witness or experience mortality since they won’t be reading my book, which is great for them.
But I am also sad because I was really hoping I would be able to introduce the idea that human beings die to the children of Riverside, California and thereby crush their dreams of immortality. (Source)
There are all sorts of weird reasons that books have been banned in the past and present. Last year I covered the topic at length with both the reasons and the recent favourites for the book banning trolls. As another year rolls round, nothing has really changed. Please, won’t somebody think of the children!!
Sometime during 1994 I bought one of my favourite albums of all time: Siamese Dream by The Smashing Pumpkins. Even today (boom-tish) that album sits proudly in my music collection and doesn’t sound dated. I can’t say the same for many other albums I own from the same time period. Superunknown from Soundgarden stands as a classic album, but I find it hard to listen to without having had the death of a pet weighing on my mind. I can only listen to Metallica’s Load if I promise myself I’ll put on one of their better albums straight after. Essentially, for me, the Pumpkins hit on music gold with that album.
I’ve commented before how I’ve essentially stopped being a fan of the Pumpkins, finding their offerings since Adore (which promised so much with the first single, and delivered so little with the remainder of the album) to be more filler than awesome. What I liked about the Pumpkins was not what the Pumpkins have been delivering since.
Your experiences may vary.
Which brings me to a discussion I was having recently on the Pumpkins album Zeitgeist. Despite buying the album, I’ve never bothered adding it to my digital library, because it only has one or two songs on it that hint at what I liked about the Smashing Pumpkins of old. A lot of fans and reviewers agree with me, with Corgan taking a potshot at fans for not even listening to the album (class act), further claiming the fans only wanted to hear the old music (probably). Anyway, the discussion had started because a couple of people were insisting the reason people didn’t like Zeitgeist was because it was too political or had political overtones.
While I’m not trying to imply that no-one was turned off of Zeitgeist due to the political overtones, it is clearly a long bow to draw to suggest that it was a factor, let alone a big factor, in listeners/fans disliking the album. So why would someone make this claim?
Well, simply, this is another example of people trying to justify their taste. Another guilty pleasure moment. I seem to be raising this point a lot (here on literature, here on genre vs literature, here on good vs popular, and here on guilty pleasures). It is perfectly okay for you to like what you like, there is absolutely no need to try and explain away someone else’s dislike for something you enjoy. Does it really matter if you like something everyone else hates? No. So why bother trying to put it down to political ideology or how terrorists did something…. 9/11….
Worthiness, guilty pleasures, justification: all of these things are actually stopping us from just enjoying stuff. I know I’m guilty of it, but I’m trying to get over myself. The great thing about the internet is that it is full of support groups for people who like stuff. So you don’t have to agree with everyone else on what music, books, movies, art, etc, you like. You can find your niche and create memes, gifs and video clips to bombard all your other friends with on Facebook.
As with most things Hank and John Green are involved with, I have become a fan of Mentalfloss. Their recent article on embarrassing things we all do was interesting, but had one point in it that made me think “what the hell is wrong with you people”.
By “you people” I obviously mean it in the pejorative dissociative sense, in that I’m not having a shot at you, or Mentalfloss, just the ubiquitous and ethereal “them” and “you”. Unless of course what I’m about to write does hit home, in which case, stop it now!
One of the items listed as an embarrassing thing that everyone does, was people claiming to have read books and watched movies they haven’t in order to appear more intelligent. I have previously discussed the list of books people claim to have read and I’m not ashamed to say I’ve haven’t read certain “classics”. I do have to admit to having claimed to have read a book I haven’t, To Kill a Mockingbird (still on my TBR pile), but that is also why I’m coming out against the practice.
And that is the point I wish to make here, there is no shame in not having read a classic book or watched a classic film. Maybe you don’t like extraordinarily long and self-indulgent wedding scenes in a movie (Deer Hunter). Maybe you don’t like novels with more than 450 main characters (War and Peace has over 500). There isn’t any shame in that. And how many “classics” have gone unread because they were in the wrong language, poorly translated, never got published, or just lucked out (John Green made mention of this recently).
Essentially we are worried about our subjective taste disagreeing with someone else’s subjective taste. The stupidity here is that we are being judged for something we haven’t done, rather than a strong opinion one way or the other on the actual topic. If we came out and said “Well, I hated 1984, it was rubbish” or conversely “Well, I loved 1984, and anyone who says it’s rubbish is a poo-poo head” we’d get into deep arguments about the relative merits of the novel. That is perfectly acceptable. But if we say “I haven’t read that one (yet)” or “Never seen it” then the response is something along the lines of calling us crazy, implying we have lived too sheltered a life, and/or that we have missed out on something great.
They could be right, of course. We may have missed out on the single most impressive book or movie ever. Our lives may be dramatically improved by reading or watching the work in question.
The reality is that it really doesn’t matter. Some people will never have enjoyed a Jack Reacher adventure, or clung to the edge of their seat reading a Matthew Reilly novel, because they have been busy reading all the “great literary works”. Who is to say that their choice of entertainment was superior? Some people prefer to watch sports: are they any less entertained?
I think we have to stop pretending that our subjective opinions are something to be ashamed of. Like what you like, don’t be ashamed to say so either. I’m always amazed at the number of closeted Buffy fans there are, which only shows how damaging this mindset of “worthiness” is.
There is only really one thing I miss about living in the city and that is going to the cinema. Of course, I’d miss that even more if there were movies worth shelling out this month’s mortgage repayments to see. The idea of paying big bucks to sit in a seat that has probably been used for sex by strangers, eating snacks that have a 2000% markup, after forgetting your earplugs and going partially deaf, which is a blessing after the pre-movie ads, is just not that appealing. Now Australian cinemas have decided they aren’t charging movie goers enough money and have decided to blame an easy target to justify their cash grab.
Cinema executives have blamed piracy on the recent price rises of ticket prices in Australia. Because of course it is piracy that is to blame, and not the marginalising of the customer base with exorbitant pricing regimes. Nor could it possibly be that people have more alternate entertainment options, including waiting a few months to watch the latest “blockbuster” in their own home cinema. Nor could it be the rubbish that so many movie studios are turning out.
Let’s dissect this nonsense like the original reports in the media should have done. There are many factors at play in the decline of cinema. The first real problem is that there hasn’t really been a change in the proportion of the population that go to the cinema in 40 years, but the number of times per year they go has been steadily declining since the 90’s.
(% BEEN TO THE CINEMA IN THE LAST 12 MONTHS)
(AVERAGE NO. VISITS PER YEAR)
So rather than keeping audiences entertained regularly, audiences are clearly becoming more occasional customers. Underneath that general trend are some interesting changes in the demographics of cinema attendance. It is no secret that Hollywood movies are made for teenagers. Teens are a huge chunk of the cinema audience. But, the biggest change in the repeat attendees is in the teen market, which has been in steady general decline since the 70’s. Which part of the market is going to be most impacted by price rises? Go on: guess!
(% BEEN TO THE CINEMA IN THE LAST 12 MONTHS)
(AVERAGE NO. VISITS PER YEAR)
Another way to look at this is in the proportion of the population going to cinemas in the age demographics. Below you can see the 14-17 and 18-24 age groups are overrepresented as cinema goers, this starts to even out in the 25-34 group (also known as the settling down and going out less demographic), is at parity in the 35-49 group (also known as the parenthood has stolen my social life demographic), and people over 50 clearly don’t like all the loud noises.
AGE PROFILE OF CINEMA-GOERS COMPARED TO THE AUSTRALIAN POPULATION
OVER THE AGE OF 14, 2012
So while the proportion of the population that have been going to the cinema each year has been pretty steady across the entire population, it is the number of times people go that is making the difference, especially in that much coveted teen “I want to see explosions and car chases” market. (Interesting aside: when you look at the age group breakdowns you do see that the over 25 audience since the 70’s have generally increased in their likelihood to attend the cinema, but this has been static for most demographic groups since the mid 90’s.) To put some hard numbers on that difference in the number of times a teen goes to the cinema each year, in 1974 the 14-24 demographic averaged 16.4 visits to the cinema, in 2012 that had dropped to 6.6 visits.
Obviously there are a lot of changes in the marketplace that have occurred over this time. TV has expanded, cable TV is a thing now, home rental or ownership of movies is a thing (VHS succeeded by DVD, now being superseded by Streaming, which will probably be superseded by actors coming to your house to perform on demand), computer games have grown in leaps and bounds, the internet, all vying for our attention and wallets. Just look at the change in households with various alternatives to cinema (NB: the game consoles data doesn’t tell the full gaming story, see this for more about that market):
I alluded to this above, but one big change has been the home cinema. Some people will remember a time when some cinema screens were actually not much bigger than the ones installed in many homes now. Sound systems have improved greatly over the crappy little speaker that was the drive-in experience. Now we have high quality TVs and projection units that rival anything you can get in a cinema complex, and these come with a pause function, easy access to food that doesn’t kill your wallet nor beat your heart with belly flab, and sound settings lower than jackhammer. Then you have all the other possible entertainment options available, suddenly the list of movies (not) to see just isn’t as appealing.
The one thing cinemas still have going for them is windowing. For the first few months after opening, there is no other (legal) way to watch the film, you have to watch it in the cinema or wait for the DVD release. Although it seems clear people are more willing to wait, let the dust settle after opening weekend, and figure out what is worth watching, whether that be at the cinema, on DVD, when it makes it to TV, or at all. And now I’m going to contradict myself and say that piracy proves people aren’t willing to wait for those other options, preferring simultaneous releases. Both arguments still point out that people just aren’t as interested in paying big(ger) bucks to see movies in the cinema. Of course movie studios and distributors don’t like that idea, since windowing is great for their bottom line, especially opening weekend.
Now the reason for the price rise could be something to do with this chart, showing that 21% of the market is in the highest income households. Cinemas are obviously betting that their price elasticity is low and will take the price increase in their stride. What this ignores is the age demographic data above, which shows a sizeable chunk of the audience may be from affluent households, but that doesn’t mean their teenage bank account is bulging with lots of cash.
Equivalised gross household income quintiles
No. cinema-goers (‘000)
Share of cinema-goers (%)
Attendance rate (%)
So we see that cinema audiences are becoming more occasional consumers, the trip to the movies is a special event, not a regular event. Teens are a big chunk of the cinema market and they aren’t the repeat customers they used to be. This is what happens when you price customers out of the market, you bite the hand that feeds. You also have them turn to other entertainment mediums. Blaming piracy for what is demonstrably a long term trend is a pretty big reach. I’d also argue that piracy is a reaction to consumer demand for lower pricing and simultaneous releasing, so that audiences can consume the movies in the way they want to, not the way they are being forced to, at a price that is commensurate with the utility received (e.g. people pay as much or more for a DVD – less if you consider it a couple or family purchase). If cinemas have anyone to blame it is themselves and their suppliers (distributors and studios). Using piracy to justify a price increase is clearly unfounded.
Of course, what needs to be mentioned is that films are essentially a loss leader for cinemas so that they can make money selling snacks and beverages. This ticket price increase is probably driven through the supply chain rather than by the cinemas themselves. But this also shows how cinemas have to adapt in order to survive. Going out to a movie is an experience. People are more willing to pay for experiences rather than stuff (DVDs). So if cinemas can get serious about screening experience at a fair price, they might get the audience back, or at least stop the decline.
There is a storm brewing. In the latest of the long line of insults by literary fiction against genre fiction, Isabel Allende has taken a pot shot at crime fiction. Now apparently she hates crime fiction because:
It’s too gruesome, too violent, too dark; there’s no redemption there. And the characters are just awful. Bad people.
But that didn’t stop her writing a crime mystery. It also didn’t stop her saying that the book was a joke and ironic. I think the word she was actually looking for was hypocrite.
I’ve never really understood the people who read or write stuff they don’t enjoy. Sure, I read some really boring science journal articles, but that’s because I enjoy knowing stuff. If I’m going to sit down and read a book, I want that 10-20 hours of entertainment to be, well, entertaining. If I’m writing, which is a much longer and more involved process, why would I invest that much time in something I’m not enjoying doing?
So to some extent, I understand why Isabel decided that her mystery had to be a joke and ironic. But that is also the crux of the problem, she doesn’t seem to understand that she is also insulting readers and fans of genre fiction. I think the book store in Houston, Murder by the Book, that had ordered 20 signed copies of her novel, did the right thing in sending them back.
Now you can write a satirical or ironic take on a particular genre or sub-genre of fiction. But when you do so it has to be because of your love of all those little things you’re taking the piss out of. If you do it out of hate then you can’t turn around and try to sell it to the audience you are taking a pot shot at. I think this stuff is stupid, you’re stupid for reading it, but I still want you to pay me for insulting you.
I get a little sick of snobbishness toward genre readers and writers. Do genre readers and writers take pot shots at literary authors for their lack of plots, characters who have to own a cat and be suffering, and writing that is there to fill pages with words and not actually tell a story? No. We’re too busy reading something exciting.
It would be great if people just enjoyed what they enjoyed and stopped criticising others for enjoying what they enjoy. Enjoy.
With many of my favourite shows now back on air for 2014, except the ones that were cancelled, I thought it was a good time to recap what kept me entertained on the small screen in 2013.
Many people have noted the rise of decent TV, leaving behind the days of formulaic plots (e.g. CSI whatever), sit coms that lack the comedy (e.g. Two and A Half Men), dramas that lack plot (e.g. Lost), lame reality TV shows (e.g. Duck Dynasty), and the cancellation of a Joss Whedon show before it got a chance to be awesome (e.g. every show he’s ever made). This is at the same time as movies have failed to produce anything particularly memorable or interesting in quite some time.
I actually have a theory (by theory I mean hypothesis) about why there are fewer and fewer decent movies. It comes down to this little figure: Let’s leave aside the gross disparity between the highest paid actor vs. actress discussion, instead let’s focus on those paychecks. You stick just one of those stars in a movie, just one, and you are going to have a really expensive movie that is going to battle make its money back at the box office. Movie studios know this, so they spend up big on special effects, production values, promotion, etc, to lure people into the cinema. But in an effort to attract as large an audience as possible to make up for this huge spend, they make the movies as bland as possible in order to accommodate a wide audience from around the world. The reason that movie sucks isn’t because it is aimed at 12 year olds, its aimed at 12 year olds who probably don’t understand idioms due to being in a different country/culture.
And this is why we get a list of gems on the small screen, because the writers, directors, and quite a few actors, have realised that in order to tell good stories, they can’t spend huge dollars (unless it is on prime time crap).
Possibly my favourite show of the past few years. This is not only well written, the entire cast and crew seem to have this knack for creating great TV. Plus, last season featured Patton Oswalt.
I love this show for its wit, humour, modernising of the classic Sherlock Holmes stories, and the casting. Some have accused it of being smug, but I see that as central to Sherlock’s character, thus welcome in the show.
I read the prequel novel by series writer Neil Cross and it was every bit as good as the TV show. Idris Elba took a break from fighting monsters in giant robot suits in order to make another season of this fantastic crime drama.
When I describe this show to friends, they always come away thinking that I’ve described a violent, b-grade, action movie with plenty of nudity. Just another throw back to the pulp novel trash that I also have occasion to read. Well, yes. The problem being? The best new show on TV in 2013, hands down!
Person of Interest
I really enjoyed the first season of Person of Interest. The second season was more of the same but brought more of the very interesting character portrayed by Amy Acker. Season 3 was off to a good start before the non-ratings break. Now that I’ve raised that point, why do we even have a non-ratings period any more? TV watching habits have changed, the networks better change with the times or lose out to the internet… oh wait, they are.
I discovered this sci-fi gem by accident. One of the problems I’ve always had with time travel in books, TV and movies is that they don’t deal with the paradox very well. Even in Back to the Future it is almost played for a joke. This series is well written and actually has the paradox central to its story structure. It also helps that Rachel Nichols does a good job of holding the series together.
Another post-apocalyptic story, ho-hum. This series has an interesting take on what would be society’s downfall and what would subsequently happen. There is a lot to like about this show, especially Billy Burke as a bad-ass. Although, after the first season, I didn’t see much point in having a second season and won’t be following it.
This is one of the few mainstream shows I find watchable. It is pretty much down to the fact that they have some good fights, an interesting premise culled from the source material, and that the actors have done the hard yards physically for the show (especially Stephen Amell and Manu Bennett). Makes me want to build a salmon ladder in my backyard.
Not often that a web series could attract a big name director like Bryan Singer (of the decent X-Men movies fame) to make a series of short scene sci-fi. I’d characterise the series as essentially 48 vignettes with overlapped characters and story, as most episodes can stand alone to some extent, despite being part of a larger narrative.
Quite simply, this show is the funniest thing on TV. In the proud tradition of cartoon comedies, it is able to do things that other TV shows and comedies can’t, due to financial, legal or ethical constraints. This series is also one of the few with DVD extras that you would actually want to watch. One of the best is when Archer has an accident and is transformed into a character much more like his voice actor, with ensuing gags around this.
This Aussie comedy-drama has been a consistently witty and interesting tale about a self-destructive Sydney barrister. Normally Aussie humour doesn’t translate well to other parts of the world, but Rake has been adapted for the USA, with Greg Kinnear replacing Richard Roxburgh.
Tried but lost interest:
Almost Human – promising sci-fi that didn’t really capture my attention
The Walking Dead – so sick of that fucking farm!
Marvel’s Agents of Shield – this should have been good, but was meh.
The Booth At The End – interesting premise but didn’t grab me.
The Following – I can honestly say that this series squandered such a great premise with derivative and clichéd story.
The Blacklist – this was interesting only because of James Spader. Needed more than that.
Vikings – interesting but too slow moving.
Hannibal – this was fantastic. I don’t know why I haven’t watched more, but I just haven’t.
What!?! You don’t watch….
The Game of Thrones – after watching the first season I had had enough. You only have to watch this far to see Sean Bean die, so game over.
Breaking Bad – I’ve dropped in and out on this series, watching episodes throughout. I’ve really enjoyed it, but not something I’ve made time to watch all of.
Arrested Development – yeah, I know. I should be a rabid fan.
The Killing – both the US and the Danish Forbrydelsen are slow boil crime shows that I’ve started watching and not continued. No particular reason for stopping, just haven’t gotten to the rest of the episodes yet.
Borgen – have heard great things, but just haven’t gotten to it yet.
There is a general rule in arguments: don’t argue with stupid people, they drag you down to their level and beat you with experience. That is pretty much the problem scientists and experts have when debating anti-science proponents – such as creationists, anti-vaccinators, anti-GM campaigners, climate change deniers, etc. Yet Bill Nye the Science Guy decided that, in the interest of science and education, he would debate a creationist.
The debate started with Bill Nye and Ken Ham stating a 5 minute opening piece. Then Ken went into his 50 minute argument, which is when my cushion really started to earn its keep protecting my desk from damage.
I really find it hard to fathom how anyone can be credulous of Ham’s statements. In his 50 minutes he used all sorts of logical fallacies, most notably his videos of “creationist scientists” as argument from authority. But it wasn’t this that really got the lump on my forehead rising, it was the use of “evidence” for his argument that simultaneously refuted the arguments. One example was the phylogenetic tree for dogs. Ham argued that the rise of Canis lupus familiaris from a wolf (yeah, just one, let’s just let that one go through to the keeper) was what you would see from biblical predictions of dogs speciating after the global flood 4,000 years ago. Just one problem. Teeny tiny. The figure showed dogs evolving from a group of wolf ancestors over the course of 14-15,000 years.
He didn’t just do this once, he did it repeatedly. Another example arose when he was talking up one of his creationist pals who helped design a satellite (or something, didn’t really care because it was irrelevant). He used the example of how scientists had been debating how old the universe was: they couldn’t agree on the age. The part he left out about that particular debate was that the age of the universe was somewhere around about 13.8 billion years old (+ 37 million years), and they had a bunch of data they were trying to make sure they had the errors accounted for. The debate was about the difference in the confidence range (or error margin) between the Planck satellite measures and the Wilkinson Microwave Anisotropy Probe measures. The error margin is 6,000 times greater than the age of the Earth that Ham claims. The Earth’s age is still 2 million times older than Ham’s claim, yet he uses this example as if to give credence to his claims.
Now Nye did his best in his 50 minutes to show that Ham’s claims were flawed, but also how evidence and scientific observation and prediction work. Others have claimed, and I agree to an extent, that Nye’s mistake was to try and cover too much ground. If he was talking to a receptive audience he would have destroyed Ham and had the crowd eating out of his hand. But at a creationist museum, with a bunch of science deniers, it would come across as too much information and too confusing. Although Nye’s last couple of minutes pretty much killed the entire debate, with trees, rocks, size of the universe, distance from stars but limits of how fast the light can travel, all showing that the Earth and Universe are much much older.
The first rebuttal saw Ham carrying on about “you weren’t there so you don’t know.” Brian Dunning had a great take on this particular argument:
There is a rumor that Bill Nye @TheScienceGuy debated evolution with Ken Ham. Not true. It did not happen, because you weren’t there.
In this first rebuttal, Ham again used evidence that rebutted his own claims, especially when talking about radio-carbon dating. Showing that measurements have error margins, or can be somewhat imprecise, doesn’t negate the fact that the measurements are still many orders of magnitude outside of the age of the Earth claimed by Ham. Then he moved onto saying that the bible is right, everything else is wrong (let’s just ignore that the bible isn’t even consistent with itself, let alone the fact that it is a translation of a translation, thus literal interpretation isn’t supported by biblical scholars).
Nye then rebutted Ham’s statements. His classic put down was for the claim that every animal and humans were vegetarian until they got of the ark: lion’s teeth aren’t really made for broccoli.* Ba-zing!
Next Ham tried to point out that creationism isn’t his model (then he blames secularists for scientists). This is true, there are other nutters who came up with this crap. But Ham tried to pretend that “scientists” came up with the various creation models (NB: just because a scientist said something, doesn’t make it science or scientific). Then he talks about species and kinds and how Nye was confusing what a kind was. Easy to do when the idea of a kind is bullshit and unsupported by any actual science.
Nye then tore apart the claims about the rise of species from kinds using the basic math involved. He also called bullshit on the ship building skills of ancient desert people. The main point in this rebuttal was that Ham hadn’t addressed Nye’s point adequately, and that Ham’s claims aren’t supported by the majority of religious people, let alone scientists.
My desk and forehead had had enough by this stage, so I didn’t watch the Q&A section, but it can be viewed here.
The point I wanted to make from this was that Ham had a huge advantage in this discussion. I’m not talking about the home team venue, nor the credulous crowd, I’m talking about the lack of need for evidence. All Ham had to do, and pretty much what he did, was seed doubt in science and then declare “creationism wins” (which might as well be “God did it”). This is the problem with any debate with anti-science: the scientists have to prove their case with evidence and logical reasoning; the anti-science side only has to sow some doubt. And that doubt can vary between legitimate claims through to flat out lies, it doesn’t matter. So Nye shouldn’t have taken the debate.
But Nye was right to take the debate.
Hang-on. Have you hit your head against your desk a few too many times during that debate?
No. Bill Nye is a well known and respected science communicator. He went into the belly of the beast to stand in the echo chamber and sow some doubt (how’s that for a metaphor-fest?). As he stated himself, Nye knows that America (and the world, but let’s allow him his patriotism) needs science and innovation for the future of society. Creationism and other anti-science nonsense undermine this. If no-one challenges the group-think and echo chamber of the creationists (et al.) then they will continue to be mislead and misinformed by people like Ken Ham. You can’t have someone reject evolution yet rely on germ theory for modern medicine. You can’t have someone reject radio-carbon dating yet use medical imaging. That is incompatible, that is a rejection of reality, and it leads to stupid stuff happening that curbs development of new technologies and advancements to society.
Other opinions on who won: Shane proposes that Nye needed to pick a couple of points to hammer home. This feeds intoscience communication researchthat shows you can get distracted from the main narrative with too many points.
Update: It is clear that many of Ham’s supporters were not listening to Bill Nye and are wilfully ignorant. This Buzzfeed article (yeah, I know, Buzzfeed) brings up a lot of the points that Nye addressed, explained clearly and simply, showing they didn’t listen to Nye, and slept through school.
Update: This article makes a nice statement that ties into some of my points about why Nye took the debate. To quote:
It brought new attention to YEC (Young Earth Creationism) to exactly the people we need to see it- the large swath of Christian and other religious parents who think of Intelligent Design or Guided Evolution or some other pseudo-scientific concept when they imagine “teaching the controversy“. These people are embarrassed by people like Ken Ham. They know the earth isn’t 6000 years old, and they understand just how impossible it is to square that belief with observable phenomena.
Update: I quoted Brian Dunning above and he wrote an article for Skeptoid about not debating anti-science people. I agree and disagree with his points as you will see from what I’ve written here and what Brian has written in his article. We can’t just preach to the choir, but we can’t provide legitimacy to nonsense either.
Update: The ever awesome Potholer54 just posted a video on one point about evolution and Ken Ham’s rebuttal of his own arguments. Worth watching.
* Okay, not the best point to make, as teeth aren’t definitive of diet, but if the comment is viewed as being representative of animal physiology overall, then it is a very valid putdown of the vegetarian claims.
I am waging a war against poor grammar and spelling. Please tell me I’m not alone. Not in a metaphysical, mystical, praise be to his noodliness kind of ‘not alone’, but the ‘you too support that idea’ kind of way.
Whether it be the borderline illiterate retired football player who is now a TV personality, the weather girl whose qualifications in meteorology are limited to blowing hot air, or the poster on any internet site you frequent, we seem to be surrounded by lazy or solecistic. Now I’m aware that language evolves over time. If you have read Robinson Crusoe, published in 1719, you will have noticed how boring labourous it is to read. Compare this to modern authors, not one would ever use such a long title: The Life and Strange Surprizing Adventures of Robinson Crusoe, of York, Mariner: Who lived Eight and Twenty Years, all alone in an un‐inhabited Island on the Coast of America, near the Mouth of the Great River of Oroonoque; Having been cast on Shore by Shipwreck, wherein all the Men perished but himself. With An Account how he was at last as strangely deliver’d by Pirates. So the English language is bound to evolve, become more concise, more relevant to the people who use it.
But I think there is a difference between evolving and those who butcher the language. I don’t think it is ‘cool’ to use poor diction such as “I is” nor make the constant newsreader error of interchangeably using were and was. This is just lazy and shows that rather than communicate clearly, the culprit is more concerned about being heard.
Rebel I say. Fight the war. Death to the deliberately illiterate.
I look forward to having this post edited for hypocrisy.
There was something on the TV last night, I think it was a moth attracted to the bright screen. Fortunately the ‘moth’ flapping around didn’t distract me from someone talking about being your best. Best? There is always the question of ability and effort. No one would doubt that effort is involved in achieving something, but some people have to work really hard just to be average, just visit any shopping mall for confirmation. And what if you are lazy, are you aiming too high in your career? My inner science nerd sent me to Google Scholar to find out what various people can achieve.
Elementary school graduates (completed eighth grade) 90
Elementary school dropouts (completed 0–7 years of school) 80-85
Have 50/50 chance of reaching high school 75
Average IQ of various occupational groups:
Professional and technical 112
Managers and administrators 104
Clerical workers; sales workers; skilled workers, craftsmen, and foremen 101
Semi-skilled workers (operatives, service workers, including private household; farmers and farm managers) 92
Unskilled workers 87
Type of work that can be accomplished:
Adults can harvest vegetables, repair furniture 60
Adults can do domestic work, simple carpentry 50
Adults can mow lawns, do simple laundry 40
There is considerable variation within and overlap between these categories. People with high IQs are found at all levels of education and occupational categories. The biggest difference occurs for low IQs with only an occasional college graduate or professional scoring below 90.
So there you go, now you know just how hard you have to work. And just remember, in a democracy the vote of the highly educated is given the same value as those who think mowing a lawn is mentally challenging.
I have fond memories of school. I fondly remember coming home from it each day. I remember English class: ‘I want to write a story about robot spies.’ ‘Well you can’t, we’re writing about the Eureka Stockade.’ Ahh, good times.
So today’s blog is about how schools are designed and operated.
If anyone has any idea how they are actually designed and operated I believe there is a highly underpaid position available for you to take up to improve the situation. Sir Ken Robinson presents an interesting polemic in the video below, essentially saying that creativity is beaten out of kids in order to make it easier to manage them in a class. What are your experiences?
The only part I wholly disagree with is the bit about multitasking. The research on that is fundamentally flawed. That research has usually compared people who multitask all of the time with people who don’t at all. I.e. they compared pot addicted college students who needed the study participation fee to buy beer with people who actually did stuff; usually housewives.
I previously posted about some of the actors who were most likely to ruin a perfectly good book adaptation. The movie of a book is always going to be hard. You take an intricate plot, interesting characters, and throw them out to make room for 90 minutes of mindless violence and teen appeal: not an easy task. So, as to not be labelled a sexist by men pointing out that there are heaps of untalented female actors, I’m presenting the follow-up list of actresses whom you don’t want in a book adaptation.
Former models, singers or “celebrities”
Was she even a singer?
Yes this is a generic category rather than a specific actress, but we see it all of the time. Is it too much to ask for there to be more to an actress than looking good? Remember that this is a red-blooded male asking this question, if I’m complaining about these clothes horses in films it must be bad. Even worse is the Elvis road that singers want to take. To quote Eddie Murphy “Elvis was so good they put him in movies. Mother@#$%er couldn’t act.” Unfortunately the modern day singers aren’t Elvis and their acting is worse.
All acting sins forgiven!
There is a common marital clause, the freebie. Basically if you ever happen to be in the position to have sex with someone completely unobtainable, then it is okay. For me it is Jessica Alba, for my wife it is Ryan Reynolds. No offense to this hottie, but she has been acting since she was a child and yet she still manages to only bring her hottness to the screen.
I don’t know what’s worse, her acting or her choice in husbands.
Underwear not included.
Lindsay almost fits under the category of “celebrities” rather than actresses. I am struggling to name a film she has been in, let alone one that she acted in. On the plus side I’m struggling to name a film she has been in.
Hairstyle, no acting included.
Has she done anything other than Friends that was decent? Yes she was in the movie gem Office Space, but you could have replaced her with just about any other actress, she did so little with the role.
At least she isn’t Helen Hunt.
There are two things you can count on with a Katherine Heigl romantic comedy: it won’t be funny and no one will have seen it. Heigl has the honor of staring in a $2 million film that only grossed $20 bucks, one of the biggest flops in film history.
Her talents are showing.
Who needs talent when you’ve got big boobs and you’re willing to show them? Graham has made a career out displaying her, um, talents on screen. Wide eyed and bland, watching Graham on screen is like watching adorable paint dry.
Jennifer Love Hewitt
She sees dead people.
According to a study of ratings at Rotten Tomatoes, Jennifer is the worst actress of all time. Now this seems a bit hard to swallow given her successful TV career, but you can’t argue with science, even when arbitrarily applied with no proper standardisation of data. Also, at least Bill Murray apologised for making Garfield and has made some good films to make up for his appalling mistake.
She’ll have diva with that.
Washed up at 24 is not exactly something you expect in Hollywood, well, not in the movies that include clothes at least. But when all you have going for you is your looks and you manage to annoy everyone you have worked with and then badmouth everyone, your career tends to be over.
The one on the right.
A proud graduate of the Steven Segal School of Acting. She has one facial expression for every occasion. I know she was hired to be boring and insipid in Twilight, but that doesn’t mean you get to play Joan Jett that way too.