This month’s It’s Lit! covers everyone’s favourite topic: food.
If it isn’t your favourite topic, just give yourself 48 hours without it and see if that changes your mind.
I’ve always found food scenes in books to fall into two categories: needless exposition, or important showing (Oliver Twist is a great example of this). While the video discusses the latter, it is all too common that the former is what we read most.
While I was watching the video I was reminded of something I read last year. The discussion of bread in Victor Hugo’s Les Misérables, particularly around the hard bread that needed to be soaked, was something that Karl Marx wrote about in Das Kapital. The hard bread was actually due to deliberate contamination to make cheap bread that workers could afford, knowing full well that it was bad for them to eat, and the employers knowing full well that the workers couldn’t afford to eat properly (keeping them hungry so they would work).
A great way to remind us future people of how society used to run.*
Food varies wildly from place to place and from culture to culture; since humans are such sensory creatures, using words to evoke the experience of eating is an excellent way to bring a text to life.
It’s Lit! is part of THE GREAT AMERICAN READ, an eight-part series that explores and celebrates the power of reading. Hosted by Lindsay Ellis.
*Let’s be honest, society would quite happily go back to those conditions, and in some areas of the world, it still is operating in that way.
The birth of the reader must be at the cost of the death of the author… or so says Roland Bathes in his essay Death of the Author. Are we talking about literally killing authors? No, this is figurative (like most uses of literally). Can Death of the Author include killing the author? Sure, but get a good lawyer first.
Let’s let Lindsay Ellis (and John Green) explain:
My take on Death of the Author is somewhat complicated. I think there is relevant information that the author has that doesn’t make it into the story (think Elvish languages from Tolkien*), but I also think that quite often if it isn’t in the story it doesn’t really exist. I think that stories are really up to the readers to interpret, as viewpoints and interpretations will change over time**, but that doesn’t mean readers always interpret correctly.
This is a hedged way of saying that Death of the Author is probably too simple a way of thinking about how stories should be interpreted. At least, that’s my interpretation of it.
*Let’s not get into how “relevant” I think those languages are, or a lot of that world-building from authors in general is.
**You may remember book reviews here where I’ve discussed how older books haven’t aged well due to changing societal standards. Sexism and racism are obvious changes that have happened in the last 50 years which make formerly acceptable, even progressive, moments in a story seem backward and unacceptable now.
Another thing that can occur is changes to society changes interpretations. E.g. The Baby It’s Cold Outside controversy can be summed up as an old song made references to things that we are no longer familiar with, so our interpretation changes. This makes Death of the Author a truly bad thing for any artwork that is “consumed” outside of the social and temporal setting it was made within.
This month’s instalment of What’s the difference? from CineFix looks at Mike Mignola’s graphic novel and Guillermo Del Toro’s Hellboy?
In the interest of full disclosure, I’m not a fan of Hellboy: movie or comic. Yes, I know, how dare you not love Del Toro’s amazing artistic vision! I’ve watched both Hellboy movies multiple times and have not loved them (and despite liking the Blade trilogy, Blade 2 isn’t my favourite – but Pan’s Labyrinth was fantastic). The comics I probably didn’t give them a fair chance, as I tried reading one omnibus after not enjoying the first film.
Anyway, the point I wanted to highlight from the video was something I think too many adaptations fail to do. When you are talking about a series of comics or books, there is often some prevailing themes, motifs, and imagery to them that may be less noticeable in any one edition, but taken as a whole it is important.
Because movies are often only drawing on one book at a time, or drawing on one run (or story arc) of a comic, important aspects may be lost. An example would be the Tim Burton or the Adam West takes on Batman versus the Christopher Nolan version. The latter drew upon more of the Batman comics than the earlier adaptations (not that either of those adaptations was bad*).
So while this doesn’t necessarily result in a direct adaptation, it does result in an adaptation that is faithful to the source material in the elements that matter.
*I’m pretending that the Joel Schumacher adaptations don’t exist. Akiva Goldsman is probably more to blame, given he has a long track record of making everything he is attached to that bit worse.
The first rule of this month’s It’s Lit! is that you don’t talk about the narrator.
Unreliable narrators are an interesting topic. To some extent, I regard all narrators as flawed in some way. Unless you have omniscient narration you always have a limited viewpoint, and it could be argued that even with omniscient you still aren’t pulling away from the main narrative so it is limited as well. So I would argue that unreliable narrators are more a case of how unreliable are all narrators.
Who is the most powerful character in fiction? Villains may doom the world, heroes may save it, but no one has more control over the plot than the narrator – expositing the who, what, where, when and how directly into the reader’s mind. But how can you tell that the person telling you the story is telling you the whole story?
It’s Lit! is part of THE GREAT AMERICAN READ, a eight-part series that explores and celebrates the power of reading.
Industry and educators are agreed: the world needs creativity. There is interest in the field, lots of urging but remarkably little action. Everyone is a bit scared of what to do next. On the question of creativity and imagination, they are mostly uncreative and unimaginative.
Some of the paralysis arises because you can’t easily define creativity. It resists the measurement and strategies that we’re familiar with. Indisposed by the simultaneous vagueness and sublimity of creative processes, educators seek artificial ways to channel imaginative activity into templates that end up compromising the very creativity they celebrate.
For example, creativity is often reduced to problem-solving. To be sure, you need imagination to solve many curly problems and creativity is arguably part of what it takes. But problem-solving is far from the whole of creativity; and if you focus creative thinking uniquely on problems and solutions, you encourage a mechanistic view – all about scoping and then pinpointing the best fit among options.
It might be satisfying to create models for such analytical processes but they distort the natural, wayward flux of imaginative thinking. Often, it is not about solving a problem but seeing a problem that no one else has identified. Often, the point of departure is a personal wish for something to be true or worth arguing or capable of making a poetic splash, whereupon the mind goes into imaginative overdrive to develop a robust theory that has never been proposed before.
For teaching purposes, problems are an anxious place to cultivate creativity. If you think of anyone coming up with an idea — a new song, a witty way of denouncing a politician, a dance step, a joke — it isn’t necessarily about a problem but rather a blissful opportunity for the mind to exercise its autonomy, that magical power to concatenate images freely and to see within them a bristling expression of something intelligent.
That’s the motive behind what scholars now call “Big C Creativity”: i.e. your Bach or Darwin or Freud who comes up with a major original contribution to culture or science. But the same is true of everyday “small C creativity” that isn’t specifically problem-based.
Relishing the independence of the mind is the basis for naturally imaginative activity, like humour, repartee, a gestural impulse or theatrical intuition, a satire that extrapolates someone’s behaviour or produces a poignant character insight.
A dull taming
Our way of democratising creativity is not to see it in inherently imaginative spontaneity but to identify it with instrumental strategising. We tame creativity by making it dull. Our way of honing the faculty is by making it goal-oriented and compliant to a purpose that can be managed and assessed.
Alas, when we make creativity artificially responsible to a goal, we collapse it with prudent decision-making, whereupon it no longer transcends familiar frameworks toward an unknown fertility.
We pin creativity to logical intelligence as opposed to fantasy, that somewhat messy generation of figments out of whose chaos the mind can see a brilliant rhyme, a metaphor, a hilarious skip or roll of the shoulders, an outrageous pun, a thought about why peacocks have such a long tail, a reason why bread goes stale or an astonishing pattern in numbers arising from a formula.
Because creativity, in essence, is somewhat irresponsible, it isn’t easy to locate in a syllabus and impossible to teach in a culture of learning outcomes. Learning outcomes are statements of what the student will gain from the subject or unit that you’re teaching. Internationally and across the tertiary system, they take the form of: “On successful completion of this subject, you will be able to …” Everything that is taught should then support the outcomes and all assessment should allow the students to demonstrate that they have met them.
After a lengthy historical study, I have concluded that our contemporary education systematically trashes creativity and unwittingly punishes students for exercising their imagination. The structural basis for this passive hostility to the imagination is the grid of learning outcomes in alignment with delivery and assessment.
It might always be impossible to teach creativity but the least we can do for our students is make education a safe place for imagination. Our academies are a long way from that haven and I see little encouraging in the apologias for creativity that the literature now spawns.
My contention is that learning outcomes are only good for uncreative study. For education to cultivate creativity and imagination, we need to stop asking students anxiously to follow demonstrable proofs of learning for which imagination is a liability.
Biologists are gathering evidence of green algae (pictured here in Kuwait) becoming carbohydrate-rich but less nutritious, due to increased carbon dioxide levels. As science fiction becomes science fact, new forms of storytelling are emerging. Raed Qutena
I count myself lucky. Weird, I know, in this day and age when all around us the natural and political world is going to hell in a handbasket. But that, in fact, may be part of it.
Back when I started writing, realism had such a stranglehold on publishing that there was little room for speculative writers and readers. (I didn’t know that’s what I was until I read it in a reader’s report for my first novel. And even then I didn’t know what it was, until I realised that it was what I read, and had always been reading; what I wrote, and wanted to write.) Outside of the convention rooms, that is, which were packed with less-literary-leaning science-fiction and fantasy producers and consumers.
Realism was the rule, even for those writing non-realist stories, such as popular crime and commercial romance. Perhaps this dominance was because of a culture heavily influenced by an Anglo-Saxon heritage. Richard Lea has written in The Guardian of “non-fiction” as a construct of English literature, arguing other cultures do not distinguish so obsessively between stories on the basis of whether or not they are “real”.
Regardless of the reason, this conception of literary fiction has been widely accepted – leading self-described “weird fiction” novelist China Miéville to identify the Booker as a genre prize for specifically realist literary fiction; a category he calls “litfic”. The best writers Australia is famous for producing aren’t only a product of this environment, but also role models who perpetuate it: Tim Winton and Helen Garner write similarly realistically, albeit generally fiction for one and non-fiction for the other.
Today, realism remains the most popular literary mode. Our education system trains us to appreciate literatures of verisimilitude; or, rather, literature we identify as “real”, charting interior landscapes and emotional journeys that generally represent a quite particular version of middle-class life. It’s one that may not have much in common these days with many people’s experiences – middle-class, Anglo or otherwise – or even our exterior world(s).
Like other kinds of biases, realism has been normalised, but there is now a growing recognition – a re-evaluation – of different kinds of “un-real” storytelling: “speculative” fiction, so-called for its obviously invented and inventive aspects.
a much larger collective conviction about who’s entitled to tell stories, what stories are worth telling, and who among the storytellers gets taken seriously … not only in terms of race and gender, but in terms of what has long been labelled “genre” fiction.
Rawson’s latest book, From the Wreck, intertwines the story of her ancestor George Hills, who was shipwrecked off the coast of South Australia and survived eight days at sea, with the tale of a shape-shifting alien seeking refuge on Earth. In an Australian first, it was long-listed for the Miles Franklin, our most prestigious literary award, after having won the niche Aurealis Award for Speculative Fiction.
The Aurealis awards were established in 1995 by the publishers of Australia’s longest-running, small-press science-fiction and fantasy magazine of the same name. As well as recognising the achievements of Australian science-fiction, fantasy and horror writers, they were designed to distinguish between those speculative subgenres.
Last year, five of the six finalists for the Aurealis awards were published, promoted and shelved as literary fiction.
A broad church
Perhaps what counts as speculative fiction is also changing. The term is certainly not new; it was first used in an 1889 review, but came into more common usage after genre author Robert Heinlein’s 1947 essay On the Writing of Speculative Fiction.
Whereas science fiction generally engages with technological developments and their potential consequences, speculative fiction is a far broader, vaguer term. It can be seen as an offshoot of the popular science-fiction genre, or a more neutral umbrella category that simply describes all non-realist forms, including fantasy and fairytales – from the epic of Gilgamesh through to The Handmaid’s Tale.
While critic James Wood argues that “everything flows from the real … it is realism that allows surrealism, magic realism, fantasy, dream and so on”, others, such as author Doris Lessing, believe that everything flows from the fantastic; that all fiction has always been speculative. I am not as interested in which came first (or which has more cultural, or commercial, value) as I am in the fact that speculative fiction – “spec-fic” – seems to be gaining literary respectability.
(Next step, surely, mainstream popularity! After all, millions of moviegoers and television viewers have binge-watched the rise of fantastic forms, and audiences are well versed in unreal onscreen worlds.)
One reason for this new interest in an old but evolving form has been well articulated by author and critic James Bradley: climate change. Writers, and publishers, are embracing speculative fiction as an apt form to interrogate what it means to be human, to be humane, in the current climate – and to engage with ideas of posthumanism too.
These are the sorts of existential questions that have historically driven realist literature.
According to the World Wildlife Fund’s 2018 Living Planet Report, 60% of the world’s wildlife disappeared between 1970 and 2012. The year 2016 was declared the hottest on record, echoing the previous year and the one before that. People under 30 have never experienced a month in which average temperatures are below the long-term mean. Hurricanes register on the Richter scale and the Australian Bureau of Meteorology has added a colour to temperature maps as the heat keeps on climbing.
There is an infographic doing the rounds on Facebook that shows sister countries with comparable climates to (warming) regions of Australia. But it doesn’t reflect the real issue. Associate Professor Michael Kearney, Research Fellow in Biosciences at the University of Melbourne, points out that no-one anywhere in the world has any experience of our current CO2 levels. The changed environment is, he says – using a word that is particularly appropriate for my argument – a “novel” situation.
Elsewhere, biologists are gathering evidence of algae that carbon dioxide has made carbohydrate-rich but less nutritious. So the plankton that rely on them to survive might eat more and more and yet still starve.
Fiction focused on the inner lives of a limited cross-section of people no longer seems the best literary form to reflect, or reflect on, our brave new outer world – if, indeed, it ever was.
Whether it’s a creative response to catastrophic climate change, or an empathic, philosophical attempt to express cultural, economic, neurological – or even species – diversification, the recognition works such as Rawson’s are receiving surely shows we have left Modernism behind and entered the era of Anthropocene literature.
And her book is not alone. Other wild titles achieving similar success include Krissy Kneen’s An Uncertain Grace, shortlisted for the Aurealis, the Stella prize and the Norma K. Hemming award – given to mark excellence in the exploration of themes of race, gender, sexuality, class or disability in a speculative fiction work.
Kneen’s book connects five stories spanning a century, navigating themes of sexuality – including erotic explorations of transgression and transmutation – against the backdrop of a changing ocean.
Earlier, more realist but still speculative titles (from 2015) include Mireille Juchau’s The World Without Us and Bradley’s Clade. These novels fit better with Miéville’s description of “litfic”, employing realistic literary techniques that would not be out of place in Winton’s books, but they have been called “cli-fi” for the way they put climate change squarely at the forefront of their stories (though their authors tend to resist such generic categorisation).
Both novels, told across time and from multiple points of view, are concerned with radically changed and catastrophically changing environments, and how the negative consequences of our one-world experiment might well – or, rather, ill – play out.
Catherine McKinnnon’s Storyland is a more recent example that similarly has a fantastic aspect. The author describes her different chapters set in different times, culminating – Cloud Atlas–like, in one futuristic episode – as “timeslips” or “time shifts” rather than time travel. Yet it has been received as speculative – and not in a pejorative way, despite how some “high-art” literary authors may feel about “low-brow” genre associations.
Kazuo Ishiguro, for instance, told The New York Times when The Buried Giant was released in 2015 that he was fearful readers would not “follow him” into Arthurian Britain. Le Guin was quick to call him out on his obvious attempt to distance himself from the fantasy category. Michel Faber, around the same time, told a Wheeler Centre audience that his Book of Strange New Things, where a missionary is sent to convert an alien race, was “not about aliens” but alienation. Of course it is the latter, but it is also about the other.
All these more-and-less-speculative fictions – these not-traditionally-realist literatures – analyse the world in a way that it is not usually analysed, to echo Tim Parks’s criterion for the best novels. Interestingly, this sounds suspiciously like science-fiction critic Darko Suvin’s famous conception of the genre as a literature of “cognitive estrangement”, which inspires readers to re-view their own world, think in new ways, and – most importantly – take appropriate action.
A new party
Perhaps better case studies of what local spec-fic is or does – when considering questions of diversity – are Charlotte Wood’s The Natural Way of Things and Claire Coleman’s Terra Nullius.
The first is a distinctly Aussie Handmaid’s Tale for our times, where “girls” guilty by association with some unspecified sexual scenario are drugged, abducted and held captive in a remote outback location.
The latter is another idea whose time has come: an apocalyptic act of colonisation. Not such an imagined scenario for Noongar woman Coleman. It’s a tricky plot to tell without giving away spoilers – the book opens on an alternative history, or is it a futuristic Australia? Again, the story is told through different points of view, which prioritises collective storytelling over the authority of a single voice.
“The entire purpose of writing Terra Nullius,” Coleman has said, “was to provoke empathy in people who had none.”
This connection of reading with empathy is a case Neil Gaiman made in a 2013 lecture when he told of how China’s first party-approved science-fiction and fantasy convention had come about five years earlier.
The Chinese had sent delegates to Apple and Google etc to try to work out why America was inventing the future, he said. And they had discovered that all the programmers, all the entrepreneurs, had read science fiction when they were children.
“Fiction can show you a different world,” said Gaiman. “It can take you somewhere you’ve never been.”
And when you come back, you see things differently. And you might decide to do something about that: you might change the future.
Perhaps the key to why speculative fiction is on the rise is the ways in which it is not “hard” science fiction. Rather than focusing on technology and world-building to the point of potential fetishism, as our “real” world seems to be doing, what we are reading today is a sophisticated literature engaging with contemporary cultural, social and political matters – through the lens of an “un-real” idea, which may be little more than a metaphor or errant speculation.