Why it is (almost) impossible to teach creativity

File 20181119 44274 v4jiya.jpg?ixlib=rb 1.1
Relishing the independence of the mind is the basis for naturally imaginative activity.
Shutterstock

Robert Nelson, Monash University

Industry and educators are agreed: the world needs creativity. There is interest in the field, lots of urging but remarkably little action. Everyone is a bit scared of what to do next. On the question of creativity and imagination, they are mostly uncreative and unimaginative.

Some of the paralysis arises because you can’t easily define creativity. It resists the measurement and strategies that we’re familiar with. Indisposed by the simultaneous vagueness and sublimity of creative processes, educators seek artificial ways to channel imaginative activity into templates that end up compromising the very creativity they celebrate.

For example, creativity is often reduced to problem-solving. To be sure, you need imagination to solve many curly problems and creativity is arguably part of what it takes. But problem-solving is far from the whole of creativity; and if you focus creative thinking uniquely on problems and solutions, you encourage a mechanistic view – all about scoping and then pinpointing the best fit among options.

It might be satisfying to create models for such analytical processes but they distort the natural, wayward flux of imaginative thinking. Often, it is not about solving a problem but seeing a problem that no one else has identified. Often, the point of departure is a personal wish for something to be true or worth arguing or capable of making a poetic splash, whereupon the mind goes into imaginative overdrive to develop a robust theory that has never been proposed before.

For teaching purposes, problems are an anxious place to cultivate creativity. If you think of anyone coming up with an idea — a new song, a witty way of denouncing a politician, a dance step, a joke — it isn’t necessarily about a problem but rather a blissful opportunity for the mind to exercise its autonomy, that magical power to concatenate images freely and to see within them a bristling expression of something intelligent.

New ideas are more about a blissful opportunity for the mind to exercise autonomy.
shutterstock

That’s the motive behind what scholars now call “Big C Creativity”: i.e. your Bach or Darwin or Freud who comes up with a major original contribution to culture or science. But the same is true of everyday “small C creativity” that isn’t specifically problem-based.


Read more:
Creativity is a human quality that exists in every single one of us


Relishing the independence of the mind is the basis for naturally imaginative activity, like humour, repartee, a gestural impulse or theatrical intuition, a satire that extrapolates someone’s behaviour or produces a poignant character insight.

A dull taming

Our way of democratising creativity is not to see it in inherently imaginative spontaneity but to identify it with instrumental strategising. We tame creativity by making it dull. Our way of honing the faculty is by making it goal-oriented and compliant to a purpose that can be managed and assessed.

Alas, when we make creativity artificially responsible to a goal, we collapse it with prudent decision-making, whereupon it no longer transcends familiar frameworks toward an unknown fertility.

We pin creativity to logical intelligence as opposed to fantasy, that somewhat messy generation of figments out of whose chaos the mind can see a brilliant rhyme, a metaphor, a hilarious skip or roll of the shoulders, an outrageous pun, a thought about why peacocks have such a long tail, a reason why bread goes stale or an astonishing pattern in numbers arising from a formula.

We pin creativity to logical intelligence as opposed to fantasy.
Shutterstock

Because creativity, in essence, is somewhat irresponsible, it isn’t easy to locate in a syllabus and impossible to teach in a culture of learning outcomes. Learning outcomes are statements of what the student will gain from the subject or unit that you’re teaching. Internationally and across the tertiary system, they take the form of: “On successful completion of this subject, you will be able to …” Everything that is taught should then support the outcomes and all assessment should allow the students to demonstrate that they have met them.

After a lengthy historical study, I have concluded that our contemporary education systematically trashes creativity and unwittingly punishes students for exercising their imagination. The structural basis for this passive hostility to the imagination is the grid of learning outcomes in alignment with delivery and assessment.

It might always be impossible to teach creativity but the least we can do for our students is make education a safe place for imagination. Our academies are a long way from that haven and I see little encouraging in the apologias for creativity that the literature now spawns.

My contention is that learning outcomes are only good for uncreative study. For education to cultivate creativity and imagination, we need to stop asking students anxiously to follow demonstrable proofs of learning for which imagination is a liability.
The Conversation

Robert Nelson, Associate Director Student Experience, Monash University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisements

Are good books made into bad films?

maxresdefault

The short answer is no.

The longer answer is Berkson’s Paradox/Fallacy applies.

The even longer answer is explained in this video from Hannah Fry and Numberphile:

Comparing the book to the movie has been a long-standing blog topic of mine, which made this maths video pretty cool*. I’ve since developed a category list that relates to what Hannah discussed in the video about what gets made into movies.

  1. It is very unlikely that your novel will be published.
  2. It is very unlikely that your published novel will be optioned to be made into a movie (or TV show).
  3. It is very unlikely that the movie adaptation will actually be made.
  4. Most movies are average, so it is very unlikely that the movie adaptation will be above average.
  5. If the movie is above average, it is very unlikely that the movie will bear any resemblance to the book it was adapted from.
  6. Pointless arguments will ensue from the previous two points.

The Metacritic vs Goodreads analysis mentioned in the video is interesting and worth a read.

hickey-datalab-bestadaptations

*As always, I’m working from a definition of cool that includes the nerdy stuff I like.**

**Did you know that cool has always been cool?***

next_essay_chart
Source.

*** Well, unless you use Ngram Viewer to check Google Books for word usage over time like some sort of nerd…

Ngram Cool

How Is Tech Changing the Way We Read?

maxresdefault1

With the rise of social media and smartphone use, we are all reading fewer books than we once did. All, not just those pesky millennials. Some people are worried about what this means for the future of literature and, well, our brains. But is it true that we are really reading less? And should I care?

Above The Noise recently did a video in which Myles covers some of the research on reading.

I always appreciate it when a Youtuber or Journalist manages to discuss a topic without devolving into head-shaking admonishment, especially when it comes to the topic of reading and books. Too often these sorts of videos and articles cite bad research or buy into industry propaganda.

I’ve previously discussed the misrepresentations made about reading ebooks, the overstating of the benefits of reading – when there are some well-researched benefits documented –  and even the way we write. And the Pew Research into reading was one of several references I’ve used in my discussion of Who Reads, something I cover quite a bit here.

And yet, there were still some things in the video that I hadn’t been aware of. So I think it is worth sharing. Enjoy.

From the video:

Reading has been an important part of the human experience for thousands of years, but believe it or not, that’s not a long time on the evolutionary timescale. Before the internet, it made sense to read long texts in a linear fashion, but that’s now changing as people are adapting to skimming shorter texts on their computers or phones. But what does this mean for the future of books?

What is literary reading?

Literary reading is, quite simply, the reading of any literature. This includes novels, short stories, poetry, and plays.

Are we reading less?

The rate at which Americans are reading literature for fun is down around 14% from the early 1980s. This doesn’t necessarily mean we are reading less, however. Many people still have to read for school or work. Then there are all the words, sentences, and messages we read on the internet from emails to texts to tweets. Some people believe that this means we are possibly reading more individual words than ever. It’s just being done in a different way. I’ve also discussed the decline of literature.

And this is changing our brains?

Some neuroscientists believe that scanning shorter texts the way we do on the internet, often jumping from hyperlink to hyperlink, is actually changing the wiring in our brains. We are becoming better at searching for key terms and scanning for information, but this means it can become more difficult to read a longer text all the way through without missing major points.

SOURCES:
Children, Teens, and Reading
The long, steady decline of literary reading
Who doesn’t read books in America?
Serious reading takes a hit from online scanning and skimming, researchers say

Book review: Astrophyics for people in a hurry by Neil deGrasse Tyson

Astrophysics for People in a HurryAstrophysics for People in a Hurry by Neil deGrasse Tyson

My rating: 4 of 5 stars

Oppose the gravitational force with your phalanges if you value science.

Science communicator Neil deGrasse Tyson understands that most people don’t have time to read physics books – plus they are hard work to read. So he decided to package together some of his essays into a book that covers the major aspects of astrophysics in a way anyone could enjoy and learn from.

While reading this book I had a revelation. Could there be an explanation other than Dark Matter and Dark Energy for the gravity and expansion of the universe?

I’m going to propose Pratchett’s Theorem as an alternate hypothesis for the expansion of the universe and gravity. Since the universe is flat and there are unexplained gravity and expansion, I postulate that this flat universe is riding on the backs of four large elephants. This explains the gravity pulling everything down. These elephants are riding on the back of a large turtle who swims through the multiverse. The elephants are slowly moving away from one another – which explains the expansion – and walking down the curved shell of the turtle such that each step is larger than the last – which explains the increased speed of expansion.

This, of course, raises the questions of whether it was the elephants who were the prime movers behind the “Big Bang”, whether the elephants will keep walking down the shell until they fall off tearing the universe to shreds, or whether the elephants will eventually decide to walk back toward one another for a reunion? Do they also walk directly away from one another, or do they walk around the shell, such that the universe rotates? Given everything within the universe rotates, it would only make sense that this rotation is caused by the elephant’s motion.

Anyway, NDGT’s book was a good read. It doesn’t dumb things down, nor use too many lay terms, which was refreshing. But as a scientist, albeit in a completely different field, it felt like the book was aimed at a more general audience, particularly those who aren’t familiar with many of the topics discussed. Which made it only a good but not a great read for me.

View all my reviews

We don’t know the world

A few years ago I saw a fantastic talk from Hans Rosling about the world and statistics. Okay, I probably lost a few people by implying statistics are fantastic, and now I’ll lose some more by saying statistics ARE fantastic. Unfortunately, Hans is no longer with us, but his son and daughter-in-law – Ola and Anna – are continuing his work with Gap Minder.

Recently they released the results of their 2017 survey of world knowledge. After looking at the results they decided to call it the Misconception Study.* You’ll see why.

That’s right, less than chance. People really don’t know that much about the world.

gms_2017inpage-1024x523

Do you think you could do better? Well, find out! Take the 2018 quiz here. Of course, this is the part where I say that I passed the test. Humble-brag. But in fairness, as I’ve already mentioned, I’ve been following Gap Minder and I like statistics.

Could you pass the test?

*They probably called it that prior, but I’m making a point here, dammit!

What genre is the 2008 book Outliers in? What are some similar books in that genre?

efc6e32f85a3821257e73ea41733271737e761e4853c341459263864ed2e6483

The book Outliers by Malcolm Gladwell is a popular example of fiction.

Outliers is probably most famous for promoting the idea of the 10,000 Hour Rule based on how many hours it will take you to get good at doing something. But like all good fiction, it ignores the reality of how skills are acquired.

Unfortunately, many people have mistakenly assumed that Gladwell’s writing should be classified as non-fiction pop-science, or even worse, factual. This has lead many researchers to waste time and resources showing that the 10,000 Hour Rule is nonsense, and that Outliers is pop-science at its worst – i.e. incredibly influential despite being clearly nonsense.

Reviewers of the book have noted the flaws in calling this book non-fiction*:

In an article about the book for The New York TimesSteven Pinker wrote, “The reasoning in ‘Outliers,’ which consists of cherry-picked anecdotes, post-hocsophistry and false dichotomies, had me gnawing on my Kindle.”[20]

In a review in The New Republic, Isaac Chotiner called the final chapter of Outliers “impervious to all forms of critical thinking”.[21]

And several researchers have debunked many factual claims made in the book*:

Case Western Reserve University’s assistant professor of psychology Brooke N. Macnamara and colleagues have subsequently performed a comprehensive review of 9,331 research papers about practice relating to acquiring skills. They focused specifically on 88 papers that collected and recorded data about practice times. In their paper, they note regarding the 10,000-hour rule that “This view is a frequent topic of popular-science writing” but “we conducted a meta-analysis covering all major domains in which deliberate practice has been investigated. We found that deliberate practice explained 26% of the variance in performance for games, 21% for music, 18% for sports, 4% for education, and less than 1% for professions. We conclude that deliberate practice is important, but not as important as has been argued”.[24]

Statistical analyst Jeff Sauro looked at Gladwell’s claim that between 1952 and 1958 was the best time to be born to become a software millionaire. Sauro found that, although the 1952–1958 category held the most births, “[a] software millionaire is more than twice as likely to be born outside the 1952 to 1958 window than within it.” Sauro notes that Gladwell’s claims are used more as a means of getting the reader to think about patterns in general, rather than a pursuit of verifiable fact.[25]

In fact, the 10,000 Hour Rule seems to irk people in the social sciences quite a bit. E.g. Practice Does Not Make Perfect – We are not all created equal where our genes and abilities are concerned.

Are there similar authors and similar books using misleading, cherry-picked, and tenuous research to make broad sweeping pop-science claims that make people feel good? Of course. Plenty of them. It is a minefield in the non-fiction section of bookstores, which I think should be more accurately renamed “Boring Fiction”. So I think it would be negligent of me to recommend more books like Outliers or authors like Malcolm Gladwell.

*Quotes taken from Wikipedia.

This answer originally appeared on Quora.