Is science broken?

With the rebirth of Cosmos on TV, Neil DeGrasse Tyson and the team have brought science back into the mainstream. No longer is science confined to the latest puff piece on cancer research that is only in the media because a) cancer and b) the researchers are pressuring the funding bodies to give them money. The terms geek and nerd have stopped being quite the derogatory terms they once were. We even have science memes becoming as popular as Sean Bean “brace yourself” memes.

Sean dies

This attention has also cast a light on the scientific process itself with many non-scientists and scientists passing comment on the reliability of science. Nature has recently published several articles discussing the reliability of study’s findings. One article shows why the hard sciences laugh at the soft sciences, with the article talking about statistical errors. I mean, have these “scientists” never heard of selection and sample bias? Yes, there is a nerd pecking order, and it is maintained through pure snobbishness, complicated looking equations, and how clean the lab-coat remains.

purity

As a science nerd, I feel the need to weigh in on this attack on science. So I’m going to tear apart, limb by limb, a heavy hitting article: Cracked.com’s 6 Shocking Studies That Prove Science Is Totally Broken.

To say that science is broken or somehow unreliable is nonsense. To say that peer review or statistical analysis is unreliable is also nonsense. There are exceptions to this: sometimes entire fields of study are utter crap, sometimes entire journals are just crap, sometimes scientists and reviewers suck at maths/stats. But in most instances these things are not-science, just stuff pretending to be science. Which is why I’m going to discuss this article.

A Shocking Amount of Medical Research Is Complete Bullshit
#6 – Kinda true. There are two problems here: media reporting of medical science and actual medical science. The biggest issue is the media reporting of medical science, hell, science in general. Just look at how the media have messed up the reporting of climate science for the past 40 years.

Of course, most of what is reported as medical studies are often preliminary studies. You know: “we’ve found a cure for cancer, in a petri dish, just need another 20 years of research and development, and a boatload of money, and we might have something worth getting excited about.” The other kind that gets attention isn’t proper medical studies but are spurious claims by someone trying to pedal a new supplement. So this issue is more about the media being scientifically illiterate than anything.

Another issue is the part of medical science that Ben Goldacre has addressed in his books Bad Science and Bad Pharma. Essentially you have a bias toward positive results being reported. This isn’t good enough. Ben goes into more detail on this topic and it is worth reading his books on this topic and the Nature articles I previously referred to.

Many Scientists Still Don’t Understand Math
#5 – Kinda true. Math is hard. It has all of those funny symbols and not nearly enough pie charts. Mmmm, pie! If a reviewer in the peer review process doesn’t understand maths, they will often reject papers, calling the results blackbox. Other times the reviewers will fail to pick up the mistakes made, usually because they aren’t getting paid and that funding application won’t write itself. And that’s just the reviewers. Many researchers don’t do proper trial design and often pass off analysis to specialists who have to try and make the data work despite massive failings. And the harsh reality is that experiments are always a compromise: there is no such thing as the perfect experiment.

Essentially, scientists are fallible human beings like everyone else. Which is why science itself is iterative and includes a methods section so that results are independently confirmed before being accepted.

And They Don’t Understand Statistics, Either
#4 – Kinda true, but misleading. How many people understand the difference between statistically significant and significance? Here’s a quick example:

This illustrates that when you test for something at the 95% confidence interval you still have a 1 in 20 chance of a false positive or natural variability arising in the test. Some “science” has been published that uses this false positive by doing a statistical fishing trip (e.g. anti-GM paper). But there is another aspect, if you get enough samples, and enough data, you can actually get a statistically significant result but not have a significant result. An example would be testing new fertiliser X and finding that there is a p-value of 0.05 (i.e. significant) that the grain yield is 50kg higher in a 3 tonne per hectare crop. Wow, statistically significant, but at 50kg/ha, who cares?!

But these results will be reported, published, and talked about. It is easy for people who haven’t read and understood the work to get over-excited by these results. It is also easy for researchers to get over excited too, they are only human. But this is why we have the methods and results sections in science papers so that calmer, more rational heads prevail. Usually after wine. Wine really helps.

Scientists Have Nearly Unlimited Room to Manipulate Data
#3 – True but misleading. Any scientist *could* make up anything that they wanted. They could generate a bunch of numbers to prove that, for an example of bullshit science, the world is only 6000 years old. But because scientists are a skeptical bunch, they’d want some confirming evidence. They’d want that iterative scientific process to come into play. And the bigger that claim, the more evidence they’d want. Hence why scientists generally ignore creationists, or just pat them on the head when they show up at events: aren’t they cute, they’re trying to science!

But there is a serious issue here. The Nature article I referred to was a social sciences study, a field that is rife with sampling and selection bias. Ever wonder why you hear “scientists say X is bad for you” then a year later it is, “scientists say X is good for you”? Well, that is because two groups were sampled and correlated for X, and as much as we’d like it, correlation doesn’t equal causation. I wish someone would tell the media this little fact, especially since organic food causes autism.

Other fields have other issues. Take a look at health and fitness studies and spot who the participants were: generally, they are university students who need the money to buy tinned beans and beer. Not the most representative group of people and often they are mates with one of the researchers, all 4 of them. Not enough participants and a biased sample: not the way to do science. The harder sciences are better, but that isn’t to say that there aren’t limitations. Again, *this is why we have the methods section so that we can figure out the limitations of the study.*

The Science Community Still Won’t Listen to Women – Update
#2 – When I first wrote this I disagreed, but now I agree, see video below. As someone with a penis, my mileage on this issue is far too limited. That is why it was only when a few prominent people spoke out about this issue that I realised science is no better than the rest of society. It hurts me to say that.

There is still a heavy bias toward men in senior positions at universities and research institutes, women get paid less, women are guessed to be less competent scientists, and apparently, it is okay to ogle female scientists’ boobs… Any of these sound familiar to the rest of society? This is gradually changing, but you have to remember what age those senior people are and what that generation required of women (quit when they got married, etc). That old guard may have influence but they’ll all be dead or retired soon where their influence will be confined to the letters to the editor in the newspaper. After seeing the video below, especially the way the question was asked, I think it is clear that the expectations for women create barriers into and through careers in science (the racism is similar and is one I see as a big issue). So it starts long before people get into science, then it continues through attrition.


Fast forward to 1:01:31 for the question and NDGT’s* answer (sorry, embed doesn’t allow time codes).

Recently there has been a spate of very public sexist science moments. Whether it be telling female scientists they should find a male co-author to improve their science, or Nobel Laureates who don’t want to be distracted by women in the lab, it is clear that women in science don’t get treated like scientists. Which is why I find the Twitter response to the Tim Hunt debacle, #distractinglysexy, to be exactly the sort of ridicule required. Recent events seem to imply at least repercussions are occurring now.

Scientists are meant to be thinkers, they are meant to be smart, they are meant to follow the evidence. They aren’t meant to behave like some cretin who hangs out on the men’s rights movement subreddit discussion. Speaking of which, watch science communicator Emily Graslie discuss the comments section of Youtube.

Here’s another from Thought Cafe and Dr. Renée Hložek.

Update: After the first photo of a black hole was published, women in STEM were back in the headlines, with people wanting to again marginalise women in STEM – not to mention how the media love to promote the “lone genius” when science is a team thing. Vox had a great article on it which included some great graphs from Pew Research.

ps_2018.01.09_stem_0-01
Source
ps_2018.01.09_stem_0-04
Source

It’s All About the Money
#1 – D’uh and misleading. Research costs money. *This is why we have the methods section, so that we can figure out the limitations of the study.* Money may bring in bias, but it doesn’t have to, nor does that bias have to be bad or wrong. Remember how I said above that science is an iterative process? Well, there is only so big a house of cards that can be built under a pile of bullshit before it falls down in a stinky mess. Money might fool a few people for a while (e.g. climate change denial) but science will ultimately win.

Ultimately, science is the best tool we have for finding out about our reality, making cool stuff, and blowing things up. Without it we wouldn’t be, this article wouldn’t be possible, we wouldn’t know what a Bill Nye smack down looks like. Sure, there is room for improvement, especially in the peer review process and funding arrangements, and science is flawed because it is done by humans, but science is bringing the awesome every day: we have to remember that fact.

Other rebuttals:

Comment
by from discussion
inbadscience

Is Science Broken?

*Wow, who’d have thought including Neil DeGrasse Tyson in this context would age quite so badly!?

I think you’re Mythtaken: Guns #2 – The second armour-piercing round

After a recent discussion about gun myths, I realised that my last blog post hadn’t covered anywhere near enough of the myths that are floating around (this article will mainly be about US guns, but parallels from the resources and science cited can be drawn to other countries). This is obviously because stuff is much easier to make up than to research, just ask Bill “tides go in, tides go out” O’Reilly. One of the big problems with research in the US on guns is that the National Rifle Association has effectively lobbied to cut off federal funding for research and stymieing data collection and sharing on gun violence. As a result there are a lack of hard numbers and research often tends to be limited in scope. Scope: get it? So like a lost rabbit wandering onto a shooting range, or a teenager wearing a hoody, it’s time to play dodge with some of these claims.

Myth: Guns make you safer, just like drinking a bit of alcohol makes you a better driver.

The myth I hear the most often is that guns make you safer;  just like the death penalty is a great deterrent, surveillance cameras stop crime, and the internet is a good source of medical advice. The problem with this myth is that people like having a safety blanket to snuggle. What they don’t realise is that guns don’t make you safer, they make you 4.5-5.5 times more likely to do something stupid to someone you know and love than be used for protection.

I want to be clear here: there’s nothing wrong with going shooting at the range, or hunting vermin. The problem is thinking that you can use a gun for self-defence, when it actually makes the violence problem worse. That gun escalates the violence because people have it there: why not use it? To wit the criminals enter into an arms race and a shoot first policy.

Owning a gun has been linked to higher risks of homicidesuicide, and accidental death by gun. For every time a gun is used in self-defense in the home, there are 7 assaults or murders, 11 suicide attempts, and 4 accidents involving guns in or around a home. 43% of homes with guns and kids have at least one unlocked firearm, and in one experiment it was found that one third of 8-to-12-year-old boys who found a handgun pulled the trigger, which is just plain unsafe.

As for carrying around a gun for self-defence, well, in 2011, nearly 10 times more people were shot and killed in arguments than by civilians trying to stop a crime. In one survey, nearly 1% of Americans reported using guns to defend themselves or their property. However, a closer look at their claims found that more than 50% involved using guns in an aggressive manner, such as escalating an argument. A Philadelphia study found that the odds of an assault victim being shot were 4.5 times greater if they carried a gun. Their odds of being killed were 4.2 times greater.

It is even worse for women. In 2010, nearly 6 times more women were shot by husbands, boyfriends, and ex-partners than murdered by male strangers. A woman’s chances of being killed by her abuser increase more than 7 times if he has access to a gun, and that access could be the woman keeping one around just in case her attacker needs it. One US study found that women in states with higher gun ownership rates were 4.9 times more likely to be murdered by a gun than women in states with lower gun ownership rates; funny that.

There is also the action hero delusion that often gets trotted out when talking about guns for self-defence. The idea is that everyone is a good guy, so give them a gun and you have a bunch of action heroes ready to fight off the forces of evil. This has worked so well that all governments are thinking of getting rid of the military….

The reality is that the average person is not an action hero and would fail miserably in a high stress situation with actual bad guys. You only have to look at the statistics:

  • Mass shootings stopped by armed civilians in the past 30 years: 0
  • Chances that a shooting at an ER involves guns taken from guards: 1 in 5

I’ve seen several examples cited of “citizens” shooting someone who looked intent on killing everyone they could (with a gun…). But in every instance the “citizen” was actually an off-duty police officer, or a person in law enforcement, or someone in the military. In other words, the people who stop mass shootings or bad-guys with guns, are trained professionals.

There have also been a few studies done that claim X million lawful crime preventions, therefore guns must be good; notably by researchers Lott and Kleck. To say that their research is flawed is like saying Stephen King has sold a few books. Lott’s work has been refuted for extrapolating flawed data. Kleck’s research has similarly been refuted by many peer reviewed articles:

Myth: Guns don’t kill people, people kill people, quite often with a gun, because punching someone to death is hard work.

If this myth were true we wouldn’t send troops to war with weapons. I get where people are coming from with this myth, because the gun itself is an inanimate object and is only as good or bad as the person using it. Yes, I did just quote the movie Shane: thanks for noticing. But here is the thing, in a society we are more than just a bunch of individuals, we are a great big bell-curve of complexity. So when you actually study the entire population you find that people with more guns tend to kill more people—with guns. In the US, states with the highest gun ownership rates have a gun murder rate 114% higher than those with the lowest gun ownership rates. Also, gun death rates tend to be higher in states with higher rates of gun ownership. Gun death rates are generally lower in states with restrictions such as firearm type restrictions or safe-storage requirements.

ownership-death630
Sources: 
PediatricsCenters for Disease Control and Prevention

Gun deaths graph: The three states with the highest rate of gun ownership (MT, AK, WY) have a gun death rate of 17.8 per 100,000, over 4 times that of the three lowest-ownership states (HI, NJ, MA; 4.0 gun deaths per 100,000).

The thing is that despite guns being inanimate objects, they affect the user/owner’s psyche. It’s like waking up one morning with a larger penis or bigger boobs: you not only want to show them off, you act differently as a result. Studies confirm this change in behaviour. Drivers who carry guns are 44% more likely than unarmed drivers to make obscene gestures at other motorists, and 77% more likely to follow them aggressively. Among Texans convicted of serious crimes, those with concealed-handgun licenses were sentenced for threatening someone with a firearm 4.8 times more than those without. In US states with Stand Your Ground and other laws making it easier to shoot in self-defence, those policies have been linked to a 7 to 10% increase in homicides.

Now people also like to try and red herring the argument against guns by pretending that video games or mental health is the problem. The NRA tried to claim video games were to blame after the Newtown shootings. Of course we’d be able to see this relationship by looking at gun ownership versus video game playing, like by comparing the USA to Japan.

United States Japan
Per capita spendingon video games $44 $55
Civilian firearmsper 100 people 88 0.6
Gun homicidesin 2008 11,030 11

Sources: PricewaterhouseCoopersSmall Arms Survey (PDF), UN Office on Drugs and Crime

The thing is controlling guns has been shown to work, although there are other factors in play, and policing is still key. But when gun control has been shown to reduce firearm deaths by 1-6 per 100,000 then the case is pretty much closed.

Myth: They’re coming for your guns to stop our freedom and tyranny and democide and Alex Jones said so and aliens made me do it!

As I stated above, the statistics on guns and gun violence is hazy. No one knows the exact number of guns in America, but it’s clear there’s no practical way to round them all up (never mind that no one in Washington is proposing this). Those “freedom” loving gun owners – all 80 million of them – have the evil government out-gunned by a factor of around 79 to 1. If government were coming for the guns, you’d think they’d have done so before being this grossly out-gunned.

guns-owned630Sources: Congressional Research Service (PDF), Small Arms Survey

Yes, 80 million gun owners is a minority! I find it interesting that from 1989 to 2000 there was a decline in gun ownership of 46% to 32%. Now the decline in ownership rebounds to hover between 34 and 43% for 2000-2011 (notably the high point in 2007 was after the Virginia Tech shooting which the NRA did a lot of campaigning around), which shows why the decline didn’t continue. Now compare those rates of ownership to the recent report from the US Bureau of Justice Statistics sums up the rates of gun violence. You can clearly see a decline in gun violence from 1993 to 2000 before a plateau that has pretty much held since. This is confirmed by other studies. This is an important take home point: all the research shows violence and gun violence is on the decline. The idea that people need a gun for protection is becoming more and more ridiculous. This is despite the global decline in violence, and trends seen in countries like Australia (more Aussie stats here). On a side note, in the last lot of statistics you see that the more female, educated, non-white, and liberal you are, the less likely you are to own a gun. 

So scare campaigns may work to boost sales of guns for a while, but overall, most people don’t want or need a gun. The long term trend has nothing to do with the government coming for the guns and everything to do with people realising they don’t need one and prefer to read a good book, or watch a movie, instead of going to the range.

The simple fact is that more guns in society is the best predictor of death, thus it is time to rethink the reasons for owning a gun, especially if that reason is in case you have to John McClane a situation.

More mythbusting gun articles:

https://www.scientificamerican.com/article/more-guns-do-not-stop-more-crimes-evidence-shows/

http://www.latimes.com/opinion/op-ed/la-oe-hemenway-guns-20150423-story.html

http://thinkprogress.org/gun-debate-guide/#moreguns

http://www.slate.com/articles/health_and_science/medical_examiner/2015/01/good_guy_with_a_gun_myth_guns_increase_the_risk_of_homicide_accidents_suicide.single.html

http://www.vox.com/policy-and-politics/2015/12/11/9891664/daily-show-mass-shootings

https://theconversation.com/six-things-americans-should-know-about-mass-shootings-48934

https://mobile.nytimes.com/2017/11/07/world/americas/mass-shootings-us-international.html

More science:

http://www.amjmed.com/article/S0002-9343(13)00444-0/abstract
http://ajph.aphapublications.org/doi/abs/10.2105/AJPH.2013.301409?journalCode=ajph
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1447364/pdf/0921988.pdf
http://www.hsph.harvard.edu/hicrc/firearms-research/guns-and-death/
http://www.kellogg.northwestern.edu/faculty/dranove/htm/Dranove/coursepages/Mgmt%20469/guns.pdf
http://www.theatlantic.com/national/print/2011/01/the-geography-of-gun-deaths/69354/
http://edition.cnn.com/2012/07/30/opinion/frum-guns-safer/
http://www.crab.rutgers.edu/~goertzel/mythsofmurder.htm
http://islandia.law.yale.edu/ayers/ayres_donohue_article.pdf
http://islandia.law.yale.edu/ayres/Ayres_Donohue_comment.pdf
http://www.motherjones.com/politics/2003/10/double-barreled-double-standards

Click to access WhitePaper020514_CaseforGunPolicyReforms.pdf

We think we’re smart

0109xkcd
XKCD nails it again.

Within science fiction and the wider society there is this idea that we’ll find aliens. I always find it funny when humans talk about discovering “other” intelligent life in the universe. Just a wee bit arrogant to consider ourselves intelligent. Yes, I do realise that I’m arguing that point using technology based on quantum mechanics, probably being read on a device that weighs less than 200g and fits in your pocket, linked by a distributed network, connected by orbital satellites. Science: it works….. bitches.

But I would continue my argument by saying that to some people that amazing interface of technology, that is allowing this blog post to be read around the world, might as well be explained as “magic, magic, magic, magic, magic, god did it.” I certainly couldn’t explain how quantum mechanics works, nor how it applies to communications technologies, let alone how it manages to stream all of my favourite porn media to my phone. Thus Arthur C Clarke’s third law – Any sufficiently advanced technology is indistinguishable from magic – holds true for the vast majority of people on this planet.

Now the argument against Clarke’s third law is that technology isn’t magic. In fact, in the entire history of human civilisation, with all the things that have been attributed to magic, all the great mysteries of the universe, once investigated, have turned out to be not magic. But I’m talking about the knowledge gap between the average person and the specialist in the field who develops all this cutting edge stuff that allows other specialists to do cool stuff; like making a hoverboard. We are surrounded by everyday items that most of us would struggle to explain the concept of how they work – magnets, how do they work? – let alone understand the complexities involved – magnets, this is how they work.

Douglas Adams brilliantly satirised this idea in his novel Mostly Harmless. Arthur Dent crash lands on an alien planet where the local humanoid populace are rather backward in comparison to us humans. Arthur comes from a planet of television, cars, planes, computers: all sorts of neat stuff. But he doesn’t know how any of it works, nor how to go about reverse engineering any or it. So he becomes the sandwich maker.

Essentially, we point to all of our human achievements to show how smart we are, but in reality most of us haven’t the first clue about any of those achievements. We just aren’t as smart as we would like to think.

Now compare this to aliens. Humans are pretty proud of having gone to the moon, cashing in on all 12 of us who have done so, but to be visited by aliens requires interstellar travel. That requires technology we probably haven’t even dreamed of yet (possibly not, e.g. warp). An alien race that can do that is so far beyond human achievement and intelligence. Thus,  I’m suggesting that even at our best, we would be babbling morons in comparison to an intelligent life-form that has managed interstellar travel.

Sure, the aliens that decide to cross interstellar space may be the Cleatus of their species. Their technology may actually have reached the point of sentience and doesn’t require anything of its “makers”. But think of how advanced such a species would be, not to mention how arrogant (rightly or wrongly). There is no reason for them to look upon Earth and see humans as intelligent (e.g. climate change and reality TV). There is also no reason to believe that we’d even notice these aliens. An intelligent life-form travels between star systems, has the technology for that not to have taken billions and billions of years, and some dude with an out of focus camera is going to be the only person to see them?

So I think that humans are rather egotistical to think of ourselves as intelligent life in the universe. I also think that it is arrogant to believe that an alien species would regard us as intelligent. I also think that we’d have little chance of encountering intelligent alien life unless they wanted to be encountered. This is just my view, but the main thing is, Neil DeGrasse Tyson agrees with me (or is that I agree with him?):

sagan on tech

Book Review: Gotham Central Vol 1 by Ed Brubaker and Greg Rucka

Gotham Central, Vol. 1: In the Line of DutyGotham Central, Vol. 1: In the Line of Duty by Ed Brubaker
My rating: 5 of 5 stars

With the new TV series Gotham currently being cast there has been a bit of buzz around what the storyline is going to be about. Unfortunately it is not going to be based upon this excellent series by Brubaker and Rucka (should also mention the art by Michael Lark). This would actually be a great way to do a non-Batman series, especially as it would be able to use the recent Nolan films as a lead in.

I guess people who read will be the only ones to appreciate a series focussed on Gotham city police trying to work in the shadow of Batman.

View all my reviews

New Ultra Thin Diet

616059-390fc430-7f7c-11e3-8cdb-58f79d3137a3

In the tradition of nutritionists – the toothyologists of dietary advice – I have developed a new free diet plan to help people lose weight. Just like all other fad diets, my diet promises to help you lose weight or your money back. And just like other fad diets, I have come up with an overly simple way of losing weight that is guaranteed to not work in the long term.

Introducing the Dodgy Kebab Diet™

The relationship between an alcoholic binge-fest and a stop for a dodgy kebab before heading home has long been known. But have you ever wondered why it is that people don’t gain pudgy spare tires around their middle from their night of drunken debauchery? Well, thanks to not-science and pure speculation, I have discovered that it is the dodgy kebab that keeps people thin and ready for another night of drinking your paycheck.

You see, the dodgy kebab contains a quantum field of dietary entanglement. This means that the dodgy kebab sneaks up on all of that alcohol and redefines its aura, changing it from calories to vomit, which I call the Gastro™ effect.

Now this may work for the overindulgence evenings, but a diet has to be every day for 10 days or 1 dress size, so how can the Dodgy Kebab Diet™ work without the need to get plastered every day? Well, the dodgy kebab’s quantum field of dietary entanglement works just as well on your stomach lining as it does on alcohol.

The diet is very simple: eat one dodgy kebab per day for 10 days and I guarantee you will lose weight.* That’s it! You will feel better ** and look better ***.

Don’t just take my word for it: here is one of my satisfied victims customers.

Shane: I started the Dodgy Kebab Diet™, caught Gastro™ and lost 5kg.

With the Dodgy Kebab Diet™ you pay no money for access to my fully unqualified nutritionists (we’d have to be dieticians to be qualified). You only pay $59.95 per month for access to our extensive database of Dodgy Kebab Diet™ endorsed vendors. No need to spend every Saturday night wandering around to find the one with Gastro™. I’ve done all the research for you, compiling all the dodgy kebab vendors from the Food Safety Authority. All you have to do is send me $59.95 and I will send you the mobile phone app and simple instructions on how to get the most out of your bout of Gastro™.

What if kebabs aren’t my thing?
For an extra $9.95 I can include Chinese, Thai, and all of the least cooked chicken restaurants in your area.

Order now to avoid disappointment at eating well cooked food and gaining weight like a mug.

* We don’t guarantee weight loss on this diet.

** You will only feel better if you make it to the hospital emergency room on time.

*** Looking better is determinant upon surviving the food poisoning and proper application of makeup.

Thanks to Shane Nixon for helping inspire this diet.

Talent, ability and being awesome

born writer

Born to write? Born to be an athlete? Born to be a rocket scientist? People love to talk about “natural” ability or talent as the be all and end all of achievement. Since I actually own a genetics text book – it props up my DVD collection on the shelf – and once watched someone do manual labour, I feel qualified to comment on the talent vs. work debate.

Genetics is a big, complicated, topic, so I’m going to provide a facile overview of it. Genetics is that thing that means some people have higher baselines, are higher responders to training/learning, and are likely to achieve more (see this and read this for sports examples). For some the opposite is true, they have low baselines, don’t respond well to training/learning, and are likely to suck no matter what they do. There isn’t much you can do about your genetics, unless you happen to have a time machine and can play matchmaker to get better parents.

But that isn’t to say that you shouldn’t try to get good at stuff. Until you are tested and start training, you don’t really know what your “ability” is. And just because you might continue to suck, you will suck less than you did before, which means you will be better than those around you who didn’t even try. Take an example from sports – because people actually do science on athletes, the arts talk about their feelings too much – athletes tend to live longer than normal because they are more likely to be fitter, which lowers cardiovascular mortality. You don’t get fit sitting on a couch, watching TV, snacking on corn chips, in your underwear: you have to train.

So let’s take this into the writing field. You may have been born with a massive brain, nimble fingers, and an imagination that rivals college students tripping on acid, but that doesn’t mean much if you never learn to read, or write, or are too poor to have access to materials for writing, or the persistence to share that writing with the world. All that talent and ability counts for nothing if you don’t do something with it. You have to train. The difference between the talented individual and the untalented individual can often just be a lot of hard work by the untalented. I mean, who has sold more books: James Paterson or any of the Booker Prize winners?*

But let’s not get carried away. We have to acknowledge that any “talent” is a GxE interaction (genetics by environment interaction). Genetics, or that innate ability, is still a factor that we can’t dismiss, but so is the environment. So all of that skill development and training will come more easily, more quickly, and possibly progress further for some, but that isn’t an excuse for not doing the hard work.


See also: http://emilyjeanroche.blogspot.com.au/2014/02/WritingSkills.html

* Not that I’m insinuating that winning a Booker Prize actually makes you a talented or good writer. I actually use those prize lists to figure out what not to read.

TV shows airing in order

almost human

Recently I wrote about the TV shows that have been keeping me entertained, or at least giving my eyeballs some much-needed exercise. One of the TV shows I’d failed to get into was a little sci-fi on Fox called Almost Human. It appears that the reason I’d had trouble appreciating this new show is that Fox is up to its old tricks.

That’s right, Fox is airing the episodes of Almost Human out of order. And before you ask, I did check to see if Joss Whedon was in any way involved in the show: apparently not. So Fox can’t use the “we have to dick Joss’ show around” excuse, like they did with Firefly, Dollhouse, etc.

Obviously I’m not a highly paid TV executive, so my opinion on this topic is really inconsequential. Unless, of course, viewers of TV shows – that reason TV shows are made, aside from selling ad-space – are regarded as important in any way. Sure, I don’t have a degree in TV programming, but I would have thought airing a TV show in order would be the sensible thing to do. I’m not sure if the degree at MITV, the TV university located next to MIT, can be done online yet, but I would like to see their syllabus to get some idea of the inner workings of TV networks.

I know when I write a story I always like to start with the fifth chapter, then come back to the second chapter after I’ve written six or so chapters. I especially like to do this in a story which has a lot of new stuff in it, like sci-fi, and where there is any sort of story arc. This way you can really do your best to alienate readers and confuse them.

Not being privy to the inner workings of TV networks, it is hard to say exactly why they would do this, or how often they do this. With some TV shows you just wouldn’t notice. Take a formulaic story capsule like CSI Wherever. There isn’t usually an episode or season spanning story line; dead bodies show up, someone puts on glasses after making a pun, someone wears a lab coat near some magic ‘science’ boxes, they get the bad guy to confess during a flashback. So you would never know if they were aired out-of-order – which also raises the idea of them actually having an order to begin with. This is the sort of show you could just chop and change around to suit whatever excuse is used for butchering a show. But you can’t do this to a serialised TV show.

This isn’t just about annoying and confusing viewers. This isn’t about the disdain the TV executives are showing toward the show’s fanbase, you know, those people they need to sell stuff to. This is about a lack of respect for the creators of the show, especially the writers. Someone has gone to the trouble of crafting a story, an episodic story that needs to build upon previous instalments in order to continue to attract fans. Almost Human has enough of a “stand-alone” nature to the show to not be damaged too much by the lack of continuity (WTF is ‘the wall’??) but plenty of shows have been damaged or destroyed by these sorts of airing decisions.

Bring back Firefly!

Update: It appears that Fox has cancelled Almost Human, despite renewing The Following which had similar ratings. This shouldn’t be surprising since the network has essentially been trying to cancel the show since they first aired it. Fox didn’t make the show, so there is some chance a network like SyFy might pick it up.

Other articles on this:

http://seriable.com/almost-human-episodes-airing-order/

http://sciencefiction.com/2013/12/13/almost-human-airs-order-sign-cancellation/

Is fiction actually fiction?

There has been an interesting duo of videos by PBS’ Ideas Chanel. Mike discusses some interesting concepts surrounding fiction, like the fact that fiction is as much real as it is made-up and vice versa. Worth a watch.


The two videos cover a lot of ground, but one of the more important points I’d like to highlight is the idea that we can’t have fiction without reality. We need something to anchor our ideas and make-believe, shared experiences that allow us to understand and accept these fictions. There are plenty of examples of this, but one of the cooler examples is looking at depictions of the future at various stages throughout history. Compare what sci-fi movies of the 50s thought computers would look like now to what they actually look like, and you see a 1950s computer. Our imaginations actually suck a lot more than we think.

But here’s an idea about our inability to imagine the future: what if our imaginations don’t actually suck, but instead we ignore the outlandish imaginings that are actually more likely in favour of stuff we already know? Think about it. Or don’t, I’m not your boss.

Perth Writers’ Festival 2014

My annual pilgrimage to the Perth Writers’ Festival is over for another year. According to reports, I was joined by 38,500 other reading and writing fans, with ticket sales up on last year (can someone confirm that figure, I thought I read it here but I must have been mistaken. Edit: confirmed figure from WritingWA).

Some write-ups have discussed the heat; we are 1.6 degrees hotter than the long-term average for February: thanks climate change! Some write-ups have discussed the wonderful talks from literary authors; can’t be less entertaining than their books. Some write-ups have tried to imply that Perth people gasped when Scott Ludlam used the word crap; yes we clearly are a simple folk over here in the west, not accustomed to swearing and impolite behaviour like taking notes. So I hereby present my write-up.

Friday 21st

I started off my festival adventure with the panel discussion Tinker, Tailor, Soldier, Spy. Susan May chaired a discussion on writing, publishing, and thrilling books with Chris Allen and Joe Ducie. It was an interesting session, although Joe is not what you’d call a gregarious person and he is limited in what he can say without being sent to a black site for breaking the secrets act. This session attracted a lot of teen readers, a first for any writers’ festival I’ve been to, in part due to the young adult theme of Joe’s book and Chris’ campaign to get more boys reading. Also, why is it that the nice and friendly people always seem to write the books with the largest body counts?

My plans for the day were beaten with a cricket bat when the session Fair Go Mate was filled past standing room only. Not being able to gain admittance I’m going to say the session was clearly for doo-doo heads. Instead, I went and saw The Inner Life of Others. Amanda Curtin discussed building and writing characters with Debra Adelaide, Chris Womersley, and Andrea Goldsmith. I was sitting next to the fan for one of the much-needed air conditioners for this session. So while I was quite cool and sweat free, I couldn’t hear the speakers clearly. I think in future the festival need monitors for the speakers or better technicians on hand to get the sound levels right.

I had hoped to see the session Boom Town Rats in the afternoon, as David Whish-Wilson was speaking. He wrote my favourite novel of 2013 after-all. I had to settle for asking him how things went via Facebook: apparently, it was an interesting discussion session. Instead, I went to Annabel Smith’s workshop on Social Media Marketing. Annabel discussed various aspects of social media and the Hub and Outpost model, with your blog/website being the hub. We had a range of people in the room from social media novices to professionals, and a couple of people who didn’t see the point – I mean, being able to talk and form communities with people on the other side of the planet instantly is so overrated. Annabel did well in catering to such a wide spectrum.

Saturday 22nd

Lee Battersby’s fantasy writing workshop, Universal Law, kicked off my Saturday with a teddy bear explaining humans to aliens (you had to be there). This was a fantastic session and I got a lot out of it. Okay, that could just be confirmation bias talking, because Lee did confirm a lot of my own thoughts on fantasy and fiction writing in general, but I’m just going to pretend we’re both right. Plus, I’ve got the beginnings of a cool little absurdist short story from the session, which may have made the session pay for itself.

Hungry and in need of golden ale refreshments, I headed to the UWA Club. David Marr was holding court with a throng of fans/questioners/listeners after having finished his discussion panel. I was tempted to join the group and ask him when he was going to finally stab Andrew Bolt to death for crimes against journalism, but decided to not ruin his day.

After a leisurely lunch at the UWA Club, I skipped the next beer and went to The Game Changers: What’s In Store? Stephanie “Hex” Bendixsen chaired a fascinating discussion about the games industry and storytelling. Dan Golding, Dan Pinchback, and Guy Gadney were all insightful speakers and kept the audience of preteens to curmudgeons entertained. Guy Gadney also showed a quick wit when a young lad couldn’t remember Guy’s name, with the boy ending up on stage answering questions (which he handled quite well).
Hex-and-special-guest-panelist

Although, as if to prove that the games industry has a long way to go, or that men are still dickheads, one of the audience members started his question with “Damn girl, you fine!” when addressing Hex. If there was only some way to breed this behaviour out of the population….

The next session I attended was Hi-Viz Days with author and comedian Xavier “Matty” Toby. As a general rule, I don’t read non-fiction, as it is often more fiction than non-fiction, is often boring, and has far too low a body count to be entertaining for me. But having attended this session and listened to Xavier read out some sections from the book, I would recommend you read his book about his mining experiences. Having lived in rural Australia for a large chunk of my life, a lot of the conversations, the style of speech, and the characters portrayed sounded like the people I’ve met and know. A few award winning authors should read Xavier’s book to see how rural and regional people actually speak (or at least hand back the awards for capturing the ‘bush lyricism’ in their novels).

Sunday 23rd

My Sunday started rather early. Or rather, my Saturday didn’t really finish until Sunday morning. My little bundle of joy was ill and had trouble sleeping, which meant I did too. It also meant I’ve contracted his illness: parenting is lots of fun.

I’d already missed one of David Whish-Wilson’s sessions on the Friday, but I went the whole hog and missed his Sunday session as well. His interview on Perth, the city, and his non-crime, non-fiction book, on Sunday apparently went well (full house). David assured me that there were plenty of interviews being done around the festival on this book. Which means if we check his webpage we could probably track down an interview with David on Perth; the book and the city.

The only event I managed to attend on Sunday was Susan May’s workshop on Standing Out From the Crowd. It turns out that Susan and I had been in the same all day workshop on publishing a few Perth Writers’ Festivals ago. Her takeaway from that event had been to avoid the slush pile and somewhere along the way, after developing industry contacts to help avoid the slush pile, she self-published. I agree with one of the other attendees that Susan’s session was enthusiastic and genuine.

And that concludes my Perth Writers’ Festival adventure for another year. It was good to catch up with friends and other attendees over the three days and I hope others enjoyed the event as much as I did.

Book review: The Tournament by Matthew Reilly

The TournamentThe Tournament by Matthew Reilly
My rating: 4 of 5 stars

Just about everyone has already commented how this novel is a departure for Matthew Reilly. It’s still unmistakably a Matthew Reilly novel, but instead of a thriller, this is a mystery novel.

Whilst this was an enjoyable novel, I can’t rate it as highly as his others. The key to enjoying the change in Reilly’s murder mystery cum chess tournament is to remember this is a mystery and not a thriller. Seriously, some of the reviews I’ve seen sound like they were expecting Scarecrow to time travel back at any moment and start shooting mutant monkeys, and were annoyed when that didn’t happen.

View all my reviews

Top Suspense Hangout video

Today was the start of the Perth Writers’ Festival, the local festival for my fellow pale, short-sighted, readers and writers. Once a year we gather together to fulfil our in-person social interaction requirements for the year.

Before I left the house, Libby Hellmann, Lee Goldberg, and Paul Levine had a Top Suspense Google+ Hangout. They discussed a number of issues around writing suspense stories. Funny how the title of the group and hangout gives away the topic. It was a good session and I highly recommend my fellow writing friends to have a watch of the embedded video below.

Total Recall: the movie, the movie, or the book?

Screen Shot 2014-02-13 at 7.54.03 pm

At the moment there is a lot of talk about Paul Verhoeven’s ‘trilogy’ of sci-fi movies being remade. I think the terms used to discuss the remakes are stupid, banal, and facile. Verhoeven made three fantastic social satires, that were also science fiction action films: Robocop, Total Recall, and Starship Troopers*. Okay, only two were fantastic, Starship Troopers was stupid. They were also all made at a time when you could make a grossly violent film and not be shunned by cinemas and TV in favour of PG13 violence – you know, the violence that is heavy on explosions and pew-pew noises, but light on the consequences of that violence, which raises kids to believe that violence doesn’t hurt anyone.

Robocop: The Reboot has just hit the cinemas, spurring people the internet over to complain about a movie they haven’t seen (new Robocop), a movie that hasn’t been made yet (new Starship Troopers – not to be confused with Super Troopers), and how terrible the recent Total Recall movie was. Anyone would think that Colin Farrell had personally shagged Arnie’s housekeeper the way they talk about the Total Recall remake.

So I did something unthinkable: I rewatched the remake, the Verhoeven/Schwarzenhamneggnburger version, and read the Phillip K Dick short story (or is it a novella?). The reason for doing so? Because these remakes were being derided so heavily. Nothing inspires people to touch wet paint like putting a wet paint sign on it.

Let’s start with the Total Recall remake. It is an action film: good start. It is a sci-fi: in that it doesn’t have a talking dragon in it, thus it can’t be fantasy, despite the lack of ‘science’ in the science fiction, making it closer to fantasy. It has half decent actors in it: I’d watch just about anything with Kate Beckinsale in it since seeing Shooting Fish, as long as the movie doesn’t have Ben Affleck in it – yes that one, let us not speak it’s name. It also appears to have a plot: I could be mistaken.

As a film the Total Recall remake is fine. All the right things explode, all the good guys live, all the bad guys die horribly, most of the needless violence is against robots so we don’t get caught up in the mass genocide that the hero performs. As an adaptation of the short story, you could be forgiven for thinking the film makers only read the first few pages; much like the original movie. As compared to the original Total Recall, it is a pale, facile shadow.

The Arnie version worked as both a straight up action movie, but also had a much better secondary plot about whether it was all happening or all in his head. This part is what makes the original movie closer to a Phillip K Dick adaptation than the new movie. Although the original movie being closer to the source material is probably because the screenwriter and Verhoeven had read the dust jacket of the story, whereas Len Wiseman and his screenwriter just took Verhoeven’s word for it that there was an original story to base the movie upon.

Dick’s story actually has a really funny and interesting twist ending, which neither movie used because the movies and story diverge at about the time when Doug Quaid (Quail in the book) arrives home after visiting Rekall. In fact, We Can Remember If For You Wholesale bears so little resemblance to the movies that you’d more call it an inspiration for them rather than source material. I don’t have a problem with this, as long as they handed Dick a great big cheque, maybe a signed picture of Arnie to go with it, maybe some Planet Hollywood shares as well.

The movies are both good fun, both are entertaining, both are well made, both had dubious understandings of physics. There is nothing wrong with the new movie as a piece of entertainment. But it won’t last the way the original movie has. This comes down to Verhoeven’s handling of the secondary plot, which might as well not exist in the remake. I certainly look forward to the even more facile Total Recall movie that will come out in another 20 years, which will probably not even have a three boobed woman in it.

* I could write an entire essay on how Heinlein’s original novel differed from the movie and how its social comment was far deeper and insightful than the movie.

Sony exits ebook biz

I don’t know if you’ve heard, but there are these things called electronic books now, e-books for short. Now these are brand new (invented 1971, possibly as early as 1949) and understandably the devices to read them are even newer (first e-reader released 1998). So it may come as a shock to many of you that quite a few people read e-books on e-readers now instead of paper books. It will come as even more of a shock to you that the Sony e-reader has become a thing of the past.

That’s right my fellow book lovers – lovers in the adoration sense, not in the brace yourself, oh yeah, uh-huh, uh-huh, chikka bow-wow, sense – it appears that Sony has decided it doesn’t want a dedicated e-reader, in fact it doesn’t even want an e-book store. They have announced that they are pulling out and customers are being transferred to the Kobo store.

Of course, I don’t think anyone is particularly surprised by this decision. Raise your hand if you’ve ever actually seen a Sony e-reader. Now keep it up if you’ve actually owned one. If you can see anyone with their hand still raised, I’d question how you manage to turn people’s web cams on. Sony has been playing at the bottom end of the market for e-readers and e-books for quite a while now. The chart below from Goodreads shows Sony were picking up Kobo’s scraps in the market.

Screen Shot 2014-02-17 at 8.49.23 pm
http://www.slideshare.net/GoodreadsPresentations/whats-going-on-with-readers-today-16508449

So what does this mean for us readers? Well, it means the big dedicated e-readers remain, the Kindle and Nook. It also means Kobo could pick up a bit more of the e-reader and e-book market. But that isn’t particularly interesting to me, I’ll discuss why in a moment. What is interesting is the Sony e-reader is probably the victim of the modern device market.

I read an interesting tech article that was discussing mobile phones. They pointed out that the companies making money on phones weren’t actually making money on the phone sales, especially at the mid to lower price points, but instead cashing in on the app stores and downloads. The phone is a loss leader for the software business they run. Nokia and their deal with Microsoft is a classic example of this, with Nokia battling to compete for market share and profits.

Translate that to e-readers and the same thing applies. It was even worse for Sony, as the other competitors were/are selling their Kindle, Nook, Kobo, etc, as a loss leader to get people using their store or affiliates. This meant that the big stores attract the users, who buy the associated tech, which locks them to the stores (to some extent at least), leading to e-book sales profits. Terrific! As long as you don’t think too hard about the slave labour making the devices.

The reason I don’t find the market positioning of the e-reader devices of much interest is down to a few things. The first is a little statistic that has been showing up in surveys from Goodreads and The Pew Institute; namely that 29-37% of people read books on their phone (23% on a tablet). A dedicated reading device is only really in the book space now because the e-reader screen has less eye fatigue. At the moment! Watch this bubble burst as phones and tablets eat away at the readability technology, such that e-reader screens become redundant. Mobile devices also don’t have to be linked to any one e-book store, so interesting times are on the horizon.

Another view on e-readers future: http://techland.time.com/2013/01/04/dont-call-the-e-reader-doomed/

Isabel Allende’s scorn for genre fiction

science-fiction-vs-proper-literature
Literature vs Genre: jetpack wins!

There is a storm brewing. In the latest of the long line of insults by literary fiction against genre fiction, Isabel Allende has taken a pot shot at crime fiction. Now apparently she hates crime fiction because:

It’s too gruesome, too violent, too dark; there’s no redemption there. And the characters are just awful. Bad people.

But that didn’t stop her writing a crime mystery. It also didn’t stop her saying that the book was a joke and ironic. I think the word she was actually looking for was hypocrite.

I’ve never really understood the people who read or write stuff they don’t enjoy. Sure, I read some really boring science journal articles, but that’s because I enjoy knowing stuff. If I’m going to sit down and read a book, I want that 10-20 hours of entertainment to be, well, entertaining. If I’m writing, which is a much longer and more involved process, why would I invest that much time in something I’m not enjoying doing?

So to some extent, I understand why Isabel decided that her mystery had to be a joke and ironic. But that is also the crux of the problem, she doesn’t seem to understand that she is also insulting readers and fans of genre fiction. I think the book store in Houston, Murder by the Book, that had ordered 20 signed copies of her novel, did the right thing in sending them back.

Now you can write a satirical or ironic take on a particular genre or sub-genre of fiction. But when you do so it has to be because of your love of all those little things you’re taking the piss out of. If you do it out of hate then you can’t turn around and try to sell it to the audience you are taking a pot shot at. I think this stuff is stupid, you’re stupid for reading it, but I still want you to pay me for insulting you.

I get a little sick of snobbishness toward genre readers and writers. Do genre readers and writers take pot shots at literary authors for their lack of plots, characters who have to own a cat and be suffering, and writing that is there to fill pages with words and not actually tell a story? No. We’re too busy reading something exciting.

It would be great if people just enjoyed what they enjoyed and stopped criticising others for enjoying what they enjoy. Enjoy.

See also:
http://www.fictorians.com/2013/03/04/literary-vs-genre-fiction-whats-all-the-fuss-about/

Book Review: The Shining Girls by Lauren Beukes

The Shining GirlsThe Shining Girls by Lauren Beukes
My rating: 5 of 5 stars

I met Lauren two years ago now, when she was running a class on writing (d’uh). This first sentence of the review is essentially a name drop… move along, nothing to see here.

The Shining Girls is such an interesting take on crime novels, with a wibbly-wobbly, timey-wimey*, plot and some fascinating story telling. Lauren has an interesting setup for the serial killer and his victim protagonist, a setup that you hope has a good payoff. Well, it doesn’t have a good payoff, in the final pages, it has an excellent payoff.

The version I ‘read’ was the audiobook, which is worth mentioning because there were multiple narrators to take on the various points of view used in the book. This was a great touch that I wish more audiobooks would do. For a complex novel like The Shining Girls, it is almost necessary. I can say I have stopped listening to at least two audiobooks in the past year that probably would have been improved with multiple narrators to clarify changes in points of view. Or you could just read the novel the old fashioned way, just not whilst driving, or using a table saw, as I was able to with the audio version.

* If you don’t get that reference I pity your TV viewing habits.

View all my reviews

Music that lasts

I was recently having a discussion about Zeitgeist. No, not the concept of a spirit of the age or spirit of the time, I mean the 2007 album from the (not) Smashing Pumpkins. I’ve been a massive fan of the Smashing Pumpkins’ music since about 1994 (wow, 20 years!) but have to say that Zeitgeist was the last of their albums I bought and I don’t listen to it, Ava Adore (1998), nor Machina (2000). Essentially, I’m no longer a fan of the Smashing Pumpkins, I’m a fan of their early work only.

What amazes me is you can listen to Gish (1991), Siamese Dream (1993), Mellon Collie (1995), even their b-sides album Pieces Iscariot (1994), and they still hold up really well. With the exception of the song Untitled (from their retrospective Rotten Apples, 2001) and maybe Tarantula (from Zeitgeist), the Smashing Pumpkins haven’t released a song or album that compares to any of the material on those early albums. With the more recent material the songs sound unfinished. When old b-sides sound better than your new a-sides, you really have to question what you’re doing.

But this isn’t just about the Smashing Pumpkins, name a Rolling Stones song released in the last 30 years (i.e. everything post Dirty Work from 1983). Can’t, can you!? They’ve released 5 studio albums and countless – well you can count them, but who cares to – live and collection albums in that time. Fans everywhere dread this announcement at a Rolling Stones concert, “And here’s a song from our new album.”

There are a few factors at play here: the idea of talent and inspiration meeting, the idea that even great artists can’t continue at that elite level indefinitely, and the idea that some art is transitory whilst some is timeless. I’ll leave the first two points for another day, the latter point gives me an opportunity to insult pop music.

Some art, music, TV, movies, books, etc, rise through the charts, become hugely popular, and dominate the media. Then a few years later everyone is embarrassed to talk about those artists and art, digging a deep pit of denial to throw those pieces of crap where they will never be found again. I’ve discussed this before in my article on Good versus Popular, suggesting that popular music/art/things aren’t necessarily good and that time and perspective sort the wheat out from the chaff. Some of the music we enjoy is just because it is played everywhere we go. Some music just filled a hole in the age bracket or life journey, such as Limp Bizkit for all the angry teens, or Placebo with their dark depressing (teen) angst music. A decade on and you’d battle to find anyone who would admit to having bought a Limp Bizkit album, and when I recently relistened to those albums I wondered how I ever listened to that junk.

So what music (or art) lasts? Is it immediately obvious? What lasts isn’t easy to define, because I would never have picked Yellow Submarine to last in the same way that Get Back has. A kid’s song versus a satire of attitudes to immigration in the UK. Would we even listen to Yellow Submarine now if it hadn’t been a Beatles song or bland and inoffensive enough be played to us as kids in primary school? I digress. I think the answer to what will last is often, but not always, immediately obvious. And what lasts is rarely categorised by the prefix* pop.

Take for example everyone’s current objects of pop music derision: Justin Bieber (or Miley Cyrus, whichever you prefer to hate more). Bieber’s music is popular, he’s famous as a result, and I don’t think anyone would argue that his music will be forgotten in 5 years time and laughed at in 10, much like The Spice Girls. Remember them? Me neither. We** already know his music won’t last. And how about an example of something that will stand the test of time…. Wow, this is the part where I admit I’m a metal fan and haven’t listened to ‘commercial’ music in over a decade. I’d say Daft Punk’s most recent work will last, but they have been around for over a decade now, so hard to call them a new artist.

But I will give you another prediction, Pearl Jam will be my generation’s Rolling Stones. They will be still touring long after anyone has realised they still record new albums. And people will go to see them live because of those first few albums that everyone loved and still loves.

Essentially I think that lasting comes down to quality. I’m not talking about the recording studio, production values, or hair gel and dance routines. I’m talking about the quality that arises from talent and inspiration meeting. Bob Dylan’s songs had terrible production and his voice sounds like someone gargling gravel, whilst strangling a cat as their foot is fed into a wood chipper. Yet he had talent and inspiration, subsequently capturing the zeitgeist and lasting (see what I did there). But that music/art has to find a fanbase, whether immediately, or growing it over time as Led Zeppelin did. Now the only question remains: which is better, to last or to grab the headlines for 15 minutes?***

* Yeah, I know, not actually a prefix, more of a noun or adjective dependant upon the context.
** Having not ever heard any of Justin Bieber’s music and only accidentally heard part of a Miley Cyrus song at the gym, I can’t actually judge how good or bad their music is and how long it will last. I’m basing my judgement upon what has happened with previous pop stars.
*** The answer is easy: to last. If everyone forgets your 15 minutes did you even have those 15 minutes?