There is nothing worse than picking up a book, movie, whatever, expecting to be entertained based on the cover. The above example is the movie Far Cry, starring Til Schweiger, in what looks like a cool action flick. The description even makes you look past the fact that this is a video game adaptation, promising a slick action-eer:
An ex-special forces soldier turned boatman is hired by a journalist to investigate a top-secret military base on a nearby island.
The problem with this packaging is that this is a film by Uwe Boll. Til Schweiger is a fantastic actor and a major box office draw card, especially in his home country of Germany. He was also the driver behind one of my favourite films of all time, Knocking on Heaven’s Door. Yet not even Til can save us from the worst director of all time.
One of the things that amazes me about Uwe Boll is not so much the fact that he is still making films (petition to stop him making films) but the fact that he is able to attract the money and star power to his movies. You would think that actors would be keen to avoid working with Uwe so that they don’t sign a career death note. But Til, Ron Perlman, Burt Reynolds, Jason Statham, Ray Liotta, Eric Roberts, Christian Slater, Stephen Dorff, Claire Forlani, Leelee Sobieski, John ‘Gimli’ Rys-Davies, and Ben Kingsley (although, Kingsley may be an Oscar winner, but he has appeared in some truly awful films), have all lined up to appear in a Uwe Boll production. Why!?! Rys-Davis has implied that the money is good and Uwe is easy to work for. No mention of exactly how good Uwe is to work with; I’m going to assume running hot and cold hookers and blow.
This speaks to the underlying problem with picking good entertainment. We can be easily mislead with a cool blurb, impressive trailer, a spot at the front of the store, a stand that tackles you to the ground and forces you to buy the movie/book. It is why movie stars are paid big money, because they have a brand that audiences recognise, and that can guarantee box office sales. In publishing you have name brand authors like James Patterson occupying the front of the store because they are reliable bestsellers. And Lee Child was recently shown to have the strongest brand in publishing, with fans following him from book to book more than any other author, because of his reliably entertaining books. Uwe Boll is the opposite of this brand of success and reliability.
Essentially media consumers like us are less likely to try a new author, or watch a film by a new director, or one that stars actors we haven’t heard of, because of the Uwe Boll’s of this world. We want our entertainment to be entertaining – I know, not much to ask really – and we hate being mislead by slick tricks. We see a cool poster or cover, we see a big name actor attached, or read a cool blurb, only to be sorely disappointed. So instead of trying something new, we stick with what we know and trust.
I guess that is why I promote books I’ve read and liked on this site. That is why we need people to review books, movies, TV shows and music. That is why we need to find people with similar tastes to make recommendations to us. If we can’t stop Uwe Boll making films, at least we can tell people about the films that are worth watching.
I bought this book for my wife when it first came on sale. When she finished reading the book she was immediately asking me when the sequel was being released – a year later, of course. So considering that this trilogy has been finished and the movie has already been released, it shows just how long my TBR list is that I’ve only gotten to this one now (even then, only as the audiobook).
There is something refreshing about a young author writing young adult novels. And it is enjoyable to have a good mix of action, introspection, character development, and social commentary. Some have criticised the five factions, that are the basis of the story’s society, as unrealistic…. Because wars over fuel would never happen in reality – the criticism levelled at Mad Max. What I’m saying is that people making this criticism have kinda missed the point being made.
Definitely worth a read, even for non-YA fans.
NB: This cool cover art was the reason I originally bought the book. I knew nothing about it, except that the cover looked cool and the blurb sounding like it would appeal to my wife. Cover art is really important (for me at least).
This latest video from the Ideas Channel raises an interesting point about how there appears to be more complex narratives in TV shows now.
Of course, there are several problems with this idea. The first is perception. For every Breaking Bad and Justified we have CSI Whatever and the banality of reality TV. So without some hard data on the number of shows and relative audiences, it is really hard to say how real that perception is.
The second problem is that TV shows run a continuum from pure episodic shows, where everything is wrapped up in an episode and the next episode has little to no changes evident to the characters or larger show, through to serials, which have more complex plot lines that often take at least a season to develop and resolve with character arcs building over the course of the entire series. The key word is continuum, as most shows have some aspects of the serial and episodic about them. Again, without breaking down each show on this continuum, and then comparing shows now versus the past, we don’t have any idea of what has changed, if anything has changed.
The third problem is the good old sample or selection bias, especially as it relates to our favourite shows and the shows we remember. E.g. Survivor has been running since 2000 (or 1997 if you are in the UK), yet without looking that up I’d have had no idea when the show started, let alone whether it is still running. I don’t remember it because I’m not a fan. But I will still complain bitterly about the cancellation of Firefly. My frame of reference is biased, so I’m going to remember some shows more than others and think more favourably of some of the ones I remember than others.
The final problem I see is assigning time shift technologies and marathon watching as the driver of a change in our demands for more complex narratives. The idea itself is sound, as I can’t think of thing less interesting than watching the same episode with minor changes in a marathon. That would be like watching 9 hours of hobbits walking. The recording, DVD buying, streaming and subsequent marathon TV show watching would indeed favour shows that have more to them, that more complex narrative that will keep you pressing play on the next episode.
I don’t know that the time shifting, or recording, or DVD buying, or other methods of marathon watching, is driving a demand for more complex narratives. As I said above, I think the more complex shows lend themselves more to the marathon than other shows. But if we assume there are more of these shows worth grabbing a blanket and a couch dent, then I still think there are other things at play. I think we’ve seen more avenues for creativity come to the fore, such as Youtube channels, computer games, and the like that didn’t exist a decade ago as they do now. As a result, entertainment such as TV shows have a need to engage the audience on a deeper level. So while episodic shows like CSI Whatever are still huge, they don’t attract the same devotion and fan adoration as a good serialised show. Plus, the advantage of the more complex narratives is that it allows for more interesting characters, plot lines, etc, which is turn allows for better acting, direction, writing, etc, which creates a feedback loop that may one day cause fandom to implode due to awesome achieving gravitational singularity. I’m assuming this will happen when Netflix reboots Firefly.
NB: I hate the term binge watching and as such haven’t used it in this article. Binge implies that there is something wrong with what you are doing. There is nothing wrong with watching a TV show or movie series you enjoy, so we should stop implying there is something wrong.
Something a little bit different from Andy McDermott with Persona Protocol; different in that Nina and Eddie aren’t being shot at in this one. But there is still plenty to enjoy about this techno-spy-thriller, not starring Nina and Eddie, but instead Adam and Bianca take over the being shot at duties.
Andy again delivers his mix of breakneck pacing and humour that are the reason I enjoy his books so much. I think this departure from the Nina and Eddie series of archaeological adventures (is it still archaeology if they destroy most of the stuff they find?) is every bit as good, and I hope to see more of these departures from Andy.
With the rebirth of Cosmos on TV, Neil DeGrasse Tyson and the team have brought science back into the mainstream. No longer is science confined to the latest puff piece on cancer research that is only in the media because a) cancer and b) the researchers are pressuring the funding bodies to give them money. The terms geek and nerd have stopped being quite the derogatory terms they once were. We even have science memes becoming as popular as Sean Bean “brace yourself” memes.
This attention has also cast a light on the scientific process itself with many non-scientists and scientists passing comment on the reliability of science. Nature has recently published several articles discussing the reliability of study’s findings. One article shows why the hard sciences laugh at the soft sciences, with the article talking about statistical errors. I mean, have these “scientists” never heard of selection and sample bias? Yes, there is a nerd pecking order, and it is maintained through pure snobbishness, complicated looking equations, and how clean the lab-coat remains.
As a science nerd, I feel the need to weigh in on this attack on science. So I’m going to tear apart, limb by limb, a heavy hitting article: Cracked.com’s 6 Shocking Studies That Prove Science Is Totally Broken.
To say that science is broken or somehow unreliable is nonsense. To say that peer review or statistical analysis is unreliable is also nonsense. There are exceptions to this: sometimes entire fields of study are utter crap, sometimes entire journals are just crap, sometimes scientists and reviewers suck at maths/stats. But in most instances these things are not-science, just stuff pretending to be science. Which is why I’m going to discuss this article.
A Shocking Amount of Medical Research Is Complete Bullshit
#6 – Kinda true. There are two problems here: media reporting of medical science and actual medical science. The biggest issue is the media reporting of medical science, hell, science in general. Just look at how the media have messed up the reporting of climate science for the past 40 years.
Of course most of what is reported as medical studies are often preliminary studies. You know: “we’ve found a cure for cancer, in a petri dish, just need another 20 years of research and development, and a boatload of money, and we might have something worth getting excited about.” The other kind that get attention aren’t proper medical studies but are spurious claims by someone trying to pedal a new supplement. So this issue is more about the media being scientifically illiterate than anything.
Another issue is the part of medical science that Ben Goldacre has addressed in his books Bad Science and Bad Pharma. Essentially you have a bias toward positive results being reported. This isn’t good enough. Ben goes into more detail on this topic and it is worth reading his books on this topic and the Nature articles I previously referred to.
Many Scientists Still Don’t Understand Math
#5 – Kinda true. Math is hard. It has all of those funny symbols and not nearly enough pie charts. Mmmm, pie! If a reviewer in the peer review process doesn’t understand maths, they will often reject papers, calling the results “blackbox“. Other times the reviewers will fail to pick up the mistakes made, usually because they aren’t getting paid and that funding application won’t write itself. And that’s just the reviewers. Many researchers don’t do proper trial design and often pass off analysis to specialists who have to try and make the data work despite massive failings. And the harsh reality is that experiments are always a compromise: there is no such thing as the perfect experiment.
Essentially, scientists are fallible human beings like everyone else. Which is why science itself is iterative and includes a methods section, so that results are independently confirmed before being accepted.
This illustrates that when you test for something at the 95% confidence interval you still have a 1 in 20 chance of a false positive or natural variability arising in the test. Some “science” has been published that uses this false positive by doing a statistical fishing trip (e.g. anti-GM paper). But there is another aspect, if you get enough samples, and enough data, you can actually get a statistically significant result but not have a significant result. An example would be testing new fertiliser X and finding that there is a p value of 0.05 (i.e. significant) that the grain yield is 50kg higher in a 3 tonne per hectare crop. Wow, statistically significant, but at 50kg/ha, who cares?!
But these results will be reported, published, and talked about. It is easy for people who haven’t read and understood the work to get over excited by these results. It is also easy for researchers to get over excited too, they are only human. But this is why we have the methods and results sections in science papers, so that calmer, more rational heads prevail. Usually after wine. Wine really helps.
Scientists Have Nearly Unlimited Room to Manipulate Data
#3 – True but misleading. Any scientist *could* make up anything that they wanted. They could generate a bunch of numbers to prove that, for an example of bullshit science, the world is only 6000 years old. But because scientists are a skeptical bunch, they’d want some confirming evidence. They’d want that iterative scientific process to come into play. And the bigger that claim, the more evidence they’d want. Hence why scientists generally ignore creationists, or just pat them on the head when they show up at events: aren’t they cute, they’re trying to science!
But there is a serious issue here. The Nature article I referred to was a social sciences study, a field that is rife with sampling and selection bias. Ever wonder why you hear “scientists say X is bad for you” then a year later it is “scientists say X is good for you”? Well, that is because two groups were sampled and correlated for X, and as much as we’d like it, correlation doesn’t equal causation. I wish someone would tell the media this little fact, especially since organic food causes autism.
Other fields have other issues. Take a look at health and fitness studies and spot who the participants were: generally they are university students who need the money to buy tinned beans and beer. Not the most representative group of people and often they are mates with one of the researchers, all 4 of them. Not enough participants and a biased sample: not the way to do science. The harder sciences are better, but that isn’t to say that there isn’t limitations. Again, *this is why we have the methods section, so that we can figure out the limitations of the study.*
The Science Community Still Won’t Listen to Women
#2 – When I first wrote this I disagreed, but now I agree, see video below (I’d still love to hear from someone without a penis on this one). There is still a heavy bias toward men in senior positions at universities and research institutes, just like all other aspects of society. This is gradually changing, but you have to remember what age those senior people are and what that generation required of women (quit when they got married, etc). That old guard may have influence, but I doubt it is as large as implied, and they’ll all be dead or retired soon where their influence will be confined to the letters to the editor in the newspaper.
And I’d question how much this influence has on “listening” to women in science, because if there is a field that encourages knowledge and evidence over other aspects, then science is it. After seeing the video below, especially the way the question was asked, I think it is clear that the expectations for women create barriers into and through careers in science (the racism is similar and is one I see as a big issue). So it starts long before people get into science, then it continues through attrition.
This isn’t to say that there isn’t an issue with equality still to be dealt with. That old guard isn’t dead yet and their influence will hang around like old people smell for a while to come. But this issue isn’t confined to science and I think science is better placed than many other fields. I won’t go into the preferred areas of study issue, as maths, engineering, science, social science, humanities, etc, all have massive differences in their sex ratios that would need an entire uninformed rant on.
Fast forward to 1:01:31 for the question and NGT’s answer (sorry, embed doesn’t allow time codes).
It’s All About the Money
#1 – D’uh and misleading. Research costs money. *This is why we have the methods section, so that we can figure out the limitations of the study.* Money may bring in bias, but it doesn’t have to, nor does that bias have to be bad or wrong. Remember how I said above that science is an iterative process? Well, there is only so big a house of cards that can be built under a pile of bullshit before it falls down in a stinky mess. Money might fool a few people for a while (e.g. climate change denial) but science will ultimately win.
Ultimately, science is the best tool we have for finding out about our reality, making cool stuff, and blowing things up. Without it we wouldn’t be, this article wouldn’t be possible, we wouldn’t know what a Bill Nye smack down looks like. Sure, there is room for improvement, especially in the peer review process and funding arrangements, and science is flawed because it is done by humans, but science is bringing the awesome every day: we have to remember that fact.
After a recent discussion about gun myths, I realised that my last blog post hadn’t covered anywhere near enough of the myths that are floating around (this article will mainly be about US guns, but parallels from the resources and science cited can be drawn to other countries). This is obviously because stuff is much easier to make up than to research, just ask Bill “tides go in, tides go out” O’Reilly. One of the big problems with research in the US on guns is that the National Rifle Association has effectively lobbied to cut off federal funding for research and stymieing data collection and sharing on gun violence. As a result there are a lack of hard numbers and research often tends to be limited in scope. Scope: get it? So like a lost rabbit wandering onto a shooting range, or a teenager wearing a hoody, it’s time to play dodge with some of these claims.
Myth: Guns make you safer, just like drinking a bit of alcohol makes you a better driver.
The myth I hear the most often is that guns make you safer; just like the death penalty is a great deterrent, surveillance cameras stop crime, and the internet is a good source of medical advice. The problem with this myth is that people like having a safety blanket to snuggle. What they don’t realise is that guns don’t make you safer, they make you 4.5-5.5 times more likely to do something stupid to someone you know and love than be used for protection.
I want to be clear here: there’s nothing wrong with going shooting at the range, or hunting vermin. The problem is thinking that you can use a gun for self-defence, when it actually makes the violence problem worse. That gun escalates the violence because people have it there: why not use it? To wit the criminals enter into an arms race and a shoot first policy.
Owning a gun has been linked to higher risks of homicide, suicide, and accidental death by gun. For every time a gun is used in self-defense in the home, there are 7 assaults or murders, 11 suicide attempts, and 4 accidents involving guns in or around a home. 43% of homes with guns and kids have at least one unlocked firearm, and in one experiment it was found that one third of 8-to-12-year-old boys who found a handgun pulled the trigger, which is just plain unsafe.
As for carrying around a gun for self-defence, well, in 2011, nearly 10 times more people were shot and killed in arguments than by civilians trying to stop a crime. In one survey, nearly 1% of Americans reported using guns to defend themselves or their property. However, a closer look at their claims found that more than 50% involved using guns in an aggressive manner, such as escalating an argument. A Philadelphia study found that the odds of an assault victim being shot were 4.5 times greater if they carried a gun. Their odds of being killed were 4.2 times greater.
It is even worse for women. In 2010, nearly 6 times more women were shot by husbands, boyfriends, and ex-partners than murdered by male strangers. A woman’s chances of being killed by her abuser increase more than 7 times if he has access to a gun, and that access could be the woman keeping one around just in case her attacker needs it. One US study found that women in states with higher gun ownership rates were 4.9 times more likely to be murdered by a gun than women in states with lower gun ownership rates; funny that.
There is also the action hero delusion that often gets trotted out when talking about guns for self-defence. The idea is that everyone is a good guy, so give them a gun and you have a bunch of action heroes ready to fight off the forces of evil. This has worked so well that all governments are thinking of getting rid of the military….
The reality is that the average person is not an action hero and would fail miserably in a high stress situation with actual bad guys. You only have to look at the statistics:
I’ve seen several examples cited of “citizens” shooting someone who looked intent on killing everyone they could (with a gun…). But in every instance the “citizen” was actually an off-duty police officer, or a person in law enforcement, or someone in the military. In other words, the people who stop mass shootings or bad-guys with guns, are trained professionals.
There have also been a few studies done that claim X million lawful crime preventions, therefore guns must be good; notably by researchers Lott and Kleck. To say that their research is flawed is like saying Stephen King has sold a few books. Lott’s work has been refuted for extrapolating flawed data. Kleck’s research has similarly been refuted by many peer reviewed articles:
Myth: Guns don’t kill people, people kill people, quite often with a gun, because punching someone to death is hard work.
If this myth were true we wouldn’t send troops to war with weapons. I get where people are coming from with this myth, because the gun itself is an inanimate object and is only as good or bad as the person using it. Yes, I did just quote the movie Shane: thanks for noticing. But here is the thing, in a society we are more than just a bunch of individuals, we are a great big bell-curve of complexity. So when you actually study the entire population you find that people with more guns tend to kill more people—with guns. In the US, states with the highest gun ownership rates have a gun murder rate 114% higher than those with the lowest gun ownership rates. Also, gun death rates tend to be higher in states with higher rates of gun ownership. Gun death rates are generally lower in states with restrictions such as firearm type restrictions or safe-storage requirements.
The thing is that despite guns being inanimate objects, they affect the user/owner’s psyche. It’s like waking up one morning with a larger penis or bigger boobs: you not only want to show them off, you act differently as a result. Studies confirm this change in behaviour. Drivers who carry guns are 44% more likely than unarmed drivers to make obscene gestures at other motorists, and 77% more likely to follow them aggressively. Among Texans convicted of serious crimes, those with concealed-handgun licenses were sentenced for threatening someone with a firearm 4.8 times more than those without. In US states with Stand Your Ground and other laws making it easier to shoot in self-defence, those policies have been linked to a 7 to 10% increase in homicides.
Now people also like to try and red herring the argument against guns by pretending that video games or mental health is the problem. The NRA tried to claim video games were to blame after the Newtown shootings. Of course we’d be able to see this relationship by looking at gun ownership versus video game playing, like by comparing the USA to Japan.
|Per capita spendingon video games||$44||$55|
|Civilian firearmsper 100 people||88||0.6|
|Gun homicidesin 2008||11,030||11|
Myth: They’re coming for your guns to stop our freedom and tyranny and democide and Alex Jones said so and aliens made me do it!
As I stated above, the statistics on guns and gun violence is hazy. No one knows the exact number of guns in America, but it’s clear there’s no practical way to round them all up (never mind that no one in Washington is proposing this). Those “freedom” loving gun owners – all 80 million of them – have the evil government out-gunned by a factor of around 79 to 1. If government were coming for the guns, you’d think they’d have done so before being this grossly out-gunned.
Yes, 80 million gun owners is a minority! I find it interesting that from 1989 to 2000 there was a decline in gun ownership of 46% to 32%. Now the decline in ownership rebounds to hover between 34 and 43% for 2000-2011 (notably the high point in 2007 was after the Virginia Tech shooting which the NRA did a lot of campaigning around), which shows why the decline didn’t continue. Now compare those rates of ownership to the recent report from the US Bureau of Justice Statistics sums up the rates of gun violence. You can clearly see a decline in gun violence from 1993 to 2000 before a plateau that has pretty much held since. This is confirmed by other studies. This is an important take home point: all the research shows violence and gun violence is on the decline. The idea that people need a gun for protection is becoming more and more ridiculous. This is despite the global decline in violence, and trends seen in countries like Australia (more Aussie stats here). On a side note, in the last lot of statistics you see that the more female, educated, non-white, and liberal you are, the less likely you are to own a gun.
So scare campaigns may work to boost sales of guns for a while, but overall, most people don’t want or need a gun. The long term trend has nothing to do with the government coming for the guns and everything to do with people realising they don’t need one and prefer to read a good book, or watch a movie, instead of going to the range.
The simple fact is that more guns in society is the best predictor of death, thus it is time to rethink the reasons for owning a gun, especially if that reason is in case you have to John McClane a situation:
Within science fiction and the wider society there is this idea that we’ll find aliens. I always find it funny when humans talk about discovering “other” intelligent life in the universe. Just a wee bit arrogant to consider ourselves intelligent. Yes, I do realise that I’m arguing that point using technology based on quantum mechanics, probably being read on a device that weighs less than 200g and fits in your pocket, linked by a distributed network, connected by orbital satellites. Science: it works….. bitches.
But I would continue my argument by saying that to some people that amazing interface of technology, that is allowing this blog post to be read around the world, might as well be explained as “magic, magic, magic, magic, magic, god did it.” I certainly couldn’t explain how quantum mechanics works, nor how it applies to communications technologies, let alone how it manages to stream all of my favourite
porn media to my phone. Thus Arthur C Clarke’s third law – Any sufficiently advanced technology is indistinguishable from magic – holds true for the vast majority of people on this planet.
Now the argument against Clarke’s third law is that technology isn’t magic. In fact, in the entire history of human civilisation, with all the things that have been attributed to magic, all the great mysteries of the universe, once investigated, have turned out to be not magic. But I’m talking about the knowledge gap between the average person and the specialist in the field who develops all this cutting edge stuff that allows other specialists to do cool stuff; like making a hoverboard. We are surrounded by everyday items that most of us would struggle to explain the concept of how they work – magnets, how do they work? – let alone understand the complexities involved – magnets, this is how they work.
Douglas Adams brilliantly satirised this idea in his novel Mostly Harmless. Arthur Dent crash lands on an alien planet where the local humanoid populace are rather backward in comparison to us humans. Arthur comes from a planet of television, cars, planes, computers: all sorts of neat stuff. But he doesn’t know how any of it works, nor how to go about reverse engineering any or it. So he becomes the sandwich maker.
Essentially, we point to all of our human achievements to show how smart we are, but in reality most of us haven’t the first clue about any of those achievements. We just aren’t as smart as we would like to think.
Now compare this to aliens. Humans are pretty proud of having gone to the moon, cashing in on all 12 of us who have done so, but to be visited by aliens requires interstellar travel. That requires technology we probably haven’t even dreamed of yet (possibly not, e.g. warp). An alien race that can do that is so far beyond human achievement and intelligence. Thus, I’m suggesting that even at our best, we would be babbling morons in comparison to an intelligent life-form that has managed interstellar travel.
Sure, the aliens that decide to cross interstellar space may be the Cleatus of their species. Their technology may actually have reached the point of sentience and doesn’t require anything of its “makers”. But think of how advanced such a species would be, not to mention how arrogant (rightly or wrongly). There is no reason for them to look upon Earth and see humans as intelligent (e.g. climate change and reality TV). There is also no reason to believe that we’d even notice these aliens. An intelligent life-form travels between star systems, has the technology for that not to have taken billions and billions of years, and some dude with an out of focus camera is going to be the only person to see them?
So I think that humans are rather egotistical to think of ourselves as intelligent life in the universe. I also think that it is arrogant to believe that an alien species would regard us as intelligent. I also think that we’d have little chance of encountering intelligent alien life unless they wanted to be encountered. This is just my view, but the main thing is, Neil DeGrasse Tyson agrees with me (or is that I agree with him?):
With the new TV series Gotham currently being cast there has been a bit of buzz around what the storyline is going to be about. Unfortunately it is not going to be based upon this excellent series by Brubaker and Rucka (should also mention the art by Michael Lark). This would actually be a great way to do a non-Batman series, especially as it would be able to use the recent Nolan films as a lead in.
I guess people who read will be the only ones to appreciate a series focussed on Gotham city police trying to work in the shadow of Batman.
In the tradition of nutritionists – the toothyologists of dietary advice – I have developed a new free diet plan to help people lose weight. Just like all other fad diets, my diet promises to help you lose weight or your money back. And just like other fad diets, I have come up with an overly simple way of losing weight that is guaranteed to not work in the long term.
Introducing the Dodgy Kebab Diet™
The relationship between an alcoholic binge-fest and a stop for a dodgy kebab before heading home has long been known. But have you ever wondered why it is that people don’t gain pudgy spare tires around their middle from their night of drunken debauchery? Well, thanks to not-science and pure speculation, I have discovered that it is the dodgy kebab that keeps people thin and ready for another night of drinking your paycheck.
You see, the dodgy kebab contains a quantum field of dietary entanglement. This means that the dodgy kebab sneaks up on all of that alcohol and redefines its aura, changing it from calories to vomit, which I call the Gastro™ effect.
Now this may work for the overindulgence evenings, but a diet has to be every day for 10 days or 1 dress size, so how can the Dodgy Kebab Diet™ work without the need to get plastered every day? Well, the dodgy kebab’s quantum field of dietary entanglement works just as well on your stomach lining as it does on alcohol.
The diet is very simple: eat one dodgy kebab per day for 10 days and I guarantee you will lose weight.* That’s it! You will feel better ** and look better ***.
Don’t just take my word for it: here is one of my satisfied
Shane: I started the Dodgy Kebab Diet™, caught Gastro™ and lost 5kg.
With the Dodgy Kebab Diet™ you pay no money for access to my fully unqualified nutritionists (we’d have to be dieticians to be qualified). You only pay $59.95 per month for access to our extensive database of Dodgy Kebab Diet™ endorsed vendors. No need to spend every Saturday night wandering around to find the one with Gastro™. I’ve done all the research for you, compiling all the dodgy kebab vendors from the Food Safety Authority. All you have to do is send me $59.95 and I will send you the mobile phone app and simple instructions on how to get the most out of your bout of Gastro™.
What if kebabs aren’t my thing?
For an extra $9.95 I can include Chinese, Thai, and all of the least cooked chicken restaurants in your area.
Order now to avoid disappointment at eating well cooked food and gaining weight like a mug.
* We don’t guarantee weight loss on this diet.
** You will only feel better if you make it to the hospital emergency room on time.
*** Looking better is determinant upon surviving the food poisoning and proper application of makeup.
Thanks to Shane Nixon for helping inspire this diet.
Born to write? Born to be an athlete? Born to be a rocket scientist? People love to talk about “natural” ability or talent as the be all and end all of achievement. Since I actually own a genetics text book – it props up my DVD collection on the shelf – and once watched someone do manual labour, I feel qualified to comment on the talent vs. work debate.
Genetics is a big, complicated, topic, so I’m going to provide a facile overview of it. Genetics is that thing that means some people have higher baselines, are higher responders to training/learning, and are likely to achieve more (see this and read this for sports examples). For some the opposite is true, they have low baselines, don’t respond well to training/learning, and are likely to suck no matter what they do. There isn’t much you can do about your genetics, unless you happen to have a time machine and can play matchmaker to get better parents.
But that isn’t to say that you shouldn’t try to get good at stuff. Until you are tested and start training, you don’t really know what your “ability” is. And just because you might continue to suck, you will suck less than you did before, which means you will be better than those around you who didn’t even try. Take an example from sports – because people actually do science on athletes, the arts talk about their feelings too much – athletes tend to live longer than normal because they are more likely to be fitter, which lowers cardiovascular mortality. You don’t get fit sitting on a couch, watching TV, snacking on corn chips, in your underwear: you have to train.
So let’s take this into the writing field. You may have been born with a massive brain, nimble fingers, and an imagination that rivals college students tripping on acid, but that doesn’t mean much if you never learn to read, or write, or are too poor to have access to materials for writing, or the persistence to share that writing with the world. All that talent and ability counts for nothing if you don’t do something with it. You have to train. The difference between the talented individual and the untalented individual can often just be a lot of hard work by the untalented. I mean, who has sold more books: James Paterson or any of the Booker Prize winners?*
But let’s not get carried away. We have to acknowledge that any “talent” is a GxE interaction (genetics by environment interaction). Genetics, or that innate ability, is still a factor that we can’t dismiss, but so is the environment. So all of that skill development and training will come more easily, more quickly, and possibly progress further for some, but that isn’t an excuse for not doing the hard work.
* Not that I’m insinuating that winning a Booker Prize actually makes you a talented or good writer. I actually use those prize lists to figure out what not to read.
Recently I wrote about the TV shows that have been keeping me entertained, or at least giving my eyeballs some much-needed exercise. One of the TV shows I’d failed to get into was a little sci-fi on Fox called Almost Human. It appears that the reason I’d had trouble appreciating this new show is that Fox is up to its old tricks.
That’s right, Fox is airing the episodes of Almost Human out of order. And before you ask, I did check to see if Joss Whedon was in any way involved in the show: apparently not. So Fox can’t use the “we have to dick Joss’ show around” excuse, like they did with Firefly, Dollhouse, etc.
Obviously I’m not a highly paid TV executive, so my opinion on this topic is really inconsequential. Unless, of course, viewers of TV shows – that reason TV shows are made, aside from selling ad-space – are regarded as important in any way. Sure, I don’t have a degree in TV programming, but I would have thought airing a TV show in order would be the sensible thing to do. I’m not sure if the degree at MITV, the TV university located next to MIT, can be done online yet, but I would like to see their syllabus to get some idea of the inner workings of TV networks.
I know when I write a story I always like to start with the fifth chapter, then come back to the second chapter after I’ve written six or so chapters. I especially like to do this in a story which has a lot of new stuff in it, like sci-fi, and where there is any sort of story arc. This way you can really do your best to alienate readers and confuse them.
Not being privy to the inner workings of TV networks, it is hard to say exactly why they would do this, or how often they do this. With some TV shows you just wouldn’t notice. Take a formulaic story capsule like CSI Wherever. There isn’t usually an episode or season spanning story line; dead bodies show up, someone puts on glasses after making a pun, someone wears a lab coat near some magic ‘science’ boxes, they get the bad guy to confess during a flashback. So you would never know if they were aired out-of-order – which also raises the idea of them actually having an order to begin with. This is the sort of show you could just chop and change around to suit whatever excuse is used for butchering a show. But you can’t do this to a serialised TV show.
This isn’t just about annoying and confusing viewers. This isn’t about the disdain the TV executives are showing toward the show’s fanbase, you know, those people they need to sell stuff to. This is about a lack of respect for the creators of the show, especially the writers. Someone has gone to the trouble of crafting a story, an episodic story that needs to build upon previous instalments in order to continue to attract fans. Almost Human has enough of a “stand-alone” nature to the show to not be damaged too much by the lack of continuity (WTF is ‘the wall’??) but plenty of shows have been damaged or destroyed by these sorts of airing decisions.
Bring back Firefly!
Other articles on this: