Tell Your Friends

•July 3, 2012 • Leave a Comment

[I posted this on July 3, 2012. It’s the last anti-religious post I published, and it nicely sums up why: religious arguments had finally completed their journey from “respectably personal” through “offensively harmful” all the way to “trivially absurd.” After five years of passionate atheism, I ran out of interesting things to say.]

A stranger walks up to you at the shops. “Big news,” they say. “The creator of the universe spoke to some humans thousands of years ago. You can read about it in this book. Make sure you tell everyone you know!”

It probably wasn’t a stranger. More likely it it was a parent, a man in a church, or an early schoolteacher. But the point is, someone told you. Someone has to tell you. (And, for obvious reasons, they have to tell you while you’re still a child.) You could pick up the Bible by chance, having never heard of it, and read it from Genesis to Revelation — but without someone there to tell you this stuff is supposedly non-fiction, it would have no more effect on your beliefs regarding the origins of the universe and humankind than Homer’s Odyssey.

What do Allah, Jesus, Athena, Shiva, Jehovah, Quetzalcoatl, and all the rest have in common? Their existence is based solely on word-of-mouth. Take a moment to consider how ludicrous this is. The only way for you to find out that these powerful, supernatural, transcendent beings exist is for other mammals to make noises using vibrating strings in their throat. Mammals who still have chronic lower back problems because they only recently started walking on two legs. They heard it from this other mammal, who heard it from her father, who heard it from some old guy, who heard it from his uncle’s wife, and so on through a game of telephone stretching back millenia to a group of desert nomads. Hopefully they didn’t just make it all up!

Homer’s Odyssey at least has the distinction of being a coherent and moderately entertaining story. The Bible is a disjoint collection of myths and conflicting single-source accounts cobbled together long after any of its events supposedly transpired. Given a very large bookshelf containing a copy of every religious text in history, it would not distinguish itself in any way (except perhaps in the Most Plagiarism From Past Religions category). You’re not struck with an overwhelming sense of awe and peace when you open its pages. No unearthly voices warmly welcome you to the Word of God. It’s just pigments on processed plant matter, like the rest of the books on the shelf. It’s not until an adult hands it to you and tells you how important this otherwise unremarkable object is. Bad luck for you if your guiding adult points out the wrong one, or even worse, if the right book has never been printed in your place and time.

“Ah, but that would make it too easy! [Insert god] would basically be forcing people to believe with such undeniable evidence!” tut-tuts the theist. Well, if your god wanted to make sure there was always some plausible deniability (are they on trial?) they certainly succeeded. Next time you speak with them, congratulate them on their choice of completely unremarkable book and the way their utter silence is eerily reminiscent of the millions of other deities that don’t exist.

All those trivialities people argue over become laughable in this light. Try to remember who it was who told you which arbitrary text holds the secrets of the universe. Conveniently, you were probably too young to remember — it probably seems like you always believed, right? Yet imagine how easy it would’ve been for you to “miss out” on your religion. You can probably think of several nice families you know who would’ve told you an entirely different account of existence if they’d raised you instead.

You’re just lucky, I guess. And I know you’ll make sure your children are still fresh out of the mammalian womb when you tell them about it, because it turns out it’s getting harder and harder to believe the stuff in that book, and let’s face it: your god hasn’t put a whole lot of that unlimited power into spreading the truth. They need every advantage they can get.


Musing on Dreams

•June 1, 2012 • Leave a Comment

[I posted this on June 1, 2012. I’m still impressed by the ‘Bad Boys’ thing.]

Dreams are bizarre in so many ways, and many of them are subtle. Of course, it’s incredible that you can walk through walls and fly across the countryside, but it’s more incredible that you don’t question your ability to do so at the time. Utilising dreams as simulations of reality isn’t just a magnificent plot device of Inception — you can experience emotional reactions to things as though they’ve truly happened, and this can lead to some heavyweight turmoil in your brain’s more realistic nighttime renderings.

I’ve played around quite a bit with “lucid dreaming” — the act of realising you’re dreaming without immediately terminating the dream — and I’m still not sure what to make of it. It’s certainly an incredibly interesting idea, and the potential in it is unfathomably huge. Just reading about lucid dreaming will make you more excited for sleep than you could’ve believed, like you’re Neo ready to jack in to the Matrix. Yet it would seem that being able to lucidly dream with any reliability requires a great deal of effort in waking life, as the art’s core technique involves reversing your assumption of reality — instead of constantly assuming you’re awake, which carries over into sleep and makes lucidity difficult, you start assuming you’re always dreaming and continually perform a routine of “reality checks” that will hopefully transfer into true dreams and jog your “awareness”.

Of course, this isn’t a very clear distinction. Lucid dreaming guides often suggest things like holding your breath, testing light switches, and checking for abnormalities in your hands as “easy” reality checks. Why should these work? If you’re standing in a room you’ve never seen before, in the midst of strange happenings, is a broken light switch really going to “spark” awareness? Finding yourself somewhere with no memories about how you got there — i.e. every dream — should be a reality check in itself. If you can remember how a light switch works or what your hand looks like, why don’t you question how you got into this unusual situation in the first place? I suppose you could make an argument about declarative versus procedural memory, the same way people with amnesia remember what their hand should look like and presumably how light switches operate, but I can recall many examples of seeing basic violations of the laws of physics in dreams without jolting into lucidity.

Enough about lucid dreaming. I want to comment on other aspects of memory seemingly warped by dreams. A few nights ago, I dreamed about the Game of Thrones episode “Blackwater”, which was going to air the next day. My version of the plot involved military bases, helicopters, and cars. As I drifted toward wakefulness, I reflected on the dream I’d just had, and realised that Game of Thrones clearly would not feature most of these things — with the exception of helicopters. Yes, I knew I’d been dreaming and that the “episode” I’d just “watched” wasn’t real, but for some reason I held the notion that the show had indeed used helicopters in real past episodes to somehow make up for other things in the books that couldn’t be translated onto the screen. It was only when I actually got out of bed that I realised how wrong this was. It was a strange half-awake state of mind.

There’s also a lot of confusion about where the contents of dreams come from. Often it seems as though dreams are influenced by the most insignificant events from the day before, and I experienced this recently as well. At one point in this dream, I drove past a park here in western Sydney that’d I’d often visited in actual childhood. I couldn’t figure out when this park had been on my mind lately. (Hopefully it goes without saying that I do not hold the ridiculous Freudian notion that our subconscious desires are being expressed in dreams.) Then I remembered something. The day before, I had read an article on RottenTomatoes about the best-rated films starring Will Smith. One of them was Bad Boys. I always get the song “Bad Boys” confused with the song “Bad to the Bone.” And I very heavily associate “Bad to the Bone” with this park from my childhood, because my cousin had once sung it while we were there. (Just one of those occurrences you happen to remember forever.) Of course, this could all be overinterpreted rubbish, but the fact that I made the connection quite swiftly demonstrates how strong this association is for me. Remembering the park in my dream led me directly to thinking about the film Bad Boys, at which point I realised I’d been reading about that film the day before. Yet the park appeared in my dream when the film did not.

Finally, some science. There are a few competing theories of memory, but the dominant (and my favourite) model is the parallel distributed processing (PDP) model, also referred to as the connectionist model. In PDP, memories are not represented by neurons themselves, but rather the pattern of connections that develop between them. Well, that’s not quite true — many neurons are “connected” in the sense that they have the ability to trigger other neurons’ active potentials, but connections that are used more frequently become stronger, making it easier for them to be used again in the future. (As the saying goes, “neurons that fire together, wire together.”) My association of the park with Bad Boys, via the song “Bad to the Bone”, would be an example of overlapping connections. (Check out the Wikipedia articles on connectionism and neural networks.) Dreams strike me as the type of thing we’d expect to happen if we allowed ourselves to just flow through these networks from one node to the next, with more recently reinforced connections being more likely to be used, and somehow this connects with our perceptual systems as well.

One more thing to think about: how long has it been since you watched Inception?

Circus Acts

•March 14, 2012 • Leave a Comment

Death is the fundamental problem of life. To entities like us, defined by structure, the decay of our components’ ability to rigorously maintain that structure will lead to our inevitable submission to entropy. The brain and body is a highly attuned circus: large groups of performers gather to form ‘cells’, and these cells train in a specific type of act. Often, a cell might miss practice, or get food poisoning. They might make a mistake, and learn to do better or else be replaced. But when our cells begin to drop their batons en masse – when everyone is too tired and everything goes wrong at once – the circus collapses. The performers shrug and head off in search of a new one.

Now here’s a peculiar fact. You’re already dead. Decomposed. Ashes scattered on the wind.

Your circus line-up has changed a little since yesterday, when it was different from the day before. Though it has the same name and may be similarly entertaining, your band of performers is almost unrecognisable from the day it started up. The small idealistic young troupe that got things rolling decades ago are already a distant memory to the current line-up. The founders aren’t dead – all performers are immortal – but have long since moved on to new horizons.

And if you’re young, you’ve met at most a tiny fraction of the performers who will comprise your Final Act. They’re currently doing an incredible variety of other shows, all across the planet and maybe even beyond. More than a few will be performing at circuses not too unlike your own. Some may be headlining as undiscovered life forms in the deepest ocean. All undeniably have impressive resumes – be it participating in the brain of Einstein, managing a spiked plate on a stegosaurus, or – for a lucky few – contributing to the planet’s first-ever life forms. All of them were around when our sun was born, and have been in stars and supernovae themselves – the universe’s biggest shows. Soon, they’ll join your circus for a short time, and move on again.

Your dead body is in the same place now as it will be fifty years after your death. Almost all of the atoms that will comprise you on the day you will die are currently scattered around the world. With omniscience and enough patience, you could walk out your door right now and collect them all, and stare at your actual decomposed dead body. But then, of course, your future would change, and it wouldn’t be your dead body anymore.

Never forget the performers. If they had memories, perhaps they’d consider the time they will eventually spent as you to be particularly pleasant.

The Decision Tree

•January 5, 2012 • Leave a Comment

Eleven months ago, I authored a post titled ‘Whatism?’, in which I discussed the odd futility of defining layers upon layers of subcategories of ‘belief’ when particular objects are concerned. My focus was largely on the fragile distinction between atheism and agnosticism — a boundary, I argue, that ceases to exist at the same moment as whatever metaphysical conversation houses the thought. These lyrical waxings aside (I understand we all love them), proponents of either ‘disbelief system’ cannot be distinguished from one another based on their actions. You can lead the life of an agnostic or an atheist and no one would know until you explicitly told them. But why should they?

Agnosticism is the belief that the truth/falsity of some premise/proposition cannot be determined. This is often a completely justified belief — at no point will I ever criticise the position of agnosticism. Any entity or event that does not interact with physical matter in any way — the invisible, permeable unicorn in the room, for example — warrants an agnostic position.  (Yes, even the illusive consciousness interacts with physical matter, as brain-damaged patients have demonstrated.)

Let’s use the word agnosum to denote anything deserving of agnosticism. The unicorn, or the ethereal deistic god, would be an example of something that is an agnosum by definition — we are told that it can’t be sensed in any way, directly or indirectly, and so there is no conceivable evidence that could sway our belief in its truth one way or the other. (Evidence being another premise/proposition with a truth value that is systematically related to the truth value of our original one.) Fortunately, we don’t need to fret over such an agnosum because — by the same definition — it can never affect our universe or our lives in any way.

Now let’s say that our chosen premise has passed the challenge of agnosticism, and we have a hypothesis about potential evidence that could sway our best estimate of its truth value. We’ve moved into the realm of skepticism (the real definition of which is not a euphemism for doubt or disbelief). We have some idea about how our premise can be taken to trial, but haven’t yet made a judgement. We perform an experiment to test our hypothesis. If our hypothetical evidence is found in our (rigorous and peer-reviewed) experiment, then we have just learned something about the premise’s truth value. We’re never going to be one hundred percent certain about it — and that’s where statistics and Bayes’ Theorem come in — but we’ve certainly made some headway.

At some point — this point is an area actually deserving of debate — we feel we are justified in abandoning our skepticism and finally accepting our premise as a belief. Yet this is the most dangerous part of the process, because we become complacent and drop our guard. The belief stops being strange and novel, and becomes normal and encultured. It becomes difficult to let go. We start ignoring the new evidence, while our fuzzy memories reinforce the strength of the old. There’s no way it could be wrong, you think. This can happen surprisingly quickly, and it’s a fact of the fallible human mind that has happened and will happen to each of us time and time again.

How does this tie back to our original focus on atheism versus agnosticism? As I’ve said before, there is a theoretical distinction between the two stances. It just doesn’t matter. If I start with the solipsistic premise that there is a second moon orbiting Earth, orbitally locked to the other side of the planet, it doesn’t take a very sophisticated experiment to gather evidence to the premise’s falsity. Among the many effects we would expect of such a celestial body would be strange tidal fluctuations when the moons line up, and we do not observe these. We can quickly shift from skepticism to belief in the falsity of the second moon (or as it is more commonly put, ‘disbelief in the second moon’).

Yet in what way does our invisible unicorn agnosum differ from our demonstrably non-existent second moon? By definition, the unicorn has precisely the same effect on our universe as that moon — that is, none at all. This is my argument, in its purest form. No sane person bothers with the distinction between agnosticism and disbelief when it comes to Peter Pan, or fairies at the bottom of the garden. Cynically, I might suggest that the reason we do bother with gods (and rarely ghosts and aliens) is that we’re roped into so many dreary discussions about them that it becomes a way to alleviate philosophical boredom — like getting into some fanatical organising at home when there’s nothing else to do.

My opinion? A much more productive pastime would be rewinding the reel and subjecting these beliefs to the agnostic and skeptic filters that were clearly taking an inopportune day off. For obvious reasons, there aren’t any surviving religions that instruct parents to wait until these filters have developed in their children. Ctely-dressed babies are dipped into metaphorical holy water before they can even stand.

If you were one of those children, imagine how you would have reacted to your religious beliefs if those mental floodgates had been allowed to close before the water leaked into your brain.

Take a Step Back

•December 28, 2011 • Leave a Comment

Perusing a museum of natural history recently, a friend remarked to me upon the travesty of what he termed “premature dinosaur exposure”. Our young, in his view, are educated about these long-extinct reptiles far too early — the result being an implicit association between dinosaurs and children. When they move on to fascinations with dragons and aliens, dinosaurs are pushed back to their eternal status as a thing of the past.

I have to agree with my friend on this. Imagine being twenty years of age and seeing the complete skeleton of Tyrannosaurus rex for the first time. It is doubtful that you would believe it to be genuine. Yet in our reality, a typical young adult would find such a thing bordering on mundane. Fossils are dull, museums are boring, and dinosaurs are cartoon creatures for little boys. Focus, really focus, and you might be able to take a step back from this inexplicable reaction to some of the most incredible creatures to have walked the ground you now stand on. Try to imagine dinosaurs from the perspective of a completely naive visitor to our place and time.

This approach is warranted in many other aspects of our lives. In some cases, like the dinosaurs, we should use it to experience the full, awestruck reactions that certain events deserve. The computer you are reading this on could most likely be used to instantly communicate with a large percentage of the human population, provided you know their name. The blue and green globe you see in pictures from the International Space Station is the home of every non-astronomical event you have ever heard of, and countless more than you never will hear of. The small white planetoid that hangs above the night sky has been walked on by people who are still alive today. If you live in Australia, you are closer to the pitch-black ocean floor of the Atlantic, 12,000 kilometres beneath your feet, than its surface — and everything you hear about in America happens almost upside-down relative to you.

Taking a step back from these concepts can provide you with a fresh appreciation. But in other cases, we should do so not to experience awe, but confusion. Just like we are ingrained with the false “normality” of walking on the Moon, most of us have been ingrained with the “normality” of the belief that an old book holds the words of a supernatural force. “Miracle”, “religious”, “divine”, “sinner”, “resurrection” — all words we have heard enough times so as to become mundane. Even a non-believer might implicitly accept another’s belief in the story of Noah’s ark — in which water appears from nowhere to flood the entire world, killing everyone except one man and his family — as more “normal” than someone’s belief in the Wall of Fire — in which a race of aliens called the Galactic Confederacy brought billions of human to Earth, piled them around volcanoes, and destroyed them with hydrogen bombs. Of course, both of these stories are equally ludicrous, and to a completely naive modern-day person, ignorant of the religious and science fiction genres, they would be rejected.

We look to modern fantasy stories for entertainment, and ancient fantasy stories for holy revelations about the universe. Clearly there is something off about this particular accepted “norm” of our culture. Some of these ancient stories, like the Books of Ezekiel and Revelation, are particularly indistinguishable from science fiction. “Gods”, uniquely from humanity’s rich catalogue of mythical creatures and characters, are normal to believe in. This is why I no longer partake in debating with proponents of specific religions. Even the name or personality of your god, let alone the details of your arbitrary dusty book, are unnecessary embellishments of a concept that is only considered sane because we have grown up in constant exposure to it.

The biggest mistake anyone can make is cherishing their own beliefs. There is no reason to ever become attached to a piece of information you have picked up on your way through life. Never stop looking for reasons to discard them. Make a regular habit of taking a step back, taking stocktake, and taking measures against information that does not hold up.

Top Games of 2011

•December 21, 2011 • Leave a Comment

2011 has been a colossal year for video games. Since it would be even more impossible than usual to order a complete ranking of the year’s top titles, I’ve instead elected to organise the best fifteen games into four tiers of overall quality. If you’re looking for recommendations, aim for the higher tiers, but I’d be reluctant to advocate any entry over another in the same tier. Let’s get started.


The Binding of Isaac — An unusual Flash-based dungeon crawler from the creators of Super Meat Boy, featuring permanent death, randomised maps and items, and disturbing imagery. You control Isaac, who escapes into the basement when his mother attempts to sacrifice him to God. Isaac can move around the Zelda-inspired rooms and shoot tears in the four compass directions, while collecting keys, bombs, and other powerful items that shake up the gameplay. A single playthrough of the game (assuming you aren’t killed) takes around one hour, but you’ll have to finish the game a minimum of ten times to see events unfold all the way through to the end. (A Halloween update also added an additional, brutal level called ‘Sheol’, in which you must fight Satan himself.) This was the game I loved to hate during my exam period, as it turned out to be far more addictive than the short break from study I had intended it to be — Steam tells me I’ve put 20 hours into it. Definitely worth the $5, but still nowhere close to the free fun of Super Meat Boy. Danny Baranowsky’s great soundtrack also becomes tragically repetitive, as do the sounds of Isaac’s tears and enemies, so you’ll want to turn off the sound after a few playthroughs.

Bastion — Another independent Steam game with a cheap price tag. Bastion is a colourful isometric shooter, famously narrated all the way through by a gravelly-voiced old man who, amazingly, never gets annoying. You’ll unlock some impressive weapons as you blast your way through the five-hour story, and there are plenty of bonus levels, achievements, and difficult challenge levels to come back for after you’re done. Bastion was a blast the whole way through, but like Isaac, was never intended to compete with the bigger titles. Hard to beat this kind of value for money, though.

Gears of War 3 — Many will tell me that Gears 3 deserves a higher slot than this, and maybe they’re right. I never actually owned Gears 2 and picked this one up mostly on a whim, as there weren’t many other major releases around. While I certainly didn’t have many complaints about the ten-hour campaign as I was playing it through, and it’s still a great game for two players, Gears 3 just didn’t stick in my memory a month later. The look and gameplay of the game has barely changed since the original all the way back in 2006, and the story, while satisfying, can’t compete with the original brutal vision of Sera presented in Gears of War. I don’t consider the Gears series among my top game series, and I’m not interested in the online multiplayer, but the game was easily worth my $50 and deserves the love it receives from many others.

Call of Duty: Modern Warfare 3 — Another series of games that needs no introduction. MW3 is the eighth entry in the long-running acclaimed shooter series, and in contrast to Gears of War 3, had an epic and enjoyable campaign that certainly captures the feel of a third world war, along with providing a satisfying end for the characters we’ve become surprisingly attached to. I haven’t yet touched the online multiplayer or the apparently-extensive Spec Ops missions, but I’m sure I’ll get around to it and enjoy it. So why not the higher rating? Two reasons. The first is the ludicrously short length of the campaign, which I finished in four hours on Regular (normal) difficulty. Like I said, it’s very well done, but I’m a single-player gamer and I need more for the price tag. Nevertheless, I’m eager to get back into it on Vetern, and I’m aware it’ll take me a lot longer when I’m being killed every few seconds. The second reason is the soundtrack — I consider the soundtrack to Modern Warfare 2, composed by Hans Zimmer and Lorne Balfe, to be an utter masterpiece. The soundtrack to MW3, by action-film composer Brian Tyler, doesn’t even come close, and for me this is a huge deal. That I consider MW3 to be the worst of the ‘trilogy’ while including it alongside Gears 3 should be taken as reinforcement of the quality of its predecessors.


Bulletstorm — An innovative and raucously fun first-person shooter that doesn’t take itself seriously while providing incredible thrills with fantastic weapons. Bulletstorm is sheer fun from beginning to end, and succeeds largely on its inventive system in which more money is awarding for creative kills, making the outdated visuals and light-hearted story easily forgivable.

L.A. NoireL.A. Noire is one of the most unusual major titles of the year, straight out of a development studio here in Bondi, Sydney. You play as Cole Phelps, a Los Angeles officer in the 1940s who quickly rises through the ranks as he tackles unusual cases across the City of Angels. The game is extremely convincing and one of the most realistic historical experiences I’ve ever had — the advanced motion-captured visuals are particularly amazing, and the diverging cases and clues are very well done. I’ve never been a huge fan of the Grand Theft Auto series, and for me, L.A. Noire hits the right note of story emphasis that wasn’t there in GTA while also not forcing you to play as a murderous thug. I borrowed the game from a friend and had to return it before I was able to finish the story, but I’ll be getting my hands on it again as soon as possible.

Deus Ex: Human Revolution — Another title where the world is as much a star as the main character. Human Revolution is ambitious in its dark, dystopic view of Earth’s future and features a world where human augmentation is a controversial runaway technology. Players are free to tackle the story and missions in a variety of play styles, from all-out gunfights to stealth — there’s even an achievement for finishing the entire game without taking a single life. It’s rare that I’m interested in examining every detail and object in a game world as large as this one (you visit several wide cities during the game), but I couldn’t get enough of the world of Deus Ex. Sadly, it’s another game I didn’t finish before the next slew of releases, and I aim to return to it to see it through.

Portal 2Portal 2 was one of the most acclaimed titles of the year, and it’s certainly deserving of that honour. I doubt there’s anyone left unfamiliar with the premise, but to restate it, you tackle fiendish first-person puzzles armed only with a gun that can create two linked portals on any flat, hard surface. Portal 2 has a fantastic story with hilariously voice-acted evil robots on all sides, as well as a fun set of co-operative puzzles completely separate from the single-player game, but in the end still didn’t feel like a ‘full’ enough game to compete with other titles with more content. A blast while it lasts, though.

Dark SoulsDark Souls, RPG dungeon crawler and successor to the notoriously hard Demon’s Souls, is the most recent addition to my collection and a title I’m playing at the moment. I can’t say Demon’s Souls was a favourite game of mine, but I appreciated the skill-intensive gameplay, the punishment of excessive risk, and the ghostly online interactions with other players. I like Dark Souls better than Demon’s Souls, and I’m nowhere near finished with it, but the problem remains that I’m never sure if I’m enjoying the game or just enduring it — not many games can make me so frustrated that I accidentally hurt myself while venting on a lounge cushion.


Assassin’s Creed: Revelations — I believe that AC:R has the lowest aggregate Metacritic score of every game on this list; somewhere around 80. The main problems that reviewers have with it seem to be that it is ‘too much of more of the same’, and they’re also critical of Desmond’s puzzle sections and the newly-added ‘den defence’ minigame that must be undertaken whenever the Templars attack one of your strongholds. In response to the first, I’d say that it’s a matter of opinion, but I’m a huge fan of the AC series and have not been let down at all by the prospect of ‘more of the same’ so far. In my opinion, the additions of the hook blade, bomb customisation (which is actually good!), challenging stronghold takeovers, and Altair missions are all terrific and well-implemented. I’m also enjoying the story as much as any of Ezio’s previous entries. In response to the complaints about the Desmond and den defence sections, I’ll start by pointing out that both of these are completely optional. You unlock Desmond’s strange Portal-esque, Animus-powered puzzles by collecting some of the 100 hidden ‘data fragments’ in Constantinople (which is the greatest AC city yet, by the way). And the den defences only occur if you let your notoriety reach 100%, and even then, it takes a criminal act to trigger them. In any case, I don’t think either of these two elements are nearly as bad as they’re being made out to be. Revelations is another successful and completely immersive entry in this series, in my books — it’s amazing how much it can suck you in, even as a third-person game.

Dead Space 2 — It’s hard to believe this was also the year we saw Dead Space 2, all the way back in January. I’m not the biggest fan of horror games, but when they’re accompanied by bleak, hard science fiction as Dead Space is, I’m all for it. Shooting the limbs off of the monstrous, zombified human inhabitants of ‘the Sprawl’ is even more fun than it was in the original 2008 game, and there are plenty of new enemies and incredible set pieces to boot. The story is bigger, better, and longer, and I still highly recommend playing on the hardest mode you can stomach (though you won’t find me attempting Hardcore mode, which in addition to being brutally difficult only allows you to save three times across the entire game).

The Elder Scrolls V: Skyrim — Despite professing to not be the biggest fan of TES IV: Oblivion from 2006, I still ended up putting a good 80 hours into it. I’m pleased to report that Skyrim fixes essentially every problem I had with Oblivion and includes plenty of new features, along with a game world that is even better to look at than Cyrodiil. First and foremost among these improvements is the levelling system, which no longer require painful amounts of effort to fully exploit. Attributes have been removed in favour of a system where simply levelling up enough skills pushes you to the next level, when you can boost your health, magicka, or stamina. Dragons are impressive, magic is much more satisfying, stealth is no longer a binary state of detected/undetected, and the majority of dungeons have a very unique feel. So why not Tier One? It’s not because of the glitches, which haven’t bothered me at all, but rather the fact that I’m struggling to find the motivation to push beyond 60 hours of play time. I’m nowhere near finished with Skyrim, and I can’t explain why I’ve lost a bit of interest, but I do know it’s not quite on par with my Tier One titles.

Uncharted 3: Drake’s DeceptionUncharted 3 was my most anticipated game of the year, and by no means was it a let down. The story is an incredible adventure from beginning to end, with a few fantastic new characters and a couple of tweaks to the near-perfect gameplay of Uncharted 2. There are even more incredible set pieces (I won’t do them the injustice of spoilerage), multiplayer is still a bit of fun, and there’s a new split-screen cooperative mode that I sadly haven’t had the chance to try out, but I’m sure it’s great as well. If Uncharted 2 didn’t exist, then Uncharted 3 would easily be in the next tier, but it can’t avoid the comparison to its predecessor — and in my opinion, Uncharted 2 has the better single-player mode. The ending of Uncharted 3, as many will tell you, was much too sudden and left an uncomfortable amount of unanswered questions. Four of its twenty-two chapters were spent building up to the cruise ship segment from the E3 demo, and the story was essentially put on pause during these tangential chapters, which is a shame.


Well, here we are in Tier One. For me, there were two clear winners in 2011 — two games I enjoyed more than any other. If I were forced to choose a ‘Game of the Year’, I’d also probably be able to do that, but not by a large enough margin to warrant an additional tier. I’ll write about my Tentative Runner-Up to Game of the Year for 2011 first, and then get onto my Tentative Supreme Hallmark of Video Game Excellence for 2011.

RUNNER-UP: Xenoblade ChroniclesXenoblade Chronicles is a Japanese RPG that still hasn’t been released in the US, and won’t be until April next year. As such, it’s been tragically overlooked in this month’s 2011 awards across the Internet. It’s a masterpiece of an RPG that takes me back to the lengthy, inspired stories, beautiful worlds, fantastic characters, and thrilling combat in the glory days of Final Fantasy X and Final Fantasy XII, two of my all-time favourite games. Of course, I’ve already written extensively about XC in a long post in which I compared it to the disappointing Final Fantasy XIII, so I don’t have much to add. I’ve played a lot more of XC since then — around 60 hours now — and I haven’t lost my motivation for it the same way I lost interest in Skyrim. The only reason I put it down back in October was because of my exams, and I just picked it up again today after exhausting November’s barrage of titles. Of course, XC is about as different an RPG as you could imagine from Skyrim — they showcase the best efforts from Eastern and Western developers, respectively — but Xenoblade Chronicles‘ strong focus on what Skyrim would call ‘the main quest’ makes it the clear winner in my eyes.

GAME OF THE YEAR: Batman: Arkham City

I’m a huge fan of Batman, though I’m sure many will consider it strange (if not blasphemous) that the first exposure I had to the Caped Crusader was after watching Batman Begins on TV in the weeks leading up to the cinematic release of The Dark Knight, in July 2008. I can’t explain why, but watching Batman Begins for the first time — specifically, on-air — was the best film experience I’ve ever had. The score, the atmosphere, the story…it all just worked for me. So when Batman: Arkham Asylum turned out to be a near-masterpiece in 2009, I named it my second-favourite game of the year (behind Uncharted 2: Among Thieves). But this year, with the release of the sequel and complete masterpiece Batman: Arkham City, Batman has edged out Nathan Drake and thirteen other worthy titles for the top spot in my heart.

Developer Rocksteady’s first masterstroke was to limit the size of the ‘open world’ of Arkham City. Instead of bloating it out to the size of Skyrim or Liberty City, they put their efforts into making a moderately-sized sandbox inwhich every square centimetre is memorable. There are Riddler secrets hidden around every corner, a dozen deep side-missions involving a host of entertaining Batman villains and challenges, extra content playable as Catwoman, and vulnerable thugs patrolling every dank street, just waiting to be snatched up into the air by a man dressed as a giant bat. Combat is also, amazingly, even smoother and more enjoyable than it was in Asylum, thanks to a host of new moves and the improved ability to counter the attacks of multiple enemies at any time. So it’s no surprise that the endless combat and predator challenges available outside the main story mode are also even better — you can also unlock Robin as a playable character, and he doesn’t disappoint.

I started a second, harder playthrough of Arkham City literally as soon as the end credits to the brilliant story had finished rolling, and I can’t claim that for many games. Batman: Arkham City is a masterpiece from beginning to end, and both it and Xenoblade Chronicles have become two of my all-time favourite games alongside FFX, FFXII, BioShock, Halo 2, and the Metroid Prime series.


Well, that concludes my miniature awards ceremony. Hopefully, I’ve done a decent job of recommending any and all of these titles if you’re a fan of games and missed out on some of them. Of course, there are plenty of games I didn’t actually get to play this year — the biggest miss being Skyward Sword, the new Zelda game, which is already acclaimed. I’ll be picking that up as soon as I get the money, and I might add my opinion of it to this list. I’ve also heard good things about Rayman: Origins, Sonic Generations, and Battlefield 3.

That’s all for now. If you read through all 3000 of these words, I probably owe you a copy of Arkham City. Buy it and leave your bank details and password with me, and I’ll reimburse you.

Structural Souls

•December 13, 2011 • Leave a Comment

Particles come and go, through the skyscrapers in our cities and the bones in our bodies. We are built from the same sub-microscopic stuff as the fuel we consume to keep our cogs whirring — notice how almost everything we eat was once biological tissue? — yet it is through this stuff that people often sift, searching for the ‘essence’ of what makes us not only alive, but somehow more alive than the rest of Earth’s biosphere.

Frequently this leads introspectors to the aether-like phenomenon of souls, impermeable ghosts that give rise to consciousness by puppeteering our material bodies. Conveniently, this paves the way for an equally ethereal afterlife. I, and others, have written much on how this dualistic approach is self-refuting and — arguably worse — explains nothing. But why does the concept sprout up all across human space and time? And if it is untrue, then what is going on when we die?

The secret lies in structure, the source of our complexity. Dualists are correct in the belief that the molecules comprising us are not especially important in defining us — rather, it is their precise and evolutionarily-tuned arrangement. As our technology evolves in parallel, we are learning the beauty of using more and more abstract structures to relay information. I can now carry the complete works of Beethoven around in my pocket as a series of ones and zeroes. If I had the patience and memory span, I could even write that number sequence down on a piece of (very long) paper. Not a very useful form for the information to be in, but a form in near-bijection to the original music nonetheless, and the same method that a theoretical teleporter might use to beam our structural information from one point to another at the speed of light.

Our DNA is literally an instruction manual for mindless cellular structures to build every part of us. The Human Genome Project famously perused the entirety of this instruction manual, and you would be surprised at how short it turned out to be — yet we understand so little of the organic complexes it is used to construct, especially the brain. When we reproduce, a new instruction manual is written by taking random words from both the mother and father, with a handful of translational flaws (that this process is flawed is a glorious, glorious fact).

It is not particularly important that the mortar used to piece you together and keep you running day after day may have come from the ham sandwich you ate for lunch. Death is simply the result of this maintenance coming to an end, either because of decay over the decades or because of extreme trauma. Some parts of the body — particularly the blood-hungry brain — notice this much quicker than others.

Your cells stop maintaining your structure, and you decompose into dust with the help of Earth’s other creatures.

Hopefully you weren’t expecting a different ending.