In my book Dungeons & Desktops, I wrote in the introduction that I think CRPGs are the greatest learning tools ever designed. To my shame, however, I did not properly defend that statement--at least, not directly. While I think most of us would agree that the basic mechanics of a CRPG teach us valuable transferable skills like resource management, long-term planning, team management, statistical analysis, and so on, what makes them better than other learning tools, including other types of videogames?
First off, I need to offer some background on the scholarship on all this. It's a bit dated now, but I usually like to start discussions of videogames and teaching with James Paul Gee's book What Video Games Have to Teach Us About Learning and Literacy. I've taught this book for a number of years, so I think I can break down what he says fairly succinctly. Gee's biggest problem with our current educational system is that we don't do a good enough job connecting what students do in the classroom to what they'll do when they leave it. The knowledge they gain (or attempt to) is not tied in any obvious way to what Gee calls a "semiotic domain," defined as "any set of practices that recruits one or more modalities to communicate distinctive types of meanings" (18). For examples he cites cellular biology and literary criticism. The key for Gee is that becoming a member of a semiotic domain like cellular biology isn't just a matter of memorizing a bunch of facts. It's really more a matter of learning the "distinctive social practices" of a cellular biologist; how do they actually go about their work, make and discuss discoveries, etc. They in fact see the world in a different way than the rest of us, they are part of a specific social group that we aren't eligible to join, and they are prepared to innovate (or deal with innovations) or changes in their domain. In short, cellular biology, just like any academic field, is not just a matter of content, but rather a social group with a very distinctive and refined set of social practices. You can't take the people out of the equation and still have cellular biology. Yet, taking the people out and just teaching the content is what most schools do, and it's the least valuable stuff you could be doing. Since these fields are constantly changing, it's more important to learn their methods than any specific body of content.
This makes sense on a common sense level; it's better to learn how to use the scientific method and think like a scientist rather than memorize a thousand isolated facts or equations. Indeed, given enough time and resources, the kid who knows the scientific method will discover all of science. The kid who just memorizes a bunch of fancy-sounding words and definitions won't know a legitimate scientific theory from the wildest quackery.
For Gee, videogames are great teaching tools because they teach people to "situate meanings through embodied experiences in a complex semiotic domain and meditate on the process" (26). He goes on: "Video games situate meaning in a multimodal space through embodied experiences to solve problems and reflect on the intricacies of the design of imagined worlds and the design of both real and imagined social relationships and identities in the modern world" (48). Whenever you play a game you haven't played before, you always begin by learning about how the gameplay mechanics work--in what ways can you interact with the character; how can he or she interact with the world; what does the game engine, in effect, allow you to do. What strategies will help you defeat the enemies and what just gets you killed? Gee compares this to the scientific method; we probe the virtual world, form hypotheses, re-probe, and reflect on results. I think Gee is exactly right about this, since that's exactly how I play.
The concept of "embodied" is also important here. There are many philosophers who insist (I think rightly) that true knowledge is always embodied and to some extent tied to our existence in actual bodies. You can't just teach a computer (no matter how sophisticated) to have a conversation, for instance, because our ability to communicate is never just a matter of learning and applying rules. Even if you had fifty billion lines of code filled with rules about what to say when somebody says something else, it wouldn't amount to understanding. Most of us wouldn't be able to explain, for example, why "the four young French girls" is correct and "the young French four girls" is wrong, though hardly anyone gets this wrong in everyday speech. A lot of our ability to communicate comes from our ability to feel when something is right--and feeling, of course, is a rather vague and fuzzy notion to try to program. Even computer designers, though, have to resort to "look and feel" to describe a lot of what they do, and it's vital that we understand technology as made by humans and not autonomous. That's why it doesn't make much sense to use terms like "Technology" and "The Computer," since every computer is marked by different personalities. We all know this, of course, since we've had experiences on different platforms and realize the "feeling" is totally different.
The main thing to keep in mind here is that whatever videogames teach us, they do it in a situated context. For instance, if you're playing X-Com, one thing you need to learn how to do is move strategically, so you don't just have your guys out in the open with no cover. In a classroom, you might learn about this by memorizing some facts presented on a PowerPoint slide. Ineffective. In the game, you learn it by witnessing what happens when your guys are out in the open. When you try moving them to cover in the correct way, you see that it works, and thus you've learned something in a "situated" and "embodied" context.
In his book Persuasive Games, Ian Bogost makes some good points about why games (or at least simulations) can communicate certain types of information better than any other medium. I think his chapter on advertising can elucidate some of these points. He talks, for instance, about how advertisers are wasting their potential (and money) by just paying to have their company logos plastered on virtual billboards in a game. It'd make far more sense to have their products actually in the game, and let players experience their utility or superior functionality. This "demonstrative" type of advertising could be used to make pretty effective arguments. If you're playing a zombie game, for instance, and find that Shotgun X is more effective than Shotgun Y in the game, you might well go out and buy Shotgun X to defend your home. In short, if you want to make an argument in a game, you just make whatever procedure you're trying to sell more effective than another in terms of the gameplay. Bogost has a nifty example of a car with an innovative storage system; in the game, you can see and experience how much more cargo you can fit in to that car in a situated, embodied context.
Gee doesn't argue that we need to limit ourselves to educational games. Instead, he goes with the line that, in general, it's less valuable for kids to learn content than the sort of meta-level skills that are actually more valuable in the workplace. If you're playing a variety of games, you're learning a lot of general cognitive skills that will help you get ahead in any job--as long as you recognize that there's value in that kind of application.
Jane McGonigal takes this idea and runs amok with it in her book, Reality is Broken, which I highly recommend. Jane's a much better writer than Gee, and I don't see how anyone can read her stuff without getting excited about the potential of videogames. Instead of talking about videogames might be designed to better prepare students for the workplace, she flips the table, and says that workplaces and schools need to change to be more like the videogames that students already enjoy. She attempts to abstract out what makes videogames so compelling, and ends up claiming that they make us so happy "because they are hard work that we choose for ourselves, and it turns out that almost nothing makes us happier than good, hard work" (28). By contrast, work that we don't enjoy "doesn't absorb us, doesn't make us optimistic, and doesn't invigorate us" (29).
McGonigal defines games as anything with a goal, rules, feedback system, and voluntary participation, pointing out that "interactivity, graphics, narrative, rewards, competition, virtual environments, or the idea of 'winning'" are unnecessary. Indeed, whatever else you want to add to the mix is only valuable if it reinforces the four core traits. She uses Tetris as an example; even though you can't win it, the feedback is so intense that it's quite easy to get addicted to it. So, in short, for McGonigal, we'd rather spend six hours grinding in WOW than one hour grinding rocks for "real" money. The best "real" work, apparently, is anything that is similarity intense and visceral as a game.
I like to think of the difference between being a cook on the show Hell's Kitchen versus being a cook in McDonalds. Some might argue that the burger flipper has it much easier; there's less to know, and he can just let his mind wander as he pushes clearly-labeled buttons and responds to beeps and the like. However, I think he's going to suffer a lot more from boredom. The cooks on Hell's Kitchen, though, are working at a terrific intensity, and the challenge and feedback are immediate and palpable. Of course, it's also something they've chosen for themselves, rather than having to do to make ends meet, and they have a clear goal (winning the competition or any of the smaller challenges) and rules (discovering these rules--essentially what pleases Ramsay--are vital to their success). The McDonalds guy, though, will find that there's little reason to bust his ass; the pay will be the same regardless. To the extent that he's supervised at all, it's just to enforce the bare minimum. It's not hard for gamers like us to think of countless ways the job could be made a lot more fun--and at the same time more efficient and cost-effective. Why pay everyone the same each day, for instance? Shouldn't the person who does the bulk of the work get the bulk of the pay? Why not make it competitive?
I think we could take things one step further, though, and do away with the idea of a separation between the "real world" and the "virtual world" altogether. Work is work, whether you're mining for ore in WOW or mining in a "real" coal mine. Your virtual life is just as "real" as your life as a janitor or teacher or whatever. I'm inclined to agree with Walt Disney, who is reported to have told a visitor that Disney World was in fact the Real World, or at least, how humans are supposed to live. War, inequality, prejudice, and so on are the artificial constructions, not inevitable or just the way things must be. If getting to level 85 in WOW gives you a real, solid feeling of accomplishment--something you could never feel at your job at McDonalds--then why should we value that less?
I realize that some might make the common sense objection: Well, you can't pay your bills by playing WOW all day. Fine, be that as it may, but it's probably the cheapest way to be happy. To the extent that we can survive on as little income as possible--just enough to keep us reasonably healthy and sheltered--we can spend the bulk of our time actually enjoying ourselves and engaging in "blissful productivity" (McGonigal 80).
I think a lot of our prejudice against someone who "plays games all day" is the result of ideology. We have a lot of pressure on ourselves to attempt to get rich, have kids, a big house, a big social network, and so on. The idea is to get yourself invested in so many things that you're working around the clock to hold it all together. You need the better job to support the extra kid and the dog and so on; oops, now you have no time to enjoy yourself playing a game. Well, that's because you're Mr. Gullible. Shame on Mr. Plays Games all Day.
I say, why buy into this? I don't agree that you should leech off others (i.e., live in your parents' basement), but to the extent that you can support yourself and not depend on government handouts or anything like that, by all means, play games all day! Try not to get entangled in unnecessary expenses; if you don't need a car, don't have one, etc.
My contention is that someone who really enjoys playing WOW (or CRPGs for me) is in fact a lot happier than someone else who works him or herself to the bone raising kids, cycling through marriages, buying larger houses, driving a fabulous car, etc. Those things all require a lot more tedium and self-sacrifice than amassing similar joys just playing a game. I think it's the height of arrogance and stupidity for someone with a Great Job and a huge bank account, sailing around in his yacht, to think he's somehow better off than Joe Blow who's having a blast playing Icewind Dale. They've both pursued a goal, but Joe has gotten there a lot more directly and with a lot less hassle or strings attached!
It's like two rats in a maze. One rat takes the direct route to the cheese (30 seconds). The other goes to college, gets an advanced degree, works his way up the corporate ladder, gets married, has lots of rat babies, and then, somewhere towards the end of his life, takes his private jet to the cheese (30 years and a lot of luck).
In short, until employers get around to making work as fun as a videogame, do as little of it as possible. The same goes for schools that can't be bothered to actually teach you useful things in a useful way. Why waste years of your life?
So, back to CRPGs, then. I think that of all the types of games out there, CRPGs are the best at encouraging the kind of learning I like to see. The best ones require a tremendous amount of scientific reasoning and experimentation. You're allowed to do many things in the virtual world and offered many different paths to the ultimate goal. By refusing to dumb everything down, these games actually make you smarter, because you have to really THINK to get ahead in them. They engage not just your hands, but, more importantly, your brain.
I also think CRPGs have advantages over MMORPGs in these areas. The most obvious is that the characters you encountered are put there by the designers, and, if they're any good, are actually better characters than the random idiots or jerkwads you encounter in WOW or other MMOs. Secondly, you can have a lot more impact on the gameworld, since the designer doesn't have to worry about a million other players in the same persistent world mucking things up. Thirdly, the level of difficulty can be matched to your performance, not the ability of other players, which is always a random factor (WOW "solves" this problem by making everything increasingly easier and less sophisticated.) Finally, you can introduce many types of puzzles and other forms of gameplay that would be ruined if another player nearby could just give you the answer. That's why a lot of the puzzles in instances in WOW are so lame; someone always knows the answer and just runs you through it--as though the puzzle were just a tedious distraction rather than something to engage the brain.
Some might object that MMORPGs are still more valuable because they teach you social skills. To this I respond, play one, and then tell me if they teach social skills.
So, to sum up, CRPGs are valuable teaching tools because they are (a) designed to challenge you intellectually, (b) promote scientific reasoning and "outside the box" solutions, (c) engage you in rich virtual worlds worth exploring, (d) offer visceral and immediate feedback on a broad range of indicators, (e) have a clear main goal and lots of smaller goals leading up to it--scaffolding, (f) engage us emotionally by identifying with our character/s and NPCs, (g) build our pride and self-confidence by presenting us with increasingly difficult challenges, (h) situate and embody our gameplay experience--your characters in specific situations, and (i) teach us to learn from failure than just be upset by it. This last piece is something sadly missing in modern CRPGs.
I believe that one reason why videogames, including "smart" genres like CRPGs and adventures, have been increasingly and depressingly dumbed down, is that otherwise too many people might wise up and effect real change. A good CRPG puts you in the position of a manager or leader, and builds up your confidence in making decisions and taking on very large--usually unthinkably ambitious--goals, such as overthrowing the corrupt government (usurping wizard, etc.) A lot of the best strategies involve working against the designer, seeking out loopholes or exploits, such as nudging forward bit by bit to trigger enemies one-at-a-time rather than a room full. We often feel the most vindicated when we find a way to really exploit these weaknesses, and the best designers actually build them in--they want us to feel that we've outsmarted them! Now, clearly, this is not an attitude most managers want in their employees. Instead, they just want people do be consistent and follow instructions very carefully, which is precisely what other genres (and now, sadly, most CRPGs) teach very diligently. Sadly, most of the most popular games don't even let you have a face, but instead keep your identity hidden behind a mask or helmet. You're not allowed to have an identity. Just do what you're told and follow the linear path to victory.
Do I think developers intentionally want this? Of course not. It's not their fault that so many of us are so deeply indoctrinated in this system that the only games we'll buy are ones that don't challenge us intellectually. But it IS their fault that they're not doing more about it. By giving in to the financial incentive to make dumb games, they're part of the problem, when they could easily be part of the solution. As Bogost writes, "Persuasive games expose the logic of situations in an attempt to draw players' attention to an evental site and encourage them to problematicize the situation" (332). The logic of our current situation--with politics reduced to soundbites, school reduced to rote memorization, and science reduced to politics--CRPG designers in particular have a responsibility to make people smarter. Tim Cain certainly gets it, but, shamefully, even his Fallout baby (so gifted as a youth) grew up to be quite dumb in the end.
Wow, there is a lot of stuff in this article. Very thought provoking. I also believe that computer games can be valuable for learning. Yet I must say that I disagree with many of the conclusions you draw on the way.
There are many philosophers who insist (I think rightly) that true knowledge is always embodied and to some extent tied to our existence in actual bodies.
The problem with this argument is that you offer no definition for “knowledge”, or how to set “true knowledge” apart from “false knowledge”. Do you know the concept of the Turing test? If it walks like a duck, quacks like a duck, it must be as intelligent as the duck.
You can't just teach a computer (no matter how sophisticated) to have a conversation
Ever played ELIZA, or any chatbot for that matter? You may object that her chatter is in no way a decent conversation. Yet her ramblings are far more intelligent than the majority conversations on mobile phones that I was (unwillingly) witnessing.
Even computer designers, though, have to resort to "look and feel" to describe a lot of what they do, and it's vital that we understand technology as made by humans and not autonomous.
Do you really think any single engineer can understand the whole of a complex machine? Let's just concentrate on one component, e.g. the processor. In the old days, an engineer would draw the pattern of transistors by hand which would later be etched into a silicone wafer.
Now the designers write a specification which is cast by algorithms into a pattern which is checked by another algorithm for logical soundness. A human being would go nuts if he or she had to do this by hand. Now, who has a deeper understanding of the product, man or machine? I think this is a profound philosophical question. And this is just one out of thousands of components in a PC.
That's why it doesn't make much sense to use terms like "Technology" and "The Computer," since every computer is marked by different personalities.
It doesn't make sense to talk of birds or chairs or games for that matter. They even have a name for this, the problem of universals.
I'm not as well versed in modern philosophy as I'd like, but feel that I have a decent understanding of the objection to objective knowledge. Most of us think that knowledge (or truth) is out in the world, and it's the job of language to describe and communicate it. For example, "one and one make two" is true regardless of what we individually believe, or whether there are any humans around at all. A scientist goes out, learns something true about the world (facts), and puts it down in clear, unambiguous language.
The alternative to this line of thought is that Truth isn't beyond language, or, even if there is, we have no way to deal with it outside of language, so it's irrelevant. If you buy that, then you have to start looking at how language is connected to the things it describes; for example, the word "apple" and the actual fruit. I'm reading a book now by Thomas Kent that argues that there's just no way we could ever have a system that would make that link totally concrete. If I say the word, you need to have some context, some knowledge of language, obviously, but inevitably you just have to guess (hermeneutics). We make pretty good guesses most of the time since we know a lot about each other. But often enough we misunderstand things; the point is that there's just no way we could ever know for certain that we understand one another. Plus, there's the fact that your experiences will affect how you interpret things. There are many examples cited, such as a line that means one thing read seriously vs. jokingly vs. mockingly and so on; "Fine weather we're having today, isn't it?" is a joke if it's actually a snowstorm outside--but you'd need to be aware of that context and tone to get it. Just seeing the sentence on a page won't help much.
I think it's likely true that a lot, perhaps most, of this guesswork is subconscious. I think it's one of those things humans are good at. We might think of language as just a bunch of rules that we learn and apply, but it's really more complex than that. I also think about a surgeon, for instance--I doubt very seriously she's thinking, "Okay, now I need to move my hand one centimeter to the left," etc. She just does it without really being aware of it. Indeed, if she did start thinking like that during an operation, she'd probably screw up big time! The point is, very little of what we do is based on deliberate, rational thought. Unless you're sitting down to solve some equations or whatever, I doubt you have sustained moments of focus like that. Most of the time we're almost in a daze. Yet we're trying to work from the top-down with these computers. If we really wanted AI, we'd have to get better at the fuzzy, daze-like behavior, which seems to require a body (or at least a sensitivity to very subtle things, like someone's pitch, incline of their head, etc.). In short, a computer might be great at playing chess, but would never be good at poker.
Personally, I think we could build computers that could have a good conversation (not just ELIZA-like mimicry), but you'd need a neural net and the computer would have to be able to learn and adapt on-the-fly. Whether or not the computer could really be said to understand is something else, but I'm convinced it could fool us. John Searle has a great concept called the Chinese room that's always fascinated me in this regard. If you heard of it before, I strongly recommend you look at the page; it's fun to think about.
edit. I found this review of the Kent book if you're interested. It looks a good breakdown.
For example, "one and one make two" is true regardless of what we individually believe
1 + 1 = 0 (George Boole)
You see, we can't even agree on truth in mathematics. ;-)
But often enough we misunderstand things; the point is that there's just no way we could ever know for certain that we understand one another. Plus, there's the fact that your experiences will affect how you interpret things.
Isn't that completely obvious because it happens all the time? No one(?) ever said it would be easy to build an AI, yet to be good at chess, computers have to guess. Not even the mightiest supercomputer can calculate all possible steps. Computers can also take guesses at emotions. If you shout and curse while waiting in the service line of your telco, you will usually be serviced faster. Or they monitor the intentions of crowds. It's fascinating and scary at the same time.
Whether or not the computer could really be said to understand is something else, but I'm convinced it could fool us.
And to me (and Turing) that's all there is to know. "Real understanding" is just a word like "cold flame" or "green sleep".
In my more cynical moments I'd agree with you; if it walks like a human and talks like a human, it's a human, right? I think we can trace a lot of the other side back to metaphysical arguments about souls. As long as you can posit a difference in kind between "Real Humanity" and "X," whether that X be slaves, animals, or robots, people feel justified in being unkind and unfair to that X. I think the science fiction stories in which robots or androids fight for their rights--I tend to think that's a very solid possibility (if not an inevitability).
I really, REALLY wish that we could find intelligent life on another planet and make contact with a completely alien civilization. That would really put an end to so many of these issues about the uniqueness of humanity and language. Would we even be able to recognize that civilization AS a civilization? Would it be possible to communicate at all, given that their "hard wiring" for language might be entirely different? After all, many linguists claim that all human languages can be traced back to a common ancestor. Apparently the question of the origin of language is thought to be the hardest problem in science. Obviously, if we could find an alien civilization and communicate with it--or even if we couldn't--we'd learn a lot about all this.
Just as a thought exercise, I've asked myself what a true alien being would be like. Would I even be able to see it if it were standing in front of me? Already, though, I'm assuming it can stand, or that my eyes could perceive it. I think we can't help but attach human characteristics, or at least characteristics from animals, plants, or other things we're familiar with. I'm intrigued by the idea (no doubt inspired by reading Lovecraft) that our "filters" simply block out all sorts of things that our minds couldn't withstand. What if there were beings among us who were so frightening and damaging to our psyches, that our minds had evolved to simply screen them out? What if they were indeed so terrible that we couldn't even conceptualize their existence?