Friday, December 24, 2010

Lacuna = unfilled space or interval; gap. You're welcome.

Note: for many of the games here you'll need a z-machine interpreter like Frotz or Zoom to play them.

I recently finished Blue Lacuna, a tour-de-force showing from Aaron Reed. This game was the only text adventure to be exhibited at this year's Indiecade conference, if you're excluding the fascinating and innovative Groping In The Dark, as I am. (They could both arguably fall under the category of "interactive novel" – which Reed owns – but Groping doesn't have the same openness of structure that's characteristic of the text adventure, and which is what I'll be highlighting here. Incidentally, it looks like text adventures have been 2 for 2 at Indiecade so far, with last year seeing Jim Munroe's Everybody Dies. TEXTS NOT DED)

What impressed me most about Blue Lacuna is the sheer magnitude of the work. Many of the most influential text adventures of the past decade have been explorations of the short form, which has been directed mostly by the guidelines of annual competitions. As such, it's easy to lose sight of what can be done in a full-length space. (Who was it who recently exhorted independent game designers to finish their damn games?)

Reed strikes a fine balance between free-ranging exploration and guided storytelling. You are able to spend your time in the game however you like, and a wide variety of approaches and actions are anticipated by the author, but there is an overarching narrative which you will be gently funneled into, if necessary. However, this coaxing always feels natural; the game features a "boredom meter" which can tell when you're just spinning your wheels. Once it does, it will send an unobtrusive hint your way: "out of the corner of your eye you glimpse a small animal heading into the bushes," and of course the player must now satiate her curiosity by following. This minimal, yet clear, sort of signposting is something that I always appreciate in a game. In keeping with Csikszentmihalyi's idea of flow (something all game designers should familiarize themselves with) the goal should be for the player to spend as much time as possible skating on the leading edge of their ability, balancing between frustration and boredom. This can be difficult in a genre so self-consciously thinky as the text adventure, but this seems to be a nice way to pull it off.

Blue Lacuna is also a game that rests largely on notions of identity. The myriad choices that you make in the game, both large and small, are not arbitrary; they lead to different developments and reactions, and, if you've been playing with a consistent philosophy and outlook (and you haven't been spamming saves to try everything like me), the game will ultimately present some honest questions for you to consider.

On a more straightforward note, the amount of customization available in the game (choosing gender both of yourself and your romantic interest, for instance) allows some exploration of the dynamics of identity. It seems to be an accepted tenet of game design that your player should identify closely with their character, and this is generally done either through making the character a blank slate or by allowing extensive customization. This customization is often largely cosmetic and doesn't really affect the story or mechanics of the game. In Blue Lacuna it's not quite clear to me yet if that is the case; it may take a few more playthroughs and some experimentation. Nevertheless, it does seem to have a distinct psychological effect – the sense of being in the character is great enough that changes in identity are truly felt. However, for me one of the most convincing games in this respect is Choice of Broadsides, where, despite (or maybe because of) the stripped-down mechanics, finding myself in a star-crossed homosexual affair on the high seas at some point became all too uncomfortable.

You may be noticing that I'm not talking about the plot of Blue Lacuna that much. I have to admit that the fantasy/sci-fi sort of thing is not really my bag anymore, and there's definitely some naked philosophizing scattered throughout. What impressed me most, and inspired me to write about it, is the obvious care and craft that went into it, and the interesting and unique mechanics at play here. I highly recommend you give it a spin.

Wednesday, November 3, 2010

Globe and Mail Article on Learning

Psychologist (and author of The Sexual Paradox: Extreme Men, Gifted Women and the Real Gender Gap) Susan Pinker has a nice article in Toronto's Globe and Mail featuring Bjork Lab research on learning. She tries out a new metaphor for learning: the construction site. I'm not sure it'll catch on.

Friday, September 24, 2010

Seeing with Expert Eyes

I'm very interested in the subject of expert performance. While I tend more towards dilettantism, I'm fascinated by people who can perform complex tasks or make minute discriminations seemingly effortlessly. If we can isolate the necessary skills, mental processes, and information that experts need to perform like experts, then we can get people to start behaving like experts faster.

That's the tactic taken by Robert Jacobs of the University of Rochester. A cognitive scientist, he is looking at something that is critical to how experts function, but is easy to miss -- eye movements. How a person moves his or her eyes over a scene can tell you a lot about what they know. (Some recent research has shown that implicit biases in decision making are betrayed by eye movements: the eyes will spend more time looking at information supportive of the initial decision, even if people say they're weighing all the options. Forgot the source, though...) Experts' eyes are very likely to be focusing on different things when scanning a scene compared to a novice's, and these differences can tell us what information is most important in making an expert judgment, even if the experts themselves can't.

Jacobs and colleagues took advantage of this on a recent field trip to Death Valley in California by having researchers -- both expert and novice -- wear mobile eye tracking devices. By capturing both the scenes that individuals are looking at, as well as the movements of their eyes over the scenes, they ultimately hope to develop better teaching methods that focus on perceptual skills. By teaching a person to use her senses like an expert, she may more quickly become an expert.

Since this was just a news brief in SciAm, it's a little light on details and methodology, but I'm interested in following up on this work and see how it develops.

Choi, C. (June 2010). Expert education. Scientific American, 17-18.

Thursday, September 23, 2010

A little more than a citizen scientist

As I'm combing through the back issues of Science magazine, I came across the story of Frédérique Darragon. Darragon is one of those people you thought only existed in movies; a real-life Indiana Jones. She comes from a high-society background filled with globe-trotting playgirl (?) exploits -- boat racing, polo playing, modeling, famous boyfriends, etc. She's also had numerous brushes with death, among them a polo ball to the face which left her with a broken jaw, and nearly suffocating in an ice cave.

Lately, though, she's been jaunting mostly around Tibet and the Sichuan Province, trying to uncover the origins and purposes of a number of poorly studied but distinctive towers. The towers, some with a unique star-shaped layout, are situated close to the path of the old silk road, so it's been speculated that they may have served as signposts for nearby towns. Some have argued that they also relate to the lesser-known "musk road," on which the deer musk trade ran, and which joined up with the silk road in this region. It could also be that the towers were built in a game of one-upsmanship by wealthy traders. Darragon is undertaking the only intensive scientific investigation, on her own dime I should mention, to try to test these theories.

Stone, R. (May 7, 2010) Unraveling a riddle in plain sight. Science, 328(5979), 685-687.

Wednesday, August 4, 2010

What it Means to Forget in the Internet Age

The New York Times has a thoughtful and comprehensive recent article by Jeffrey Rosen, The Web Means the End of Forgetting. With the shifting of more and more aspects of our lives online to sites like Facebook, Twitter, Foursquare, etc. (and blogs, of course... *ahem*) comes the realization that we ultimately have less and less control over our permanent social records. Even though an individual has a large amount of control of what information she publishes about herself -- I could choose to just cancel this post right now and kill it in utero -- once something is out there it becomes more and more difficult to take it back. If I choose to publish this post, it will be publicly accessible, free to be copied and quoted and referenced. Even if I delete it in the future, there will still be a copy somewhere in Google's archives. (Let's leave aside for the moment the problem of what other people publish about you, shall we?)

So is the internet's long memory a bane? There are some who have embraced this new zeitgeist: Microsoft, I believe, has a concept in development that would automatically record and archive what you see, what you hear, your conversations, your writing, essentially your entire life, automatically and continuously. And Rosen comments towards the end of his article about what it would mean to culturally shift (back, actually) to a holistic public persona. But most of us aren't there yet, and so more people are looking into ways to limit what the internet can remember by building in -- or adding on -- mechanisms of "forgetting."
Google not long ago decided to render all search queries anonymous after nine months (by deleting part of each Internet protocol address), and the upstart search engine Cuil has announced that it won’t keep any personally identifiable information at all, a privacy feature that distinguishes it from Google. And there are already small-scale privacy apps that offer disappearing data. An app called TigerText allows text-message senders to set a time limit from one minute to 30 days after which the text disappears from the company’s servers on which it is stored and therefore from the senders’ and recipients’ phones. (The founder of TigerText, Jeffrey Evans, has said he chose the name before the scandal involving Tiger Woods’s supposed texts to a mistress.)

Expiration dates could be implemented more broadly in various ways. Researchers at the University of Washington, for example, are developing a technology called Vanish that makes electronic data “self-destruct” after a specified period of time. Instead of relying on Google, Facebook or Hotmail to delete the data that is stored “in the cloud” — in other words, on their distributed servers — Vanish encrypts the data and then “shatters” the encryption key. To read the data, your computer has to put the pieces of the key back together, but they “erode” or “rust” as time passes, and after a certain point the document can no longer be read.
If you're reading this, you may know a thing or two about how forgetting works in humans, and you may be familiar with this idea of degradation over time. It's one theory, though not necessarily the most likely. Much work in the Bjork lab and elsewhere has supported the contending theory of interference as a principle driver of forgetting -- the passage of time tends to reduce the memorability of an event not due to some direct effect of time itself (as in apps like TigerText), but because subsequent experiences intrude.

This then leads to the idea of adaptive forgetting, that forgetting is a necessary mechanism for separating information and experiences into "important" and "unimportant," and ultimately for drawing conclusions and making more oblique connections -- an idea I've been exploring in my own research*. When something is experienced or learned, memorability is determined by the processing of the information at the time. But once established, memories can be altered. For example, if you haven't thought about something in a while but then suddenly find yourself searching for that information weeks or months (or years) down the line, what you are able to recall will come to mind much faster the next time; this information has become more of a prominent fixture in your mind. Likewise, the more often some information needs to be accessed, the more prominent it will become -- if it has passed out of consciousness in the interstices. Contrariwise, any information that is infrequently accessed, or only accessed in a short timeframe, will be easily forgotten, or less prominent in one's mind.

So am I arguing that attempts to create online forgetting systems should more closely resemble humans'? On the contrary, that's how it already works. Search engines, for instance, rely on the most visited and most heavily linked sites to determine page rankings. You could view the unique links and pageviews as analogous to memory queries in the human mind: the more successful "retrievals" -- people finding what they're looking for -- the more prominent the "memory" -- the particular information. That particular aspect of you is so highly viewed because people think it's the most important. If a search comes up empty, however, or if the information is buried so deeply in Google's o's that you're unlikely to uncover it, it remains unmoved.

And what about pernicious, stubborn information that refuses to go away -- a politician's affair or a workplace incident? Organizations like ReputationDefender promise to help you out by essentially laying out a net of interference, ranking up the positive information about you and ranking down the negatives. This is actually similar to some efforts to treat the persistent memories associated with PTSD.

This analogy hasn't been all that well thought-out, so I don't want to place too much weight on it. I've intended mostly to point out that discussions of "memory" and "forgetting" on the internet and attempts to address it could, and perhaps should, take very different forms. Forgetting is not the absence of information.

Finally, just a shout-out to some other psychology studies of note in the article:
  • Browsing the web with an attentive human-like avatar leads people to less self-disclosure, compared to no avatar or an avatar that isn't paying attention.
  • A recent study published in Psych Science (Back et al., 2010) shows that people's social networking profiles tend to be accurate reflections of their personalities.

* Rosen mentions a literary perspective on what it means to be unable to forget:
Jorge Luis Borges, in his short story “Funes, the Memorious,” describes a young man who, as a result of a riding accident, has lost his ability to forget. Funes has a tremendous memory, but he is so lost in the details of everything he knows that he is unable to convert the information into knowledge and unable, as a result, to grow in wisdom
There are a few published cases of individuals with this problem, but I'd love to read Borges' take on it.

Saturday, July 3, 2010

Recommitting - A Mens Rea Manifesto

It's been quite a while since I last posted. This may well have become a ghost blog, if there were digital tumbleweeds that were able to blow through it. However, I've decided to restart and reboot Mens Rea.

Why now, and in what form? Mens rea is latin for "guilty mind." I began this blog because I've had a long-standing interest in forensic psychology, even though it's not my field of specialty, and in fact it was started for a class that I was TAing at the time.

However, under this new renovation mens rea takes on a new meaning. I've been experiencing a guilty mind myself with regards to my creative and professional output; in short, I should really be writing more than I am. So to counteract this, I'm returning to the blogosphere, and I'll be writing about more than just forensic psychology. I'll be writing about research I'm reading, research I'm conducting, and ideas that cross my mind; however, I'll also bring in some of my outside interests: more diverse areas of psychology, games and gameplay, sustainable development, and more. That being said, this won't exactly turn into a personal blog, either -- I'll do my best to put a critical, analytical spin on the topics that I cover.

Writing is like any other learned activity: it becomes easier the more you do it. So this is my dedication to make writing a habit.