Showing posts with label MOOCs. Show all posts
Showing posts with label MOOCs. Show all posts

Tuesday, February 25, 2014

You Ruined My Game

(previously published in Anthropology News)
As the brief, terrifying passion for MOOCs slowly dissipates, your university administrators may be casting around for some other technologically enhanced pedagogy.  Might I suggest gamification?  It’s not a new idea, by any means—people have been applying game-based mechanics to learning for some time, but its latest incarnation focuses on online games, from single player to collaborative, multiplayer experiences.
Of course, there’s a good deal of potential for gamification to follow on other technologically-driven changes in university teaching—ie, towards another wave of expropriation as public universities “partner” with private capital in order to undermine the autonomy of faculty.  But I believe there’s subversive potential here for anthropology.
A screenshot of Manic Digger photo courtest Pierre Rudloff and wikicommons
A screenshot of Manic Digger image courtesy Pierre Rudloff and wikicommons
I’ve been thinking a lot about games and subversion recently, mostly because my children have entered their online gaming stage of child development, and are spending inordinate amounts of time either playing Minecraft or watching other people play Minecraft on screencast videos uploaded on YouTube.
Among the innumerable screen captures with stammering, preteen voice-overs, there are other, less innocent uploads that chronicle the efforts of teams of tricksters to trap, harass and prank other players.  This “griefing” runs the gamut from facile to sadistic—and if you play in any multiplayer environment, you’ll certainly have encountered behavior like that.
And while some of this simply looks like cyber bullying, I have begun to think of it in terms of anthropological approaches to gamification.  The true subversion of griefing is not that various pranksters refuse to play by the consensual rules of a multiplayer environment (although there are many, ham-fisted examples of this), it’s that the victims of their pranks believe that they’re playing one game, when in reality they’re part of other gaming logics of which they know nothing.  The humor (if it is that) lies in the victim’s realization that the game they thought they were playing is no longer possible and has been overturned
One of Gregory Bateson’s most interesting contributions was his theory of the “double bind,” the logical and discursive forms that trap victims in a vicious feedback loop where their behavior is castigated no matter what they do.  He initially theorized that double binds would precipitate schizophrenia, but later in his career began to explore the potential of double binds to stimulate creativity.  In particular, in Bateson’s theory of learning, “learning how to learn” (what Bateson later calls “Learning II”) can settle into a self-affirming cycle where positive or negative stimuli both serve to bolster a particular representation of the world.  The only way out of this would be to undermine the contexts of that understanding themselves—to move beyond rewarding or punishing behaviors and actions to calling into question not only what we might mean by reward or punish but the entire system of thinking upon which that logic rests.
In an anthropological approach to gamification, what might we be trying to subvert?  There are several: that play is competitive, composed of winners and losers,  that the environment around us needs to be exploited for personal gain, that winning and losing can be quantified as points.  Games that people play reinforce (and reinforce again) dominant understandings of people and the world, not only in terms of the politics of representation (e.g., depictions of gender in games), but at much deeper levels of game play.
For the last twenty years, people have been developing a whole species of pedagogical games (or serious games) that seek to unmask these dominant assumptions, and by confronting players with a double bind precipitate a critical understanding of the world.   For example, “Spent” (developed by an ad agency for the Urban Ministries of Durham) challenges players to survive economic hardship.  But there are no winners, really; even if you win, you end up with a couple of dollars left over at the end of the month and the challenge to survive begins again.  The message: if you are un- or underemployed and lack a place to live, you simply do not have the ability in contemporary society to pull yourself up.
Serious games like this can have a laudable impact on understanding, but anthropology, perhaps, can push critique even further, past the subversion of the game to undermining what we even mean by a game.  Something very much like the Situationist dérive, with its deliberate subversion of walking and living in the city by acts of randomness and deliberate refusal.   The idea that you might be walking in Paris following the dictates of social class and capitalist accumulation while the people walking alongside you follow the improbable logic of the dérive is profoundly unsettling and defamiliarizing.  That is, the game you thought you were in turns out to be a different game altogether.  Even various guerilla performances (like the No Pants Subway Ride) still fall short of questioning the city as primarily a growth machine.
Anthropology has long defined itself along a vector of cultural critique, although this has meant various things at various times, from a mild cultural relativism to a more piercing unmasking of the exploitation at the heart of processes of globalization.  With gamification, anthropology has an additional opportunity: to ruin someone’s game.

Wednesday, May 8, 2013

MOOCs, Matrix, Bridge



At the moment I write this, a creeping group think has saturated both higher education (The Chronicle of Higher Education), and popular media (New York Times, Huffington Post, etc.).  It's that moment when public debate constricts to a terrifying one-dimensionality--when all manner of unwarranted assumptions attain hegemony and become the scaffolding for etiolated prognostications.  And, in this case, where we enter a time-warp and return to the 1980s.   

Take, for example, an April 30 article from the front page of the New York Times, "Colleges Adapt Online Courses to Ease Burden". Here, the President of San Jose State University, Mohammad Qayoumi, discusses his enthusiastic adoption of MOOC modules from MIT:
"Traditional teaching will be disappearing in five to seven years, he predicts, as more professors come to realize that lectures are not the best route to student engagement, and cash-strapped universities continue to seek cheaper instruction" (A1).
The pilots in STEM have some early adopters suggesting that the idea of making one's own curriculum is misguided.
"Our ego always runs ahead of us, making us think that we can do it better than anyone else in the world," Dr. Ghadiri said, "But why should we invent the wheel 10,000 times?  This is M.I.T., No. 1 school in the nation--why would we not want to use their material?" (A3)
In the dull, predictable style of many writings on information technologies, MOOCs have attained that air of inevitability, the point at which public discourse is reduced to legitimating the idea in one way or another (should we just do away with colleges?  or just professors?).  Of course that's their attraction--economies of scale lead to large scale de-skilling of the professoriate a la Harry Braverman, money can be expropriated from the public sector to the private, enormous salaries can be collected by upper-management by cutting out middle-strata of professors.  Wait--isn't this what happened to IBM?  Automobile manufacturing?  The 1980s indeed.

But there's anther sense that current hysterics over MOOCs revive 1980s anachronisms: the way that MOOCs revive (zombie-like) the distorted dreams of the past, now projected into a dull future, where, pace William Gibson's Neuromancer (1984), the sky is "the color of television, turned to a dead channel" (3).  There are several reasons to be critical here, quite apart from my desire to preserve my own job or my belief that students at our regional state university would be the likely losers in a MOOC world.  My biggest protest: it could all be so much better, and, here we are, stuck in another era's dream (or nightmare) of the future.

William Gibson is credited for the invention of "cyberspace," that "consensual hallucination" that makes its first appearance in "Burning Chrome" (1982) and its more robust exposition in Neuromancer (1984).  It's "the matrix" that Gibson's protagonist lusts over that secured Gibson's position vis-a-vis cyberpunk and science fiction, "'a graphic representation of data abstracted from the banks of every computer in the human system.  Unthinkable complexity.  Lines of light ranged in the nonspace of the mind, clusters and constellations of data.  Like city lights, receding . . .'" (51).  The evocations of this "paraspace" and the complex of desires Gibson invests in it made cyberspace a peculiarly fecund metaphor in the 1980s.  "Cyberspace" telescoped 1980s dreams of white middle-class transcendence and segregation; the digital equivalence of gated communities, it was a way of imagining oneself at the center of the world while still maintaining one's separation from it. 

And as a creative trope, it was quickly exhausted.  At some point between Neuromancer's Chandler-esquae cyberspace (1984) and Neal Stephenson's Pynchon-esque (and Python-esque) Snow Crash (1992), "cyberspace" (and the cyberpunk genre that mass media invented around it), lost much of its potency.  Stephenson's "Metaverse" is already a parody.   But policy makers clung to Gibson's image, and it passed into countless, inane reports on the "information superhighway," some of which I explore in my 2009 book, Library of Walls.  To say that Gibson's narrative "caught on," is to grossly underestimate the power it wielded in the formulation of policy.  As viral meme, "cyberspace" was especially good at infecting education.  As Al Gore articulated in the 1990s, the goal of the Clinton administration was "to give every child in America access to high quality educational technology by the dawn of the new century."  This was certainly a promising start,but it did not mean addressing entrenched racism and class inequalities; nor did it mean addressing the re-segregation of American schools.  But this was the beauty of evoking cyberspace--invest in a metaphoric space that can divert attention away from increasingly unequal, physical spaces, a shell game that continues to this day. 

However: "cyberpunk" writers (and Williams Gibson) had already moved on.  Gibson's 1990s Bridge trilogy (Virtual Light, Idoru and All Tomorrow's Parties) developed other interstitial spaces that complicated the easy abstraction of cyberspace (Farnell 1998).  Other writers associated (however fallaciously) with cyberpunk moved to ecology, to steampunk, to contemporary fictions.  This shouldn't be shocking.  When I re-read Neuromancer, I'm struck by the narrative inadequacies of cyberspace.  In the opening pages, Case is an addict impotently yearning for a cyberspace utopia he's been turned away from: "his hands clawed into the bedslab, temperfoam bunched between his fingers, trying to reach the console that wasn't there" (5).  When he is re-united with his virtuality, that cyberspace--however richly imagined--is never enough; 85% of the novel takes place in the "real" work, not the virtual.  If Case is an addict, he is one addicted to the real, not the virtual, and he needs to return to its pleasures and dangers at regular intervals.  Fittingly, Gibson narrates around cyberspace: it's Japan, London, New York.  These are cyberpunk's enduring images, with leaps into cyberspace supported by globetrotting evocations of the urban-noir.  On the other hand, the "real" is never really enough, either; cyberspace fits together disconnected narratives, joining them across geographic distance; it lends "real" space a media-saturated edginess.  "Cyberspace," then, fills in as Gernsback-ian deus ex machina for the inadequacies of one, and the gritty spaces of the post-industrial metropolis for the inadequacies of the other.

The Bridge trilogy, then, can be seen as exploring the asymptotes of both physical and virtual space, with the "bridge" itself (modeled after the Bay Bridge) bringing together virtual and physical into a chaotic bircolage of popular culture, technology, nostalgia and globalization.

"He tried to imagine this place the way it had been before, when it was a regular bridge.  Millions of cars had gone through here, this same space where he walked now.  It had all been open then, just girders and railing and deck; now it was this tunnel, everything patched together out of junk, used lumber, plastic, whatever people could find, all of it lashed up however anybody could get it to stay, it looked like, and somehow it did stay, in spite of the winds he knew must come through here  He'd been back in a bayou once, in Louisiana, and something about the the way it looked here reminded him of that: there was stuff hanging everywhere, tubing and cables and things whose function he couldn't identify, and it was like Spanish moss in a way, everything softened at the outline.  And the light now was dim and sort of underwater-looking, just as these banks of scavenged flourescents slung every twenty feet or so, some of them dead and other flickering" (185). 

The bridge is built from the dross of half-abandoned modernities, and its imaginative power comes precisely from this juxtaposition--a collage of castaway imaginings stitched together with networked hypermedia. 

So what does this trip down sf-memory lane have to do with MOOCs?  For those who look to MOOCs as the future of public education, the premises, and the oppositions they construct, are the same: on the one hand, online, vertiginous living, free from constraint, transcending all manner of (embodied) inequalities--race, ethnicity, social class, nationality.  And on the other, the brick-and-mortar university, here understood as an inertial drag on progress: sucking away at government money, while sinking families deep into insurmountable debt.  But MOOCs, like cyberspace, are already bankrupt metaphors the moment they're deployed.  If nearly ~40 years of  leaden writing on "virtual" life has taught us something, it's that "virtuality" itself can never stand alone.  The reason?  You can't live in a metaphor, and nor can you learn in one.  Scholars working in distance education have developed a much more robust experience for students, but MOOC designers have largely ignored these innovations, largely because (one imagines) they are labor (and teaching) intensive.  But there's some deliberate ignorance at work, too.  President Qayoumi seems to believe that the MIT professors are "teaching" the students--this is a bit of magical thinking that is only possible in the prison house of 1980s metaphor. 

But what is most galling to me--and takes me back again to Gibson's peregrinations--is the anachronistic imagination at work.  ICTs (information and communication technologies) have not developed in this direction--i.e., creating hermetically sealed online environments that will allow us to jettison the "meat" of material life.  In fact, just the opposite.  As Lee Rainie and Barry Wellman point out in Networked, "Internet use does not pull people away from public places, but rather is associated with frequent visits to places such as parks, cafes, and restaurants--the kinds of locales where people are likely to encounter a wider array people and diverse points of view" (Rainie and Wellman 2012: 119).  Rainie's careful research on networked life for Pew suggests that people utilize ICTs for making connections with people and places, not for transcending them: "ICTs are about society as well as relationships.  They support participation in traditional settings such as neighborhoods, voluntary groups, churches, and public spaces" (128).  In other words, ICTs unfold across urban fabrics--they form a kind of connective tissue linking together a variety of heterogeneous, "third spaces" that make up the daily round.  Through developing "present absence" and "absent presence," ICTs anticipate face-to-face meetings; they proliferate among urban inter-spaces, filling gaps between meetings, commutes, leisure.  People spend time online, not to escape from their offline lives, but to enable them. It is not by accident that social networking apps have exploded over the last 15 years, or that geosocial apps have become popular; people embrace new ICTs because they facilitate meeting in social and physical space.

But this is not what MOOCs have done, for reasons that have everything to do with economies of scale.   Like McDonald's, they're about "serving" billions and billions--replicating that statistics class over and over again--the Ivy League reduced to a Big Mac.  On the other hand, online education could mean more face-to-face communication with faculty, more integration with place and community, more interaction with peers on a variety of context-dependent levels.  Communicating with students through a networked media platforms, meeting them in different (physical) places, extending and distributing learning in a bewildering variety of spaces, continuing learning (and presence) before and after formal teaching begins and ends: these are the potentials.  Yes--leaving the classroom behind, not by transcending it, but by extending it into new, fantastic topologies.  Or, alternately, the creation of "interstitial" spaces linking the university to place and identity is multiple ways.  If technological developments are moving us towards "the bridge" rather than "the matrix," then why not embrace that complex swirl of materialities, socialities and virtualities as they unfold in heterogeneous, unpredictable ways?  Why try to create homogeneous experiences?   Of course, resources, resources, resources.  But also a colossal failure of imagination: being content to inhabit the dreams of earlier generations.   




References

Farnell, Ross (1998).  "Posthuman Topologies."  Science Fiction Studies 25(3).

Gibson, William (1984).  Neuromancer.  NY: Ace Books.

--(1999).  All Tomorrow's Parties.  NY: Ace Books.

Lewin, Tamar (2013).  "Colleges Adapt Online Courses to Ease Burden."  NYT (4/30/13): A1+.

Rainie, Lee and Barry Wellman (2012).  Networked.  Cambridge: The MIT Press.  

Cybernetics and Anthropology - Past and Present

 I continue to wrestle with the legacy of cybernetics in anthropology - and a future premised on an anthropological bases for the digital.  ...