Thursday, June 2, 2011

A Korean multiculturalism?

A journalist contacted me about race and racism in South Korea, and I summarized some of my thinking (and prognostications) for him.  You may not believe it, but I think some of the most interesting (and potentially positive) things are happening right now with attempts to address race and multiculturalism in South Korea.

Is there racism in South Korea?  Absolutely, although the real question here is: what is the context for Korean racism?  And how is it different than other countries?  “Minjok" is a neologism borrowed from the Japanese that refers to a national ethnos.  It’s not the same as US operationalizations of race—nor would it be accurate to simply gloss it as “Japanese”.  Instead, it needs to be contextualized in the colonialist past—that is, while Korean minjok makes some of the same historical claims as Japanese minzoku (ancient, homogenous lineage, glorious destiny), Korean nationalist/ ethnic discourse develops first in the crucible of resistance to Japanese imperial ambitions, and then again in the wake of US occupation, partitioning (bundan) and the Korean War.  This is why you might find heavily nationalist rhetoric on both the Left and the Right—there are both conservative and progressive messages there.

But there’s another kind of “race” as well—this one very much the result of occupation by US forces during the USAMGIK period.  This “race” is, perhaps, more familiar to Westerners: the hierarchy of perceived phenotypical differences institutionalized in government, citizenship, employment, media representation, etc.  Koreans adopted this system as well.

But prior to the 1990’s, most people outside of Korea had little opportunity to experience either system—the resident foreign population was negligible.  But as that population has ticked upwards to 2%, so have opportunities for people to define themselves vis-à-vis racial others, and, in particular, guest laborers (who, whatever the complaints of expat American and Canadian English teachers, really bear the brunt of racism in Korea).  People from South Asia or Southeast Asia bear the double, racial burden as being defined both as non-Korean and dark-skinned.

As far as addressing these issues, there are all kinds of things going on right now in Korea, from lots of Korean academics studying multiculturalism, to lots of governmental and non-governmental organizations working to mediate discrimination and prejudice.  So I absolutely see things changing in South Korea.  But some of the more deep-seated (and hence more serious) problems are probably the same factors that contribute to deep racial inequalities in the US: not the incidence of hate speech itself (which, of course, still proliferates here), but in the access to networks of contacts that, in Korea, are invaluable for anything from education and employment to housing and marriage.

This sounds insurmountable—and it is certainly is challenging to progressive elements in South Korean society.  But it’s also exciting, because it means that whatever “multiculturalism” emerges in South Korea will be uniquely Korean—not, in other words, a recapitulation of the sometimes shockingly hollow US-style multiculturalism.  That is, it will address not only racial discrimination and differential citizenship, but also the post-colonial relations that reproduce these powerful inequalities. 

So, I continue to follow this issue, not just because of my Korean research, but to get some ideas for building a more inclusive society in the US.     

Monday, May 16, 2011

The New Anthropological Science Fiction--A Review of Ekaterina Sedia




Over the past months, I have been trying to decide (if only in my own mind) what anthropological science fiction looks like today.  After all, if you're looking for "fully realized worlds" in the style of 1960's and 1970's fiction, you'll not find it.  Even authors synonymous with "anthropological science fiction" (e.g., Ursula K. Le Guin) have moved away from that style towards something more like what James Clifford has called "partial truths".  "Ethnographic truths are thus inherently partial – committed and incomplete” (7). But, that said, anthropological science fiction still exists, albeit not by that name.  Or, rather, what's produced today is a kind of anthropological science fiction under erasure. 

That is, rather than the full (and functionalist) anthropological sf of the 20th century, what seems "anthropological" about sf today are exactly those partial, contradiction-ridden evocations of difference and alterity--no easy way to divide the alien other from the self.  

Ekaterina Sedia's The House of Discarded Dreams (2010) follows the dream-like adventures of Vimbai and her roommates through a perambulating, mutating house rife with variously mischievous spirits. It is absolutely in the tradition of the mysterious house--the genius of place that exerts its (oftentimes baleful) influence over its residents, from Hawthorne's House of Seven Gables to Lovecraft and beyond.

But when I read Sedia's novel, I had a more contemporary text in mind--Richard Grant's View from the Oldest House (1989).  Both, after all, feature disaffected college students discovering GREAT TRUTHS amidst an inexplicable house.  In Grant's novel, it's the tiresome Turner Ashenden, a Stephen Daedalus knock-off who is a spiritless foil for the postmodern jouissance that swirls decadently around him.

Sedia's text is certainly in that bildungsroman tradition, and even takes the same mis-en-scene, but with Vimbai, a New Jersey college student with parents from Zimbabwe. Her own relationship to Harare is tenuous--some vague memories of pictures, coloring books, and her grandmother--i.e., a very different place than the Zimbabwe her politically active parents fled.

But the New Jersey beach house she moves into is just the place to explore tenuous memories, ambiguities and contradictions.  As the house inexplicably takes to sea, Vimbai and her roommates, Felix and Maya, gradually confront their unresolved conflicts through encounters with various spirits--baleful or beneficent.  For Vimbai, a bestiary of Shona folklore, from the appearance of her grandmother as an ancestral spirit (vadzimu) who, fortunately, can cook for the roommates, to the "man-fish" Njuzu, a "Zimbabwean urban legend" (110).  The house seems to materialize her ambiguous relationship to her family, to the experience of race and racism in the US, to her education (marine biology) and to her sexuality:

Obedient, Vimbai dreamt,  Her dreams were vivid--more vivid, it seemed, than the waking landscapes inside the house.  She dreamt of smells and sounds, of saturated solid planes of color.  She dreamt of Africa as she had half-remembered it from her trip, half-imagined from the coloring books her mother bought her, and then got upset when Vimbai colored children on the pages pink instead of brown. (142)
Their exploration of the house's "pocket universe" brings Vimbai up against these dreams, and up against a life that she only understands in half-articulated kaleidoscopes of memories inflected with her parent's post-colonial critiques of Mugabe's Zimbabwe.

As she confronts zombie horseshoe crabs stricken with soul loss, the cognitive dissonance is too much:

This collision of worldviews--one that allowed for talking horseshoe crabs and one that hinged on graduate school applications--made her breath catch in her throat, bowled her over, brought her to her knees, and she clutched her head in her hands. (86)
But, eventually, she begins to come to terms with herself qua the contradictory networks that run through her life, connecting her to family, to ancestors, to other women.  
With every passing second, the wrinkles on her grandmother's face grew more and more familiar, with the same inevitability as one's face is recognized in the mirror.  Soon, the vadzimu and Vimbai would not be able to tell where one ended and the other began. (242-43)
With her grandmother's animus, it is Vimbai who begins to spin her own kind of magic.  Telling her own contemporary versions of ngano (pedantic folk stories), Vimbai is able to make peace with the trickster-figure man-fish and bring some semblance of order to her world.

It is an enigmatic novel--certainly as potentially narcissistic as Grant's View From the Oldest House, but never so self-assured.  Instead, the house stands at the intersection of global networks that bring together places, cultures, identities, social class, race and sexualities.
"That there are forces in the world," Felix answered.  "Forces that run along invisible wires--like phone wires of the spirit, and sometimes you get trapped in them like Peb, and sometimes you stumble in the middle and get caught like a fly in a spider web . . ." (71)
Sailing off into the ocean falls in to that "there and back again" cycle as much Odyssean as Earthsea, but it is also a way of enjoining a world of transitional, sociohistorical connections--a physical movement to mirror the movement of immigrants.   

So what kind of anthropological science fiction is this?  Perhaps some would place it more with  fantasy, but I see here a desire to interrogate the world-making that characterized more assured science fiction in the 1960's and 1970's--think Michael Bishop's early work, but more complex and more uncertain.  With Sedia, it is not the experience of culture contact that is at stake, but the open-ended life of a person stretched between different identities, what Lila Abu-Lughod has called a "halfie".  Who is self?  Who is Other?

And this kind of question cascades into the consciousness of multiple connections puncturing holistic visions of identity: nature and culture, local and global.  An animism that takes Sedia's protagonists over and beneath the seas, and one that ultimately undermines our understanding of culture as unified, integrated and autonomously human (in the Cartesian sense).  


Thursday, March 10, 2011

The Future is a Foreign Country: locating tomorrow’s world in the world of the Other

It has been almost thirty years since Johannes Fabian published Time and the Other (1983), a scathing critique of the ways anthropologists have slotted the Other into “other” times—the “savages” or “primitives” said to resemble the West’s history.  In many ways, his critique is still relevant today; the same kinds of discourse are used to explain contemporary politics in the Middle East with reference to supposedly ancient ethnic conflicts.  But there are other temporal machinations at work these days as well.  A fairly typical, recent example: a February 22 New York Times article on South Korea’s ubiquitous computing (“ For South Korea, Internet at Blazing Speeds is Still Not Fast Enough”)—years ahead of the United States.  Instead of being slotted into the past, here Korea appears as the future—underscoring US fears of being overtaken by Asian economies.  In this way, US futures are invoked in comparisons with the demographics, educational institutions, health care and environmental concerns of other nations, and there are other axes of comparison as well, with people in South Korea looking to Singapore or Japan (rather than the United States) for clues to its own future.               
In an era of globalization, these “future states” proliferate, part of a perpetual state of crisis that constantly compares self to others, agitating for restructuring, free-market reforms, retraining, mobility.  Comparisons and rankings regularly contrast multiple indexes of neo-liberal development.  Conditions at home are critiqued, and the warning is clear: we may be overtaken by global futures that continue without us.  But unlike other forms of allochronism, these future states are multidirectional and stochastic.  While the West represented a privileged modernity at one time, now a diffuse, unsettled capitalism locates the ”the future” in several places simultaneously, along networked lines of flight that link, for example, Asia and the West together at different points.  In an age of neo-liberal globalization, images of the future travel along flows of capital, migrants and media, generating representations and desires that are at once diffuse and ecumenical, simultaneously critical and complicit with the present.
Of course, thinking of Iran, South Korea or Singapore as the “the future” is no more credible than looking to other places as representative of the past.  Here, we’re just reversing the gaze, while leaving the orientalist architecture in place—fear of “yellow hordes” updated for the age of the smart phone.  But there more positive possibilities here as well—call it a “cultural arbitrage” that highlights gaps between people’s expectations for modernity and its unequal realities; that gap can open a window onto contradictory experiences and force us to question the course of our futures.  Ultimately, we might question inevitability of neo-liberal globalization itself.   
I’m planning to compare discourses on “future states” in the United States, South Korea and Singapore.  Through anthropological research on state reports , media, future-oriented events and expos, together with interviews with informants (parents, educators, employers, state technocrats), I plan to explore moments when the future is displaced onto the Other, with particular emphasis on technology, education, multicultural policy and health.
Ultimately, I believe my findings will tell us much about how a relentlessly networked globalization works to colonize future imaginaries.  But I also hope it will open up the possibility for alternative futures.  That is, in the gap created by what is perceived to be the present and the future purportedly located in another place may constitute what Ernst Bloch called a “utopian surplus”: the possibility for a different global future altogether. 

Tuesday, February 1, 2011

Can A Place Be the Future?

In a January 26th New York Times op-ed, "25 Years of Digital Vandalism," William Gibson reflects on the Stuxnet attack on Iran's nuclear facilities.  As a genuine futurist, Gibson looks to Stuxnet as a sign of the times--and a bellwether for the future.  He confesses, "I briefly thought that here, finally, was the real thing: a cyberweapon purpose-built by one state actor to strategically interfere with the business of another."  But he's disappointed in the end, to find that Stuxnet is really just another virus--albeit one perhaps appropriated by one government against another.  He is ambivalent about the meaning of this for the future of nuclear security. 

One of Gibson's strengths is his restless, global search for sites of the future.  Here, he looks to Iran, but he is best known for his (highly selective) evocations of Japanese postmodernity.  But this is a never-ending quest--the future proves elusively peripatetic.  As he commented in a 1989 interview, “I think that at one time the world believed that America was the future, but now the future’s gone somewhere else, perhaps to Japan, it’s probably on its way to Singapore soon but I don’t think we’re it” anymore."

But is this an ultimately pointless quest?  To what extent is it useful to think of the future as another place?  On the one hand, in an era of globalization, there's a certain temporal relativism at work.  One way of thinking of financial arbitrage (and other financial instruments) is precisely that: the exploitation of pricing irregularities that are a function of temporal distance.  After a relatively short time, these differences will disappear in a more homogeneous time of globalized capital.  But those are short, and necessarily fleeting, temporal distortions.   

In a sense, thinking of Iran, Japan or Singapore as "the future"is no more credible than looking to other places as representative of the past, a familiar tactic in 19th century anthropology, and still part of racist, ethnocentric depictions of non-Western peoples as "caught" in the "primitive past."  Here, we're just reversing the gaze--now, because of culture, politics or economy, the other place is thought to exist in an accelerated time horizon; looking at their "present" is said to grant us some insight int our future.  

But our more quotidian moments are more obdurately Netwonian or, perhaps the better way to think of it is "more Taylorist."  That is, after the work of F.W. Taylor, time for us is parsed out according to a unified, commodified form, ultimately synchronized into the monolithic, mechanical timepiece of global capital.  

Still, there is a real point to looking past the U.S. or Europe for the future.  And not because it opens up onto some magic window onto the next, big thing.  Call it "cultural arbitrage"--the gap that opens up between global modernity and the kind of hopes and expectations people have for their lives.  Looking somewhere else doesn't mean that our life will become more like their life.   But it does open up the possibility for reflecting on similar conditions in the US.  That is, the "gap" opens up onto our contradictory experiences and expectations and forces us to question the course of our own futures.

We'll be doing this in August of this year with our study abroad course in Seoul, South korea:   Harmony of Modernity and TraditionWe'll be reflecting on exactly those tensions that open up between people's lives and the modernity that we all share.  We'll be visiting temples, shrines, factories, shopping meccas, nightclubs.  Along the way to making sense of it all, we'll reflect on what it means for us as well.  Seoul not as a window onto the future, but as a means for thinking about our mutual futures.


References

Gibson, William (1989).  Interview (February) with Terry Gross on "Fresh Air."  Washington, D.C.: National Public Radio.   

Thursday, January 13, 2011

Technologies of Waiting


I've been reading Orvar Löfgren's and Billy Ehn's
The Secret World of Doing Nothing (University of California, 2010) in preparation for the Spring semester.  It's the first time I've used a work of ethnology  (i.e., a comparison of different cultures) in the classroom, as opposed to the conventional, in-depth monographs that are the bread and butter of US anthropology.

Lofgren and Ehn explore the cultural and social life of non-events, i.e., those parts of our life that we ordinarily "bracket" as irrelevant--the times we wait in line, or idly stare out a window.  There are interesting questions--especially with regards to methodology.  How do you do anthropology when no one thinks it's even worth talking about?  

Not surprisingly, they find that our experiences of these kinds of phenomena are culturally variable, and that "our" (US and Europe) expectations for non-events are very much conditioned by a modernity which 1) sets up a variety of institutions to organize people into spaces to contain "empty" time: waiting rooms, departure gates and 2) places a premium on "productive" time while making it immoral to "waste" time. 

One of the results is an in-built tension between "using" and "wasting" time--a double bind which places people in situations where they must surrender to the "empty" time of waiting while at the same time craving the productive, commodified time of the protestant work ethic.  If modernity replaced meaningful time (Biblical, moral, mythological) with time as an empty variable, then it is not surprising that people would find this unsettling.  Accordingly, there have been many technologies developed to solve the dilemma of empty time:
The accelerated pace of everyday life in the Western world is often said to have influenced the way people feel about waiting.  A whole industry has been built up around diminshing delays. (28) 
One of the major successes, of course, has also been the most Pyrrhic--the automobile has both sped up and slowed down--first by raising expectations for speed and crushing them with the multiplication of sprawl around the world.  Thanks to this effort to speed transportation (and the concomitant spread of suburbs), commute times are high: The average commute where I live (Maryland, USA) is 31 minutes.  China's average commute: 42 minutes.  Tokyo workers: 60 minutes. This hasn't stopped the desire for faster transportation at all.  Indeed, based on The Secret World, one would have to prognosticate that the future will mean various other devices to accelerate.

Still, thinking about waiting and technology, I can imagine other desires besides acceleration.  For one thing, many of the information technologies that we utilize have little to do with "saving" time--in fact, they introduce a number of time effects that include different ways of parsing out time, the frisson of sudden time dilation, the rhythm of turn-taking, etc.  This has been a major draw in gaming: the introduction of "game time" (Tychsen and Hitchens 2009).  Other IT introduces different time effects, the point being less that they introduce "more" speed, then that they demand that the user enter into the new pace.  Social network technologies aren't about speeding up or slowing down along a linear continuum so much as the introduction of different, temporal rhythms.  Aren't these temporalizations another reason for their popularity?

To take this back to Lofgren's and Ehn's book, the growing blight of "empty time" in the form of commuting and bureaucratization may give rise to various technologies of speed (in Virilio's sense), but will also stimulate the development of technologies that introduce new time effects.  "Empty time" acts as a an abstract table upon which variously commodified, variously meaningful time effects can be overlaid--e.g., the rhythm of text messaging and the dialectic of anticipation and expectation produced in the space of that temporalization.  But it's the difference that's important there, not necessarily the speed. 

References
Tyschsen, Anders and Michael Hitchens (2009).  "Modeling and Analyzing Time in Multiplayer and Massively Multiplayer Games."  Games and Culture 4(2). 

Tuesday, December 21, 2010

The Future of Mind

The New York Times has been adding blog content to its online site.  One of the most interesting (and most surprising) additions to the unfortunately named "Opinionator" section has been "The Stone,"  a forum edited by Simon Critchley, chair of the department of philosophy of New School in New York, that began in May. It's a philosophy blog--a welcome addition, especially compared to the blogged content on other newspapers (sports, crime, consumer news, entertainment).

Over the past couple of weeks, the columns have turned to critiques of neuroscience--or, should I say, a critique of popular representations of neuroscience, where every culture and behavior has its materialist correlate measured in the release of dopamine, the firing of neurons.  Which, of course, is on one level entirely true--we are biological creatures, after all. But the results of neuroscience that trickle down intro etiolated newspaper articles present the materialist reduction as "explaining" our complex lives--violence, love, etc.--in a way that seems calculated to shut down curiosity in science by suggesting that everything is on the brink of final explanation.

But "mind", like "body," is instead a perpetual work-in-progress, with room for sociological or even (gasp) anthropological speculations on what may emerge next.  In other words, the study of cognition is inherently future-oriented. 

A couple of the most recent columns come from one of the more well-known cognitive scientists out there, Andy Clark.  He's a popularizer, certainly, but one who has always argued for a more complex model of thinking.  In his December 12th column, "Out of Our Brains," he recapitulates the arguments for a "distributed cognition" (somewhat disingenuously described as a "current" movement even though it's been around for decades).

But he extends those argument to ICTs--information and communication technologies:   

If we can repair a cognitive function by the use of non-biological circuitry, then we can extend and alter cognitive functions that way too.  And if a wired interface is acceptable, then, at least in principle, a wire-free interface (such as links in your brain to your notepad, BlackBerry or iPhone) must be acceptable too.  What counts is the flow and alteration of information, not the medium through which it moves. 

This is not exactly a revolutionary idea.  The example James McClelland and his co-authors gave in their seminal, 1986 paper was a simple arithmetic problem--multiplying 2, three-digit numbers.  How many can do it in their head?  And how many need a "tool" (e.g., pencil and paper) to "think" this problem through to a solution?  And if we accept that the the boundary of cognition can be drawn to encompass the environment (in this case, the pencil and paper) around us, then there is little reason not to consider the information technologies we use in those processes as well.  Extrapolating on this to the future of cognition, we can safely predict that new tools will bring new, complex forms and configurations of cognition.  As Clark concludes:

At the very least, minds like ours are the products not of neural processing alone but of the complex and iterated interplay between brains, bodies, and the many designer environments in which we increasingly live and work.  

Fine.  Thank you Andy Clark, for the observation!

But where I begin to become more interested is with the idea that the "interplay" may go the other way as well.  We take it as axiomatic that--however extended our cognition is into the cell phones we deploy--"cognition" extends from the the "I" outward, a Cartesian intentionality where "I" am the master of my many tools.  But couldn't it happen the other way?  Couldn't we be the "tool" of some machine cognition--a pawn, as it were, in the connectivity of our hand-helds?  We don't, I think, need to stoop to Hollywood science fiction to imagine this--indeed, this is the whole branch of science and technology studies (Actor-Network Theory and its many spin-offs).  Our machines "exert" some of their own priorities onto us, and, rather fittingly, we, accordingly, become more "machine-like" in our thinking.  The moment you've moved outside of a room to get a better cell phone connection is the moment you've done your machine's "bidding"!   But how has this impacted our conversations and relationships with each other? 

We can see this Andy Clark's blog entry itself--"What counts is the flow and alteration of information, not the medium through which it moves."  He already conceives of cognition along the lines of information technologies--as quanta of information sent and received.  He has become (as have all of us) more "computer-like" in our cognition, just as our current development of multiple social networking platforms has made our social life more "network-like".  Or the universality of Graphical Use Interfaces has made us capable (or incapable) of "multi-tasking".  That is, not just adding a new word ("multi-tasking") but enabling people to consider cognitive actions as discrete "applications" that can be simultaneously undertaken like opening multiple windows on a computer screen. 

For the future, these are the interesting, unanswered questions: if we're doing "cell phone" thinking today, what kinds of cognitions will we be embedded in tomorrow?  What machines will we invent to help us think?  And how will those machines "think" with us?

References
McClelland et al. (1986) J.J. McClelland, D.E. Rumelhart and the PDP Research Group (eds).  Parallel Distributed Processing.  Cambridge, MA: MIT Press.

Tuesday, November 30, 2010

Parasitic Twittering at the Anthropology Conference

I posted this at www.wfs.org as well . . .


I’m back from the American Anthropological Association Annual Meeting in New Orleans, Louisiana.  As expected, 6000 of us shuttled between two, huge, corporate hotels on Canal Street, soaking up hundreds of panels, poster sessions, round tables and workshops organized according to our association's unique calculus—unpopular panels (like mine) should be held in cavernous banquet halls, while popular topics should be granted a room the size of a bargain berth on a Carnival cruise.
  
But there was also Twitter.  By all accounts, a few thousand tweets from a handful of people before, during, and after our conference.  You can see them all archived with the #aaa2010 hash code.

There was “Kerim” (as he is known at the anthropology blog, “Savage Minds” [savageminds.org]), alerting anthropologists to the “Twitter Meetup” at a restaurant near the hotel.  “Ethnographic Terminilia” to a party at Du Mois Gallery (uptown).  The jazz funeral for Walter Payton, the celebrated New Orleans bassist.  A book signing at an uptown bookstore.  Hints on getting around town; kvetching about the water “boil alert” (from Friday to Sunday).

Not exactly South By Southwest, was it?  It depends on what you were expecting.

Last year, there was an avalanche of blogging about the political power of twitter in Tehran—later (and rather embarrassingly for journalists who ought to have been more skeptical) revealed to be far less of a revolution than originally depicted.  But it’s par for the course for our society, where technologies are regularly accorded tremendous power to affect social and political change.  Malcolm Gladwell critiqued this tendency towards hyperbole in a recent New Yorker article.  He warns,

"It shifts our energies from organizations that promote strategic and disciplined activity and toward those which promote resilience and adaptability.  It makes it easier for activists to express themselves, and harder for that expression to have any impact.  The instruments of social media are well suited to making the existing social order more efficient.  They are not a natural enemy of the status quo.  If you are of the opinion that all the world needs is a little buffing around the edges, this should not trouble you.  But if you think that there are still lunch counters out there that need integrating it ought to give you pause." (Gladwell 2010)

In many ways, Gladwell is spot-on in his critique.  Too many essayists and academics write about Twitter the way people write about iPads or cell phones or whatever—as pivotal, ultimately deterministic technologies that are going to change the world in some beneficial way.   This is where marketing and scholarship meet: sales hype finds its hyperbolic echo in academic scholarship.  When the reality is less than game-changing, you’d think that these kinds of proclamations would become less common.  But the same commentators just move on to the next social media.

Ultimately, this distracts us from considering what social media do, and what they might do in the future.  Looking back at the modest twitter presence at the anthropology meetings, it would be hard to suggest that twitter represented an alternative to the main conference.  Nothing of the sort, really—most of the tweets were actually commentary, summaries or advertising for papers and presentations at the conference.  But the stuff that got retweeted the most were announcements for off-site events: little challenges to the monopoly of the conference site in the form of meet-ups, gallery showings and book signings.  In other words, nothing there that represented an actual alternative to the conference (not a new way to conference), but little nudges to conference attendees to consider supplemental events outside.

Here, twitter reminds me of Michel Serres on “parasite logic,” the way that a outside, third party (or media) intercedes in a dyadic communication and opens the possibility for new meanings or new action.  As Brown (2002:16-17) writes,

“In information terms, the parasite provokes a new form of complexity, it engineers a kind of difference by intercepting relations. All three meanings then coincide to form a ‘parasite logic’–analyze (take but do not give), paralyze (interrupt usual functioning), catalyze (force the
host to act differently). This parasite, through its
interruption, is a catalyst for complexity. It does this by impelling the parties it parasitizes to act in at least two ways. Either they incorporate the parasite into their midst–and thereby accept the new form of communication the parasite inaugurates–or they act together to expel the parasite and transform their own social practices in the course of doing so.”

Twitter’s power lies in its ability to interrupt, supplement and catalyze different kinds of behavior: a media to impel people to (briefly) diverge from their expected scripts at the conference and, say, take a trolley uptown. This is a powerful potential—one that people like Clay Shirkey have made a career off of extrapolating upon.

But it is, ultimately, a parasite technology, one that requires the presence of more monolithic institutions to function.  That is, it supplements the school, the meeting, the demonstration, rather than moves to replace them.  More than that, its ontology rests on the presence of these more permanent, more powerful structures.  This hardly represents some grand failure on the part of social media—it’s a just a reminder to look to the social contexts of media rather than media themselves.

Doing so can also free us to imagine other parasite technologies—cascades of social media that nudge, prod, intrude, implore.  We move to a future where social technologies will consistently fail to be transcendent—will fail to utterly transform the way we exist and communicate. But ultimately, the parasitic itself can prove transformative.

References

Brown, Steven D. (2002). “Michel Serres.” Theory,
Culture & Society 19(3):1-27.
Gladwell, Malcolm (2010).  “Small Change.”  New Yorker 10.4.2010: 42-49.

Multimodal Interrogations of Anthropologically Unintended Media - Video link

Matt Durington and I had a wonderful time giving a talk at UBC Okanagan. Thanks to Dr. Fiona McDonald and the Collaborative and Experimental...