sleeping alone and starting out early

an occasional blog on culture, education, new media, and the social revolution. soon to be moved from

Archive for the ‘Facebook’ Category

on Cory Doctorow on how to say stupid things about social media

Posted by Jenna McWilliams on January 5, 2010

Originally posted at

“There are plenty of things to worry about when it comes to social media,” says writer Cory Doctorow in his fantastic Guardian piece, “How to say stupid things about social media.” Social media environments, he continues,

are Skinner boxes designed to condition us to undervalue our privacy and to disclose personal information. They have opaque governance structures. They are walled gardens that violate the innovative spirit of the internet. But to deride them for being social, experimental and personal is to sound like a total fool.

Yet plenty of perfectly smart people who should know better say exactly the foolish kinds of things Doctorow rightly decries in his post. Mainly, lately, the stupid things have been leveled at Twitter: It’s trivial. It’s banal. It’s too voyeuristic, or it’s a weak imitation of real relationships, or–and this is the one that really gets me–I try to use it in smart, deliberate, consequential ways, even though lots of my followers don’t.

Partially, people who take stances like the above fail to see that the majority of the communication on sites like Twitter falls into the category of what Doctorow calls “social grooming.” He writes:

The meaning of the messages isn’t “u look h4wt dude” or “wat up wiv you dawg?” That’s merely the form. The meaning is: “I am thinking of you, I care about you, I hope you are well.”

Doctorow compares the “banality” of conversations on Twitter and Facebook to the conversations we have with coworkers. We ask a coworker if she had a good weekend, he writes, not because we care about how her weekend went but because we care about developing bonds with the people around us.

Yes, though that’s only part of the answer. In choosing to communicate via Twitter, I’m not only saying “I am thinking of you, I care about you, I hope you are well,” but I am also publicly announcing: “I am thinking of him, I care about her, I hope he is well.” These announcements are interspersed with my Twitter interactions with
people who are not close friends or even necessarily acquaintances–people I care about only in the most abstract sense. I follow just under 350 people, after all, and am followed by around the same number–a far higher number than I am equipped to develop deep relationships with. And lots of people follow and are followed by far greater numbers than I.

The creaming together of the personal and the professional, the public and the private, means that ‘trivial’ social interactions in online social networks, however much they seem to replicate those that pepper our physical interactions, actually represent a new social animal whose form we have yet to fully sketch. We’re all kind of blindly feeling our way around the elephant here. We who embrace social media technologies can scoff at the person who says an elephant is like a water spout after feeling only its trunk, or the person who has felt a little more and argues it’s like a moving pillar topped off by a shithole, but we would do well to remember that in this parable, everyone who tries to describe the elephant, no matter how much of it he has touched, can only describe it by comparing it to objects he has previously encountered. Twitter is similar to a lot of things, but in the end it’s its own elephant, identical to nothing else we’ve seen before.

This is why, as Doctorow points out, people rely on personal experience and therefore read Twitter and similar networks as trivial and banal instead of deeply socially meaningful. But it’s also why we need to take care to treat the social meaning as different from that which emerges through other types of (digital or physical) social interactions.

Posted in Facebook, participatory culture, social media, social revolution, Twitter | 2 Comments »

time to smack down the Wall Street Journal

Posted by Jenna McWilliams on October 12, 2009

(Don’t worry; I snuck around the pay wall.)

As my sister Laura put it when she sent me this article on why the Wall Street Journal is five years behind the times why email is no longer the communication tool of choice, “It’s trying so hard to be ‘with it’ and in the flow of the times… But it seems stuffy and like a 40-year-old’s take on new media.”

The piece is called “Why Email No Longer Rules… And what that means for the way we communicate,” and it reiterates points that were interesting to tech folks a handful of years ago. (Here’s a 2007 piece on the decline of email from Gawker; here’s a 2007 piece on the same topic from Slate ; here’s a 2006 blogpost on the issue ; and so on.)

Among the “insights” of the WSJ article are that email seems painfully slow to us now:

Why wait for a response to an email when you get a quicker answer over instant messaging? Thanks to Facebook, some questions can be answered without asking them. You don’t need to ask a friend whether she has left work, if she has updated her public “status” on the site telling the world so. Email, stuck in the era of attachments, seems boring compared to services like Google Wave, currently in test phase, which allows users to share photos by dragging and dropping them from a desktop into a Wave, and to enter comments in near real time.

There is, reporter Jessica E. Vascellaro explains, a new phenomenon wherein we receive a constant stream of information, both personal and professional. There is a downside, as she points out:

That can make it harder to determine the importance of various messages. When people can more easily fire off all sorts of messages—from updates about their breakfast to questions about the evening’s plans—being able to figure out which messages are truly important, or even which warrant a response, can be difficult. Information overload can lead some people to tune out messages altogether.

Additionally, the speed of communication presents problems:

While making communication more frequent, they can also make it less personal and intimate. Communicating is becoming so easy that the recipient knows how little time and thought was required of the sender. Yes, your half-dozen closest friends can read your vacation updates. But so can your 500 other “friends.” And if you know all these people are reading your updates, you might say a lot less than you would otherwise.

Good lord! she exclaims. We’re surrounded by this constant stream of information! How will we manage it?

It makes sense that we would compare new forms of communication–Twitter, Facebook, text messages–to older forms of written communication like email and, going back more than ten years (!), letters, memos, and personal notes. If we compare newer communication technologies to those previous modes of written communication, then Vascellaro’s points ring true.

But here’s where Laura got it right: Thinking of Twitter as a faster, shorter, and less consequential version of email is an old-school paradigm that ignores that other than the fact that it works primarily with printed text, Twitter (and, more broadly, microblogging in general) is not like email at all. Anyone who approaches these new platforms with an attempt to figure out what’s ‘important’ and what’s ‘trivial,’ what needs to be acted on and what can be ignored, is missing out entirely on the spirit of these spaces.

In fact, Twitter, Facebook, and similar participatory platforms support a convergence of multiple types of communication. Twitter, just as one example, supports a type of identity work that was not previously seen in other communication environments. Through the careful combination of tweets about personal information, ‘trivial’ details, and and professional interests, people are painstakingly (sometimes, especially at first, accidentally) crafting and presenting a coherent if fluid and flexible identity, which then informs the identities they present in other spaces, online and off.

What makes Twitter new is the particular combination of people and affordances. Facebook and similar social networks require people to send out friend requests that must then be accepted; it means people can control, fairly strictly, the size of their community. Twitter requires no such permission. Because I can follow almost anyone I want to, and because almost anyone who wants to can follow me, we’re seeing a fascinating intermixture of near and distant connections between people. I follow my best friend, who follows me; I also follow my idol Clay Shirky, who doesn’t follow me (yet); and I follow colleagues who fall all along the friendship continuum. Some of them follow me and some of them don’t (yet).

Vascellaro largely focuses on the professional implications of new communication tools, and agrees that one nice feature of these tools is that information is often available instantaneously–if you need to know whether a colleague has left work yet, you might check her Facebook status. The downside, she explains, is that

a dump of personal data can also turn off the people you are trying to communicate with. If I really just want to know what time the meeting is, I may not care that you have updated your status message to point people to photos of your kids.

In fact, if you’re using Facebook and Twitter just to find out the kinds of information you used to get through email and phone conversations, then the volume of information may feel overwhelming and prohibitive. But if you’re only focusing on how to use these tools to do the work that email used to do, then you’re kind of missing the point: Social media communication tools provide new avenues for doing deep identity work in communities that mix professional and personal relationships.

To be clear, Twitter is not email on steroids. Facebook is not like coffee circles for 500 people at a time. And blogs are not diaries with 100 or 1000 readers. Twitter is Twitter. Facebook is Facebook. And trying to parse these spaces by comparing them to previous one-to-many types of communications (like email) limits one’s ability to see the full range of the affordances of these platforms.

I don’t mean to hammer too hard on Vascellaro, who has written numerous interesting tech-related articles for the WSJ. But a quick look at her homepage reveals her biases (this, by the way, is another interesting aspect of an increasingly participatory culture: a public figure’s digital footprint becomes a matter of public interest). Her site points to her Facebook page, the details of which are locked to the public; the ability to lock down a Facebook page, in my experience, is a feature largely leveraged by people who struggle with the notion of mixing the personal and the professional. But increasingly, the ability to engage with this mixture–even by getting it wrong sometimes–is more valued and valuable than the ability to carefully separate the two.

A look at Vascellaro’s Twitter feed is even more telling. Her first several hundred tweets mimicked the style of Facebook status updates:

Eventually, she switched to a broadcast approach, mainly tweeting about interesting articles or linking to her own writing at the WSJ. These forms of participation are, just to be clear, perfectly legitimate. But other people do it better, and I imagine they get more out of the Twitter experience.

“Better” in this case, means “with deeper engagement in the collective meaning-making process supported by the affordances of Twitter.”

[insert contemplative pause]

Truly, I’ve been sitting here considering whether an examination of Vascellaro’s social networking practices are germain to a critique of her article. She links to her Facebook page (locked to outsiders) and her Twitter feed on her home page, and I believe that there is much to be learned about a technology reporter’s biases through an examination of her use of those technologies. But I wonder how I would feel if someone picked apart my use of Twitter, Facebook, and other social media platforms. At the very least, I might be engaging in an ad hominem attack. But then I think, if a school reporter critiques public schools, we should try to find out where she sends her kids. If a tech reporter smacks down Apple products, we should find out what kind of products she uses at work and at home.

Am I being a crudwad for examining Vascellaro’s digital footprint? Is it relevant to the issues she identifies in her piece? I would love for people to weigh in on this. In fact, I think I’ll try to get Vascellaro herself to weigh in.

Posted in Facebook, journalism, new media, Twitter | 5 Comments »

…and yet I don’t use Facebook much lately

Posted by Jenna McWilliams on September 15, 2009

This blog has been an unofficial Don’t Say Facebook is Over zone. I’m not quite willing to let go of that stance, especially since statistics suggest that Facebook activity continues to increase worldwide.

But you guys, I really don’t use Facebook that much anymore. And I’m not alone: Lots of my friends have drifted away too. Most of us prefer Twitter now, which means that one of the more interesting features of Facebook–the friend newsfeed–is clogged up by lame quiz results and remediated tweets that I’ve already read. All of the interesting stuff is going on over at twitter now, and Facebook is starting to feel like the social networking version of a print newspaper: I already got all the important news elsewhere, and the rest of what’s there feels like filler.

More significantly, gaining a new Twitter follower feels like a bigger win to me than adding a Facebook friend does.

Now, I don’t want to open myself up to accusations of Virginia Heffernanism. I’m not going to argue that my experience is symptomatic of any larger social networking trends. As far as I can tell, Facebook is far from an “online ghost town.” In fact–and this seems important–as Facebook increasingly becomes the domain of an older and generally less social networking-savvy demographic, it’s shifting to accommodate its new users’ needs and interests. Though it has certainly tried, Facebook just can’t keep up with the dynamic, socially complex Twitterspace; and the more it embraces this fact, the more it attempts to fortify the features it can uniquely offer, the more likely its continued success becomes.

Posted in Facebook, social media, Twitter | 3 Comments »

what is learning (in new media)?

Posted by Jenna McWilliams on September 9, 2009

Alert blogtrollers may have seen multiple posts recently with titles identical to the one accompanying this post–that’s because we’ve been asked by learning scientist and new media researcher Kylie Peppler to address this very concern. The question–what is learning in new media?–is too broad for anyone to address within the context of a single blogpost, but if we all set to work, we might get that turkey stripped down to its bones by the end of the night.

My chunk of the turkey is time.

When I joined Twitter, I lurked for months and months without tweeting a thing. When I finally did join the community as a good, earnest citizen, I started out slowly and picked up speed as I learned to negotiate the community’s norms and embrace the valued practices of the space. Now, a year and a half later, I can communicate fairly clearly the spoken and tacit norms of the Twitterspace.

I did the same thing with Facebook, Wikipedia, and blogging–looking around for months before joining the community. By doing so, by taking the time to consider the space I was entering, I was able to reflect on others’ practices before offering up my own. I read thousands of blogs before starting my own. I worked with friends to learn how to edit Wikipedia. And I was coerced by another friend to join Facebook; the rest was up to me.

I recently spent some time working with Scratch, a simple visual programming language designed for young learners. As the site explains,

Scratch is designed to help young people (ages 8 and up) develop 21st century learning skills. As they create and share Scratch projects, young people learn important mathematical and computational ideas, while also learning to think creatively, reason systematically, and work collaboratively.

I’ve designed exactly two projects in Scratch; the first was about a year ago, when a colleague spent the morning helping me work up a little thing I call Jimmy Eats World.

To play this project, click the green flag in the upper right.

I’m annoyed with myself that I didn’t make the flying hippo actually disappear at the end of the project, and if I wanted to I could open up the program and make it so. Or I could turn the main sprite, the walking cat, into a hammerhead shark announcing my blog’s url.

I could do that if I wanted to, because I am a highly resourceful independent learner who has the passion and the time to devote to projects like this. I find them personally and epistemologically meaningful–I feel enriched, and I feel that the time I devote to these kinds of projects makes me a better, more useful and proficient blogger and educational researcher.

Time, the friend of the highly resourceful learner, is the enemy of teaching. Time: There’s never enough and even if there were, it couldn’t be spent on tinkering. There’s content to cover, and not just in the name of high stakes tests. A teacher’s job–one made ever more challenging by the social revolution–is to equip learners with the knowledge, proficiencies, and dispositions that will suit them well for future learning. There comes a time when the teacher must say, It’s time to stop with Scratch and start on something else.

Which is a deep shame, because it’s the tinkering, the ability to immerse oneself in participatory media or a learning platform, that fosters a real fluency with the space.

This is a key feature of what it means to learn in new media: the choice to engage with certain tools, to join up with certain affinity spaces, beyond the time required by schools. Clay Shirky writes that the days are gone when we could expect to do things only for money; we’re in an era when the greatest innovations emerge not for money but for love.

If learning in new media takes time, passion, and some combination of intrinsic and extrinsic motivations, then on its surface school seems to run anathema to a new media education. In fact, it may be that engagement with participatory practices is exactly what schools need at a time when they are struggling to remain relevant to the real world needs, experiences, and expertises into which learners will ultimately emerge.

Posted in academia, academics, blogging, Clay Shirky, creativity, education, Facebook, graduate school, MIT, new media, Ph.D., schools, shark attack, social revolution | 10 Comments »

"Things fall apart"? SRSLY?

Posted by Jenna McWilliams on September 2, 2009

Over at the HASTAC blog, Cathy Davidson has posted a fantastic piece about a so-called “Facebook Exodus.”

Davidson’s post is called “Is Facebook the Technology from Hell?” and it tackles a New York Times article by Virginia Heffernan that suggests that

Things fall apart; the center cannot hold. Facebook, the online social grid, could not command loyalty forever. If you ask around, as I did, you’ll find quitters…. [W]hile people are still joining Facebook and compulsively visiting the site, a small but noticeable group are fleeing — some of them ostentatiously.

Davidson, while acknowledging her affinity for other Heffernan-authored pieces, rightly attacks this article for sloppy research and a bald-faced refusal to interpret data rationally. First, Davidson explains,

The “small but noticeable group” she documents are her friends. Their reasons are the ones that any wise FB user needs to be cautious of. Privacy, mostly. Of course FB is datamining. It’s “free,” right? Well, no. As every Cat in the Stack user knows by now, the “information is free” fantasy has been over for a long, long time. If it is free, they are gathering information that they can sell on the backend. There is no free lunch and no free Internet.

While it’s certainly true, Davidson adds, that Facebook’s popularity is declining among the younger demographic and it likely won’t remain the behemoth it is now for the rest of time, there’s no reason to think it will turn into the “online ghost town” Heffernan believes it’s doomed to become–a ghost town, by the way, “run by zombie users who never update their pages and packs of marketers picking at the corpses of social circles they once hoped to exploit.”

But Davidson’s most important point is this:

methodology, people! We have to hold mainstream media responsible in the same way we hold the Internet bloggers and writers responsible. One’s five friends are not necessarily the best filter on the world.

It is passing peculiar that the journalistic revolution–everybody, Clay Shirky writes, is a potential media outlet–is being covered by journalism’s old guard, the very people whose vocations are threatened by new media platforms like Facebook, Twitter, blogging applications and forums. Humans have consistently proven their ability to see only what they want to see and ignore the rest; print journalists, for all their training in “objectivity” and “fairness,” are really no different.

Of course not all print journalists are focused on studiously ignoring the social revolution, despite the overwhelming likelihood that it will come at the cost of their entire field as we know it today. For proof, just follow any journalist who actively uses Twitter as god intended it (I recommend David Carr, David Pogue, and Rachel Maddow).

Still, the question remains: Given the inherent bias of print media outlets toward print media outlets, how do we decide what to trust? Is it true that Facebook, Twitter and the like are suffering from a decline in popularity, that online reportage is less reliable than print outlets, or, indeed, that print journalism is really in the dire straits it purports to be?

Posted in Facebook, journalism, new media, social revolution | 1 Comment »

what is new media literacy?

Posted by Jenna McWilliams on August 17, 2009

Until about a month ago, I worked for a research group called Project New Media Literacies. During my tenure there, the group’s Creative Manager, Anna van Someren, produced the following video to describe our work:

I love this video and think it does a fantastic job describing the focus of Project New Media Literacies. What it doesn’t do, however, is answer the question my friend Kathleen asked me the other day: “What is new media literacy?”

Here’s my answer: It’s like print literacy, only different.

A short definition of print literacy

Think about learning how to read. You start by figuring out words and sounding out short sentences. Kids spend a lot of time learning vocabulary, practicing with different kinds of texts, and writing their own texts. The whole point of that is to increase learners’ fluency with the words, symbols, and markers that comprise a language, so that when they encounter an unfamiliar type of text they’ll be able to decipher it in context. By learning how to read this

you can also (theoretically) develop an ability to read this:

and this:

and this:

And even if you can’t exactly decipher everything included in the examples above, most people would likely at least know what kind of text they were looking at and, even if they didn’t know what opah was or whether it tasted good, they would at least know how much it cost to find out.

New Media Literacy
Keep in mind that, though we tend not to think too much about this, there are tons of technologies involved in the creating and communicating of print messages. Word processors are communication technologies, of course, but so are pencils, quill pens, telegraphs–even language itself is a technology–an invention devised to support a specific kind of communication.

New media literacy starts from the premise that digital technologies like email, Twitter, chatrooms and so on are simply new communication resources that can be clustered in the same category as pens, paper, and the printing press. While they’re the same class of technologies, however, the types of communication these new digital technologies support are significantly different from those supported by print technologies.

One interesting feature of print literacy is that while it’s related to oral literacy–the ability to speak and understand a language–oral literacy and print literacy can in theory and often in practice exist in mutual exclusion. This is because what it takes to interpret the symbols that make up a spoken language (deciphering a series of intentionally ordered sounds) is fundamentally different from what it takes to interpret the symbols of a printed language (deciphering a series of symbols, intentionally ordered). Print literacy, however, is built on the shoulders of oral literacy: While we can easily imagine a powerful public speaker being functionally illiterate, it’s practically impossible to picture someone who is able to read but unable to speak or understand the language she can read.

Print literacy and new media literacy are connected and separate in a similar way. In order to master new media platforms and social communication tools, you have to possess a fluency with print media (in addition, increasingly, to visual and sound-based media formats). On the SAT, here’s how this all would play out:

oral literacy:print literacy::print literacy:new media literacy

In other words, print literacy is necessary but not sufficient, because the conditions surrounding print media in social communication environments are fundamentally different from those surrounding print media in, for example, a textbook.

The goals behind new media literacy education are, however, the same as those surrounding print literacy education: To support learners’ facility with a set of texts and allow them to navigate new media platforms with relative fluency. There’s no point in learning how to edit Wikipedia, for example, if it doesn’t offer us skills that carry over into other collaborative knowledge-building environments. Twitter might (though I doubt it) flounder and fail within five years, but that doesn’t mean learning how to engage with it it pointless. Through Twitter, we can learn how to build and participate in a community that features a largely invisible audience, persistence of information, and tacit but fairly firm rules for engagement. If we can learn how to jump into the world of Twitter, then, we might also learn how to navigate this:

or this:

or this:

All of the above technologies are built on combinations of oral and print literacies, but that doesn’t mean knowing how to speak, read, write and understand are enough. The words may be the same, but the social competencies required to decipher them inside of their context are different.

There: new media literacy. It’s kind of like learning how to read and write and kind of not like that at all.

Posted in education, Facebook, Google, participatory culture | 1 Comment »

blogging burns more calories than watching tv or sleeping

Posted by Jenna McWilliams on August 10, 2009

I recently went nearly completely offline for ten full days as I packed and moved from Boston to Indiana, having the unbelievable good fortune along the way to witness the first minutes in the life of my brand new niece, Morgan Brianna DeGeer. During this time, I only found the time and connectivity to publish two blog entries, post four tweets, and enter five Facebook status updates.

Being offline was hard but not impossible, thank god; I halfway expected to suffer from serious irritability and sudden fits of rage and sadness. What I missed most was my daily morning routine of waking up, reaching over to the passenger side of my bed, and grabbing hold of my laptop. This is a routine I’ll be glad to get back to.

And I’m not alone, according to a recent New York Times article that describes an increasingly typical A.M. routine:

This is morning in America in the Internet age. After six to eight hours of network deprivation — also known as sleep — people are increasingly waking up and lunging for cellphones and laptops, sometimes even before swinging their legs to the floor and tending to more biologically urgent activities.

Some people–including some interviewed for the NYTimes article–may decry this new trend as unnatural, antisocial, or unhealthy. I can’t speak for the experience of others, but for myself, I disagree with this analysis. (And here I risk being part of what another New York Times article calls a potentially problematic anti-print media “drumbeat.” “This drumbeat,” Michael Sokolove writes in the piece about the faltering of Philadelphia’s major newspapers,

a relentless declaration that print is doomed, may be a problem in and of itself, making it easy to cast anyone who wants to save print as a Luddite.)

Perhaps “lunging” for cellphones and laptops before emptying your bladder might be considered unhealthy, but only if you think of the lunging as on par with waking up and reaching for the TV remote. Watching television, after all, is the ultimate passive activity, burning a mere 68 calories an hour (to the 46 calories burned per hour of dead sleep). But for a lot of people, opening a laptop is practically the diametric opposite of turning on the tv: Instead of watching something someone else made, they get to make something for themselves and others, to build something new out of nearly endless buckets of clay that get replenished by the day, the hour, the minute.

In a previous career trajectory, I was a newly minted poet freshly emerged from an M.F.A. program. Most mornings, I woke up early, flipped open a notebook, and wrote. That activity seems to me now to be innately self-contained and self-absorbed, existing as it did in an intentional vacuum. I don’t know how many calories blogging burns per hour, but I know it generates both light and heat for me and, I hope, for other people who land here. It’s why I haven’t yet been swayed by accusations that blogging, tweeting, and working with social networks are a vain, self-centered and self-aggrandizing acts: When leveraged in the most interesting ways, these media platforms become not only the materials molded out of clay but the raw materials from which others may build their own designs.

Oh, and here’s a pic of my new niece. I swear she is exactly as gorgeous as this photo suggests.

Posted in blogging, creativity, Facebook, journalism, participatory culture, social media, television, Twitter | Leave a Comment »

it might not be a lot but I feel like I’m making the most

Posted by Jenna McWilliams on July 25, 2009

living and leaving with less

This is my last weekend in Boston. In a few days, I’ll be closing up shop, losing my internet access, piling some items into a truck, and heading to points midwest.

I’m not going to bother using this post to detail the emotional tumult inherent in this kind of move, because that feels lamely self-indulgent, even to someone who spends a huge chunk of her time broadcasting her thoughts on at least three different blog sites (here, here, and here). Besides, you’re probably reading this blog for one of two reasons: You know me and therefore care about my emotional state, but have received private updates; or you don’t know me and don’t particularly care how I’m feeling this morning.

Instead of tearing open my chest and splaying my guts across this post, then, I just want to focus on something interesting I’ve noticed while packing: It’s a whole lot easier to get rid of stuff than it was during my previous moves (of which there have been nearly two dozen in the last 14 years, including three major regional moves and multiple cross-town or cross-state relocations).

For one thing, I no longer need to carry with me certain types of materials. I’ve gotten rid of hundreds of books, including over a dozen dictionaries, thesauruses, and style guides. (I kept the dictionary I won as a spelling bee champion, but only for sentimental purposes.) I shredded and recycled reams of paper documents: tax returns, credit card bills, rental agreements and contracts. I don’t need them. They’re all online.

For another thing, we just don’t generate as much physical stuff as we used to. My friend and former coworker Debora Lui experienced a complete laptop failure–her second in a year–last summer as she was finishing her master’s thesis. While the first failure reduced her to working from “printed pages, (her) memory, or scattered hand-scribbled notes,” the second failure was a much different experience. She writes:

Miraculously – with all my Google Doc usage, emailing out, saving my information on remote sites – I found that I not only had one good copy of my thesis, but several copies, saved and transfered at different points of revision. I found that my other files like photographs and videos (which normally I would have been upset about losing) were also strangely distributed across the web through sites like YouTube and Facebook. While I had previously thought of my life as being contained in one place, it was suddenly shown to me as a vast network for links and uploads.

As Deb explains, we–and young people especially–collect and hold on to more everyday detritus than ever: More photos, more written communications, more logged and archived conversations. Yet because of digital technologies, the space this material takes up is so close to zero that it is, as Chris Anderson writes in Free, “too cheap to meter” and “too cheap to matter.”

Why not take a hundred photos of yourself posing in front of a full-length mirror? Why not save every email you ever received or sent from every single one of your friends? Eventually your gmail account may hit 5% of its total storage space, but it’s more likely that Google will increase storage capacity before you even hit that number.

My buddy Russell Francis, playing on Dorothy Holland’s notion of history in person, calls this phenomenon “history in laptop.” Summarizing a study he conducted of graduate students’ media habits, he writes that

Over time traces of students’ lives, past and present, become ingrained into students’ personal media environment through a process of inherited, evolved and mindful design. Archives of e-mails, letters, essays written as undergraduates, digitised photographs and digitised music collections also started to accumulate on many students’ laptops. Traces of Jacob’s participation in various environmental groups, traces of Jim’s participation in multiple human rights organisations and traces of Clinton’s long history of avid news reading were evident in the links, shortcuts and contacts designed into their personalised mediascapes. Furthermore, traces of their connections to others accumulated as entries in contacts folders and instant messenger ‘buddy lists’; tools that allowed students to remain in touch with former lives and former practised identities.

The point is well taken, though the term itself seems a bit of a red herring. The term seems to imply a history that’s located in a concrete place, albeit one that uses space in a way that’s much different than, for example, books and letters and mementos do. In fact, history in laptop may be a more accurate term for how identity was stored as recently (and as long ago) as 3-5 years ago; today, history is stored across a virtual space no longer constrained by such silly contraptions as hard drives and memory cards. If my computer crashes, I’m likely to retrieve nearly all of the data that was stored on it–okay, let’s say somewhere around 80%. Still, that’s an awful lot to retrieve, given that history that resides in the brain is gone as soon as the blood flow is cut off.

Anyway, my point is that I carry around less stuff, and the less will get lesser with every passing year. Interestingly, this makes it easier to drift physically but harder to drift emotionally. We can, and often do, maintain the types of everyday connections with family, friends, and acquaintances that at least approximate the experience of physical promixity. My sister can send me a photo of her wardrobe choice for her first day of law school; we can chat online about which shoes she should wear, where she should buy her books, and how heavy her backpack is. I can follow her blog, her Facebook updates, and her tweets, and she can do the same for me. And, more importantly, all of these things are equally possible for me to do with, for example, the cluster of people I met at a recent conference, whether they live in Boston, Bloomington, or Cape Town.

For now, let’s call it “history at large.”

Posted in blogging, distributed cognition, Facebook, graduate school, participatory culture, Twitter | 1 Comment »