Pew Survey Finds Rising E-Reading, Continued Dominance of Print

E-Reader
Standard

A January 2014 survey from the Pew Research Center’s Internet Project demonstrates that while e-reading and e-reader ownership is on the rise, print isn’t going anywhere either. About 70% of U.S. adults read a print book in the last year, and only 4% of readers are “e-book only.”

The typical U.S. adult read 5 books in the past year (the average number is 12, thanks to a small group of avid bookworms), and 50% of Americans now own a handheld device like a tablet or e-reader. More and more Americans are turning to tablets for e-reading, although the number of adults that own a dedicated reading device like a Kindle, Nook or Kobo jumped from 24% in September 2013 to 32% in January 2014.

At our “Knowledge Systems” book sprint in January, many of our collaborators wrote about the continued vitality of print books, as well as the great degree of cultural capital and nostalgia that has congealed around the printed, bound word. This Pew study finds that while Americans become increasingly comfortable with e-reading, their fundamental relationship with the worlds of words and literature continues to be an analog one.

Read the full results of the survey by Kathryn Zickuhr and Lee Rainie at the Pew Research Internet Project: http://www.pewinternet.org/2014/01/16/e-reading-rises-as-device-ownership-jumps/.

 

Image courtesy of Pam Lau, used under a Creative Commons license. Thanks Pam!

Features of the Future Digital Textbook

Standard

In lieu of writing, we drew things…

Overview

Overview

Intro to learning node: in this case a class on Shakespeare

Intro to learning node: in this case a class on Shakespeare

Provides historical context. Features include chat, place for notes, student directed content (student can choose which area(s) to explore)

Provides historical context. Features include chat, place for notes, student directed content (student can choose which area(s) to explore)

The play. Features include chat, video, AI scaffolding

The play. Features include chat, video, AI scaffolding

One of 3 cognitive engagement activities: Here, students watch the scene they just read.

One of 3 cognitive engagement activities: Here, students watch the scene they just read.

One of 3 cognitive engagement activities: Here, students discuss the scene with their peers.

One of 3 cognitive engagement activities: Here, students discuss the scene with their peers.

One of 3 cognitive engagement activities: Here, students write an essay about the scene they just read.

One of 3 cognitive engagement activities: Here, students write an essay about the scene they just read.

One of three assessment activities: writing assignment

One of three assessment activities: writing assignment

One of 3 assessment activities: Students do a performance

One of 3 assessment activities: Students do a performance

One of 3 assessment activities: Students do a creative writing activity

One of 3 assessment activities: Students do a creative writing activity

Finished product

Finished product

Because Community

Standard
image via knottytotty.tumblr.com

image via knottytotty.tumblr.com

Pretty much all I teach these days are classes on the study of writing in digital communities. For 15 weeks, students in my undergraduate and graduate courses embed themselves in a space of their choosing and investigate how participants write, read, communicate, and think in that digital network. I’ve had the pleasure of reading studies on interesting linguistic constructions like the “because noun” and “I can’t even.” I’ve learned about the ways that language gets debated on the black hole that is Tumblr, and I’ve witnessed countless ragequits and twittercides as they are documented and analyzed by the student scholars in my classes who write with clarity and confidence about the people in the communities they study throughout the semester.

Image from http://the-toast.net/2013/11/20/yes-you-can-even/view-all/

Image from http://the-toast.net/2013/11/20/yes-you-can-even/view-all/

We talk about the difference between image macros and memes (they are often taken to mean the same thing, where one is actually a subset of the other). We construct research questions that often boil down to: “Why would anyone waste their time on that?” We then design qualitative (short term) ethnographic studies that attempt to account for why people spend hours a day buying and selling pixelated items in virtual auction houses, or why it’s not cool to retweet a post from someone’s protected account. Students have taught me the difference between “bro” and “brah,” learned via investigative research into fantasy sports leagues. They’ve explained doge to me in ways I could have never possibly understood without their assistance. Best of all, we have learned together how difference is best appreciated when experienced firsthand. The rest of the world may not understand my obsession with flowcharts, but my fellow Pinterest users sure do. To them, it makes perfect sense why anyone would want to spend hours a day curating their niche collections of taxidermy photos and DIY lip balm recipes.

dogeI’ve always believed that to study language is to study people. Studying how people write and value texts and paratexts in their everyday lives is to appreciate perspectives that were perhaps previously misunderstood. From the insides of these communities, we can make and share meaning in ways that feel different and somehow new. Take, for example, the 19-year-old Tumblr user who created a comic about white privilege. The comic itself generated a huge buzz and loads of negative backlash from nasty Tumblr users. But in the end, it’s a teaching moment for those of us who study the ways that people use Internet-based writing spaces to communicate with one another. On the one hand, this communicative form enables hate and ignorance in countless ways. On the other hand, it exposes hate and ignorance in concrete, readable, consumable ways, too. The raw, unedited, unfiltered Internet communities are rich with opportunities to teach students about the power of language and text. I believe strongly in exposing students to both the bloody awful and the radically accepting ways that digital textual communities shape our lives.

Screen grab from http://imgur.com/gallery/l8Rdg

Screen grab from http://imgur.com/gallery/l8Rdg

In 2006, I was a co-author on a white paper titled “Confronting the Challenges of a Participatory Culture: Media Education for the 21st Century,” primarily written by one of my mentors, Henry Jenkins. In that piece we wrote about something we called “the transparency problem.” The “transparency problem” is the notion that adults (educators, parents, mentors, media makers) often mistakenly assume that because young people are “born digital” as “digital natives” (an idea, by the way, I wholeheartedly disagree with) they must be so rhetorically skilled at interpreting media messages that they don’t need our help “to see clearly the ways that media shape perceptions of the world” (p. 3). While it is definitely true that some people younger than I am are more knowledgeable about digital tools and communities than I am, it is equally true that I still have plenty to teach them about these spaces, too. That’s why we work on understanding these spaces together. Shared understandings of shared languages, artifacts, and activities enable us to become better thinkers and writers, and that, in turn, enables us to share better thinking and writing with other communities, like the folks participating in this Sprint Beyond the Book. Thanks for reading, and feel free to invite me to understand your weirdo niche subreddit or strangely addictive Pinterest board.

 

 

Exhuming the Mastodon

Standard

“Let us tenderly and kindly cherish, therefore, the means of knowledge. Let us dare to read, think, speak, and write.”

—John Adams

As a cultural historian, and one involved in rethinking graduate education, the notion of pathways is resonant in obvious ways. We are heirs to a tradition of valuing archives that are arranged synchronically and chronologically (classes, curricula (L. to run), and credentials) to effect a set of knowledge outputs and practices—the educated individual, critically forged and capable. That person extends the means and ends. So, John Adams, thanks.

But what happens when those means clot or forestall the impulse to dare and act in language—when the pathways become sclerotic and unnecessarily difficult? I’m thinking, for the moment, of the dissertation as we’ve inherited it from the nineteenth century. It takes the form of a thesis, but really a book, chaptered, indexed, bound. It must be “defended,” in the form of an oral meeting that theoretically works as an opportunity to counter and call bullshit on written material that can cloak error or ambiguity in its formal, officializing guise of print. The defense completes the delivery of new knowledge, by the newly “minted” scholar.

We might view it as a kind of curtain lifting, not unlike the iconic Charles Wilson Peale, in his self-portrait as gatekeeper to the objects of knowledge: “The Artist in His Museum,” 1822.

Peale

Since 1822, the museum of scholarly production has advanced through a few more chambers, but the performative and architecture are basically the same. Of late, we then take the text product, make it a codex via arbitrary formatting, and then contract with Proquest to digitize it, make it available on the Internet (not open-access, but close), and then usually provide it to the degree-granting institution’s library to archive. Many humanities students have begun to choose to forego publication at the moment of credentialing, for fear that they might be precluding their pathway not into “knowledge” but into the publication systems that market knowledge—academic presses embedded themselves in a shrinking trade in knowledge commodities.

But that access issue is almost the least of the problems with the PATHWAY of doctoral credentialing. It’s the form itself. That culminating experience is the place where the “running” in curriculum hits obstacles, stalls, crashes, burns, evaporates. Perhaps the digital offers ways to dredge the riverbed and make that knowledge system much more fertile.

I’d like to see dissertations that continue the curriculum—that are, as the MLA and AHA are making preliminary steps toward advocating for, process projects. They would arise out of a richer mix of inputs than an advisor and several other co-advisors to include communities of intra- and inter-institutional faculty and students. They would break down the wall between institutional knowledge and its publics by inviting widespread access to the project as a work in process. Graduate faculties would be configured to critique and follow real-time progress rather than dangerously episodic check-ins. The archive too would not be spatially remote, giving the student little excuse to get “lost.” Indeed, the line between reading and curating would be forever blurred. And indeed the metaphor of “defense” becomes unnecessary, since that need to complement the discrete bounded knowledge-output, the one we must “suspect” of flaws, has always and already been produced through an engagement with multiple voices and assessments.

So rather than Peale in his museum, we’d have the dissertation as collaborative dig, pulling forth, over time.  As in:

Peale

Also Charles Wilson Peale, this is an image of “The Exhumation of the Mastodon, 1805-08.”  Note the temporality Peale foregrounds, the wheel in motion, the dating over a three year period—this is a rendering of process. And it’s a process of manufacturing knowledge collaboratively, over time. It is a lesson from the past about how not to bury things.

The Book (and E-Lit) as Nostalgic Object

Standard

Not only does digital fluidity facilitate the creation of printed media that have no right to exist physically (that should stay digital and not “waste” paper—the using up of these resources clearly pushes our buttons because of both concern over conservation and over cultural capital—that gets to be a book?), expanding (or shrinking, depending upon your perspective) authorship, it also raises questions of access—how do we ensure these texts remain available as platforms change? As Michael Simeone notes, digital books are far more brittle than their physical counterparts and decay in a far different fashion. Sally Ball has addressed the way this ephemerality impacts conceptions of authorship—knowing that our works are likely to become dated within a short span of time prevents many writers from experimenting with new media and alternative or app-based publishing forms (many poets won’t even reference the contemporary moment in their work, lest a temporal reference prevent its resonance for subsequent generations). I myself collaborated on a book of augmented reality poems whose content can change at the drop of a hat—since the text does not appear on the pages, but only comes to life when those pages are presented to a webcam, emerging from barcode-like markers on the page’s surface (in fact, the reader herself can now change what appears on-screen, thanks to a web-based tool my collaborator Brad Bouse developed). That very terror, though, of dating oneself, can alternately be seen as liberatory—if we fail, we can erase the evidence, and we can even adapt or update our work to meet a new audience. If Michael Simeone’s doomsday predictions are accurate, then what me worry? about whether my book is accessible a year or two from now? Poets are always accused of fiddling while Rome burns, so to worry about who’s listening only expands our image of writerly narcissism.

To be serious, though, this state serves as a reminder that a book is an event, a performance between reader and page. Artists have known far longer than writers that the best way to save the ephemeral (happenings, performance, some land art) is through documentation.

Art in the Age of Mechanical Reproduction, a Philadelphia purveyor of fascinating goods and spirits.

Though I may be willing to give up on work that can no longer be supported, scholars like Lori EmersonDene Grigar, and Stuart Moulthrop are doing wonderful work to build archives of new media writing (from magic lantern slides—which once upon a time, of course, told highly immersive phantasmagoric stories—to hypercard works and Flash-based texts). In addition to this scholarly interest, what about the resurgence in pop culture of “antiquated,” outdated, even obsolete aesthetics? It’s no coincidence that I picked up letterpress printing in graduate school while studying electronic literature, or that my students are fascinated when I bring a typewriter into the classroom, or that we are so inundated by nostalgic-looking image filters that we need a #nofilter hashtag to assure us what we are seeing accurately reflects “reality.” Perhaps the electronic literature projects being made today, even those that seem glossy, interactive, and lovely in the best ways (like Aaron Koblin’s interactive music videos, and mass collaborative artworks created for Google) will indeed look wonky and wiley and willful to future readers (perhaps they may be utterly inaccessible), but it is also possible that, like the resurgence of interest in glitch and animated GIFs, their very stylistic issues will make us treasure them more.

The Future of ________? A Cautionary Tale

Standard

What is the future of publishing? How will people read in the future? How will people find new books to read in the future? How will books be produced in the future, and who will produce them? How will books be written and edited in the future? How will the concept of the book evolve in the future? What will the economics of authorship be in the future? In what new ways will authors engage with their readers?

These are all wonderful and engaging questions. But before we begin searching for examples of the future happening today, let’s start with a cautionary tale. The future is a tricky thing….

I Upset a Room Full of Journalists

It was my first time in Oslo, Norway and I was super excited. My father’s family actually comes from the city. I emailed Dad before I left and asked him to send me a list of the ancestors that lived there. I arrived on a cold clear day and took a walk around to get my bearings.

I had come to Oslo to talk about the future of entertainment and computing to a group of journalists, business leaders and students. I arrived a day early to prepare for the talk and take in a few sights. The harbor and downtown were lovely. The sun was out and the Norwegians were not shy about soaking up every little bit of it. They laid around like well-dressed seals on the steps and piers of the manicured harbor, sunning themselves and chatting. But the most exciting part of my trip was the cemetery.

Vår Frelsers graveyard is set in the middle of the city. Edvard Munch and Henrik Ibsen are both buried among its rolling hills. I crept around the gravestones with my father’s list of names in my hand, searching for ancestors. It was a bit haphazard. I didn’t really do my prep work, but it was exciting to wind my way through the lovely ground looking for familiar names. I found a few Johnsons and a few Johansens, but no exact matches.

The next day I rose bright-eyed and ready to meet the Norwegians. On stage I started by reading out the names of the ancestors that my father had given me, asking anyone who knew them to raise their hand. That slayed them. They loved it and laughed the entire time, but no luck – nobody raised their hand.

During the question and answer session, a tall, thin journalist with blond hair asked me, “What happens when the machines get too smart? Do you see a future where we humans might be at risk?”

I smiled and replied, “I’m an optimist. You see….”

“You’re an optimist?” the reported stopped me. He seemed shocked and began to write furiously.

“Yes,” I said. “The future isn’t an accident. I believe the future is made every day by the actions of people. And if that’s true, then why would we build a future that is bad? How about we build a future that is awesome?”

“But how can you be a futurist and an optimist?” another reporter asked. I had clearly hit a nerve.

“I’m an optimist because I choose to be an optimist,” I answered. “I believe you have to make a decision about your point of view, and I made the decision to be an optimist and to try to build the best future possible.” This turned out to be the most radical statement I’ve ever made as a futurist.

“But what about the rapid advance of technology?” the first journalist asked. “Don’t you think that things are moving so quickly that we can’t possibly keep control of the machines?”

“I don’t think technology is moving that fast,” I explained. “I live my life 10 to 15 years in the future. From that perspective, that rapid progression isn’t so drastic. The dirty little secret about the future is that it’s going to look a lot like today.” The place instantly became a madhouse.

“How can you say that the future is going to look a lot like today?” A third journalist stood up, recorder in hand. “You are a futurist. Do you really mean to say that the future will look like today?”

“That’s exactly what I mean. The look of the future doesn’t change all that much,” I started.

“But…” the third journalist tried to break in.

“The world around us doesn’t change that fast,” I kept going. I knew that I had a perfect example to make my point.

“Look: we are here in your lovely city. There are buildings in this city that are older than my entire country. Of course the future will look like today. And the reason is that people don’t want it to change that fast. If you woke up tomorrow and your entire world had been transformed into a science fiction future, you’d be living a nightmare.”

The room erupted into laughter. Two of the journalists sat down with smiles on their faces. The third still looked a little upset.

The Hardest Thing about Being a Futurist

The hardest thing about being a futurist and doing the serious work of futurecasting is something called metacognition. This is simply thinking about thinking. It’s what many people think makes us individuals, and what makes us human. But the hardest thing about my job is thinking about thinking about the future.

As we begin to think about the future of books, publishing, narrative and how we act and interact with each other, let’s be careful not to Jetson-ize our visions for the future too much. Let’s make sure to embrace the inexorable complexity of people and cultures. Can we hold two different futures in our heads – even if those visions are diametrically opposed to one another? Can we explore the extremes of technological progress while maintaining a rich historical perspective? If we can, then we’ll be able to map to the middle and explore the beauty and the contradictions of the future we will find ourselves inhabiting.

Here’s my caution: the future is going to look a lot like today. Our challenge is to be courageous to populate that future with amazing new experiences and stories that none of us could have imagined before today.