The future, death, and recoding.

It’s hard to get rid of an oxford comma. Books, even more so. Vilem Flusser writes in his harbinger-of-doom for writing:

We are book-worms, being opposed to automated apparatuses and green forests, not out of bibliophilia–which today registers as necrophilia–but out of an engagement with historical freedom. This, our “worm-like feeling,” this sense of nourishing ourselves on corpses (books) , explains our horror of dispensing with books.

Does Writing Have a future? p. 102

Flusser suggests that books provide freedoms that forests (which he also ties, in the progress of communication, to primitiveness, images and the magical) do not provide. Neither can it engage the coming of the digital, which is somehow both images, but also something completely new. Whereas text could criticize images, it cannot engage with the digital because:

….Digital codes synthesize things that have already been fully criticized, fully calculated….The old criticism, this dismantling of solid things, would be lost in the gaps between intervals, in nothingness….A completely different critical method is required, one that is only approximately named “system analysis.”  For this, alphabetic thinking is completely useless.

Does Writing Have a Future? p. 152.

As a result:

But isn’t the feeling of a knife’s edge exactly what is responsible for what we call freedom?….To those of us who spell things out, the current transition from the alphabet to the new looks like a dangerous step on a ridge between abysses. It may seem like a pleasant stroll to our grandchildren, but we are not our grandchildren, who will learn the new with ease in kindergarten.

Does Writing Have a Future? p. 155.

This was written in 1987, and while the blurb on the back of the translation (U. of Minnesota Press, 2011) would have us believe that Flusser makes a convincing argument that “the art of writing will not so much disappear but rather evolve into new kinds of thought and expression” (back cover) it’s clear that Flusser thinks anything but that. He likens the transition less to dialectical models and more to Kuhnian paradigm shifts (150). The alphabetic will be recoded in an undefined “new” which will obliterate it. Whoever blurbed the book thinks this is a good thing, while Flusser sees it as an apocalyptic inevitability on par with “the singularity.” Such is the solemn deathcult of technologists.

In the argument he makes is an undercurrent of coming to grips with great fears, and Flusser’s prognostication is not our own. What is missing from  Does Writing Have a Future? is people. Between the forests, the alphabet, and the digital, Flusser writes out all of humanity. Perhaps he’s done with us, but we’re not done yet. We can choose and inhabit any of those worlds (trees alphabet, the “new”), and often all three at the same time. There’s no freedom in being made to walk on Flusser’s knife-edge. It’s an image that rings out from W. Somerset Maugham and a Bill Murray movie: walking the razor’s edge is treacherous and hurtful, and at the end, it could make everything OK. There’s not much freedom to be found there. Just because he felt he had no choice doesn’t mean we have to do the same.

Supposed lack of creativity and not seeing what they see.

The confessions of a burnt-out music writer cites over-specialization, in life and in the profession as a major weakness for the author, what was once a pleasure became a tool to isolate one’s self, and the general freakiness of staring down a less-than rosy future gave him weak knees. While all kinds of writers hail the coming of the post-human, kids these days are still trying to be human. When some people are trying to explore their world through education, schools are telling them they have to specialize and focus, pick a major that will get them a job and help them get out of crushing debt. Kyung Hee Kim, a creativity researcher at William and Mary uses standardized tests to measure creativity in thinking, which seems dubious, argues that in an era of over-testing, it gets suppressed. In the midst of all this:

“The compelling, unnerving issue is that the student has nothing to say,” said Howard of the piece that drew so heavily on WebMD. “How could she, since she’s writing a research document from reference materials?” -Skimming the Surface, Dan Barrett

That was from an InsideHigherEd article on how students cite these days. The paper in question relied on the same WebMD article for 9 of 17 citations. Some other telling quotes:

That so few citations were classified as summaries — 164 out of the 1,832 — also indicates that many students are alighting on several different sources without spending much time reading them, then cobbling them together into what Howard called “an incomprehensible pastiche.”

“We think we have students working for efficiency and doing efficient writing.”

An unprovable thesis: the call to teach students to find better sources amongst the piles of handy information both inside and outside of the library is a direct result of the decline in the editorial role of the librarian. Information wants to be free and we keep piling it on.  So rather than spend a bunch of time doing a lot of boring work and reading, why not blaze through it and get on to things that actually interest you? Enter the lament of the music writer:

Ever since middle school, I’ve taken peers to task when they claimed to “listen to everything,” because they almost always meant Top 40 with maybe one deep-cut album they heard about from a parent, babysitter, or older sibling…..Now imagine that persnicket growing up, and getting empowered by music-nerd culture’s online blossoming. Then stoop to imagine him at 35, holding a pillow over his face as he regrets not being able to consummate with, much less commit to, a Maggie-Gyllenhaal-esque sweetheart because she had that one Jim Morrison poster up in her room, despite Ian Curtis and Glenn Danzig’s vouches for Morrison, etc.

This is an illustrative tale. The forces of the educated marshaled their strength behind the idea that more information is good. As a result, creative students who would rather do something else besides what we think they ought to be doing with themselves use the cut-and-paste culture they have been armed  with to meet those demands “efficiently” only to be called “uncreative.” We of earlier, more “creative, deeper thinking generations” get older and judge them for bending and breaking the rules of the games we impose, growing bitter at lost opportunities and their general ability to have a good time despite the things with which we task them. Not every student will find her voice in the system given. In this model of education, voices are given to the experts of the system who judge.

In art, montage and collage are time-tested techniques that can challenge us in ways other techniques can’t (it’s not an end-all-be-all). The power of sampling and turntablism in hip-hop is just one example of using tools outside of their original purpose to say something new. It’s deadly efficient, too:it makes it easier to use other’s work as a basis for your own when the materials to create are out of reach or unusable. It also allows you to capture the essence of a thing. You can get the idea from summarizing another wirter, but you’re not honestly capturing their voice.So when Howard and Jamieson lament that students are using more direct quotes than they are summarizing, it’s worth wondering if they miss this point.

In the study that Barrett writes about, it’s worth noting that these are 1st-year students, not yet inculcated with”proper” study habits. So they’re doing the best they can. I’ve working in public libraries on-and-off for a while. Some of the collections can support high-school research, others cannot. Public and School libraries are under constant budget scrutiny when board members and taxpayers ask “isn’t everything on the internet?” And we as librarians keep piling it on, too. The rules of the game given to students have changed, and they are responding creatively to those pressures. It’s not a lack of creativity, it’s a way of thinking we don’t understand.

Gamification and Bad Marketing

Art by Skinny Coder

Gamification is the application of game elements in non-gaming situations, often to motivate or influence behavior…. Gamification offers instructors numerous creative opportunities to enliven their instruction with contests, leader boards, or badges that give students opportunities for recognition and a positive attitude toward their work. -Educause, 7 Things You Should Know About Gamification” (Emphasis in original)

As a librarian and advocate for information literacy, I’m weary of gamification, especially in higher education. I have two big issues with it. Gamification is primarily a marketing term, developed in the wake of apps like Foursquare that businesses use to promote themselves to customers by giving higher reward levels through interaction with the app and the business. Essentially, it is an interactive third-party advertising platform. By promoting the use of marketing style in higher education, we promote the perpetuation of those behaviors in our students, as opposed to finding ways to engage them that also allow them to challenge market-driven behavior, or at least be free from it. Broadly conceived, gamification does little to promote critical engagement with information technology or the larger, market-driven economy, and has much greater potential to do the opposite.

The second issue is this: the application of superficial gaming elements to education gives the impression that education in itself is not strong enough to hold interest on its own. We’ve all had bad days in a classroom, but using what any student can see to be a blatant bid for their attention sends the wrong message. Due to the use of superficial elements of gaming (points, badges, leaderboards), it overlooks the parts of games that make them truly wonderful: characters, narratives, and the balance of challenging yet rewarding gameplay. Education can be rewarding in its own right when we find ways to actively engage students in the subject matter itself, not in a system of superficial rewards:

Game developers and players have critiqued gamification on the grounds that it gets games wrong, mistaking incidental properties like points and levels for primary features like interactions with behavioral complexity. That may be true, but truth doesn’t matter for bullshitters. Indeed, the very point of gamification is to make the sale as easy as possible.. – Ian Bogost, “Gamification is Bullshit”


Ian Bogost, who develops games and teaches at Georgia Tech, delivered those lines to a symposium held by the Wharton School of Business on gamification. His understanding of Bullshit is derived from Harry Franfurt’s On Bullshit, whereby bullshit is defined as something which is used to conceal, impress or coerce, with no interest in truth or untruth.  I believe that gamification is a form of low-level coercion designed to impress students with a superficial knowledge of gaming, and as a result divorces real understanding both of the subject at hand and games by mixing the two in the name of an easily repeatable model of active engagement.

If we are going to use marketing methods to engage students, it is in our interest to aim higher than merely following the trends. Truly successful marketing  is driven by services or products that are either a) is attached to someone’s identity or b) is something they cannot live without and cannot procure themselves. As the art of using words, images and sound to connect people to those services and products, marketing rightfully deserves serious consideration. It is in our interest to do so in a truthful and open manner, otherwise, we run the risk of just giving our patrons and students something that does not hold truth in any regard: bullshit.

The confluence of marginalia (how this post came about).

Sarah Werner makes some excellent points as to how limited the imagination is for digitizing objects (why focus only on text?) along with the ideas that most digitization conflates copy and edition, and it allows others to make important decisions for you.

I often look at scanning machines and wonder about how much they cost. They seems like a lot. Good digital cameras have come down in price, and the know-how of taking good pictures is not outside the reach of most people. So, as Werner points out, you lose the bespoke elements in digital work. Harder, better, faster, stronger is the motto, which runs counter to the whole view of special collections and archives as being places that hold unique items. Considering the difficulty and expense of traditional models for digitization that requires special software, hardware, expertise and training, how many countless hours and expensive plane tickets could be saved with a Nikon? It’s high time that digitization becomes not just a collections tool, but a reference tool as well.

Why we’re unreasonable.

Psychologists have shown that people have a very, very strong, robust confirmation bias. What this means is that when they have an idea, and they start to reason about that idea, they are going to mostly find arguments for their own idea….

The idea here is that the confirmation bias is not a flaw of reasoning, it’s actually a feature. It is something that is built into reasoning; not because reasoning is flawed or because people are stupid, but because actually people are very good at reasoning — but they’re very good at reasoning for arguing.

Houston, we still have a problem.

This is an extremely thorny coupling of quotes, but I’m hoping that in the tangle, we can start to think critically about how we address and adopt new technologies in information literacy instruction. These are real and ongoing issues that have been around for 20+ years and still are not even close to being out in the open in the IL literature. Searching LISTA with “information literacy” and either “race,” “ethnicity,” or “digital divide,” the most you’ll get is 49 results. Instead, the 826 articles hat come up with the “digital divide” search come from information and communication science and research.

There are connections between information literacy and information technology that have real impacts that are wholly unexamined. Despite our best intentions, we’re all a part of it.

1986

“As a first step, we might do well to dispel the notion that information literacy is something that comes neatly packaged with information technology. Not only are there practical limits to the diffusion of technology, there are still greater limitations on the ability of machines, in and of themselves, to inform and instruct.”-William Demo, The Idea of “Information Literacy” in the Age of High-Tech, p.20.

2006

“Toward the end of making the Digital Divide a central issue in curricula and pushing professional organizations to take public stands on technology policy issues, Selfe and Moran call on teachers to find ways to use any and all tools available in the project to expand access now. They suggest that given the expense of cutting edge technologies and the fact that here is always some new cutting edge software package or hardware tool being sold as the next great answer, the job of promoting digital literacies and writing abilities might often be best accomplished with lower end tools.” -Adam J. Banks, Race, Rhetoric and Technology: Searching for Higher Ground, p.19.