Mar 30, 2013

the eye of the (be)holder

still from George Méliès’s Le Voyage dans la lune

This week at OSLab we're thinking about what happens when we realize we're being gazed at (is there a loss of autonomy, a surrender to the other? are we only imagining the extent to which we are seen?) and the impact we have when we gaze (is it a normative gaze? do we construct a person as we view them?) We're jostling things about over here, trying to turn our lenses inward on...our lenses. You know the drill. We decided we can't say it any more clearly than this cheeky cephalopod did, when it stole a diver's camera and became the one holding the lens. (You'll want to watch full screen to see the subtitles, and odds are you'll be humming I Love You...Octopus when you're through.)



Mar 24, 2013

Unreliable Narrator

If a parasite sucks the blood out of your tongue until it atrophies, and then becomes your tongue, 
who, then, is doing the talking?

Cymothoa exigua, the tongue-eating louse 


Historically, unreliable narrators have been somewhat less gruesome - more classic and captivating - Holden Caufield, Humbert Humbert, Huckleberry Finn (...though there is Alex in A Clockwork Orange). The term was coined by Wayne C. Booth in his 1961 The Rhetoric of Fiction, and its definition includes unreliability due to: ignorance, bias, psychological instability, self-deception, lack of worldliness, or deliberate deceit. It is a popular trope in post-modern literature and film, and because of its hip-ness reactions range from "it's way over-used and the effect is over-produced," to "it's a true representation of the range of selves within every human."   

The idea is intriguing - knowingly spending time with someone who is lying to you, allowing them to tell you their story. The one-sidedness of reading a book or watching a movie provides the right dynamic for this. You, on a comfortable couch, the perfect receptacle. The unreliable narrator, cozily embedded in their medium, the scoundrel you can safely love.

You can bound across a field of flowers into the situation because, dear reader, the unreliable narrator is reliably unreliable. Suspend disbelief, celebrate your fraudulent counterpart, created by an author just for you. It's our favorite kind of OSLab opening. There's an agreed upon space of tension, and maybe a nervous sort of co-creativity. Whatever is happening is real, and not real.

It's all in good fun, so there's nothing to fear.


"I heard all things in the heaven and in the earth. I heard many things in hell. How, then, am I mad? Hearken! and observe how healthily – how calmly I can tell you the whole story.”


Mar 18, 2013

Proud, Thrilled, Happy

Baby Boomers reminisce about growing up during a time of wonder and exuberant national striving, where all Americans were united around the awesome goal of sending humans into space. But a 1969 opinion poll showed that only 53 percent of American adults were wholly in support of exorbitant moon-trip spending. Where did this story the Boomers tell, then, emerge from? First, it appears in the words they use to describe their experiences as eager, wide-eyed children during the beginnings of space exploration (by 1958, ray guns had replaced six shooters, and 50 percent of the $1.3 billion U.S. toy market was sci-fi-related)

And second, it materializes in the words of the space enthusiasts, who's records thrive on a tenor of awe-inspiring exploration.  "It [the rocket] will free man from his remaining chains, the chains of gravity which still tie him to this planet. It will open to him the gates of heaven." -Wernher von Braun



The English language contains approximately 500,000 words. The average person uses 2,000 of them, and calls upon only 200-300 of them as regular standbys. (For reference, William Shakespeare used 24,000 words; 5,000 of which he only used once.)  The words we pay attention to as a culture are shifting constantly. In 1996 the English dictionary went online. "And we noticed that we learned a lot about the English language," said Peter Sokolowski, editor-at-large at Merriam-Webster, "about what words are looked up every single day in kind of a perennial way...after 9/11 (there was) an enormous amount of attention paid to the coverage and any news coming out of that story. And so we had this kind of concrete word of disaster, like rubble and triage in the first couple of days. And then words of explanation or of politics, like terrorism and jingoism. And then later, you know, four, five days later, we found that people were turning to the dictionary for what I would call philosophy, words like surreal and succumb..."

As we share verbal reactions, moving from the concrete to the explanatory to the philosophical, we are building a personalized representation of our world. The words we choose reflect our leanings, our deepest drives, and then, with them we re-invent. The next generation of space explorers are starting their journeys as young thinkers, science-lovers, experimenters. Many are already considering their small selves in relation to outer space. Is there a different, or even brand new, vocabulary we should be adopting to ensure they journey upward and outward with respect, awe, and imagination?  Should we take poetic license, urge them to pepper their search terms with metaphor? Stick to the scientific? A stunning combination of both? 

Here in the OSLab we think about how to ignite new ways of speaking...thinking...acting.  There is much to consider, and room to create, in the interstices. 




Mar 10, 2013

Nietzsche's typewriter

Friedrich Nietzsche's eyesight was failing. The effort of focusing on the page resulted in excruciating headaches, and he feared he would have to give up writing altogether. He determined he would purchase a typewriter, learn to use the tips of his fingers to press the thoughts onto paper, eyes closed. He ordered the latest gizmo - portable yet imposing, colored ribbon at the ready for dramatic effect. In the mail arrived his new Malling-Hansen Writing Ball.
The ball did help somewhat, but an odd side effect was soon noted by a close friend. He began to notice that Nietzsche's prose had changed, "from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.”  
 “You are right,” Nietzsche replied, “our writing equipment takes part in the forming of our thoughts” 

"We are not only what we read," claims Maryanne Wolf, professor and Director of the Center for Language and Reading Research at Tufts University, and author of Proust and the Squid: The Story and Science of the Reading Brain"We are how we read." In fact, our brains learned to read only a thousand years ago, and are configured differently now than when we started. The reading of cuneiform script shaped it one way, and then at some point soon thereafter (evolutionarily speaking) it was called upon to contend with the alphabet, written cues, a litany of words. Not the same brain at all, the glutinous organ repeatedly celebrated its plasticity by adapting. As writing emerged, our need for memory decreased, and other skills took center stage. Attention, and linguistic practices like contextualizing information, were increasingly engaged. English and Chinese languages utilize different brain systems, so even discerning between languages required somewhat different brains.

There are critical implications here. As the next wave of technology takes hold and our brains are being hailed to skim, scan and multi-task, scholarly articles are aflutter with commentary about which skills might be lost. Intellectual descendants of Marshall MacLuhan believe "the medium is the message:" a medium's effect on us is the result of not only its content, but how it habituates us to perceive. When new media technologies are introduced into society, the hierarchy of our senses is reordered. Some are elevated, some filed away. For example, the arrival of the printed word dulled the need for audio, and celebrated the visual, helping to ferry a more focused, individual, linear society toward Western capitalism. Critics scoff at this "technological determinism"; they'll concede that technological advances may change the way we live, but do not agree that they necessarily become part of our biology or are deeply embedded in social values. 

Is it possible to revel in the internet age, become global, habitually connected citizens, but still take a moment, mid-book, to let our minds wander beyond the page as word triggers memory, moves to idea, falls back into the imagination? It would behoove us to foster this opulent, associative dimension to reading. According to Wolf, it is vastly important that we preserve "the profound generativity of the reading brain." 

Here on this page dawns a new day, a brain emergent, a new flurry of posts. And now, if we can muster them, a few reflections all our own.



Mar 4, 2013

Delightable, game-able you

Employees at the World Bank are spending hours of work time playing a new video game called EvokeSet in the year 2020, the game's story follows the activities of a mysterious network of Africa’s best problem-solvers. Players are challenged to determine the inner workings of the Evoke world, form their own innovation networks, and develop radical new solutions to real-world development challenges. Content is relayed to players via an online graphic novel written by Emmy-award nominated producer Kiyash Monsef and drawn by Stan Lee motion comic guru Jacob Glaser. This is the game:


And this
is the woman the World Bank's top executives hired to create it - futurist, game designer, and radical thinker Jane McGonigal.

McGonigal is one of those people who makes bearing life's usual stress load seem resoundingly ordinary. She suffered a traumatic brain injury. She was spiraling downward, in tremendous pain, and her brain essentially began telling her it was time to die. She crawled her way back to recovery by doing something extraordinary (of course) but simple: she made a game of it.  

She is, at 35, one of the: "top 35 innovators changing the world through technology" (MIT Technology review), "Top Ten Innovators to Watch"(Business Week), "20 Most Inspiring Women in the World" (Oprah Magazine), and "Young Global Leader" (World Economic Forum). Clearly, she is hitting all targets - sociology, technology, economics, and the heartstrings of the populus.

In the introduction to McGonigal's new book, Reality is Broken: Why Games Make us Better and How They Can Change the World, Edward Castronova writes, "You can't pull millions of hours out of a society without creating an atmospheric event." McGonigal's goal is to do just that, by inserting gaming into museums, education programs, hospitals...every place that might need a shift in thinking about how to be more productive, environmentally sustainable, civic minded, globally conscious, happy. McGonigal doesn't claim games can solve all problems, and clearly states that when playing, game face doesn't cut it - you need to be invested and want a positive outcome if there is to be any chance of one.

First, she claims, we need to let go of lingering cultural biases about "lone-gaming." Up to 65% of gaming is now social, and research shows that gamers reach out for help more often, and are even more creative as a result of playing. They willingly tackle obstacles we normally try to avoid - like not being given any instructions, or not knowing what obstacles are going to emerge - by collaborating and innovating, and demonstrate increased computational thinking (problem solving, system designing, understanding of human behavior). Games provide a safe learning environment in which failing is an option, and creativity follows. 

Everything in moderation, of course, though: McGonigal suggests 20 game-playing hours a week for each of us, and a mere 21 billion hours a week for the (technologically connected) world at large. It may be only one piece of the puzzle, but given that the global number of hours played per year is now over 160 billion, it seems worth acknowledging, and at least dipping into. It's collaborative innovation we're after in the OSLab, and the statistics are behind play as a major contributor. Hive mind of the best sort