I studied American literature in Poland and published my Masters Thesis on cyberpunk and postcyberpunk for free under a Creative Commons BY SA license. It is available online and covers the writers William Gibson (‘Neuromancer’) and Neal Stephenson (‘Snow Crash’, ‘The Diamond Age’) and the theme of innocence in cyberpunk fiction. This theme will be familiar to Boing Boing readers, as it appeared in the works of Mark Dery and John Barlow, among others. The thesis explores such topics as American individualism, escapism, religion and Rapture, ‘the rapture of the nerds’, AIs, etc. One chapter also covers cyberpunk in general.
This is a detail of page 72 of the July 1982 issue of the magazine ↑Omni. Depicted is the beginning of ↑William Gibson‘s short story ‘↑Burning Chrome.’ It is a bit of linguistic history, because here the word ‘cyberspace’ saw print for the very first time.
Fittingly enough in the same issue, right after the first part of Gibson’s short story, there is an article (Manna 1982) on ‘↑Tron‘ (Lisberger 1982) featuring double-paged stills, illustrating the subheading ‘A science-fiction film leaps inside a bizarre computer world’:
This picture spreads over pages 82 and 83 of Omni July 1982, showing off computer-generated imagery from ‘Tron.’ The caption reads: ‘Sark’s carrier is blasted back into its wire-frame skeleton’. It almost seems like the screen captures from ‘Tron’ serve as illustrations for ‘Burning Chrome.’
I am able to show you all this, because ↑The [Glorious] Internet Archive now carries a freely downloadable ↓complete collection of Omni magazine:
OMNI was a science and science fiction magazine published in the US and the UK. It contained articles on science fact and short works of science fiction. The first issue was published in October 1978, the last in Winter 1995, with an internet version lasting until 1998. [...]
In its early run, OMNI published a number of stories that have become genre classics, such as Orson Scott Card’s “Unaccompanied Sonata”, William Gibson’s “Burning Chrome” and “Johnny Mnemonic”, Harlan Ellison’s novella “Mefisto in Onyx”, and George R. R. Martin’s “Sandkings”. The magazine also published original sf/f by William S. Burroughs, Joyce Carol Oates, Jonathan Carroll, T. Coraghessan Boyle, and other mainstream writers. The magazine excerpted Stephen King’s novel Firestarter, and featured a short story, “The End of the Whole Mess”. OMNI also brought the works of numerous painters to the attention of a large audience, such as H. R. Giger, De Es Schwertberger and Rallé.
To my mind Omni (1978-1995), together with ↑Heavy Metal (1977-), was one of the most important carriers and amplifiers of the ‘cyberpunk discourse’ (↑Wired saw the light of day not before January 1993).
Digital ethnography can be understood as a method for representing real-life cultures through storytelling in digital media. Enabling audiences to go beyond absorbing facts, computer-based storytelling allows for immersion in the experience of another culture. A guide for anyone in the social sciences who seeks to enrich ethnographic techniques, ↑Digital Ethnography offers a groundbreaking approach that utilizes interactive components to simulate cultural narratives.
Integrating insights from cultural anthropology, folklore, digital humanities, and digital heritage studies, this work brims with case studies that provide in-depth discussions of applied projects. Web links to multimedia examples are included as well, including projects, design documents, and other relevant materials related to the planning and execution of digital ethnography projects. In addition, new media tools such as database development and XML coding are explored and explained, bridging the literature on cyber-ethnography with inspiring examples such as blending cultural heritage with computer games.
One of the few books in its field to address the digital divide among researchers, Digital Ethnography guides readers through the extraordinary potential for enrichment offered by technological resources, far from restricting research to quantitative methods usually associated with technology. The authors powerfully remind us that the study of culture is as much about affective traits of feeling and sensing as it is about cognition—an approach facilitated (not hindered) by the digital age.
UNDERBERG, NATALIE M. AND ELAYNE ZORN. 2013. Digital ethnography: Anthropology, narrative, and new media. Austin: The University of Texas Press.
His academic “exile,” as he calls it [Graeber meanwhile is a professor at the ↑LSE], has not gone unnoticed. “It is possible to view the fact that Graeber has not secured a permanent academic position in the United States after his controversial departure from Yale University as evidence of U.S. anthropology’s intolerance of political outspokenness,” writes Jeff Maskovsky, an associate professor of anthropology at the Graduate Center of the City University of New York, in the March issue of American Anthropologist.
That charge might seem paradoxical, given anthropology’s reputation as a leftist redoubt, but some of Mr. Graeber’s champions see that leftism as shallower than it might first appear. Anthropology “is radical in the abstract,” says ↑Laura Nader, a professor in the field at the University of California at Berkeley. “You can quote Foucault and Gramsci, but if you tell it like it is,” it’s a different story, she says. [...]
Responding to anthropologists’ frequent claim that they embrace activist scholarship, he [Graeber] echoes Ms. Nader: “They don’t mean it”—at least when it comes truly radical activism.
“If I were to generalize,” Mr. Graeber says, “I would say that what we see is a university system which mitigates against creativity and any form of daring. It’s incredibly conformist and it represents itself as the opposite, and I think this kind of conformism is a result of the bureaucratization of the university.” [...]
But she [Laura Nader] finds it deplorable that scholars would value superficial clubbability over originality of thought; she decries the “‘harmony ideology’ that has hit the academy.” She also thinks the fact that he “writes in English,” eschewing jargon, hasn’t helped him. [...]
Mr. Graeber, who says he gets along just fine with his colleagues in London—and, indeed, with most of his former colleagues at Yale—has his own take on what scholars mean by “collegiality”: “What collegiality means in practice is: ‘He knows how to operate appropriately within an extremely hierarchical environment.’ You never see anyone accused of lack of collegiality for abusing their inferiors. It means ‘not playing the game in what we say is the proper way.'”
Outspoken anthropologist ↑Sarah Kendzior[<-- have a look at her blog!] has an opinon piece published at Al Jazeera, called ‘↑Academia’s indentured servants.’ Very worthwhile—here are some snippets from the beginning and the end:
On April 8, 2013, the New York Times reported that 76 percent of American university faculty are adjunct professors – an all-time high. Unlike tenured faculty, whose annual salaries can top $160,000, adjunct professors make an average of $2,700 per course and receive no health care or other benefits. [...]
Last week, a corporation proudly announced that it had created a digital textbook that monitors whether students had done the reading. This followed the announcement of the software that grades essays, which followed months of hype over MOOCs – massive online open courses – replacing classroom interaction. Professors who can gauge student engagement through class discussion are unneeded. Professors who can offer thoughtful feedback on student writing are unneeded. Professors who interact with students, who care about students, are unneeded.
We should not be surprised that it has come to this when 76 percent of faculty are treated as dispensable automatons. The contempt for adjuncts reflects a general contempt for learning. The promotion of information has replaced the pursuit of knowledge. But it is not enough to have information – we need insight and understanding, and above all, we need people who can communicate it to others.
Many designers enjoy the interfaces seen in science fiction films and television shows. Freed from the rigorous constraints of designing for real users, sci-fi production designers develop blue-sky interfaces that are inspiring, humorous, and even instructive. By carefully studying these “outsider” user interfaces, designers can derive lessons that make their real-world designs more cutting edge and successful.
Make It So shows:
Sci-fi interfaces have been there (almost) from the beginning
Sci-fi creates a shared design language that sets audience expectations
If an interface works for an audience, there’s something there that will work for users
Bad sci-fi interfaces can sometimes be the most inspiring
There are ten “meta-lessons” spread across hundreds of examples
You can use—and not just enjoy—sci-fi in your design work
Over 150 lessons and 10 “meta” lessons that developers can use to enhance their realworld interfaces
There is a ↑companion blog to the book, carrying additional visual material, interviews, and more. All this is a fine example of the to and fro between fictional and non-fictional technology, the mutual influences, one creating and reproducing the other and vice versa.
CAMERON, JAMES FRANCIS. 1984. The terminator [motion picture]. Los Angeles: Orion Pictures.
SCOTT, RIDLEY. 2012. Prometheus [motion picture]. Century City: 20th Century Fox.
SHEDROFF, NATHAN AND CHRISTOPHER NOESSEL. 2012. Make it so: Interaction design lessons from science fiction. New York: Rosenfeld Media.
The above is the most clear-cut explanation of ‘how a ↑differential gear works’ I’ve ever seen. The video below is complementary—↑Richard Feynman explains why a train car stays on its tracks when ‘going around the corner,’ although no differential gears are involved:
From ↑ITU statistics intac made some interesting infomap posters. The above one shows the ↑Internet usage around the globe (click the picture for full-size). The lighter a nation state is rendered, the lesser percentage of its population are using the Internet. As you can see a lot of Africa completely drops out, rendering the continent as a skeleton. The poster below takes ↑a closer look on Africa and gives both percentages and total figures of Internet users and mobile subscribers.
It’s scheduled for release in late October this year—‘Batman: Arkham Origins’ (Warner Bros. Games Montreal 2013), the prequel to ‘Batman: Arkham Asylum’ and ‘Batman: Arkham City’ (Rocksteady Studios 2009, 2011). Here are the until now ↑known details, and here is a ↑full history of the ‘long line of Batman games stretching back more than 25 years.’
ROCKSTEADY STUDIOS. 2009. Batman: Arkham Asylum [computer game]. London, Burbank, New York: Eidos Interactive, Warner Bros. Interactive Entertainment, DC Entertainment.
ROCKSTEADY STUDIOS. 2011. Batman: Arkham City [computer game]. Burbank, New York: Warner Bros. Interactive Entertainment, DC Entertainment.
WARNER BROS. GAMES MONTREAL. 2013. Batman: Arkham Origins [computer game]. Burbank: Warner Bros. Interactive Entertainment.
The Guardian carries a fine and more than worthwhile ↑essay by ↑Russell Brand. Here’s a snippet:
When I was a kid, ↑Thatcher was the headmistress of our country. Her voice, a bellicose yawn, somehow both boring and boring—I could ignore the content but the intent drilled its way in. She became leader of the Conservatives the year I was born and prime minister when I was four. She remained in power till I was 15. I am, it’s safe to say, one of Thatcher’s children. How then do I feel on the day of this matriarchal mourning?