Friday, September 24, 2010

"The Wow Factor is Gone": Our Limitations

Problems:
The main problems with getting excited with Douglas Engelbart's visions and plans for us reading today are two-fold:

1) We are not NACA engineers:
2) We have the expectation of the technology that we've always had.

The combination of these factors leaves us underwhelmed by the descriptions and details of the process by which Engelbart performs what is essentially an amazing feat of combining vision with engineering and programming.

In short, we are not impressed because many of us have lived with this level of technology for 20+ years. Our students and younger colleagues have even greater difficulties in imagining the amazement at being able to coordinate data in the ways suggested.

However, I do not want to stay on why it is difficult to engage with Engelbart. I want to engage with his vision and ideas. As the former winner of the Delaware BASIC team programming challenge in 6th grade, I am fascinated by the level of detail and attention needed to get a machine to coordinate these different categories of information that lead to the invention of the personal computer (even as Engelbart does not really intend to envision a personal, consumer computer).

Engelbart v. Engelbart
In reading the first Engelbart/vision essay and comparing that with the second (Engelbart/English proposal) and the video of his presentation, I'm really seeing two different ideas.

On one hand, Engelbart/English propose a work terminal system to analyze and support research into ways of analyzing and supporting research, but in the renowned live presentation, Engelbart introduces the potentialities of the technology in word processing and mapping, interestingly combined with the narrative of his wife calling and asking for him to do the shopping. The development of handheld organizers in the late 1980s, PDAs in the mid-late 1990s with devices like the Newton & Palm Pilot, and continuing with advertising for smartphones and tablets also often confronts the consumer with the ability of this hand-held device to help with....gasp...shopping, maps, and relationships.

Palm ads here and here. (Interesting history of Palm Pilot here from the Computer History Museum)

The Case of Newton
Apple's Newton Intro Video here shows that a direct connection exists between Engelbart's vision and this kind of device. Watch the video and be amazed as the Newton users are excited about being able to get down and manipulate their thoughts on the device in a variety of ways, including drawing/design/handwriting, etc.

But, there's an interesting addition about a minute in. Now, the Newton will do things for the user, not just be an augmentation. Don't know how to put text and images together well? Don't worry. Newton will do it for you. Newton will not only assist you. It knows enough about what you are doing to do it for you.

This sort of personification and extension of technology beyond being an extension is even clearer in this Newton TV ad. Newton is now not just a thing or an assistant, an adjunct. It is a separate being who is friendly and worldly and intelligent.

While this might appear to be a minor shift from helper to friend used to achieve a marketing goal, it also reflects a vital shift rhetorically that has real consequences. Engelbart envisions a device that works on a hierarchy of symbols that he and his team devise and program into the system.

In fact, he envisions that these sorts of machines will always comes as a product of a team that analyzes needs and wants and develops a system that serves to allow users to manipulate those symbols in meaningful ways to achieve their goal. In essence the technology becomes a product of individual/small-group's needs and wants.

However, as we see the evolution to the PDA world, we realize that what has happened is that devices have been produced on the large scale where the individuals need to now be assimilated into the system of symbols developed by the engineers and programmers. In effect, the augmentation has taken over.

It is not just a tool to be used and shaped at will. Jobs, actions, wants, and needs must be shaped to fit into the use of the tool, as shown by the instructional videos on how to use a PDA.

And while the Newton perhaps tried to do too much and cost too much, the iPad and other augmentations of today seem to be facing less resistance. Built on an iPhone UI, the iPad seems "instinctive" and "responsive". No, it's more than that.

Apple wants it to be "magical"

And it is, but what is lost is the knowledge of how it works and the ability to shape the technology to the standards and hierarchies or OUR symbols. The process of learning about how we learn has been sublimated to make invisible what might be better made visible: the construction and sharing of symbolic systems.

Without an understanding of the logic behind the systems of symbols, students and users become consumers of symbols and leave their creation and manipulation to "magic."

The problem, from a cultural studies POV, is that these systems of meanings and symbols are not segregated from reality in an ether of pure entertainment. They are heavily ensconced in the networks of economics/class, culture, language, race, ethnicity, politics, gender, sexuality, and differences of all kinds. The order and preferences that are given to some symbols over others carry with them ideological values and meanings beyond the symbol and its meaning alone. Pure data connection is not possible.

Therefore, shouldn't we be aware of how, why, and when these sorts of decisions get made and by whom?






Friday, September 17, 2010

Bush-y, Bush-y, Bush!

The Good:
Bush's article clearly articulates that technology holds vast potential beyond the ability of engineers and scientists to develop ways to kill more people more quickly or more efficiently. In fact, it can help fellow researchers more readily share information about how past peoples have used technologies to kill more people more quickly and/or more efficiently.

All joking aside, Bush's attempt to look towards a future of connection and relative interaction is fantastic and exciting. What hipster would not want a memex? It's got a cross between Mad Men and steampunk aesthetic and does all the work of a mid-90s Palm.

Ok, I wasn't quite done with being sarcastic yet. I'll admit it. I love technology. I love being able to take pictures, like the one below, on the fly and not have to worry about getting it developed.




It is truly amazing to be able to follow my college acquaintance as he drove across America on a Craftsman lawnmower this summer.

The Bad:
One might immediately notice the potentials for abuse (or at least not positive use) of tiny cameras that people can take anywhere. A surveillance culture, every success or failure living on indeterminately, pornography, and cats, lots of cats, appear to be the products of the tiny cell-phone camera.

This remains to be the problem with almost any advance in technology. The technology always precedes the abilities of the culture to incorporate the possibilities in primarily positive ways, at least to the status quo.

And to some degree, that is a good thing. It allows technology to even the field between the oppressor and oppressed, a la the use by Iranians of twitter to subvert media blackouts and connect to the rest of the world. We love to hear this. It's exciting and hopeful. It turns our attentions away from the fears that technologies bring with them and that technologies distribute even more quickly and constantly than before.

However, it is this technology that allows for our attentions to be diverted so quickly. Recently, I listened to a Fresh Air where Terry Gross interviewed Matt Richtel, a writer for the New York Times, who has been working on investigating issues of technology, society, and the science of the brain for the past year or so ("Your Brain on Computers" articles: here and here). In short, he has discovered that the research is beginning to show that our brains cannot take the quantity and diversity of information being presented to it on a near constant basis. The pleasure potential of a new e-mail coming in keeps us constantly checking the inbox like a rat with a randomly distribution of food from a slot. I find it fascinating that technology is beginning to have the equivalent effect of allowing, nee forcing, us to carry little slot machines with us (Yes, there is an app for that.)

I see it in both myself and my students. I check my e-mail right when I leave my office, and 20 minutes later, when I get home, I feel a strong urge to open up my laptop and check again. I know, intellectually, that nothing of significance has come in during the last 20 minutes. I know, emotionally, that I should sit on the floor and read or play with my son rather than reconnect to the screen, but the "pull" is powerful.

The Dangers:
What's amazing to me is that so many seem willing to plow headlong into more reliance on technology that increasingly proves to be dangerous or detrimental when used on a broad basis. I can't drive anywhere in this city without nearly getting plowed into by someone on their cell phone or texting. At Baylor's campus, I've repeatedly heard of students nearly being hit as they are walking, absorbed in their phones or iPods, and not noticing that they are going into traffic.

Would these people engage in dangerous or distracting behavior of one kind or another anyway? Sure, why not? But, the point is not that technology presents the only danger of violence, distraction, or incipience of whole levels of intelligence or thought vanishing, but rather, that technology makes those problems easier to develop and harder to resist. Furthermore, the culture places a negative value on those places or people that choose to not use those technologies.

I think that E.M. Forster's "The Machine Stops" gives a highly astute and prophetic view of the dangers of these kinds of connections of knowledge and thinking to machines. It is not so much the problem of storage that Bush's plan helps to solve. The danger lies in losing the ability and inclination to train the mind and body to work together through diverse media to obtain information and synthesize it.

It's precisely because it IS so easy to "Google It" that renders to desire and pursuit of knowledge as a process only the tiniest of realms within society, pushed back into the Ivory Dungeon, only to be let out to service the cry that a populace with more "higher" education will fix fundamental problems in the economy.

Forster's story describes technologies and interactions not to far afield of Bush's, only with an oppositional perspective and thirty-six years before. Forster describes the use of machines to encode knowledge, making actual research unnecessary and accessible at the touch of a button. Forster's dystopia does not end well (like any do), and human beings die in droves in the dark as the machines run down with no one to understand their processes after generations of efficiency.

Final Thoughts:
Rather than end on a negative note of complete ruin, I want to propose a solution or a potential solution. In recent years, STEM education has represented a significant push for American education at all levels of schooling. However, in reviewing much of the literature, little is done to pair discussions of technology's abilities with the potential ethical issues of that technology and science. The focus has been on the question, "Can we do X?" not "Should we? How should we? What are the social/cultural costs of X?"

No, it's been left largely to specialists within fields, secondary/tertiary debates at conferences, or, more likely, those crazy "humanities" people who keep saying, "Umm...remember this other time that we did something like that? It didn't turn out well."

If we could institute education of STEM that includes the implications of these actions and ideas, from the earliest of levels, then we might have a population more prepared to adapt to new technologies in healthier ways, rather than getting bloated on the processed/technological "food" that Bush praises in his article.

A Step Closer to "Public"

Hey, happy, happy-ish news! I'm beginning to live my "public" part of intellectual a bit more than in conferences and classrooms (stupid article publishing being so hard and long). I was contacted a while ago by Anna David (I thought it was a joke/spam at first) to talk about my use of reality TV in the classroom, something that I've blogged about at times and been trying to publish here and there.

Well, the article is finally here. I think that it does a fairly good job of giving an overview of the approaches to reality TV including, but not limited to, it's study as a genre in media studies. Since I've shifted my teaching and research to be more of using popular culture in order to get to other academic points, I've felt a bit of the outcast. This did a fairly good job of bring these odd threads together. Maybe I'll propose that TWOP needs a "higher ed" column or Chronicle needs a "Reality TV" column.

Thanks, Anna. I'm sorry that I thought you were porn spam initially.

Thursday, September 16, 2010

Forgive My Ignorance: Thoughts on Murray

Excuse my confusion. I was criticized frequently in graduate seminars that I quibbled too much with the linguistics and bought into the myth of needing to "tear down" or challenge a reading, and I desire to avoid that pit.

However, in the course of reading Murray's introduction, I couldn't help but twitch at certain moments that rankled. Let me give an example.

...the term "new media" is a sign of our current confusion about where these efforts are leading and our breathlessness at the pace of change, particularly in the last two decades of the 20th century. How long will it take before we see the gift for what it is--a single medium of representation, the digital medium, formed by the braided interplay of technical invention and cultural expression a the end of the 20th Century?
She asks a question, and I assume that means an invitation to dialogue. How long will it take indeed?

A "Gift"?
Well, let's set aside the large presumptions that this is a "gift," implying both a singular entity and a "giver," along with the supposition of some measure of benevolence.

A "Braid"?
So, assuming that "new media" is "good", "singular", and a "gift", then we have the issue of whether it is formed by this "braided interplay of technical invention and cultural expression". Like the gift metaphor, the "braid" carries a purposefulness with it that I'm not convinced exists in the new media today. It seems much more like when one tangles a mess of yarn and then has a few people work to untangle it. Some, will take the time to methodically find and work from a loose end and carefully wind the end as they move. Others will grab joints and try to loosen/organize these nexuses. Still others, mostly my son and cats, spend time enjoying themselves. Finally, some might find ends, unravel a bit, knit what they want, and then charge for access.

Whatever the case, new media are not braided together. This is something,
but it's not a braid.

A "We"?
Finally, the question assumes that "we" most/all have access to technologies of production and consumption. Of couse, Murray could be referring to the "we" in the educated, elite, but that appears to reinforce the "us" v "them" mentality that was appropriated by critics of higher education. The simple truth is that the class divides that are increasing carry with them significant digital divides in terms of access and literacy. The ability of peoples without training in the access and understanding in these media have placed them behind the curve that increases term by term to require fundamental knowledge of using computers, e-mail, twitter, Blackboard, and other new media elements.

But, that deals with "selection", which will be a major part of my interest in V. Bush's "As We May Think"

Friday, September 10, 2010

I'm Back!: New Media Faculty Seminar!

After a hiatus of a couple years (including blogging over at a Relief for a while; please feel free to go and seek them out), I'm baaaaack!

This is due entirely to signing up for a Faculty Development Seminar on New Media Seminar that I'm very excited about. Although my field is not really in "New Media", I feel that I'm quite often pushed into this role within my department by virtue of being 10 to 20 years younger than most people in the department.

In addition, I'm beginning to see a lot of overlap between my research in the ethics in and of genre narrative and the ways that communication is and might change.

So, I'm going to be taking the questions of the course "How do we use new tech?" and "Do we use it as it's intended?" and applying them to my thoughts in movies, popular novels, TVs, and video games, particularly with those connected to the moral quandry of the individual within the corrupt world with flawed social institutions (what I refer to as "noir").

In the next post towards the beginning of next week, I want to provide some of my first impressions based on Janet H. Murray's "Inventing the Medium" intro to The New Media Reader