Friday, November 19, 2010

I Will Survive...

First, I Was Afraid. I Was Petrified...

or something like that...

I must admit that I struggled with the Deschooling reading for today. Something deep and, seemingly, primal screamed out in slo-mo, "NOOOO!!!!" when the author appeared to belittle the hierarchical order and engagement with knowledge.

"Of course one needs to find and engage with knowledge in order. It's obvious." Or, additionally, the "reorganization" of schools in a deschooled environment, a web of learning, would resulted in further divisions between the haves and have-nots.

But I Spent So Many Nights Thinking How You Did Me Wrong...

Then, however, I began to think as I encountered different examples about how resources were used in ways similar to the author's descriptions, like the tape recorders and mechanical donkeys, that mirrored my experiences in teaching. I teach about half of my day at a relatively lower privileged high school where money seems prevalent for football and computers that remain locked in closets. I see textbooks that cost hundreds of dollars that cost only that much for the reason that they control the flow of information and are driven by the profit motive.

I was also angry about the article's assertion that it was the teachers who held onto this structure. I would love nothing more than to let students be self-directed. In fact, I've structured my final paper in my freshmen writing class to be inspired by a reading of their choice and tried to serve more as guide than a "TEACHER". I encourage them to ask questions that they are interested in and follow the directions that their research sends them. However, they have been so strictly trained against inquiry that these appears to cause the same anguish as ordering them to kill their pet rabbit.

Clearly something is NOT right.

I Grew Strong. I Learned How to Carry On.

The problem that I see in this debate, and the reading itself, is that educators and policy-makers seek answers rather than a dialectic. Just as Ms. Gaynor states in her totemic song, the strength comes from struggle, not the solution. It comes in the realization that the narrator has about what they should have done in the distant and recent past AND in their practical response to what they should do now.

The error of complete reconstructionalists is that they can imagine the world and relationships in that world in any way that they want. They have the luxury of a known fantasy. Gloria Gaynor, along with the vast majority of educators today, knows that the what-if's are as self-interested as the no-good-nic attempting to return.

Perhaps, just maybe, the solution to the education/tech debate rests in the song as she sings to not just focus on
all the strength I had not to fall apart kept trying hard to mend the pieces of my broken heart and I spent oh so many nights just feeling sorry for myself

But rather on the direction provided by the example...
Now I hold my head up high and you see me somebody new. I'm not that chained up little person still in love with you

Yes, we could completely recreate ourselves to meet the expectations of one constituency or another, but then we succeed only in demonstrating something of the lack of value of the very thing that we offer to provide.

- Posted using BlogPress from my iPad

Monday, November 15, 2010

Where Does He Get Those Wonderful Toys?

A bit of background is necessary before I get into my belated discussion of Turkle (which I thought I had put on timed release but did not, I suppose).

First, I love games of all sorts. One of my favorite things to do with my family has always been to play games. My brother and I used to fight over the monthly arrival of the Games magazine. We love crosswords, computer games, arcade games, car games, and any other sort of game known to mankind.

I believe that this game-centric lifestyle has warped me to no end. I constantly play games in my mind at almost all times. One of my favorites is to take things that people say and try to think of songs or quotes from movies/TV that fit or follow from this. "Come on!" almost always elicits an internal completion of "Eileen" and a subconscious break-out into "toora loora toora loo rye aye" etc. (Albeit, my version is the Save Ferris version rather than the Dexy's Midnight Runners').

Second, I did not have a console gaming system for a long, long, long time after all of my friends had one, other than an Atari 800, which was ostensibly a computer. So, Nintendo, Sega, and Playstation systems were always a bit of a fetish item for me. I did eventually buy a PS1 and a PS2 after they went down in price, but by that time, it was not cool.

This means that my sort of my gaming came a series of fits and starts that depended on what my friends had and were playing. It meant that I never got really good at any of the games that other people played and would constantly have to submit to the query of, "Do you want to have me help you through is part?"

Combining these, I feel a polarity that I'm not sure I see in Turkle's work, although I might see it reflected. It is that gaming, whether it was a form of social connection and/or mental expansion and challenge, was never, for me at least, a matter of this zen-like connection that Turkle observes.

For me, it was a battle with the movable parts that Turkle claims to not exist in video games, despite my battles with sticky controller buttons, high ping/lag, or have the less-than-modern mouse or joystick. it was a battle with not being able to devote the time and energy to claim some social status.

Even in the faculty seminar with the fact that I'm more gamer than most of my colleagues, I am noob of all noobs in comparison to those around me. My student toss off their kill ratios and ask me about whether I've played the new Fallout (no), and I'm left without a response. A couple years ago, I taught a mass media intro class and had a day on identity construction in video games. I brought in my PS2 and Guitar Hero II, a wonderful example of mastery and the zen-like meditation that Turkle describes, and I was pawned in ways that I'm sure you can imagine.

Fortunately, now, I can retreat to my farm and enjoy a leisurely time of planting tomatoes.

Or, better yet, I can turn down the lights and watch "Dory" mate.

-Posted using BlogPress from my iPad

Friday, November 05, 2010

Laurel: Where's the Hardy?

Oh, That's Right It's in the Humanities!

It's argue ably not the first approach that we've seen from the Humanities, both Nelson and McLuhan have some pretty humanistic undergirding. However, Laurel makes no excuse for her roots AND their usefulness in the realms of the digital.

It's the Story, Stupid!

Consider me biased, but I think that throughout all of the reading to date a large quantity of great ideas have been given, a lot of these ideas have been tied to potential ways of seeing the world of work and thinking in new and original ways. What has been missing, and I think Nelson was pointing to the to a degree, has been the ability to analyze and critique these stories of the digital age.

It is to here that Laurel brings her thinking and from whence that I think people like Tom Chatfield and Jane McGonigal draw their ideas for their, relatively recent TED talks, Found here, where they begin to draw out some interesting potentials for human-computer interaction.

What's It To You?
Well, to me, it's nearly everything in terms of research but also filters down to my teaching to a large degree. The concept that the interaction between agents involves their actions and also the motivations and beliefs behind those characters and actions is a powerful one. It clearly filters into any number of situations: advertising, politics, history, and even science. The structure of the narrative affects its meaning.

In almost every class that I teach, I give at least one example of how looking closely at the form of something can give us an understanding of how it works. This hierarchy, or should I say hierarchies, presents a method of analysis that not only goes beyond the efficacy of something being studied but also can contain and explore the discussion of efficacy itself.

In other words, it gives a process for both the exploration of the process but also for the reasons behind the processes that is not always available to more scientific approaches to phenomena. In this way a researcher can employ a transmedia approach to interactions that could be analyzed as narrative.

An Example?

Really? I'd love to.

Let's say, hypothetically, that you were interested in the changes in characters/agents that one might commonly call "detectives". Let's say that you want to also look at agents that seem to border on the definition of that character based on their actions, language, or motivations.

Well, traditionally, one would need to do literary analysis on the literary examples, applying film theory to the cinematic examples, and mass media approaches to the televisual sorts. Additionally, techniques might need to be formed for musical, video game, comic, and advertising examples to name a few.

Applying Aristotelian approaches to narratives and ins that we agree on as narratives is not new, but the idea of applying them to non-narrative characters and interactions is very valuable. now, we can compare the driving of a character in a 1940s noir to the use of a controller in playing Max Payne. We can unpack the agency of the characters involved and compare the different modes of thought and ethical questions behind them in a way that more resembles the ways that individuals use media and engage in narrative.

The modern human agent does not really differentiate between computer time and movie time and TV time and Video Game time. it's screen time and needs to be studied as such.

- Posted using BlogPress from my iPad

Friday, October 29, 2010

The Fable of the Porcupine and the Car

As I prepped for teaching a course on short stories this term, I struggled a great deal with where to begin. The longevity of a "short" story is well recorded, with myths and folk tales, as is its potential to become ephemera, with "You'll never guess what happened last week." Because of the vast distances that stories both can and cannot travel, it grows difficult to ensure relevancy beyond the discipline. Sure, short stories are a form that rose to popularity with the growth of subscription publications like newspapers and magazines in the early 19th Century, but they are more than that.

Viola's "Will There Be Condominiums in Data Space" gives perhaps a perfect example of the relevance of the short story in his use of the fable at the end. While it is told as a personal anecdote, this, combined with the personification of the porcupine and the merging of the "I" with their car, has all the hallmarks of a folktale or fable.

As with many of Aesop's fables and those collected by the Grimms, the location is both described but also vague, "Late one night while driving down a narrow mountain highway." Additionally, the players, porcupine and the man/car, each take on aspects of society or human nature. The porcupine is proud, stubborn, and natural, while the man/car is large, powerful, kind, and technological. The conflict is obvious and reflects the conflicts that Viola traces throughout his writing. It is a call for progress and and acknowledgment of the limits of personal perspective, but the framing as a fable has additional importance.

GK Chesterton writes, in his introduction to a translation of Aesop's Fables,
This is the immortal justification of the Fable: that we could not teach the plainest truths so simply without turning men into using animals in this austere and arbitrary style as they are used on the shields of heraldry or the
hieroglyphics of the ancients, men have really succeeded in handing down those tremendous truths that are called truisms.

To us, this means that Viola's use of a nearly universal and ancient narrative form communicates and demonstrates the points about tradition and technology that he seems to point out at various places in the chapter that there is

the importance of turning back towards ourselves...The sacred art of the past has unified form, function, and aesthetics around this single ultimate aim. Today, development of self must precede development of the technology or we will go nowhere

This reminds me of some recent trends in sacred spheres to return to more traditional forms of representation in order to recombine and recreate the now, including the monastic walk/prayer labyrinth:

and the both ironic and non-ironic appreciation of religious icons:

Interestingly enough, a friend of a friend's blog gives a very simple explanation of why icons look they way they do, and its theological importance. Not unsurprisingly, it has a lot in common in the discussion of space and ideas that comes up in Viola's chapter.

Note: I somehow lost the two posts that I did for last week. I'm going to recreate them from my notes and post them on Monday and Wednesday of next week.

Monday, October 18, 2010

Some Ideas About Tech...

I'm going to take a break from discussing and interacting with the readings until later this week. Don't worry, I have plenty of things to say about Nelson's "Computer Lib/Dream Machines" and Kay/Goldberg's "Personal Dynamic Media", and in some ways, I want this post to bridge between my Nelson-esque rant from last week to a discussion of implications for actual use.

All of this ties into the fact that...
I got an iPad!

This is fascinating to me primarily because I have always had to be supremely self-motivated in my technological direction. Other than my father's devotion to the sadly overlooked and under-appreciated Commodore Amiga,

most of the technology in my life has had to have been self-selected, vetted, and thoroughly argued for/purchased with my own money.

From my alarm clock to my numerous Walkmen, personal cd players, laptops, desktops, pager, cell phones, iPods, flash drives, home theater system, video consoles (PS, PS2, Wii), Kindle, and anything I might have left out, I have spent hours talking to people, checking out Consumer Reports, surfing the web, all in the service of not purchasing something that I would not get solid use out of.

This iPad gives me to opportunity to interact with a media technology on a different level, a reactive level, which has been quite informative.

I want to give one negative aspect and then a bunch of positive things.

Bad- Difficulties of Output
The abilities of the iPad to connect, combine, store, and access a wide variety of media is fabulous, but the difficulty of getting things off of the iPad. I assume that these will be corrected/simplified as things progress, but I would love a couple things: higher quality audio/video output, easier wireless printing, and data/file transfer via bluetooth/WiFi.

Yes, before you start inputting comments, I know that these all have workarounds that are ok, but for my use, as an educator who goes to different rooms with different set-ups (often of widely varying decades of equipment), I'd like to have one thing that I can carry with me with my presentations, online encyclopedia, Kindle access, gradebook, streaming audio/video, etc. all in one. Right now, I have to install Silverlight/Kindle on the computers that I use in the classroom (assuming that the priesthood allows such things), have a selection of flashdrives, and a connection to Google Docs.

I have to say that it's not bad. I like it much more than making overheads/copies, tapes, VHS, posterboard and so on that was the norm when I was learning to make presentations, in undergrad, but how nice would it be to walk into a classroom with my iPad, have the projector automatically recognize the iPad, establish a connection (with log-in), and allow me to type, draw notes, show videos, play audio, all without cords, remotes, or a big console?

Good- Community of Discoverers
One of the most exciting parts of new technologies is the growth of supportive communities towards the use and maximization/enjoyment of their use.

I remember the weekly Amiga BBS/SysOp meetings at the University of Delaware campus that we'd attend. We've all seen the continuance of such communities for longer periods too (motorcycles, HAM radios, classic cars). The iPad seems to have some potential towards these sorts of connections, and I'd like to share a couple:

One, is the TWIT network's iPad show, "iPad Today" (if the link is not active, it's because it is blocked by Websense, which is causing some problems). The Twit Network is an interesting podcasting network helmed by Leo Laporte, who I first saw 10 years ago on Tech TV. More interesting than the weekly show alone is the establishment of live, chat communities, wiki's, Buzz's, twitter accounts, blogs, and other outlets that grow up around it.

Second, is the "ideaplay" website that a friend at the tech and Ed, PhD program at Michigan State turned me on to.

These sorts of discussions and communities not only serve to teach one the rules and possibilities of the central subject, but they also test those rules and abilities. We can weigh the costs of "jailbreaking" an iPad without having to put yours at risk (not that there's really a big risk). In other words, they establish boundaries but also push against these, or at least they do in the best of potential worlds.


The potentials to move and interact with content is really excellent with the iPad. The screen is clear, sharp, and just begs to be touched. I don't find the keyboard overly difficult to type on for most purposes, although I do wish a wider shift key and more ready access to number keys. I'm sure that different keyboards will come in time. The sheer portability and design profile of the iPad make it very easy to pop into a bag, even more so than a laptop or netbook.

The use of the iPad is very simplistic (overly so in some's opinion). There are a select number of apps per page arranged without much variability. Clicking in and out to single applications fits most uses on a daily basis and simplifies a work-thread in a way that might be advantageous for a creature that cannot truly multi-task.

Pure Potential
There is nothing really innovative to the iPad. As many have said, the tablet PC is not new, and others have actually done it better in some ways. What Apple provides is a a convergence and synergy that makes the iPad a potential and simple locus for almost all connection/access, in a similar way to what some Microsoft people have seen with the XBox 360 with Zune-pass.

I cannot wait to see where things go and test out trails going forward.

Friday, October 01, 2010

The Problem of People: Why I Really Don't Hate Tech

In reading Engelbart's reports laying out the research center, I had no problem engaging with the text. Perhaps it is my love of bureaucracy and reports, but I enjoy seeing a vision/idea laid out in such specific terms that they seem manageable.

In these not-too-lengthy pages, Engelbart lays out a plan that would lead to all sorts of amazing things: the mouse, Cloud computing, YouTube, and We Rule. What's not to like? What's not to admire?

Well, here you go:

Yup, the information for Pandora is blocked. It's exciting to have an iPad and look at ways that I might incorporate it into my teaching. I love audio and would love to find clips of NPR stories or better yet the C-SPAN app to discuss rhetoric and give us specific content to respond to, but as I go to the App Store....

You want to know why? Well, on the campus system and WiFi, iTunes, NPR, C-SPAN Radio, Pandora, etc are all blocked because Engelbart's dream of a Research Center is not really progressing to the sort of organized and informed opportunity for self-managed and designed computer systems.

As frustrating as it is, it's understandable to a degree. After all, the system is not the closed one of ARC. It is vulnerable. Those vulnerabilities cost money and leave information to be potentially stolen, altered, or destroyed. There are all sorts of reasons why a community college might want to protect their wired and wireless networks, but they all boil down to one thing:


People are the problem. The people that design, the people that manage, the people that use, the people that misuse, and all the rest form a constantly fluctuating mass that is dangerous, powerful, and unwieldy. They are nowhere near the "skilled user" that Engelbart and English keep referring to being able to do things like "readjust his view to suit immediate needs very quickly and frequently."

Those managing and paying for our contemporary networks want as little "readjust"-ing as possible from the user's perspective. "Readjust"-ing costs money in fixing things when they go wrong. Allowing users, apparently even faculty in new media seminars, to actually use and test their abilities to integrate that technology is a cost without sufficient benefit.

Sure, having all the kids on campus with their phones and computers connected to Pandora constantly would probably eat up some bandwidth. That is a problem in need of a solution. However, this brings us back to Engelbart and English's "A Research Center for Augmenting".

The beauty in this plan is that they PLANNED for it to be manipulated and changed before they let people into it. Our current systems is not designed or planned, it's patched and stretched. It's the same as the difference between a tailored suit and one that's "adjusted" for your rental.

Darn it! I want technology to be tailored, and I want in on the consultation because whoever makes these decisions clearly does not think forward. They think backwards. It is not about using technology to make connections and explore possibilities on the campus now. It's about controlling access.

This is a fundamentally different process that is, sadly, a necessary evil to some at least.

Friday, September 24, 2010

"The Wow Factor is Gone": Our Limitations

The main problems with getting excited with Douglas Engelbart's visions and plans for us reading today are two-fold:

1) We are not NACA engineers:
2) We have the expectation of the technology that we've always had.

The combination of these factors leaves us underwhelmed by the descriptions and details of the process by which Engelbart performs what is essentially an amazing feat of combining vision with engineering and programming.

In short, we are not impressed because many of us have lived with this level of technology for 20+ years. Our students and younger colleagues have even greater difficulties in imagining the amazement at being able to coordinate data in the ways suggested.

However, I do not want to stay on why it is difficult to engage with Engelbart. I want to engage with his vision and ideas. As the former winner of the Delaware BASIC team programming challenge in 6th grade, I am fascinated by the level of detail and attention needed to get a machine to coordinate these different categories of information that lead to the invention of the personal computer (even as Engelbart does not really intend to envision a personal, consumer computer).

Engelbart v. Engelbart
In reading the first Engelbart/vision essay and comparing that with the second (Engelbart/English proposal) and the video of his presentation, I'm really seeing two different ideas.

On one hand, Engelbart/English propose a work terminal system to analyze and support research into ways of analyzing and supporting research, but in the renowned live presentation, Engelbart introduces the potentialities of the technology in word processing and mapping, interestingly combined with the narrative of his wife calling and asking for him to do the shopping. The development of handheld organizers in the late 1980s, PDAs in the mid-late 1990s with devices like the Newton & Palm Pilot, and continuing with advertising for smartphones and tablets also often confronts the consumer with the ability of this hand-held device to help, maps, and relationships.

Palm ads here and here. (Interesting history of Palm Pilot here from the Computer History Museum)

The Case of Newton
Apple's Newton Intro Video here shows that a direct connection exists between Engelbart's vision and this kind of device. Watch the video and be amazed as the Newton users are excited about being able to get down and manipulate their thoughts on the device in a variety of ways, including drawing/design/handwriting, etc.

But, there's an interesting addition about a minute in. Now, the Newton will do things for the user, not just be an augmentation. Don't know how to put text and images together well? Don't worry. Newton will do it for you. Newton will not only assist you. It knows enough about what you are doing to do it for you.

This sort of personification and extension of technology beyond being an extension is even clearer in this Newton TV ad. Newton is now not just a thing or an assistant, an adjunct. It is a separate being who is friendly and worldly and intelligent.

While this might appear to be a minor shift from helper to friend used to achieve a marketing goal, it also reflects a vital shift rhetorically that has real consequences. Engelbart envisions a device that works on a hierarchy of symbols that he and his team devise and program into the system.

In fact, he envisions that these sorts of machines will always comes as a product of a team that analyzes needs and wants and develops a system that serves to allow users to manipulate those symbols in meaningful ways to achieve their goal. In essence the technology becomes a product of individual/small-group's needs and wants.

However, as we see the evolution to the PDA world, we realize that what has happened is that devices have been produced on the large scale where the individuals need to now be assimilated into the system of symbols developed by the engineers and programmers. In effect, the augmentation has taken over.

It is not just a tool to be used and shaped at will. Jobs, actions, wants, and needs must be shaped to fit into the use of the tool, as shown by the instructional videos on how to use a PDA.

And while the Newton perhaps tried to do too much and cost too much, the iPad and other augmentations of today seem to be facing less resistance. Built on an iPhone UI, the iPad seems "instinctive" and "responsive". No, it's more than that.

Apple wants it to be "magical"

And it is, but what is lost is the knowledge of how it works and the ability to shape the technology to the standards and hierarchies or OUR symbols. The process of learning about how we learn has been sublimated to make invisible what might be better made visible: the construction and sharing of symbolic systems.

Without an understanding of the logic behind the systems of symbols, students and users become consumers of symbols and leave their creation and manipulation to "magic."

The problem, from a cultural studies POV, is that these systems of meanings and symbols are not segregated from reality in an ether of pure entertainment. They are heavily ensconced in the networks of economics/class, culture, language, race, ethnicity, politics, gender, sexuality, and differences of all kinds. The order and preferences that are given to some symbols over others carry with them ideological values and meanings beyond the symbol and its meaning alone. Pure data connection is not possible.

Therefore, shouldn't we be aware of how, why, and when these sorts of decisions get made and by whom?

Friday, September 17, 2010

Bush-y, Bush-y, Bush!

The Good:
Bush's article clearly articulates that technology holds vast potential beyond the ability of engineers and scientists to develop ways to kill more people more quickly or more efficiently. In fact, it can help fellow researchers more readily share information about how past peoples have used technologies to kill more people more quickly and/or more efficiently.

All joking aside, Bush's attempt to look towards a future of connection and relative interaction is fantastic and exciting. What hipster would not want a memex? It's got a cross between Mad Men and steampunk aesthetic and does all the work of a mid-90s Palm.

Ok, I wasn't quite done with being sarcastic yet. I'll admit it. I love technology. I love being able to take pictures, like the one below, on the fly and not have to worry about getting it developed.

It is truly amazing to be able to follow my college acquaintance as he drove across America on a Craftsman lawnmower this summer.

The Bad:
One might immediately notice the potentials for abuse (or at least not positive use) of tiny cameras that people can take anywhere. A surveillance culture, every success or failure living on indeterminately, pornography, and cats, lots of cats, appear to be the products of the tiny cell-phone camera.

This remains to be the problem with almost any advance in technology. The technology always precedes the abilities of the culture to incorporate the possibilities in primarily positive ways, at least to the status quo.

And to some degree, that is a good thing. It allows technology to even the field between the oppressor and oppressed, a la the use by Iranians of twitter to subvert media blackouts and connect to the rest of the world. We love to hear this. It's exciting and hopeful. It turns our attentions away from the fears that technologies bring with them and that technologies distribute even more quickly and constantly than before.

However, it is this technology that allows for our attentions to be diverted so quickly. Recently, I listened to a Fresh Air where Terry Gross interviewed Matt Richtel, a writer for the New York Times, who has been working on investigating issues of technology, society, and the science of the brain for the past year or so ("Your Brain on Computers" articles: here and here). In short, he has discovered that the research is beginning to show that our brains cannot take the quantity and diversity of information being presented to it on a near constant basis. The pleasure potential of a new e-mail coming in keeps us constantly checking the inbox like a rat with a randomly distribution of food from a slot. I find it fascinating that technology is beginning to have the equivalent effect of allowing, nee forcing, us to carry little slot machines with us (Yes, there is an app for that.)

I see it in both myself and my students. I check my e-mail right when I leave my office, and 20 minutes later, when I get home, I feel a strong urge to open up my laptop and check again. I know, intellectually, that nothing of significance has come in during the last 20 minutes. I know, emotionally, that I should sit on the floor and read or play with my son rather than reconnect to the screen, but the "pull" is powerful.

The Dangers:
What's amazing to me is that so many seem willing to plow headlong into more reliance on technology that increasingly proves to be dangerous or detrimental when used on a broad basis. I can't drive anywhere in this city without nearly getting plowed into by someone on their cell phone or texting. At Baylor's campus, I've repeatedly heard of students nearly being hit as they are walking, absorbed in their phones or iPods, and not noticing that they are going into traffic.

Would these people engage in dangerous or distracting behavior of one kind or another anyway? Sure, why not? But, the point is not that technology presents the only danger of violence, distraction, or incipience of whole levels of intelligence or thought vanishing, but rather, that technology makes those problems easier to develop and harder to resist. Furthermore, the culture places a negative value on those places or people that choose to not use those technologies.

I think that E.M. Forster's "The Machine Stops" gives a highly astute and prophetic view of the dangers of these kinds of connections of knowledge and thinking to machines. It is not so much the problem of storage that Bush's plan helps to solve. The danger lies in losing the ability and inclination to train the mind and body to work together through diverse media to obtain information and synthesize it.

It's precisely because it IS so easy to "Google It" that renders to desire and pursuit of knowledge as a process only the tiniest of realms within society, pushed back into the Ivory Dungeon, only to be let out to service the cry that a populace with more "higher" education will fix fundamental problems in the economy.

Forster's story describes technologies and interactions not to far afield of Bush's, only with an oppositional perspective and thirty-six years before. Forster describes the use of machines to encode knowledge, making actual research unnecessary and accessible at the touch of a button. Forster's dystopia does not end well (like any do), and human beings die in droves in the dark as the machines run down with no one to understand their processes after generations of efficiency.

Final Thoughts:
Rather than end on a negative note of complete ruin, I want to propose a solution or a potential solution. In recent years, STEM education has represented a significant push for American education at all levels of schooling. However, in reviewing much of the literature, little is done to pair discussions of technology's abilities with the potential ethical issues of that technology and science. The focus has been on the question, "Can we do X?" not "Should we? How should we? What are the social/cultural costs of X?"

No, it's been left largely to specialists within fields, secondary/tertiary debates at conferences, or, more likely, those crazy "humanities" people who keep saying, "Umm...remember this other time that we did something like that? It didn't turn out well."

If we could institute education of STEM that includes the implications of these actions and ideas, from the earliest of levels, then we might have a population more prepared to adapt to new technologies in healthier ways, rather than getting bloated on the processed/technological "food" that Bush praises in his article.

A Step Closer to "Public"

Hey, happy, happy-ish news! I'm beginning to live my "public" part of intellectual a bit more than in conferences and classrooms (stupid article publishing being so hard and long). I was contacted a while ago by Anna David (I thought it was a joke/spam at first) to talk about my use of reality TV in the classroom, something that I've blogged about at times and been trying to publish here and there.

Well, the article is finally here. I think that it does a fairly good job of giving an overview of the approaches to reality TV including, but not limited to, it's study as a genre in media studies. Since I've shifted my teaching and research to be more of using popular culture in order to get to other academic points, I've felt a bit of the outcast. This did a fairly good job of bring these odd threads together. Maybe I'll propose that TWOP needs a "higher ed" column or Chronicle needs a "Reality TV" column.

Thanks, Anna. I'm sorry that I thought you were porn spam initially.

Thursday, September 16, 2010

Forgive My Ignorance: Thoughts on Murray

Excuse my confusion. I was criticized frequently in graduate seminars that I quibbled too much with the linguistics and bought into the myth of needing to "tear down" or challenge a reading, and I desire to avoid that pit.

However, in the course of reading Murray's introduction, I couldn't help but twitch at certain moments that rankled. Let me give an example.

...the term "new media" is a sign of our current confusion about where these efforts are leading and our breathlessness at the pace of change, particularly in the last two decades of the 20th century. How long will it take before we see the gift for what it is--a single medium of representation, the digital medium, formed by the braided interplay of technical invention and cultural expression a the end of the 20th Century?
She asks a question, and I assume that means an invitation to dialogue. How long will it take indeed?

A "Gift"?
Well, let's set aside the large presumptions that this is a "gift," implying both a singular entity and a "giver," along with the supposition of some measure of benevolence.

A "Braid"?
So, assuming that "new media" is "good", "singular", and a "gift", then we have the issue of whether it is formed by this "braided interplay of technical invention and cultural expression". Like the gift metaphor, the "braid" carries a purposefulness with it that I'm not convinced exists in the new media today. It seems much more like when one tangles a mess of yarn and then has a few people work to untangle it. Some, will take the time to methodically find and work from a loose end and carefully wind the end as they move. Others will grab joints and try to loosen/organize these nexuses. Still others, mostly my son and cats, spend time enjoying themselves. Finally, some might find ends, unravel a bit, knit what they want, and then charge for access.

Whatever the case, new media are not braided together. This is something,
but it's not a braid.

A "We"?
Finally, the question assumes that "we" most/all have access to technologies of production and consumption. Of couse, Murray could be referring to the "we" in the educated, elite, but that appears to reinforce the "us" v "them" mentality that was appropriated by critics of higher education. The simple truth is that the class divides that are increasing carry with them significant digital divides in terms of access and literacy. The ability of peoples without training in the access and understanding in these media have placed them behind the curve that increases term by term to require fundamental knowledge of using computers, e-mail, twitter, Blackboard, and other new media elements.

But, that deals with "selection", which will be a major part of my interest in V. Bush's "As We May Think"

Friday, September 10, 2010

I'm Back!: New Media Faculty Seminar!

After a hiatus of a couple years (including blogging over at a Relief for a while; please feel free to go and seek them out), I'm baaaaack!

This is due entirely to signing up for a Faculty Development Seminar on New Media Seminar that I'm very excited about. Although my field is not really in "New Media", I feel that I'm quite often pushed into this role within my department by virtue of being 10 to 20 years younger than most people in the department.

In addition, I'm beginning to see a lot of overlap between my research in the ethics in and of genre narrative and the ways that communication is and might change.

So, I'm going to be taking the questions of the course "How do we use new tech?" and "Do we use it as it's intended?" and applying them to my thoughts in movies, popular novels, TVs, and video games, particularly with those connected to the moral quandry of the individual within the corrupt world with flawed social institutions (what I refer to as "noir").

In the next post towards the beginning of next week, I want to provide some of my first impressions based on Janet H. Murray's "Inventing the Medium" intro to The New Media Reader