Monthly Archives: September 2009

For Good Measure: Integrating fully

As I alluded to in a previous post, I posit that it might not be long before the progress of our technological usage reaches a threshold. Whenever I spout off about this topic, I tend to get funny looks, but I encourage everyone to bear with me.

Think about our relationship with technology in just the past fifty years. If I may be poetic, we’ve progressed from computers filling buildings and only able to run the most basic operations to carrying around smartphones equipped with processors many orders of magnitude more capable than the Apollo XI Command Module’s on-board computers. We’ve witnessed the rise of the Arpanet, the Internet, Web 2.0 (a term I still kinda dislike), and real-time photo-realistic graphics rendering. More information can stream across global distances faster each day. And with Moore’s Law, this capacity doubles every 18 months to 2 years.

As far as our behavior goes, we are the most digital-socially integrated group of humans ever, and that too shows no likelihood of abating. We constantly devise new ways to use technology to keep in contact with our circle of friends, monitor our world and create new digital worlds to escape into. So, I ask – absolutely, 100%, I kid you not – seriously: How long until we become a race of cyborgs?

Just to be absolutely clear: I do NOT mean this.



It’s not a big leap. In fact, with the above trends factored in, what used to be a big chasm is rapidly shrinking into what is soon to be a mere crack in the sidewalk. If we continue carrying more powerful and capable technology around with us, increasing our reliance on digital maintenance of our knowledge base, and going where we can with nanotechnology, it can’t be long. Our generation might be cautious but convincible, and generations older probably won’t adopt, but what about those yet to come? There will come a point when I won’t be content with physically interacting with a device when I know a more accurate, more efficient route exists directly with my brain. It’s started with increasingly portable yet capable laptops and phones. The next step will be wearable but separate tech that makes these functions more permanently accessible. After that will be technology that passively observes our nervous system operation to intuit our will into physical manifestations. ( Actually, this device might already do that.) After we’ve contented ourselves with using trained subconscious eye movement to access and display information on a HUD before our eyes, what else could be the next step?

Integration isn’t going to happen overnight. In fact, a lot of these steps have already been taken in crude ways to assist those with injuries or disabilities manage some life functions through alternative means. As we continue to chart these technologies and guide them into more effective use, it will pass the mark of parity with the biological and on to technological superiority. At that point, elective adoption of these technologies will increase and humans will gradually alter themselves for the simple objective benefits afforded.

I won’t commit the classic folly and assign a specific timeframe for when I think this will come to pass, but I will offer that I sincerely believe it will be well within my lifetime. When we reach this threshold, what reason will we have as a culture to ignore it and say we will go no further?

Advertisements

Personality Faceting

In response to Tuesdays class discussion and Dr. Schirmer’s article, “The Personal as Public: Identity Construction/Fragmentation Online,” I’d like to propose another way of looking at how we present ourselves online: facets.

The analogy works pretty well for all aspects of our interpersonal lives, actually. Let’s use a precious stone like an emerald for our example. The stone starts off kind of unremarkable, really. Gemstones in their natural states aren’t the clear, sparkly bursts of light we associate with their end product. Assuming you have a stone of appropriate clarity and worth, it still needs to be cut and polished. It takes a skilled lapidary to study the stone and see the gem within, and a slow, careful process to whittle away just the right amount in the right places. The lapidary makes choices based on the application (pendant, ring, earrings) about how each facet should be oriented to get the best use of light from the gem. While the facet everyone focuses on – the one that is displayed most prominently to the world – gets all the attention, the supporting facets throughout the rest of the stone are crucial. Channeling light to the clearest, most pristine part of the internal stone and away from the imperfections is a task made possible by all the supporting facet working as a whole, not just the most prominent facet alone. If faceting is done incorrectly, the gem would be little more than the dull, unremarkable stone it started as.
Now think of the selves we present to the world. I firmly believe that every presentation of ourselves, both digital and physical, is part of a whole. We start life as an uninteresting wiggly lump and soon begin the faceting process. It starts out external until we’re old enough to chart these paths for ourselves, but the process leads to the same end. We choose the single best side of ourselves to display most prominently, but rely on the lesser noticed aspects – sometimes our imperfections – to give support and context to our public face. This extends online just the same. It’s not that we’re different people in different online settings, it’s that we use different facets of our personality in different settings, and its always in service of the whole personality.

Digital Natives and our inevitable assimilation

Regarding the Prensky article, “Digital Natives, Digital Immigrants,” I’m going to explore some things I didn’t get to on other classmates’ blogs.

That we have two types of digital denizens is of crucial importance to our digital culture. I think that we have an equilibrium that is going to steadily shift over the coming decades as the last of the immigrants dissipate through … let’s just call it attrition. With each year, the number of natives will rapidly increase while the number of immigrants will steadily decrease. (Aside: in what other scenario could the immigrants ever get out-numbered?)
I think this will have a steadily freeing effect on our digital communications. This isn’t to say that the immigrants have a chilling effect by intent, but rather by necessity does the digital landscape have to not take some things for granted. As our culture becomes vastly dominated by the native (those who’ve never not had a digital life), the inherent mistrust of those things digital will abate. That you have a Facebook account (or whatever our future equivalents will be) or maintain a significant online presence would be considered a default assumption. Accommodation for the disinclined at first will wane, and eventually will be rare.
In short order, we’ll be as accustomed to our lives with a completely integrated digital self, it will be practically like existing in multiple dimensions at once and we won’t even notice.
And this will be one of several gateways to full cybernetic augmentation. And this will start off another dichotomy: The Integrated Natives & Integrated Immigrants. But I digress…

For Good Measure: Shaking our fists at the Internet on our gol-derndet lawn!

Note to 513 readers: In an attempt to delineate between what is class-assigned and what I stumble upon and find of interest, I’m going to try and preface blog posts and tweets like this with “For Good Measure.”
As I tweeted today, I found a Telegraph (the UK paper) article about 50 things that are being killed by the internet. The article might be better described as “50 things that are being (sometimes mercifully) killed by the digital revolution.” A few things stick out as fluff they were scraping the bottom of the idea well for a round 50, but it’s a solid list. Some are tongue-in-cheek, some are irrelevant, and some are a shame.
I really have to single out Telegraph’s list topper as the most influential – “Polite disagreement.” With politics being the topic de force in this regard, the Internet’s double-edged sword has been the promise of diversity of viewpoints to challenge us to see past preconceptions and to humanize those we disagree with, but with a reality that we’ve only radicalized more. Because the politphile can easily find a niche blog/community that reflects back at them exactly the same point of view they themselves hold, there is no rationale to seek common ground. With the Internet, we can cast our own dramas with loyally supportive cast and fantastically one-dimensional Others. One would hope that this would be constrained to the digital facet of the world, but that is sadly not so. We know that we can always escape from those pesky, foolish ultra-_____wingers/_____ists we work with or otherwise have to interact with by returning to our digital compound and circling the wagons with the ones like us, the ones who get it. It’s a cheapening of the discourse, a polarizer.
Honorable mentions:
13. Memory – I don’t know. I think that it’s actually probable I’m a little better rounded for having surfed the internet/wikipedia for as long as I have. I can’t count the number of times I got lost in the wikipedia click-spiral.
14. Dead time – As I mentioned in class, I really think I need go Walden and just disappear into the woods for a few weeks.
16. Hoax debunking – The single greatest triumph over the emails that one crazy, gullible distant family member sends.

What really ISN’T new media?

(CRAP! I had this sitting as a draft since Tuesday.)

My response to Zappen’s “Digital Rhetoric: Toward an Integrated Theory” was lukewarm. Maybe I’ve completely missed the point – and perhaps, I can be forgiven this because the point seems so buried – but it seems to me the take-away is that new digital forms of expression and rhetoric are only superficially different from the traditional notions. Assuming I’m on target, I’ll proceed.

Well, duh. I don’t really see how this manner of communication could be any different based on the medium. The point, the essence, is the exchange. Be it 140 characters at a time and tapped out on the subway or at length in peer-reviewed journals, discourse is discourse is discourse. Legitimacy bestowed may make it more professional or suitable for specific application, but it is still discourse and it is still valuable.

It seems a point being made is that one of the aspects of so-called “New Media” is a scientific reproduction or affectation of traditional media – a mathematical breakdown of something like a video or image. That strikes me as odd. In reality, EVERYTHING we’ve ever done could potentially fall under this definition if the parameters remain this loose and computing power continues its unabated expansion. Just like a mathematical equation can reproduce the display of an image by orienting millions of small bits and dots of color, so too could the more physical characteristics be included. A painting is not just the colors used and their relative orientation – it’s the way light plays off the inconsistencies in the pigment, or how different canvases interact with different paints. There are mathematical ways to describe all of this. So, given the time, wouldn’t a)processing power needed to three-dimensionally render a digital expression for how all of this data is presented and constructed, and b)continuing advances in physical reproductions (such as 3-D printers) eventually produce something that is a 100% flawless reproduction of an analog original? No doubt reaching this level of sophistication would take time, but since it couldn’t be achieved by a human hand, wouldn’t this also be New Media?

Furthermore, there is reference to Alan Turing, but nothing to say of the infamous Turing test. For those who may not know, the Turing test asserts that a sufficiently complex and well programmed computer could carry on a conversation with a human as another human could. No one can claim that honor yet, but this, too, must be only a matter of time. Then media will be capable of expanding into yet another realm it has been shut out of thus far: human-like/human-to-human communication.