Monday, April 29, 2013

Knowledge in a Nutshell

        All throughout this course thus far, we've examined how the Network has shaped the reality of writing, the self, the author, and power. The Future of Thinking: Learning Institutions in a Digital Age by Cathy Davidson and David Goldberg brings us deeper now into these themes by examining the relationship between the Network and knowledge, especially in education today. Davidson and Goldberg focus mainly on the idea of a “shift,” which they apply to time, technology, and skill, in order to convey the belief that knowledge reflects the Digital Age and that education should as well.
        First, Davidson and Goldberg signifies a shift in time. In this excerpt, it’s important to note that Davidson pointed out a shift in the type of knowledge that the digital age provokes along with the shift in time. Back then, authorship and professional surveillance was necessary because knowledge existed in a discovery age. There were facts being found, examined, recorded, and taught. Web 2.0 completely covers every resource, every record of information, which signifies a shift into a collaborative age. Web 2.0 facilitates a new method of obtaining knowledge by looking between the lines and making connections between fragments of information to explore further. To continue reinstating traditional knowledge authorities for expertise and validity in this age is to “assume that credentialed experts have a premium on authoritative knowledge and fail to acknowledge that ‘experts’ also are open to error, bias... And it belies the productive strength of collaborative Web 2.0 knowledge formation,” (59). The Digital Age doesn’t need authorship or surveillance anymore because anyone can be refuted or found to be biased. Now with a broader database, knowledge is extracted from the ability to create or find new underlying truths. Times have changed and holding onto an archaic method of processing information would only hinder the individual, especially if education holds on to (and continues passing on) this outdated method.
        Along with the shift in time, the shift in technology pointed out by Davidson and Goldberg works to disprove some criticisms against the integration of technologies in the educating process (schools/classrooms) and the acceptance of the new method of obtaining knowledge. They agree that technology cannot completely replace the classroom setting. Knowledge must be embodied. However, the complaint that Web 2.0 is a form of distraction that takes away from the linear process of turning information into knowledge is no longer plausible. Many professors and complaining colleagues “have called for a complete or partial ban of laptops and other mobile apparatuses from the classroom, which some report that a dramatic increase in the quality of classroom participation happened as a consequence...but it fails to address all the underlying factors pushing students to look elsewhere for sources of engagement,” (76). In fact, according to Davidson and Goldberg, Web 2.0 is no different from the newspaper that used rustle in the back of the classroom just a while ago-- the newspaper is also a technology. Even worse, kids getting distracted by Web 2.0 is the same as instructors getting distracted by the book they're buried into and reading aloud stagnantly in the front of the classroom. Although still embodied, knowledge should now be facilitated by the instructor instead of banked into the students as suggested by Freire’s pedagogy “the banking education.” If the instructor doesn’t mediate the subject matter or content and use the technology in a collaborative, interactive, and engaging manner, he/she has failed to facilitate knowledge and to allow the students to become knowledgeable. Attention must be drawn and maintained. To turn facts into knowledge, there must be embodiment, and it must be the students embodying what they've learned. Spontaneous factual extraction from the web is still just information.
        The shift into a collaborative and technological age supports the contention that knowledge processing has transformed, but Davidson and Goldberg (as well as I) still agree that Web 2.0 is still fragments of information. In addition to collaboration and embodiment, the process of knowledge requires the final shift: the shift to new skills in order to enhance productivity. Here, the words “skill” and “productivity” are key. The authors state, “the compelling response here is not to insist on authoritarian modes of learning, on top-down assertion by some small class of experts; rather, it is to shift focus from authority claims to assessment of authority claims and the stature of authority itself. The point is not to abandon or restrict contemporary technology but to put it to good use,” (77). Knowledge comes with a new skill set in the digital age. Knowledge in the digital age challenges authority by not relying on experts. Actually, this shift in trust by using falsities, half-truths, or even flat out lies signifies that knowledge is now created by actively refuting and forcing ourselves to pay more strict attention in order to discover the truth and determine it for ourselves before personalizing that information (embodying it) as knowledge. This new skill set does not depend upon experts, since correctness is not a crucial factor. Students develop knowledge by challenging posts on the Web, digging deeper for meaning, and analyzing components independently through widespread collaboration both directly and indirectly. Not only are these new skill sets demonstrative of a new, productive age but also a more collaborative and technology-friendly age as well.
        To conclude, digital learning allows not only for the sharing of facts and information, but for the questioning and discovering of the meaning and root of information. Instead of the previous system of schooling, where students are taught to regurgitate facts, by crowdsourcing and learning digitally, students are actually learning more. They are learning more than facts. They're learning how to properly ask deep questions and think critically about tidbits of facts. Ultimately, the most attractive aspect of this essay is the tone it presents: there is no other option, it’s time to “get with the program.” The shift is here and now. Holding on to archaic modes of interpretation and hiding behind flimsy complaints, like whining about distracted students, is a disappointing characteristic of our society, if not downright sad. The Digital Age means we learn through our peers and through a gigantic database of information. We’re collaborative and inquisitive by nature now, especially with the new set of skills that force us to dig deeper for connections and truths without the authority of experts that is flawed anyways and demands blind faith. Technology is an integral part of our generation anyways, why not use it to its full advantage? Students and the general public are already demonstrative of this Digital Age and the new knowledge, I strongly encourage the authorities of our most crucial components of life, such as our academic authorities, to be just as flexible too.

Tuesday, April 2, 2013

The Author in Cyberspace


    The author has been redefined, reevaluated, and recreated by thinkers since the start of the digital age. Mark Poster, in the fifth chapter (“Authors Analogue and Digital”) of his What’s the Matter with the Internet, however, focused on redefining and reevaluating authorship and the authorship shift that occurred in the digital generation. From analogue to digital, the shift in authorship represented  a historical shift as much as a social shift. We went from a linear mentality from the age of guilds and hierarchies to an unorganized, democratic cyberspace of simultaneity. The author argument remains heavily disputed since, as Poster points out, we are still in the birth process of the digital age and the answer won’t even be predictable until the end of the modern digital age.
    There must, however, still be a certain opinion about the new digital author that Poster doesn’t explicitly state. I found myself drawn to his arguments and in support of the observations he thoughtfully explained in his work. Although Poster attempts to be unbiased in his arguments (mainly to avoid being called a technological determinist), I inferred an underlying sentiment of support for this new transition. As people and time changes, so must certain functions in society. It doesn’t make sense to hold onto an archaic practice in a contemporary world. Therefore, I contend, and agree, that the authorship shift in our digital age should be fully embraced and taken advantage of so that we can discover humanities specific to our reality. Applying a deconstructionist tint of Postmodernism, I attempted to create a poem by erasing and selecting one word or one phrase at a time until a coherent thought summarizing Poster’s opinion emerged.


The Author in Cyberspace
by Susan Lee

    “Digital authors are not simply separated from their words, as they are in the print media, but reconfigured by their relation to the machinic apparatus. Because digital writing may be rewritten with ease, the stability of words on paper is lost, severing the link between author and text that was established with so much difficulty during the first centuries of print, as we have seen. The cultural practice of taking authors of books as trustworthy authorities, as persons of possibly great creativity, is difficult to reproduce in the case of digital texts on the Internet. This, of course, by no means prevents the establishment of a new cultural practice in which authorship as we know it is somehow sustained. But the case of digital texts does indicate a rupture in existing practices and the need for a new invention of authorship.
    When analogue authors were installed in the cultural landscape the modern subject was being articulated in discourse and practiced in daily life. The figure of the analogue author fit well with the emerging sense of the body as private, the self as separate from the world of objects, and the investment in rationality as human essence and consciousness as the source of meaning. It fit well wit the practice of distanced relations of the free enterprise market, the theory of representative democracy, and secular education in literacy and mathematics. It fit well, in addition with the narcissistic arrogance of European superiority and imperialist adventure and with patriarchy in its new articulation in the urban nuclear family. Each of these hallmarks of modernity had its own temporality; by no means was all of this some unified essence, some spirit of the age, or even some revolutionary project of a well-define group of political agents. 
    Digital writing emerges at a very different point of history, which might be characterized as follows: The broadcast media, as many have argued, have done much to diminish or even dissolve the rational, autonomous ego. Global capitalism is reconstructing planetary relations along very different lines from older colonialism. The viability and even legitimacy of leading modern institutions is no longer secure, even though alternatives are by no means obvious. Digital authorship arrives, then, in a specific context and the shape it is given in the decades to come will owe much to that context as well as to its material characteristics. It is my contention that the more beneficent configuration of digital authorship can come only from practices that explore its particular potentials, perhaps with an eye to the best that analogue authorship has offered but by no means with a sense that at best we can only repeat its achievements. This is a great moment to experiment with digital forms of writing and communication, even though these experiments will be resisted by the gatekeepers of authorship-- the watchdogs of copyright, printing establishments, tenure committees, and so many others...
Many of the features of digital authorship, as they affect the conditions of work in the humanities, are in some sense anticipated in the modern period. From the novels of Lawrence Sterne to the theoretical practice of Roland Barthes, anticipations of hypertext, for instance, may be gleaned. If the digital imaginary is here foreshadowed, the practice of digital authorship had to await the material inscription of networked computing. Only when this rearrangement of ink into bits, this profound destabilization of the trace, occurred could the regime of the author function be transformed in countless practices of symbolic culture. Only then could the Gutenberg Galaxy become overlaid with a universe of cyberspace,” (Poster 97-100).

Monday, February 18, 2013

The New Identity: Law of Association


Click.

Okay, here we go.



Typical Facebook profile photo. The “I’m a cutsie, casually fashioned girl,”-type picture designed to not be provocative but she knows exactly where you’re looking. A whole bunch of NYU-related pages plaster her “Recent Activities.” Don’t tell me, she’s the annoying, must-be-president-of-anything, Student Counsel girl, isn’t she? Music likes for Hillsong United and... Lil Wayne. That’s quite a contradiction. A humanitarian, I see, with interest in the national debt, bullying, and the American Cancer Society, however all those likes for pages like Michael Kors, Sherri Hill, and Britney Spears are a little concerning. No more than a couple sentences for her biography; no more than a couple sporadic statuses per day.

I see she’s not exactly an artist herself, although she’s got an eye for the latest and the trendiest tags, posts, or tweets. Of course she’s a Chicago-raised girl in NYC, just look at these constant instagram photos of West Village, Brooklyn Bridge, or Manhattan. I get it, it’s a city. Big whoop. Jesus, I can’t tell if that’s Central Park or heaven with all those filters. The joys of twitter: sexual pick-up lines. She must have a good humor, I’d love to hop on that myelin sheath. Unfortunately, following Steve Martin and Shane Dawson is a mistake and I don’t understand why she’s following Forbes and Jim Carrey. Someone is clearly trying too hard.

Unfollow.

No matter how cynically and stereotypically I try to decipher my own online identity, the only concrete labels I can paste to my own being is “Hi, My Name Is: Susan Lee, born June 12th, 1994, student, daughter, and sister.” I mean, what does the genre “typical” even constitute in this random, disjointed cyberworld? There is no such thing. We’ve moved on from the traditional rationality-seeking, thoroughly analyzed identity of the past, starting with the Cartesian ego. Bolter writes, “the notion that writing unifies the mind was shared explicitly by the classicists and historians... the [notion], associated with Descartes, that what makes each of us human is our ability to function as a reasoning agent.” Descartes, a world renown philosopher of the Enlightenment Era, denied his senses, his body, and his physical, secular world with all of its institutions but he held fast to his mind-- the only real, secure authority he believed in because his thoughts were his own. 

In my world, as deconstructionist as I may seem, I find it difficult to deny that much of wonder left in the world. Humans are an intelligent species that have recorded every definition, every vision, and every experience to the best of their linguistically descriptive abilities. There are infinite fragments of information out there, somewhere, in text just waiting to be used. C.S. Peirce, quoted by Bolter, ascribed the modern man a more modern definition: “People are like words. The man-sign aquires information, and comes to mean more than he did before. But so do words. Does not electricity mean more now than it did in the days of Franklin?... In fact, therefore, men and words reciprocally educate each other; each increase of a man’s information involves and is involved by, a corresponding increase of a word’s information.” Ironically, we know a lot more than we think. As a result, one long rational analysis doesn’t do an individual justice anymore. We can now be everything and anything, we just need the right description, the right Facebook page to like, or even the right trending hashtag to follow because words hold incredibly expansive connotations now, deep enough to justify every piece of ourselves-- The text comes first and we associate decisively.

In the end, I still sound cynical. Just because who we are is what’s already out there doesn’t mean we can’t still be independent, individualistic. I’m proud knowing that we’ve nearly exhausted our mental resources and we still create masterpieces.

Sunday, February 10, 2013

An Ironic Profession


Author: the composer of a literary work; someone who writes as a profession

At least that’s how an author is defined in a dictionary; however, the title of “author” is now being challenged in the midst of our digital era. What once was a noble profession is now a profession for any common man and print, the preserver of text for the past five centuries, fails to keep up in competition with the Internet. A new kind of revolution arises, challenging the value placed by authors on permanence by readers. In a digital world where the essence of writing is changing, being created, revised, and deleted constantly, can distinctions between readers and authors truly exist?

Much like Art, Writing indubitably requires talent; Writing definitely is an innate skill. Having acknowledged this, it only makes sense that the author profession should remain exclusive. For example, the ancient Western literary canon has been challenged many times, but who can deny the genius of those books? As the Foundation of our culture, the very fabric of our morals and ethics, it’s become nearly impossible to extract Plato, Shakespeare, and Homer from today’s home, education, and government. Print awards these books a prestigious merit and timeless recognition that, when achieved, completely separates the author and the reader. By being printed, these texts have maintained permanence for generations and endured the test of time. To share authorship with the unqualified naturally seems... wrong.


On the other hand, these authors must have started somewhere. To obtain literacy, you must study literature. So someone who was once a reader is now an author, and anyone who is an author is/has been/will always be a reader. Also, one must remember that time is both a virtue and a vice. As applaudable as it is that Gilgamesh’s brilliance never diminished since about 2700 BC, times have changed and the fact that it’s bound in a book won’t affect its value, positively nor negatively. While we complain about printing converting to typing, we fail to remember that even Gilgamesh converted from cuneiform clay tablets to translations on pieces of paper. Finally, author Jay David Bolter argues in his book Writing Space: Computers, Hypertext, and the Remediation of Print, that “the shift to the computer may make writing more flexible, but it also threatens the definitions of good writing and careful reading that have developed in association with the technique of printing.” To that, I must reference my belief in the essence of writing. As long as it conveys thought, emotion, plot, imagination or even experience, writing has been achieved. The only debate lies with “quality” yet what makes Mr. Bolter so certain that text develops reading skills better than typing? Being on the screen doesn’t mean the words appear any differently. In fact, by allotting multiple edits or annotations, don’t readers prove to be more than competent in close reading? Additionally, with an endless world of information and the ability to hypertext with a click of a mouse, the notion of reading in a linear fashion has disappeared but hasn’t that sort of access to resources created authors with all different kinds of cultural glasses?


Categorization is vital in our process of self-evaluation. We need a category that’s original and a perfect embodiment of what makes us unique, confident, and beautiful. Authorship, in my opinion, belongs to everyone. Recognition comes with true talent, which will always be distinguishable. Unfortunately for arguments like Bolter’s, a reader cannot be without an author; an author cannot be without a reader. Writing cannot be bound to a set of conventions and finally, the Internet has proved that. Therefore, instead of focusing on such shallow, narrow forms of organization, we must remember that writing exists without a doubt. All these clay tablets, printing machines, and computers are nothing but mere mediums. With changing times, all we need to do is change mediums and preserve writing that way. Why can't both be an I     D     E     N     T     I     T     Y  ?