Thursday, July 28, 2011

I AM THE INTERNET



This is a screenshot at my desktop at the very moment I begin this post. This could also be a screenshot of my desktop any given day, if my computer has a power source and a relatively stable internet connection. My browser has at least 5 or 6 tabs open to various websites, which I switch between every few minutes. I constantly check my email accounts and respond to chat messages. You could have probably guessed it after reading my post about how I am suddenly in love with Twitter. I still maintain my blog on Xanga. I was on Facebook when it was in its infancy. Then, it was MySpace. Now I am on Blogger and Google+ (add me to your circles, yo)

Hello, my name is RJ. And I am a Seeker.

Emily Yoffe's article -- hold on, let me Google her real quick -- and the reading on Generation "M" spoke directly to my soul. My podcast is an oral history of my addiction. But then again, is it really an addiction? If we live in a wired world, why not be plugged in?

The arguments I have heard against over-connectedness are not unlike the arguments I have heard against video games, and they boil down to this: seekers and gamers could be using their time to do other, more productive things. I won't raise my hand against that statement. This is true. Instead of being bathed in the cold light radiating from your [insert device here], you could be [insert any other activity here].

Instead, I will offer an alternative hypothesis. Those other activities -- like running and dancing -- are hobbies. They are things that some people like to do during their free time. Some of those people might even attach those activities to their identity. One who runs might refer to themselves as a "runner." A person who likes to dance could say they were a "dancer." People associate themselves with people with like interests or shared values. It is the very reason why churches and clubs and furries exist. I believe that in order to realize our sense of self, we need to define what that means in relation to others. The internet has made the world a very big place and a very small place, at the same time. Therefore, defining yourself is both very difficult and very easy.

The methods of doing this are outlined by Klapperstuck and Kearns (pause while I Google). We use social networking sites to tell the world who we are and meet others like us. We keep in contact with our friends via text message and chat. We blog in order to make our thoughts and feelings public, hoping to somehow feel some empathy (and certainly not because we are being graded on it). Somewhere within all those servers and databases, in between the nooks and crannies of hashmarks and at-marks, is our identity.

We have become the internet.

P.S. Emily Yoffe definitely has more Facebook friends than I do. Did you know she took a vacation at a nudist colony?

Monday, July 25, 2011

He saw the symbol on the warning sign and understood



I'm beginning to think they purposely limit the number of EDUC 504 classes we have during the summer. It is a conspiracy built upon the laws of supply and demand. Scarcity causes an increase in demand, which in turn increases value. Therefore, every one of our six meetings becomes precious, each like a grandfather's silver pocket watch wrapped in an oilcloth and tucked safely into the corner of a desk drawer.

The value of Friday's class was in its dichotomy of activity. One half of the class was spent learninghow to use Aviary's Myna tool to build a PSA, preliminary practice for making the podcast due at the end of the week. Back in the day, I did that sort of thing on a regular basis, so it was exciting to see a modern application for tired, old skills.

The other half of class was a discussion on the video game readings. Or was it? The first part of the discussion dealt with deciphering an excerpt of Xu Bing's "Book from the Ground." A sample of the excerpt is shown in the picture above, except without the convenience of a written translation. We delved into the origins of literacy and language, surmising that human beings have a natural instinct for language structure. Our difficulty in understanding different languages arises from our solidified language schema -- we become comfortable within the structures of our primary tongue and run into obstacles when we encounter languages that do not fit within our established structure.

I found that the first part of discussion loosely connected to the second part, which was actually about video games. I mentioned that we become comfortable within our primary language, thus making other languages sound like shit. Learning, be it other languages or how to win at a video game, involves a certain amount of risk taking. We must necessarily depart from our comfort zones in order to make sense of the unknown. In order to form new schema, we cannot always be dependent upon the supports of the old.


Friday, July 22, 2011

Gee

During the entirety of the Gee reading, all I could really think about was this:



Gee gee gee gee gee

Thursday, July 21, 2011

So I'm glad I got burned, think of all the things we learned for the people who are still alive


About a year ago, film critic Roger Ebert stated on his blog that "video games can never be art." During a conversation with video game producer Kelli Santiago, Ebert said:
"No one in or out of the field has ever been able to cite a game worthy of comparison with the great poets, filmmakers, novelists and poets." To which I could have added painters, composers, and so on, but my point is clear.
Former attorney Jack Thompson, an anti-video game activist, refers to video games as "murder simulators," and blames them for three shooting deaths in Alabama committed by a 14 year-old boy in 2005.
The video game industry gave him a cranial menu that popped up in the blink of an eye, in that police station. And that menu offered him the split-second decision to kill the officers, shoot them in the head, flee in a police car, just as the game itself trained them to do.
The societal value of video games is still up to interpretation, and Gee's article adds to the debate. The above video clip is from "Portal," a game released by Valve in 2007. As demonstrated by the clip, the beginning of the game consists of a series of tests where the player has to perform tasks in order to move on to the next level. Each test is a lesson that teaches a particular skill or game mechanic, and builds upon previous lessons. As the player progresses further in the game, the lessons stop and he/she must apply skills and use knowledge of game mechanics in order to succeed.

According to Gee, "good video games incorporate good learning principles." In order to see if Gee is correct in his statement, we must verify whether or not "Portal" was indeed good and had good learning principles.

Here is a list of awards "Portal" has won (from Wikipedia):

  • At the 2008 Game Developers Choice Awards, Portal won Game of the Year, along with the Innovation Award and Best Game Design.[107]
  • IGN.com honored Portal with several awards, for Best Puzzle Game for PC[108] and Xbox 360,[109] Most Innovative Design for PC,[110] and Best End Credit Song (for "Still Alive") for Xbox 360,[111] along with overall honors for Best Puzzle Game[112] and Most Innovative Design.[113]
  • In its Best of 2007, GameSpot honored The Orange Box with 4 awards in recognition of Portal, giving out honors for Best Puzzle Game,[114] Best New Character(s) (for GLaDOS),[115]Funniest Game,[116] and Best Original Game Mechanic (for the portal gun).[117]
  • Portal was awarded Game of the Year (PC), Best Narrative (PC), and Best Innovation (PC and console) honors by 1UP.com in its 2007 editorial awards.[118]
  • GamePro honored the game for Most Memorable Villain (for GLaDOS) in its Editors' Choice 2007 Awards.[119]
  • Portal was awarded the Game of the Year award in 2007 by Joystiq,[120] Good Game,[121] and Shacknews.[122]
  • The Most Original Game award by X-Play.[123]
  • In Official Xbox Magazine's 2007 Game of the Year Awards, Portal won Best New Character (for GLaDOS), Best Original Song (for "Still Alive"), and Innovation of the Year.[124]
  • In GameSpy's 2007 Game of the Year awards, Portal was recognized as Best Puzzle Game,[125] Best Character (for GLaDOS), and Best Sidekick (for the Weighted Companion Cube).[125]
  • A.V. Club called it the Best Game of 2007.[126]
  • The Web comic Penny Arcade awarded Portal Best Soundtrack, Best Writing, and Best New Game Mechanic in its satirical 2007 We're Right Awards.[127]
  • Eurogamer gave Portal first place in its Top 50 Games of 2007 rankings.[128]
  • IGN.com also placed GLaDOS, (from Portal) as the #1 Video Game Villain on its Top-100 Villains List.[129]
  • Gamesradar named it the best game of all time.[130]

Wired considered Portal to be one of the most influential games of the first decade of the 21st century, believing it to be the prime example of quality over quantity for video games.


Good? Check.

As for good learning principles, I was originally going to launch into an in-depth discussion and analysis of "Portal" and Gee's learning principles, as well as fit it within the framework for Bloom's Taxonomy. However, while I was doing research before launching headfirst into what I am sure would have been a fantastic essay, I stumbled upon this:


I will save you the trouble of reading the entire article. The author cites "Portal" as an excellent example of how new media can be used to engage players "in the difficult process of learning new skills and making difficult conceptual leaps." (Schiller 2008) Players are scaffolded as new knowledge is introduced and supports are removed once proficiency is successfully demonstrated. Eventually, the players become completely independent from the instructional structure and left to strategize and apply knowledge on their own.

Good learning principles? Check.

Also, the article is totally about librarians, written by a librarian. Knowing my audience is an important pedagogical skill. WHAT UP KRISTIN.

P.S. Before I start getting suspicious sidelong glances during class, below is the ending theme song to "Portal," which is referenced by the title of this post:



My Favorite Superhero

Summer is well underway. The Art Fest is in full effect, basically turning the streets of downtown Ann Arbor into an open-air market. I believe it is a distinctly American habit to make ritual and celebration out of things that are never cause for ritual and celebration any place else. Here, when people sell homemade trinkets from stalls that line crowded streets we call it a "festival." In a lot of other places the world, the people there call it "a living." I feel the same way about apple picking. Listen, I am not going to go on any kind of "hayride" under the working condition of "ALL-U-CAN-PICK" unless I am guaranteed some kind of union representation.

At the same time, I think our culture tends to take for granted things that do indeed warrant celebration. Wireless internet connectivity is a miracle. We have at our disposal relatively tiny machines that literally pluck millions of bits of data out of thin air, but feel frustrated when this marvel is not running at peak efficiency. We complain about fluctuating gas prices, but forget to be thankful that we own cars. And, we turn to schools to remedy all of society's ills, but tip our teachers with apples. I suppose I could look at the silver lining and be thankful that most of us have some kind of union representation.

Summer also brings us superhero movies. We've had the X-Men: First Class, which sounds like a movie about education (it isn't). We saw Thor, who seems to be cheating at the superhero gig (he is a god). Now, it's Captain America, who is just a big guy with a shield. However, my favorite summer superhero is not any of these guys.

It is the librarian.

It just got real.

The librarian is truly great. She is the master of information, which in my opinion, totally trumps all the other superpowers combined. Like a superhero, the librarian has also evolved into someone superhuman. The X-Men inherited their abilities through genetics. Captain America was injected with a super serum. Librarians used to be those nicely dressed people who sat behind the circulation desk at the library, book scanner in one hand and due date rubber stamp in the other, head full of thoughts of the Dewey Decimal system. Then, the world changed. Technological breakthroughs have led to a flux of information. We can read books without actually physically possessing a book. We can learn about virtually any subject in a matter of seconds. We can create and publish our thoughts on anything. However, this horde of information is wild, a mass of gibberish that can overload the senses and assail the mind. Who will save us as the world falls under the dark shroud of over-information? Who will help us determine reliable sources from the flim-flam?

It is the librarian. She cast off her cat-eyed glasses, pulled the pencil from her hair bun, and emerged from behind the circulation desk. She traded the scanner and rubber stamp for a MacBook Pro and a smartphone.

I never really understood what librarians could do before I encountered Marija and Jan. When I refer to librarians as masters of information, I do not mean to imply their omniscience. Again, information is everywhere. Librarians' mastery of information lies in their mastery of the technology used to retrieve, organize, and create that information. I am consistently amazed almost every week. First, I discovered how RefWorks can produce instant bibliographies. Then, I was shown the multitude of databases within the Internet Public Library (ALL-U-CAN-PICK). I once made the mistake of saying the word "microfiche" while suggesting primary sources for the tsunami lesson plan, and was immediately disintegrated by Jan's eye lasers.

From the perspective of this teacher-candidate, my future career is pretty dire. I must simultaneously be an educator, a researcher, a reformer, a politician, a colleague, a savior, and a scapegoat. When I do a good job, there is no cause for celebration, because compared to art fests and superheroes, I am nothing special. It's a living. But as long as I have a librarian in my corner (to hold me), everything might actually turn out OK.

Monday, July 18, 2011

Twitter-pated





Friend Owl has listed the symptoms: weak knees, head in a whirl, air-walking, knocked loops, and lost heads. The prognosis? I think I love Twitter.

I know this is bad. My reputation is on the line. Why, it was just in my last post that I was publicly declaring my love for Facebook and how Twitter was deficient in so many different ways. Now, through the medium of HootSuite, my impressions have completely changed. Twitter is no longer the clunky, poor excuse for a social networking site that I have always believed it to be. Now, it is quick and fun. I can set up multiple streams that automatically retrieve tweets containing keywords of interest and display them in an easy to read format. I can see a list of my tweets that were re-tweeted by others, so I can more or less calculate my net social worth. I have hashmarks and I am not afraid to use them.

Meanwhile, Facebook thought it would be a fantastic idea to change how their chat system works, much to the chagrin of the world. Seriously? When it comes to social networking sites, I like to enjoy a certain level of customization. I want to be able to display information that I either want to display, or has some particular use to me. I like features to be easy to use, without the need to navigate cluttered menus. But above all, once I have become comfortable with the look and feel of a site, I do not appreciate it being changed, especially without notice or consent.

I tend to see connections between relationships and things that normally shouldn't be connected with relationships. For instance, I once related concept maps to serious relationships: I want them to work and I see their worth, but I don't seem to be any good at doing them.

In application of that same sort of metaphor, I feel as if Facebook and Twitter are fighting for my affections. Facebook and I have been going strong for too long, but sometimes I wake up and feel like I don't recognize it anymore. Back in the day, we were exclusive and had a simple relationship. Gradually, things began to change. First, it started seeing a lot of other people, offering itself to high school students and people who worked for different companies. Then it started playing games with me. Now, we can't even talk anymore. I mean, I admit that Google+ and I have been hanging out a lot lately, but honestly I was thinking of you the whole time.

Twitter and I met a couple years ago. We both had similar interests and we knew a lot of the same people, but we didn't really hit it off right away. Looking back, I've noticed that Twitter was present at some really crucial moments in my life. I tweeted when I found out my family in the Philippines was OK after some major flooding. I also randomly tweeted on the very day I decided that I wanted to become a teacher. Class last Friday really helped me see Twitter in a new light. Twitter is accommodating in that it lets me do my own thing, but is always there for me when I need it. Twitter always has something interesting to say. Twitter might be involved with a lot of other people, but it still makes me feel special. Twitter doesn't play games. In terms of my own professional development, Twitter provides me with a community of educators who are available to guide me and support me, cradling me in digital arms and letting me know that despite what we learn in EDUC 649, everything is going to be alright.

Because sometimes, Facebook, I just need to be held.

Friday, July 15, 2011

What did you call me?

So, I guess I'm a "tweacher" now.

Not gonna lie, I'm a little apprehensive about using Twitter and incorporating it into the classroom. Admittedly, I have a Twitter account. The content has no sustenance; the bulk of my tweets are "re-tweets" revolving around the latest happenings in the world of Korean pop music. Moreover, like a weak and starving baby bird, my tweets are few and far-between.

My biggest reason for Twitter-neglect is that I fail to see its utility. Facebook provides nearly the same function, but offers much, much, more. I don't just have to limit my information into a 140 character post. I can make a note and type to my heart's content. I can start a Facebook group or page. I am admittedly a twit (the non-tweeting kind) when it comes to Twitter, so perhaps I need to truly get intimate with it before I make judgments, but in my opinion, anything Twitter can do, Facebook can do better.



However, I must also recognize that Twitter is rather new in the social networking game and I am a human being. I am prone to homeostasis. I was extremely angry when Firefly was cancelled. Although it has been a central experience of my adult life, I am averse to change. All this talk of incorporating Twitter into the classroom makes my head hurt, like songs with auto-tune.


To tweet is to make myself vulnerable to the realization that I am growing old.


***UPDATE***
HootSuite is AMAZING. Tweeting is SO FUN,

Monday, July 11, 2011

Praxis and Teach Me How to Dewey

I googled the definition of "praxis." Here's what came up:

prax·is/ˈpraksəs/Noun

1. Practice, as distinguished from theory: "praxis of Marxism".
2. Accepted practice or custom.

I boil down our entire SMAC program experience to praxis. In the comfort of Room 2229, we familiarize ourselves with educational theories with the intention of putting them into practice. Right now, we bow our backs as we trudge through classes on research, reform, content literacy, records of practice (which conveniently provided this blog with its name), and of course, technology. One of the major benefits of having a sibling complete the program is possessing a realistic sense of the future. Soon, we will be juggling coursework with our student teaching, struggling to complete our assignments while in turn giving our students assignments. Not long after, teaching will become a full-time commitment. We will spend less time sitting inside a classroom, and more time standing in front of one. Although the thought seems strange to me at present, in less than a year our metamorphosis will be complete, and we will finally emerge from throbbing cocoons as legitimate teachers, albeit with our realized wings still moist and glistening.

I suppose I should cease my rambling and discuss more relevant things, like praxis and technology. I am beginning to come to terms with today's academic landscape. During my elementary and middle school years, I remember doing research for speeches and book reports. Back then, success in those assignments meant going to the library's card catalog, looking up your subject, and then hunting down the indicated books. Once said books were captured, you had to open them up and render useful information from their entrails. Research and its trials were a necessary part of the learning process; knowledge was wrested out of the grasp of a savage and barren wasteland.

Today, that is definitely not the case. Anyone who can manage a keyboard and decipher the glowing symbols can use a computer -- which is, I reckon, anyone aged 2 years and up. Information is quite literally at our fingertips. The research beasts of my childhood have all gone extinct. The landscape is green, lush, and alive. Knowledge now hangs low on branches like ripe fruit with fibers strained to near bursting, and the air is permeated with their aroma. The fact that I could google the word "praxis" is evidence of how much has changed.

Which brings me to Dewey. If I was writing on another medium, I would probably launch into a brief biography and then mention his impact on educational reform. But I'm not going to do that. Information is quite literally at your fingertips. Go ahead and google him. Instead, I wanted to relate his "Pedagogic Creed" to technology. I won't spend very long doing this. In a nutshell, Dewey believed that the present and real life was central to education. Since real life exists in society, schools should teach students how to interact with society in the here and now. Given that the here and now is becoming increasingly dominated by the presence of technology, it is only natural that education should make use of it.

Thus, my paradigms of how and what to teach are also becoming extinct. Book reports, in their old form, can be completed within 15 minutes without having to read the book, since key information such as characters, settings, main plot points, and themes are readily available. Up until a few days ago, I was convinced that there was a major cleavage between what students "know" and what they can look up. I've decided that there is no difference; they are one and the same. Having knowledge is simply a matter of retrieval, and whether or not it comes from long term memory or a computer database is a distinction that is steadily losing importance.

And so, this is what I am going to be focusing on during the course of the program: How can I take theory and turn it into practice? Or in other words: Teach me how to Dewey (teach me, teach me how to Dewey).