Skip navigation

Tag Archives: new york times

This post originally appeared on Cyborgology – read and comment on the post here.

Discussing the relative strengths and weaknesses of education as it occurs on and offline, in and outside of a classroom, is important. Best pedagogical practices have not yet emerged for courses primarily taught online. What opportunities and pitfalls await both on and offline learning environments? Under ideal circumstances, how might we best integrate face-to-face as well as online tools? In non-ideal teaching situations, how can we make the best of the on/offline arrangement handed to us? All of us teaching, and taking, college courses welcome this discussion. What isn’t helpful is condemning a medium of learning, be it face-to-face or via digital technologies, as less real. Some have begun this conversation by disqualifying interaction mediated by digitality (all interaction is, by the way) as less human, less true and less worthy, obscuring the path forward for the vast majority of future students.

This is exactly the problem with the op-ed in yesterday’s New York Times titled, “The Trouble With Online Education.Read More »

Advertisements

This post originally appeared on Cyborgology – read and comment on the post here.

 This is part of a series of posts highlighting the Theorizing the Web conference, April 14th, 2012 at the University of Maryland (inside the D.C. beltway). See the conference website for information as well as event registration.

Experiencing global events through social media has become increasingly common. For those in the West, the uprisings over the past few years in the Middle East, North Africa and elsewhere were especially striking because social media filled an information void created by the lack of traditional journalists to cover the dramatic events. By simply following a hashtag on Twitter, we tuned into those on the scene, shouting messages of revolution, hope, despair, carnage, persistence, misinformation, debate, sadness, terror, shock, togetherness; text and photos bring us seemingly closer to the events themselves.

But of course the Twitter medium is not neutral. It has shaped what we see and what we do not. Where is the truth in all of this? The intersection of knowledge, power, struggle and the radically new and transformative power of social media begs for intense theorizing. How we conceptualize, understand, define and talk about this new reality lays the path forward to better utilizing social media for journalistic and political purposes.

This is why the keynote for Theorizing the Web 2012 conference (College Park, MD, April 14th) features Andy Carvin (NPR News) and Zeynep Tufekci (UNC) in conversation. Carvin (@acarvin) has become well known for his innovative use of Twitter as a journalistic tool. Tufekci (@techsoc) has emerged as one of the strongest academic voices on social movements and social media and brings a theoretical lens to help us understand this new reality. Together, insights will be made that have impact beyond just journalism but to all researchers of technology as well as those outside of academic circles.

Who is Andy Carvin; and What Do We Call Him?

Without a deep background in professional journalism, Carvin’s actual title at NPR is “Senior Strategist.” However, Read More »

This was originally posted at my blog Cyborgology – click here to view the original post and to read/write comments. 

PJ Rey and I have been following the 2012 presidential campaign on this blog with social media in mind. We watch as President Obama and the republican contenders try to look social-media-y to garner dollars and votes. However, the social media use has thus far been more astroturfing than grassroots. There have been more social media photo-opts to appear tech-savvy than using the web to fundamentally make politics something that grows from the bottom-up. Presidential politics remain far more like Britannica than Wikipedia.

But this might all change, at least according to Thomas Friedman yesterday in the New York Times. He describes Americans Elect, a non-profit attempting to build an entire presidential campaign from the ground up. This might be our first glimpse of an open and social presidential web-based campaign. From their website,

Americans Elect is the first-ever open nominating process. We’re using the Internet to give every single voter — Democrat, Republican or independent — the power to nominate a presidential ticket in 2012. The people will choose the issues. The people will choose the candidates. And in a secure, online convention next June, the people will make history by putting their choice on the ballot in every state.

Read More »

David Carr recently wrote a piece in the New York Times where he states,

Add one more achievement to the digital revolution: It has made it fashionable to be rude.

Has it?

The article is about how people are increasingly gazing into little glowing screens when in physical space. Carr views this as a “mass thumb-wrestling competition” where we are “desperately” staring at devices instead of making “actual” connections. And it is his usage of “actual” here that tips us off on why he has such a negative view of people looking at screens: he, like so many others, suffers from digital dualism. I’ve critiqued Amber Case, Jeff Jarvis and others on this blog for failing to make the conceptual leap that the digital sphere is not this separate space like The Matrix but instead that reality is augmented. I’ve been through the argument enough times on this blog that I’ll just refer you to the links and move ahead.

Carr’s digital dualism begins in his description of people looking at phones while at South By Southwest this past spring, something he then uses as evidence for the larger problem of increasing disconnectedness. He argues, Read More »

The New York Times recently ran an expose on teen “sexting” as a part of a slew of recent articles on the topic. Unfortunately, this article failed to take into account the fact that teens, especially girls, have sexual desire. A couple of quotes from the article:

“Having a naked picture of your significant other on your cellphone is an advertisement that you’re sexually active to a degree that gives you status,” said Rick Peters, a senior deputy prosecuting attorney for Thurston County.

Perhaps, but what about the fact that the teen might want to enjoy the photo for themselves, too? Inner-desire is continuously ignored in the article in favor of the view that teens (again, especially females) engage sexually in order to please others.

“You can’t expect teenagers not to do something they see happening all around them,” said Susannah Stern, an associate professor at the University of San Diego who writes about adolescence and technology. “They’re practicing to be a part of adult culture,”

Teens do not need anyone to tell them to play show-me-yours. More than practicing for when they get older, teens are also attempting to explore and enjoy their sexuality in the present. It is not just adults who have sexual desire. In fairness, the New York Times did run another article that quotes teens on the topic, who are clear that sexting is the result of desire. So, why do most articles dismiss this fact?

I can accept that culture influences sexual behaviors, I am a sociologist, but to not even bring sexual desire into a conversation about sexting is erroneous. Acknowledging teen sexual desire should be at the center of how to deal with the issue of sexting moving forward. We should be promoting sexual agency, not dismissing it. Better than shaming teens is to start a conversation around how to best express themselves sexually at their age.

There are consequences to this perspective that views teen sexual behaviors as not stemming from desire but instead only as something taught. Adults too often feel they can simply squash teen sexuality through shaming and even criminalization. A scenario described in the article and that is occurring all too often is that teens are being escorted from school in handcuffs, locked up and forced to register as sex offenders simply because they shared nude photos with a significant other their own age. This over-reaction demonstrates Michel Foucault’s point: that by seemingly ignoring teen sexual desire, we’ve only succeeded in turning it into an obsession.

The rant that anything digital is inherently shallow, most famously put forth in popular books such as “The Shallows” and “Cult of the Amateur,” becomes quite predictable. Even the underlying theme of The Social Network movie was that technology trades the depth of reality for the shallowness of virtuality. I have asserted that claims about what is more “deep” and “real” are claims to truth and thus claims to power. This was true when this New York Times panel discussion on digital books made constant reference to the death of depth and is still true in the face of new claims regarding the rise of texting, chatting and messaging using social media.

Just as others lamented about the loss in depth when moving from the physical to the digital word, others are now claiming the loss of depth when moving from email to more instant forms of communication. E-etiquette writer Judith Kallos claims that because the norms surrounding new instant forms of communication do not adhere as strictly to grammatical rules, the writing is inherently “less deep.” She states that

We’re going down a road where we’re losing our skills to communicate with the written word

and elsewhere in the article another concludes that

the art of language, the beauty of language, is being lost.

There is much to critique here. Equating “depth” to grammatical rules privileges those with more formal education with the satisfaction of also being “deeper.” Depth is not lost in abbreviations just as it is not contained in spelling or punctuation. Instant streams of communication pinging back and forth have the potential to be rich with deep, meaningful content. Read More »

Life is rough for men wealthy enough to own an iPad: “how to carry it in a manner that is practical and yet, well, masculine.”

This is from a New York Times story that chronicles the danger of the iPad on a man’s masculinity, specifically, the need for a carrying case that does not look too much like –gasp!– a women’s purse. The horror of appearing slightly feminine runs so deep that CNET ranks bags with a “humiliation index” (would be better to call it a “heteronormativity index”).

The story turns especially dark when we learn that Apple’s neglect has resulted in some men not being able to leave the house with their iPad. Or even worse, not buy one at all in fear of not appearing masculine enough. But there is hope for these rich males: “Scottevest plans to introduce an iPad-compatible blazer in time for Christmas.” See the manvertising here.

The New York Times recently ran a story about how “The Web Means the End of Forgetting.” It describes a digital age in which our careless mass exhibitionism creates digital documents that will live on forever. The article is chock full of scary stories about how ill-advised status updates can ruin your future life.

These sorts of scare-tactic stories serve a purpose: they provide caution and give pause regarding how we craft our digital personas. Those most vulnerable should be especially careful (e.g., a closeted teen with bigoted parents; a woman with an abusive ex-husband). But after that pause, let’s get more realistic by critiquing the sensationalism on the part of the Times article by acknowledging that, with some common sense, the risks for most of us are actually quite small.

1-Digital Content Lives Forever in Obscurity

The premise of the article is that what is posted online can potentially live on forever. True, but the reality is that the vast majority of digital content we create will be seen by virtually no one. Sometimes I think these worries stem from a vain fantasy that everything we type will reach the eyes of the whole world for all time. Sorry, but your YouTube video probably isn’t going viral and few people will likely read this post.

What interests me about digital content is that it is on the one hand potentially immortal and on the other exceedingly ephemeral. In fact, it is precisely digital content’s immortality that guarantees the very flood of data that makes any one bit exceedingly ephemeral, washed away in the deluge of user-generated banality. Jean Baudrillard taught us that too much knowledge is actually no knowledge at all because the information becomes unusable in its abundance. This is what millions of people tweeting away is: an inundation of data, most of which will never be read by many and will probably be of little consequence [edit for clarification: I like Twitter].

If anything, one problem with social networking applications like Facebook or Twitter is that they do a poor job of archiving and making searchable specific past content. A quick glace on Facebook reveals that I cannot search my friend’s history of status updates. Looking at my Twitter stream, I cannot even find my oldest tweets. My digital content may live forever, but it does so in relative obscurity.

2-Flaws are Forgivable, Perfection is Not

The article draws from a quote about how the immortality of digital content…

“…will forever tether us to all our past actions, making it impossible, in practice, to escape them” […] “without some form of forgetting, forgiving becomes a difficult undertaking.”

I disagree. As we increasingly live our lives online, always index-able, it should be expected that many of us will have some digital dirt on our hands. Instead of this idea that we won’t be able to forgive each other for not being perfect, new realities will change our expectations. I suspect being an imperfect human being will be just as forgivable as it always has.

In fact, it very well might be the too-perfect profile that is unforgivable. As any politician knows, you cannot look too clean and sterile; else you come off as phony. A too-polished and perfect profile is increasingly a sign that you are not living with technology and making it part of your life -and thus seem a bit technologically illiterate. The overly-manicured profile screams that you are not out there using social media tools to their full potential.

In conclusion, use scare-tactic articles like the one being commented on here to remind you that what you say indeed might come back to haunt you. But do not go overboard worrying and cleaning your digital presence. Yes, riding your bike or eating chicken might get you killed (potholes and salmonella scare me more than Googling my name), but we are willing to take these risks because they are exceedingly small. Be smart, don’t post about your boss, but, in any case, the vast majority of people posting status updates about their job today will not get fired tomorrow. ~nathan

As media became truly massive in the middle of the 20th century, many theorists discussed the degree to which individuals are powerless -e.g., McLuhan’s famous “the medium is the message.” In the last decades, the pendulum of dystopian versus utopian thinking about technology has swung far into the other direction. Now, we hear much about the power of the individual, how “information wants to be free” and, opposed to powerful media structures, how the world has become “flat.” The story is that the top-down Internet was “1.0” and now we have a user-generated “Web 2.0”. The numbering suggests the linear march of increasing democratization and decreasing corporate control.

The pendulum has swung too far.

I have tried to argue elsewhere (here and here and here) that Web 1.0 and 2.0 both exist today, sometimes in conflict, other times facilitating each other. On this blog, I have noted that sometimes “information wants to be expensive” and how the iPad marks a return to the top-down as opposed to the bottom up. Zeynep Tufekci and I have a paper under (single blind) review that discusses the iPad as the return of old media and consumer society by way of Apple’s Disney-like closed system.

Steven Johnson recently wrote a powerful op-ed in the New York Times titled “Rethinking the Gospel of the Web” that makes a similar argument. He portrays Apple’s closed system as incredibly innovative, stating that “sometimes, if you get the conditions right, a walled garden can turn into a rain forest.”

Opposed to the current orgy of writing about the powerful agent/consumer, Free, democratization, revolutionary potential, flat worlds and so on, let’s remember how structures and top-down corporate control remain important.

  • access is still unequal
  • how people use the web is unequal, something I’ve discussed as the post-structural digital divide
  • the “revolutions” of Wikipedia or open source are basically knowledge or software being produced by a few white men to now being produced by a few more white men (revolutionary this is not)

This world is not flat, and if the success of Apple is any indication, it is not getting any flatter. ~nathan

by nathan jurgenson

I’ve written many posts on this blog about the implosion of the spheres of production and consumption indicating the rise of prosumption. This trend has exploded online with the rise of user-generated content. We both produce and consume the content on Facebook, MySpace, Wikipedia, YouTube and so on. And it is from this lens that I describe Apple’s latest creation announced yesterday: the iPad. The observation I want to make is that the iPad is not indicative of prosumption, but rather places a wedge between production and consumption.

From the perspective of the user the iPad is made for consuming content. While future apps might focus on the production of content, the very construction of the device dissuades these activities. Not ideal for typing, and most notably missing a camera, the device is limited in the ways in which users create content. Further, the device, much like Apple’s other devices, is far less customizable than the netbooks Apple is attempting to displace (which often use the endlessly customizable Linux OS).

Instead, the iPad is focused on an enhanced passive consumption experience (and advertised as such, opposed to their earlier focus: can’t resist). Unlike netbooks, the iPad is primarily an entertainment device. Instead of giving users new ways to produce media content, the focus is on making more spectacular and profitable the experience of consuming old media content -music and movies via the iTunes store, books via the new iBookstore and news via Apple’s partnership with the New York Times.

Thus, the story of the iPad’s first 24hours, for me, is the degree to which the tasks of producing and consuming content have been again split into two camps. The few produce it -flashy, glittering and spectacular- and the many consume it as experience. And, of course, for a price.

Does this serve as a rebuttal to an argument that the trend towards the merging of the spheres of production and consumption into prosumption is inevitable? Or is prosumption indeed the trend for a future Apple seems not to grasp? Or will the applications developed for the device overcome its limitations? ~nathan