Quantcast
Channel: convergence – Ethical Martini
Viewing all articles
Browse latest Browse all 12

Journalism and blogging: leave it to the machines?

$
0
0

In science and science fiction there’s a moment when it all goes to custard for the human race. It’s the singularity – often defined as the time when machines begin to out think humans.

We’re not there yet and I’m comfortable with predictions that it might happen 200 years after my demise. But you can never really trust futurist predictions.

We’ve already got smart(ish) bots hurtling around the interWebs chewing up data and spitting it out again in a clickable and commercial form, so I’m not too sanguine about what’s gong on in the DARP labs and other murky salons where “mad” scientists and uber-smart geeks tend to gather.

Anyway, there is evidence of not-so-smart machines out there already aggregating, redacting and posting prose that fills the holes between advertising links on some remote outposts of the blogosphere.

Take, for example, Biginfo, the website with the unbeatable cyber-catchline: “All of your info, on one page”.

Isn’t that the holy grail of the Internet? Isn’t this slogan the absolute bottom-line misison statement for Google?

We won’t need humans any more if Biginfo succeeds.

I  know about Biginfo because the site has linked to a post here at Ethical Martini. As you do, I went to check out why the site was linking and pushing some traffic my way.

This is what I found:

What is More Ethical Blogs or News Media?

20 October, 2009 (15:10) | News And Society | By: admin

// your advertisement goes here

We are chance more and more that readers conceive the aggregation contained in Blogs is more trusty than the indicant programme media. (I don’t conceive a candid comparability between the electronic media and Blogs makes such sense, so my comparability is direct: cursive touchable vs. cursive material.) While I encounter this agitate in ‘believability’ to be somewhat surprising, I staleness adjudge that I don’t conceive I personally undergo anybody that reads the production without a nagging distrustfulness and a taste of doubt. Even more, I move to be astonished at the ontogeny sort of grouping I undergo that do not modify pain to feature the newspaper.

The long post goes on in this vein for some depth. Here’s another of my favourite paras:

I module substance digit appearance on the supply of blogs vs. newspapers. A blogger, aforementioned me, is attractive the instance to indite most an supply that I poverty to indite most and that I see passionately about. Question: so, what most the mortal of ethics? Answer: I do not hit a deadline, I hit no application that is biased, and I modify intend to indite my possess headline!

I am willing to believe that this is a machine-translation of something written in another language (possibly Chinese?) by a blogger or someone and that in it’s original iteration it makes great sense. Also, if it had been translated by a moderately proficient human it would probably also be readable and cogent.

Are we redundant? Should we retreat and leave the web to dribblejaws who find it a convenient medium to feed their conspiracy theories and ugly prejudice?

I certainly hope not, continue reading if you’d like to know more about the singularity.

Convergence culture—a moment of singularity?

Welcome to the brave first decade of the twenty-first century, a decade which will destroy more science fiction futures than any ten year span that preceded it.

Charles Stross, Toast

In an introduction to Toast, his collection of short stories, science fiction writer Charles Stross writes that a “fogbank of accelerating change” seems to “swallow our proximate future”. He also writes that the pace of change in the period from the late 1960s to the present has been faster than at any other time in history. He adds that today “if anything, it’s accelerating” and that technological change is “one-way”. There’s no going back. Stross speculates, as sci-fi writers are encouraged to do, that the world may be heading towards what mathematician and computer scientist Vernor Vinge describes as a “singularity”. He writes: “At the singularity, the rate of change of technology becomes infinite; we can’t predict what lies beyond it” (Stross, 2003, p. 13). For cultural studies theorist Henry Jenkins, this singularity—the birth of convergence culture—is defined by the clash of old and new, particularly in terms of media forms and media platforms. “Contradictions, confusions, and multiple perspectives should be anticipated at a moment of transition where one media paradigm is dying and another is being born”.

In his 2004 novel Singularity Sky, Stross uses the character of Burya Rubenstein—revolutionary leader and journalist—to further define the singularity as “a historical cusp at which the rate of change goes exponential…the suddenly molten fabric of a society held too close to the blowtorch of progress”. For Stross one aspect of a technological singularity is the point at which computer intelligence begins to out think the human brain. According to computing and robotics professors cited by Stross, that juncture is about 25 years away (2035) and “we cannot possibly know what life will be like” once artificial intelligence (AI) gets beyond our ability to control it. To be honest, I’m not sure if Stross is right; there’s no doubt that scientists are working on the concept of artificial intelligence, but can they produce a super-computer with thinking abilities that would be as incomprehensible to us, “as ours are to a dog or a cat”?

We will find out soon enough; but the point here is that I think that Stross is right about the rest of the singularity thesis. This disorienting dialectic (the individual against the world) appears to be predicated on technological change that is constant and seems to be speeding up, following Moore’s law of computing power doubling every 18 months. It is also clear that we are no longer analogue, we are digital and the convergence of computing and communication technologies is almost complete. In that sense perhaps we have already experienced a mini-singularity in terms of technology. Machines are getting smarter, but perhaps not quite in the semi-human AI way. Not yet, anyway.

We’ve been digital now for long enough for it to have far-reaching and non-reversible effects on our lives, the ways we work and relax and many of our cultural norms. We are already talking on a daily basis about the new social values of digital “cyberculture” in terms that media scholar Mark Deuze (2006, p. 63) describes as “an expression of an increasingly individualized society in a globalized world”. News organisations were among the first to embrace convergence, mainly for reasons of business economics, but as Australian journalism scholar Stephen Quinn (2005) observed, it also established a new dialectic—between the commercial expectations of the news capitalists and the aspirations of working journalists. Only one thing was ever certain as convergence culture grew organically from the digital revolution: as Charles Stross writes in Toast: “The future is not going to be like the past any more—not even the near future.”

This is already evident in popular culture, as media sociologists Shayne Bowman and Chris Willis noted; ideas, products, trends, styles and social mores appear to “accelerate their way from the fringe to the mainstream with increasing speed” (2003, p. 7). Media scholar Naren Chitty describes the impact of globalisation on the news media in similar terms. He says that the global and the local are now intimately intertwined in the global economy and through globalising cultural linkages: “Globalization has made the local explode in the global and the global implode on the local” (Chitty, 2000, p. 14). The impact on the global media has also been profound. The managing director of the Australian Broadcasting Corporation, Mark Scott, noted this in his 2009 La Trobe University annual media studies address. Scott said that the media is in a state of “transition and turmoil” and on a “revolutionary road” that has turned the old certainties of the media industry on their head (Scott, 2009):

Daily, doomsayers beat the drums for newspapers, free‐to‐air television, regional media and investigative journalism.

Mark Scott, La Trobe speech 2009

Mark Scott also mentioned the global recession as a major factor in the decline of media advertising revenues and bottoming share prices. The global downturn, he argued, is having “profound effects” on the global business of news and media. The shift has been so sudden and the fall so steep that Scott did not appear hopeful of a significant recovery any time soon. “Executives, particularly those in the newspaper business, wonder whether the good times will ever come back”. Perhaps, at one level, it’s too late because cyberculture turns the mediasphere on its head. It seems there’s no escaping what Deuze refers to as “an emerging value system and set of expectations as particularly expressed in the activities of news and information media makers and users online” (2006, p. 63).

In Burya Rubenstein’s terms, we could argue that globalisation and, more importantly, the global financial crisis of 2008-2009 has “ripped up social systems and economies and ways of thought like an artillery barrage”, a “hard take-off singularity” (Stross, 2004, p. 163). When globalisation and economic crisis is combined with the mini-singularity of digital technologies, we experience the resulting state of profound change as chaos and flux. As Stross has another character say to Burya Rubenstein at a critical point in Singularity Sky: “Talk you of tradition in middle of singularity.”

Well, actually: “Yes!” We have to talk of tradition, because the present and the future, are products of historical events and forces; the basic elements of convergence culture have been present in our world for most of the twentieth century—the telephone, the television and (in the last 80 years at least) the computer. So we should not be surprised by the strength of convergence culture, it is the culmination of historically situated processes that have percolated through the filter of combined and uneven development for nearly a century. It is an uneven process in that there is not one form of digital culture that sits snugly across all social formations, nor across all individuals and groups within one particular social formation.

We are all familiar with the term digital divide that signifies those with good access to the Internet and other digital technologies on one side and those with little or no access on the other. According to global statistics for Internet penetration, as of June 2008 less than a quarter of the world’s population has access to the web. The highest penetrations are North America (73 per cent) and Europe (51 per cent); the lowest Africa at 6.7 per cent (Internet World Stats, 2009). As Deuze reminds us there is no linear progression from analogue to digital, nor is digital culture necessarily an improvement on the analogue past.

American media scholar Robert McChesney also talks in terms that resemble Stross’ “singularity”; he argues that globally the communication industry, journalism and even the very democratic fabric of society has reached a “critical juncture”. This juncture—to some degree a product of the “communication and information revolution”—has a number of possible outcomes. Either, McChesney argues (2007, p. 1), it can be “a glorious new chapter in our history”, or “we may speak of it despondently, measuring what we have lost”, or “we may end up somewhere in between”. For some it’s a bright future of “life-streaming”, the practice of uploading everything you see and do to a social networking site to share with the world; for others it is a bleak future of surveillance and a total lack of privacy. There will also be economic winners and losers, for this the nature of global capitalism.

We can never know the future with any certainty and according to a former director of The Institute for the Future, Roy Amara, we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run. This has become known as Amara’s Law and it reflects the dialectic of tensions and contradictions that run through our understanding of digital technologies: what Deuze (2006, p. 66) calls the “scrambled, manipulated, and converged” mediasphere. It is reasonable to assume that we have, perhaps, passed through a technological and economic singularity in the era of digital globalisation.

We are no longer living in an analogue world; though present social systems do retain elements of the analogue. Our digital world is still emergent—it is not yet fully-formed—but we can be certain that the digital dominates and that there’s no turning back the clock. As another utopian visionary wrote 150 years ago:

“All that is established melts into air, all that is holy is profaned, and man [sic] is at last compelled to face with sober senses his real conditions of life and his relations with his kind,” (The Communist Manifesto).



Viewing all articles
Browse latest Browse all 12

Trending Articles