The Harsh Couch - 2016.04.26 Acupuncture of the Genitals


#1

If I had to take hell, I would use the Australians to take it and the New Zealanders to hold it. - Erwin Rommel


This is a companion discussion topic for the original entry at http://theharshcouch.com/thc/2016-04-26/

#2

I’ve been spamming Wibbly with e-mails about the WNYC On The Media 6 April 2016 podcast interview with an interesting fellow about how an team of journalists across different publications work to sift and process the information in the Panama Papers dump.

Link: http://www.wnyc.org/story/behind-panama-papers/

This has triggered a gush of thoughts. I’d planned to send them to Wibbly. But then thought, why should only he suffer. So, here’s what I could get around to committing to record. I suspect I have not proof-read it enough.

For media, there is an interesting emerging content and business model in this. In short:

  • The electronic information age has reached a stage where information leak data dumps like this will become quite common. Examples are both from the public sector (Wikileaks, Snowden) and the private sector (Panama Papers, Ashley Madison).
  • The challenge this has presented to the media is how to digest the large volumes of data both quickly and knowledgeably
  • The emerging model has been to work across publication houses who may not (at a headline commercial level) be otherwise connected, at least in terms of the commonly understood networks of commerce
  • This model has quickly grown to a point where it seems to be both profitable, efficient and self-enriching
  • Self-enriching comes in the sense that, if was this was just a country-based network (eg the Fairfax Press, ignoring their New Zealand banners) or even a banner with a split of semi-local and targeted-global focus (eg NYT, Guardian, Washington Post), I don’t think they have the resources or focus to pick up leads on information like that linking to the now former Icelandic Prime Minister. The model allows the same asset (the data) to be exploited much more, with the benefit of that exploitation rewarding all in the network.

I don’t see this as a Big Data issue (at least not in my layman’s understanding of the term). This is not analysing large volumes of data with the power of high spec datacores to detect the patterns emerging from the aggregate.

The contents of leaks like the Panama Papers, Wikileaks and Snowden were akin to traditional documents sitting in a pile needing individual, detailed attentions. They just came in a different format and larger volumes. It’s lawyer document discovery territory or business due diligence territory, rather than the realm of statistics and other sciences I never paid enough attention to.

The question is how to we get the necessary process of individual, knowledgeable attention to detail running in an efficient way. AI seems to be decades away from replicating this (so-far human) function. The more informed among us may contradict me, in which the singularity is closer than I thought and the end is nigher.

From my banker perspective, this is the sort of opportunity that the smarter publishing banners should be grabbing at furiously in this age of a (here to date) dying journalistic model. If this model is as dollar profitable at it seems (and I must admit I’ve seen no financials to confirm it is), this is the promised land of both quality journalistic comment AND profitability.

I repeat my disclaimer (because my years as a lawyer compel me) - I haven’t seen profitability numbers supporting my assumption that this model is profitable. This assumption reflects my hope that quality journalism can be profitable. My promised land of quality journalistic comment and profitability could just be evangelism on a mistaken presupposition.

Disclaimer aside, an interesting aspect of this model unfortunately missing from the podcast interview is how the connections between banners occurs in this model.

Possibly each leak project gives birth to a model that is the next stage in evolution from the last project (carried by the genes of journalists and architects who come across from the last project).

But we’re not stuck in a biological genetic paradigm where there is only vertical lineage. The opportunities for lateral passage of architecture and processing concepts from conceptually adjacent data systems are something I wish I knew more about.

It’s frustrating to know there is a spawning process underway in the information sphere but I can’t see under the blankets because I chose never to go down the necessary education tangent. I hope to do that one day, but I’ve got another active education project to get through and a large “to do” list before that.

I’d expect there’s a few different network models existing between media houses. The obvious connections would be:

  • Standard human networks built on common schooling, employment or collaboration;
  • Content syndication networks; and
  • Common ownership and internal corporate channels eg News Limited

From what I understand of the media buying advertising model, those networks indirectly connect most media houses. But it’s also unlikely to be a network that would make the necessary type of connections. Advertising space is largely bought up by clearing houses and auctioned off on sterile electronic platforms as a purely commercial and commodotised product. They are run by numbers people.

I’d guess that the approach would still be standard human networks. While content syndication makes more sense at first glance from a business perspective, I suspect the psychology of journalists, as content-focused professionals, leaves then feeling most comfortable operating on networks where they can comfortably screen the content, value and integrity of the people they connect up with for this sort of project. This human network is also a network that has benefit from the ready access to information and human connectivity aspects of the information age.

Otherwise, the information age has been largely unkind to journalists and the journalistic content models of the 20th century.

And I suspect many journalists would have a distrust for, if not antagonism to, the potential inroads the content production aspects of the electronic information age could make into their content market. The commoditised, low quality content that this challenger has produced so far has already grabbed eye balls that the traditional media used to take for granted. Would journalists want to betray their own kind, work with the enemy and help their own extinction evolve?

Perhaps what we’re witnessing is the old human journalism network finally evolve in reaction to the challenges of the internet age. Are we beholding a scrappy, tough human network finally picking itself off the canvas and launching a flurry of blows against the rise of the machine networks? I think I’ve taken too much programming from Hollywood.

Anyway, another tangent laden with a lot of presupposition on my part - who will suffer? What networks only reluctant join this revolution, should this disruptive model have staying power?

The most obvious seems closed shops which incentivise internal networking only and disincentivise external connections with other corporate networks. I mean corporate here in the commercial sense of somewhat contained structures, here. Not a “corporations are evil” protest banner sense.

In the podcast interview, the NY Times comes up. The interviewee is a polite gentlemen, but still his comments about the Panama Papers model not integrating well with the NY Times model are pretty clear.

I don’t know enough about media houses to know if any others would be a candidate for this inward network type. I have an under-informed suspicion News Limited may fit this description. But, by my reasoning above, networks across media houses can be more profitable, so a profit seeking corporate (now in the “corporations are evil” sense) should rationally encourage those networks. Rupert sounds still sane enough to think this way. But this also assumes perfect knowledge. It’s also oversimplified (and potentially ignorant) model.

And perhaps the (assumed) profitability of inter-house networking model is only coming to fruit now. Which coincides neatly with my earlier comment about the recent emergence of the data leaks phenomenon in the data age. But a neat explanation can be deceptively attractive.

In a mildly coherent way, this brings me to a request. I’ve seen the recent, great interview documentary with Snowden during his time in HK. Are there good books that focus on how the Wikileaks or Snowden data leaks were processed? Or on these themes generally?


#3

Lest I not be giving credit where it’s due, full credit to Wikileaks as the first model for efficient content processing that I’m aware of. Not a traditional media house, but a content producing model challenging the traditional media model’s space.

And credit to the Guardian^ being, as far as I’m aware, the first traditional media house to devour and absorb that challenging conceptual model, harnessing the power of the model to process the Snowden leaks within a more traditional commercial framework.

^ And if someone corrects me that it was actually one of the others in that syndicate, I’ll correct that.


#4

More data leaks: http://www.theregister.co.uk/2016/05/11/embarassing_data_breach/