Wednesday, 7 May 2008

How and why you should socialise your readers online - e-Publishing Innovation Forum

As digital media adviser to The Guardian media group, Tom Turcan’s job is to socialise Guardian readers online. This morning he explained to delegates at today’s e-Publishing innovation Forum on how that is currently being achieved.


Achieving hits of 200 million page impressions a month, with 250 news stories and 50 blog post a day. The content generated by the Guardian is vast; there are 7,000 audio and video files in the organisations media library to boot.


Turcan explained how the Guardian has been nurturing its readership into something more than just passive readers. They seem to have had some success; reader comments on the Guardian blog are up by 40% on last year, even though the level of content had remained roughly the same.


After dazzling us with such statistics, Turcan took us through some notions of what a community is and what it isn’t. For starters, he warned on not confusing an audience with a community, they aren’t the same thing. A group of people with similar interests don’t automatically count as a ‘community’ either.  Turcan argued a community should be considered as a group of people who form relationships through the platform. A vital element it seems if you are trying to develop your own


With this in mind Turcan took us through a variety of social-networking functionalities the site has adopted over the years, explaining how even something as basic as the Guardian blog “a very thin platform” has encouraged the jump in engaging its user base. By asking people to comment on specific items, rating services or content for example, you engage with the reader and can then reward them accordingly, even if that is through kudos. It makes for a better service all round.


Giving the BBC website as a good example, Turcan explained how as a user he is more drawn to what others are reading about on the bottom right of the beeb news page than the stories chosen by the editor. I have to confess this same habit too, it illustrates his point well.


Turcan also advised the need to recognise the different types of users publishers serve. Some will be casual, new to the site or not using it regularly, other users, those that connect more often, (commenting, rating etc) will be more engaged with the site. The trick, it seems, it is that by engaging those users they then evangelise the Guardian site which in turn will bring in more casual users and so on. 


Sounding a word of warning Turcan reminded the assembled publishers that because the internet frequently allows their advertisers to have direct contact to customers. There is nothing to say that publishers can’t disappear from the equation. The media are traditionally used to not being directly in touch with their readers and media organisations that don’t will need to change if they are to avoid this scenario.


It’s not just about reaching to each new user, said Turcan; it’s about engaging each user so they engage more with you and each other.

Threats and opportunities in the digital age - e-Publishing Innovation Forum

David Worlock, chief research fellow at Outsell kicked off this mornings e-Publishing Innovation Forum. 15 years after birth of the web, change is endemic in our systems, the only stability we have in our industry is that change will happen. A world of collaboration is vital to us, and if nothing else, delegates needed to get energised said Worlock.


Following shortly after, the opening keynote was presented by Vin Crosbie from consultancy Digital Deliverance. His presentation, thriving in the digital age: threats and opportunities for digital publishing was a thorough examination on the state of the publishing industry, how it got to where it is and where it’s going.


Crosbie highlighted how much has changed since 1908, a million horses in the streets of London for example, ask someone from then if that would ever change and they may well have asked you how else everyone would get around. How does publishing in 2008 compare? Information supplied through print has worked fantastically since Gutenberg first invented the first printing press after all.


Crosbie argued that traditional media still follows the analogue practice this is largely based on two factors. Information is traditionally supplied based on;


1) Stories of greatest common interest
2) Stories about what editor thinks other people should be hearing about


Traditionally this was the only way to get information out to the masses and it worked, but only up to a point, not all the information ever got out there. That may sound fairly obvious stuff but it gave a nice grounding to a key question we may not consider enough – why have 1.3bn people gone online?


That figure is also expected to grow to 3.3bn or thereabouts – half the human population. Crosbie says this is because the internet and web can offer both relevant and personal information to the individual – and this is pertinent for publishing.


You could be forgiven for thining that the way forward is just by pushing all that content online instead, not so says Crosbie, believing that the websites of traditional media save their companies.


Just think of the myriad of genre-specific TV stations that have exploded since the 70’s, the 80’s saw a similar explosion of specialist magazines while the 90’s heralded the internet. There was/is a deluge of information. That surplus has meant that people want to be far pickier.


If that is state of things, how should publishers adapt? Mass customisation apparently and that is driven directly by the user, hunting and gathering as Crosbie says.


There is a huge opportunity for creating a delivery system that is able to provide that highly specialist information. Crosbie cited the Telegraph’s experiment last year which emailed a thousand readers a unique, specialised to their interests PDF, last year.


Citing newspaper circulation figures in the US, Crosbie highlighted that for every lost print subscriber; it takes 20 to 50 online users to make up the lost revenue. The current business model isn’t working.


Crosbie left us with some thoughts to mull over


1) Information has to be good enough but not perfect, so long as it’s out there
2) Brand is no defence, it doesn’t matter as much to people anymore
3) The siloing of information will end, information wants to be free and that means collaborating with competitors.


This isn’t to say that mass media no longer has a place; people will want to know about the big stories of the day, the weather too. However Crosbie says that we have to accept that the reader now has far more control. This shift has meant that the value of content has gone down, but this also means that content is now much more affordable to more people than it ever was before. And perhaps that is what future opportunities will come from. 


More later…

Tuesday, 6 May 2008

e-Publishing Innovation Forum

On Wednesday and Thursday this week, IWR will be blogging from the launch of our new sister show, the e-Publishing Innovation Forum.


Tomorrow's line-up of sessions will play host to speakers accross the publishing spectrum such as Tom Turcan from The Guardian on the topic of "Working with social network platforms to build online audiences". The afternoon will host Josh Bottomley, Managing Director, LexisNexis who will examine the opportunities available for getting the most for your money, in his presentation "Streamlining processes: getting ROI on your workflow".


Grace Baynes from Nature Publishing Group will ask what the options are for developing niche social networks in an online world dominated by Facebook. With other sessions from Outsell: "Who are your customers? Understanding their needs and behaviours". Bauer Consumer Media: "How Social Networks are changing everything" and Vin Crosbie from Digital Deliverance: "Thriving in the digital age:threats and opportunities for digital publishers". The opening day will be packed with useful nuggets of wisdom and insight.


David Worlock will open the day as conference chairman


Check back tomorrow for the latest updates

Archive forgeries deserve an inquiry

A bank holiday weekend is probably as good a time as any to announce less than positive news. With the majority of workers enjoying the time off, bad-news is bound not to get noticed as much.


With the government still feeling the aftershock of disastrous local election results or Microsoft deciding to drop its takeover bid for Yahoo, there was also a fascinating article in the FT this weekend. It detailed the story of how at least 29 forged documents had been slipped into the National Archive records and how professional research had revealed the documents to be duds. The bad-news part of this is that police say the case will stay unsolved.


The FT article by Ben Fenton, an accomplished researcher himself, details how between 1999 and 2005 a series of files were planted in the records. You may be familiar with the case.


The documents said that under the highest orders of the British Government plans were made and carried out to assassinate SS leader, Heinrich Himmler.


Fenton’s outline details the evidence for forgery was overwhelming. Apart from factual inaccuracies, there are physical traces of foul play. This included pencil marks under signatures and questionable phraseology used between correspondents. The letterhead on the documents used, was created from a laser printer.


The documents themselves were cited as evidence in the books by historian Martin Allen. Allen himself claims to have no knowledge of the forged documents and says he was ‘the victim of a conspiracy’.


The real stink of this is the 13 month police investigation yielded nothing but a decision to abandon prosecution as it wasn’t in the public interest.


Trust in the work you use, catalogue and preserve is vital, without it the efforts of information professionals across the spectrum are undermined. This is very much in the public interest.


If criminal proceedings aren’t an option then perhaps an inquiry is needed to ensure further breaches of trust aren’t allowed to happen and the reputation of the National Archives restored to its rightful place.

Monday, 5 May 2008

Lessons from Infosec

Now that the dust has settled on another year at Infosecurity Europe, the chocolate bourbons digested, the old sandwiches binned and the vast conference hall at Olympia generally made spick and span for some other obscure trade show, it might make sense to revisit the key themes. It's the bane of most tech journalists' lives, but love it or hate it, Infosec has become the undisputed king of the IT security world – if you're a vendor and you're not there, people will assume you have ceased trading.



The high profile keynote speakers this time around included Howard Schmidt, former White House cyber security advisor, Bruce Schneier, the renowned encryption guru, and Information Commissioner Richard Thomas. He's clearly pretty frustrated about the lack of enforcement powers and measly funding the organisation receives from government. At one point during his presentation, Thomas asked "what other watchdog has to ask permission before it investigates", and he's got a point. Until now the government has paid little more than lip service to the notion of data protection; high profile breaches seem to be just the tip of the iceberg, pointing to a more serious and deep-rooted cultural malaise which is affecting attitudes to safeguarding citizens' data.



But Gordon Brown has now finally given the go-ahead for the ICO to carry out random spot checks on government departments, with powers to do the same in private enterprises likely to follow. Thomas has noted before that the watchdog is not into witch hunts, and will do its best to help those who are trying to comply; but in equal measure he has made no secret that he will come down hard on those organisations which are flouting data protection laws.



Technology solutions can help in certain respects, controlling the flow of data into, through and out of the organisation, but it ultimately comes down to policy, people and process. With this in mind, organisations must be clear about their data protection policies, rigorously enforced and communicated to all staff. Be warned, the ICO is finally looking likely to fulfill its intended role; no more a passive lapdog but a watchdog with teeth.

Friday, 2 May 2008

Check your values before Green IT

Since global warming has (temporarily?) stopped, we now have the term 'climate change' to keep us alarmed.


It strikes me it's all a bit like religion, you can be a Muslim or a Christian and the primary thing that sustains you is your belief, reinforced by holy books such as the Bible or the Koran. Generally speaking, those of one faith generally spend most of their devotional time among their own. Conversion to, or even understanding of, alternative faiths is uncommon.


Such has been the mass of hype around climate change (some of it well-founded, by the way) that it has created a new faith. It's not totally unlike the much-abused 'political correctness' before it, in that to challenge is to be branded a 'denier'. But who can deny that planting crops to harvest as biofuels is fundamentally stupid?


Covering large areas of Britain's land and sea with wind farms is another classic. It would make more sense to capture energy from naturally renewable and continuous energy sources - tide, geothermal, sun and even nuclear is being seen as increasingly realistic by some the pros and many antis. But they'd better get a move on and first make sure the energy generated exceeds that consumed by the lifecycle cost of raw materials, building, operation and eventual shut down.


Nigel Lawson has written a book called Global Warming: An Appeal to Reason. James Lovelock has written The Revenge of Gaia. I read them both together last week. Each makes interesting observations - sometimes agreeing, sometimes not, even when based on the same premises. If you've not read them, they cast different lights on the same science. I feel disinclined to follow either argument with any degree of passion, but they certainly provided plenty of food for thought.


We can choose a pro or anti stance or we can go for a more neutral 'trying to leave the planet in a fit condition for our successors'. With Green IT 08 coming up next week, there's going to be a lot of hype around. If you're going (and the conference looks interesting), I suggest you make sure of where you stand on environmental issues before you leave, and weigh everything that's hurled at you accordingly. In particular, if someone's trying to sell you something, ask them about the nett planetary impact.

Thursday, 1 May 2008

Opening the Door to Open Access

Earlier this week, the Scholarly Publishing and Academic Resource Coalition (SPARC) and Science Commons released their advice on implementing Open Access through their Open Doors and Open Minds white paper.


The paper is billed as a how-to guide on formulating an institutional license grant and the best way in getting which OA policy up and running at an institution.   


I’ve had a quick read through; it’s written well with some useful advice and things to consider. At a dozen pages in length, it’s worth a look if only to familiarise yourself with current OA developments.


The paper has a good stab at the reasons behind Harvard Faculty of Arts and Science adopting their landmark OA policy. The paper suggests what action in academia can and should be taken. Advocates (of OA) and educators (to colleagues and administrations) are what is needed goes the argument. Librarians will usually be the engine managing resources so scholarly articles get deposited online, say the authors.


Of the three case scenarios outlined, each examines the different licence grants for consideration. The paper (press release to be exact) breaks these down as follows:


Three different licenses, which are granted to the institution by the author, are offered for consideration:


Case 1. Broad license grant - a non-exclusive, perpetual, irrevocable, worldwide license to exercise all of the author's exclusive rights under copyright, including the right to grant sublicenses.


Case 2. Intermediate license grant - involves license restrictions that modify the scope of the license grant in Case 1.


Case 3. Narrow license grant - grants to the university only the right to deposit the article in the institutional repository, and to make it available through the repository Web site.


The paper also recommends mandatory deposit of articles in institutional repositories. Mandatory deposit may be adopted regardless of the licensing policy chosen.


All of this is particularly useful if you are considering taking similar steps in your institution...