Friday 30 May 2008

I'll get me coat...

By now my picture may have disappeared from the sidebar of this blog. It's because, for the past eight months, I've been burning too many candles at too many ends. Something had to give and, while my relationship with the IWR folk is still good, I just can't keep up with the weekly blog. I'm so sorry (although I'll understand perfectly if you're not!) You'll still be getting a column from me once a month.


What's taken over a large chunk of my life is working with Freeform Dynamics, a boutique research and analysis firm which publishes a lot of its stuff freely. My particular responsibilities centre around environmental and social computing; although a lot of business people shrink from the 'social' word, so I tend to use 'collaboration' with them.


My first IWR blog post was just over two years ago (we started the blog in May 2006, despite what it says in the sidebar.) I used the Wayback Machine to find it. Just provide a website URL and it will list all the web pages back to the start - a great way to check up on how people change their tune over the years. Or simply to do a bit of research, like I did.


Then, as now, I'd recently been in California. And my first post related to the first law of VC investment: find something that appeals to one of the seven deadly sins. Of course, unless you're a churchgoer, these don't come readily to mind. So they are: greed, envy, sloth, pride, wrath, gluttony and lust. The virtues by contrast are: faith, hope, charity, fortitude, justice, temperance and prudence. The last being much beloved of our current (is he still?) Prime Minister.


It's interesting to see how social computing and environmental concerns have tempered many people's outright pursuit of sins with more than a pinch of virtue. Yet, underneath you can't help suspecting that not much has changed. Although, perhaps because of the new circles I move in, I do see a lot more charity these days. People seem more ready to give in the online social world with little expectation of a return. The weird thing about this is that the returns come rapidly as word spreads. I had lunch with four people the other day and every one of us has benefitted financially and spiritually from our online activities, even when that's not what we set out to do. In fact, all five of us only knew each other as a result of social computing related activities.


If you're curious about who we are, one of our number, Euan Semple, posted a picture of the occasion to Flickr. Except it wasn't an occasion. We just all felt like turning online to offline for a change.


When it comes to the green side of my life, it seems that we are, once again, driven by sin on the one hand and virtue on the other. All businesses want to make money (sin or survival - take your pick). And we all want to make sure that our children and grandchildren inherit a world worth living in, which I think lies in virtues, although it could be partially driven by pride. Bosses of organisations might make environmental choices for pragmatic reasons - money, PR or regulation - but they may secure staff commitment for additional, more spiritual, reasons. If the end result works, are we bothered?


So back to the first column. My conclusion was that "information professionals may be helping users address their baser urges". Now I'm not so sure. I actually think you're better than that. I owe you an apology.


See you on the column. Or Google me if you feel so moved.


It's been fun. Thanks for reading the blog posts.

Wednesday 28 May 2008

Web 2.0 Strategies

Here at IWR towers our events team have been keeping themselves busy launching two new conferences this year. The first, E-Publishing Innovation Forum, we blogged at a couple of weeks ago, this month sees the launch of Web 2.0 Strategies in London. Once again we’ll be blogging from there.


The event will examine how Web 2.0 is fundamentally changing the way organisations work, especially where their customers are concerned, the emphasis is on how communication and collaboration factor in.


The speakers lined up to thrash this train of thought out include Jeff Schick, IBM’s vice president for Social Computing Software, Artur Bergman, director of engineering at Wikia (described as a ‘hacker and technologist at large’) and Julie Meyer who you may have heard of as founder of First Tuesday the iconic networking forum for technology entrepreneurs.


The opening keynote will be held by Dion Hinchcliff, Editor-in-Chief, Social Computing Magazine, who will examine the strategic value of enterprise 2.0 including trends and the innovations happening in this area. The conference will be chaired by Euan Semple (a previous IWR Information Professional of the Year winner).


Other topics up for discussion include the strategic value of enterprise 2.0, how to get web 2.0 adopted by the business, mobile web 2.0, and using web 2.0 to foster advocacy and collaboration amongst employees.


For further details on the full conference line-up here

Monday 26 May 2008

More government woes for CIOs

As you might have read in the papers, seen on TV etc etc, the government has announced proposals for a massive database to store the details of all phone and VoIP calls, text messages, emails, and internet usage made by UK citizens. The big question is whether the forthcoming draft of the Communications Data Bill will also include corporate emails, calls and other forms of communication, as this could impact your data retention policies and affect security risk management strategies.



If the idea is merely to tap up the telcos and ISPs for their logs on these sorts of communications then it’s probably fair to say that as long as you keep a careful eye on your outbound comms channels – and have the technologies and rules in place to monitor and block any use of consumer tools such as webmail, public IM etc for work purposes – corporates need not worry too much about the logs of any of their communications ending up in the hands of the government.



Speaking to a Home Office spokeswoman though, I was told that the bill could potentially cover all forms of communication. Although she was keen to stress these were still proposals at present, and may not even reach the draft bill stage, it’s food for thought for CIOs. Many organisations, especially those not already highly regulated, may not have the technologies and processes in place to store the records of their internal communications and those between the organisation and third party business partners. And for those that do, how easy will it be to integrate with current disaster recover and data retention policies. If some of that data is required by industry regulators to be destroyed after a certain time frame, how will that tally with the government hanging on to it for as long as it likes? At the moment, no time length has been specified, although telcos currently have to hold on to phone call and text details for at least 12 months.



In a way it’s a shame we should be so horrified to think this kind of data could end up in the hands of the government. But as recent incidents have shown, for whatever reasons it doesn’t always guard data as carefully as perhaps it should. It’s probably a case of watch this space with baited breath until more details are released.

Friday 23 May 2008

Government's monumental snooping plan

Here we go again. The government wants to get more data about us. And store it in a monumental database. Although a Home Office spokesman has said, "Ministers have made no decision on whether a central database will be included in the draft Bill." That's probably just a way of keeping troublemakers, like the press, at bay until the inevitable decision has been made.


Dig a little deeper and you find embedded in the proposal for a Communications Data Bill and you'll find that a main element is to "Transpose EU Directive 2006/24/EC on the retention of communications data into UK law." This data retention law has already been challenged for breaching human rights. Yet our lot press on regardless. I'm never sure whether our government decides what snooping it wants to do, then finds a handy European directive to cover it, or whether it is just utterly supine.


Anyway, the ball is rolling. If the plans come to pass, our communications and ISP networks will sport the equivalent of street cameras except they will be black boxes snooping on our telephone and internet activities and feeding them in real time to the database. It will know how long we spend online, who we talk to, what web pages we connect to, what emails and IMs we exchange, with whom, and so on.


The government's proposal reassures us that, "ensure strict safeguards continue to strike the proper balance between privacy and protecting the public." In other words it is using the threat of crime - terrorism specifically - to snoop on us all. If the past is a good guide to the future, it is likely cost a fortune, be late, cost even more than planned, be insecure, run up huge carbon debts and fail in its primary task.


Seems to me that the government really is biting off more than it can chew with this one.

Wednesday 21 May 2008

Our digital lives

IWR were approached this week by a Research Fellow at UCL. He is looking for some volunteers (academics in particular but professional groups in general) to complete an online questionnaire about the kind of information management behaviours they adopt in personal life. The purpose is to ask what this might mean for the digital curation practice.


As information professionals, do you follow the same rigorous practices of filing and cataloguing at home? Is the filing of family photos on your hard-drive done with the same zeal as you would be with work material? Is your MP3 music-library properly catalogued and referenced, not only by artist, title and date, but also by mood, every category religiously filled in? Do you back up everything both on a spare storage device as well as online – just in case?


The questionnaire project, Digital Lives, is being run in partnership with the British Library and Bristol University. The survey takes about 10 minutes to complete and isn’t too taxing.


Below is the official message and link to the survey.


Digital Lives: Helping People to Capture and Secure their Individual Memories, their Personal Creativity, their Shared Historic Moments


Increasingly, our family memories, our personal achievements, our experiences of historical events, are being facilitated and recorded digitally.


Digital Lives is a pathfinding research project that is setting out to understand how individuals retain and manage their personal collections of computerised information  - everything from digital photographs and videos to favourite podcasts and sentimental email messages - and how these digital collections can best be captured in the first place and preserved in the long term, perhaps for family history, biographical or other purposes.


The project is led by Dr Jeremy Leighton John and colleagues at the British Library who, together with experts from UCL and Bristol University, are researching the challenges that lie ahead as more and more of our memories and documentary witnesses exist in electronic form.


We would like to invite you to take part in our research by completing an online survey.  This should take no more than ten minutes of your time and it will provide us with crucial information that will benefit both individuals such as yourself, in your day to day management and storage of information, and also help the work of the British Library and other archives enormously as we plan for what is fast becoming a largely digital world.


If you would like to take part in the survey, please click here: http://tinyurl.com/5wtwgm


If you would like to enter our Prize Draw and stand a chance of winning £200 in British Library gift vouchers (drawn at random and with no further obligation) you can register your interest at the end of the survey.


Please note that all responses are strictly confidential.  No individuals will be named when we report our findings, and the information collected will only be presented in an aggregated form.  You will not be contacted again as a result of completing this survey.


If you have any questions, or are concerned about the bona fides of this survey, please email Principal Investigator, Dr Ian Rowlands (UCL School of Library, Archive & Information Studies)at: i.rowlands@ucl.ac.uk  (Digital Lives is funded by the Arts & Humanities Research Council:
Grant number BLRC 8669).

Sunday 18 May 2008

Up to speed on search

It occurred to me the other day that technology is probably the only industry in which the lingua franca of the day is so rapidly displaced by a new set of buzz words, which end up confusing buyers and making marketing men lots of filthy lucre. Enterprise Content Management is so broad a term it lets the vendor community brand their products as "ECM", when they may only cover one small part of the full functionality needed to provide a holistic solution. And enterprise search is another. When you hear the term enterprise search it sounds like you're getting a Google-type experience for behind the firewall – but of course it's a little bit more complicated than this, despite what some of the vendors may tell you.



Autonomy chief executive Mike Lynch was particularly keen to tell me the difference between his firm's technology and those of some of the smaller players in this market. In fact, he went so far as to compare certain search technologies to planes, and others to mere bikes: the latter being fine for a trip back from the shops, he said, but probably not a great way to get to the States. Analogies aside though, the difference between the incumbent high end players and the likes of Sinequa, Exalead, Vivisimo and others at the mid to low end can be significant, despite these less-well known firms offering functionality that will suite many organisations just fine thanks very much.



Lynch waxed lyrical in particular about implicit query functionality, which could redefine what we understand by search. Gone is the search box on a user's screen, and instead the technology works in the background to automatically read the information on a user's screen, retrieving and displaying enterprise materials related to that content.



Then there is Autonomy's own Automated Query Guidance, which presents the user with queries based on different contexts, so that the user can be sure the most applicable one is returned. Lynch also pointed to audio and video search as two other areas the company is investing resources in.



Search has really grown in sophistication over the last few years. For some these extra bells and whistles won't be important, but increasingly enterprise search is an inadequate term to describe all that's out there. Autonomy itself is moving more and more into the e-discovery space, for example – a canny move given the potential size of the market and legislative drivers at play.


Friday 16 May 2008

eLearning (etc) with Adobe

In the dim and distant past, when Adobe first announced Acrobat and I was a snidy journalist, it gave me huge pleasure to rib the company about the inflationary nature of its software. It would take simple text and inflate it to six or more times the size and say "now it can be read on multiple devices." My view then was that plain text could be viewed on multiple devices anyway.


But, of course, I was a words man and an ex-programmer whose first computers had the equivalent of just 2.4k of memory. We ran things like accounts, payroll and stock control on those things. I was a) not interested in presentation and b) paranoid about wasting computer resources. The idea that the Gettysburg address would require 6.4 times the storage in .pdf compared with .txt appalled me.


Of course, life moves on. Computer equipment has become cheaper and storage more plentiful and Adobe has spent its life delivering what real people want, rather than pandering to minorities like me. And, for what it's worth, I've been a consumer and creator of Acrobat and Flash materials for some years. I know my screencasts and invoices (for example) can be understood by anyone who has a Flash player or Acrobat reader, respectively.


This week, the company previewed some upcoming products and services, in particular Connect Pro 7  and Presenter 7 which are imminent. Adobe also mentioned a consumer-level conferencing system due in June and extended IM interoperability for Connect Pro later in the year. Connect Pro is a suite of web conferencing and eLearning facilities while Presenter is an authoring tool which adds things like quiz compilation, audio and video editing to PowerPoint 2007 and can publish the results to Connect Pro.


The end result is a powerful eLearning environment which runs from authoring, through conferencing, break-out group management, collaboration, quizzing and assessments. Live sessions can be recorded and edited and published for self-paced learning later. There's much more - security, APIs, integration with existing directories and so on. But best to visit the Adobe site for more details if you're interested.


Because Flash and Acrobat are available on multiple operating systems and run in different browsers, it means that just about anyone can participate in these conferences and learning experiences.


It turns out that the concepts that I scoffed at all those years ago, were actually smart in the extreme.

Thursday 15 May 2008

Information: Who owns it? Who has the skills and who needs them? PI Panel Discussion

When Hamilton Matthews, chair of the Perfect Information panel debate, began the final session of the conference asking us who we thought was high-jacking information from the professionals, you knew it wasn’t going to be a dull session.


Were the main culprits IT teams, outsourcing firms, or the role company intranets play? Is the information centre going to be obsolete? He asked.


“Yes” said fellow panellist, Mark Janssen, a senior consultant with Smartlogic, “You have been hearing for a long time from Outsell reports and the like ‘change or die!’ the information broker will go, yes – but that’s no to say there isn’t a role for information professionals.


Alison Harman, Executive Director, Global Info centre, UBS added that change needs to be embraced. "The information centre seems out of date" she said, adding that in her organisation the information resources sit alongside the bankers, they work with their so-called threats, the IT department, communications and marketing people and that should be seen as an opportunity. ‘It’s a collaborative effort’ Harman said, asking us to consider some points...


1) Don’t bury your head in the sand
2) Don’t be reactive as a researcher
3) Get to the forefront and sell your skills


If that all sound horribly familiar it’s because the issue of the information professional promoting themselves has never adequately been addressed, it’s a challenge that arguably should have been addressed long ago.


Before I get to that, Alun Davies, Head of Knowledge Management at Lovells, explained what he thought was the wider problem, saying “Information professionals are in a difficult position. Are we comfortable with where we are? – no, we are seen as being an obstacle to information, the gatekeepers.” He went on to add that for a long time information professionals prided themselves on being there for the user, now users are searching for their own information, even though they don’t appreciate the complexities involved with what you do. Information professionals can be seen as barriers to information with their logins and licences. It seems rather unjust as these measures are vendor led.


“They don’t help with heavy restrictions on the information – they use old business models,” added Harman, “The Google generation aren’t necessarily aware of the difference in quality of information, often free will do.”


Davies also criticised professional bodies for not raising the bar in terms of skill sets, with universities also not giving enough room to developing the right information courses.


With this in mind the topic got back to information professionals needing to improve ‘selling’ and ‘marketing’ themselves around the organisation. This side of the pond those can be seen as dirty words, maybe more so in the information world.


“Why should we!?” demanded one audience member.


I won’t go into the nitty-gritty of that debate, its been chasing its tail for so long now, suffice to say that the general consensus in the end was that promoting the behind the scenes work and skills of the information professional was a good thing. The question was how.


From the suggestions being kicked around by the panel and audience, which I liked a lot, there was no intention of turning info pros into braying, extrovert monsters.


My favourites include:


 Spot opportunities, explain what you will do to show value (ask how will this affect the client).
 Do this even if it is out of your typical remit
 Bring what you do to the attention of the decision makers, if they know your value they will fight your corner in strategic decision-making
 Get closer to the financial department, they are part of the inner-circle of decision makers but don’t have any research skills – can you help them?
 Tell war stories, let people know the trials and tribulations you overcame. They are a valuable way of promoting your role and importance.


Maybe an altered mindset is what is needed? The final comment of the session came from an audience member who pointed out that information professionals consider themselves support staff rather than executive material. Is there any reason that can’t change?

Think clever about competitive intelligence

After some technical glitches preventing a regular update to the blog yesterday, we were all back online at this morning's opening session of Perfect Information Conference, day 2.


The stage was opened up to Trevor Foster-Black, the founder and CEO of Coalition, who had come to tell us about how his organisation delivers competitive intelligence to his clients. What this involved was an outline of the general model they use and a few anecdotal pointers that are worth considering if you are tasked with a similar responsibility. As information professionals you may well have access to much of this corporate business information.


Knowing who knows what in your organisation is vital when approaching the task of telling an organisation more about itself. Finding out what its assets are and how it compares to its competitors isn’t necessarily a straightforward task.


By examining your internal information in the guise of staff knowledge and then combining that with what is available publicly you can start the process of getting your hands on some very valuable data.


Rather than detail the complex metrics and statistical analysis models that Coalition use, I’ll ad-lib on the more general advice Foster Black offered us.


For a start, defining the metrics and language used in the initial brief is critical first step. Agree with the client on certain meanings so you don't go off and fail to answer their actual question. Consider your framework and what you want to achieve as well as where you are going to get your information. 


Take publicly available information of the organisation, then at the back end fill it with relevant data and research you don't yet have, the net effect is quite powerful apparently.


Manage expectations on what competitive intelligence can offer to the client. Also be prepared that the results you get indicating a clients position in their market might not actually match up with where they think there are, ‘no one likes to be told they’re not as good as they think they are’ said Foster-Black.


To get around this, he advised getting all stakeholders involved, to provide feedback and ensure any data differences are reconciled or highlighted in advanced. Be prepared to show on your audit trail of information why you have the results you do – people will disagree with results so you must be able to show why you have reached the conclusion you have. This means highlighting methodology error margins and un-reconciled issues at the start.


When presenting the results, consider that the executive summary of your research will be the most read part. Few people will read a report from start to finish even if they do use it as a reference manual.


If in your competitive intelligence research you are surveying comments from a variety of people about the company, it’s ok to include factual statements alongside opinion and pure gossip, providing you clearly indicate which is which. This makes for a much richer piece of content, and provide an alternative insight.


If I would take anything away from this, it is that with complex data-crunching, competitive intelligence is a combination of a huge disciplined research effort and a bucket of common sense.

Outsourcing your information? Think about this

Kicking off proceedings at his organisation’s fifth annual conference, Greg Simidian, CEO of Perfect Information, reminded us what a surprising 12 months it has been. Who would have predicted the Thomson Reuters merger would happen or the dramatic collapse of the Bear Stearns bank? With the ongoing credit crunch, ‘these are uncertain times,’ he said.


This was all the more appropriate, given the first presentation examined the tricky topic of outsourcing. As UK information professionals are all too aware, a range of information research roles are now being outsourced, it’s a fact of life. Recruits will tend to be based in India, but can also hail from the Philippines and China as well as Eastern block countries, but what does that mean for the information profession as a whole?


The idea of the presentation was to advise delegates considering the options of outsourcing and guide them around the pitfalls as well as the impacts it has on the profession.


Rishi Khosla, co-founder and CEO of Copal Partners (a financial analytics outsourcing firm) ran an informal (and rather courageous) presentation, deciding to ditch his pre-prepared slides for a more conversational presentation with the audience. You don’t normally see events opened this way, but it worked well enough as attendees were grateful to have a platform to voice their concerns.


How do you deal with the problems of staff retention asked one? Khosla admitted this was a problem, with banks offering big-bucks to researchers, the poaching of staff after a year or so of service, meant that the company actively recruits from the second and third tier universities. Not ideal really, but the information industry isn’t awash with cash.


An audience member was quick to point out a glaring problem facing information outsourcing right now. What concerned her, she said, is that there is no commitment to information research, ‘we are being replaced with people who don’t care. We have been replaced with cheap labour and this will erode innovation and expertise in the information industry.’ Murmurs of consent could be heard around the room.


Of the other voices, it was suggested that because the idea of an information researcher in India didn’t exist 5 or 6 years ago, it will take 10 to15 years to build up a cadre who care and will be experts. ‘I think we will see people who want to do it, and will build up expertise’ a more positive voice ventured.


Of areas with potential outsourcing growth, New Zealand, Australia and South Africa were all touted. Researchers from there will tend to be highly educated, relatively cheap with the country providing a good exchange rate. They also may well have experienced UK processes. The downside would be a much, much smaller population to draw from.


Closing comments from the chair suggested that if you are considering outsourcing then don’t do it as a cost saving exercise, rather think of it as a way of scaling information, getting more for your money. 

Tuesday 13 May 2008

IWR looks at the info pros toolkit

As Phil mentioned in his post yesterday it’s inevitable for a business journalist to spend much of their professional life trawling endless exhibition halls and attending facsimile conference venues. Last week I attended the launch of our sister show e-Publishing Innovation Forum, which although you may think I am bound to be a little biased about, I honestly thought was rather good. If you work in the field of publishing and attended, I’m sure you’ll agree there was  good material and ideas circulating around both speakers and fellow delegates. Something to chew over and start formulating your next big strategies.


So with myself and Peter at the e-Publishing innovation event, Phil at Internet World, and David out and about discovering the latest technological-buzz, we have all been networking mercilessly. It also looks set to continue, as for the next couple of days I will be blogging from the heart of Warwickshire countryside for the Perfect Information Conference.


Perfect Information are billing their event as the Information Professionals ToolKit. This conference is intended to act as an all round master-class for the info pros attending. A cursory glance through the list of attendees reads like a who’s who in the information professional world, with representatives from the financial, legal, corporate, and technological fields dominating.  After tomorrow’s opening remarks from Merger Markets Director, Hamilton Matthews, the first session of the morning will be examining the implications of outsourcing and what impacts that could mean for information professionals. Advice on developing authentic leadership will follow and should give everyone there something to think about. I'll be blogging on what is being said and how it relates to you. Keep checking back for further updates over the next two days.
 

Monday 12 May 2008

ECM in a mess?

Another week, another trade show. It's a lonely existence, living off dry press room sandwiches and free vendor-branded Gummi bears handed out on the stands by women not wearing very much. Alright, it's not all that bad, events like Internet World are not only valuable for lead-generation from a vendor perspective, but the keynote speakers can illuminate some interesting themes.



While most of the crowd gravitated to keynotes by MySpace, Mozilla, Google and other such cool names, I found myself in a darkened corner of the ECM theatre where Alan Pelz-Sharpe of analyst CMS Watch explained why the current enterprise content management space is "a mess". A lot of the blame was laid at the feet of the vendors, and to an extent it should be – although the technology is as mature as it's ever been, many vendors can be guilty of over-hype in this space, and creating a siloed view of the market which leaves many customers confused.



So what's new, you might ask? Well, probably more than most industries, ECM can be a minefield for the uninitiated buyer, partly because there are at least ten or more components to an ECM solution, any one of which or combination could, and usually is, touted as a holistic ECM solution. Problem is they're not. Case in point: SharePoint – it may be a great collaborative tool and has the user community well onside, but it's not a "proper" ECM product really, is it now?



Well, this is all very well and good but IT buyers have got to take some of the responsibility too. If they take the time to understand where vendors' strengths lie, what kind of customers the vendor has had before and in what sectors, and if they draw up a list of priorities – what they will need the technology to do – then they will be starting out on the right foot. Organisations also need to pay more attention on the relationships they form with their vendors, said Pelz-Sharpe, demanding who they will actually be working with on implementation for example.



Elsewhere, Ben Richmond, founder of consultancy The Content Group said end users need to work towards a best practice guideline for ECM. This would involve defining the term, developing and implementing a strategy and then creating a means to continually measure its effectiveness. Best practices around ECM are few and far between, and given the buyer confusion that leads to most projects failing, they would be a welcome addition.

Friday 9 May 2008

Big Broadband Bournemouth

The lucky citizens of Bournemouth are soon to get 'up to 100Mbps' internet access from their homes and businesses. If the project goes according to plan, it could either make Bournemouth an attractive place to work and live or it could give a kick up the bottom to BT et al to bring high speed broadband to the rest of the country.


Talking of bottoms, I should mention that the cabling is being installed in the city's sewers by the misnamed H20 Networks. Shouldn't that be CH4 Networks? Oh well. At least the Bournemouth project itself goes under the moniker 'fibrecity'.




In terms of speed, impact on the environment, security and cost, this approach beats the digging-up-of-roads method hands-down. A couple of kilometres of broadband can be laid in four hours at a cost of less than a third of conventional approaches.


The backbone capacity is described as 'unlimited', which suggests that 100Mbps is theoretically possible. So what would it mean in everyday life? Video, videoconferencing and IP telephony without hiccups for a start. Fast uploads and downloads of all manner of information, suggesting the possibility of offloading hefty computing activities to the 'cloud'. Remote visual monitoring of people, equipment or property. And, for those so-minded, vastly improved multiplayer games and other virtual experiences.


Without wishing to be a wet blanket, I should point out that not every provider of services to the internet wants to gear up for high speed. It will cost a lot of hard-to-recoup money. Others will see an immediate commercial value - video rentals, online training etc - and will move swiftly. We'll end up with a two-tier internet in the short term and the good citizens of Bournemouth will be watched as closely as laboratory rats.


Oh darn it. I didn't mean to mention rats.

Thursday 8 May 2008

Behaviour models: privacy issues and opportunities - e-Publishing Innovation Forum

Privacy issues, what privacy issue? Hugo Drayton, Chief Executive Officer of Phorm dismissed the privacy fears that have been levelled against his company even before it launches its behavioural advertising tool, writes Peter Williams…


Phorm – launch is soon– is just one example of how digital advertising is rapidly changing both the internet and the traditional world of advertising. The last few years has seen a tremendous growth in search and while it will continue to grow its share of the advertising, the market remains steady. The other area of advertising which is predicted to take off and which promises to be equally controversial is mobile: while there is no doubt it is coming it is still in its infancy and concerns remain over formats.


Behavioural advertising models, according to Drayton, are much misunderstood. The Phorm model is not sinister. It is an anonymous process which does not hold data. And it is a trend which is here to stay. Proof? Well the big IT companies (Google, Yahoo etc) are buying up specialist behavioural targeting companies so they must think there is something in the technology. And if you are still sceptical, would you have said two to three years ago that online advertising would have overtaken TV advertising? It has happened because of a multitude of factors but particularly because of the availability of the internet.


But while the internet continues to grow, the audience is no longer as homogenous as it was. Instead the audience is fragmenting and becoming harder to reach. So in a big and complex world, the idea of a technology which will send information to users depending on their previous interest sounds like an answer to an advertiser’s prayer. The internet currently has a long tail of small sites which are an unexploited commercial opportunity, but by using emerging technologies those backwaters are capable of bearing commercial fruit.


The Phorm model works by the ISP gathering data which publishers can tag via an exchange. It promises to offer a breadth and targeting opportunity which has not existed before.


Think of it, says Drayton, as a search engine for people.


One comment which may be of particular interest to information professionals is a final thought which Drayton left with the audience: if the Phorm models works for driving targeted and behavioural advertising on the internet, then it is perfectly possible that you can do the same for content. Drayton, an ex-newspaper man said he stood by the need for the editorial process (i.e. the editor making a judgement on what his or her readers want to see). But there may also be room for content sent to people determined by their previous search behaviour. The world of content and information may never be the same again.

Wednesday 7 May 2008

Implementing digital into the workflow - e-Publishing Innovation Forum

The final session of today’s e-Publishing Innovation Forum had a more practical, rather than analytical slant to it. Josh Bottomley, Managing Director of LexisNexis explained to delegates how adopting efficient digital initiatives in the workflow can lead to new business opportunities.


Bottomley, who has previously worked for Goldman Sachs, McKinsey and the FT before moving to dominant legal information providers LexisNexis gave an insight not so much on how LexisNexis engage lawyers online, rather how the company has evolved how it works.


At the opening of the session, Chairman David Worlock asked delegates, “How do you move from just online research to the next thing?” It was a question that even the most successful of businesses need to consider. 


Bottomley outlined how LexisNexis moved from providing information in print, to print and online to now print, online and with knowledge driven solutions, risk and compliance, client development and practice management. Bottomley explained that even though the LexisNexis business model has been very successful for the company with customers relying on them for speedy information through an online model, the big question, which echoed Worlock’s, is how to keep that business growing?


LexisNexis believe this should be done through a number of ways, in particular, knowledge driven solutions, by quantifying the value of the information provided to their clients. Bottomley compared this to a traditional tech company that would say ‘here are the efficiencies we can offer you through our information’.

By offering risk and compliance, client development and practice management, Bottomley says “it’s kind of a combination with merging established content with effective technologies,” rather than a separate useful set of systems and a separate piece of useful content. 


What it takes to get this right is a huge amount of domain expertise said Bottomley; you have to pick the customers you want to serve and what issues you want to help them with. Choosing this is very difficult to do, especially if you try and do this across too wide a spectrum. Another crucial element is bringing in a concept of behavioural change management or convincing the customers to buy into a new initiative. Finally, he pointed out that technology and content skills are a must to get this happening. With the clients buy-in, a clear idea of what you are trying to do and effective technology, clients will be more efficient, agile and better cope with adversity.

How and why you should socialise your readers online - e-Publishing Innovation Forum

As digital media adviser to The Guardian media group, Tom Turcan’s job is to socialise Guardian readers online. This morning he explained to delegates at today’s e-Publishing innovation Forum on how that is currently being achieved.


Achieving hits of 200 million page impressions a month, with 250 news stories and 50 blog post a day. The content generated by the Guardian is vast; there are 7,000 audio and video files in the organisations media library to boot.


Turcan explained how the Guardian has been nurturing its readership into something more than just passive readers. They seem to have had some success; reader comments on the Guardian blog are up by 40% on last year, even though the level of content had remained roughly the same.


After dazzling us with such statistics, Turcan took us through some notions of what a community is and what it isn’t. For starters, he warned on not confusing an audience with a community, they aren’t the same thing. A group of people with similar interests don’t automatically count as a ‘community’ either.  Turcan argued a community should be considered as a group of people who form relationships through the platform. A vital element it seems if you are trying to develop your own


With this in mind Turcan took us through a variety of social-networking functionalities the site has adopted over the years, explaining how even something as basic as the Guardian blog “a very thin platform” has encouraged the jump in engaging its user base. By asking people to comment on specific items, rating services or content for example, you engage with the reader and can then reward them accordingly, even if that is through kudos. It makes for a better service all round.


Giving the BBC website as a good example, Turcan explained how as a user he is more drawn to what others are reading about on the bottom right of the beeb news page than the stories chosen by the editor. I have to confess this same habit too, it illustrates his point well.


Turcan also advised the need to recognise the different types of users publishers serve. Some will be casual, new to the site or not using it regularly, other users, those that connect more often, (commenting, rating etc) will be more engaged with the site. The trick, it seems, it is that by engaging those users they then evangelise the Guardian site which in turn will bring in more casual users and so on. 


Sounding a word of warning Turcan reminded the assembled publishers that because the internet frequently allows their advertisers to have direct contact to customers. There is nothing to say that publishers can’t disappear from the equation. The media are traditionally used to not being directly in touch with their readers and media organisations that don’t will need to change if they are to avoid this scenario.


It’s not just about reaching to each new user, said Turcan; it’s about engaging each user so they engage more with you and each other.

Threats and opportunities in the digital age - e-Publishing Innovation Forum

David Worlock, chief research fellow at Outsell kicked off this mornings e-Publishing Innovation Forum. 15 years after birth of the web, change is endemic in our systems, the only stability we have in our industry is that change will happen. A world of collaboration is vital to us, and if nothing else, delegates needed to get energised said Worlock.


Following shortly after, the opening keynote was presented by Vin Crosbie from consultancy Digital Deliverance. His presentation, thriving in the digital age: threats and opportunities for digital publishing was a thorough examination on the state of the publishing industry, how it got to where it is and where it’s going.


Crosbie highlighted how much has changed since 1908, a million horses in the streets of London for example, ask someone from then if that would ever change and they may well have asked you how else everyone would get around. How does publishing in 2008 compare? Information supplied through print has worked fantastically since Gutenberg first invented the first printing press after all.


Crosbie argued that traditional media still follows the analogue practice this is largely based on two factors. Information is traditionally supplied based on;


1) Stories of greatest common interest
2) Stories about what editor thinks other people should be hearing about


Traditionally this was the only way to get information out to the masses and it worked, but only up to a point, not all the information ever got out there. That may sound fairly obvious stuff but it gave a nice grounding to a key question we may not consider enough – why have 1.3bn people gone online?


That figure is also expected to grow to 3.3bn or thereabouts – half the human population. Crosbie says this is because the internet and web can offer both relevant and personal information to the individual – and this is pertinent for publishing.


You could be forgiven for thining that the way forward is just by pushing all that content online instead, not so says Crosbie, believing that the websites of traditional media save their companies.


Just think of the myriad of genre-specific TV stations that have exploded since the 70’s, the 80’s saw a similar explosion of specialist magazines while the 90’s heralded the internet. There was/is a deluge of information. That surplus has meant that people want to be far pickier.


If that is state of things, how should publishers adapt? Mass customisation apparently and that is driven directly by the user, hunting and gathering as Crosbie says.


There is a huge opportunity for creating a delivery system that is able to provide that highly specialist information. Crosbie cited the Telegraph’s experiment last year which emailed a thousand readers a unique, specialised to their interests PDF, last year.


Citing newspaper circulation figures in the US, Crosbie highlighted that for every lost print subscriber; it takes 20 to 50 online users to make up the lost revenue. The current business model isn’t working.


Crosbie left us with some thoughts to mull over


1) Information has to be good enough but not perfect, so long as it’s out there
2) Brand is no defence, it doesn’t matter as much to people anymore
3) The siloing of information will end, information wants to be free and that means collaborating with competitors.


This isn’t to say that mass media no longer has a place; people will want to know about the big stories of the day, the weather too. However Crosbie says that we have to accept that the reader now has far more control. This shift has meant that the value of content has gone down, but this also means that content is now much more affordable to more people than it ever was before. And perhaps that is what future opportunities will come from. 


More later…

Tuesday 6 May 2008

e-Publishing Innovation Forum

On Wednesday and Thursday this week, IWR will be blogging from the launch of our new sister show, the e-Publishing Innovation Forum.


Tomorrow's line-up of sessions will play host to speakers accross the publishing spectrum such as Tom Turcan from The Guardian on the topic of "Working with social network platforms to build online audiences". The afternoon will host Josh Bottomley, Managing Director, LexisNexis who will examine the opportunities available for getting the most for your money, in his presentation "Streamlining processes: getting ROI on your workflow".


Grace Baynes from Nature Publishing Group will ask what the options are for developing niche social networks in an online world dominated by Facebook. With other sessions from Outsell: "Who are your customers? Understanding their needs and behaviours". Bauer Consumer Media: "How Social Networks are changing everything" and Vin Crosbie from Digital Deliverance: "Thriving in the digital age:threats and opportunities for digital publishers". The opening day will be packed with useful nuggets of wisdom and insight.


David Worlock will open the day as conference chairman


Check back tomorrow for the latest updates

Archive forgeries deserve an inquiry

A bank holiday weekend is probably as good a time as any to announce less than positive news. With the majority of workers enjoying the time off, bad-news is bound not to get noticed as much.


With the government still feeling the aftershock of disastrous local election results or Microsoft deciding to drop its takeover bid for Yahoo, there was also a fascinating article in the FT this weekend. It detailed the story of how at least 29 forged documents had been slipped into the National Archive records and how professional research had revealed the documents to be duds. The bad-news part of this is that police say the case will stay unsolved.


The FT article by Ben Fenton, an accomplished researcher himself, details how between 1999 and 2005 a series of files were planted in the records. You may be familiar with the case.


The documents said that under the highest orders of the British Government plans were made and carried out to assassinate SS leader, Heinrich Himmler.


Fenton’s outline details the evidence for forgery was overwhelming. Apart from factual inaccuracies, there are physical traces of foul play. This included pencil marks under signatures and questionable phraseology used between correspondents. The letterhead on the documents used, was created from a laser printer.


The documents themselves were cited as evidence in the books by historian Martin Allen. Allen himself claims to have no knowledge of the forged documents and says he was ‘the victim of a conspiracy’.


The real stink of this is the 13 month police investigation yielded nothing but a decision to abandon prosecution as it wasn’t in the public interest.


Trust in the work you use, catalogue and preserve is vital, without it the efforts of information professionals across the spectrum are undermined. This is very much in the public interest.


If criminal proceedings aren’t an option then perhaps an inquiry is needed to ensure further breaches of trust aren’t allowed to happen and the reputation of the National Archives restored to its rightful place.

Monday 5 May 2008

Lessons from Infosec

Now that the dust has settled on another year at Infosecurity Europe, the chocolate bourbons digested, the old sandwiches binned and the vast conference hall at Olympia generally made spick and span for some other obscure trade show, it might make sense to revisit the key themes. It's the bane of most tech journalists' lives, but love it or hate it, Infosec has become the undisputed king of the IT security world – if you're a vendor and you're not there, people will assume you have ceased trading.



The high profile keynote speakers this time around included Howard Schmidt, former White House cyber security advisor, Bruce Schneier, the renowned encryption guru, and Information Commissioner Richard Thomas. He's clearly pretty frustrated about the lack of enforcement powers and measly funding the organisation receives from government. At one point during his presentation, Thomas asked "what other watchdog has to ask permission before it investigates", and he's got a point. Until now the government has paid little more than lip service to the notion of data protection; high profile breaches seem to be just the tip of the iceberg, pointing to a more serious and deep-rooted cultural malaise which is affecting attitudes to safeguarding citizens' data.



But Gordon Brown has now finally given the go-ahead for the ICO to carry out random spot checks on government departments, with powers to do the same in private enterprises likely to follow. Thomas has noted before that the watchdog is not into witch hunts, and will do its best to help those who are trying to comply; but in equal measure he has made no secret that he will come down hard on those organisations which are flouting data protection laws.



Technology solutions can help in certain respects, controlling the flow of data into, through and out of the organisation, but it ultimately comes down to policy, people and process. With this in mind, organisations must be clear about their data protection policies, rigorously enforced and communicated to all staff. Be warned, the ICO is finally looking likely to fulfill its intended role; no more a passive lapdog but a watchdog with teeth.

Friday 2 May 2008

Check your values before Green IT

Since global warming has (temporarily?) stopped, we now have the term 'climate change' to keep us alarmed.


It strikes me it's all a bit like religion, you can be a Muslim or a Christian and the primary thing that sustains you is your belief, reinforced by holy books such as the Bible or the Koran. Generally speaking, those of one faith generally spend most of their devotional time among their own. Conversion to, or even understanding of, alternative faiths is uncommon.


Such has been the mass of hype around climate change (some of it well-founded, by the way) that it has created a new faith. It's not totally unlike the much-abused 'political correctness' before it, in that to challenge is to be branded a 'denier'. But who can deny that planting crops to harvest as biofuels is fundamentally stupid?


Covering large areas of Britain's land and sea with wind farms is another classic. It would make more sense to capture energy from naturally renewable and continuous energy sources - tide, geothermal, sun and even nuclear is being seen as increasingly realistic by some the pros and many antis. But they'd better get a move on and first make sure the energy generated exceeds that consumed by the lifecycle cost of raw materials, building, operation and eventual shut down.


Nigel Lawson has written a book called Global Warming: An Appeal to Reason. James Lovelock has written The Revenge of Gaia. I read them both together last week. Each makes interesting observations - sometimes agreeing, sometimes not, even when based on the same premises. If you've not read them, they cast different lights on the same science. I feel disinclined to follow either argument with any degree of passion, but they certainly provided plenty of food for thought.


We can choose a pro or anti stance or we can go for a more neutral 'trying to leave the planet in a fit condition for our successors'. With Green IT 08 coming up next week, there's going to be a lot of hype around. If you're going (and the conference looks interesting), I suggest you make sure of where you stand on environmental issues before you leave, and weigh everything that's hurled at you accordingly. In particular, if someone's trying to sell you something, ask them about the nett planetary impact.

Thursday 1 May 2008

Opening the Door to Open Access

Earlier this week, the Scholarly Publishing and Academic Resource Coalition (SPARC) and Science Commons released their advice on implementing Open Access through their Open Doors and Open Minds white paper.


The paper is billed as a how-to guide on formulating an institutional license grant and the best way in getting which OA policy up and running at an institution.   


I’ve had a quick read through; it’s written well with some useful advice and things to consider. At a dozen pages in length, it’s worth a look if only to familiarise yourself with current OA developments.


The paper has a good stab at the reasons behind Harvard Faculty of Arts and Science adopting their landmark OA policy. The paper suggests what action in academia can and should be taken. Advocates (of OA) and educators (to colleagues and administrations) are what is needed goes the argument. Librarians will usually be the engine managing resources so scholarly articles get deposited online, say the authors.


Of the three case scenarios outlined, each examines the different licence grants for consideration. The paper (press release to be exact) breaks these down as follows:


Three different licenses, which are granted to the institution by the author, are offered for consideration:


Case 1. Broad license grant - a non-exclusive, perpetual, irrevocable, worldwide license to exercise all of the author's exclusive rights under copyright, including the right to grant sublicenses.


Case 2. Intermediate license grant - involves license restrictions that modify the scope of the license grant in Case 1.


Case 3. Narrow license grant - grants to the university only the right to deposit the article in the institutional repository, and to make it available through the repository Web site.


The paper also recommends mandatory deposit of articles in institutional repositories. Mandatory deposit may be adopted regardless of the licensing policy chosen.


All of this is particularly useful if you are considering taking similar steps in your institution...