Friday 31 August 2007

The Facebook privacy and productivity puzzle

Nothing worse for a group blogger than to find his editor blogged on the same subject the previous day. Yep, it's pesky Facebook again. For me, for the last time. I hope.


The BBC now has 9,333 Facebook users. That's people who actually possess a valid bbc email address. I'm sure that's over a third of the company. They were turned on to social computing a few years ago, as reported in IWR, and this seems to be a high validation of Facebook by a bunch of knowledgeable social networkers.


Yet, other companies are blocking Facebook completely. Sophos gave some examples in a recent report but, because they were all banks, I was concerned that it might have come from a biased source. However, the same company has conducted surveys to reveal corporate attitudes to Facebook.


It discovered that 43 percent of the 600 respondents said that their companies were blocking Facebook, while a further seven percent were allowing employees with a specific business requirement to use it at work. Of the remainder, eight percent (in response to a leading question) said that they feared that "workers would complain" if access were switched off.


In a later survey, Sophos learned that a third of respondents believed that colleagues and employees "were sharing too much information on Facebook". Again, a bit of a leading question but one that serves Sophos' interests well. But, it also offers behavioural guidlines on how to best protect individuals and companies from such threats.


Here are the (paraphrased) topline items:


  • Use Facebook options to protect your identity

  • Think carefully about who you accept as a friend

  • Use the cut-down profile for 'friends' who aren't really

  • Disable all the options and re-enable each when you realise you need it

If you don't believe the risks, do take a look at Sophos' Freddi the Frog story. Eighty two people befriended and handed over personal details to an imaginary frog.


If you want to understand more, there's a very promising event brewing in October. Called The Facebook Debate, it is being run by the British Interactive Media Association. Entry costs £25 (£15 for BIMA members) and, although there's a panel, most of the action is expected to be in the audience.

Thursday 30 August 2007

Union group warns of Facebook sackings

The Trades Union Congress (TUC) has warned Facebook users to be careful and for employers not to sack workers for using the social networking tool in work time.



The TUC has written a guide on acceptable Facebook usage in the workplace in response to growing unease from organisations that Facebook is reducing productivity in the workplace. Facebook has become popular for social and professional networking.


Staff across the UK have already faced discipline or sackings because of the overuse of Facebook at work. Daily newspaper The Guardian reports that employees of Kent county council have been sacked.



With 3.5 million registered users in the UK the TUC describes Facebook as an accident waiting to happen. "Simply cracking down on the use of new web tools like Facebook is not a sensible solution to a problem, which in only going to get bigger," said Brendan Barber, General Secretary of the TUC. Instead he advises, "It is better to invest a little time in working out sensible conduct guidelines, so that there don't need to be any nasty surprises for staff and employees."



These are a set of good suggestions, so far IWR has not come across any figures which show how much work time and productivity has been lost to Facebook, but we have heard anecdotal evidence of sales being lost and workers spending considerable amounts of work time within social networks.



Policing social networks will be hard. IWR and many titles have promoted them as useful tools in the information landscape. Unlike games or even YouTube, it is difficult to see where using Facebook is purely for pleasure and where is a useful corporate tool. Add into the confusion the fact that many managers are using these tools to communicate with their teams both about corporate and social issues and the blur gets even fuzzier.



Ultimately, it will have to come from the top down within the organisation. Managers should be involved with these tools in order to garner the best from them, but they should also set the parameters for staff's usage and enforce it, before UK Plc is ground to halt by a winter of disreputable content.





Wednesday 29 August 2007

What’s making SharePoint so hot (and not so hot)?

One of the not-so-secret secrets of Microsoft’s rocket ride of the 1990s was its understanding that success in commercial software almost always goes hand in hand with a nexus of links to ISVs, channel partners, hardware makers, consulting firms and so on.


The firm is at it again with the release of software developer kits for SharePoint Server 2007 and SharePoint Services 3.0, replete with Microsoft’s usual “how to” wizards, templates and demystification guides.


It’s not surprising to see how well SharePoint is doing at the moment. It integrates with other key Microsoft infrastructure, it’s (relatively) cheap and it lets you do most of the things that you would want to do with an ECM system.


What does it lack? Having spoken to a number of experts for an upcoming article in Information World Review on SharePoint, I would say:


Attention to bespoke features for vertical markets and associated rules and regulations


Clear licence terms and conditions


Control tools that can prevent SharePoint growing everywhere in the organisation at the cost of manageability


Integration with latest Microsoft developer tools


What else is missing in SharePoint and what doesn’t work? I’d like to hear, so drop in a comment or get in touch by email.


Incidentally, lest anybody in the green-font brigade suggest I’m selling ads for Microsoft, let me link to a good post by Matt Asay on Microsoft’s paid-for research into why SharePoint is better value than open-source ECMs. Asay is right. This sort of paranoia does the company no favours.


And finally, a song about over-zealous SharePoint users. Oh yes, things have got to that point.

Don’t be a sundial in the shade

I may be jumping the gun a bit, but as our woeful summer draws to a close, I’m going to set my sites ahead to the forthcoming Online Information show, in particular the award for IWR Information Professional of the Year 2007.



The notable accolade is given in recognition to the individual who has made an outstanding contribution to the information profession or to the practice of information management. Their contribution and effort may have been through exceptional project or collaborative work, either between or inside organisations. Innovation of best practice or new services and solutions that contribute to the information profession is also a factor that will be considered.



The announcement of the 2007 winner will be made on day one of the Online Information Conference (4th December) so if you have a colleague or peer, or even yourself or a client, that you think deserves the award and recognition then get in touch by emailing nominations to IWR editor

Mark Chillingworth.

Please bear in mind that the deadline for nominations is Friday 9th November. 



In case you need any further persuading, consider what US founding father, Benjamin Franklin, said on achievement; “Hide not your talents, they for use were made. What's a sundial in the shade?”

Tuesday 28 August 2007

Web TV can be profitable - if broadcasters know how to tune in

It's been long predicted, and in many ways, it's already here if you know the right people; TV is having an iPod moment. Don't take my word for it, though: take Vint Cerf's. Writes Ben Tudor


Cerf's address to the great and good at the Edinburgh festival makes for fascinating reading, and it's a tad recursive - you can watch a clip of his interview afterwards courtesy of Reuters TV.


Basically, video content is now freely available on the web. If a PVR or mobile phone can record it, it can also be distributed freely - without much regard for issues of ownership or copyright. This is pretty much what hit the music industry with the combination of CDs and cheap PCs made copying digital music cheap and easy. Bear in mind the usual caveats - home taping  didn't kill music, and VCRs gave movie firms a huge boost to sales of their back catalogues instead of killing off the practice of going to the pictures.


What does this mean? Well, forget the company Vint's on the board of for a moment. Google - and by extension YouTube - are the periscope of a big old submarine. Far more important are sites like UKNova, which allow people to download British and other TV shows using Peer to Peer clients. UKNova (if you can get an invite) is a great resource, and possibly a great alternative to a PVR. Virgin, Setanta, Channel 4, BT and Tiscali are among the firms that are trying to provide TV on Demand, but they
have to compete with the free - downloadable TV shows that people have just recorded from broadcast themselves and shared. The likes of UKNova may stand on shaky legal ground, but people want content, and they get it. If I want to do a little viewing around the birth of skateboarding, I  will buy or rent Dogtown and Z Boys, but I'll also catch a World in Action documentary on skateboarders on Google Video.


I'm expecting TV companies, certainly in the UK, to take a more enlightened approach than the RIAA, which boneheadedly refuses to accept that its business model needs to change. On the other hand, I also expect TV firms to see this as a revenue opportunity. People are used to seeing ads online and on television - both media suit it well. What do you think?

It's been long predicted, and in many ways, it's already here if you know the right people; TV is having an iPod moment. Don't take my word for it, though: take Vint Cerf's.


Cerf's address to the great and good at the Edinburgh festival makes for fascinating reading, and it's a tad recursive - you can watch a clip of his interview afterwards courtesy of Reuters TV.


Basically, video content is now freely available on the web. If a PVR or mobile phone can record it, it can also be distributed freely - without much regard for issues of ownership or copyright. This is pretty much what hit the music industry with the combination of CDs and cheap PCs made copying digital music cheap and easy. Bear in mind the usual caveats - home taping  didn't kill music, and VCRs gave movie firms a huge boost to sales of their back catalogues instead of killing off the practice of going to the pictures.


What does this mean? Well, forget the company Vint's on the board of for a moment. Google - and by extension YouTube - are the periscope of a big old submarine. Far more important are sites like UKNova, which allow people to download British and other TV shows using Peer to Peer clients. UKNova (if you can get an invite) is a great resource, and possibly a great alternative to a PVR. Virgin, Setanta, Channel 4, BT and Tiscali are among the firms that are trying to provide TV on Demand, but they
have to compete with the free - downloadable TV shows that people have just recorded from broadcast themselves and shared. The likes of UKNova may stand on shaky legal ground, but people want content, and they get it. If I want to do a little viewing around the birth of skateboarding, I  will buy or rent Dogtown and Z Boys, but I'll also catch a World in Action documentary on skateboarders on Google Video.


I'm expecting TV companies, certainly in the UK, to take a more enlightened approach than the RIAA, which boneheadedly refuses to accept that its business model needs to change. On the other hand, I also expect TV firms to see this as a revenue opportunity. People are used to seeing ads online and on television - both media suit it well. What do you think?


Friday 24 August 2007

The social computing equation

The BBC, as you probably know, has a thriving social software setup. Because of the sheer numbers of staff who participate, it is possible for staff to find help and answers at the drop of a hat.


Smaller businesses - ie those with fewer than thousands of staff - will find it difficult to get the critical mass needed for this sort of thing to happen with any degree of dependability.


The question vexing many is whether to allow access to the wider world of social software. Currently, Facebook is probably exercising them, just as blogs and Skype have in the past.


No doubt you've heard of the law of Karma where the wheel turns and good given comes back in time, often in unexpected ways. This is where individuals and companies can benefit more from social software than it costs them.


Real life example: Last weekend, I was asked to give a short speech in Norway tomorrow. I wrote it in English on Monday and threw it into a machine translator. Even I could see the results were not good.


I checked my Skype contact list and found a Norwegian, Sigurd Rinde, I'd met through our blogging activities. I left him a note. I also joined a Norwegian language group in Facebook and left a note there as well.


Within seventeen minutes of Sig reading my note, I had his translation in my hands. A short while later, a Facebook user offered to help and, minutes later she'd suggested a couple of improvements. Then Sig's wife improved it still further. Then Sig made an MP3 recording so I could learn how to pronounce the words properly.


These various actions were a matter of minutes for the participants (we were only talking about 150 words). Imagine the hassle trying to achieve the same result through conventional channels.


This morning, a Skype friend wondered if I could help him prepare a speech. I gave him a PowerPoint deck and speaker notes for a similar presentation I'd done recently. That saved him days but cost me minutes.


For organisations with less than a critical mass of intelligent people, external social communities may offer a reasonable substitute as long as the attitude is one of giving as well as taking.




Thursday 23 August 2007

Liberate information and join the Free Our Data Campaign

IWR has been keeping an eye on the Free Our Data campaign which the Technology supplement of daily newspaper The Guardian has been running. We've been watching it and supporting it. Now Charles Arthur, editor of the Technology Guardian supplement, has asked IWR and its readers to join the campaign.



"The more people and organisations we have on board, the better our chances of success," he said. The campaign seeks to make data which the public has paid for freely available, which will then stimulate new information resources for information professionals and the public alike to use.



The campaign uses a blog as its central repository and communications point, so its easy for all of us to add more information to the cause. So if you know of examples of government information costing you or your organisation, despite having already paid for it once through your taxes, let the campaign know. "If everyone joins, it would be sort of hard for the government to ignore," Arthur adds.



The campaign, which has been running since 2006 ,has highlighted some interesting disparities in

Whitehall

's information policies. Ordnance Survey (OS), known for its maps has been revealed to charging British citizens repeatedly for geographic information that is already funded by the public purse. It has been revealed that local authorities pay the OS for map information during the planning application process, planning authorities also pay separately for the same map information. During the campaign it has been revealed that a total of eight separate payments arrive at the OS as part of a planning application.



There are bound to be many more cases like this and it would be great if Information World Review and its readers can be part of a campaign to make the information we already own more easily available.

Wednesday 22 August 2007

Space, man

I know, I know; Google coverage must sound like a broken record. But I defy you not to be impressed by Google Sky. Google Earth was an excellent exercise in taking a set of data, and openning it to the public for scrutiny, notation and all kinds of use and abuse. From tracking hurricane damage using NOAA imagery to spotting flying cars and spotting Matt and Andy sunbathing, Google Earth has become more than the sum of it's parts; a gigantic corkboard for everyone and anyone to add information, browse or explore.


Now Google is doing this with the heavens. The best bit, of course, is that as more data is collected, and more imagery produced, Google Sky has the possibility of becoming a huge resource, not just for astronomers and physicists, but everyday human beings, too .



Tuesday 21 August 2007

Challenging assumptions: 2.0

You may be aware of the great Library Juice blog, I ventured over there recently and a post last week got my attention on challenging some of our web/library 2.0 assumptions.


Rory Litwin, who runs the site, gives us the example of two recently published studies on undergraduate research behaviour. The first was conducted on University of Rochester students by anthropologist Nancy Foster. She found that the assumption that modern scholars will be in-sync with net-tech to be a misnomer.


The results at University of Rochester study show that even though the subjects were all scholars, they can be as split in their technical abilities as with any other groups; some embrace and are apt at using new tech whilst others were not. Litwin mentions the study even shows a sizeable amount of undergraduates are “technologically inept”.


The second piece of research on undergraduate guinea pigs comes from Alison Head and is entitled Beyond Google: How Students Conduct Academic Research. This study examined how humanities students conducted their research and found that rather than rely on Google and questionable academic sources like Wikipedia, students adopted a hybrid method of researching. A tiny minority (Wikipedia 3%, Search Engine 13%) went straight to the outside web, while the majority used library sources; especially its website (23%). Aggregated research resources which had been recommended by the library or their tutors were significant information sources for the students (Course reading 40%). You may feel encouraged that the students placed their trust with the professionals over anonymous online sources.


I would however encourage you to have a look at the rest of Litwin’s post. He goes on to quote a stinging response by Mark Rosenzweig, Councilor at Large of the American Library Association (ALA) to Laura Cohen’s very cheery Library 2.0 Manifesto, published in the latest American Libraries.


Both make good reading, both make notable points, clearly both authors care very much about the role and importance of The Library. I do wonder though why the debate on 2.0 has to sometimes be so polarised.

Monday 20 August 2007

Of baby-boomers and bastard operators

There’s not too much new under the sun but Open Text came up with an interesting spin on the need for ECM last week. In a press release, the firm suggested that with many baby-boomers approaching retirement, there comes a parallel increase in risk of loss of business knowledge.



OK, so it’s not an entirely new insight. I’ve heard of the baby-boomer issue applied several times with reference to mainframe and other datacentre stalwart systems that depend on one guru who holds all the keys, knows where skeletons are buried and is the single source of all the know-how to keep the wheels turning.



You could also quibble and say that with changes in attitudes to careers and retirement, the twilight of the baby-boomers probably isn’t a phenomenon that will strike hard outside certain areas such as the public sector where switching between employers is not so common.



But where Open Text is surely right on the money is in suggesting the effects of this brain drain where “a big chunk of knowledge, context and process know-how walks out the door with newly retired staff”.



I had a horrible feeling that Open Text’s answer to this problem would be to buy more of its software but, to be fair, it suggests some practical advice such as:



the setting up of mentoring programmes



creating groups where retiring records professionals work with HR, IT and corporate legal executives to protect knowledge assets



moving away from paper processes that are anathema to the new generation of professionals



To a lot of you out there, this is just common sense but basic knowledge management and succession planning are all too often honoured in the breach.


For me, however, the big issue here is that of the “bastard operator”: the selfish employee who owns a domain and won’t share knowledge because it weakens a personal fiefdom. Any stories of how to handle this type would be worth sharing -– providing you’re willing to share the knowledge, of course…

Sunday 19 August 2007

But what about Johnny Cash?

Bear with me; this may take some time to explain.


Not a few years ago, around about the time that music companies decided it would be a good idea to sue customers as well as file sharing sites, various people began warning that record companies were in a terminal decline. The internet - particularly the ability to find, download and copy pretty much any piece of pop music - had put paid to a business model that has endured since the first companies started printing sheet music.


Not many people took notice. There are a couple of reasons for this. Firstly, record companies are not the sort of things that most musicians or music fans tend to have warm feelings for. Search for EMI, and the first page will be a collection of EMI's corporate web sites. Search for Apple, and the first page includes Slashdot's Apple minisite and the Wikipedia entry. Secondly, record companies, despite what they did to musicians, fans and downloaders, had a point; by downloading an MP3 of a commercial track, people were breaking copyright laws.



Friday 17 August 2007

Curiosity doesn't have to kill the cat

A friend of mine is big on creativity. In fact, he's a visiting professor at De Montfort University's Institute of Creative Technologies. He also runs a company that gets paid for coming up with bright ideas.


He has the most brilliant and clearly articulated approach to harnessing creativity profitably, which I'd like to spill the beans on, but can't. Not yet, anyway. At school he was a nightmare because of the way his mind worked. He fluctuated between total hopelessness and utter brilliance.


Sometimes his brilliance was misplaced but, as his subsequent life has shown, he has learned to corral his creativity and put it to good use.


Through his teaching, he is helping future generations harness their creativity. He knows this is what makes the difference between success and failure in today's frantic world.


Continuous innovation is demanded and this can no longer be the exclusive preserve of company founders or a coterie of engineers/inventors/whatever. We're all in this together and facilitating creativity seems to be right up the information professional's street.


The problem, as Jim Magee pointed out recently, is that to be creative, you have to be curious. And curiosity doesn't sit well with the suits that run organisations. I think they fear that if everyone followed their curious instincts, no work would get done.


Creativity rarely comes out of thin air, it usually comes from fresh juxtapositions of existing information. When I invented some software a long time ago (it sold tens of thousands of copies), it was the result of combining elements of Tony Buzan's mind-mapping, IBM's Bill of Materials Processing, Ted Nelson's information on Actor Languages and Vannevar Bush's paper "As We May Think".


Each of us has a different mix of information and experience, so the potential for fresh juxtapositions is limitless. The trick is to ensure that an environment is established where it's okay to dream and definitely okay to share and where curiosity and creativity are a continuous backdrop to real life.

Thursday 16 August 2007

Ordnance Survey pulls the plug on Virtual London

The public will miss out on the opportunity to view an online 3D map of London, and a potential wealth of relational data, following a decision by the Ordnance Survey (OS) to withhold licenses for its information.


The Virtual London project was meant to act as a useful visualisation tool; it included three million of the capital’s buildings.


The idea was that London’s citizens would gain a better appreciation of new developments and environmental concerns if they could visualise the impact on their local boroughs. 
This could be through seeing the implications of land development in their borough or mashing-up environmental impact data, such as areas with high levels of air pollution or sections of the capital prone to flooding. As London’s councils are exempt from such licensing restrictions, Haringey Council have used it effectively to show heat loss from each building in their borough. 


After six years of development, negotiations broke down last week over licensing terms and costs between the OS and the developers of Virtual London; UCL’s Centre for Advanced Spatial Analysis (CASA) and Google who were to use the data on their Google Earth platform. The OS, which owns the crown copyright of Virtual London’s spatial data, sells licenses to access the information; it wanted Google to pay on a per-user basis, Google however wanted to make a one-off payment. In response to the OS decision, Google is said to have released the one word statement, “disappointed.”


As today’s Technology Guardian points out, the strange thing is that the office of the mayor of London funded the initiative in order to engage the public in their city in more substantial ways, informing them with a unique and effective tool.

The question everyone is asking is why can’t the public access information that they have already paid for and help fund the development of?


The OS say that they were unable to issue the licences for the project the way Google wanted because it wouldn’t fit into their current framework.


The authors of the Digital Urban blog who are involved with the project explain the justification the OS gave to them. The OS said “There is an existing licensing model that works for the original purpose of Virtual London i.e. the availability to London boroughs etc. What Google wanted to do would take it out of those licensing arrangements”
 
But expecting Google to pay on a per user basis at best doesn’t seem realistic or considered, at worst protectionist and out of touch. The OS approach is distinctly unsuited to the digital age, what is required now is quick and easy access to data and the ability to apply that information in new and interesting ways. The current system doesn’t seem to have caught on to that idea yet.


Perhaps Google's one-off fee offer isn’t realistic either; a yearly subscription is perhaps a solution? However it is the rigidity of the OS position and its business model that reveals something probably needs to change there first of all.


Earlier this month, the Technology Guardian’s Free Our Data blog, ran another article that showed how government departments such as the Ministry of Defence (MoD) and the Department for the Environment, Food and Rural Affairs (Defra) had complained to the Communities and Local Government select committee, saying that the OS licensing rules were restricting Defra in the supplying of information to other European governments and partner organisations.


The MoD had the following to say “in recent times the boundaries applied to the use of OS’s data for public service and national interest work have become increasingly blurred. MoD has experienced more stringency and complexity being applied to the release of data by OS, which has resulted in uncertainty and lack of flexibility in the use of that data by the MoD.”


If the OS are drawing this kind of considered criticism from fellow government departments’ a review of its commercial operation is perhaps long overdue?

Tuesday 14 August 2007

Springer survey salutes Springer eBooks!

A survey sponsored by scientific publisher Springer about Springer has, unsurprisingly, come out in favour of the Springer eBook service SpringerLink.


Six international universities were questioned as users of the Springerlink eBook program. According to Springer they said the eBook enhanced user access with greater functionality, more categories of information and "provided clear advantages over print publications". Information professionals also told the Attfield Dykstra and Partners, who carried out survey that not physically handling the book for archiving had cost advantages. Is there anything new here yet?


Further ground breaking insights included "back-end efficiencies" from the "lack of storage requirements". No mention of the cost of servers.


The universities involved in the survey were the University of Illinois at Urbana Champaign; University of Florida; University Library of Turku, Finland; Centre for Mathematics and Computer Science (CWI) Amsterdam, The Netherlands; University of Muenster, Germany, General and Medical Library; and Victoria University, Australia.

In a statement Olaf Ernst, Vice President eProduct Management said, “This survey is instrumental to Springer in continuing to build our eBook program, and catering to the needs of our subscribers."


He went on to say that “The future is bright for eBooks in the academic realm".

Monday 13 August 2007

ECM left out in cold by SaaS –- for now

A new Gartner report suggests that while the software-as-a-service trend is rapidly adding enterprise applications support, the ECM sector has so far been largely insulated from the trend.


Gartner highlights this disconnect, saying that:


“In ECM and search, SaaS adoption is in the range of one percent to two percent of total software spending. Within e-learning and web conferencing, SaaS accounts for more than 60 percent and 70 percent of total market revenue.”


Of course, there are a few good reasons for this delta in uptake. ECM software is bought in large part to get a grip on documents located behind the corporate firewall. Web conferencing is a blindingly obvious candidate for subscription-based hosted services. Indeed, firms such as WebEx were making their millions before they decided to ride on the bandwagon and describe themselves as SaaS or on-demand players. Well, that’s fashion, folks -- and an optimistic view of how markets might view your stock.


However, I’d be surprised if ECM didn’t get more traction in SaaS. As companies, and regulators, get more comfortable with internal documents being exposed in secure online silos, SaaS becomes a plausible alternative to in-house deployments. It’s probably going to be a phenomenon that applies to smaller outfits initially but it will be worth watching to see who goes on from storing emails in the cloud and takes the leap to business documents.


Another SaaS scenario that might apply to ECM is a hybrid format where software stays behind the firewall but customers pay vendors for services to monitor and backup systems. In this way, firms might gain a better understanding of what's happening to their ECM investments, who is using the tools, who isn't, what issues are causing problems, and so on.


ECM vendors have a bad rap for usabilty and ease of deployment. If SaaS companies can help firms get started and keep on using content management, they won’t be stuck at one or two percent market share for much longer.

Friday 10 August 2007

Tagmash: a boost for library intelligence

Tim Spalding, of LibraryThing fame, has come up with an intriguing twist on his book-tagging operation. He's capturing combinations of tags and turning the results into web pages. 'Tagmash' takes anyone who looks for a tag combination to a URL for that combination.


Since I'm reading Charles Handy's autobiographical "MYSELF and other more  important matters", I threw in 'Handy, philosophy' to Tagmash. This is what came back:


Tagmash


In fact I lie, because the tags list their soundalikes, which are usually misspellings and punctuated words. 'handy' stood alone while 'philosophy' included 26 alternatives. Anyone searching for 'Handy, philosophy' in future will be taken straight to the URL. New combinations take up to 30 seconds to materialise.


You can apparently skip particular tags by preceding them with a '-'. I tried, 'Harry, Potter, -fiction' and got the same 16 results with or without the final argument. But, by using a double-minus, I got a single result. As Spalding explains, "A single minus (-fiction) 'discriminates' against items tagged 'fiction'. A double minus (--fiction) disqualifies all books with the
fiction tag." Perhaps mine was a foolish choice of tags anyway.


According to Spalding, "Tagmashes work with different things, not a thing and its category." I tried "http://www.librarything.com/tag/florence,michelangelo" and, there on top of the list, was one of my all-time favourite books.


Alongside the results are a cloud of related tags, related tagmashes and a list of related subjects.


Like all Web 2.0 stuff, LibraryThing and its offshoots like Tagmash are work in progress. As people get involved and use the system and talk about it, ideas for refinements pop up and get incorporated. And, of course, the more people that use it, the more intelligently the underlying system behaves and the more any dross gets sidelined.

Thursday 9 August 2007

FoI users face poorly trained staff and fear of questions

In the forthcoming September issue of Information World Review we have an exclusive interview with an information professional who is at the coal face of using the Freedom of Information Act (FoI). Rebecca Lush is the Roads and Climate Change campaigner for Transport 2000, an independent body that lobbies for sustainable transport solutions.


One body that has a wealth of information on transport usage in the UK is, unsurprisingly the Department for Transport. Yet as this article details; getting access to that information is nigh on impossible. Without access to proper information a body cannot deliver on its stated aim - to inform the debate and help those keen to adopt more environmentally sustainable transport. You get a small inkling of government conspiracy and double dealing. But Lush and Transport 2000 are above the nail gnashing fears of conspiracy theorists. In fact their experience is of a civil service that is poorly trained on handling FoI requests.


She tells IWR's Alisdair Suttie: ‘Government agency staff at all levels need to be better trained to deal with requests under the Freedom of Information Act. There’s a prevailing attitude that anyone asking questions is automatically bad. Until this way of thinking changes, it will always be a struggle to obtain information as easily and readily as it should be.’


A search for FoI training reveals on data protection courses. Organisations known to and for information professional training such as Aslib, Cilip and TFPL don't seem to have seen this golden opportunity coming.

Tuesday 7 August 2007

The future of libraries

“I have seen the future of libraries: It is to spend the future discussing the future of libraries.”


I had to share this wry comment with you, courtesy of Tim, founder of Librarything.com and its blog; Thingology last week. I think Tim makes a fair point; too often we can wax lyrical about a 2.0-monikered piece of tech and not enough about actually applying it.


In saying that, I think we are seeing evermore promising and exciting times for libraries really utilising such technology and with credible outcomes. This may be something like the BL’s digitisation project, Turning The Pages or Sound Archive online offerings, or as mentioned last week, the dauntingly ambitious Open Library initiative. 


With this in mind I have been keeping an eye out for some practical advice and information on other library 2.0 experiences. I think these may shed a little more light on what is out there and how it is being used.


Choice Reviews Online has some good material this month. You do have to register to access this content, but there is a free two month trial on offer; all for the price of some demographic information. One weighty essay; The Social Tools of Web 2.0: Opportunities for Academic Libraries, comes from InfoTangle contributor Ellyssa Kroski.


Kroski discusses which kind of 2.0 tools to use, why an academic library should use them and how they benefit both it and its users. For example, topics covered include how content collaboration software such as wikis can be usefully applied in a library environment. This includes everything from subject guides to a website or intranet. Then there is social bookmarking software like del.icio.us that easily assembles substantial lists of online material. This can be applied to things such as reading lists and subject guides. Bibliographies have benefited from a tool called CiteULike which has been particularly well used with various journal articles to record full citation information and then shared with others in the community. Sharing different formats of media such as the BL, digitisation example I gave above, shows how effective information distribution can have many different dimensions. Perhaps most importantly, through the re-engaged relationship libraries have with their users


There is also a fairly substantial buyer’s guide available from Choice. It details a stack of online academic resources available. It’s pretty much a software directory, so there is no need to go into much detail about those listed, suffice to say that if you are in the market for some new heavy-weight kit this might be a good resource to refer to.


Finally, for a slightly quicker run-down on what other applications the e-learning community are using, have a look at the Centre for Learning & Performance Technologies site. They have been asking their community members what their choice online tools are, compiling a favourites list. There are familiars like Google and Firefox – that will come as no surprise to most of us, but there are a few applications in here that might be worth checking out. If you have any to recommend yourself let us know.

Monday 6 August 2007

Barometers aren't always reliable

Open-source ECM firm Alfresco Software recently released what it called its “first-ever global survey of trends in the use of open-source software in the enterprise”.



Based on responses from 10,000 users, the Alfresco Open Source Barometer came to a very interesting conclusion:



“Windows is increasingly a popular evaluation platform for open source software but most enterprises use Linux when they go into production.”



Well, most enterprises that are Alfresco customers, maybe, but I doubt that data stands up quite so well at other shops.



One more point caught my attention:



“The research also showed that the UK lags behind in the adoption of open source, suggesting less government emphasis compared with other European countries such as France, Germany, Spain and Italy."



Maybe that’s true, although Alfresco’s customers probably aren’t a wholly reliable guide, at least when taken in isolation.



And one final one:



“Deployments of Red Hat have grown at a rate twice as fast as Novell SUSE since the controversial November 2006 patents and interoperability agreement announced by Novell and Microsoft.



‘This finding suggests that customers may not like the terms of the deal as more information became public,’ [said an Alfresco spokesman].”



Hmm, based on the best part of two decades attempting to get responses from real-life users on this sort of industry storm in a teacup, I really doubt that many customers give a damn one way or the other.



The Alfresco user barometer is a good starting place for arguments and might mature into a very useful indicator but this is a company just starting out. I’ll be waiting a little longer before I’m confident that its research can always offer broader insights into the minds of the ECM user community.   

Friday 3 August 2007

The liberation of public information

Ants have been in the news a lot recently. They fill holes in the road with their bodies so that their colleagues can move swiftly. In an emergency, they slow down and proceed in an orderly fashion, thus speeding the escape of the whole community. Most things that ants do are for the good of the community as a whole because, without the community, they wouldn't thrive.


The government's agonising over public information has been in the news a lot too. And it's interesting to note the differences between its approach to life and the ants'. First of all, if you read any of the public pronouncements on public information, a lot of attention is paid to its 'value'. Yes, 'social value' is chucked in for good measure, but it's clear where the obsession lies.


Imagine yourself in a government department that is custodian of some public information and you are rewarded according to how much you can extract from others for the privilege of sharing it. How would you feel if someone came along and said "hey, the taxpayer has already paid for this information, you should be giving it back at no more than the cost of delivery." Gulp.


It's not an attractive thought is it? But this is exactly what has been facing those who govern us for several years. I think that opening up public information was first mooted in the year 2000. Only this year, when the clamour from outside is too loud to ignore, is the government really giving the appearance of trying to address the issue.


But, the people whose departmental funding depends on getting outside revenue aren't going to like it and they will talk endlessly about the commercial value of the IP or the need to provide a return on the public's investment. Or the need to create a 'data mashing' function within government in order to add more commercial value to the stuff we've already paid for.


The idea that liberating the data so that others can do a better job of the 'mashing' does not sit at all well with them.


Watch these people closely. They claim they will have a set of proposals ready for public comment by the end of this year. We should all be poised to scrutinise this and give our feedback. It will be a good test of whether they really want to listen or just try and put off the evil (to them) day when they have to give us back what was ours in the first place.


Unlike ants, one senses that government departments aren't actually interested in the greater good, only in their parochial needs, their bonuses and their career advancement.


Here are a few links you might find useful, should you wish to get involved:


The government's response to the Power of Information report by Ed Mayo and Tom Steinberg.


Yesterday, the Guardian published a piece on Ordnance Survey coming under fire from inside the government.


And here's the transcript of an interview (you can draw your own conlcusions) with Michael Wills, minister for information, and representatives from the Guardian Technology's "Free Our Data" campaign.

Thursday 2 August 2007

Some new ways of doing things

Last week I blogged about the validity of using Second Life in further and higher education, from the very informative and reasoned feedback I got from some of you I’m well on my way to disciple-hood. I think as a ‘new’ method of sharing information it can excel, we are going to see some interesting developments in both the technology of SL and how it is used.


This week I have been scouting around for some useful, albeit more conventional online resources, but there should be something here for most of readers. The three listed below have proven themselves worth a mention, either because the method of information delivery is ambitious or fresh and effective or the information is just good quality content.


Instant Atlas


First of all, software mapping company Geowise, contacted me about their Instant Atlas product. The statistical and mapping site takes any (preferably large) set of statistical data and uses a variety of graphical illustrations and maps to present that information in new and interesting ways. They aren’t the only company doing this by a long shot, we’ve all seen plenty of Google Earth, flashMapped and MS Live mashups are out there, but I like the way this one gives you a decent amount of control and a good bundle of options to play with.


A great example made available in the demo, is data for Scottish MP expenses, some of the higher figures, according to the Instant Atlas demo, come from those MP’s that live away from the mainland, in geographically large and remote constituencies, one would therefore presume that money is spent on travel? Interestingly the largest single expense total came from a geographically tiny inner-city constituency. I highlight this random example because it was so easy to get this overview of detailed information in a couple of clicks.


I think Geowise have done a good job of showing the tools broad appeal. Whatever your data – if there is the potential to add the geographical context to that information then its surprising how helpful software like this helps with clarity and therefore understanding. I can imagine there is so much scope for products like these from education, to research, business, health - you name it. We’ll see if we can get a more detailed review done in due course and really put it through its paces.


Talking with Talis


Second up, I’ve been happily listening to the Talking with Talis online resource in the IWR office this week. I’m sure many of you will be aware of the monthly podcasts from the technology company, but for those of the uninitiated, it’s basically a compendium of mp3 file conversations between Talis and that week’s guest speaker. The theme revolves around web 2.0 technologies and their relationship with libraries. 


With nearly two years worth of discussion available on the site anyone wanting to brush up on the jargon and trends will do well to start here. The interviews can be a touch chatty and informal but if you skip forward a little the Talis interviewer is good enough to let the experts hold court and share their views. There are some engaging and interesting points that the commentators make, in particular I liked tech author Peter Morville’s discussions about ambient findability and how the authority of sources is a significant factor in search. Whether that is through means that are traditional, using a ranked method like Google or user-contribution like Wikipedia in origin. He points out, “The issue of authority is a controversial and messy one in library circles these days” going on to say “We are in a period of real tension between the traditional notion of expert authority and this celebration of the ‘wisdom of crowds’ and the ‘rise of the amateur’, I tend to come down in the middle, there is a great strength to be tapped from the energy of the millions of people, but at the same time I don’t think that discredits the work of an expert, who has worked and thought about a particular area”.


The Open Library


With this in mind I wonder what Peter would make of The Open Library (TOL) which launched a demo version in mid-July. The incredibly ambitious project (some would argue pipe dream) aims to “build the world’s greatest library, then put it up on the Internet free for all to use and edit”.


This means that founder Aaron Swartz and his TOL team want to use the internet to make available every single book, ever published in every language online. Furthermore, they want everyone to contribute to this; “Not simply "free to the people, "but a product of the people: letting them create and curate its catalogue, contribute to its content, participate in its governance, and have full, free access to its data” the mission statement grandly announces. All very noble stuff, I’m sure you agree.


The project has support from the Internet Archive and the Open Content Alliance, the British Library according to the Beeb seem to be little more guarded about its prospects with Stephen Bury, head of European and American Collections saying “We have always supported digitisation, and the more the merrier. But there’s some scepticism as to whether one day the Open Library might become a commercial site with adverts and so on”


In terms of technical principles and in order to achieve a useable information source, The Open Library team are developing their own cataloguing system or Open Library Number (OLN). A schema called futurelib is being developed which will act in principle like the MARC format. Advice and contributions from all in the library community are welcomed, particularly at this stage.


Like the BBC report, I think it’s also worth mentioning, that even a few years ago, the idea of something like Wikipedia ever really taking off the way it has was fantasy. Yet through the millions of voluntary contributions from its user-base and putting aside some notable problems for a moment, Wikipedia has been a tremendous success.


Are the “public” suitable for a similar role with TOL or could this initiative give a platform for the experts to assert their knowledge, expertise and experience a little?


Perhaps the BL’s sceptical stance is more realistic and The Open Library is just a fanciful notion? Or maybe, we can hope for a Wikipedia-like success, born through the initial efforts of an interested community?  It may just depend on experts like you lending a hand


As ever, please weigh in and let me know your thoughts…

Wednesday 1 August 2007

What Plaxo did next

A couple of weeks ago, I asked the last person using Plaxo to turn out the lights. I was far too hasty, it appears, to write the calendar and contact synching company off. Within a couple of days, Plaxo's nice PR people had called up, and this afternoon I listened to a presentation from vice president and co-founder Tod Masonis and marketing VP John McCrea. The bunch at Plaxo have been busy. Very busy.


I'll explain what all this is  about in a minute, but first off, it's worth saying that quite a lot of what I'm going to talk about isn't going to happen until next week. I have no idea if it will work - the screenshots show it working, but I haven't actually had a chance to try any of this out. So take this as a pinch of salt, but have a try for yourself if you have a Plaxo account.


That said, I think Plaxo has had a great idea. One of the key features of a lot of the new social networking sites is open interfaces. Building on that, and on Plaxo 3.0, the slightly less spammy version of Plaxo, has resulted in something else entirely. Before we go further, it's worth reading Rob Scoble's take on Plaxo 3.0, which gives a good catch-up for those of you who, like me, forgot all about Plaxo.