Friday 28 September 2007

The Future of the Book

How important is a book? You know, the actual artefact. I am old enough to remember when paperback books were still regarded as sacrilegious. Somehow they seemed cheap and nasty.


Christmas stockings, well pillowslips in our house, bulged with properly bound hardback books. To buy a paperback as a present was to somehow devalue the recipient. Now, in our family, we're viewed as quite mad if we waste our money on a hardback. Unless, of course, we simply can't wait for the paperback edition.


So what of the future? Will we abandon atoms for bits? eBooks proliferate, but few are (yet?) regarded as the equal of a real book. They do, though, have the advantage of being sized to fit the content. Which is nice. So many books, especially non-fiction, are padded out to match the publishers' ideas of a perfect page count. Thus, in some cases, rendering them quite unreadable. IMHO. Shelf space for electronic books is infinite and delivery is more or less instant. So online definitely has its advantages.


Last week I met Bob Stein at the inaugural meeting of the Creative Coffee Club (more on that another time). He was introduced as being from the Institute of the Future of the Book. Had I known that he was a founder of the Voyager Company (a pioneering laserdisc and CD-ROM publisher) and had worked with the legendary Alan Kay (inventor of the Dynabook among many other achievements), I would have enthused a lot more about meeting him.


Still, his organisation intrigued me. He said something to me like, "a book is a means of communicating ideas across time and distance." Printed books have been a rather handy mechanism for doing this for the past few hundred years. But this doesn't mean they will continue. I confess to finding the idea of being bookless a bit weird. But I do like the fact that you only need ambient light in order to read one. But life moves on. And if you're interested in knowing where we might end up, especially since this has a direct bearing on your career, you might like to visit the Institute's blog  or join its newly-formed UK Facebook group.



Thursday 27 September 2007

Common sense prevails among the dreaming spires

Newdepositoryimagesmaller_2


Plans to build a state of the art book depository for the Bodleian Library were given the go-ahead yesterday afternoon.





Commenting on the win, Dr Sarah Thomas, Bodleian chief says; “We shall now be able to house all our collections in a secure modern building. We can now also redevelop the New Bodleian into a major research centre for scholars…’ ‘We are confident that when the depository is completed it will be recognised as a positive contribution to the City”


The £29 million project has been designed to hold nearly 8 million books and ensure the library has enough shelving space for the next 20 years. An existing extension to the ancient library, the New Bodleian, is reportedly running at 130% capacity.


This story from the Guardian last week reported claims that; “some books are shelved under pipes carrying liquids and in the caged sections books are stacked waste high and only three floors have air conditioning”

No one was arguing that a new facility needed to be built. What had got the NO-campaigners vocal was one if not all three of the following issues:


The View: It is argued that the new design for the building will ruin the view of Oxfords trademark “dreaming spires” for tourists and residents alike. Consider though that the current site in Osney is also home to an industrial/business park. None of those buildings are going to be winning the Stirling prize for architecture any time soon. Compared to the proposed design, I fail to see how it could be any worse than what is currently there.


The Design: The Bodleian say that the depository has been “designed to fit in with the local environment and to blend in with the Oxford skyline” Personally I find the sweeping roof is more in keeping with an out of town shopping or leisure centre (sorry Bodleian). Then again if it’s good enough for English Heritage and the Environment Agency to back, then it’s good enough for me. Speaking of the Environment Agency…


The Flood plain: Critics argued that as the site sits on a flood plain, housing a library depository would not only be an unnecessary risk to its precious contents but will potentially divert water to surrounding areas. Understandable, when you consider the ruin flooding bought to many parts of the country this summer. Then consider that argument when looking where the National Archives and House of Commons are situated. They may get very soggy should river defences including the Thames barrier (which also protects Oxford) ever fail.

Wednesday 26 September 2007

The new Zune, or the new Windows smartphone?

The New York Times (Thanks for removing the registration requirement, NYT bods!) puts it best; Microsoft is going a fair ways outside its core competency by taking on Google at the advertising game.
Google has said it will buy DoubleClick for $3.1bn, regulatory problems allowing; Microsoft has already shelled out $6bn for aQuantive. It's going to be a jolly tight - run thing.


That said, it sounds as if Microsoft is going to take a very different tack. Brian McAndrews, the aQuantive chief now running things at the Microsoft subsidiary, is keen to divorce ads from search, concentrating instead on telling advertisers how their customers get to - and click through - their adverts.



Tuesday 25 September 2007

Money men plan to cut informed British culture

The word 'cuts' has been rearing its ugly head in the information sector with far too much regularity in the last month or two. The latest two organisations to be threatened are the two jewels in the British crown of not only information, but also our culture – the BBC and the British Library.


No organisation can spend willy-nilly and difficult as they often are to deal with, the money men have their place. But if the focus becomes too narrow, in other words too short term, the damage can be lasting.  Lynne Brindley, chief executive of the British Library penned an item in this weekend's Observer discussing what will happen if the money men starve our national library of cash. As she rightly points out, "I simply don't want to run a second rate organisation. Slipping from world leadership to the second tier is not something that can be reversed."


Talk to anyone on the street and they will believe that Britain is second at just about everything. We've lost our iron grip on manufacturing (it was only really in place because our Empire was the world market), we are no longer a military super power and other than the brilliant efforts of Lewis Hamilton we are not winning every sporting event about. Yet when you tell people that the UK is the world leader in the information world they are surprised. But once again the money men could very well cut the costs and accept second place whilst talking of being winners.


If funds are cut the quality will drop. The quality debate has, of late, been caught up in a debate about information literacy and egalitarianism born from the Web 2.0 movement. Yet an excellent analysis of the role of Radio 4 as it reaches 40 in the Saturday edition of the Guardian summed up what I've been feeling, "The confusion is the assumption that unstructured demotic chatter is more "accessible" than a well written talk by someone who really knows about a topic. As sources of information and comment proliferate, the demand for authoritative, well informed programmes increases rather than diminishes." The last sentence sums up what faces the information world at the moment, not a need to ditch our methods in place of Web 2.0, but to improve our resources to complement Web 2.0.


The same is true of the British Library, according to Brindley, for the cost of a cup of coffee and a muffin (presumably at the BL café) the nation has access to some of the most important cultural, academic; and informed works on earth, including the Magna Carta.


If these two scions of information and quality are reduced to silver medal holders then the information industry as a whole will suffer.

Monday 24 September 2007

Online software's steady march

Two stories caught my eye on the excellent Techmeme site this morning: EMC’s acquisition of Mozy, a developer of online storage, and Google’s decision to spruce up its GMail service.


Anybody possessing a passing acquaintance with software trends would have found it hard to miss the ongoing move from disk-based software to web-based programs that has been going on for over  a decade now. As that prolonged timeline suggests, changes in software usage tend to be evolutionary and spread out rather than revolutionary and overnight, but these two news items suggest the way that, for many areas of computing at least, the corporate server and even the local hard drive are becoming redundant.


EMC has plenty to lose if the business datacentre server starts to fade. The acquisition of Mozy, one of many startups that offers storage as a service, suggests EMC recognises that if anybody is taking its traditional business away, it might as well be EMC itself.


For many companies, online storage makes a lot of sense. You don’t hire storage admins, you don’t buy a ton of disks and tape, and you don’t have spikiness on the balance sheet.


As for online storage, GMail started out as a consumer-orientated service but edged into corporate activities because for many it represented a superior way to do business. I still prefer Yahoo Mail but GMail changed the face of freemail because it was the first to dish out large amounts of storage at no cost.


Now many of us use freemail services in preference to the business standard because we find it easier to access and that it frequently offers better security, performance, usability and resources than our employers are willing to provide.


For content management, the advantage of having files and mail available on the web is that we can use standard web services and tools to manipulate data. However, nearly all business users still have some dependency on local storage because of reasons such as latency or rules and regulations on the location of data.


Despite this, many startups are now effectively running their whole businesses on tools such as Google Apps and Salesforce.com. I strongly suspect that many other firms only haven’t moved because of inertia or the threat to status quo in terms of IT, procurement and other staffing.

Friday 21 September 2007

Unsentimental targeting of Sentiment Analysis

Fourteen months ago, I wrote an optimistic column about Corpora and its sentiment analysis software. It can take a body of news stories about any particular subject and tell you whether the underlying sentiment is positive or negative. I thought this would be a boon for marketing and PR types for tracking the market's reaction to their activities.


At the time, the company hinted at an imminent big win. This turned out to be a partnership with Reuters. Being a media type, I had jumped to media-related conclusions. But the company (now called Infonic) has tapped a much richer, but more mammonistic, seam. The two companies have slipped a Sentiment Analysis layer into the data management layer of the Reuters Market Data System (RMDS) used by banks and hedge funds.


Reuters is the biggest news agency in the world. If we had time to read all its output, then our view of the world would be more or less well-rounded. Investors aggregate as much of this information as they can in order to make their buy/sell decisions. But, in the end, all they're care about is whether there's a gap between reality and the market's perception. Trading on these gaps is how they make their money.


Governments, too, are very interested in these perceptions. So that's two markets with more or less bottomless pockets. (Update: I forgot to re-mention pharmaceuticals. The average drug has 70,000 trial reports which can be analysed unemotionally for trends.)


Infonic says that it will include Sentiment Analysis into its own search engine but, judging from the company's commercial priorities, I am less optimistic than I was about us getting our mitts on it.

Thursday 20 September 2007

Salesforce.com takes on ECM

Salesforce.com has hardly put a step wrong in its short history, transforming the CRM and sales force automation space, becoming the toastmaster for on-demand software, and creating the model for transparency. It's fast-growing and even loved by customers. And now it’s heading for enterprise content management.


Salesforce said earlier this week that its next quarterly release will mark its entry into the ECM sector. CEO Marc Benioff referred specifically to Documentum and Microsoft SharePoint for what he suggested was a failing model of how to do ECM. His plan is to take web consumer technologies and apply them to content management so that a person searching for a file could view it by criteria such as ‘most comments’, ‘most popular’ or ‘most recently viewed’, for example.


As Benioff noted, this is straight out of the YouTube playbook and the move has a certain attractive simplicity. However, I’m not overly confident that Salesforce will have the same effect on ECM that it has had on CRM.


Despite Benioff’s characteristic denigration of Microsoft, SharePoint has already shown that you can have an attractive front-end for managing content, while companies like Documentum have a ton of solid experience at the more complex end of the ECM spectrum.


Salesforce's vision of a pure browser-based view of an enterprise’s assets will necessarily be incomplete even if one were to accept his implicit belief that document users will be happy to apply ratings they way they do on the latest music videos.


Salesforce’s content push might offer up an attractive alternative to search software, or for managing information culled from sales or service calls, but that will only take the firm part of the way to a true ECM system.

Wednesday 19 September 2007

Social networking: the monetization begins

I've been kicking myself all day today; News Corp's purchase of MySpace is making better sense than before. According to the New York Times, MySpace intends to customise adverts based upon the data that its users freely provide as part of their profiles. Facebook isn't far behind, although why one advertiser thinks that journalists want to work from home (we do) on dull data entry (Oh, hold on, this sounds rather familiar) for £12 an hour (Now steady on), I'm not sure. On second thoughts, that sounds like perfect targeting.
The long and short of it is that NewsCorp reckons it can tailor ads to its users, using intimate data the users provide - rather than usage data that the site, or a search engine, gathers from the user's behaviour. News Corp. says that the ad targeting is based on what users say, what they do and what they say they do - smart work.
There's a catch here. Most users assume that the old model of web advertising - get a rough idea of the user, slap ads on page, collect revenue - applies. What they're about to find is that the data they are providing - to be shared with friends - is going to be used to sell stuff to them.
That said, there is a massive shift in the way people use the web. I now realise I'm an old 'un at the age of 32, because I'm going to say the following: youngsters use social networking differently. They are less concerned about freely sharing all kinds of information with others; with inviting people they don't know, and have never met, to be friends. They're unconcerned that the deal they are involved in - be advertised to in a very targeted manner in exchange for the platform - is not as good for them as it was in the past. It's a toss-up as to whether the new profiling and advertising methods will cause a backlash - we'll only be able to find out by watching how MySpace users react. Assuming, of course, that they react at all.

Tuesday 18 September 2007

Preserving the past and planning for the future

Following my interview with Nathalie Ceeney, CEO of The National Archives (TNA) (IWR September issue), where we discussed how the government is tackling the preservation of its born-digital content. I recently had an equally interesting conversation with Simon Stammers, sales director at Anacomp about archiving documents, using more traditional means.


Anacomp create and archive records onto microfiche and 16mm roll film, apparently they have the world’s largest installed base of film generating systems (they also scan and convert documents for electronic storage and management). Simon explained that although Anacomp have a digital operation within their business there is still a necessity for hard copy material; typically financial and regulatory data. They currently produce 1 million microfiche a month at their Salford production centre.


It was interesting to see that there is still a place for this kind of preservation of information, ok it might not be a growth industry but Anacomp seem to have found a comfortable niche to exist in. Archiving in the future won’t just be digitally-based.


I have also been reading up posts from an attendee to the Society of American Archivists (SAA) conference in Chicago last month. There were some difficult questions asked on how and what to preserve from both digital and physically generated content.


As an attendee to the event, Jeanne Kramer-Smyth, archives, records and information management student, discussed on her blog these debates and issues raised by the panel as well as conference goers.


For example, one session’s panel of experts examined the issues of preservation of born digital records in architecture and engineering. There was mention of archiving related materials created from Building Information Modelling (BIM). This technology means certain digital printers have the ability to produce physical models adding one more type of record to be preserved.


Kramer-Smyth says the problem is that there has been “an explosion in the various means of representation. The architecture world is catching up to other industries (such as the auto industry) that have been doing this for 25+ years”.


Other matters the SAA panel thrashed out were efforts to make digital preservation as robust as anything that a physical archive achieves. Kramer-Smyth also senses a growing effort to include archivists earlier on in preservation projects rather than years after the contents creation.


That reminded me of something Ceeney said in our interview, “I think you will find leading edge R&D Techies in the world are actually probably in archives because we are the ones who have seen this challenge coming and have been putting resources behind it”.

Monday 17 September 2007

Fair use benefits the economy, so Free Our Data Mr Brown

A report from the Computer and Communications Industry Association (CCIA) in the USA shows that fair use of copyrighted material is beneficial to the national economy. According to the CCIA industries that can use material under the terms of fair use earned  $4.5 trillion, which adds more weight to the arguments of the Free Our Data campaign from newspaper The Guardian.


Free Our Data wants information held by the government, and therefore paid for by tax payers, to be made freely available so that organisations can use it.


Amongst the organisations using fair use terms that have benefited the US national economy are media organisations, education sector and software developers. 


Industries bound by copyright control with no fair use aspect contributed just $1.3 trillion to the US economy.


Fair use under US copyright law is described as being the use and copying of copyright protected material to comment upon, criticise or parody. Examples include summaries and quotes from medical articles for news, use of media content for teaching or the use of copyright protected material as evidence in a court case.


The Guardian Free Our Data campaign, run by its Technology supplement argues, rightly, that information collected by the Highways Agency, the UK Hydrographic Office and Ordnance Survey should be made available to organisations in the UK without being encumbered by clunky copyright restrictions. Although designated as trading funds, these three organisations receive almost 50% of their income from the public sector, which means taxpayers pay for it. Access to this data is charged for and as a result, organisations are turning to Google Maps for mapping information rather than using information they have already paid for through their business rates.


IWR supports the Free Our Data campaign because we are passionate about online information and want to see the UK remain a leader in information provision and we want to see British information professionals continuing to manipulate information in innovate ways that is beneficial to their user community.

Thursday 13 September 2007

Not enough hours for social software

Dave Snowden, 'knowledge management' whizz and a lot more besides, runs an interesting blog called Cognitive Edge. One of his recent posts mentioned Bertrand Russell's observation:

"If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence."

This is exactly what happened with the Quechup social networking invite which I mentioned last week. People were getting invites from their friends, or so they thought, and they signed up without a moment's hesitation. The evidence, in this case, was the fact the mail appeared to come from a friend, when it didn't.


Here's a more recent example. A story from Peninsular Business Services, which was swallowed whole by the BBC on Tuesday:

"233m hours are lost a month due to workers wasting employers’ time on social networking sites such as Facebook. The problem will cost employers £30.8bn per year and is to escalate."

From a sample probe of one, I learnt that people in the BBC are 'addicted' to Facebook. But whether the time spent on it is 'wasted' or actually time-saving, I have no idea. And I don't suppose Peninsular knows either. But it certainly knows how to produce a scary headline and frighten businesses into shunning social networking.


I tried to do the maths: if 47 percent (Sigurd Rinde came up with that one) of the 60-odd million people in the UK work, if they all used social software and if Peninsular's figures were true, they'd 'waste' just over eight hours a month each. Say two hours per week. Assuming a 36-hour week, anything below a 5.5% penetration would mean that users of social software do nothing else while at work.


Dennis Howlett, who waxed lyrical on this subject, reckons that fewer than one percent of employees use social networking sites at work. This would suggest that Peninsular's figures are very very wrong. It would mean those that had access would be spending over forty hours per day being social online.


Yet, returning to Bertrand Russell, I bet people believed them and their implications because it fitted their instincts.

Partying like 1999

Earlier this week PaidContent.org launched its UK and European information service at a swanky Scottish bar in, err, London.  IWR went along and once underneath the deer antler chandelier it was as if a time and space wormhole had opened up and we were transported back to 1999 and they heady dot com boom.


The zeitgeist was unmistakable, young trendy professionals in Chris Evans glasses, sharp suits, bright shirts and an excitable level of conversation about "content" and "funding". It was uncanny. The headache's from the launch parties of Boo.com, Handbag.com and anything you like .com have only just cleared at the IWR Editor's desk and all of a sudden I get the feeling that it is all about to happen again.


The last web boom rapidly replaced CD-Rom in the professional information space and for those of us commentating on it for the traditional information sector, we were regularly told our days were numbered and the geeks would inherit the earth. In many ways everything has changed, yet also, nothing has changed.  Jimmy Wales and Wikipedia are significant changes, but despite falling ad revenues, the stalwarts of information still remain kings of the jungle.


Interestingly at this party, fund toting entrepreneurs didn't make the same mistake of predicting the demise of traditional information providers; instead I heard many conversations about partnerships, relationships and hosts. Kewego, just one of the bright (complete with lime green logo) Web 2.0 start ups present talked of the importance of the "content owners" and rattled off the names of respected information providers. The general feeling I left with is that if we are about to start partying again, but the difference is not that the new players think they have all the answers and will replace our libraries, publishing houses and research departments, instead they see themselves as a component and supplier.


Widgets is a term used widely in the blog world and already newspaper groups are adding widgets to their online portfolios. The next information wave appears to be about a wealth of new ("funded" and partying) companies offering to add their widget to your information. For information professionals this means understanding what a widget is, what it offers your users and negotiating a good deal for all parties involved.

Lies, damn lies and benchmarks

The always-excellent CMS Watch has an interesting article on benchmarking portals.


The author, Janus Boye, makes the valid point that benchmarking has to be seen in the context of what the portal is intended to do. This might seem an obvious point but it underlines how slippery even empirical data can be in the wrong hands.


We’re all familiar with the phrase about lies, damn lies and statistics (although nobody is quite sure who coined it) but I’ve always preferred the less popular one about those who use statistics the way a a drunkard uses a lamp post -– more for support than illumination.


In many fields of computing, benchmarking is broken. In server microprocessors, companies like Intel, AMD, IBM and Sun can make merry with numbers but they don’t mean much, at least not without a ton of accompanying explanation. The fault is not so much with the design of the simulations themselves or their accuracy, but with the multiple, and often mutually contradictory, nature of the metrics being applied. For example, in servers, some buyers will want raw performance, some energy efficiency, and an increasing number will want the perfect virtualisation host.


Too often, benchmarks are cited to impress even though the numbers are meaningless. As my old history teacher used to say, if you can’t blind them with science, baffle them with bullshit. However, used properly, benchmarks offer short cut to finding valid, independent insights.


When benchmarking an ECM, CMS or other system, the temptation will always be to impress the person who is asking. If the FD wants to see ROI, the person who bought the system will try to find a way to show ROI.


One thing that would help buyers is a little help from the vendors. Assistance in measurement tools based on specific criteria would help buyers offer up proof points for not just ROI but usability, service improvements and other factors.


So far, that assistance has been limited. I wonder if some vendors are a little coy of providing tools that might produce answers they don’t like. 

Wednesday 12 September 2007

Back from the dead... again

There's nowt but a note on Barrons to show for it, but something is stirring at the long-moribund website of The Industry Standard. Could it be that, a mere couple of weeks after Business 2.0 bit the dust, another title tied to the Dot Com boom is coming back from the dead?
The Standard was one of the stand-out magazines of the era, a mag that didn't take itself as seriously as others, and wasn't afraid to puncture a few egos in pursuit of the story. Sub Standard, the unofficial in-house rag, was a hysterically funny read - if you could get hold of it. If you haven't read James Ledbetter's account, you'll have missed out.


But besides making tech hacks a bit misty-eyed, what does it all mean?



Tuesday 11 September 2007

Speed bumps for the patent highway please

Last week the US Patent and Trademark Office (USPTO) and its UK counterpart, the UK Intellectual Property Office (UK-IPO) embarked on a trial run to speed up the patent application process. What this means to applicants is that those who have already had a patent examination report conducted in one country, can now request an accelerated examination in the other. It’s intended to speed-up the process; critics say it hasn’t been thought through enough. 


Commenting on the agreement, Jon Dudas, Under Secretary of Commerce for Intellectual Property and Director of USPTO, outlined why those working in the patent system will benefit, he said; “Patent users worldwide want offices to cooperate more effectively. Our collective goal is to reduce duplication of work, speed up processing and improve quality” he went on to say the scheme ‘contributes to a more rational international patent system” 


The pilot scheme or Patent Prosecution Highway (PPH) is scheduled to run for 12 months, and if a similar extension of six months between the US and Japan is anything to go by, the UK venture could well follow suit. The UK-IPO says in their press release that the scheme will measure the demand from applicants and “quantify the quality and efficiency gains to be expected”. This also follows recommendations from last year’s Gowers Review that the UK-IPO share work with other international organisations.


Meanwhile critics say that the pilot scheme could do with further scrutiny. Apart from usefully outlining a breakdown of patent-application eligibility requirements for PPH; Eugene Quinn, Patent Attorney and blogger at the Practising Law Institute (PLI) complains of restrictions to continuing patent applications being limited from November will have a knock-on effect. It seems PPH applicants will end up penalising themselves, or as he puts it; “You would likely have to leave subject matter unclaimed in any pending application in order to use the PPH, which seems ridiculous.” 


Over at techdirt, president and CEO Mike Masnick asks on his blog; “Since patents are granting a rather complete monopoly, doesn't it make sense that they should be thoroughly reviewed before being granted?” He also questions, what happens if there are different levels of leniency between patent offices around the globe? If we are to assume the success of the scheme, more countries will want to sign up. Masnick argues that patent offices with a more relaxed reputation will attract companies wanting to push questionable patent applications through. What’s more is that they will be fast-tracked to boot.

Thursday 6 September 2007

Cut the IT boys some slack

Many otherwise intelligent social networking fans have been taken in by an outfit called Quechup. They receive messages from their pals inviting them to join "the social networking platform sweeping the globe". When my first invite arrived a few days ago from Hugh Macleod, of Gaping Void fame, I was immediately suspicious. I have huge respect for Hugh's talents and we rub along reasonably well when we meet, but we're not what you might call friends.


Instead of signing up, I dropped him a note. His reply was, well, abject. A couple of days later, I got an invite from the equally well-known and respected Euan Semple. Same thing.


Something was amiss. And it was very simple. Quechup asks for a password to your email accounts so that it can, essentially, pretend to be you. It does it by saying "see what friends are already in Quechup" but it actually takes your entire list and emails it.


This could prove something of a turning point for uncritical enthusiasm for social networking. We're so used to seeing 'good' organisations handle our trust with the respect we expect, we forget that not everyone plays by these unwritten rules. Perhaps the calculation is that greed pays - collecting millions of email addresses is worth brassing off a few hundred thousand people. They have to be wrong.


All this was happening as the San Francisco based Office 2.0 conference was about to take place. There, some caution was urged by corporate types who reminded delegates that, while social software was all very nice, regulation was very real and, in many companies, a proper audit trail of customer contact is needed. Casual instant messaging conversations are not usually captured for posterity. A good point and one that was, in all probability, lost on the many gung-ho 'social is good, corporate is bad' members of the audience.


We might moan about corporate IT and its concerns. But it's their job to protect the organisation from unnecessary digital exposure. Quechup is a timely reminder that not everyone in the social networking world is as nice as its advocates would have you believe.

Jimdo and the webset

Thanks to a heads up from Profy, I have enjoyed getting to know Jimdo – a web building kit site, apparently its undergone a revamp or two of late and if how easy it is to use is anything to go by, then its mission accomplished.


If like me you are a web-building novice, usually relying on internal support for guidance, using a tool like Jimdo is actually quite empowering. Not only is it effortless to customise your own page within Jimdo; point, click, edit (that’s pretty much it) it’s also free.


There is a pro version available for 5 euros a month, the main benefits of signing up to this is that what you get in addition to the basic package is a choice of own domain name like .org/.com etc. an email account, 5Gb of storage, an ad free page as well as an advanced stats package.


Jimdo claim that they are the only website building platform world wide that permits users to integrate their own design and from my trial run it’s not offering a bad suite of tools to do that. There is a fair selection of customisable images and stock photography to use, however Jimdo is the kind of application built for users to customise from the ground up.


You can place the usual coupling of copy and multi-media on your page whether that’s Youtube videos or photos its up to you. A very basic knowledge of html will permit you to add in a whole set of widgets, allowing some great customisation options. If that seems even a little daunting the FAQ’s section offers compact step by step guides with accompanying screenshots.


Given that you have a built-in stats package at your disposal (even for the free version), indicates that Jimdo believe their user-base will be serious enough to want to monitor usability and traffic results rather than for Myspace page-hit narcissism.


I think the opportunity in such easy to create web sites, will see a boon for more project-based work or be useful as a presentation gateway to peers and colleagues. Most organisations will already have an official website and blog offering, Jimdo offers the opportunity to carve a presence somewhere in between the two. I can certainly see myself using something like this the next time I need to share results or progress from project work around the office. The fact that it’s so easy to use is what makes Jimdo fly above the rest.   

Wednesday 5 September 2007

No to Office Open XML

This week's news that the ISO has rejected Microsoft's OOXML standard should be welcomed. XML is a great way of opening up documents and files to people using whatever platform they choose, and OOXML was an attempt to squeeze proprietary and trademarked technologies into what should be an open standard. There's even been a petition against quick adoption of the standard by ISO.
OOXML doesn't stand for Office Open XML for nothing; it's not about doing stuff in an office, and everything about doing stuff *with* Microsoft Office. And if you live somewhere where the unbelievable cost of Microsoft Office is possibly a good deal more than your salary, then kicking out a proprietary format in favour of something anyone can use has to be good news.
OASIS and UN Edifact have worked incredibly hard behind the scenes over the years to introduce XML as a standard; it has real implications and a huge positive impact for the third world, where cheap computing can save lives.
I remember talking to a Sun spokesman six years ago, and, bear in mind Sun is partial, he predicted that Microsoft would attempt to embrace, extend and extinguish XML back then.
I'm being incredibly partial myself here, but the choice is pretty stark; use open standards and a choice of free or premium software as it suits your pocket, or embrace standards with proprietary elements and find that free software doesn't quite cut it, but a £400 software package will.

Tuesday 4 September 2007

Information professionals guiding you to the best bits of the Blogosphere - James Mullan

From this month onwards the Blogosphere print title page will be dedicated to information professionals who are shaping the blogosphere. Each month, one of your peers will explain why they blog and what benefits it provides, and reveal which bloggers they trust, or just plain enjoy.


Q Who are you?
A James Mullan, 32. It’s hard to have hobbies when you have a two-year-old child, but my ultimate goal is to catch up on sleep. I also enjoy going to the gym; in particular running. I’ve participated in a number of 10k races for charity. I also enjoy reading (not just RSS feeds), blogging (obviously), film and TV, and travelling. I’m currently Information Officer at CMS Cameron McKenna, where I have responsibility for the Library Management System, Internal Portal Pages, the production of statistics and several other applications.


Q Where can we find your blog?
A http://ligissues.blogspot.com


Q Describe your blog?
A It’s a bit of a mish-mash really, but I’m particularly interested in Web 2.0 and the impact it has on law libraries. I have posted about Facebook, Delicious, Second Life, lawyers use of online databases, Librarian 2.0, social media and social networking. Where possible I will post about how legal publishers are using Web 2.0, as I see this as a developing area for legal publishers, but one whose benefits are not fully understood.


Q How long have you been blogging?
A Eight months externally, three months internally


Q What started you blogging?
A I’m the chair of a British and Irish Association of Law Librarians (BIALL) Committee, which works closely with publishers. One of the concerns myself and another member had is that we weren’t advertising the work we did and thought a blog might be the ideal solution as it would allow members to view posts just on issues relating to particular publishers. As it turned out, the blog became much more personal. To supplement this, we launched the BIALL Blog in May 2007 – the official BIALL Blog – which myself and a number of other BIALL committee chairs administer.


Q Do you comment on other blogs?
A I try to comment wherever possible on posts I find interesting. The problem with commenting is that there are a number of blogging platforms available, some of which ask you to log in while others don’t, so immediately there is a barrier to commenting. Blogs that have comment moderation enabled can also be frustrating because blogging is so immediate you want to see your comment immediately. Unfortunately, because of spam this isn’t possible. Commenting is important because blogging shouldn’t be done in isolation. Building this ‘collaborative community’ is one reason why I try to comment on other blogs. Having said that, I don’t see any value in commenting for the sake of it.


Q How does your organisation benefit from your presence in the blogosphere?
A In several ways. One of the benefits is that I am now considered a ‘subject matter expert’ on blogging to the extent that I work closely with a number of other groups developing blogs. The other benefit is exposure to information. I ‘unofficially’ monitor blogs for anything that may be of interest to my colleagues; for instance, where CMS Cameron McKenna may have been posted about.


Q Which blogs do you trust?
A Information Overlord and Binary Law – I consider these the most trustworthy. Trust is an interesting concept within blogging. I guess it comes down to whether you can rely on what the blogger has posted about.


Q How does it benefit your career?
A My blog has exposed me to many individuals and organisations. I’m writing for Information World Review for a start! Blogging can be hugely beneficial in terms of
exposure, but it is time-consuming.


Q What good things have come about because of your blogging?
A I’ve met a lot of people I would never have met otherwise. I’ve also spoken at a Knowledge Management Conference on Blogs, Wikis and Social Media, which I would have never imagined doing pre-blogging.


Q Which blogs do you read for fun?
A Police Camera Action – for insight into the work of the police. http://policecamerapaperwork.blogspot.com
Lifehacker – because there is no way you can’t. http://lifehacker.com
The Dilbert Blog – for my daily laugh http://dilbertblog.typepad.com


Q What bloggers do you watch and link to?
A Enquiring minds want to know http://enquiring-minds.blogspot.com
Lore Librarian http://lorelibrarian.blogspot.com
Lo-Fi Librarian www.lo-fi-librarian.co.uk
Library Etc http://neilstewart.wordpress.com
Jennie Law http://jennielaw.blogspot.com
Information Overlord www.informationoverlord.co.uk

Monday 3 September 2007

Open Text comes back to life

After a period in the doldrums, Open Text is looking like a hot company again.


Shares in the Canadian firm shot up 25 percent late last week on the back of outstanding financial results with licence growth up sharply and overall quarterly revenue up from $105.2m a year ago to $175.2m. That’s a remarkable rise for a sector that some say is slowing and is often characterised as being full of veterans under threat from Microsoft and other menaces.


Some experts said the stock jump was in part because investors are looking for safe havens in a market that is deemed to be rebounding, unlike the financial services sector that is taking a hammering. Certainly, IT bellwethers such as Intel, Microsoft and Cisco are also up although none so much as Open Text.


The real reason Open Text is suddenly so popular is that it is performing. The company said the integration of Hummingbird and work on bolstering its partner channel had gone well. Good for Open Text, a straight-talking company that tends to keep marketing BS down to a minimum and sometimes pays a price for its modesty.


Becoming a $1bn-revenue company still looks a long way off but the shooting stock price might at last put to bed (for a short while, anyway) the argument that as the biggest of the last remaining big ECM firms not owned by an IT giant, Open Text will soon be swallowed up by an SAP or other behemoth.


Open Text's performance is probably also a reminder that hard-won domain knowledge still has value and that veterans of the sector won't be displaced without a battle.