So it's that time of the year again - the time of the year where we spend most of our days writing endless year round-up or prediction stories for your delectation. Having had my fill of that, and stretched my tiny little brain about as far as it will go, I'm going instead to list some of the main predictions made by analyst firm CMS Watch.
Now the firm has scored a pretty good success rate in years gone by, so with any luck, I'm going to look pretty good when the following actually happens. With any luck. First off, the inevitable; you may have had your fill of "economic meltdown" stories already, but it's likely to genuinely impact the market next year, and could mean good news for buyers.
As I've mentioned a few times throughout the year, if you're prepared to bargain hard, good deals can be had from ECM vendors on up-front license costs, and this will be especially true as vendors become desperate to retain as well as attract new customers. According to the analyst, year-end discounts of 50-70 per cent are already starting to crop up, although as CMS Watch founder Tony Byrne told me, vendors are unlikely to budge on maintenance and consulting fees, where they will hope to claw that money back.
The economic climate is also likely to encourage IT shoppers to look at open source alternatives, although a word of warning here from the experts: unexpectedly high development and integration costs may mean that these offerings are less of a bargain than they first appear.
Another knock-on effect of the downturn could be low valuations for WCM firms, which may spur some M&A activity in the industry. Some big ECM firms without a strong web content management solution may see this as a perfect time to buy-in some expertise.
Moving away from the credit crunch, and back (briefly) to SharePoint; the next version of Microsoft's flagship 'content management' solution could be seen in beta by the end of 2009. However, it's unlikely that firms will be seriously considering it, as many have yet to fully digest SharePoint 2007.
And finally, there is perhaps a spot of bad news for Oracle, whose lack of investment in knowledge worker-facing technologies could leave the firm trailing rivals Microsoft, IBM and others. Fine for heavy duty document management and back office stuff, but in the lighter weight Web 2.0 applications and web content management space it could be struggling, said Byrne.
And there you have it. Now let's sit back and see what happens.
Monday, 22 December 2008
Sunday, 14 December 2008
PHP and .Net - a third way?
There are certain topics in the world of information technology which always excite intense debate and near-religious fervour. PC versus Mac, open source versus proprietary; whatever stories we push out which may come down on one or other side, you can be sure that feedback from readers will be swift, uncompromising and sometimes abusive.
Now I'm all for debate, I'm up for a lively exchange of views as much as the next man, but sometimes the level of devotion to one or other side is so high it's quite bemusing. Wouldn't it make everyone's life a little easier if we all just got along and put their differences on the back burner?
We've already seen promising moves being made in the content management space to bring together the competing ECM vendors and make their products interoperate with each other. The Content Management Interoperability Services initiative has been hailed by many as a major breakthrough in the industry as it could finally ease the IT headache of having to manage multi-vendor and multi-repository environments. With Microsoft, IBM Alfresco, OpenText, Oracle, SAP and Adobe all on board, it stands a great chance of success. Finally the vendors have realised that the industry can't grow unless greater interoperability is achieved.
And now, at a programming level, a new initiative which could finally reconcile the great divide between PHP and .NET programming languages for the good of everyone - courtesy of WCM vendor Jadu. Development of the Phalanger PHP compiler was funded by the firm, but it is now releasing it into the open source community. It basically enables the creation of PHP applications which can run natively under the .NET Framework, allowing firms to make use of PHP apps without needing to rip out existing .NET/Visual Studio environments.
So PHP developers will finally see the lucrative Microsoft customer base open up to them and, as Ovum analyst Tony Baer told me, it provides "one less reason for .NET development shops to oppose allowing PHP into their wall gardens". It seems web developers can finally have their cake and eat it by profiting from the ease-of-use and effectiveness of the PHP language and the richness of the .NET platform; a benefit which will surely cascade down to end users ultimately in the form of more compelling applications. Good news too for the careers of developers, who will no longer have to go through costly retraining on PHP or .Net if they want to get on in the industry. See, everyone's a winner when look for ways to work together.
Now I'm all for debate, I'm up for a lively exchange of views as much as the next man, but sometimes the level of devotion to one or other side is so high it's quite bemusing. Wouldn't it make everyone's life a little easier if we all just got along and put their differences on the back burner?
We've already seen promising moves being made in the content management space to bring together the competing ECM vendors and make their products interoperate with each other. The Content Management Interoperability Services initiative has been hailed by many as a major breakthrough in the industry as it could finally ease the IT headache of having to manage multi-vendor and multi-repository environments. With Microsoft, IBM Alfresco, OpenText, Oracle, SAP and Adobe all on board, it stands a great chance of success. Finally the vendors have realised that the industry can't grow unless greater interoperability is achieved.
And now, at a programming level, a new initiative which could finally reconcile the great divide between PHP and .NET programming languages for the good of everyone - courtesy of WCM vendor Jadu. Development of the Phalanger PHP compiler was funded by the firm, but it is now releasing it into the open source community. It basically enables the creation of PHP applications which can run natively under the .NET Framework, allowing firms to make use of PHP apps without needing to rip out existing .NET/Visual Studio environments.
So PHP developers will finally see the lucrative Microsoft customer base open up to them and, as Ovum analyst Tony Baer told me, it provides "one less reason for .NET development shops to oppose allowing PHP into their wall gardens". It seems web developers can finally have their cake and eat it by profiting from the ease-of-use and effectiveness of the PHP language and the richness of the .NET platform; a benefit which will surely cascade down to end users ultimately in the form of more compelling applications. Good news too for the careers of developers, who will no longer have to go through costly retraining on PHP or .Net if they want to get on in the industry. See, everyone's a winner when look for ways to work together.
Monday, 8 December 2008
UK data breach notification laws?
After all the recent news about the new powers to be granted to the Information Commissioner, Richard Thomas, another piece of information pushed out by the Ministry of Justice appears to have gone rather unnoticed.
It was a definitive statement saying that the government would accept Thomas's request that there should be no US-style data breach notification laws for private sector organisations in this country. Of course, public sector organisations are already forced to report any significant "actual or potential" data losses to the ICO - so why not private sector firms?
Well, Thomas has argued that the US experience has not been a particular good one. It's certainly true that mandatory notification laws have the potential to desensitise the public to data losses. If breaches are in the news all the time then the public is less likely to pay any attention - although you could argue that this is pretty much already the case. Then there are problems such as how high should you set the notification threshold, and who should firms be obliged to notify - just their customers or the relevant authorities too? And on top of this there is the potential problem of phishing attacks. Criminals could well decide to send out mass emails after a large data breach, hoping to hit gold by appealing to an organisation's customers that there has been a data breach and that they should reconfirm their details.
But is the alternative to breach notifaction laws really the best option? ICO Thomas, and now the government, seem to warnt a situation where private sector firms have to report breaches only as a matter of good practice, but although fines will be levied according to the seriousness of the breach, a system of voluntary disclosure hardly seems like the best solution
The negative impact of a data breach can be so great that it may well tempt some firms to keep quiet in the hope it could be covered up. No breach notifaction law also means that the government and law enforcers can't get any idea of the true scale of the problem, which is woefully underreported at present, according to most experts.
This could all be a moot point of course, if the EU has its way. Data protection supervisor Peter Hustinx told me recently that European breach notifaction laws could be put in force for telcos and ISPs as soon as 2011. He also argued that it would be "fair and in line with reality" for them to be extended to all firms. Were this to happen we could be in that rare situation where European laws actually have a positive outcome.
It was a definitive statement saying that the government would accept Thomas's request that there should be no US-style data breach notification laws for private sector organisations in this country. Of course, public sector organisations are already forced to report any significant "actual or potential" data losses to the ICO - so why not private sector firms?
Well, Thomas has argued that the US experience has not been a particular good one. It's certainly true that mandatory notification laws have the potential to desensitise the public to data losses. If breaches are in the news all the time then the public is less likely to pay any attention - although you could argue that this is pretty much already the case. Then there are problems such as how high should you set the notification threshold, and who should firms be obliged to notify - just their customers or the relevant authorities too? And on top of this there is the potential problem of phishing attacks. Criminals could well decide to send out mass emails after a large data breach, hoping to hit gold by appealing to an organisation's customers that there has been a data breach and that they should reconfirm their details.
But is the alternative to breach notifaction laws really the best option? ICO Thomas, and now the government, seem to warnt a situation where private sector firms have to report breaches only as a matter of good practice, but although fines will be levied according to the seriousness of the breach, a system of voluntary disclosure hardly seems like the best solution
The negative impact of a data breach can be so great that it may well tempt some firms to keep quiet in the hope it could be covered up. No breach notifaction law also means that the government and law enforcers can't get any idea of the true scale of the problem, which is woefully underreported at present, according to most experts.
This could all be a moot point of course, if the EU has its way. Data protection supervisor Peter Hustinx told me recently that European breach notifaction laws could be put in force for telcos and ISPs as soon as 2011. He also argued that it would be "fair and in line with reality" for them to be extended to all firms. Were this to happen we could be in that rare situation where European laws actually have a positive outcome.
Thursday, 4 December 2008
Online Information Conference - Day 3; The Digital Company in 2013
How will technology change the way organisation operate and do business with one another?
The task of understanding how we use information whether as casual browser, consumer or scholar has been one of the elements examined rigorously on the last day of the Online Information Conference.
The JISC National Observatory project studied how eBook usage in academia operated, what was required and what was needed by students and librarians alike. Traditionally trusted sources (such as journalists) were no longer seen as credible as they once were. "Print is dead" said one speaker. David Nicholas meanwhile gave us a stark warning to start understanding how users interact with information (based on our traditional assumptions - not the way we would like to think) we need to adjust our models accordingly, that much is clear.
The irrefutable conclusion was that things were going to change with the demand for information online continuing to rapidly increase.
In attempting to meet that demand, it is essential for content providers, whether publisher or amateur to understand how users operate when searching.
Dennis McCauley, Director, Global Technology Research of the Economist Intelligence Unit came to talk to us about what the digital company would look like in five years time.
McCauley explained the research his organisation has conducted recently found that there will be a greater age of collaboration - why? Because it is much harder for firms to go it alone, there will be a need for organisations to "let go" he said.
Technology will remain the greatest influencer of business change and it will cause excessive complexity, such as the change of business models and the changing nature of demand for a company's products, he added.
Because customers will continue to become more tech savvy, there will be a great increase into the importance of online communities, this will be beneficial in that organisations will get to hear what their customers think (whether they like it or not) as well as the risk of mob mentality should your cross them.
This all ties in with the collaborative behaviour McCauley outlined. There is the expectation that some of the best ideas will arise from improved interaction with their clients.
Social networking applications will be widespread in the enterprise organisation; those that don't like this idea will need to start accepting it and with it give their trust or get left behind.
In summary McCauley gave us these points to consider
There will be no big tech jumps in next 5 years - although e-books will become more widespread
Technology will be used more effectively but differently in an organisation
Collaborative technology will be mastered
Obstacles include: rigidity in organisations, tight budgets, skill shortage and security concerns
The task of understanding how we use information whether as casual browser, consumer or scholar has been one of the elements examined rigorously on the last day of the Online Information Conference.
The JISC National Observatory project studied how eBook usage in academia operated, what was required and what was needed by students and librarians alike. Traditionally trusted sources (such as journalists) were no longer seen as credible as they once were. "Print is dead" said one speaker. David Nicholas meanwhile gave us a stark warning to start understanding how users interact with information (based on our traditional assumptions - not the way we would like to think) we need to adjust our models accordingly, that much is clear.
The irrefutable conclusion was that things were going to change with the demand for information online continuing to rapidly increase.
In attempting to meet that demand, it is essential for content providers, whether publisher or amateur to understand how users operate when searching.
Dennis McCauley, Director, Global Technology Research of the Economist Intelligence Unit came to talk to us about what the digital company would look like in five years time.
McCauley explained the research his organisation has conducted recently found that there will be a greater age of collaboration - why? Because it is much harder for firms to go it alone, there will be a need for organisations to "let go" he said.
Technology will remain the greatest influencer of business change and it will cause excessive complexity, such as the change of business models and the changing nature of demand for a company's products, he added.
Because customers will continue to become more tech savvy, there will be a great increase into the importance of online communities, this will be beneficial in that organisations will get to hear what their customers think (whether they like it or not) as well as the risk of mob mentality should your cross them.
This all ties in with the collaborative behaviour McCauley outlined. There is the expectation that some of the best ideas will arise from improved interaction with their clients.
Social networking applications will be widespread in the enterprise organisation; those that don't like this idea will need to start accepting it and with it give their trust or get left behind.
In summary McCauley gave us these points to consider
There will be no big tech jumps in next 5 years - although e-books will become more widespread
Technology will be used more effectively but differently in an organisation
Collaborative technology will be mastered
Obstacles include: rigidity in organisations, tight budgets, skill shortage and security concerns
Wednesday, 3 December 2008
UPDATE: What Future for Search?
I was going to wait until tomorrow's keynote on Search before blogging about this, but I understand that the session will now count Andrew Kanter, Chief Operating Officer from Autonomy as a panellist.
As one of the senior figures at Autonomy it is going to be revealing what he has to say about search and what plans the organisation has for its technology. If you are involved in Enterprise search or just interested in semantic developments, this will be a must see.
With search guru Stephen E Arnold moderating the session I have a feeling that the panel will get a thorough grilling.
We will be blogging from the session so if you can't make it check back here around mid afternoon for a round up.
As one of the senior figures at Autonomy it is going to be revealing what he has to say about search and what plans the organisation has for its technology. If you are involved in Enterprise search or just interested in semantic developments, this will be a must see.
With search guru Stephen E Arnold moderating the session I have a feeling that the panel will get a thorough grilling.
We will be blogging from the session so if you can't make it check back here around mid afternoon for a round up.
Online Information Conference - Day 2: Using Web 2.0 tools in a learning environment
We all hear a lot about how Web 2.0 is applicable to all areas of life, both at work and at play. The education sector is certainly no exception with the library poised to play a crucial role in what is offered to scholars.
Professor Anne Morris, from Loughborough University examined the technology surrounding libraries and the service they provide in Higher Education. What they want to offer students is a richer learning experience.
The key thing with 2.0 tech is that the more people use it the better it gets, as far as libraries are concerned, she said.
Morris gave us a quick run through of what is on offer and its potential for helping students learn better.
Blogs - encourage the development of communities, they facilitate communication among librarians (Stanford University being a very good example).
Wikis - Offer an easy way to create lists and tips as well as the easy ability to comment on LIS services. There are of course issues with trust and security, but then that is true of any wiki.
Instant Messenger (IM) - Has been used for reference management, training and immediate online assistance.
One example that Morris gave us was the offering from the OCLC or QuestionPoint as it is known to its users. It's a good example of library's spreading the burden of information sharing and works well with a group of libraries using this technology.
Podcasts - a wide choice of material is made available to students whether as a lecture, interview, conference or tutorial. The list is substantial.
Social Networking - can be applied to recommendations, listings of popular materials and the opportunity to work in groups. The Virtual bookshelf available on Face Book is a nice idea to highlight the favourites in your collections and offer recommendations and reviews.
What the Pilkington Library have done at Loughborough is adopt a range of these ideas, such as a podcast introducing the library, a blog, RSS feeds on either all new material that comes in or a specific subject area.
What did the students think about all this? Had they even heard of the concept of Library 2.0? The research that Morris and her team conducted on the Information Department students showed that less than half of the scholars knew what the library 2.0 term actually meant. More worrying was that over 70% hadn't even used the tools or knew that they existed. However when asked if they would find receiving updates about their specific needs useful over 70% expressed a positive interest.
The general conclusions that Morris came across were mildly positive views of Library 2.0 tech. The most welcomed technology came from RSS feeds, podcasting, IM and professional reviews of books. There was little faith that fellow students would contribute much in the way of their own recommendations. The key thing to consider is that whatever technologies you are thinking of adopting, make sure they are user-centric, specific to their course needs and of course wanted in the first place.
Professor Anne Morris, from Loughborough University examined the technology surrounding libraries and the service they provide in Higher Education. What they want to offer students is a richer learning experience.
The key thing with 2.0 tech is that the more people use it the better it gets, as far as libraries are concerned, she said.
Morris gave us a quick run through of what is on offer and its potential for helping students learn better.
Blogs - encourage the development of communities, they facilitate communication among librarians (Stanford University being a very good example).
Wikis - Offer an easy way to create lists and tips as well as the easy ability to comment on LIS services. There are of course issues with trust and security, but then that is true of any wiki.
Instant Messenger (IM) - Has been used for reference management, training and immediate online assistance.
One example that Morris gave us was the offering from the OCLC or QuestionPoint as it is known to its users. It's a good example of library's spreading the burden of information sharing and works well with a group of libraries using this technology.
Podcasts - a wide choice of material is made available to students whether as a lecture, interview, conference or tutorial. The list is substantial.
Social Networking - can be applied to recommendations, listings of popular materials and the opportunity to work in groups. The Virtual bookshelf available on Face Book is a nice idea to highlight the favourites in your collections and offer recommendations and reviews.
What the Pilkington Library have done at Loughborough is adopt a range of these ideas, such as a podcast introducing the library, a blog, RSS feeds on either all new material that comes in or a specific subject area.
What did the students think about all this? Had they even heard of the concept of Library 2.0? The research that Morris and her team conducted on the Information Department students showed that less than half of the scholars knew what the library 2.0 term actually meant. More worrying was that over 70% hadn't even used the tools or knew that they existed. However when asked if they would find receiving updates about their specific needs useful over 70% expressed a positive interest.
The general conclusions that Morris came across were mildly positive views of Library 2.0 tech. The most welcomed technology came from RSS feeds, podcasting, IM and professional reviews of books. There was little faith that fellow students would contribute much in the way of their own recommendations. The key thing to consider is that whatever technologies you are thinking of adopting, make sure they are user-centric, specific to their course needs and of course wanted in the first place.
Online Information Conference - Do We Have a Profession? (part 2)
Following Natalie Ceeney's presentation, Gloria Zamora (President Elect, SLA, USA) examined the alternate careers on offer for the intrepid info pro. With a similar position to Ceeney she added "We are our own worse enemy, we need to blow our own horns more than we do", Zamora believes that sometimes the profession attracts the kind of people who want to help everyone but don't have a tendency to promote themselves.
One of the things the SLA have taken on to deal with this issue is the Alignment project which has concerned itself with finding out what the senior decision makers think the librarians, record keepers and specialists in the organisation are doing.
The perceptions seemed to be that while people understand what a Librarian is and does there is also a lack of appreciation of the specialisation that comes with the territory. Similarly there was uncertainty into what an information professional is. Sound familiar when you are trying to describe what you do to the layperson?
For Zamora, so long as the info profession get out there and bang their drums, there is a positive future, so much so that she believes we should start by bringing in people from other disciplines such as IT. Of course that doesn't mean making way with influencing the decision making process. The fight to make yourself heard will be an ongoing one.
One of the things the SLA have taken on to deal with this issue is the Alignment project which has concerned itself with finding out what the senior decision makers think the librarians, record keepers and specialists in the organisation are doing.
The perceptions seemed to be that while people understand what a Librarian is and does there is also a lack of appreciation of the specialisation that comes with the territory. Similarly there was uncertainty into what an information professional is. Sound familiar when you are trying to describe what you do to the layperson?
For Zamora, so long as the info profession get out there and bang their drums, there is a positive future, so much so that she believes we should start by bringing in people from other disciplines such as IT. Of course that doesn't mean making way with influencing the decision making process. The fight to make yourself heard will be an ongoing one.
Subscribe to:
Posts (Atom)