Thursday 28 February 2008

Space travel from your local library

During my first visit to SRI International in 1980, scientists were excitedly showing off pictures radioed back from the Voyager spacecraft as it passed Jupiter. On my next visit, it was showing pictures of Saturn's rings. Getting such close ups of these other worlds was a stunning achievement for the scientists and their powerful computers which were able to translate the incoming signals into such beautiful images.


It would be great, of course, to be a passenger on the spacecraft but, with a journey time to Saturn of over three years, and no prospect of return, this was clearly impossible. Until now.


Microsoft, amazingly, was allowed to pimp its upcoming WorldWide Telescope product at the Technology, Entertainment, Design (TED) conference this week. The conference is supposed to comprise 'Inspired talks by the world's greatest thinkers and doers' so I cannot be alone in thinking that this was a rather tacky coup by Microsoft.


Still, the program allows its (XP or Vista) users to immerse themselves into the visible universe, panning and zooming to whatever nooks and crannies take their fancy. Never mind waiting three years to get a close up of Saturn, it's a matter of seconds now. Or it will be in the 'spring' when it officially launches. The software changes the way we view and absorb the nature of the universe.


The pictures, which are taken from a variety of earth- and space-based telescopes, are stitched together into a totally convincing three-dimensional space using Microsoft's Visual Experience Engine. Different light wavelengths can be selected and a mouse click brings up contextual information about the current view.


Microsoft says that the WorldWide Telescope will be available as 'a free resource to the astronomy and education communities', which is an interesting way of phrasing things. But, presuming that public libraries are part of the 'education communities' bit and presuming they have pretty large screens, they could make the experience far more realistic than the average PC display. Indeed, this sort of thing could act as the hub for space events with astronomers as guest participants.




Fly the flag and fly it high

Earlier this week I ventured over to Westminster to attend the 2008 Network of Government Library and Information Specialists (NGLIS) conference. The theme was, “Changing to survive, breaking professional boundaries”. It was a war-cry to the government information managers to adapt or die.


Essentially the busy event was about equipping government information professionals with the necessary skills to promote themselves and their profession a little better.


The language and advice followed a common sense if obvious approach, such as “make yourself heard”, or “engage”, “talk the language of business” “learn to network better”.


An evangelical Natalie Ceeney, (National Archives boss) was the keynote speaker, she talked about opportunities for information management being at their greatest in the last decade. So were the challenges and threats.


The talk was of a real shift in how government perceives the importance of its information. There is greater ministerial interest in data and information than ever. In part this was down to new technological initiatives being developed as well as the obvious data-loss crisis that continues to beleaguer Whitehall. The bigwigs in government realise what you knew all along and that there is so much more that could be done with the information it held, both as a way of engaging the public and doing its job better.


Ceeney’s Keynote asked for governmental info pros to consider the following points and how that reflects in their role:


 Knowledge Management (KM) is strategic and important, Knowledge Information Management must be owned at a senior level.
 Departments must ensure Information Management features prominently as risk registers at a senior level.
 All departments should have a coherent information strategy.
 That an information or KM department is considered just as important as Finance and HR etc to an organisation


It was these changing roles and relationships in government that were the biggest threat to information management Ceeney said. Info professionals must stand up and not let IT take over their turf. The fear now is that a lack of self-promotion for info departments may be to their detriment.


Given that NGLIS spent a large part of the day examining this, there must be cause for concern. It’s almost as if there is an endemic lack of confidence among information workers in the organisation.


If Ceeney and NGLIS have anything to do with it, then government information professionals will need to change. Either that or the tireless efforts and pool of knowledge belonging to info managers could well turn to vapour in the glare of the IT-centric, business talking profession.   


Ceeney rounded off the keynote saying, “We need to see ourselves as part of a bigger whole, or it will end up in IT’s hands”. That would be a shame, information professionals, whatever sector they operate in, are a varied, highly informed and dedicated bunch of professionals, let everyone else know that too.

Wednesday 27 February 2008

Counting the cost of data loss

Data loss has become a running story over the last few months. Not so much is the question “Has there been a data breach?” more a case of “Who now?” Writes Peter Williams


You wouldn’t accept the catalogue of stolen laptops, mislaid CDs, unopened disks and general lack of care and attention from Kevin the teenager, never mind responsible organisations. One of the inevitable follow up questions is how much does all of this “now what have I done with that?” misplaced data actually cost?


The answer is further embarrassment for those responsible and more exasperation for the shareholders and taxpayers who finally picks up the tab. According to research issued this week (25 February 2008) the average total cost was more than £1.4 million. Perhaps more interesting, the 2007 Annual Study: UK cost of a data breach also revealed that the most significant component of data breach costs was the financial impact of lost business due to reduced consumer trust. The study (the first despite its title), was carried out by the Ponemon Institute and sponsored by PGP Corporation and Symantec Corp and focused on the cost of activities resulting from actual data loss incidents as well as identifying the most frequent causes. Breaches included in the survey ranged from 2,500 records to more than 125,000 records from 21 businesses spanning eight different industry sector. 


The average cost per record lost is £47 lost business leads to 46 per cent of the total cost of a data breach, as a loss of trust leads to higher churn and higher customer acquisition rates, the study found. The rest of the cost is made up from notification (£1 per record), detection (£15) and ex-post activities (£15), which are the costs after the event to help victims watch their credit or the reissuing of account cards, for example.


Lost laptops are the most frequent cause of data breaches (as no doubt as the NHS and MOD could confirm), accounting for 36%. The use of paper records account for 24%, while hackers, malicious insiders and malicious code combined are responsible for12% of such incidents.


The number of reported data lost incidents has risen sharply in the last few years. Perhaps that indicates that organisation are taking these fouls up more seriously but as this study shows more needs to be done to cut the cost and contain the damage.

Tuesday 26 February 2008

Web 2.0: rubbish name, great idea

Web 2.0 is a crummy name -- patronising, cloying and irritating. It presupposes that there is a clear generation gap between the first set of web technologies and the second set, as if the latter had been distributed like manna from heaven. It attracts get-rich-quick jerks, gimmicky marketeers, serial startup merchants, venture capitalists, conference organisers, insta-pundits. Even, ahem, bloggers.


Whatever you might find to dislike about Web 2.0 though (and that capital ‘W’ as if it equated to God, the Queen or some such, is one of them) it is a very interesting concept indeed. Catch-all term that it is, it has become a useful shorthand for software that relies on the wisdom of crowds, peer review, user-generated content, hyper-interactivity, friendlier user interfaces. It has even transcended the web to become a familiar aspect of client-server software too. It’s like classical music: we might not all have the vocabulary to describe it, but we know it when it’s there and we know what we like.


Web 2.0 has changed the way we use the consumer web and even performed the miraculous job of reinvigorating somnolent software categories such as human resources systems. But its ultimate business home is in enterprise content management systems.


The dirty, dark secret of many ECMs is that they are woefully under-used and, when used, badly. Non-existent or generic tags, systems that wither on the vine, training sessions that failed to get users moving, new silos that replace the old silos... these are common problems and an indictment of the usability, or lack thereof, of many an ECM.


Web 2.0 (that cringe-worthy name again) gets around the problem with technology-light tricks that encourage the user to participate, vote and swap ideas, or at least not lose valuable documents. Little wonder that companies like Vignette, Documentum, Open Text and Alfresco are all sprinkling the fairy dust on their most recent programs.


It’s probably still a little early to know whether Web 2.0 (last time, I swear) is the elixir that fulfils ECM’s huge promise, but it is a phenomenon that fundamentally changes the usability and, to dig out an ancient term, user-friendliness of software. Some elements will pass by the wayside but supporters can already show that it is rewiritng the face of what were unwieldy, ugly programs. Now, if only we could think of a new name...

Friday 22 February 2008

Exploit the gaps, not the silos

Toby Moores is a visiting professor at the Institute of Creative Technology and founder of videogame company Sleepydog. His company is best known for creating Buzz!, the hugely popular quiz for the Sony Playstation. He finds opportunities in the gaps between silos and traditional structures which are not good at communication and collaboration.


Of course, the barriers will fall over time and he will end up looking in new places for his opportunities but, as you might expect from a creative person, he has absolutely no shortage of ideas.


Loosely connected with this was the transcript of an interview with Tim Berners-Lee, done by Talis' Paul Miller. It's scheduled to go online next Wednesday (27th) and you might find it interesting. It gives a glimpse of how information can be liberated from its silos and encouraged to flow to where it's needed and to be aggregated and annotated in new ways.


And also connected, IBM is a company which has unleashed a swathe of communication and collaboration tools on its employees over the past ten years which, once again, can liberate information from its silos to the benefit of those who are able to re-use it.


As you know, everything digital can be easily copied and moved. Hoarding is more likely to diminish a person's reputation or, worse, make them completely invisible. Creativity is easier than ever before. Sure, the average person can't create a Harry Potter film, but they can create interesting YouTube movies, machinima animations or screencasts. And, of course, information mashups get easier by the day.


Abundance is the key here. And the more abundant information (in whatever form) becomes, the less value it has in its own right. What adds value is doing something new with it. The Berners-Lee semantic web stuff can ensure that information carries its provenance around with it. The stuff that Sleepydog produces indirectly promotes the films and singers whose clips are used. Oh and the IBMers who give are more likely to do themselves good than those who simply consume.


Potentially, everyone who originates useful, interesting or entertaining raw material can gain in some way from sharing rather than hoarding. But the real winners are those who are able to innovate through original blending of existing material.


And that's where you and I come in. Everyone of us is different. We each have our own take on the world and most of us are multidisciplinary to some degree. We speak one language at home and another at work, one when we're with the engineers another when we're talking to the chef. It's a natural human thing. If you find yourself able to talk the language of each of two different groups then you are surrounded by opportunity.


It strikes me that operating in the gap between IT and management is a fine starting point.

Thursday 21 February 2008

Undermining Gowers?

Should the UK law on copyright fall automatically in line with the European Union? One of the major advantages of the EU is the effort to create a single market for goods and services stretching across much of Europe. Of course it sounds a nice idea but the detail is the killer. Writes Peter Williams


As many readers of this blog will know (probably in some detail) the UK is in the middle of a hefty consultation on copyright as part of the long-standing review of intellectual property (IP). This is a serious business as it has become a familiar refrain that copyright underpins the success of a whole range of culturally and economically important sectors in the UK.


One of these sectors is music. In an earlier round of the copyright review in 2006 the British government commissioned review of IP decided that an extension to the 50-year term for performers’ copyright protection was not necessary. The decision at the time provoked understandable disappointment but the argument was effectively over. Well it was until this week when Charlie McCreevy, the European Commission’s internal market chief, said that he was minded to propose an extension to the protection enjoyed by performers by shifting the 50 year to 95. This move would bring Europe in line with the US.


Announcing the plan – which of course has numerous hurdles to negotiate before it would become law– the long-standing commissioner, put forward two reasons for the move: helping the poor artist with no pension and the simple idea that the current law isn’t fair.


Let’s forget the details of this particular issue of whether the British government decides to abandon the Gowers review and the 50-year limit in favour of falling into line with what may turn out to be the wider will of Europe on this particular issue of session musicians who were in a recording studio in the 1950s or 1960s. There is a wider principle at stake. Why is the information profession spending so much time and effort on responding to consultation to British law if IP needs to be sorted at a Europe-wide level? The McCreevy speech suggests it’s time for proper consultation on a thorough review of IP European law.

Microsoft wants it all in ECM

Open Text’s announcement this week that it will work with Microsoft to create services that sit on top of the latter’s SharePoint Server is a classic example of the realpolitik that currently characterises enterprise content management (ECM).


SharePoint has created a seismic shift in the category to the extent that ECM heavyweights realise that it makes more sense to make nice with the software giant than to try and stare it down. Open Text is not alone in realising this, and firms like Documentum, Interwoven, Vignette and others have done much the same. Fair enough, but the catch here is that Microsoft has a long history of turning the tables on its erstwhile partners.


The SharePoint strategy is straight out of the Microsoft playbook.


Step One: start out with a product that integrates with other Microsoft programs, and looks and feels just like them, even if it is a little rough around the edges.


Step Two: Create licensing that provides the illusion that the program is free or very low-cost.


Step Three: Get channel partners to work with the product in order to create traction.


Step Four: Adopt the marketing message that “this is [insert product cartegory name here] for the rest of us”.


Step Five: When traction has been gained, build up capabilities over subsequent versions to challenge, and usually beat, specialists.


Microsoft has pulled off this trick in server operating systems, systems management and many other areas. The likes of Open Text can argue for a while that SharePoint won’t scale, isn’t appropriate for certain verticals, doesn’t have top-end sophistication and so on. But when Microsoft enters a market it wants it all and software history is strewn with the names of companies that learned that the hard way.

Tuesday 19 February 2008

Getting at their real needs

I've noticed that when looking for something - a house, a car, a solution to a business problem, whatever - a lot of people arm themselves with a checklist of criteria to be satisfied.


Today, I came across an interesting variation on this theme. Someone from BT was explaining how they decide whether someone is seriously in the market for what they offer.


If you know about this, my apologies, I'll be blogging again on Friday. If not, then it's a set of 'sliders', each with an opposing statement at each end. All the user has to do is place the pointer somewhere between the two extremes. At a glance, the sales person (I'm sure they're called business consultants or something) can see whether they're in with a shout or wasting their time.


I could imagine automating this. Have a little box at the bottom of the sliders which tells you "give it a whirl" or "don't bother". It would save so much of everyone's time. The example we were looking at today concerned implementing IBM's Unified Communications and Collaboration wrapped up with BT services. Here are the extremes (or 'discussion points' as BT optimistically calls them):

Sliders

When asking the people you serve what they really want, it seems to me that such an approach, using your own criteria, would a) force them to think and b) give you a realistic idea of whether you might succeed.

Monday 18 February 2008

Tackling the plague of plagiarism

Ctrl-C, Ctrl-V is one of the simplest keystroke operations you can perform on a PC. (For readers of the blog who are Mac people, you will also agree that apple-C apple-V is equally straightforward). And yet that two-sequence action is causing a huge amount of concern in educational establishments across the world writes Peter Williams.


Plagiarism has become a plague and one which the teaching profession seems powerless to control. One blog by a Canadian teacher asked what could be done to try to curb the way that students appeared to think that writing an essay was no more than an exercise in cut and paste. The answer according to one correspondent was to assign a 0 for the work and hand out a detention. A few detentions, it was claimed, soon brought the class to realise that passing off other people’s work as your own is nothing more than cheating. Of course, the big headache is to distinguish between the deliberate cheat and those who misunderstand or misuse academic conventions.


News from Germany this week added a new twist to the debate over plagiarism. In an apparently carefully worded statement the Proteomics editor-in-chief and publisher Wiley-VCH announced that following an agreement with the authors they had decided to retract an article from online journal Proteomics. The article would now no longer appear in print. The announcement stated: “The article has been retracted because it contains apparently plagiarised passages from several previously published articles.”
IWR will be covering this news in more detail in its March edition.


It has to be considered good news that this incident has been brought out into the open and it seems that all of the parties concerned have attempted to deal with the issue in an open and transparent way. But what does this incident tell us? Is this a one-off or is it the tip of an iceberg? How should information professionals react to incidents of apparent plagiarism that they encounter? This is a difficult tightrope that must be walked. But some help is at hand. In 2005 JISC produced a briefing paper on how to deter, detect and deal with student plagiarism. All institutions now need policies in place to deal with this problem. Not to have such procedures is just poor management. 

Thursday 14 February 2008

Net Neutrality and a Green Paper

Information World Review covered Network Neutrality a fair bit in 2006. It was a time when congress was thinking of reworking the Communications Opportunity, Promotion and Enhancement (COPE) Act 2006. Over a million Americans wrote to Congress demanding that net neutrality be enshrined into the new laws. That was the last thing that major telco and cable companies wanted. They hankered after a two-speed internet: fast for them and slow for everyone else.


What happened? Nothing. No-one won except, of course, the status quo was maintained. Oh, and as far as the FTC was concerned, the market could sort itself out.


A couple of days ago senators Ed Markey (Democrat) and Chip Pickering (Republican) reignited the debate by introducing their Internet Freedom Preservation Act 2008. According to Gigi B. Sohn, president and co-founder of Public Knowledge, this will "ensure that open, free and accessible Internet we have known for years will continue to be open to innovation, free from the control of telephone and cable companies and accessible to everyone."


Brace yourself for a bumpy ride as the various special interest groups restart their lobbying activities.


Thus far Congress has not prostrated itself before the communications giants. What would happen, I wonder, if these proposals were put in front of the UK government? We might get a clue soon as the DCMS unveils its green paper on internet piracy. It wants ISPs to monitor traffic and take action against anyone downloading copyright material.


Hmmm. To do that successfully, wouldn't they happen to need exactly the same packet sniffing capabilities that could act as the basis for a multi-tiered internet? As I say, it must be a coincidence. What isn't a coincidence is that the drivers for this initiative are large media and entertainment corporations.


Isn't it interesting that the US bill is for the people and will be attacked by the corporations with vested interests. But in the UK it is exactly the other way round. Quite regardless of this fundamental distinction, it will be interesting to watch the progress of the two proposals through their respective administrations.

Wednesday 13 February 2008

Could an Open Access model transform healthcare?

“Openness is not binary; information or processes are not open or closed. They sit on a broad continuum stretching from closed to open, based on their accessibility and responsiveness”. So concludes the latest report from the Committee for Economic Development (CED)


The not-for-profit and nonpartisan organisation has recently published a report (Harnessing Openness to Transform American Health Care) which examines how utilising an open access (OA) model would benefit healthcare and medical research in both the US and wider world


Comprising of business and education leaders from a wealth of backgrounds  and with donors ranging from Pfizer and GsK to Yahoo and McKinsey and company, the CED’s report helpfully defines the criteria of what they suggest constitutes ‘open’ information.


Ultimately they say it boils down to two things; the first is that information must be accessible, this means that data should be both available and free from restrictions while secondly responsiveness of that information refers to how malleable or redistributable the information is and therefore the more it can be considered ‘open’.


The report is interesting in how its findings suggest the open information will lead to benefits to healthcare systems in general as well as to the wider biomedical research field. Obviously the report has a sharp US skew, but there are global implications in there for researchers and information professionals to consider. 


For example it recommends a greater disclosure of biomed research results and calls for issues such as when and what data is made public need to be addressed. I thought the suggestion of aggregating Electronic Health Records (HER) offers a plethora of opportunity. “The aggregation of such records could facilitate the achievement of genuine ‘evidence-based’ medical system. Such records provide far richer data than clinical trials, and could serve as the basis for predictive models similar to those used in other scientific domains”, the report says.


The argument is that there will be an improvement in treatments, preventing the duplication of testing through accessibility to a patient or even a family’s medical history. Reassuringly, the report notes the ‘fundamental issues of privacy and security’ will tend to limit openness, as well they must, considering recent information security fiascos particularly on this side of the pond.


The opportunities to be had from the disclosure of research findings and the process of evaluating drugs and medical devices from greater and more responsible data-sharing will improve both care and scientific medical research. For a summary of the report, go here  or for the full version with end-notes here
   

Monday 11 February 2008

Here’s hoping Open Text stays Swiss (and Canadian)

I must admit that I was among those who thought that Yahoo would cave in, and sell out, to Microsoft. Having predicted a deal for some time, it’s human nature that I would want this pair to go ahead and combine. Now it looks like Yahoo will rebuff the world’s largest software company – at least for now, anyway.


Another prediction that I have been making for some months (OK, years) is that Open Text would sell out, with SAP quite likely to be the buyer. It hasn’t happened yet and, as another strong Open Text financial quarter goes by, you could argue that it’s becoming less likely to happen.


Let me say that even an eventual vindication of my forecasting powers is not enough for me to wish Open Text to join the rush to consolidate, and admit that there are good reasons for it to stay neutral. Some buyers will prefer a Switzerland of ECM that will not automatically try to push its own database, applications or other stack components.


Another reason is focus: Open Text has broadened its reach with deals of its own (Ixos, Hummingbird et al) but it is the one large-ish company that can still fit under the capacious umbrella branded ECM.


And, as long as Open Text stays away from the negotiating table, it can still market itself as a best-of-breed player in a sector dominated by one-stop-shops. A David surrounded by Goliaths, for the romantically minded.


That said, I still don’t buy the argument, well argued though it is in the is Big Men On Content blog, that Open Text is not attractive to a buyer because that would mean extensive duplication of assets. Modern business software deal-making, as evinced by Oracle’s capture of PeopleSoft, is largely about being able to monetise customer base rather than worrying about having too much code that was written to do the same stuff.


So I hope for the sake of competition in the ECM sector that Open Text stays solo, but I still think it more likely that a sale will come.

Friday 8 February 2008

Kevin Kelly: Better Than Free

Kevin Kelly, a Wired co-founder, recently wrote a very interesting piece about the digital economy called Better Than Free. It starts from the postulation that the internet is a giant copying machine. Anything that can be copied and distributed for free becomes worthless (in a financial sense). And, therefore, anything that can't be copied acquires value.


Sounds like common sense, right? But it strikes at the heart of the old order where people were willing to pay for mass-produced copies of stuff. Of course, it is still possible to pay for the convenience of a book, for example. Inexorably, though, online is trumping offline in an increasing number of situations.


But Kelly proposes eight categories of valuable and non-copyable activity. If he's right, and he's thought about these things more than most, then his suggestions provide a series of guiding lights for those of us who are still floundering in the internet economy.


He calls these values 'generatives'. To quote him:

A generative value is a quality or attribute that must be generated, grown, cultivated, nurtured. A generative thing can not be copied, cloned, faked, replicated, counterfeited, or reproduced. It is generated uniquely, in place, over time. In the digital arena, generative qualities add value to free copies, and therefore are something that can be sold.

His categories are: immediacy; personalisation; interpretation; authenticity; accessibility; embodiment; patronage and findability. Most of them probably make intuitive sense to you, but do read his explanations, they deliver real insight and understanding. To clarify a couple of the more obscure ones: 'Embodiment' means an analogue version of the digital entity - a book or a musical recital for example; and 'Patronage' relates to paying a reasonable amount to the originator.


In this new world, value is being derived from essentially human skills rather than mechanical processes. We are not talking about a new bandwagon for wheeler-dealers to jump on, we're talking about being rewarded for genuine intellectual or physical effort which delivers real value to the buyer. Something which should resonate with most IWR readers.


(Hat tip to Jack Rickards for the tip-off.)

Thursday 7 February 2008

Finding meaning in Microsoft and Yahoo

While the rest of us are worrying about whether we should be worrying about the credit crunch, Microsoft slaps down a cool $44 billion in its bid to buy rival Yahoo writes Peter Williams.


Timing is all in business (as in much else in life). Of course, part of the reason for the bid now was that stock market nerves on the likelihood of a US (if not world) recession had made Yahoo look, well if not cheap, then certainly better value.


The idea of a Microsoft and Yahoo forming some sort of alliance is not new; it was being discussed around Wall Street back in May 2006. The theory back then was that Microsoft was behind with its core search technology and Yahoo was struggling with its paid search service. That may have been the thinking then but it only works if you are convinced that there is some mutual benefit in coming together. The danger (the point is general not particular) is the chances are that you just compound the problems not solve them. Research from accountancy firms suggests that a majority of mergers destroy shareholder value rather than add to it. And the number one difficulty in getting it right is integrating the cultures of the acquired and acquiree. Cultural similarities between Yahoo and Microsoft? To be discussed.


In yesterday’s (6 Feb 2008) email to Yahoo staff (sent on for regulatory reasons to the US takeover regulator, the Securities and Exchange Commission (SEC)), “jerry” (that’s Chief Executive Officer and Chief Yahoo Jerry Yang to you and me) said the bid wouldn’t distract from the task of “pursuing our transformation strategy.”


Well maybe. But while the poor investment banker community are relieved that an M&A deal of substance (i.e one which will take some time and lots of fees to sort) has come onto the table, the question for information professionals is what does this mean (if anything)? At this stage it is not an easy question. Perhaps the best answer can be found in the Google generation report which my colleague Dan Griffin has talked about. The research tried to separate myth from reality for the Google generation. One idea posited by the research is that the Google generation thinks everything is on the web and it’s all free. Anecdotally, says the research, this appears to be true and certainly there is much evidence that young people are unaware of library-sponsored content or are at least reluctant to use it.


One consequence of the battle between the search giants is that in time we may all end up paying more (in one way or another, such as exposure to advertising) to take full advantage of the digital age. While being aware of access issues, if paying can mean an increase in the quality of content and search, then maybe information professionals would be among those who would see that as a good move.

Wednesday 6 February 2008

Information professionals guiding you to the best bits of the blogosphere

Josie Fraser runs the Edublog Awards, develops Web 2.0-based research communities, and reckons that blogging is the perfect cure for procrastinators and perfectionists


Q: Who are you?
A: I'm a social and educational technologist, a role that certainly didn't exist when I was checking out the careers guidance advice at school. Most recently I've been working on a JISC-funded project called Emerge, which is exploring the benefits of Web 2.0 tools to develop research communities. I'm also currently completing guidance on the benefits, risks and opportunities of using social networking services within education.

Q: Where is your blog?
A:
SocialTech (http://fraser.typepad.com/socialtech) is my current professional blog. It's just reached its first birthday! I blog at a few places and I'm at Twitter a lot at the moment, micro-blogging.

Q: Describe your blog and the categories on it
A:
It sticks to information and discussion on social networking, social media and social software. There is a lot of EdTech stuff in there since I work a lot in the education sector, and stuff that relates to the broader issues of digital literacy and social participation. I take a very relaxed approach to my personal blogs. I try to have fun with blogging and enjoy it rather than stress about the length of posts (they're getting longer, I'm afraid) or how often I post.

Q: What started you blogging?
A:
I started working in educational technology and it became clear to me very quickly that there were a lot of amazing people in my sector who were sharing resources, holding open conversations and exploring ideas. It was fantastic to be able to think through issues and make discoveries within a lively, smart and pretty funny international community. I've always been someone who thinks things through too much given half a chance. Blogging is great for whatever you want to call that: perfectionism or procrastinating. It's quick, dirty and iterative. The context is far more conversational. To me, blogging is about working through things, contributing your opinion to the pot. I'm dyslexic too and it's great to have a context where the odd mistake just doesn't matter.

Q: Do you comment on other blogs?
A:
I love to and wish I spent more time doing it. There is a huge value opening up discussions or just letting people know you are paying attention. Comments are the life blood of blogs.

Q: How does your organisation benefit from your blog presence?
A:
I've worked for several employers where blogging has been a part of my job. Any employer is hiring your ability to operate effectively within your networks, and to develop them. My employers and clients have benefited by having someone who is in touch and involved with what people are thinking and discussing, the interesting things people are getting up to, and the innovative ways they are solving problems.
Q: How does it help your career?
A:
Blogging is probably one of my primary sources for continuing professional development. Working and thinking publicly pushes you to try to represent your ideas clearly and to critically examine those ideas. It's a great showcase for your ideas, connections and conversations. Also, in my line of work online presence is an important thing.

Q: What good things have happened to you that could only have happened because of blogging?
A:
I've met people from all over the world and I've made some really great friends. I've also run the Edublog Awards for the past three years; that certainly wouldn't have happened without being actively involved in blogging myself. I'm incredibly proud of what we achieve: every year we get people participating internationally to create a resource for everyone that showcases the breadth of innovative work being done using weblogs and other social software to support learners and educators.

Q: Which blogs do you read for fun?
A:
I like art and tech blogs. I love We-make-money-not-art.com, and graffiti/street art blog Wooster Collective. Like most people, I read Post Secret. I also run a not-work blog, A Girl and a Gun, which is a film diary; it's been very successful and I get invited to a fair few premieres and festivals because of it.

Q: Which bloggers do you watch and link to?
A:
Loads! My aggregator has made grown adults cry! Also, there are all the people I keep track of within trusted community aggregators. I use Twitter, Flickr and Facebook in that way. I'd like to give a heads-up to some of the UK edublogging contingent who are doing some really interesting work:
Graham Attwell
www.pontydysgu.org/blogs/waleswideweb
Frances Bell
http://eduspaces.net/francesbell/weblog 
Helen Keegan
http://eduspaces.net/holla/weblog 
Scott Wilson
http://zope.cetis.ac.uk/members/scott 
Brian Kelly
http://ukwebfocus.wordpress.com 
Stephen Warburton
http://warburton.typepad.com 
Steve Wheeler
http://steve-wheeler.blogspot.com 

Not surprisingly, my relationships with all of them have been either started or augmented via our blogs.

Tuesday 5 February 2008

MS-Yahoo shows that search trumps portal

Briefly, as is its wont, fame beckoned for your blogger last week when he did a tour of duty at the BBC to talk about the likely effect of Microsoft’s bid to buy Yahoo.


There’s no need to go into deep details about that putative transaction here, although, if you’d like to see what I think, you could look here, here, here, here and here. No, what struck me most forcibly was the extent to which search had beaten its old enemy, the portal. At the Beeb, headlines suggested that Microsoft was buying a leading “search engine”. I must admit raising a quizzical eyebrow as it’s a long time since I heard the term used to describe a company.


And yet…


In the mid-1990s I edited a web site that shared office space with Yahoo UK. A manager there berated me for referring to Yahoo as a search engine. No, it was a gateway to the web -- a portal, no less, he insisted.


But the company that differentiated on search quality, Google, has been the winner so far on the internet. Having a string of great properties like Mail, Flickr, Delicious and Finance has helped prop up Yahoo in hits; Microsoft has had the tie-in to a massive brand and OS; but Google’s search has been the most powerful means of making hard cash.


There is a lesson here for all those web sites that don’t deliver the search results you wanted, and even for enterprise search and content management camps. At least for many ways to search and control data, unless you have an excellent core search capability, the other guy probably has a better product.

Friday 1 February 2008

The boundaries are different with learning 2.0

One of the key issues currently being debated by information professionals is the validity of Web 2.0, writes Peter Williams.


Gurus are suggesting that the new use-driven web environment is spawning a new form of learning, learning 2.0. Learning 2.0 is the antithesis of learning which those of us who pre-date the Google generation would know as formal learning. Indeed learning 2.0 even seems to throw away the rule book that learning over the internet (e-learning) or the mixture of internet-based learning and traditional face-to-face learning (blended learning) were prepared to obey.


Learning 2.0 seems to be approaching anarchy, it is open, distributed, networked complex and fuzzy. Speaking at the recent Learning Technologies 2008 conference, Stephen Downes – one of the leading proponents of Learning 2.0– shared his thinking on what learning 2.0 is and what it might be. Those of you who tut tut at the old fashioned idea of  a leading guru on Web 2.0 giving a one-to-many presentation will be pleased to know that he did not deign to hit the delights of Olympia in West London, instead staying in his homeland of Canada and joining the audience virtually. Downes argues that learning is moving from a centrally controlled provision to a barely controlled group activity. In many spheres the internet is seen to be switching our behaviours from ‘push’ to ‘pull’ and this in one place where it is actually happening.


Individuals are using Web 2.0 technologies to interact and learn with whatever is available whether it is designed as learning content or not (and many times it will be not).


The question for information professionals working across all types of sectors and organisations is what will the impact be on them and their work is learning 2.0 in all its proliferations does turn out to be something more than marketing hype. One cast-iron reason why it may stay the course is the simple reason that people (and especially younger learners) appear to enjoy the freedom, interconnection, and interactivity that is on offer. A few years ago one of the ways that the internet was fostering learning (especially informal learning and knowledge sharing) was through communities of practice (CoPs). These CoPs are individuals which technology could connect and bring together so that they could share knowledge and improve both individual and organisational performance through sharing of experience in an unstructured way. Just by belonging to the community your experience, knowledge and expertise was assumed and accepted. Learning 2.0 can be seen as the young cousin of CoPs. Learning 2.0 is social network transformed for a learning purpose.


Part of the attraction of learning 2.0 is the barriers that it breaks down. We are usually defined as learners by the class we are in, or the educational institution or corporate learning activity to which we belong. Forget those sorts of boundaries with learning 2.0: the group is infinite, self-selecting, open and self-defining. The breaking down of these boundaries – a deconstruction which in learning terms some see as significant as the tearing down of the Berlin Wall – does raise some serious questions. For information professionals the question has to be what does this do to the use of the library resource and the scholarly approach in the process of learning?


Downes says we should think of learning as something that flows rather than static, with the use of tools such as Twitter and Facebook learning is partly about being immersed in a community. Crucially he says that learning 2.0 is not based on objects and contents – the sort of elements we may expect to find stored in a library and which therefore may not be immediately accessible. Instead learning 2.0 is learning where you need it and when you need it. This is more than just in time it is just in time PLUS what you want. It is learned-centred because it is both owned by, and of interest to, the learner. As learners we are moving away from being passive consumers to creating learning as we learn. Who could argue against such an appealing vision of learning? Learning 2.0 may have some of the answers to learning in the digital age but it does not yet offer the complete answer.