w: www.meantime.co.uk
t: 01539 737 766

Wednesday 24 November 2010

Why I walked out.

It's been a busy couple of months. A new project with a client that I can't yet name, which is the hugely satisfying culmination of six years' hard work, not to mention new lessons learned about the need to project manage clients during large scale developments with staggered code deployments. I know I've neglected the blog - just the sort of scenario I warn blogging clients about - but I've had two or three topics I want to blog about knocking around my head and I've been looking forward to having the time to put them into writing.

However, this evening I want to use the blog for a different reason, which is partly by way of explanation of my sudden exit from an event I attended, this evening. The event was called 'Beyond Websites - Using Web Technology Creatively'. I must admit I was slightly dubious about the title, which seemed to offer two different topics but, with the advent of HTML 5, I had little doubt this would be an interesting couple of hours, with a presentation from Keith Mitchell (@specialized), a Research Fellow at Lancaster University, and then a panel discussion.

The presentation was entertaining but a little disappointing, mostly concerned with watching television over the web. There was some talk of community clouds - effectively caching programmes - which I believe will be rapidly superseded by the rollout of more powerful comms and - as the lady from Business Link pointed out - better compression algorithms, anyway, and there was also a demo of some software that would enable the user to watch television whilst viewing a clickable television guide and relevant feeds from Twitter and Facebook, which struck me as reactive rather than innovative development. Certainly I wouldn't have described it as a creative use of web technology.

The panel discussion was a further let down. There was some talk about existing websites, particularly Vimeo, some paranoia about Google claiming copyright to any documents placed on Google Docs (take a *closer* look at the T&Cs) and then some discussion about targeting content, with specific mention of 'Googlezon' from "Epic 2015" (2014, in fact).

So, why the hissy fit, albeit in the relatively middle-class form of walking out?

Well, I won't pretend it was because the session didn't live up to its name (although more on this in a bit). I'm as happy as the next (nerdy) guy to spend an evening talking about the web/Internet and guessing at where it's heading. I would have had no problem with that. There are two overlapping elements to evenings such as this one that really get my goat. The first is a rather smug assumption that we're at the forefront of a cultural movement, i.e. that what we're doing today is what everyone else will be experiencing tomorrow. The second is the weak recycling of common 'wisdom' regarding the web, what's cool and where it's "definitely" going to go wrong.

So, just because we use Twitter and blog, doesn't mean everyone is going to. Indeed, I'd argue that the very fact that we are entrepreneurs working in design, marketing and new media means we are exactly the kind of self-aggrandising/outgoing folk who will engage in these activities. Does it mean other people won't? Of course not. Be just because we *all* do, doesn't mean *everyone* else will.

Similarly, it wasn't true to say that everyone spends more time online than they do watching television. That may be true of teenagers but when I was a teenager I spent more time in my room reading and listening to music than watching TV. Let's not confuse human behaviour with cultural trends. And as for the ridiculous anecdote about the literature professor who can't read War and Peace since he started using the web... That man is in the minority.

I'd recommend these people read Tim Berners-Lee, Clay Shirky and maybe Brian Eno's insightful 1995 essay on targeted marketing.

So, yes, I got impatient and I have a low boredom threshold and I walked out. But what would have made me stay? Two things, I think.

Firstly, we could have enjoyed some talk about how the way in which we use the Internet has changed. The Internet is a load of computers/servers connected together. Over that, the web was laid, pages of content that joined together and, initially, that was how we used the Internet (and for email and IRC, of course). Now we use the Internet to deliver content to the apps on our smartphones or to 'narrowcast' films to our TVs, and the web part of the general connectivity has become a little less important (although it won't go anywhere in the short-term). We could have usefully talked about how users are accessing data and how we adapt what we do to satisfy their demands. And how we keep that fresh, interesting and, yes, creative.

The second thing that could and should have been different this evening was that we should have talked about the Internet as a huge, interconnected resource, to which people contribute. Web 2.0 as it is now called (and was so referenced as at the start of the evening). Instead we talked about the web - but, really, the Internet - as a mechanism that delivered content to us. We talked about how the content presented to us could be better refined, more targeted. But that isn't the point. The original Web 2.0 - as defined by Tim Berners-Lee - sometimes called the semantic web, was about joining up data, in all its forms (including video). One of the panellists did make a good point this evening (and I must apologise that I can't remember who it was), which was that often we start out looking at one thing on the web and, by following links (and, I'd add, our interests), we find our way to all sorts of different sites, pages and information. This is what excites me about the web; the interconnectivity, the wealth of data, the user content, the real-time commentary on life.

The web, the Internet, whatever you want to call it, is - at the risk of sounding clichéd - a fresh, innovative, exciting space. What excites me is not how I might watch TV over the web or how content might be targeted for my consumption. No. I love being in the middle of a strange town with my smartphone telling me where I am and what's around me. I love being able to map the runs I do and share them with my long-suffering friends. I love the fact I can IM my daughter in Berlin *right now* and chat to her. I love the fact that via apps and pages and any and every other means, we can all access and share films and music and data and opinions and all the other things that constitute the culture that we enjoy. That, I think, is worth talking about and, indeed, celebrating.

Wednesday 22 September 2010

What yesterday's problems tell us about Twitter's testing

Those of you who keep an eye on the news - and who don't switch off at the mere mention of Twitter - will probably be aware that Twitter had problems yesterday as users (with development skills) were able to include code in their posts leading to the problems described in The Guardian (here) and The Telegraph (here).

Unsurprisingly, the language used to describe the incident is full of the kind of words that IT people use to keep everyone else at a distance and to spray a little nerd glamour on themselves. But for all the talk of malicious code, worms, onmouseover, hacks and loopholes, the truth of the matter is remarkably straightforward.

So, first, a quick description of how browsers work. When you load a web page, your browser requests HTML (the language in which web pages are written) from the web server and as it receives the code, it uses it to build the page, top down. It's a fundamentally simple process and the browser simply processes each line in turn. So, when a browser displays a page of 'Tweets', the messages are simply part of the HTML. If some other code is included in the HTML for that message, then the browser simply interprets it.

I first came across this as an issue ten years ago when I was leading the testing for RBS's Digital Banking software, their first purely web-based Internet Banking service. I discovered during our early testing that on the page where a user would be able to name their accounts, I could enter basic HTML, which would then affect the way the page was subsequently displayed, once it had been saved.

From then on, every entry field in the website had to be coded in such a way that any characters that might be used to insert code were not permitted. And every test script included tests to ensure that if those characters were used, the data would not be saved to the database. We had a few data input boxes and the testing was time consuming but, of course, it was vitally important that no one could introduce code and make the site work in a way that wasn't intended. These scripts were used to test every release of Digital Banking, even if the changes were in a different part of the system from the data entry.

Twitter has one input box. That's it: one. It might be deployed on one or two different pages but it is the same code, the same function.

So, what does this tell us about Twitter's testing. If it tells us one thing, it tells us it isn't as robust as it should be. It doesn't really matter whether the issue is down to a tester who ticked a box without actually doing a test, an automated script that wasn't run, poorly documented test scripts or a missing process that should confirm that all scripts are complete. This was a bad drop by an organisation that has tens of millions of users and a burgeoning usage by business.

For as long as I have been in IT, testing has been the poor cousin to development, and regarded as an unnecessary headache by developers. IT and project managers must never lose sight of the importance of this stage of IT development: customer and client confidence is easily lost and difficult to regain.

Friday 17 September 2010

Keeping content accessible and other things *you* should do for your site

There's a lot of things you might ask or even demand from the company that builds your web site. Certainly one of those things is that it should comply with the Disability Discrimination Act but you might also ask for, say a news page and a calendar of events.

But whilst it's very easy to have ideas at that point in proceedings, you should think about your ongoing commitment. There's nothing that makes a site look sadder and neglected than a news box on the front page that hasn't been updated for months or a calendar with nothing on it. It's important to ensure that somebody in your company - or perhaps someone outside it: your copywriter or marketing people - takes responsibility for that content and ensures it is updated regularly and with some care.

However, the main reason for this quick blog is to do with your responsibility for accessibility. Today we are sending out our latest newsletter, which is about accessibility and the DDA. In the newsletter we provide a link to a tool for checking accessibility and, of course, it occurred to us that a sharp client or two might use it to check our sites: our own and the ones we've built.

We were more than a little surprised at first to find some of them failing because we always check our sites for both W3C and DDA compliancy once they're finished. However, on closer inspection, we found that it was the user generated content that was causing the problem and not the code we'd written. That was, of course, a relief but then Louise asked whether we had even spoken to our clients about how to keep their content accessible. Well, that did take the smiles off our faces.

So, from next week, we will be briefing our existing clients on how to make sure that the content they put up is accessible and making sure that it's part of the training for our new ones.

Monday 30 August 2010

Why the government needn't stick with IE6

One of the features of the web that I initially found exciting was the concept of platform independence. One could build a page in HTML and anyone using any browser on any machine running any operating system would be able to see that page and use it just the way its author intended.

I will pause here to allow the hollow laughter of web developers everywhere to fade to an echo and so not interrupt my thoughts.

The truth of the matter is that over the last fifteen or so years we have had a multitude of browsers that work differently on various platforms and that has made both the development and testing of web pages and applications far more time consuming and, therefore, expensive than necessary. At Meantime we develop in whatever we consider to be the most standards compliant browser at the time and test primarily in whatever is the most popular, and both of these are moving targets.

As you would expect, Microsoft have a major role in the history of browsers but it is a surprisingly chequered history. Bill Gates was famously dismissive of the Internet at the outset and, if memory serves me correctly, it was not until IE4 was released that Microsoft really found their feet in the marketplace. Their success lasted 18 months until early 1999 when IE5 was released.

The extent of IE5's problems can be inferred from the fact that, on this one occasion, Microsoft released an interim version of the browser, IE5.5. And when IE6 came out, it was widely agreed that unhappy IE5 users would have been better waiting for that than 'upgrading' to IE5.5. (I remember all this vividly; I was responsible for the team testing the Royal Bank of Scotland's Internet banking software at the time.)

Of course, all of this made life incredibly torrid for those people in organisations who were responsible for the software and applications architecture. It was a period beset with problems and costs, and this was during the period immediately following the nasty surprise cost of the Millennium Bug. But IE6 was stable and remained Microsoft's browser offering for five years.

Despite that long period of stability - or, arguably, because of it - there did not appear to be a strong appetite for change when IE7 came along or, a couple of years further down the line, IE8. And so we find ourselves in 2010 with many large organisations and particularly government departments still using IE6, which is now nine years old.

The problem is that an awful lot has happened in those nine years and right now is a particularly exciting time with the new HTML5 and CSS3 support in the latest browsers: Chrome already has it, as does Firefox 4, which is out in a mature beta, and so will IE9's public beta, which is released in just a couple of weeks' time.

And in the middle of all this excitement, the government has announced that they won't be upgrading from IE6. Such a move, they say, would be "a very large operation" potentially at "significant potential cost to the taxpayer". They say that it will be "more cost-effective in many cases to continue to use IE6 and rely on other measures, such as firewalls and malware-scanning software, to further protect public sector internet users."

This is such a short-sighted option that I don't feel I need spell out the various reasons why it is a bad idea. Indeed, the quote above makes it clear there is an attendant security risk which is a strong argument in itself.

What is perhaps less obvious is that as software is increasingly web-based, software that is used by local authorities, it is not a straightforward transition to move between browsers: there is a lot of regression testing and consequent redevelopment to be done.

But it's not satisfactory for government IT strategists to simply throw their hands up and say it's too difficult or too expensive to change. Yes, I understand this is supposed to be a time of austerity, but that doesn't mean that an economical solution can't be found. In six months' time we will have a clear view of which of the current crop of browsers is the best and, as a first step, this can be deployed across government departments. From there, a two year plan to migrate applications from IE6 to the new browser will get departments to the point where they can look at their next upgrade. Because this is a rolling process and not a one off. Government needs to recognise that fact and start to make the cultural change that will enable them to leverage the benefits of what is still a fast moving (and exciting) technology.

Monday 9 August 2010

Bespoke software is alive and well.

In a recently posted article on the Business Computing World website, Haseet Sanghrajka of ST Consulting writes under the byline "How Application Platforms Are Killing Bespoke Software'.

Now, I can only assume that the title comes from a bit of over-enthusiasm for his offering because this is certainly not what Mr Sanghrajka succeeds in demonstrating. What he does argue is that application platforms are the best way forward for companies needing software solutions, as opposed to either package or bespoke solutions.

Let's tackle the package argument first, as it is the easiest. I would certainly agree that more complex business software, when taken in package form, is limiting for any business. However, there should be allowance made for packages such as QuickBooks or MS Office: there are times when a package solution is entirely appropriate.

His argument against bespoke software seems to be related to development time, cost and maintenance. So, let's return to those arguments after we've had a look at what Mr Sanghrajka is proposing, which is an 'application platform'. Although this is not particularly well explained in the article, in his terms it consists of some packages - that is, Microsoft Dynamics, Office and Outlook - plus a bespoke development language, .NET. (Packages and bespoke development are, of course, precisely what he is campaigning against.)

Those of you who have any experience of Dynamics will be aware of its eye-watering price tag (although MS have now introduced a cheaper version to try and encourage take up in smaller businesses). Microsoft themselves are pleased to tell their resellers that they can anticipate many times the cost of the package in terms of consultancy fees. Readers of Mr Sanghrajka's article won't be surprised when they reach the end to see that his company sells precisely this type of consultancy.

But back to our argument. Dynamics is more commonly known as Dynamics CRM and it is for this purpose - customer relationship management - that the product is most commonly sold. Extending the product out to cover other functions takes us into the territory of square pegs and round holes, and it is for this reason alone that I struggle to see how this particular application platform can compete with bespoke software.

So, to pursue the argument, perhaps the article is only really supposed to be about CRM. One of the touted benefits of Dynamics is just how powerful, configurable and flexible it is. I don't disagree with that. But you can deduce from those features that it is a complex piece of software, which is why companies such as ST Consulting can make a business out of configuring it.

It is this complexity and the cost of the consultants needed to maintain it that is seeing organisations move away from Dynamics (and SAP, as well) to simpler, bespoke solutions that do exactly what their companies need and nothing more. Certainly any bespoke provider worth their salt can provide a CRM for less than the cost of a configured Dynamics solution.

So, the argument was that bespoke software is too expensive, takes too long to develop and needs maintenance. But Dynamics will frequently be more expensive as a whole, takes time to configure and is too complex for organisations to manage themselves, resulting in ongoing consultancy costs. What's more, working with a good software house - with business and systems analysts, experienced developers, and a strong testing and change management process - will lead to cost-effective software being delivered to spec, on time and on budget.

I'd like you to indulge me in a couple of very specific ripostes, too. Quite apart from the fact that the argument has not been raging for decades (two perhaps, which is technically plural, I suppose), Mr Sanghrajka has it plain wrong when he says "Unlike traditional bespoke development that relies on software coders, application platform technology allows organisations to build on the underlying relational database using point and click development tools. Key concepts such as database fields, data relationships and workflow are automatically handled by the underlying application platform layer." This is disingenuous, at best. Building and maintaining relational databases requires good understanding and experience. Whilst a casual user may be able to add an attribute here and there, by the time they are creating entities and connecting them, they are only a short step from having to call those Dynamics consultants back in to sort out the ensuing problems with performance, reporting and data integrity.

Secondly, Mr Sanghrajka states that "it is estimated that an organisation can create a new solution from scratch in the same time frame it takes to deploy a traditional packaged application." Does this need much more than common sense to disprove it? Unless the new solution is, perhaps, a VAT calculation and the package is, say, Microsoft Dynamics.

My argument here is not with Mr Sanghrajka, who has a business to run and who no doubt has a lot of belief in Dynamics. What I dislike is the false argument. By Mr Sanghrajka's own admission, bespoke is the best solution. However, his arguments against it are flawed and actually apply more acutely to Microsoft Dynamics. Or the 'application platform' if you prefer.

Thursday 29 July 2010

Everybody's talking; why your business can't ignore social media.

Perhaps you remember when you first heard about Facebook and Twitter. Maybe you were one of the people who thought they'd misunderstood when these sites were described to them: "what's all the fuss about..?". You might even have tried Twitter and abandoned it after a few 'tweets' finding it every bit as lame as it sounded when it was described to you.

You would then have been further perplexed when you started to hear people saying how important Facebook and Twitter (and numerous other social sites) were to your business. Did you sit and look at that flyer from Business Link inviting you to a workshop on how social media could benefit you, and wonder just what it was that you'd missed?

Now, I'm not saying they can't or won't benefit you - as it happens, we use both at Meantime - but this blog is not an attempt to convince you of the merits of social media or otherwise. What I do want to write about is the broader phenomenon of social media and what it might mean for your business.

First of all, let me give you a couple of examples of the type of impact I want to talk about. The first one came about last year when a protestor died at the G20 protest in London. At first the police denied that they had anything to do with it, stating this as fact at a news conference. The truth might have come out if enough eye-witnesses had managed to convince a newspaper to run the story and one might have believed it or not depending on one's opinion of the police. However, in this case, footage and first person narrative of the event was promptly posted on websites and picked up by news broadcasters and the police had to confirm that the man had, in fact, been attacked, unprovoked, from behind by an officer.

The second example comes from earlier this week when Wikileaks posted the Afghan war logs. Regardless of the ethics of this action, the bottom line was that once that information was in the public domain, with even a single member of the public, it could be immediately broadcast to the wider population. Whilst one can be impressed with how President Obama rolled with the horrible revelations contained in the logs, the key point here is that he did not try to refute what was published.

So, how do these stories relate to your business, assuming that you are not running either a security firm or a war overseas?

The point I want to make is that the publication of opinion and information is now unfettered to an extent that was unimaginable even ten years ago. Most of you will have used Amazon and seen the review system used there. Some reviews - particularly those for music and books - are obviously partisan and no more than opinion but when you look at product reviews, it becomes a bit more serious. It was one the birthday of one of my daughters recently and she wanted some toys from the Little Cooks range. The reviews on Amazon across a range of their products quickly highlighted the fact that no matter how good they looked on TV, the build quality was poor and they were not a good present.

There are now independent sites - such as the dreadfully named Revoo - that specialise solely in independent reviews, and consumers are becoming more and more sophisticated in analysing and interpreting the reviews, good and bad. A simple example of this would be when I bought some headphones recently. The pair I fancied had over a hundred reviews, the majority of which were very positive. As one does, I read a higher proportion of the bad reviews and satisfied myself that those reviewers had either been unlucky or had unrealistic expectations. So, I bought the headphones.

I think that in the near future we will see similar sites that are based around services, as well as products.

So, what does this mean for your business? Two things, I think. Firstly, there's no point in trying to deny a problem with your product. Apple have kindly illustrated this point for me over the last few weeks with the issues arising with the aerials on the iPhone 4. No amount of denial is going to convince potential customers, who can see reviews and comment from any number of sources that are presenting the actual facts.

The second, more subtle issue, is germane to those who provide services. The issue here is not around people reviewing a product but instead talking about how you do business and what results you achieve. It doesn't matter how good your staff are, how good your planning is and how good your track record; you will have projects from time to time that simply seem to be jinxed. In the past, one might have been pleased to simply get the project completed and the client out of the door and focus on your successes but those days are fast disappearing. Now it is even more important to understand why a project hasn't gone according to plan and to make sure your client understands that, too. You can't simply rely on the good news that is out there about your company already; beware the old adage that bad news travels ten times as fast as good.

In conclusion then, whether you sell a product or a service, the good opinion of your customers and clients is no longer something that it would be nice to have, a secondary consideration: it is now core to your sales and success. More than ever, it is important to have happy clients because, in this age of Facebook and Twitter, everybody is talking.

Tuesday 13 July 2010

Paywalls: is this the end of free content on the web?

As of the 2nd of July, The Times has now placed its content behind a 'paywall'. This term, a derivative of the term firewall, means that you can only access The Times' content if you are willing to pay for it. This is an important step forward in the debate about content on the web, as Rupert Murdoch puts his money, or his readers' money, where his mouth is.

The emergence of the web has had a dramatic impact on newspapers. Most obviously because, by and large, all of their printed content goes onto their websites, which removes the need to buy the printed version. Whilst newspapers and their associated magazines take a lot of money in advertising revenue, the loss of the newspaper's cover price as an income stream is clearly going to hurt. This also has a nasty by-product in that it makes newspapers more reliant on their advertisers' agenda. (This, incidentally, is why there is no editorially independent women's magazine that can criticise the beauty industry.)

Of course, with a reduced income, the newspapers have less money to employ journalists and so, the argument goes, the quality of journalism will decline. And this is the primary argument for charging, that newspapers are providing a service and that has to be paid for by some means.

But is that really the case and what are the alternatives?

Firstly, let's look at the question of the quality of journalism. I can't see that there is much point in pulling the tabloids into this argument - most of their readers know exactly the quality of journalism that they are buying - so let's concentrate on the broadsheets, who are the main players in this debate (after all, Murdoch hasn't put a paywall around The Sun's website).

Erwin Knoll said that "Everything you read in newspapers is absolutely true, except for that rare story of which you happen to have first-hand knowledge." As it happens, I can vouch for this argument. Earlier this year an incident took place at the school where I am a governor. A young girl from a difficult background brought a small knife to school. She bragged about this on the bus and said she had a list of people to get. Staff at school confiscated the knife and the list without any danger or trouble and then notified the police. Here is that story in the Daily Telegraph: http://www.telegraph.co.uk/news/uknews/crime/7519842/Schoolgirl-found-carrying-a-knife-and-hit-list-of-teachers-and-pupils.html. Quite apart from the obvious discrepancies between the story and what actually happened, it's worth noting the picture of the knife - a library photo, far larger than the knife concerned - and the emphasis on the stab vests, which police actually wear as a matter of course.

If this is the quality of journalism in our broadsheets, what are we in danger of losing? And, whilst anecdotal, the story above is far from an isolated example and there are far more sinister abuses of journalistic power, particularly around elections. (One thinks, of course, of Bush's first election "victory".) Whilst we have a Press Complaints Commission run by the press, specifically newspaper editors, we are not going to see the very improvement in journalistic standards and quality that is required to render newspaper journalism a profession worth protecting.

So, the argument for 'saving' journalism is not black and white. However, we clearly want our news from somewhere, which brings me to my second question, what are the alternatives?

As it happens, there are a lot of people out there commenting on the world around us, people who are free from the constraints of editors and newspaper ideology. For example, in the two years leading up to the credit crunch, a number of financial bloggers, some of whom published books, were accurately predicting what was going to happen and when. (In fact, there is a parallel in the Liberal Democrat Vince Cable who, effectively freed from toeing any party line, was able to air his concerns about the impending financial disaster.)

In his book 'Cognitive Surplus', Clay Shirky rightly points out that as publishing becomes easier, so the average quality declines. This is quite true but, to follow what has happened with printed books as an analogy, the fact there is more content being produced actually leads to a raising of the bar at the top of the quality scale. I believe that what will evolve is ways for people to find that good content and, in the same way that best seller book lists and the community on Amazon help us to find books to read, so we will point one another to the quality blogs and user published content. Indeed, one could well see a future newspaper website being a portal through to that content, some of which may be commissioned through advertising but probably not.

So, is this the end of free content on the web? The answer, simply, is no. I believe Murdoch has made a mistake here and, indeed, overestimated his own ability to impose his worldview on an evolving environment, one that is moving away from the models he desires. I won't pretend that I know exactly what the future looks like but I do know it will be one in which we pull our news from a variety of sources, channelled to us by portals and subscription services.

Saturday 10 July 2010

NI14 - When government has IT spending right

A long time before this austerity coalition came to power, a policy called NI14 or "avoidable contact" was introduced to local government. Whilst this sounds rather unfriendly - as if our public servants want nothing to do with us - it was, and is, a good policy and one that might be usefully adopted by any company.

By whatever means, it was established that the cost of a face to face meeting between a council employee and a member of the public costs somewhere around £15, a 'phone call perhaps £3 or £4 but a website visit only has an associated cost of a few pence.

Thus, making information available on a website - anything from swimming pool opening times to a PDF of the application form to become a taxi driver - saved the council money and, crucially, freed employees up to do other work.

This was all very sensible and a good example of IT benefiting both an organisation and its clients or customers.

This policy and its aims have necessarily taken on a darker hue in the wake of the cuts that have been announced since the election and the inevitable job losses to follow. But whether you see the swingeing cuts as evidence of an unleashed Tory ideology or the essential measures required to raise the country out of a financial hole, the goal remains the same: don't tie up council employees' time on jobs that can be done by websites or, by extension, applications for mobile 'phones.

So, I was surprised to see this article in The Telegraph - http://www.telegraph.co.uk/technology/news/7874856/Government-iPhone-app-spending-disclosed.html - criticising government spending on iPhone applications.

Firstly, the costs involved - £10k to £40k - are hardly scandalous for IT development, and that's before they are compared with the millions and billions haemorrhaged on the police and hospital systems.

Secondly, in NI14 terms the expenditure makes absolute sense. The jobseekers' app, for example, has been downloaded 50,000 times. That's 50,000 people using an app that has cost the equivalent of perhaps three public sector salaries. The maths is simple and compelling.

I am all for keeping an eye on government spending, particularly around IT, but it seems crazy to me to criticise spending that not only makes demonstrable financial sense but is also a success story within the terms of one of Whitehall's own policies.

And don't worry, the irony of the BBC investigating this story with a view to criticising an organisation for not budgeting carefully around its new media spend really is not lost on me.

Monday 5 July 2010

Is it time to create a 'top shelf' for the Internet?

It is an unavoidable ethical complication: sometimes good things come out of bad. Medical research, for example, has been assisted by murdering grave robbers and the horrific experiments of wartime scientists.

And so, on the web, it is undeniable that Internet technology has been pushed ahead by the demands of the pornography industry, which has enthusiastically driven the research and development of commonly accepted and valuable web tools such as video streaming.

Like so many elements of the free market - not to mention the banking sector - those making the money are happy to push ahead, while those with rational reservations make comments from the sideline that are accepted but not acted upon. Thus, we find ourselves with a generation now that has access to pornography that was unthinkable even ten years ago. No one can be sure where this will lead but you will not find many people arguing that it is a good thing.

To be clear, this blog is not about the pros and cons of pornography. However, I think I would be comfortable stating that the industry that promotes and produces it is, by and large, an unpleasant one, driven by high profits and with little regard for the people it uses. And I am consequently comfortable stating that in matters of regulation, I would think twice about consulting them or listening to their arguments about how they keep their content away from young people. (How many 15 year old boys are going to pay heed to a 'gatekeeper question' asking if they are 18?)

I am writing about this because last week, ICAAN, which is the body that manages 'top level domains' such as .co.uk, .com and .ac.uk, has begun the process of introducing a .xxx suffix for use by sites with pornographic content.

This strikes me as a very positive step. In no other area of our lives do we allow hardcore sexual content to be mixed in with the rest of what we see and do. I wouldn't take my children into Waterstones if I thought that Harry Potter would be sat side by side with adult literature. Indeed, I would be far happier with my children having unfettered access to the web if I knew there was a way by which I could simply block all unsuitable content.

More than ten years ago, a friend of mine was thinking of developing a trailer for windsurfing equipment and asked me to see if there was anything equivalent already on the Internet. I naively used AltaVista (ancient but rather good search engine) to search on 'water sports' with results that took me by surprise. More seriously, a friend recently complained to me that his daughter's biology homework had been to use the Internet to research bodily fluids, one of which was 'sperm'. This, at best, was a naive teacher.

Moving all pornographic content to .xxx domains would avoid this kind of accidental contact and also make it easier for the search engines to classify the data they pick up off the web.

There's no denying that the frontier spirit of the Internet has been and remains a great asset in its development. However, this romantic notion of a new wild west should be tempered by these oil barons of cyberspace whose intentions are not to do with creating a brave new world but instead revolve around exploitation and  profit.

It is for all the above reasons that I welcome the new .xxx domain and I would support any and all efforts to oblige sites with adult content to move to using them. After all, for anyone who wants to access that content, nothing changes except a few letters in a URL. The advantages though, to those who don't want pornography and especially those who should be protected from it, are immense.

Friday 2 July 2010

Can you really spend £35 million on a website?

Let's suppose I told you I was going to buy myself a new car. If I also told you what I planned to spend, you would naturally make some informed assumptions about just what I was going to buy.

I might say £6,000 and you'd think maybe a second hand car or low spec Skoda. Somewhere between £15,000 and £40,000 and you'd probably assume that I was looking at a new car. And how about if I said £100,000 or £200,000? Then, perhaps, you'd get really interested in what kind of car I was buying (and where I'd got the money from).

But what if I told you I was spending a million pounds? Now you'd be a bit bemused. Can you buy a car for a million pounds? you might ask. What is it, armoured? Jewel encrusted?

Which brings me, perhaps not obviously, to the Business Link website.

Yesterday, the government announced they are scrapping Business Link: http://www.bmmagazine.co.uk/Business-Link-to-be-scrapped-by-Government.870 Now, depending on your experiences with Business Link, you might think that is the loss of a resource that is valuable to small business or you might think it's about time. (Personally, I enjoyed my meetings with my local advisor- he's a nice chap - but I'm not sure I ever got anything out of it apart from a bit of local business gossip).

But one thing in the article really caught my eye: "the total cost of developing the Business Link website is a mind blowing £35 million". Now, I can honestly say, having been in IT for 22 years and in web development since 1997, that I have no idea how they have managed to spend that much money on that website. Certainly at that price - and notwithstanding the fact it is government funded - I would expect it to be compliant with the Disability Discrimination Act (it isn't) and not have obvious bugs (like its CSS errors).

Even a quick scoot round the site - which you can find at http://www.businesslink.gov.uk/ - will demonstrate that it's feature rich and has plenty of useful resources. But you could speak to any number of web development companies of a certain size and find they have developed sites on a similar scale for a fraction of that cost (the exception, of course, being the company that built the Business Link site, who are presumably maintaining it from their summer offices in the south of France).

Of course, this is ultimately about government's widely acknowledged yet never tackled problems around IT procurement. All too often, the tail wags the dog, with government suppliers telling their clients just what they will deliver and at what cost. Yet the solution is simple: start talking to and dealing with smaller suppliers rather than the handful of large companies who have been fleecing the taxpayer for years.

Thursday 1 July 2010

Building in bad practice - another BBC adventure

A few years ago, I was working as a freelance test manager for JPMorganChase in London and at the end of the contract I was invited to work on a problem project in New York. The project in question was having difficulties delivering to an aggressive schedule that required new releases every three months, with each release having a six month life cycle. This meant that at any given time there would be at least two releases in development.

The team on the project were very good, including a project manager, David Hodges, with whom I had worked before and held in high regard. At the first meeting we sat down and had a rather dispiriting couple of hours look at the project's history, where it was now and the future requirements.

Applying a bit of common sense, we decided on an initial release that was limited to correcting some live issues and started a conversation with the business unit to whom we were delivering about what they really needed in the very short term. Understandably, and given the frustrations they had endured, the business team wanted as much as possible in the next release; they had been waiting some time for important functionality.

So, we explained the challenges we had on our side and in the end, with their cooperation, we defined a manageable release. When we sat and looked at it, it didn't look too ambitious and, indeed, it was delivered on time, fully tested and functional. This set the pattern for subsequent releases and, despite their initial misgivings, the business unit ended up very happy with a steady stream of change, delivered every three months as required.

Yesterday my colleague, Steve Parker, sent me the following link: http://www.wired.co.uk/news/archive/2010-06/28/project-canvas-youview

Of course, I was interested in this from a technology point of view and also for the details of the predictable spat over the question of public money distorting the private market for which I have some sympathy (although little for Virgin and BSkyB, specifically). However, what really caught my eye was the following sentence: "If all that's met, and the project doesn't go more than 20 percent over its budget (which has also been prohibited by the BBC Trust), then Project Canvas is expected to launch in the early part of 2011."

This strikes me as symptomatic problems with IT development in the public sector. i.e. the acceptance that IT projects will go over-budget. Whether it is conscious or not, the trust is tacitly approving a budget overrun of 20%. This means those running the project, who should be responsible for the budget, won't begin to worry until that figure of 20% is approached. They certainly won't worry at the point where they are approaching the limit of the official budget.

Furthermore, previous experience tells us that if the budget limit is reached, no one will want to throw away millions of pounds worth of work and more (licence payers') money will be found to shore up the project. We may get the minor satisfaction of seeing a slapped wrist or two.

The point of this blog, as you will probably have guessed by now, is that there is no reason IT can't deliver to a budget, whether it's a financial or, as in the case at JPMorganChase, time limited. If the BBC have a budget for this development, then their suppliers, whether they are in-house or external, should learn to work to that budget. To deliver a successful IT project, it is important to be realistic, not ambitious. Software development is both technical and creative and consequently there is plenty of scope for issues to arise and cause slippage. Furthermore, technology is a moving target and their is no doubt that the environment around this development will evolve as it progresses. The BBC should set their project milestones now, so that this project can be cancelled early if it is not going to deliver to the proper budget and they should remove the 20% safety net immediately.

Monday 21 June 2010

Now that's 21st Century customer service

One of the reasons this blog has been neglected for the past few weeks has been the fact that we've just been flat out with work and, whilst there are a few matters that I want to write about, I had an experience yesterday that has jumped to the head of the queue.

I'm in London today for the 'more exciting than it sounds' Automated Transfers to Local Authorities workshop, which has already scored its programme initiation goal of having a Greek acronym in ATLAS (although Midas remains the IT industry favourite in my experience).

As it's a fairly early start in London, I caught the train down yesterday afternoon. We left in good time for the station and I went to pick up my tickets from the machine at the station, having ordered them online from The Train Line. I popped in my card and I was prompted for my booking reference, which caught me off guard, as it normally just spews the tickets out.

The Train Line website has a nifty feature which enables you to SMS your journey details to your 'phone, so fortunately I had them to hand but the reference wasn't recognised. After two more attempts I gave up and bought fresh tickets from the booking office, just as the train rolled in. It was a mad rush to kiss my wife goodbye and jump on the train with my laptop, suit and overnight bag but I made it to a table and made to stow my luggage, at which point I realised I was still holding the car keys.

In a most un-English way I shouted for someone to hold the door, thus paralysing everyone else on the carriage at a stroke but I made it in time to hand the keys to my wife who was determindly striding the length of the train looking for me (and probably considering the eight mile walk home with two small children).

Eventually, then I was sat at my seat, luggage put away, book, laptop and iPod out, and I decided to Tweet my success against the odds: "Heading down to London for the DWP meeting, despite TheTrainLine's ticket pickup process letting me down #moredismalsoftware".

Less than thirty seconds later I had a reply from @thetrainline "@fennerpearson Hello, anything I can help you with? Dave"

Needless to say, my irritation with The Train Line flip-flopped in an instant and, fond of the website as I have been in the past, now I was in love. Before the train was at the next station, David Wilkins, the man at the end of the Twitter line, had sorted out my problem and given me instructions to obtain a refund.

There is a huge amount of suspicion and even distain when it comes to social media and I'm not saying I don't have doubts about it myself from time to time but here is a brilliant example of a company making full use of the tools at its disposal to give a personalised service in a transaction environment that - arguably as one of its benefits - has almost no contact between business and customer.

So, hats off to The Train Line for imaginative use of technology and thereby turning a very disgruntled customer into one who is now delighted to have had a stressful start to his journey, just so he could experience their use of Twitter. As anyone runnng a business knows, it's easy when everything's going right. It's how you deal with situations when they go wrong that sorts out the average companies from the excellent ones.

Thursday 15 April 2010

Computing article: "Calls for more transparency in ICT procurement"

Subsequent to my post on April 11th, an article appeared in Computing entitled "Calls for more transparency in ICT procurement", which you can read in its entirety here.

The article reports on a 'roundtable' debate chaired by Janet Grossman (the ex-chief operating officer for the Department of Work and Pensions) makes the point that, in order for government IT projects to succeed, projects do need to be smaller, which was one of the points in my posting.

As an additional point of interest, the article also makes reference to the fact that government procurement is tied up by a small number of large companies.

It's interesting that both the problems arising from and the solutions to government's IT problems can be expressed in common sense terms, easily understandable by the layman.

Sunday 11 April 2010

Why I believe the government has it wrong on IT

In recent weeks, as the main parties look for somewhere concrete to cut costs in order to locate the billions they need to save, they have alighted on IT as as a strong contender for saving money. And who can blame them?

Government's track record on IT is appalling. Huge overspends on systems that fail to deliver, with just a handful of large suppliers carefully tying up the market to their benefit, and decidedly NOT to the advantage of both government and tax payer.

The Child Support Agency's CS2 system, for example, is riddled with "insurmountable bugs" that mean many cases have to be handled manually: 19,000 in 2006 rising steadily to 75,000 by September 2009.

The problem, here, I believe, is how government buys its IT. It appears to completely lack the expertise to make procurement decisions that will deliver the required outcome. I have met a lot of intelligent people in local government who are quite incredibly committed to delivering good service. They know what they want from IT but seem quite unable to get it from the handful of providers who have found seats on this particular gravy train.

Recently we tendered for some work from a local authority, which had talked about developing an open source system that could be shared with other councils. Yet, when we received the paperwork, it was clear that the current provider had the authority locked into its own CRM solution. Elsewhere, when we have tendered successfully, we have delivered working solutions at a fraction of the price the larger software houses have quoted, in once case coming in at less than a sixth of the cost.

So, as I said, I can see why government has had enough of IT, spending billions on projects that are either abandoned or, once delivered, don't meet the requirements. However, I think it would be a serious mistake to abandon IT solutions. Going back to the CSA for a moment, the National Audit Office has calculated that when a case can be processed through the system, the cost to the tax payer is £312. When processed manually, the cost trebles to £967. The 75,000 cases being processed manually last September were costing seventy-two and a half million pounds, instead of twenty-three and a half million, if CS2 had been able to handle them, a difference of nearly £50M.

I believe, then, that the government does need IT. Furthermore, if civil service numbers are going to be allowed to decline by 40,000 jobs (and probable a lot more), then I would argue that government needs IT more than ever. Civil servants need to be freed up from manual processes and administration, to focus on serving the public. (See my blog earlier this month, "Good business systems are about liberating your staff, not getting rid of them.")

However, the solution to the government's problem is not so difficult. Let's look at the issues for a moment, which primarily arise from the simple fact that large projects are notoriously difficult to handle.
The longer the project, the more legislative changes it will suffer over its development cycle.
Big projects are hard to manage, where large teams and deliverables need to be coordinated.
The huge amount of analysis that is necessary means that detail is skipped, so requirements are misinterpreted or lost and parts of the solution don't integrate.
Vague specifications lead to inaccurate costs, so the supplier ups the price to protect themselves. (And add to this the fact that large companies have high salary overheads, so they want as many of their consultants as possible on any project.)

Projects away from the public sector are far less likely to be run this way, with the client wanting far more clarity about precisely what will be delivered, as well as how and when, before committing any money to the project. Government should learn from this, commissioning comprehensive, precise specifications as a separate part of the project and hiring third party companies to evaluate them. Projects should be broken down into manageable releases, such that everyone involved has a clear grasp of what is going to be delivered, and on what date and at what cost. What I am saying, in effect, is that the solution is to turn these large projects into a number of smaller projects.

You might, quite reasonably, argue that as someone who runs a software house, I have a vested interest in this argument and that may be true, but I believe the figures speak for themselves. Time and again, I have seen software benefiting my clients' companies, including some local authorities. Government must not turn its back on IT; it will be more vital than ever over the next few years. What it does need to do is to learn how to buy IT, including being a bit more discerning about those suppliers from whom it buys.

Tuesday 6 April 2010

Good business systems are about liberating your staff, not getting rid of them.

As you might suppose, I’m a big advocate of computers and computer systems. I think the right bespoke software can make the difference between a company being good and great, especially in the eyes of their clients.

However, I would not for a moment suggest that computers are better than humans. Sure, there are some jobs that a computer can do faster, more accurately and more effectively but, in business, an awful lot of the time those jobs are the ones that humans don't want to do. Repetitive, predictable, frequent, labour intensive tasks.

And, really, no business wants to take people on to do those tasks. Businesses like people who contribute to the company’s success because of the skills and talents for which they were hired. They don't want to take people on whose sole role is to chase around bits of paper, add columns of numbers or keep track of where items are kept in the warehouse.

Most small businesses, as they grow, develop processes. A lot of the small, successful businesses that I go to see have very evolved processes, often involving spreadsheets and order forms and bits of paper being moved from one tray to another one. And I’m not being damning there: these are often very good systems that have helped the company to make its presence felt in its marketplace.

But, as these companies grow, so these systems start to creak. Bits of paper go missing, the person who understands how the formulae in the spreadsheet work is off sick, an order gets lost. The people who have other, important jobs to do, like making sales or sending out invoices are distracted by the work required to keep the system turning over. At this stage, a small business might start to consider hiring people just to administer that process. And just how do you go about recruiting someone to do a really dull job?

You don't need to have pre-cognitive powers to spot that I would suggest that at this point you get my company in to build you a bespoke software system to replace that process, or rather to emulate it on an application to let your company run the same way, only better. However, that is not the point of this post. What I want to say is that by introducing a bespoke system to replace the process, you not only save yourself the difficulty of recruiting those lovers of mundane tasks but, crucially, you liberate your staff. You are not replacing anyone, you are benefiting them and your business.

Those people who were beginning to flag, who were complaining that they didn't have time to do their job properly, who had stopped thriving in the workplace, suddenly find that those dull aspects of their job, tasks that, in fact, weren’t really part of their job have gone. They have hours in the day to do the role they were hired to do. Rather than looking over their shoulders for yesterday’s missing order form, they have time to look ahead and think what they could be doing to improve the way they work and the company’s prospects.

Bespoke systems will save you money, not least in saving you from hiring staff who do nothing but administer your business. But they won't replace your staff, rather they will help you to get the best from them.

Tuesday 9 March 2010

PCI Compliance: What? Why? Where? When? Who? And how?

Welcome to my first post of 2010. We've been busy - really busy - since the start of December and I've not had time to get my thoughts in order let alone down on the blog. However, since part of the reason we are so busy is that we are in the process of working on two large e-commerce sites, my thoughts have been concerned with the issues around selling goods on line and that topic will be the subject of the first three posts (unless anything else crops up, of course).

The first topic to discuss is PCI compliancy, not least because so few people whom I meet who are trading online, particularly in small and medium-sized companies, seem to understand its implications.

So, the logical first questions is "what is PCI compliance?"
The term refers to the Data Security Standard set by the Payment Card Industry. The details can be found here but, in précis, the standard is concerned with ensuring that customer data - particularly credit card data - is held securely. In fact, the standard is so strict that, in fact, the solution is to completely avoid holding credit card details. Fortunately, many of the better payment gateways have made this a viable option for e-commerce, if not for telesales.

A reasonable question following from this, then is why is the standard so strict? Well, whatever my minor gripes about the standard (see below), there is no question that there were an increasing amount of e-commerce sites around that were not built to any given security standard and many of these were - and still are - holding unencrypted credit card details. Of course, these same sub-optimal sites were also the ones most vulnerable to hackers. Thus, it is understandable that the payment card industry decided to set a standard.

As far as the where question is concerned, PCI applies wherever you are trading.

So, if you are processing credit cards online, a good question is "when do I have to become PCI compliant?" The answer is that you already should be. Although it appears to be relatively easy to trade online without being PCI compliant and, if you are 'caught', you will usually receive a reasonable period in which to achieve compliancy, matters become far more serious if your security is breached and credit card details are stolen. The fines are punitive and your company will then be under constant scrutiny, with expensive top level compliance required.

The who, as you will have gathered by now, is anyone trading online. The banks have written to their merchant clients telling them of the need to be compliant, so I anticipate that there can only be a relatively small percentage of online traders who are genuinely unaware of the requirements around compliancy.

The $64 question, then, is how to achieve compliancy. If you are holding credit card details and intend to carry on doing so, then there are some big hurdles to jump. Just download the certification documentation from the PCI site and you will see just how extensive the requirements are. If, however, you use one of the better payment gateways, such as SagePay, there are ways around holding credit card details, even in complex situations involving delayed charging (such as when an order is shipped in multiple parts). Not holding card details makes achieving compliance a lot easier.

It seems to me that advertising relating the standard would be a good way forward. Educating customers to stop them using sites that don't demonstrate certification would encourage e-commerce sites to adopt a proper level of security. It would also avoid naive companies - who, perhaps, have not been properly advised by their web provider - incurring large, unexpected fines when their sites are hacked.

There's little doubt in my mind that the standard is flawed and the fact that the banks appear to have followed the PCI slavishly hasn't helped. Although we, Meantime, do not trade online, we have taken the trouble to achieve PCI compliancy as a company, as well as building e-commerce sites that are compliant. This was very difficult, unnecessarily so, and, in places, the requirements were illogical. However, PCI is the standard and that is what we have had to follow. All e-commerce companies need to follow suit.