w: www.meantime.co.uk
t: 01539 737 766

Wednesday 22 September 2010

What yesterday's problems tell us about Twitter's testing

Those of you who keep an eye on the news - and who don't switch off at the mere mention of Twitter - will probably be aware that Twitter had problems yesterday as users (with development skills) were able to include code in their posts leading to the problems described in The Guardian (here) and The Telegraph (here).

Unsurprisingly, the language used to describe the incident is full of the kind of words that IT people use to keep everyone else at a distance and to spray a little nerd glamour on themselves. But for all the talk of malicious code, worms, onmouseover, hacks and loopholes, the truth of the matter is remarkably straightforward.

So, first, a quick description of how browsers work. When you load a web page, your browser requests HTML (the language in which web pages are written) from the web server and as it receives the code, it uses it to build the page, top down. It's a fundamentally simple process and the browser simply processes each line in turn. So, when a browser displays a page of 'Tweets', the messages are simply part of the HTML. If some other code is included in the HTML for that message, then the browser simply interprets it.

I first came across this as an issue ten years ago when I was leading the testing for RBS's Digital Banking software, their first purely web-based Internet Banking service. I discovered during our early testing that on the page where a user would be able to name their accounts, I could enter basic HTML, which would then affect the way the page was subsequently displayed, once it had been saved.

From then on, every entry field in the website had to be coded in such a way that any characters that might be used to insert code were not permitted. And every test script included tests to ensure that if those characters were used, the data would not be saved to the database. We had a few data input boxes and the testing was time consuming but, of course, it was vitally important that no one could introduce code and make the site work in a way that wasn't intended. These scripts were used to test every release of Digital Banking, even if the changes were in a different part of the system from the data entry.

Twitter has one input box. That's it: one. It might be deployed on one or two different pages but it is the same code, the same function.

So, what does this tell us about Twitter's testing. If it tells us one thing, it tells us it isn't as robust as it should be. It doesn't really matter whether the issue is down to a tester who ticked a box without actually doing a test, an automated script that wasn't run, poorly documented test scripts or a missing process that should confirm that all scripts are complete. This was a bad drop by an organisation that has tens of millions of users and a burgeoning usage by business.

For as long as I have been in IT, testing has been the poor cousin to development, and regarded as an unnecessary headache by developers. IT and project managers must never lose sight of the importance of this stage of IT development: customer and client confidence is easily lost and difficult to regain.

Friday 17 September 2010

Keeping content accessible and other things *you* should do for your site

There's a lot of things you might ask or even demand from the company that builds your web site. Certainly one of those things is that it should comply with the Disability Discrimination Act but you might also ask for, say a news page and a calendar of events.

But whilst it's very easy to have ideas at that point in proceedings, you should think about your ongoing commitment. There's nothing that makes a site look sadder and neglected than a news box on the front page that hasn't been updated for months or a calendar with nothing on it. It's important to ensure that somebody in your company - or perhaps someone outside it: your copywriter or marketing people - takes responsibility for that content and ensures it is updated regularly and with some care.

However, the main reason for this quick blog is to do with your responsibility for accessibility. Today we are sending out our latest newsletter, which is about accessibility and the DDA. In the newsletter we provide a link to a tool for checking accessibility and, of course, it occurred to us that a sharp client or two might use it to check our sites: our own and the ones we've built.

We were more than a little surprised at first to find some of them failing because we always check our sites for both W3C and DDA compliancy once they're finished. However, on closer inspection, we found that it was the user generated content that was causing the problem and not the code we'd written. That was, of course, a relief but then Louise asked whether we had even spoken to our clients about how to keep their content accessible. Well, that did take the smiles off our faces.

So, from next week, we will be briefing our existing clients on how to make sure that the content they put up is accessible and making sure that it's part of the training for our new ones.