Archives for posts with tag: software

I made an analogy once a while back about error pages on websites.  When you are browsing a nice website and you get slammed with a default or ugly error page, it is like going to a really fancy restaurant only to find a really terrible bathroom.  Now this may seem like a really weird analogy, especially because I have seen some really nice bathrooms before, but ignoring this issue can really disrupt a nice meal, or for web users, can really disrupt their user experience.

Where did all my pretty colours go? I just wanted to watch cat videos!

Ideally, if someone typed in a URL wrong, or heaven forbid clicked a deadlink on your site, I think it is only fair to quickly explain the problem and help them along their way quickly.  Remember, arriving at error pages is a frustrating experience.  Your user did not mean to get there, and the customer is always right, so it is your job to apologize (if you choose) but more importantly comfort them so they don’t just leave.  By maintaining the same look and feel, the users don’t have the huge context switch that they might experience on a default error page served by Apache or IIS. This is how we handled it at http://thoora.com (Yes I’d like to think we are somewhat fancy)

Maybe not as fancy as the bathroom at the Ritz, but hopefully it shouldn't ruin my customers' meal

Note the link to the main page which quickly ushers them back to the regular service.  Now more and more you tend to see graceful error handling, but I think it is good not to go overboard.  If we don’t clearly illustrate that an error has occurred the user might not even know that something has gone wrong.  Now arguably one could say that hiding errors from users is good if possible, but if we encounter a 404 and the resource literally doesn’t exist, we should tell them.  I have seen error pages that look exactly like other pages, so over designed it is hard to tell what is going on.  Similarly, I have seen pages that treat a missing page as an invitation to serve me something completely different (although I might not be completely opposed to a recommendation of similar alternative pages).  When something goes wrong, let them know and help them recover gracefully.  This is especially true for the other type of visitors on your site.  The visitors that probably visit more often than you think, and really care about these dead pages.  The robots.

While decorating a bathroom, and having a guy handing you a towel is great, you still have to call it a bathroom.  If you don’t, Google (et all) will be constantly directing some of your traffic straight to your bathroom instead of your dinning area.  The robot needs to know that something bad happened and Apache (and most web-servers) will serve them the correct HTTP response code for you.  I can’t talk about this without thinking of an old commercial I used to love.  I laughed so hard every time I saw this (its been years since this campaign was run). Below is a screenshot of Firebug capturing a nice 404 back from a server. This lets your browser know, and more importantly the spiders/robots know that even though it looks nice, something went wrong (Tim Berners-Lee would be so happy).

Bots aren't as smart as (most) humans, they need help from time to time

So if you are going to fail, fail gracefully.  And if you are going to customize your errors, make sure you don’t forget about the robots.  On a side note I encourage all to use Google webmaster tools to see what the spiders are tripping on.  Google is great at finding links you didn’t even know existed on your site ;)

Further reading: ISBN-13: 978-0735714106

Advertisements

Why are things always so obvious after disaster happens, I think they call that hindsight

One of the final talks I attended at CUSEC 2010 in Montreal, was probably the most important, and unfortunately – (seemingly) the most underrated. Daniel Berry from University of Waterloo Software Engineering gave a talk entitled “Ambiguous Natural Language in Requirements Engineering”. Unfortunately he was speaking to an audience in the Agile area who have been exposed to culture stating that Waterfall Software Process is dead. More inaccurately, that requirements gathering is a dead science. This talk was not only enlightening, but also made me rethink many topics I haven’t thought about since University.

In summary for those who don’t want the verbose version: The process of transforming the ‘idea’ (as a requirement) and turning it into ‘code’ is a problem that extends well beyond just the ‘code’ part. Many software errors (%5-%10) are a result of ambiguous requirements – Requirements that neither party in the process even knew were ambiguous. This talk was a branch of Software Engineering shining at its best – and Berry offered solutions!

First to address those developers still pretending this problem doesn’t apply to them: Not all software projects involve 1-2 developers. Not all projects allow you (the developer) to talk directly to a client. Some projects will have you writing code based on requirements written by someone else in your company. This problem exists whether you want to believe it or not.

Read the rest of this entry »

This happened to me a while ago, but I recently stumbled upon the screenshots somewhere on my desktop (Yes take a screenshot whenever a website does something that pisses me off).  This will begin my, hopefully abundance of, rant posts about otherwise great/successful software applications and websites.  I found Twitter was much to short to fully express myself, but I’ll leave my twitter rant for another occasion.

The target of this post: YOUTUBE.com, yes you.. tube </bad joke>.

Youtube’s recent UI change (last quarter or so) has some new modules on their front page which include: recommended videos, popular videos by category (love this module), videos being watched right now (which I still don’t understand) and Featured videos.

Here is what I saw:

Youtube's home page

Look at all the great suggested videos for me to watch!

Read the rest of this entry »