I made an analogy once a while back about error pages on websites. When you are browsing a nice website and you get slammed with a default or ugly error page, it is like going to a really fancy restaurant only to find a really terrible bathroom. Now this may seem like a really weird analogy, especially because I have seen some really nice bathrooms before, but ignoring this issue can really disrupt a nice meal, or for web users, can really disrupt their user experience.
Where did all my pretty colours go? I just wanted to watch cat videos!
Ideally, if someone typed in a URL wrong, or heaven forbid clicked a deadlink on your site, I think it is only fair to quickly explain the problem and help them along their way quickly. Remember, arriving at error pages is a frustrating experience. Your user did not mean to get there, and the customer is always right, so it is your job to apologize (if you choose) but more importantly comfort them so they don’t just leave. By maintaining the same look and feel, the users don’t have the huge context switch that they might experience on a default error page served by Apache or IIS. This is how we handled it at http://thoora.com (Yes I’d like to think we are somewhat fancy)
Maybe not as fancy as the bathroom at the Ritz, but hopefully it shouldn't ruin my customers' meal
Note the link to the main page which quickly ushers them back to the regular service. Now more and more you tend to see graceful error handling, but I think it is good not to go overboard. If we don’t clearly illustrate that an error has occurred the user might not even know that something has gone wrong. Now arguably one could say that hiding errors from users is good if possible, but if we encounter a 404 and the resource literally doesn’t exist, we should tell them. I have seen error pages that look exactly like other pages, so over designed it is hard to tell what is going on. Similarly, I have seen pages that treat a missing page as an invitation to serve me something completely different (although I might not be completely opposed to a recommendation of similar alternative pages). When something goes wrong, let them know and help them recover gracefully. This is especially true for the other type of visitors on your site. The visitors that probably visit more often than you think, and really care about these dead pages. The robots.
While decorating a bathroom, and having a guy handing you a towel is great, you still have to call it a bathroom. If you don’t, Google (et all) will be constantly directing some of your traffic straight to your bathroom instead of your dinning area. The robot needs to know that something bad happened and Apache (and most web-servers) will serve them the correct HTTP response code for you. I can’t talk about this without thinking of an old commercial I used to love. I laughed so hard every time I saw this (its been years since this campaign was run). Below is a screenshot of Firebug capturing a nice 404 back from a server. This lets your browser know, and more importantly the spiders/robots know that even though it looks nice, something went wrong (Tim Berners-Lee would be so happy).
Bots aren't as smart as (most) humans, they need help from time to time
So if you are going to fail, fail gracefully. And if you are going to customize your errors, make sure you don’t forget about the robots. On a side note I encourage all to use Google webmaster tools to see what the spiders are tripping on. Google is great at finding links you didn’t even know existed on your site ;)
Further reading: ISBN-13: 978-0735714106