December 06, 2003 | View Comments (4) | too hard

These were supposed to help us navigate the web with ease. Instead they have become one of the biggest usability issues on the web. We have gone from the ability to read a url to someone, to having to ask them if they are online so we can just send them the url. If we have all of these great CMSs running around written by talented programmers, why can't they come up with a scheme to generate some usable urls. I mean MovableType allows users to come up with some human readable urls and that was written by two people for free. canadadrugs.com

Imagine typing in your address bar:


and actually getting your results. The point is urls could be so much easier and they would be so much more useful. Apple does it right. I need to find info on the ipod, instead of going and searching the Apple site I just type in:


and I am presented with the ipod page. Amazingly simple. Why does everyone wish to make it so hard?

To make useful urls it does take a little more work upfront because you need a good architecture. But isn't better doing all the work at the beginning instead of beating yourself over the head 1 year down the road? The art of making useful urls is definitely a lost one. Maybe we should bring it back to the forefront of our minds.

Trackback URL: https://simpleydone.com/cgi-bin/mt/mt-tb.cgi/10


I don't think Apple does a much better job than most. Try http://www.apple.com/filevault --> 404

/ipod is just common enough to warrant a 'directory." On my sites i might have /forum for the same reason, instead of /forum.php

Human readable urls are very important though, not only for the readers sake but for indexing, since a lot of page's with query stings won't be archived/indexed by search engines (for good reason). I think equally important, if not more so, is persistant URLs (see purl.org), with so many sites coming and going, and so many sites getting "redesigned" and files getting moved around, the organization of pages on the Internet is a disaster. Further, if they are not being cached/archived/indexed, due to "bad" URLs, we lose large numbers of resources/pages...

With all of the dynamic sites and pages these days, it would be hard to convert all URLs to be human readable, and even your google example might be to difficult for the average web-goer. Hopefully browser's and things like favelets can make these common things (eg Google searches) more convenient. As an example, the easiest and quickest search method for me is done by typing the string in to a Google search box next to my address bar in Firebird....

When MT makes the human-readable URL does it actually put a file on the system? If so, this would be infeasible for a site like, say, slashdot.org. Being a web developer I know it isn't necessary to use an actual file, I'm just curious. In fact, I do something like this on a site I just rebuilt [ http://www.radiotakeover.com ]. Granted, more work needs to be done with it ;) But there's a few thousand artists on the site and instead of searching or remember the ugly "machine-readable" URL you can use URLs like http://www.radiotakeover.com/earthcrisis to get to the "Earth Crisis" page.

Oh, and it allows for clever easter eggs, too... Like attempting to find Metallica that way ;)


Posted by: defHex at December 8, 2003 03:05 AM

The Apache URL rewrite module is great at turning messy URls into friendlier ones, so there's no excuse not to have nice resource centric URLs (unless you're not using Apache of course).


Posted by: Paul at December 8, 2003 11:03 AM

http://google.com/search?q=web+design 0_o

Posted by: Flexer at December 8, 2003 11:43 AM

Haha, touche Flexer. Almost totally easy, not quite, but almost.

I agree that the URL rewrite module would definitely help a lot of sites out.

Posted by: The Scholar at December 8, 2003 01:47 PM

You can contact us at thegroup{AT}simpleydone.com