We’re back with the final installment of SEO-friendly website architecture.
Dynamic URL structure is often a common issue in large websites that are driven by Content Management Systems (CMS). I have listed a few points to help you understand how you could construct a good URL structure for your website.
Following are tips for an ideal URL structure:
Tip 1: Describe your page and keep your URLs short
Tip 2: URLs should remain static without the use of query strings or dynamic parameters (“?,=, and etc) or alphanumeric characters (3IWHD7937q2)
Tip 3: Use keywords in your URLs and ensure that they are not case sensitive
Tip 4: Have fewer folders (keep it to less than four trailing slashes)
Tip 5: Use hyphens and underscores as word separators
You could make use of plug-ins that usually accompany Content Management Systems (CMS)s for rewriting URLs or even make changes at the server end by implementing ISAPI_Rewrite (for Microsoft) or mod_rewrite (meant for websites built on Apache servers).
Below are good and bad examples of URL structure:
Server, hosting and IP architecture:
Tip 1: Choose a reliable service provider to ensure that your server is up and running at all times.
Tip 2: If you have a large global audience, you can use the services of a reputed Content Delivery Network (CDN). This will ensure that your website loads faster at any location.
Tip 3: Ensure that your website is in a neighborhood of clean IPs and not part of any group of spammy IPs. A test in DNSSTUFF will show you if you are a part of any blacklisted IP groups.
Tip 4: Ensure that you configure your reverse DNS entry. Having a DNS entry is equivalent to having a return address. Spammy and blacklisted websites usually do not have a reverse DNS entry configured in their servers.
Tip 5: Search engines would prefer if you host your website in the same country as that of your audience.
Tip 1: Ensure that your website is free of any broken links. Google webmaster tools and Xenu link sleuth are two good sources of checking error URLs present on your website. Fixing broken links not only retains the visitor on your page, it also ensures that Page Rank and keyword rank values are not lost.
Tip 2: Do not use 302 redirects as search engines consider them to be spammy. Use 301 server end redirects.
Tip 3: Use a custom 404 page for your website. This might help reduce bounce rates whenever a user visits a broken or a non-existent URL. It will inform him that the page is broken and provide him options to visit other important sections of the website.
Tip 4: Most importantly, do not disallow any important pages you wish to be crawled and indexed using robots.txt. Also, check if you have a “Meta Noindex” tag on these pages.
Tip 5: Validate your HTML code to ensure that your website is functional across all browsers.
Search engines reward websites that contain accessible and unique content with better keyword rankings. By applying most of the tips provided here, your keyword rankings and website indexing could just shoot to the top in search engines.
Contributed by Zuheb SM