Spiders, bots, and users – oh, my! There are a lot of different kinds of people (or programs) looking at your site. Internal links can help both human users and search engine bots and spiders to understand what your site is talking about and how to get more information. But they can navigate in such different ways – users often search for something specific, while bots avoid the search screen at all costs. In any case, it is possible to build links to satisfy both users and robots!
In blog posts, on pages, and in other situations where it makes sense, use internal links to encourage your customers to find out more information. These in-context links are crucial to Google’s effort to categorize your pages and understand what they are about. Using descriptive anchor text is always better than the words “click here” for internal as well as external links.
Help people navigate and stay oriented by telling them where they are on your site. A simple way to do this is using breadcrumbs, allowing users to click up to more general topics. Breadcrumbs help spiders keep oriented as well, and direct more links back to general categories.
Speaking of categories – make sure your site has a structure, and stick to it. Resist the temptation to create random pages that can only be accessed in one way. If you are constantly making pages for special sales, create one page instead and update it as sales come and go. Instead of making a stranded page for an announcement, create a News category so that users can find previous press releases and see how far your business has come.
Once you have everything organized, let users navigate themselves around using a sitemap. Don’t drill down to individual articles, but make it easy to understand how to learn more about any subject they are interested in. Sitemaps are even more important for spiders, making sure no site is left behind.
5. Recommended reading
After a user has read a blog post or article, use a plug-in to suggest additional reading opportunities on your site. This is an almost effortless way to keep readers engaged longer.
In the same vein, allow readers to view the titles of blog posts which were published immediately before and after the post they are viewing. This is particularly useful for ongoing series of posts, but you never know when a title will catch someone’s eye and encourage them to click over and read another article. This modification is also useful for bots, allowing them to crawl the whole archive (albeit very slowly, with a large distance from the homepage).
Each page only needs one hyperlink address. This can be particularly difficult for homepages, which can generally be accessed both at url.com and url.com/index.htm. When linking to a page, be consistent and use the most simple, common link format. Readers might not notice that they are being redirected, but it could cause search engine penalties.
8. Most Popular
Widgets displaying popular posts can help both readers and bots to understand which links are most important, and even drive more interaction when people see the number of existing comments.
9. Common Searches
A bot will not search your site to find all your content. If you have essential information hidden behind a “search wall,” try creating pages or articles summarizing those essential points, and linking to related information found elsewhere on your site.
10. Alt text
If you want to use images for links instead of just text, go for it! Just remember to make the image informative for search engines, too: describe it with alt-text, and even make the image’s filename match. And make your banner or logo direct users back to the homepage!
11. Don’t get too carried away
200 or 300 links on a page is too much. Though bots might be able to digest that many, it still dilutes the value placed on each link. Stick to a reasonable number for your navigation and footer, and make sure the on-page links make sense in context and aren’t overwhelming, to readers or search engine bots.
Go forth, and happy linking!