Even if you have a kickass site, sometimes you have to wait for ages to get some pages indexed and crawled on Google. Why does this happen ? I’ve found many people complain that after building a great site, and doing the standard chores, waiting so long for the results doesn’t make any sense. Certain things just don’t happen as planned. Like certain pages on a site, doesn’t get indexed and if indexed doesn’t get the right visibility on the search engines. Strange.
I can probably attribute all these problems to a couple of SEO factors. Let me try and explain them below.
1. Set a sitemap, with the right priorities
Setting a proper sitemap solves most of the problems often. But don’t take this lightly as we have many automated plugins to do the job. Many a times, the automated plugins need good tweaking to ensure that they have the right settings for your site. I have written several posts about sitemaps, like this one, check it out.
2. Publish articles regularly in a predictable fashion
This is not a way to “control” how often the bots visit you. Google has its own algorithm to find out when it should visit you, but publishing more articles in a regular fashion sure does give Google clues on what your posting frequency is like. Nothing wrong in publishing articles in random, but I’d suggest you keep it pretty much in a predictable fashion so we make the process easy.
3. Link well contextually and often between posts
Cross linking between pages contextually is a great way to make sure that the bots visit all those pages. Contextual links are given more importance probably than any other link on the site, so make use of them.
4. Keep the directory structure simple
If you have a self designed website, make sure you keep the directory structure simple and not too deep. Having more directories to crawl will make the job difficult for the bots. Keep it simple.
5. Block the unnecessary pages with proper SEO techniques
Along with making sure that all the pages are crawled, also make sure that the ones you don’t need to show up in the search results are blocked. (Like the TOS page.) You could use the Meta Noindex and the robots.txt exclusion protocol to get things done.
6. Make the navigation bar simple and accessible
Often I see websites with complex, fancy navigation bar annoying. I mean, it doesn’t have to be fancy to impress your visitors right ? You could use classy CSS styling to get elegant looks and still impress them. The problem with fancy navigation bars are that they often do more harm than good like not allowing proper crawling of the pages linked. There are various techniques you can use to make the navigation bar attractive and still allow smooth crawling, employ them.
7. Practice deep linking
Always practice deep linking (you shouldn’t be having a deep structure in the first place.) if you have a big and deep directory structure. Do not miss out linking to any directory/pages. Use clever linking strategy to link all the directories/categories and pages each other. If possible link to all the main category indexes right from the homepage.
8. Make use of proper anchor texts and internal text links
Linking to pages with images and other elements may also result in poor crawling of those pages. Use proper anchor texts to share the contextual factor among your pages and directories.
9. Keep the URLs simple and easy to remember
Confusing URLs are neither good for Google bots as well as visitors. Google bots love simple, meaningful URLs and visitors love URLs with recall value and those are easy to remember. If you don’t want to screw up things, keep the URLs simple and easy to remember (both for bots and humans). P.S – Google does not have any problems with seemingly confusing or meaningless URLs. They still make out the content from the page, however having meaningful URLs are a definite plus.
10. Use the homepage very well
The homepage as you know is likely to gather the most link juice and importance compared to other pages on the site. (This is just a general observation, but if a page gathers more links to it, it could also get more authority and link juice than the homepage.) Use the homepage well, and add more links to the inner pages from it, so they share the authority.
Essentially, factors like how fast to index, how often to index and how deep to index are decided by the search engines. But to a certain level you can control their crawling speed and frequency. The points above are good pointers to this, hope they helped you.