Okay, let me not even dare to explain what “better indexing” means generally. From an SEO perspective, better indexing to me is getting all the important pages from your site getting indexed with the right importance and “weight” with the search engines. So, if I have ten pages on my website, I’d say success is to have all those ten pages with equal importance and page rank that searching for the primary keyword for each of those pages would result in a top ten listing. Makes sense right?
What goes wrong with the “normal” SEO efforts?
As a matter of fact, many of us work towards the “improvement” of our site’s metrics like page rank and ends up doing just that. All the link juice gathered, all the submissions made, all the onsite optimization done, all goes to the homepage and we do pretty much nothing except linking to the internal pages. As a result, not every page on the site gets the chance to show theur heads on the search engines front page.
What happens when your internal pages don’t get the right visibility?
Undispuedly, the major part of search engines traffic comes from the internal pages for an average website with lot of contents. In order to enjoy the traffic from long tail keywords (which form more than 80% fo the entire traffic chunk), you got to make sure that your internal pages gets the right importance on the search engines.
Problems that may cause unequal visibility problems on your site.
- Poor link structure
That’s a general term am afraid. And it means different things to different people. You have to analyze your site requirements and find out what works best for you but generally it’s a good idea to link more frequently to more important pages while maintaining a single frame of links connecting all the pages equally. If there are lot of pages, connecting to each of them grouping them under categories might be a good idea, yet again it canot work for everyone.
But problems arise when the lin structure is poor and not every “important” page is not connected well and gets the right visibility.
- Use of Paginators
I’m all against paginators and would only recommend it when absolutely necessary. But as I see it, they give a very bloated link structure that can reduce and cut off the importance given to every page connected. It follows more of a lateral linking strucure that’s long, and I’ve seen that it doesn’t work at all times with search engines, but may be a good tool for friendly navigation structure.
- Extensive use of §Categories
Categories, as we’ve discussed here and here, sometimes poses a threat to creating duplicate content, but sometimes they are the best strategy to group together lot of pages. When grouping together pages, extreme care has to be taken to handpick and manually ensuring equal visibility for each.
- Extensive use of Tags
Another way to get more duplicate content on your blog is to use more tags. Same posts on different tags, more than one time (sometimes several time) – not cool !
Most of the time it is not deliberate, but improper use of commands on the robots.txt file will create undetected accidents and issues that can be avoided.
How to get over the poor indexing problems ?
- Get quality, high frequency backlinks to internal pages – Always prefered.
- Use good use of sitemaps, with the right priority values and excluding the unimportant parts of your website.
- Content syndication on networks – Works everytime.
- Create a carefully crafted page/file structure – Difficult to create an maintain, but works.
- Use the nofollow meta tags to full use – Be careful when you do it.
- Use plugins for WP blogs – Easier stuff, do it if you really know how to.
- Use the social media to generate traffic – Put all your best posts on the social media and gather some traffic as well as links.