Site ranking myths for 2012
What is ranking myth and what isn’t? What should you do (or not)?
SEO myths abound and increase every year. Misinformation is spread to others who then pass that misinformation on, and on it goes.
Newcomers read misinformation on blogs and forums and then pass on the same even adding their own ideas to it, creating further misinformation.
It’s difficult to say exactly what is valid and what is myth as many people post SEO ideas based on their experience of what worked (or not) for them and it is easy to think that if your site ranking changed, it was because of an SEO change you made.
The truth is, there are so many reasons why a rank may have changed and often it has nothing to do with any changes you may have made to your site. It’s possible to change a number of SEO factors and there will be no change to your site ranking.
I have always believed in ‘good practice’ and trying to keep to ‘accepted standards’ as closely as possible. I still use meta keywords and update (most) sites regularly, even though it is now accepted that these two elements do not feature highly in ranking factors.
Whether they feature or not, I believe they should still be included as they form part of good practice. Any dynamic site should be updated regularly anyway and there are still some search engines that use meta keywords in ranking formulas.
Is ranking the same as indexing?
What is indexing and ranking? ‘Indexing’ is the process of listing the pages of your website in the search engine’s databases. ‘Indexed’ is when the pages on your site have been listed in the search engines databases. When they have been indexed, they will be able to be found in an internet search. The more effective the pages have been optimised, the higher up in the SERP’s (search engine results pages) the pages will be found.
Ranking refers to a totally different aspect but is often confused with indexing. ‘Rank’ is a rating applied to the indexed pages and is the process which determines which pages show up first in the SERP’s.
If a page is found higher up in the SERP’s than another, it is said to have a higher ranking than the other. This is not strictly ‘ranking’ in a technical sense, but rather a word to describe the level, or quality of indexing.
Ranking should not be confused with Page rank, (PR) which is basically, a measure of Google’s trust in a webpage.
What is considered to be Ranking Myth?
Based on information gathered from specialist sites, SEO blogs, forums and from Google, I list the following myths, adding my own personal views, which are based on the results I have achieved by using the strategies I have used.
I have to periodically re-submit my site to the search engines.
Nowadays, re-submission is totally unnecessary. Once a site is in a search engine, it’s in it for good (unless it gets banned for using dodgy techniques). There is no reason to keep submitting a site to a search engine. Of course, ‘site dynamics’ does come into it, but this is nothing to do with re-submission.
Having Meta tags will help my rankings.
Meta tags do not affect your rank. Long gone are the days when search engines trusted what webmaster’s meta tags said about their websites!
Trading links with any site which will link to mine is still effective.
Exchanging links with ‘anyone who will swap’ is a pointless exercise. Don’t associate with useless websites and be wary of ‘why’ people are requesting link exchanges as there are many scams on the go. I have a link exchange page but probably accept only around ten percent of requests for one reason or another..
You must submit your URL to as many search engines as possible.
If this refers to the physical submission to 1000’s of individual sites then I agree with it. This used to be a necessity (in the 90’s) but is no longer needed. Many engines submit to associated engines nowadays and once a site is live, it will eventually be found by the crawling or indexing techniques of all search engines.
I only ever submit to around 100 or so sites nowadays and do this through a submission broker. It’s a cheap enough paid service and well worth the cost. Forget the ‘We’ll submit to 10,000 sites for £100’. It’s not needed and a waste of money. The exception to this is ‘speed’. If you want to speed up the indexing process, then practically speaking, the more widely spread links, the better.
All you need is about £5 for 100 or so sites to get the initial indexing started and the internet process will take it from there. These 100 or so sites should also include Google, Yahoo and Bing because many other engines stock their databases from these engines.
You must have a high ‘keyword density’ to rank well.
This definitely no longer applies and having too high a density can have the opposite effect, however, I believe it is still important to have your targeted keywords included throughout any article or page/post and in the right places and at the currently accepted density.
Some search engines still apply a ranking factor to this element of optimisation, however, if you saturate your pages with keywords, it can have adverse affect. Visitors want to read content – not SEO. Lose visitors and you’ll lose ranking!
Correct keyword density for your content, depends on the targeted keyword. Some keywords lend themselves nicely, with variations, to a density of 2% to 3% while others can only be used less than 1% of the time while still looking ‘natural’.
On-page keywords should definitely still be used, but not abused.
You must have a Google Sitemap.
It definitely helps to have a site map but it’s not a necessity. It doesn’t directly help in getting a site ranked. I have had sites rank without and if a site is ‘crawler-friendly’ you can make do without one, however, for the few minutes it takes to build one, I always recommend using one.
You need to update your site frequently.
Again, this used to be a necessity but nowadays, will not increase your ranking. Regular updates to your site will increase the crawl rate, assuming feeds etc. are in place, but it won’t increase your rank.
PPC ads will hurt rankings.
Some people think PPC ads harm rank and some think the opposite. I have had sites with PPC ads on which ranked as well as the sites without, so I don’t think it actually matters whether you include them or not – from a ranking point.
Pages or Posts must contain a certain number of words.
I remember when this used to be a major factor in the mid 90’s but as engines have changed, it is no longer such a requirement. Web pages should read how they need to. How can you make a ‘contact’ page amount to 400 – 500 words? If littered with keywords, it would be classed as keyword stuffing. Web pages should be visitor friendly as well as engine friendly and ultimately, more traffic is a major factor in ranking! You should therefore use as many (or as few) words as you need to use, to say what you need to say to keep your visitors returning.
H tags must be used to get high rankings.
There is very little evidence to suggest that keywords in H tags actually affect rankings. When I’ve rushed a site together and uploaded it with little effort, I haven’t seen much difference in rank to that of sites that I spent ages on, which in my experience, tells me that it is not too important for ranking.
However, because it is one of those ‘good practice’ elements, I always try to include them.
Words in your meta keyword tag have to be used on the page.
This is listed as a myth as many SEO specialist sites list it as so, but I differ on my opinion with this one. I am ‘big’ on analytics (I advise anyone with websites to be the same) and I have found that indexing and traffic increases with ‘relevant’ meta keywords on the page.
Meta keywords on the page don’t affect the rank directly, but as they still attract search results and traffic, rank improves.
The Meta keyword tag was originally designed to be used for keywords that were ‘not’ already on the page, so it stands to reason that ranking should not be affected by this element, however, many sites, including Yahoo! still apply some weight to it and for this reason alone, I always recommend inclusion at the correct rate.
You’ll get a Google slap if you ignore all Google’s guidelines.
Yes you probably will, (I got a slap this week) however, the only thing that Google really opposes is black hat techniques. ‘Most’ of the guidelines for ranking set out by Google refer to good practice and common sense, (although I’ve gotta say, all common sense goes straight out the window when your talking about Google Adwords when they turn into power crazed despot!!!).
Same colour text, cloaked redirects and other similar techniques will always attract a Google downgrade but then again, they are not genuine techniques, which opposes Google’s policies.
Your site will be banned if you buy links.
While there is some truth in this (where Google is concerned), you have to go to some extreme lengths to get a ban for it. The reason is that Google and other search engines don’t want to be counting ‘paid’ links as votes for a page as they are not real votes. If you do buy links, then I recommend restraint. The trick here is to buy few and often! :0)
Should you still bother with all the above?
Even though all the above don’t directly feature in site ranking and are classed as myth, if you didn’t apply any of them to your site, do you think it would rank as quickly and effectively?
I believe not, and for this reason, I always recommend including the appropriate amount of each element in all your sites. The above are considered as myths by ranking experts and while I agree that some no longer have the weight that they did in the 90’s, I have found that when I have literally ‘thrown’ a site up without bothering about these factors, (as I occasionally do) I have found those sites to take a much longer time to get indexed and to rank – and some never rank well enough.
The recent site I did for a colleague; Peter Whieldon proved this. I set the site up, added a site map, used meta tag keywords, keyword rich descriptions, submitted to the usual 100 or so engines through an indexing site and included a few initial posts which were appropriately keyword optimised. I also used all the available H tags and every other on-page element available.
The site was indexed and on the first page of Google within eleven days for the primary keyword, proving that you don’t actually need to do too much to get to this stage (if you have a relatively easy keyword to optimise) but, due to other commitments, as Peter has not put much work into the site since, I can see there has been little change in ranking and especially indexing for the other keywords and pages, which, when compared to a similar site that was started at the same time (which has had regular input) and which does have higher indexing, suggests that all the above ‘should’ be used!
So what do you need to do to acquire a good ranking?
Google is spearheading the move towards internet sociality on many levels and quality at all levels and as such, has introduced many algorithms concerning these matters. Many of these are focused on user enjoyment, again on many levels and these range right from ease of access to site authority and trustworthiness.
Google say that all elements concerning visitor access and enjoyment will be featured in ranking factors for 2012. Algorithms concerning page loading speed, site trustworthiness, feeds and social media, website design and navigation and linking structures are already used in rank scoring.
What do the main ranking factors for 2012 include?
They will definitely include visitor friendly elements. As Google state on their Philosophy page, ‘Focus on the user and all else will follow’.
Image search aptness, site links, soft 404 pages (404’s that show an alternative page), mobile friendliness, appropriate domain extensions (for locale searches), language choices and security and SPAM all feature highly in ranking equations for 2012.
Quality for a visitor is paramount and that’s nothing to do with ranking (and ultimately, everything to do with it).
What elements make a site visitor-friendly?
- The site is relevant to the terms being searched for
- The site is considered an authority about its topic
- The site contains quality, original, useful content
- The site has been around for a while
- The site contains lots of associated information
- The site and pages load quickly
- There are no broken links
- The page is not saturated with keywords
There are many more – the above list are just the basics!
What do the top search engines consider to be good and bad qualities?
Here are the guidelines from the top two search engines on what you should implement in order to create a ‘high quality’ site/page:
Google (Google’s guidelines page)
- Make pages for users, not for search engines. Don’t deceive your users, or present different content to search engines than you display to users.
- Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
- Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web as your own ranking may be affected adversely by those links.
- Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our terms of service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.
Yahoo! (Yahoo!’s guidelines page)
- Original and unique content of genuine value
- Pages designed primarily for humans, with search engine considerations secondary
- Hyperlinks intended to help people find interesting, related content, when applicable
- Metadata (including title and description) that accurately describes the contents of a web page
- Good web design in general
And here is what the top two search engines say you shouldn’t do or class as poor quality:
- Avoid hidden text or hidden links
- Don’t employ cloaking or sneaky redirects
- Don’t send automated queries to Google
- Don’t load pages with irrelevant words
- Don’t create multiple pages, subdomains, or domains with substantially duplicate content
- Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content
- Pages that harm accuracy, diversity or relevance of search results
- Pages dedicated to directing the user to another page
- Pages that have substantially the same content as other pages
- Sites with numerous, unnecessary virtual hostnames
- Pages in great quantity, automatically generated or of little value
- Pages using methods to artificially inflate search engine ranking
- The use of text that is hidden from the user
- Pages that give the search engine different content than what the end-user sees
- Excessively cross-linking sites to inflate a site’s apparent popularity
- Pages built primarily for the search engines
- Misuse of competitor names
- Multiple sites offering the same content
- Pages that use excessive pop-ups, interfering with user navigation
- Pages that seem deceptive, fraudulent or provide a poor user experience
And as usual…
- – - – -
- – - – -
- – - – -
- – - – -