- Get your site contents written by a professional content writer, who also has vast experience on SEO Article Writing.
- Avoid Duplicate Contents or Duplicate URLs.
- The URLs of your page should look nice to the search engines. For example, www.primusinfotech.com/Digital-Marketing-Services-India.html is called a nice URL and www.primusinfotech.com/?p=128 is called an ugly URL.
- All of your site pages and your site frontpage should have a valid meta description and few meta keywords also.
- Meta Description should be as short as possible, but should contain your main target keywords and should be in the form of a sentence.
- Avoid punctuation marks in meta description.
- Avoid placing long keywords in Meta keywords. For example, JOOMLA SERVICES is perfect, but PROFESSIONAL JOOMLA SERVICES is not a good keyword. Search engines hate long keywords.
- Page title of your webpage is another important factor . Always keep it short and avoid using signs and marks inside it. Just keep it to the point.
- You should have a valid sitemap , which should be readable by the search engine crawlers, because, that’s from where, the search engines pick your website pages.
- Robots.txt is also essential. Disallow all site folders or unnecessary page links from it, so you don’t get indexed on the search engines, with that filthy stuff.
- Submit your website to Google from here, www.google.com/addurl
- Submit and verify your website at Google Webmaster Tools.
- Submit your site sitemap at Google Webmaster Tools and keep checking it after every 24 hours to see, if Google has accepted the Sitemap and the links of your website.
- If you are from US, then also submit your website to Bing and Yahoo and verify it with your website.
- Use Twitter, Facebook, Digg, StumbleUpon, Delicious, etc to get your website noticed by the world.
- Have an official Facebook Page and an official Twitter page also. Share your website and your stuff over there to drive more traffic. Remember, more traffic means, more conversions.
- Validate your site XHTML and CSS with the w3c validator.
- Test your website with ySlow and Google Page Speed and try to apply all the suggestions given by them. Try to have as low as possible grade at ySlow and have as much as possible Page Speed score.
- Keep the site fresh with regular new contents, so search engines consider it as a living website and not as a dead heaven.
#1: Doorway Pages (aka Gateway Pages, Leader Pages, etc.)
Definition: Create multiple web pages that are devoid of useful content but heavily optimized for search engine rankings. Each page is optimized for a variation of a keyword phrase or for completely different keyword targets. The essence of this concept was to fool the search engines into thinking that these pages were highly relevant and provide top rankings for them under their targeted phrase. When a surfer stumbled on the page they were often shown a “Click Here to Visit Joe’s Pizza” link that the surfer had to click on to actually arrive at the legitimate website.
Once among the most popular methods of attaining multiple search engine placements, doorway pages were widely used until 2000 by many webmasters. Since then, Doorway pages have become the most obvious form of Spam that a search engine can find and the repercussions are dire if such a tactic is employed. Unfortunately, I have seen many sites still employing this tactic and occasionally we even get calls from potential customers wondering why they have dropped off the search engines for using this technique. We also continue to receive requests to perform this tactic on a daily basis.
#2: Invisible Text
Definition: Invisible text is implemented in a variety of ways in an effort to increase the frequency of keywords in the body text of a web page. Some of the implementation methods are: making text the same color as the background of the web page, hiding text behind layers, placing text at the very bottom of over-sized pages, etc.
This tactic is perilously old and obvious to search engine spiders. It constantly amazes me when a web site utilizes these methods for placement. Invariably, placements are the last thing that a webmaster will get when using this tactic. Invisible text had its heyday from 1995 to 1999. This not to say that invisible text didn’t work after 1999 but the majority of web sites were not using it by this time as the search engines began implementing automated methods of detection and penalization.
#3: Content Misrepresentation
Definition: Misleading search engines into believing your webpage is about topic ‘A’ when it is in fact about ‘B’. This tactic was used primarily for the promotion of adult, gambling, and other extremely competitive search markets.Unfortunately this tactic is still in use; you likely find one or two every time you search! The fact is that this tactic is the simplest for a search engine to identify and the result will be swift and complete; banishment from the search engine index indefinitely. The worst offense in the realm of the search engines is to try to fool them.
Definition: Redirects have some innocent uses (practical, legal, etc.) but they are also used nefariously to mislead search engines by making them believe that the page they have indexed is about ‘A’. When a surfer visits the page, however, they are redirected to an entirely different site about ‘B’.
In most cases search engines have advanced enough to see this technique a mile away. In fact they usually ignore any page with a redirect (assuming correctly that the content is useless) while spidering the redirect destination. Redirects, unless blatantly Spam-related do not directly result in intentional ranking penalties, however, they have no positive effect either.
#5: Heading Tag Duplication
Definition: Heading Tags, by definition, were created to highlight page headings in order of importance. Thus the Heading Tags that are available: H1, H2, H3, etc. This duplication technique involves implementing more than one H1 tag into a webpage in order to enhance a particular keyword or phrase.This tactic is still very prevalent and likely still works on some search engines; however, none of the major search engines will respond well to this technique as it has been identified as a common manipulation.
#6: Alt Tag Stuffing
Definition: Alt Tag stuffing is the act of adding unnecessary or repetitive keywords into the Alt Tag (or alternative tag – shown by words that appear when you hover over an image with you mouse pointer). The Alt Tag is meant to be a textual description of the image it is attached to. There is nothing wrong with tailoring the Alt tag to meet your keyword goals IF the tag is still understandable and if the change still appropriately describes the image. The offense occurs when an Alt tag has obvious keyword repetition/filler which a search engine can key in on as Spam.
#7: Comment Tag Stuffing
Definition: Comment Tags are used to include useful design comments in the background source code (html) when creating a webpage. These are suppose to be used only for adding technical instructions or reminders; however, in times past these tags were used to artificially increase the keyword count for targeted phrases.
At one time there was some argument that this technique worked, but it has always been a “Black Hat” SEO technique which even then could result in placement penalties. Nowadays this technique will not help an SEO campaign, if anything it will be ignored or produce a negative result.
#8: Over Reliance on Meta Tags
Definition: Meta Tags is a broad term for descriptive tags that appear in the <Head></Head> of most webpages and are used to provide search engines with a concept of the page topic. The most common tags are the description and keyword tags. At one time, extinct search engines such as Infoseek relied a great deal on Meta Tags and many took advantage of this factor to manipulate rankings with relative ease. In today’s far more advanced climate the search engines place cautious weight on Meta Tags and when considering rankings Metas play only a fractional role. Some webmasters still consider Meta Tags the ‘end-all and be-all’ of ranking producers and forget to optimize the rest of their webpage for the search engines. With this line of thinking they miss that the search engines place far more importance on the body text (or visible text) of the webpage. This is a critical error that will ultimately lead to low or insignificant rankings.
Note: An extremely common example of Meta Tag over-reliance are web sites that have been designed
totally graphically and are devoid (or nearly so) of html text that a search engine can read. A webpage such as this will have no body text to index and may only provide a small amount of relevance to the webpage which ultimately leads to poor rankings. Over reliance on Meta Tags does not produce intentional search engine penalties, however, the simple act of ignoring other ranking principles often means a lower ranking.
#9: Duplicate Content
Definition: This tactic is blatant Spam that is very common today. Essentially the webmaster will create a web site and then create duplicates of each page and optimize them differently in order to obtain varying placements. By doing this you are saturating the search engine databases with content that is essentially eating valuable bandwidth and hard drive space.
Duplicate content is a dangerous game often played by full-time marketers accustomed to trying to attain placements in aggressive markets. Avoid this tactic like the plague unless you are willing to sustain serious ranking damages if you get caught – which you likely will.
#10: Automatic Submission and Page Creation
– Automatic Submission is the use of automated software to submit a website to the search engines automatically and often repeatedly.
– Automatic Page Creation is using software to create pages ‘on the fly’ using predefined content (body text, keywords, images etc) to create “optimized” webpages to target specific keyword rankings on the search engines.
At StepForth the word ‘automated’ is an abomination when used in reference to SEO. The fact is that automated SEO campaigns are not as effective as manual (by hand) optimization techniques AND such techniques often require the use of doorway pages to lead search engine users to polished marketing pages at the true destination page. In this case this is a double-offense by using the banned doorway page technique. My strong prejudices aside, lets take a short logical look at both tactics noted here:
a) Automatic Submission
Search engines make the majority of their profit from surfers like you viewing their advertising. Do you think that allowing automated submission tools to submit a web site (which bypasses SE advertisements) is in the search engines best interest? No, in fact the submission companies have had to upgrade their software repeatedly to try and subvert the search engines’ latest effort to stop their programs. There are also concerns about bandwidth because automated tools have been known to repeatedly submit sites and sometimes each individual page within a site.
All-in-all, this leaves the submitter in an unstable position where they may or may not have their submission ignored. This is not even considering the fact that automated tools claim to submit a website once a day or a week or a month! The cardinal rule of search engines… submit ONCE and it may take a while but the site will get spidered at some point (up to 2 or 3 months later – max). If within a few months a site is not listed, then resubmit. If a search engine is submitted to too often that it is Spam and frankly the website being submitted will not fair well. As for the major engines like Google… be patient and definitely don’t submit more than once if you can help it.
b) Automatic Page Creation
If a page is automatically created will it have the kind of quality content that search engines require for their index? Also, if the page is automatically created, will it not be using repetitive content? There may be a few incidences where there are some variable answers to these questions, however, I imagine the answer will be ‘No’ 99% of the time which instantly illustrates a poor and dangerous search engine optimization technique.
|PR(A) is the PageRank of page A,|
|PR(Ti) is the PageRank of pages Ti which link to page A,|
|C(Ti) is the number of outbound links on page Ti and|
|d is a damping factor which can be set between 0 and 1.|