Top 20 SEO Mistakes to Avoid

There are many mistakes done by web masters while optimizing their site for search engines. Most of them are very simple SEO mistakes and the site can perform really well if they are fixed at the earliest.

Here I would like to list around top 20 SEO mistakes that should be avoided for getting better website performance and search engine rankings. This list might miss many other points and I request every one to comment me back if they find any missing points.
[Read more…]

10 Tips for Making an SEO Friendly Content

SEO friendly content is the most influential ingredient of your website, when it comes to optimizing your websites. There is no point in having a website with a large chunk of content which are not SEO friendly, rather it is always better to have SEO friendly content on your website for better search engine rankings.

SEO friendly content does not mean that you need to tamper the content in such a way that it is good for the search engines only. You cannot ignore the user friendliness of the content while making it more SEO friendly. The best way to create a SEO friendly content is to keep in mind on how search engines view your content, without compromising on the user friendliness. The content should always be for the readers and not totally for the search engines, but you can follow certain guidelines which helps to make the user friendly content more search engine friendly. Here I would like to explain few points which helps you in making your content more SEO friendly.
[Read more…]

Tips for an Healthy SEO Title Tag

The title tags are the most important ingredient of a website and plays a major role in determining its search engine rankings. So while optimizing a website make sure that you have created and SEO friendly title tag for better ranking. Many of the webmasters fails to recognize the importance of Title tag and they end up getting lower rankings in SERPs. In order to gain better rankings it is always good to optimize your Title tag with your relevant keywords.

Often most of the sites does not perform well even after optimizing the Title tag, or initially they perform but underperforms at a later stage. This is due to many factors but the major being improper optimization of title tags. Many of them land up using wrong and unethical methods for optimizing the title tags of their websites. As a result after the initial bubbles they are seen nowhere in the SERPs.

Here I would like to mention some of the basics things to take care while optimizing your title tags, and are listed below. [Read more…]

Solving Duplicate File Issue with Search Engines

Duplicate content have always been an obstacles for webmasters while going for search engine optimization. Many of the webmasters got their site penalized by the search engines for using duplicate content in their website. The main case of duplicate content arises when multiple URLs on the same domain pointed to the same content. This issue mainly arises in dynamic and CMS driven sites and the webmasters really live with the fear of being penalized by the search engines. The case arises when more than a single URL on the same website points to the same content under certain conditions. Consider pages in a CMS driven web site http://www.yoursite.com/products.php?item=woodchairs&category=chairs and http://www.yoursite.com/product.php?item=woodchairs&trackingid=3254&sessionid=7184 which points to the same content at http://www.yoursite.com/products.php?item=woodchairs

[Read more…]

Google not Ranking Top for “Search Engine” in Google

Another interesting thing that I noticed in Google today was Google is not even being ranked on top for the keyword “search engine” and “search engines”. This is one of the interesting things, taking into the fact that Google is the king among  search engines and it is not even having top ranking for that keyword in its own search engine. It is high time that Google take this into account and at least place it on the top results for the above search terms. I am not sure why this is happening and the sites which show on the top for those keywords are Dogpile, Altavista, Ask etc.

search-engine

Control the Search Engine Spiders with Robots.txt

Search Engine Spiders are simple programs which crawl the content of your website, and helps in the indexing of websites. The search engine spiders and its activities have a major role in determining your search engine rankings. The main tool for controlling the search engine spider activities on your site is Robot.txt file. This is a simple .txt file which contains certain codes and just needs to be uploaded to the root of your website.

With the robots.txt file you will find it easier to direct the spiders to the most important pages on your website, so that you can improve your rankings for your website. You can also prevent the search engine spiders from crawling unwanted folders or least important files. For example, a website may contain less important pages such as privacy policy, terms and conditions, about us etc. These pages would not help in improving your search engine ranking and it is of no use for the search engine spiders in crawling these pages. In this case you can write special code on the robots.txt file, through which you can prevent the spiders from crawling them.

You can control the search engine spiders with a properly written robots.txt file, which is uploaded to the root of your website. In case if you have sub domains for you website, you need to create separate robots.txt for those sub domains. It is also better to have separate robots.txt for secure (https) and non secure (http) web pages.

To create a robots.txt file you just need to save your text file as robots.txt. The basic code for robots.txt is simple and is given below.

User-agent: *

User-agent is the name of the search engine spider that is allowed to crawl the website. The symbol * means, all the search engine spiders are allowed to crawl the site. If you want only particular search engine spiders such as Googlebot (Google), Slurp (Yahoo) msnbot (Microsoft) to crawl your website, then you can mention those names, instead of the symbol *. If you want to prevent a particular folder named “personal” of your website you can rewrite the code as below.

User-agent: *
Disallow: /personal/

The code “Disallow” specifies the folder or page on your website that you don’t want the spiders to crawl. If you want to prevent spiders from crawling more than one folder, you have to create multiple Disallow line to the robots.txt file. For example apart from “personal” folder, you want to prevent the spiders from the crawling your folders such as “archive”, “temp”, “clients”, then you can rewrite the code as below.

User-agent: *
Disallow: /personal/
Disallow: /archive/
Disallow: /temp/
Disallow: /clients/

In case you want all the spiders to ignore your above mentioned folders, except the Google spider to crawl your folder named “personal”, the code can be rewritten as follows.

User-agent: *
Disallow: /personal/
Disallow: /archive/
Disallow: /temp/
Disallow: /clients/

User-agent: googlebot
Allow: /personal/

These are just some of the simple robots.txt codes, that I would like to share with you. You can know more about robots.txt file and its codes from here. Hope it would be useful for all the readers.

Why My Website is Not Ranking Higher in Search Engines?

Recently I met with a client for SEO consultation, who was already doing SEO for his site with some other Vendor. Since we knew each other personally, he asked me for free SEO consultation and I did the same. The main complaint of that guy was that despite a doing various offpage optimization he was not ranking for his major keywords and was much disappointed over his vendor’s performance. I just went through the site and after a day I got the reason why the site was not ranking for the major keywords. These reasons are very much applicable to majority of the website owners and as a result they lag back in the SERP and miss our important traffic to the site. Some of the reasons that I found out, which may be the reason for the non performance of the site are given below.

1) The site was not properly optimized for the major keywords
The major issue with its onpage optimization was that none of the pages were optimized for the major keywords. It is always advisable to optimize one of your pages for a particular keyword and it would be better not to optimize any other page for the same keywords. This would help in listing this page on Search Engine Results Page for that particular keyword.

2) No Sitemap
The site didn’t have the XML sitemap as well as the regular sitemap, which have made it difficult for the search engine spiders to easily crawl all the pages of the website. It is always advisable to upload an updated XML file to the root of the website, and also include the link to the regular sitemap on all the pages (preferably on the footer). The sitemaps should be updated whenever you make changes to the website structure.

3) Bad Navigational Structure
The website had poor navigational structure with unwanted functions, java scripts etc at the inappropriate places, which made it difficult for the search engine spiders to crawl through the pages. It is better to make your site navigation as simple as possible, to make it spider friendly.

4)  Poorly Written Robots.txt file
The site had a poorly written robots.txt file which made the crawling of the search engine spiders more miserable. Make sure that your site contains a well written robots.txt file which allows the major search engine spiders to easily crawl through and block private pages from being crawled by the spiders.

5) Less Content
The Site had less content on his keywords, which deprived the major keywords from appearing in the content. A good and unique content based on the major keywords is essential for getting better SERPs and forms the back bone of the website.

6) Weak Backlinks
The site had a weak backlinks, which gave the search engine ranking a big blow. Even though the site was having a few backlink, it was of less quality, ie links from poor, irrelevant and spammy sites.

After the client paid importance to these factors, the site really showed tremendous improvement in SERPs. This can be applicable to all webmasters and take care that these points are not missed while going for SEO for your website.

SEO Client Issues and Solutions

SEO is one of the most difficult profession as far as client handling is concerned.  Many of the clients are not aware about what works behind the SEO works and what is needed for a better search engine results. So the sales team or business development team have a tough time tackling some really difficult queries and demands raised by the clients. Even SEO professionals have also go through these tough situations. The pressure mounts over the sales and SEO team for these questions raised by the clients due to their unawareness on SEO.

A large number of clients thinks that SEO is a one time technology works, similar to software development. They think that once SEO is done their site is guaranteed for top search engine results. They are now aware that SEO project needs patience and ongoing maintenance work and is not a guaranteed one. No companies in the world can guarantee search engine ranking in any search engines. The only thing that can be done by the SEO is to make to make the website more SEO friendly, doing good keyword research, implementing keywords onto the site and then waiting for the results. One cannot be sure whether the site is going to get the top search engine ranking. If the website doesn’t perform well in the search engine, the SEO work has to be redone. So it goes on and on and needs ultimate patience and maintenance.
Even if the site starts coming on the top of the search engines, the work is not done. The SEO work should be continued to maintain the rankings, which is one of the most tedious job.

Some of the major issues faced from the client side are
1) They need guaranteed top listings for their keywords.
2) They are not ready to make design changes to their website to make it SEO friendly.
3) When a keyword drops a 1 or 2 positions after a Google update, they start complaining over this.
4) They want only flash files on all the pages and expect better SERP.
5) They are not ready to make the site rich with content related to their main keywords.
6) They have a small site and have 25 keywords. They need top ranking for all the 25 keywords.

The above list contains only few, but there are still more.

After looking at the above list, the one thing that comes to mind is what we can do from our side to sort out these issues. The SEO companies can do various things which helps in removing most of these issues and making the project smooth. Some of the things that I have implemented are listed below.

1) Educate the clients about the basics of SEO.
2) Make them aware of the search engine algorithm.
3) Encourage them to update and add fresh contents.
4) Inform them about the latest search engine updates.
5) Send them monthly reports and explain the reason behind the fall / increase in rankings.
6) Avoid giving information on all the things that we have done in the SEO project.

These things have worked for me to a great extent and I hope that this would work for you also.

The Effects of Over Optimizing a Website.

The optimization of a website has changed very much since 2-4 years back. During those days you might not have enough competition and you came to the tops of the search engine results page without doing much work. A slight SEO on your sight would have made you to the top of the SERPs. But time has changed more and more websites have come to the fore, and hence the competition has also increased for your keywords. So these days when you do SEO you try to optimize it more when compared to the SEO that you used to do 2-3 years back. Certain SEOs try each and everything possible to bring their site on to the top position. But the major thing that I have notices is that the sites which are overly optimized tend to loose the battle and loses its popularity.

So the question is whether over optimization of your site results in any kind of penalty from the search engines? As per Matt Cutts there is nothing like an over optimization penalty in Google, but over optimization can make your site look spammy. Google hasn’t so far penalized any website for over optimization, but websites has losed its popularity among its users for over optimization. For example if your over optimizing your site with keywords, such as using them more than usual in Title tags, Meta tags, Alt Tags, content etc, it would sometime end as an annoying factor to the users. Even though if you rank on the top, the users would not be comfortable with this over optimization and your sites looks junky or scammy to the users.

In my experience over optimization of a site has always lead to some issues and resulted in the site losing its popularity and In my experience over optimization of a site has always lead to some issues and resulted in the site losing its popularity and traffic. If your content is over optimized with keywords, the users find it uncomfortable with going through the content, and content appear spammy to them. So the best thing to do is to come back to the basics, include keywords in a moderate way and make the site more user friendly. If the search enigines your website the most appropriate for certain keyword, no one can stop you from making your site appear on the top of SERP. Relevancy is the key for better search engine rankings and if your site is relevant to your major keywords, you dont have to worry too much.

Is SEO Changing?

Last few weeks has seen som many changes in the search engine world. These includes the announcement of Google Wave, the replacement of Microsoft’s Live with Bing, the end of Yahoo’s Geocities and Yahoo 360. These all things sends a feeling that the search engine optimization field is changing somehow.
The search engines has been there for almost last 10 years and we have seen a lot of changes since then. In the early years we were new to the search engines and were amazed at the results that the search engine produced. But over the times, search engine have evolved to a higher level, with Google leading the way and remaining at the helm for over a decade. Whenever anyone things of searching something on the net, the first name that comes into the mind is ‘Google’.
Search EnginesIn the earlier days people used search engines to search something on the net, or search for some informations. But now it has changed rapidly and apart from searching stuffs on the net, people are using search engine for even taking decisions and other calculations. The search engine efficiency and its algorithms has become more complex and major search engines are trying too harder to stay in the top search engine brackets. Till some time back no search engine were comfortably able to read and crawl the text on the flash files and certain Javasripts. But Google has somewhat broken that jinx and has thrown more challenges to its competitors.
These all facts have brought the Search Engine (SEO) Professionals to work harder and stay up to date with the changing algorithms and tools. But in my opnion even with these advancements of the Search Engine algorithms the basics of SEO has not changed. The basic of the search engine optimization still remains the same, but the style of doing it has changed. So every SEO professional should be well equipped to adapt to these technological advancements.
The basic of SEO is to build a search engine friendly website structure and a good keyword research, so that the website does not perform badly in the search engine results page (SERP). The process of link building is also the same and does not have changed too much. The main thing that differentiates the SEO of today form the SEO techniques that was implemented a few years back is the introduction of Social Media Optimization (SMO). Social Media Optimization includes the process of getting traffic from certain social media sites which works purely on Web 2.0 platform. SMO mainly focuses includes user participation and loyality, and is also a major source of link building.
So in a nutshell if you stick to your SEO basics and get updated with the latest techniques, you would not find too much changes in the process of SEO.