Google Keyword Tool – Keywordize Blog

You have all the content well written, tags are well in place, right images have been used and everything seems to be in order but wondering why still the post is not getting enough exposure to the lord of search engines – Google! This happens when you have not used right keywords in your post that Google or any other search engine can catch and show it in the search results for the given keywords. Google Keyword Tool is your next step then to keywordize your blog. Continue reading “Google Keyword Tool – Keywordize Blog”

Improve WordPress site PageSpeed in Simple Steps

Site PageSpeed is one of the important factors for better traffic not only for SEO but for a better user engagement experience. Use the following quick and simple steps to improve WordPress site PageSpeed.

Working on using the right keywords is also important. Today, quality is most important than anything else. However, filling the gaps as much as possible to make your blog perfect will always help. Quality content can be built with using the right keywords that people are generally using to search information or which comes handy to them.

Install a Caching Plugin

The plugin W3 Total Cache caches and compresses files to reduce page download time. I have seen great improvements in the page load time in the Google’s PageSpeed Tool.

Modify your site’s .htaccess File

Login to your site’s file manager and edit .htaccess file (backup the file first!). Then copy the below code and paste in the top of the .htaccess file. This is to improve on Leverage Browser Caching PageSpeed factor.


# Enable expirations
ExpiresActive On

# Set default expire time
ExpiresDefault “access 2 week”

# Specify expire time by file type
ExpiresByType image/jpg “access 1 month”
ExpiresByType image/jpeg “access 1 month”
ExpiresByType image/gif “access 1 month”
ExpiresByType image/png “access 1 month”
ExpiresByType text/css “access 1 month”
ExpiresByType application/pdf “access 1 month”
ExpiresByType text/x-javascript “access 1 month”
ExpiresByType image/x-icon “access 1 month”


Use Asynchronous Code for AdSense

I use WP Insert for inserting ads on the pages, it helps me manage the ads easily from time to time. Ensure that you are using Asynchronous AdSense Code.

To get the Asynchronous AdSense Code, goto your Google AdSense Account, My Ads then click on Get Code for the Ad of your choice. In the overlay popup, click the dropdown and select Asynchronous as shown below. Currently they are in BETA phase, but it works well on my website. Copy this code and paste it either in WP Insert or whatever plugin you use for AdSense or directly on the page as per your convenience.

asynchronous adsense configuration

Use Asynchronous Code for Social Plugins

Social plugins add a lot of overhead of Javascript and CSS contributing in more load time of the pages. Using asynchronous plugin like Async Social Sharing may help you reduce the overall load time of the page of your WordPress site.

To improve the PageSpeed for the factor Eliminate render-blocking JavaScript and CSS in above-the-fold content, you will need to ensure that the Javascript and CSS are not blocking the page load meaning the rendering of the page is not blocked while these asset files are loading. The solution to this is to move the JS and CSS files towards the footer of the page so that they load after above-the-fold content is loaded. Above-the-fold content means the content that is present in the viewport (which is not below the visible screen). I have added Footer Javascript Plugin for the same.

Compress Image with Smush.It Plugin

The images can be compressed without losing (visible) quality using the plugin like Smush.It. To smush already uploaded images, Go to Media >> Bulk It takes couple of mins depending upon number of images your site has in the media library but it compresses the images contributing in better PageSpeed.

Lazy Loading of Images

Lazy loading of images can significantly improve in user experience. It loads the images as they appear in the viewport. This means that the images which are not visible in the current screen are downloaded when the user scrolls down to that area of your page. You may use JQuery Image Lazy Load Plugin for this.

Minify CSS / JS

If you have any custom CSS or JS of your own, you can first minify using online tools then upload it to your WordPress site.

That’s all! Now check your site’s PageSpeed Insights and let me know if you got any improvement.

Interesting Links

Optimizing Page Load Time with Free Tools Online

page load time optimizationFaster page load is very important for a better user experience. If a page takes a lot of time to load, higher probability is that the user will look for another page, going away from your website. Optimizing Page Load Time can be done quick using various free tools available online. Things that affect the page load time and free online tools to find a solution are listed as below.

Recommended Read

Optimizing Page Load Time Tips

  1. Large size graphic images: Sometimes we forget to optimize images and this causes website take time to load images before it gets loaded completely. Using Smushit tool online by Yahoo, you can compress images and get them loading faster in your website.
  2. CSS Sprites: Instead of having many small iconic size images, you can have one single image with all the icons and use CSS sprite to show the images on the page. This reduces number of requests to the server thereby enhancing page load time. Using CSS Sprite Generator you can create css sprites for free.
  3. Many, large size Javascripts & CSS: Javascripts and CSS are written with indents, spaces, etc for better readability. However, when deploying on production website, you must consider compression of the JS and CSS files. There are two methods to do so: Minification and Obfuscation. Minification reduces the size by not changing the code but arranging the code with minimum bytes required. However, Obfuscation alters the code to make it even more compressed by applying certain code logic, reducing iterative paths, refactoring variables, etc and is more prone to error. The Minify JS is a free online tool to compress JS while Minify CSS is a free online tool to compress CSS. These online compression tools mentioned above also combine multiple files into one.
  4. Lot of server HTTP requests: Placing the images, css, flash objects, etc on the same server results in a lot of server requests. Putting the load on other servers helps. You can use Content Delivery Network for providing the content to your site thereby reducing/minimizing the requests to your server and enhancing the page load time.
  5. Placement of CSS and JS on page: Put stylesheets at the top while JS at the bottom. Putting stylesheets in the HEAD allows the page to render progressively. JS block parallel downloads, hence keeping them at the bottom is a key to optimizing page load time. Keep the JS and CSS files external than inline.
  6. Avoid CSS Expressions:The problem with expressions is that they are evaluated more frequently than most people expect. Not only are they evaluated when the page is rendered and resized, but also when the page is scrolled and even when the user moves the mouse over the page.
  7. Lot of plugins: Lot of plugins like Social media generate a lot of requests (mostly to the social media server) but affects the load time to an extent. Remove unnecessary plugins.
  8. Lot of comments: The size of the page sometimes increases due to comments. Sometimes the content size of comments is much more than the actual page content. Loading a lot of comments on a page can impact the load time. You can load the comments after the page is loaded possibly through AJAX on page load.
  9. Code behind: The server side code also plays a vital role in page load time. Caching may help but not always. Read more about caching and decide whether to enable it.

Analyze the links on your page with Links Analyzer

Further reading

Post your page load time optimization, check the page load time with, one of the most popular tools today.

Create backlinks for SEO

SEO is iterative and is very important to make your site reach its customers. There are many techniques by which you can achieve this goal. Apart from search engine optimization on your site or page, you must also create backlinks for SEO by submitting articles to various websites that allow you to do so.

Approach to Create backlinks:

Register and submit quality content on articles submission directory sites. This helps you become visible on the world wide web.

You can also have answers submitted in forums. The best way is to include links in your signature. With this it gets added to each and every answer you post. Target to provide the solution to the questioner than targeting to create backlinks while responding to a question. Your answer will automatically gain popularity (hence your link in signature) if your answer really resolves the problem.

So, people who are specifically looking for that solution, will ultimately read your answer and may also follow the links. In any case, your link becomes visible when users come to your answer when especially through a search engine.

There are tools that allow you to share links to Social media including Facebook, Twitter, StumbleUpon, Bibo, Digg and many other. One website that provides you this service is onlywire. It submits your articles to various social media (52 sites!! as of today) saving your time to visit each social media and post your link. Popularity is the key to SEO. Hence, creating backlinks is the key to popularity.

Caching – The good, the Bad

All good things created on earth may not be good always. For example, milk may suit some people but for others it may cause skin irritation. Similar is the case with caching.

Caching – Good or Bad?

Caching is not always a good idea. If the site traffic is less and caching is enabled and your website hosting is on multiple servers, the users may experience high load time and may become impatient. This is because the application pool resets and then when users try to open the page the cache is rebuilt because the old cache is expired due to inactivity (low traffic).
caching effects on website load time
So you should enable caching only if the site is having good amount of traffic. And you don’t have multiple applications running on shared basis.

Caching is important for a better load time but at the same time it may not be good for low traffic website.

Submit Sitemap Search Engines

Sitemaps are an easy way for webmasters to inform search engines about pages on their site that may be crawled by the search engines’ robots. This Sitemaps are in xml formats.

A typical Sitemap file lists every URL, together with information about when it was last updated, how often it normally changes, and how important it is, relative to other pages in the site. This helps search engines to more intelligently crawl the website.

Creating sitemap

First of all, create the sitemap for your website, listing each url in it.

You may have a look, as an example, at the sitemap of this website that lists the url with other attributes: Sitemap

After you have created the sitemap for your website, its the time to submit or inform the search engines about it. Below are the methods to submit to various search engines.

Submitting the sitemap to Google

Google developed a schema for sitemap and has a portal for webmasters Google Webmaster Central.

First of all, after signing into Google accounts, you have to verify the ownership of the website. This is done by uploading an html or updating a page with the meta information that provide. The meta tag looks like –

<meta content="unique code provided by google" name="verify-v1" />

Submitting sitemap to MSN

MSN uses for Live search to get informed about the sitmaps for various sites from webmasters. The following URL would directly submit the sitemap to MSN:

Submitting Sitemap to Yahoo

Yahoo provides this service through Yahoo Site Explorer. You can submit the sitemap through the following URL

As with Google, Yahoo too provides a verification HTML file or meta tag to verify ownership of the website. Once verified you can use the services provided by Yahoo Site Explorer. Submit the sitemap as a feed by providing your sitemap url on:

Submitting Sitemap to Ask

Ask follows a similar approach. The following URL submits the sitemap to

Happy Webmastering!!!

Automatically Submit Sites to Search Engines:

Just provide your site or individual page URL and other details and submit. Phone number is not mandatory. It’s free! No bluffing at all! I personally use this tool. It is provided by Free Web Submission

Search Engine Optimisation – SEO

SEO – Search Engine Optimization

Make the site easy enough to find the useful information that a user would seek. Try it by yourself by navigating through the site contents. SEO is not a step-by-step method to follow. It’s an ongoing activity.
SEO - Search Engine Optimization

All the pages within your site, should be reachable by a static link. However, make sure that you don’t just put all the (100s of) links on pages for this reason.

Use simple and short keywords. Keep 2-3 phrases for keywords. Longer phrases are sometimes ignored by the crawlers. The phrases can be made simpler by deciding the common keywords that the users could put in the search string. The set of keywords on each page should be unique. DON’T use words like www, com, a, the, an, etc these keywords are ignored by the crawlers.

The elements like title and alt attributes for images should be used appropriately. They should be descriptive and properly phrased. The <title>, for example, summarises the content on your page. Similarly, the alt tag should have the proper image description. Put your main keywords in the title, hence making the room in the keywords list for other keywords.

Example –

<img src="search-engines.jpg" alt="search engines" />

Use the heading tags to put the headings on your page. The heading tags are emphasised by the crawlers more than other tags. If the heading is put in the div or span tags, the crawler would treat them as normal text. Imagine yourself reading a book, the first thing you do is you scan through headings rather than reading the text under it.

Clear all the broken links from the site. Create a permamnent redirect for these broken links. You may use htaccess to allow the crawlers understand this.

Make your pages valid. Make sure no tags are missing their end tags, in other words, make sure the tags are properly closed. Crawlers do not consider such pages even if browsers display them properly. Validate your website against w3c standards with the help of online tools such as

Not all crawlers crawl dynamic pages. When your page has some parameters passed in it, it should be rewritten to some proper/sensible URL. For example, should have been re-written to something like This makes the crawler treat this a static page.

Keep the link names consistent throughout the site.

You may use the top-level domains for bilingual site content. The chances of finding your hindi language website would be easier by making it rather than

Use robots.txt file to help or prevent the crawlers crawl your website. Major search engines use this text file as a guide while crawling the website.


Allow all crawlers crawl complete website:

User-agent: * 

Do not allow crawlers crawl / keep them away:

User-agent: * 
Disallow: /

Disallow select directories from your website:

User-agent: *
Disallow: /admin/ 
Disallow: /private/

To help crawlers get the links within your site easily, create a sitemap and submit it to search engines. Have a look at this sitemap:

You may use the following link to generate the sitemap and make your job easy.
Google Sitemap Generator

Submitting your site to as many as possible search engines is a good idea. Alexa is another site that you must concentrate on. Check on how to increase alexa rank

There are various online sitemap generators, validators, etc. This helps in assuring that your sitemap is valid and would not be ignored by the search engines. Similarly, robots file can also be generated and validated. Get more information on how to increase organic traffic.

Avoid redirecting links in your website using Javascripts. Crawlers avoid javascripts. Hence it would not know the redirections in your website properly.

If you are rewriting URLs like to, Make sure that crawler is informed that this is a permanent or temporary redirect, otherwise it will be treated as duplicate URLs and you might loose the page ranking.

Avoid the duplicate content strictly. Crawlers and users are confused by this. Instead of making altogether a different page for the printer friendly version of your page, create a different CSS that may be applied when user would like to print the page. Another example is instead of putting the disclaimer text on the bottom of every page, create a different page and put a link to that page on the pages you would like the disclaimer to have. This would help avoid the duplicate content on your site.

Hosting companies can also affect your page ranking. If the same hosting server hosts some ‘bad hats’ also called as spammers, pornographers, etc, your site ranking may also be banned. The crawlers treat them as the same kind of site as they all share the IP address of your hosting server.

The IP address of your website should not continuously keep changing. Such websites are avoided by the search engines’ crawlers.

Try to make your site interactive. The more the site is interactive the more chances are that users or site visitors visit your site again and again. Keeping the forums, news, etc or such dynamic content will keep the users visiting site again and again. Give your users a reason to visit the page or website back again. You visit your email expecting that some new emails or new replies may have arrived. Same is the case with visitors visiting your site. They should expect something new, may it be reply or some answers to their questions or comments.

Google Netscape Alexa AOL Overture Ask Jeeves Excite Lycos Fast Look Smart Teoma MSN Yahoo All The Web Hot Bot Go Dmoz Alta Vista

Website Marketing

Market your website and share the link with the world. Website marketing is very important if you are focusing on increasing your business or reach to the people.

Website Marketing

You may be having a very good and useful website but if the website is not visible to its users, then you are not really having its worth.

Whenever you create a website or to be specific a page or section within website you are targeting a user group for whom you are creating this website. The targeted group can be any set of users like CEOs of the manufacturing companies, CEOs of the software companies, students from a specific stream (science, commerce, arts, etc), housewives, etc.

Now if your website was created with a specific user set for example housewives and if through a wrong medium it is seen by the students of science background, you have not reached your potential users. The students will just ignore the website or page which is not relevant to them (in this case).

On the other hand, if the website is visited by your targeted users for example housewives. They would want to explore it more and more. If they find it interesting they would also share it with their friends, relatives, etc with the help of social media like Facebook, Google+, etc. Hence, your targeted users are multiplied.

The more visible your website is, the more the chances of it being seen by your targeted users and in case you have a business or an ad on the website or the page, the more you make money through this better visibility.

So having your website advertised at many places is a good idea.

Advertising it at the sites that are more popular will get your site or pages even better visibility. The sites that have categorized listings are more helpful to reach the targeted users.

For example, if a website is listing everything at one place it will be confusing for a user and again the case of a housewife targeted page being shown to a science student will occur. On the contrary, if the webpage or website is advertised where categories are present, it will help users to reach to the section they are interested in.

For example, a science student will never enter the section that is meant for a housewife and the chances of the site or page getting their targeted users increases!