Thursday, April 27, 2017

Nofollow Back-links: All you need to know

SEOs/Webmasters hate Nofollow links. Because it is said that it does not pass a lot of "juice" to our website. But, do you even know its just an HTML tag or attribute value used to instruct search engines bots that hyper-link or a back-link should not influence the link target's ranking in the search engine's index. This are also suspicious links which are still known to pass link juice with a hidden motto. Although not directly. It is just an indication provided for back-links.
It was originated and announced on January 18, 2005 with the collaborations from three major search giants – Google, Yahoo and Microsoft. It was introduced to stop the spam like Blog comments, paid links as back-links are used as a ranking signal. And the main intention behind this attribute was to stop the credit being passed from a linking website to the linked website. As webmasters or website owners does not get ready to share the authority of their sites.

Here's the example code-snippet on how to check

how Nofollow back-link look like:

<a href="" rel="nofollow">ABC</a>

There is one more type also:

External Nofollow

i.e. <a href="" rel="external nofollow">ABC</a>

Benefits of Nofollow Back-link

Although this links are not renowned from SEO perspective but still they have benefits.

1. Traffic
The first and the last should-be intend of back-links. Nofollow back-links helps to increase referral traffic from other website which link you. It results in increasing your audience or user base.

When to give a nofollow link:

1. In Code Embedding
When you're embedding code of other sites or resources from other sites, then you should give a nofollow attribute to that link. It will prevent your domain authority being shared or passed to the linking site.

2. In Blog Comments
If your website have a blog and you are allowing comments then you can probably give nofollow attribute to those comment links.

3. Untrusted Content
Give it a nofollow if you do not trust it.

4. Crawl Prioritization
In one way you are also prioritizing the crawling density of your web-pages. If you give it a nofollow then automatically other contents of your web-page will be prioritized by web robots.

It also gives signal to Google or any other search engines that you're building your links naturally and do not intend to spam. This also indicates that you're interested in building natural links which can increase your user base or traffic. So, keep in mind that “nofollow links” are not the end of the world :)

Monday, February 13, 2017

Common Signs of Poor SEO Strategy OR Bad SEO Practices

There are many to name, and it should be avoided to have a good/SEO friendly website, such as:

1. Keyword Stuffing/Improper Keyword Density

Stuffing keywords everywhere is a proper example of keyword stuffing. Also, keyword coming repeatedly all the time is also a bad practice. One keyword coming ample time can lead to poor reading content.

2. Keyword Cannibalization

Targeting one keyword with two or more pages OR focusing on single keyword with all the power of website is Keyword Cannibalization. This should not be there. You can focus on one keyword with just one good quality landing page.

3. Gateway/Doorway Pages

Creating a keyword specific pages, is known as Gateway/Doorway pages. It should be not like that. Even if you want to create a landing page, you need to keep this in mind.

4. Links from Exact Match Keywords

Never acquire/bring back-links from exact match keyword. Make sure you build links with phrases like business name, website URL, keyword without GEOs, or any Misc word like click here, here, more info. etc.

5. Cloaking

Cloaking is representing a two versions of website content, one for User and another for Web Robots/Bots. This is a clear form of black hat SEO technique.

6. Too many CTAs or improperly placed CTAs, where they are not required

Placing call to actions or CTAs are required at key areas of web-pages. And it should not be placed on multiple parts of a web-page. You should not even place it on above the fold content. You can have it at end of the page or at the left/right side of the page, just like sharing buttons.

7. Buying Back-links

Google can actually figure out that you're buying links! So beware if you're going in that way. You do not need to buy links. You can opt natural way to get good quality back-links. And you can work gradually to build your back-link profile.

8. Thin/Irrelevant/Misleading Website Content

Reading something which we are not interested in always disappoints. Same way, do not put content your website which misleads or is irrelevant to your website/industry niche. Also, a very less amount of content which do not provide proper or enough information needs to be avoided.

So, do not follow this strategies to get ranked on search engines result pages. Instead, follow search guidelines, build a website that give quality to users and most important, have patience. You will get rankings one day.

Happy Optimization!

Friday, December 30, 2016

Why Search Engine Optimization?

Those who wish to improve their sites interaction with users and web robots, SEO, is as important as any other form of marketing. All you have to do is follow the best practices of Google guidelines.   

The process of Search Engine Optimization(SEO) is all about improving and making modifications to your website in order to make it search friendly. Whilst doing, it may seem like minor changes; but when properly implemented you can experience significant improvement in your websites performance.

SEO helps you to rank in Google or any other search engines result page. You can get ranked in organic search results which you can have for free without paying a single penny. You have to put your best in order to get the ranking, but being yourself always helps. All you have to do is make your content original and official. Your intent should be users/your website visitors and not search engines(This is the core part).

There are some key factors or website components which Google uses to give the search engine rankings. And it is as follows:

1. Title of the web-page
>>This is the first line which user reads while searching what they need.

2. Meta description of the web-page
>>Meta description is the little explanation about web-page after title. It needs to be in 160 characters including spaces. Yes, that’s the limit.

3. Web-page URLs
>>Uniform Resource Locators(URLs), this are the web-page address or where the page resides. We need to make the URL structure in specific way that it should contain a keyword. It should also indicate the directory structure.

4. Breadcrumbs
>>Breadcrumbs are short or direct link which can lead user/web robots to understand the page flow.

5. <h1> Tag
>><h1> tags contains the heading of the page, they tell what your page is about.

6. <h2> Tag
>>Same like <h1>, it is a second heading, through which you can put more content about your heading.

7. Web-page Content
>>Even though everything is considered as content only but adding an explanatory text or caption helps immensely. It will be as beneficial as accurate it is or how deep you can describe your service/product motive.

8. Image Optimization
>>Naming image files, reducing the file size, making it readable, giving it suitable title and ALTs

9. Internal Linking
>>Linking web-pages with each other according to suitability. This helps to improve internal structure of website and navigation.

10. Call to Actions
>>Adding call to actions(CTAs), and USPs can help your user to take a quick decision about clicking.

11. Hyper-link Optimization
>>Giving a proper title to hyper-links so that it can help user to understand the navigational menu.

12. XML Sitemaps
>>XML(Extensible Mark-up Language) file format helps web robots to read all the URLs. Through sitemap, we are providing all the web-page at one place to web robots.

13. HTML Sitemaps
>>HTML sitemaps are for humans or visitors. This will help the user to navigate through site.

14. Robots.txt
Now this is an interesting file which you will be uploading in root folder of a web server. This file controls the web robots for the site access.

Apart from all this, you can make use of Google tools like Search Console, Analytics. There are also tons of another tools available.

P.S. You do not need to be developer to do this or begin with the process. You can take help of a developer.

Wednesday, November 30, 2016

Something About Robots.txt in a Simple and Easy Manner!

Robots.txt is an instructive or directive file that instructs/directs the search engine crawlers or web robots on how to crawl the website. The purpose or use of installing Robots.txt file is to keep away the spammy bots and allowing the good bots that are benefitial to the website from SEO perspective. Generally, we install it in a root folder of a web server. Any website's Robots.txt file can be checked by just entering "" in address bar of the web browser.

You can also check for errors and test it in a tester of Google Search Console under Crawl>>robots.txt Tester section.

Here's what robots.txt Tester looks like in Google Search Console:

It works under Robots Exclusion Protocol. You can also download a sample of robots.txt file from here. In simple, its a table of content which contains an instructions for web robots/crawlers. Whenever the web robots come to visit the website it first checks before visiting or any other web-page of a website.

There are hundreds and thousands of good and bad web robots. And robots.txt is an effective way to control them. You can also use Robots Meta Tag <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> to instruct the web robots but it is not recommended for specific robots allowing or disallowing. It goes for all the types of robots. As there are pros, there are cons too. You can not expect search engine crawlers to fully obey the robots.txt instructions. But you can not ignore it fully, as it is a powerful mechanism available for SEOs/webmasters.

Thats all! Please comment below if you want to discuss something or any questions for me :)

Monday, November 28, 2016

SEO Content Creation Tips That Create Engagement!

1. Refer the keywords to determine how people are searching for information about the chosen topic,

~ URL structure should include main keyword, and city.
~ Title tag should include main keywords
~ H1 and H2s should include keyword{Heading 1 and 2}

2. You can always look on what are the current trends in the industry and what competitors are writing about.

3. I would recommend having at-least 250 word page content. Don't focus on word count too much – focus on the thoroughness of the coverage of the topic.

4. As you write, don't focus on including keywords a certain number of times. Use them where they make sense from a personal perspective. This means keywords will be sprinkled throughout the page, but naturally, so a couple/few times over 250 words.

5. Make the copy pleasing for people to read. Yes...those keywords are in there, but they have been highlighted in a natural, non-robotic manner. Add some additional information about your cities and best views/attractions for that cities.

6. Have a compelling title/headline for the post or with a key term in mind.

7. Try to add unique images, additional information and videos

8. Socialization Options

9. Add Contact details and contact form at the end of a web-page or in a way that convinces user to sign-up or fill the form.

So, these are some general tips which can be used to create a unique website/blog content.

Thursday, August 25, 2016

What is Event Tracking Code in Google Analytics and How to Use it

There are four types of goals in Google Analytics. Destination, duration, Pages/Screens per session and Event. A completed action or process is considered as a goal. Event tracking code is implemented to track the event goals - one type of measurement to analyse the user engagement. 

A user interaction or behaviour with any website element is considered and called as an event. To track this user interactions, we set-up event tracking code with the help of Google Analytics tool. The results of all the action data is reported through Analytics reporting tab/interface which is located in top of the tool. You can gain more insights on how user interacts with your website and its content/elements with this feature of Google Analytics. 

There are many types of events. Below are some examples of events and piece of codes to install and execute them.

1. Banner Ads
2. Clicks on Social Icons
3. Videos
4. Form Fill-ups(It can be Sign-up Form, Contact Us or Inquiry/Free estimate request Form etc.)
5. Clicks on Emails(Click to Email Option Should Be Present)
6. Call to Action(Call or Add to Cart/Shopping Cart etc.)

Here are the parameters in which you can pass the value and ready to go!

1)Forms Like Free Website Analysis Request(Ideal for )

onClick="_gaq.push(['_trackEvent', 'Analysis', 'Click Link', 'Analysis Request']);"

2)PDF Downloads
onClick="_gaq.push(['_trackEvent', 'Temporary Benefits', 'PDF', 'Temporary_Benefits.pdf']);"
onClick="_gaq.push(['_trackEvent', 'One Page Stratergic Plan', 'PDF', 'OPSP_final.pdf']);"

3)Signup Form: News Letter Signup
onClick="_gaq.push(['_trackEvent', 'Submit', 'Form', 'Contact Us']);"

4)Social Icon Clicks
onClick="_gaq.push(['_trackEvent', 'Subscribe to Blog', 'Newsletter', 'Blog']);"
onClick="_gaq.push(['_trackEvent', 'Follow Us on Twitter', 'Signup', 'Twitter']);"
onClick="_gaq.push(['_trackEvent', 'Be a fan on Facebook', 'Fanpage', 'Facebook']);"
onClick="_gaq.push(['_trackEvent', 'RSS Feed', 'Feeds', 'RSS Feed']);"
onClick="_gaq.push(['_trackEvent', 'Google +', 'Google +1 Page', 'Google']);"

onClick="_gaq.push(['_trackEvent', 'Connect with us on LinkedIn', 'Social Media', 'LinkedIn']);"
onClick="_gaq.push(['_trackEvent', 'Google+', 'Social Media', 'G+']);"
onClick="_gaq.push(['_trackEvent', 'Visit our Facebook page', 'Social Media', 'FB']);"
onClick="_gaq.push(['_trackEvent', 'Follow us on Twitter', 'Social Media', 'Twitter']);"
onClick="_gaq.push(['_trackEvent', 'Visit our Youtube Channel', 'Social Media', 'YouTube']);"
onClick="_gaq.push(['_trackEvent', 'Pinterest', 'Social Media', 'Pinterest']);"

onClick="_gaq.push(['_trackEvent', 'Videos', 'Play', 'Baby\'s First Birthday'])"

Each piece of custom code contains four elements[Category | Action | Label | Value], such as:

_trackEvent - A Method

Analysis - A Category

Click Link - Click Link

Analysis Request - Labels


Values - You can type integer rather than string - _gaq.push(['_trackEvent', 'Videos', 'Video Load Time', 'Gone With the Wind', downloadTime]);

Non-Interaction Events - Boolean


If you're not tech savvy or do not have any technical skills, then you can take help of a web-publisher.


>>Analytics should be already set-up

>>If have no tech skills, then dont worry. There are many plugins available for WordPress sites, but it also needs testing and monitoring after installation.

>>This feature can also used with Google Tag Manager

Thursday, July 21, 2016

Schema Tag: Usage & Meaning Simplified!

Schema is an interpretation of data into a structured data. We should implement schema tag on the website’s info so that search engines can understand it in a better way. Schema is a part of structured data mark-up. With the help of Schema tags, we will be able to represent our data in a better format.

There are three major types of structured data mark-ups:

1. JSON-LD(Recommended by Google) - JavaScript Object Notation for Linked Data
2. Micro-Data
3. RDFa (Resource Description Frameworks)

This makes sure that your data mark-up is presented in a well formed manner and can be processed by Google, other search engines. It is also referred as a kind of micro-data format which is used by majority of the SEOs and web developers world wide. is an open community effort to promote your standard structured data in a variety, where you can find almost everything about schema tags and their usage.

You can implement data mark-ups on following property of your website:

1. Internal Search Box
2. Ratings/Reviews
3. Physical Location/Address
4. Products (for e-commerce websites)
5. Future Events
6. Recipes (Ideal for food blog or cooking related websites)
7. Social Profiles (It shows only when your profiles are verified and tested)

Next step is to always make sure your structured data is tested and validated. Google also provides structured data testing tool for testing and publishing your mark-up. When you publish your data you can make it more eligible by testing in this tool. It also shows errors or corrections if any.

In simple words, it is associating your data with the search engine crawlers/robots and other online resources.

P.S. Creating Quality Pages Always Helps!