Monday, February 13, 2017

Common Signs of Poor SEO Strategy OR Bad SEO Practices

There are many to name, and it should be avoided to have a good/SEO friendly website, such as:

1. Keyword Stuffing/Improper Keyword Density

Stuffing keywords everywhere is a proper example of keyword stuffing. Also, keyword coming repeatedly all the time is also a bad practice. One keyword coming ample time can lead to poor reading content.

2. Keyword Cannibalization

Targeting one keyword with two or more pages OR focusing on single keyword with all the power of website is Keyword Cannibalization. This should not be there. You can focus on one keyword with just one good quality landing page.

3. Gateway/Doorway Pages

Creating a keyword specific pages, is known as Gateway/Doorway pages. It should be not like that. Even if you want to create a landing page, you need to keep this in mind.

4. Links from Exact Match Keywords

Never acquire/bring back-links from exact match keyword. Make sure you build links with phrases like business name, website URL, keyword without GEOs, or any Misc word like click here, here, more info. etc.

5. Cloaking

Cloaking is representing a two versions of website content, one for User and another for Web Robots/Bots. This is a clear form of black hat SEO technique.

6. Too many CTAs or improperly placed CTAs, where they are not required

Placing call to actions or CTAs are required at key areas of web-pages. And it should not be placed on multiple parts of a web-page. You should not even place it on above the fold content. You can have it at end of the page or at the left/right side of the page, just like sharing buttons.

7. Buying Back-links

Google can actually figure out that you're buying links! So beware if you're going in that way. You do not need to buy links. You can opt natural way to get good quality back-links. And you can work gradually to build your back-link profile.

8. Thin/Irrelevant/Misleading Website Content

Reading something which we are not interested in always disappoints. Same way, do not put content your website which misleads or is irrelevant to your website/industry niche. Also, a very less amount of content which do not provide proper or enough information needs to be avoided.

So, do not follow this strategies to get ranked on search engines result pages. Instead, follow search guidelines, build a website that give quality to users and most important, have patience. You will get rankings one day.

Happy Optimization!

Friday, December 30, 2016

Why Search Engine Optimization?



Those who wish to improve their sites interaction with users and web robots, SEO, is as important as any other form of marketing. All you have to do is follow the best practices of Google guidelines.   

The process of Search Engine Optimization(SEO) is all about improving and making modifications to your website in order to make it search friendly. Whilst doing, it may seem like minor changes; but when properly implemented you can experience significant improvement in your websites performance.

SEO helps you to rank in Google or any other search engines result page. You can get ranked in organic search results which you can have for free without paying a single penny. You have to put your best in order to get the ranking, but being yourself always helps. All you have to do is make your content original and official. Your intent should be users/your website visitors and not search engines(This is the core part).

There are some key factors or website components which Google uses to give the search engine rankings. And it is as follows:

1. Title of the web-page
>>This is the first line which user reads while searching what they need.

2. Meta description of the web-page
>>Meta description is the little explanation about web-page after title. It needs to be in 160 characters including spaces. Yes, that’s the limit.

3. Web-page URLs
>>Uniform Resource Locators(URLs), this are the web-page address or where the page resides. We need to make the URL structure in specific way that it should contain a keyword. It should also indicate the directory structure.

4. Breadcrumbs
>>Breadcrumbs are short or direct link which can lead user/web robots to understand the page flow.

5. <h1> Tag
>><h1> tags contains the heading of the page, they tell what your page is about.

6. <h2> Tag
>>Same like <h1>, it is a second heading, through which you can put more content about your heading.

7. Web-page Content
>>Even though everything is considered as content only but adding an explanatory text or caption helps immensely. It will be as beneficial as accurate it is or how deep you can describe your service/product motive.

8. Image Optimization
>>Naming image files, reducing the file size, making it readable, giving it suitable title and ALTs

9. Internal Linking
>>Linking web-pages with each other according to suitability. This helps to improve internal structure of website and navigation.

10. Call to Actions
>>Adding call to actions(CTAs), and USPs can help your user to take a quick decision about clicking.

11. Hyper-link Optimization
>>Giving a proper title to hyper-links so that it can help user to understand the navigational menu.

12. XML Sitemaps
>>XML(Extensible Mark-up Language) file format helps web robots to read all the URLs. Through sitemap, we are providing all the web-page at one place to web robots.

13. HTML Sitemaps
>>HTML sitemaps are for humans or visitors. This will help the user to navigate through site.

14. Robots.txt
Now this is an interesting file which you will be uploading in root folder of a web server. This file controls the web robots for the site access.

Apart from all this, you can make use of Google tools like Search Console, Analytics. There are also tons of another tools available.

P.S. You do not need to be developer to do this or begin with the process. You can take help of a developer.

Wednesday, November 30, 2016

Something About Robots.txt in a Simple and Easy Manner!

Robots.txt is an instructive or directive file that instructs/directs the search engine crawlers or web robots on how to crawl the website. The purpose or use of installing Robots.txt file is to keep away the spammy bots and allowing the good bots that are benefitial to the website from SEO perspective. Generally, we install it in a root folder of a web server. Any website's Robots.txt file can be checked by just entering "abc.com/robots.txt" in address bar of the web browser.

You can also check for errors and test it in a tester of Google Search Console under Crawl>>robots.txt Tester section.






Here's what robots.txt Tester looks like in Google Search Console:


It works under Robots Exclusion Protocol. You can also download a sample of robots.txt file from here. In simple, its a table of content which contains an instructions for web robots/crawlers. Whenever the web robots come to visit the website it first checks 123.com/robots.txt before visiting 123.com/home.html or any other web-page of a website.

There are hundreds and thousands of good and bad web robots. And robots.txt is an effective way to control them. You can also use Robots Meta Tag <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> to instruct the web robots but it is not recommended for specific robots allowing or disallowing. It goes for all the types of robots. As there are pros, there are cons too. You can not expect search engine crawlers to fully obey the robots.txt instructions. But you can not ignore it fully, as it is a powerful mechanism available for SEOs/webmasters.

Thats all! Please comment below if you want to discuss something or any questions for me :)

Monday, November 28, 2016

SEO Content Creation Tips That Create Engagement!

1. Refer the keywords to determine how people are searching for information about the chosen topic,

~ URL structure should include main keyword, and city.
~ Title tag should include main keywords
~ H1 and H2s should include keyword{Heading 1 and 2}

2. You can always look on what are the current trends in the industry and what competitors are writing about.

3. I would recommend having at-least 250 word page content. Don't focus on word count too much – focus on the thoroughness of the coverage of the topic.

4. As you write, don't focus on including keywords a certain number of times. Use them where they make sense from a personal perspective. This means keywords will be sprinkled throughout the page, but naturally, so a couple/few times over 250 words.

5. Make the copy pleasing for people to read. Yes...those keywords are in there, but they have been highlighted in a natural, non-robotic manner. Add some additional information about your cities and best views/attractions for that cities.

6. Have a compelling title/headline for the post or with a key term in mind.

7. Try to add unique images, additional information and videos

8. Socialization Options

9. Add Contact details and contact form at the end of a web-page or in a way that convinces user to sign-up or fill the form.

So, these are some general tips which can be used to create a unique website/blog content.

Thursday, August 25, 2016

What is Event Tracking Code in Google Analytics and How to Use it

There are four types of goals in Google Analytics. Destination, duration, Pages/Screens per session and Event. A completed action or process is considered as a goal. Event tracking code is implemented to track the event goals - one type of measurement to analyse the user engagement. 

A user interaction or behaviour with any website element is considered and called as an event. To track this user interactions, we set-up event tracking code with the help of Google Analytics tool. The results of all the action data is reported through Analytics reporting tab/interface which is located in top of the tool. You can gain more insights on how user interacts with your website and its content/elements with this feature of Google Analytics. 

There are many types of events. Below are some examples of events and piece of codes to install and execute them.

1. Banner Ads
2. Clicks on Social Icons
3. Videos
4. Form Fill-ups(It can be Sign-up Form, Contact Us or Inquiry/Free estimate request Form etc.)
5. Clicks on Emails(Click to Email Option Should Be Present)
6. Call to Action(Call or Add to Cart/Shopping Cart etc.)


Here are the parameters in which you can pass the value and ready to go!

     
1)Forms Like Free Website Analysis Request(Ideal for )

onClick="_gaq.push(['_trackEvent', 'Analysis', 'Click Link', 'Analysis Request']);"

2)PDF Downloads
onClick="_gaq.push(['_trackEvent', 'Temporary Benefits', 'PDF', 'Temporary_Benefits.pdf']);"
onClick="_gaq.push(['_trackEvent', 'One Page Stratergic Plan', 'PDF', 'OPSP_final.pdf']);"

3)Signup Form: News Letter Signup
onClick="_gaq.push(['_trackEvent', 'Submit', 'Form', 'Contact Us']);"

4)Social Icon Clicks
onClick="_gaq.push(['_trackEvent', 'Subscribe to Blog', 'Newsletter', 'Blog']);"
onClick="_gaq.push(['_trackEvent', 'Follow Us on Twitter', 'Signup', 'Twitter']);"
onClick="_gaq.push(['_trackEvent', 'Be a fan on Facebook', 'Fanpage', 'Facebook']);"
onClick="_gaq.push(['_trackEvent', 'RSS Feed', 'Feeds', 'RSS Feed']);"
onClick="_gaq.push(['_trackEvent', 'Google +', 'Google +1 Page', 'Google']);"

onClick="_gaq.push(['_trackEvent', 'Connect with us on LinkedIn', 'Social Media', 'LinkedIn']);"
onClick="_gaq.push(['_trackEvent', 'Google+', 'Social Media', 'G+']);"
onClick="_gaq.push(['_trackEvent', 'Visit our Facebook page', 'Social Media', 'FB']);"
onClick="_gaq.push(['_trackEvent', 'Follow us on Twitter', 'Social Media', 'Twitter']);"
onClick="_gaq.push(['_trackEvent', 'Visit our Youtube Channel', 'Social Media', 'YouTube']);"
onClick="_gaq.push(['_trackEvent', 'Pinterest', 'Social Media', 'Pinterest']);"

5)Videos
onClick="_gaq.push(['_trackEvent', 'Videos', 'Play', 'Baby\'s First Birthday'])"


Each piece of custom code contains four elements[Category | Action | Label | Value], such as:

_trackEvent - A Method

Analysis - A Category

Click Link - Click Link

Analysis Request - Labels

------------------------------------------------------------

Values - You can type integer rather than string - _gaq.push(['_trackEvent', 'Videos', 'Video Load Time', 'Gone With the Wind', downloadTime]);

Non-Interaction Events - Boolean

------------------------------------------------------------

If you're not tech savvy or do not have any technical skills, then you can take help of a web-publisher.


P.S.:

>>Analytics should be already set-up

>>If have no tech skills, then dont worry. There are many plugins available for WordPress sites, but it also needs testing and monitoring after installation.

>>This feature can also used with Google Tag Manager

Thursday, July 21, 2016

Schema Tag: Usage & Meaning Simplified!

Schema is an interpretation of data into a structured data. We should implement schema tag on the website’s info so that search engines can understand it in a better way. Schema is a part of structured data mark-up. With the help of Schema tags, we will be able to represent our data in a better format.

There are three major types of structured data mark-ups:

1. JSON-LD(Recommended by Google) - JavaScript Object Notation for Linked Data
2. Micro-Data
3. RDFa (Resource Description Frameworks)

This makes sure that your data mark-up is presented in a well formed manner and can be processed by Google, other search engines. It is also referred as a kind of micro-data format which is used by majority of the SEOs and web developers world wide. Schema.org is an open community effort to promote your standard structured data in a variety, where you can find almost everything about schema tags and their usage.

You can implement data mark-ups on following property of your website:

1. Internal Search Box
2. Ratings/Reviews
3. Physical Location/Address
4. Products (for e-commerce websites)
5. Future Events
6. Recipes (Ideal for food blog or cooking related websites)
7. Social Profiles (It shows only when your profiles are verified and tested)

Next step is to always make sure your structured data is tested and validated. Google also provides structured data testing tool for testing and publishing your mark-up. When you publish your data you can make it more eligible by testing in this tool. It also shows errors or corrections if any.

In simple words, it is associating your data with the search engine crawlers/robots and other online resources.


P.S. Creating Quality Pages Always Helps! 

Tuesday, April 12, 2016

Initial SEO Work Activities for 2016-The Check-list!


Many are not even aware about SEO. While some beginners struggle with it. The question is - From where to start the process of search engine optimization? Today, I'll let you know some basics of the initial SEO work that you can upload on your new website in order to get the high ranking's on search engines result pages. Following is a list to get started!

1. Domain Authority
>>You can use bulk da checker(http://www.bulkdachecker.com/) to check the current domain authority score of your website. It may increase or decrease with time.

2. Page Authority
>>Just like you checked the domain authority you can check the page authority. The difference between them is of an entire domain and a web-page.

3. Back-links Profile
>>Back-links have always been a constant and important factor according to Google guidelines. You can analyse your back-links with the help of tools like Ahrefs, Google search console etc.

4. Link Saturation
>>Didn’t understood? Don’t worry! This term refers to the numbers of web-pages of your website that is indexed by search engines.

5. Domain Age / Domain Creation
>>Search engines provide first preference to old domains which are registered for a long tenure. So domain age also matters up to some extent to get the higher rankings on search engines.

6. Domain Expiry
>>We can register domains up to 10 years. So it is always expected that domain that you book is registered for higher number of years.

7. Google Analytic
>>Google has provided a great tool to analyse the traffic and various ways/sources of referral. You can install the provided code in <head> section of all your web-pages.

8. Hidden Text / Links
>>Having a hidden text or hidden links on the website is not recommended and is considered as black hat technique. It is always on safe side to make sure that no hidden texts or links are present on website or any of the web-pages.

9. JavaScript / CSS
>>Its a subject of big debate that search engines are able to understand/read JavaScript/CSS or not. Still it is better to have them externalized(separated from HTML files) for the better crawl-ability of robots.

10. Website Load Time
>>Also known as page speed time. It is recommended that website or web-page should render in no time or take minimum time to load on browsers.

11. Mirror Sites
>>Website with unique appeal and content attracts more visitor.

12. Check Server Status/htaccess

13. Similar Sites on Current Domain

14. Website Content
>>Check all the web-pages content including home page and other service pages.

15. Duplicate Content/Plagiarism
>>Having unique/fresh content always helps and it is a basic factor to rank on search engine.

16. Custom 404 Error Page
>>This helps in helping a user to navigate in a better way to the website. If user does not find anything then with can denote that this page does not exist. This custom page hold the status code 404.

17. 301 Redirect
>>301 redirect is a permanent redirection of URLs.

18. Google Webmaster Tools/Search Console Verification
>>Google webmaster or Google search console is one of the best tools made and provided by Google to numerous webmasters.

19. Xml Sitemap
>>It is a website map in XML(extensible markup language) file format which is installed in root directory of a web-server for search engine crawlers.

20. Simple/HTML Sitemap
>>It is a website map in HTML(hyper text markup language) file format which is installed in root directory of a web-server.

21. Canonicalization
>>Webmasters use to avoid duplication between two pages of website. i.e. home page and index page

22. Social Widgets
>>Social widgets are always recommended to enhance the user experience. We can install widgets like Google+1 button, Facebook like button and posts, Twitter tweets etc. Which also helps us to gain new likes and sometimes traffic also!

23. Social Profile Icons
>>We should always link our social profiles with our website which helps search engines to find the correct profile of your brand.

24. Social Profile Schema
>>By doing social profile schema, we can inform Google about our social profiles. This also helps in branding. The search for this shows in knowledge graph panel of result pages.

25. Schema Tag
>>Google uses microdata format which involves schema markup for reviews/testimonials, addresses etc.

26. Robots.txt File
>> Robots.txt is an instructive file which instrcuts the search engine robots to crawl or not to crawl the website.

27. Noindex – Nofollow Tag
>>This comes as similar concept to robots.txt file. The only difference between them is of file and tag. And both serves for the same indexing and following purpose by crawlers. Below is the tag which can be included in <head> section of a home page.

<meta name=”robots” content=”noindex, nofollow”>

28. Broken Link Checker/Dead Links
>>The links which direct us to pages or links which does not exist is a broken link. We need to rectify them as soon as possible because it hurts the structure of the website.

29. Integration of internal Blog
>>internal blog helps to increase the website volume and it also drives the traffic to site through fresh content.

30 External/Outgoing Links

31. Dynamic Pages
>>It is always beneficial to have static web-pages keeping search friendly aspect. Dynamic pages are not actual pages which exist on a web-server. So it is always recommended to have a static web-pages.

32. Frames/iFrames
>>Robots can not crawl frames/iFrames located on a web-pages, so it is recommended to have them externalized or removed if possible.

33. Javascript Menus
>>Same as iFrames, robots find difficult to crawl the javascript, so it is recommended to have them externalized or removed if possible.

34. Contact Form and Tracking
>>To analyse the web traffic that came through contact form submissions, we need to implement form tracking in Google Analytics.

35. Duplicate Meta Tags
>>Unique meta tags that are concise always helps to increase CTR. So it is good that we have a unique web-page based meta tags that clearly explain the web-page intent.

That’s all for now ;) You can also use this check-list to ensure that all the basics of you website are covered.