All posts by Yuvaraj

PROBESEVEN - SEO Trend 2015 – 3 Field of Approach for Every Online Marketer

If you are an online marketer or looking to know the new trends in SEO then this read will figure out what is expected. Be it a smart brain or a well experienced mind the tactics which goes with SEO is dynamic that needs to be optimized frequently.

It is quite well known that SEO is indispensable and it adds the structure and provides clues to the search engine to give your website the level of sophistication. In the absence of the SEO, your enriched website can also become invisible and obscure. So, you are now to add more to your concepts towards the latest trend in 2015.

Search Engine Algorithms

Search engine algorithms stores everything that is been done in SEO and reflect it in the SERPs. Every online marketer would be aware of search engine algorithms and the implications. However, it is rare that anyone to know entire search engine algorithms but experienced online marketers can get to update at the quick point with intelligent guess of algorithms. Search engines often update their algorithms which makes the online marketers to constantly and promptly update their knowledge from search engines. There are more than 10 algorithms in search engines, if website affect any one of the algorithm automatically ranking get down in SERPs.

Mobile Online Marketing

Mobile SEO, the new iteration of the Online Marketing. Marketers should find out the way to optimize for mobile because in recent days the visitors using computers & laptops are decreasing which has been taken over by mobiles and tablets for which the visitors are increasing day by day. In immediate future more than fifty percent of visitors would come from mobile devices. Hence, marketers suggest their clients to make their website as a responsive design. When website have responsive design, visitors feel your website as user-friendly so your bounce rate can also maintain in good mode.

On-Page SEO with Schema

On-Page Optimization is the traditional technic but still every SEO marketers consider this as an effective way to rank in SERPs because Meta tag and content optimization is the way to describe about your website to search engines and visitors. For the visitors this technic could be fine but for the search engines it could be not sufficient because still search engines having struggle to crawl your website to find your exact business information. For which Google, Yahoo & Bing have introduced schema code to track website exact information. When a website have schema codes, it should be ranking in SERPs with better position because schema codes is the another way to describe your website information to search engines. To know more about schema code, click this link http://schema.org/.

Hope you have set your moves to improvise the SEO of your website. You would certainly have points to share which I welcome to make it more informative.

PROBESEVEN - Google rolling out Penguin 3.0

After more than a year, Google unleashed Penguin. On October 17, 2014 Google has rolled out and confirmed Penguin refreshment. Many online marketers, website owners and webmasters have been eagerly waiting for this update. This is the sixth Penguin release from Google, we are considering it as Penguin version 3.0 with an impact on 1% of queries in all languages, Penguin 3.0 is commonly rolled out irrespective of industries and is not limited to any particular business.

Still, the Penguin refreshment is rolling out and expected to continue for a couple of weeks. Penguin is primarily designed for decreasing the ranking in SERP of websites violating Google guidelines. If a website gets affected by Penguin release or update, the website owners or webmaster should take immediate action against the spammy links.

Google commonly considers spammy backlink methods such as the below

• Low quality free Directories, Bookmarking and Article submission websites.
• Reciprocal Links, Paid Links and Sponsored links.
• Links from Outlawed, Punished and Duplicate content websites.
• Irrelevant or unnatural websites links.
• Link schemes methods and Guest blog sites.
• OFF Page black hat technique like stuffing the keyword in anchor text.
• Any link building method which is violating Google guidelines.

Website owner or Webmaster has to cleanup website’s bad backlink profiles using Google disavow tool and wait up for the next penguin release. If the spammy backlinks are perfectly cleaned up in the website, then Google index the site and increases the ranking in it’s upcoming Penguin release.

Quality link building methods for Penguin:

✔ Niche high PR inbound links are highly valuable for any business websites, relevant quality business links give best backlink signals to the websites.
✔ Business website links are beneficial to the websites, to quickly rank in search engine result pages.
✔ The Link building method should be varied from your competitors link building. While building the links, promote the keywords as well as the brand name of the business because your link building method makes it clear your brand to Google.
✔ When promoting the keywords with brand name, website will not affect any Penguin update or refresh.

Many people are confused on the difference between algorithm update and algorithm refreshment from Google. Refreshment is again roll out of the same algorithm which is executed previously, no changes are done in the calculation and signals of the algorithm. The Update is adding new factors and modifying the computing and signals in algorithm. Now about Penguin, Google refresh same signals, it’s not an update.

Though initially it is been declared that 1% of English queries were affected by Penguin 3.0 roll-out, the exact outcome is not yet finalized with the refresh happening here and then. For website owners penalized in this latest update, rectify the penalty by building strong and effective back-links with more of your brand name in the anchor text to overcome all future updates in the search algorithm.

PROBESEVEN - How to use Google Disavow Tool

Webmaster-tools-tipsThe disavow tool introduced by Google is used to address the issue in website inbound links. Inbound links to a website are well-known indication to move up in search results and increase the page rank. If website builds unnatural back-links like paid, reciprocal, spammy links and other link schemes which violates Google guidelines, website drop-off automatically in search engine result page, together affecting the website page rank.

Now a days, Google acts heavily to point out harmful third-party website inbound links. When Google identify low quality links directing to your site, it will send toxic links notification via webmaster tool to the site owner. Once the message is received, Website owner takes immediate action to find and remove the low quality inbound links pointed by Google. This is the great approach path to dispatch the toxic links directly.

The list of links submitted in Google Disavow tool will not be removed permanently from its index but the link value passed from them will not be considered any more. Google does this by adding a no-follow tag to the low quality links specified. When a link is considered as no-follow, no value passes to the website. Hence the website stay safe from the future search engine algorithms.

While uploading toxic links in webmaster tool, no need to add No-follow links.

Rules of Google Disavow Tool Usage

  • The disavow file which is updated in webmaster tool, should be a text file (.txt).
  • While entering the domain or link in the file, each line must contain single URL and comments must start with “#” symbol at every line.
  • If website owner need to remove entire domain of a specific inbound link, add Domain: at the commencing of the URL, e.g., domain:example.com.
  • If site owner requires only specific pages from the website, add particular web pages e.g., http://example.com/webpage.html.
  • Don’t exceed the maximum upload size limit of 2 MB.

Procedure to Disavow low quality Back-links

There is a two-step procedure to disavow the back links

1)   Website owner download the list of back-links in webmaster tool.

  • Click your website in webmaster tools home page.
  • On the left of website dashboard, click the menu of Search traffic and choose “Links to your site”.
  • Click More, which is under Who Links Most and download the latest links.

2)   Upload the file in Google disavow tool.

  •  Make a notepad file, carrying list of inbound links which you need to disavow.
  • While specifying the links, kindly write in comment why you want to remove this back-link.
  • Log in to Google web master account, Go to Disavow Tool page and upload the list of links which you want to disavow.

After uploading the disavow file, If site owner mistakenly added some quality websites in disavow file, Google crawls and adds a no-follow tag. In such a case, site owner should download the disavow file and remove the quality URLs then re-upload in webmaster tool. Google again crawls the URLs then change to Do-Follow and consider as valuable back-links.

Readers are welcome to share their experiences with Google Disavow Tool before in the comments section!!!

PROBESEVEN - How to do On Page Optimization in a day

SEO process basically include ONPage and OffPage Optimization. The two are instrumental for any SEO projects. While Offpage consumes many days of time, Onpage take minimal time yet supports SEO process greatly giving better output. One important thing to remember is that any web page should not be optimized for more than four or five keywords.

onpage_probesevenLet’s separate OnPage into two agenda

Before entering Onpage Optimization, analyze the below points in your website:

  • Check your site in webmaster tool. Analyze the crawl errors, security issues and HTML Improvements for your website.
  • Check content duplication and poor content in your site for all pages mainly in landing page. Some good tools to check your content are Copy scape and Small SEO Tools sites.
  • Analyze your landing page URL, which should be user friendly to search engines as search engine gives priority to Static URLs than dynamic URL’s. If your pages has dynamic URL’s, implement 301 redirection and change your pages to static URL’s.
  • Check internal links in your website, ensure it is interconnected in some ways.
  • Check for canonical issues in the site. It means if your site have duplicate pages, use canonical tag in duplicate pages and tell search engines that it is just another version of original page for visitors and it is no more duplicate.
  • Check for broken links in entire website. If you encounter any broken links, remove the links immediately since broken links cause problem in SERP result.
  • Check for ROBOTS.TXT page in the site. If it already exists, just validate line by line. If not found, create one and clearly mention the pages you want to disallow.
  • Consider Text to HTML ratio. The content of the website should never be too low compared to the website’s coding. Content should accommodate at least 15% of your website.

Now begin the Onpage optimization works,

After the website analysis, choose your landing pages in SERP. Most of the website’s landing page would be home page but it depends upon the business page where customer should enter.

  • Optimize meta tags in landing pages. Meta title should be less than 70 characters and Meta description goes 150 characters and Meta keywords have no limit. Google do not consider the meta keywords while other search engines gives priority.
  • Create XML site map, HTML site map and RSS Feed site map. XML site map is to let search engines easily identify your pages. HTML site map help customers to use the site better. Feed site map tells the important pages in your website to the search engine.
  • Place the important keywords in Heading tags, ALT Tags and Anchor text because it highlights important words in your page to the search engine.
  • Optimize your content with keyword naturally in the landing pages, over optimization spam your site and so utilize your keyword with medium level. Retain 3-5% of keyword density. Concentrate to include long tail and semantic keywords.

The above specified Onpage works is crucial and basic for any SEO projects, working accordingly indicates that you are half done of complete SEO process which support in the keyword rankings in SERP. OffPage optimization is the next plan, it will popularize the website and help crawlers frequently crawl your website. However, laying a strong foundation with right Onpage optimization is mandatory to reap the real benefits of SEO.

PROBESEVEN - Google’s Hummingbird – All you want to know

google_hummingbird_algorithmHummingbird is a new search algorithm unleashed by Google. Google has celebrated its 15th birthday on September 27, 2013 and on the same day announced this algorithm release. The concept of this new update is to process the search via voice input in the search box and retrieve results matching the voice query which indeed seems to be an interaction between Google and the users for getting relevant search results. Google really developed a number of advanced changes to this algorithm to make so search results more relevant and convincing  together with the support of existing Panda and Penguin Update.  Google voice search gives better conversational  results than typing in words.  Hence users able to search questions rather than keywords to get accurate results letting mobile traffic automatically increase naturally.  Google has seen a big revolution with Hummingbird for the very first time in its history.  After the humming bird update Google has become more powerful search engine in the market.

Hummingbird Impact on SEO

Whenever Google updates their search algorithm,  SEO experiences either a tough or smooth situation.  In the recent years starting from 2010, Google keeps on updating their search algorithm using panda and penguin.  Now hummingbird is set to give webmasters a bunch of challenges to undergo SEO.  The explicit truth of Hummingbird is certainly the heavy weight of long tail keywords compared to short keywords.  One of the major factor Google will consider to list websites for the long tail keywords. Panda and Penguin depend on the existing algorithm but the hummingbird is different since it is completely a new algorithm and do not stick to tweaks in an existing algorithm. Now on,  high quality content with strong back links will do not help your website to move first position in Google unless you find and answer the phrases people converse with voice search in Google.

How to Face Hummingbird

Previously we are optimizing a web page with keywords but the next type of strategies is explained here.  Hence SEO Professionals have a clear idea and understanding about the hummingbird and how conversational search works.  SEO Professionals target should be more on natural questions and it means what end users ask questions to Google search regarding a specific topic/business/service and include the answer for such questions in the website content.  Focus on the particular search terms and provide relevant point answers to the specific topic to surpass the Hummingbird effect.  Don’t limit to short keywords and worry on their placement in Google SERP’s. Web page content is the major ranking factor of SEO. Spend more quality time and revise with enriched content to get your  in a  natural and organic way.

SEO Strategies 2014

The year 2013 Google has overcome many useful updates for search engine optimization and online marketing.  Google Major algorithms include Google Hummingbird, Google Penguin and Google Page rank updates in the  year 2013.  In 2014, Concentrate on Long tail keywords, Google+ and advanced Link building phase which will be worth full for SEO in coming years so SEO Professionals make way for new strategies in your thought for SEO.