Thursday, January 21, 2016

Actionable guide to SEO in 2016

The year of 2015 was relatively calm for SEOs. But no matter how peaceful the current SEO landscape looks, it doesn't mean you can lean back in your chair and relax!
Penguin and Panda are still out there, as well as the new ranking signals (like mobile friendliness or https). So, to help you catch the wind and brush up your SEO skills, our SEO specialists' team has prepared a list of recommendations SEOs should focus on right now.

CHAPTER 1 Be findable
The rule is simple — search engines won't rank your site unless they can find it. So, just like before, it is extremely important to make sure search engines are able to discover your site's content — and that they can do that quickly and easily. And here's how.

1. Keep a logical site structure

 Good practice
  • The important pages are reachable from homepage.
  • Site pages are arranged in a logical tree-like structure.
  • The names of your URLs (pages, categories, etc.) reflect your site's structure.
  • Internal links point to relevant pages.
  • You use breadcrumbs to facilitate navigation.
  • There's a search box on your site to help visitors discover useful content.
  • You use rel=next and rel=prev to convert pages with infinite scrolling into paginated series.
 Bad practice
  • Certain important pages can't be reached via navigational or ordinary links.
  • You cram a huge number of pages into one navigation block — an endless drop-down menu or something like this.
  • You try to link to each & every inner page of your site from your homepage.
  • It is difficult for users to go back and forth between site pages without resorting to Back and Forward browser buttons.
An example of a logical site structure:
An example of a clean URL structure:
www.mywebsite.com/product-category-1/product-1
www.mywebsite.com/product-category-2/product-3

2. Make use of the XML sitemap & RSS feeds

The XML sitemap helps search bots discover and index content on your site. This is similar to how a tourist would discover more places in an unfamiliar city if they had a map.
RSS/Atom feeds are a great way to notify search engines about any fresh content you add to the site. In addition, RSS feeds are often used by journalists, content curators and other people interested in getting updates from particular sources.
Google says: "For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index."
 Good practice
  • Your sitemap/feed includes onlycanonical versions of URLs.
  • While updating your sitemap, you update a page's modification time only ifsubstantial changes have been made to it.
  • If you use multiple sitemaps, you decide to add one more sitemap only if yourcurrent sitemaps have already reached the limit of URLs (up to 50 thousand per each sitemap).
  • Your RSS/Atom feed includes only recently updated items, making it easier for search engines and visitors to find your fresh content.
 Bad practice
  • Your XML sitemap or feed includes the URLs search engines' robots are not allowed to index, which is specified either in your robots.txt, or in meta tags.
  • Non-canonical URL duplicates are included into your sitemap or feed.
  • In your sitemap, modification time is missing or is updated just to "persuade" search engines that your pages have been brought up to date, while in fact they haven't.

3. Befriend Schema markup

Schema markup is used to tag entities (people, products, events, etc.) in your pages' content. Although it does not affect your rankings, it helps search engines better interpret your content.
To put it simple, a Schema template is similar to a doorplate — if it says 'CEO Larry Page', you know whom to expect behind the door.
 Good practice
  • Review the list of available Schemas and pick the Schemas to apply to your site's content.
  • If it is difficult for you to edit the code on your own, you use Google's Structured Data Markup Helper.
  • Test the markup using Google'sStructured Data Testing Tool.
 Bad practice
  • You use Schemas to trick search engines into believing your page contains the type of info it doesn't (for example, that it's a review, while it isn't) — such behavior can cause a penalty.

4. Leverage rich answers

In 2015 we observed the growth in the number of rich answers in Google search results. There are various types of rich answers. Basically, a rich answer is a snippet that already contains a brief answer to the search query. It appears above other organic search results and thus enjoys more exposure.
Any website has a chance to be selected for the rich answers. Here are a few things you may do to increase your chances to get there:
1) Identify simple questions you might answer on your website;
2) Provide a clear direct answer;
3) Provide additional supporting information (like videos, images, charts, etc.).

CHAPTER 2
 Master Panda survival basics
"Panda" is a filter in Google's ranking algorithm that aims to sift out pages with thin, non-authentic, low-quality content. This means getting rid of thin content and duplicate content should be high up on your 2016 to-do list.

1. Improve content quality

 Good practice
  • These days, it's not enough to keep your content unique in a sense that it passes the plagiarism test. You need to createreally useful, expert-level content and present it in the most engaging form possible.
  • You block non-unique or unimportant pages (e.g. various policies) from indexing.
 Bad practice
  • Your website relies on "scraped" content (content copied from other sites with no extra value added to it). This puts you at risk of getting hit by Panda.
  • You simply "spin" somebody else's content and repost it to your site.
  • Your website includes too many pages with little textual content.
  • Many of your site's pages have duplicate or very similar content.
  • You base your SEO strategy around anetwork of "cookie-cutter" websites(websites built quickly with a widely used template).
Keys4Seo tip tip:
Use WebSite Auditor to check your pages for duplicate content

2. Make sure you get canonicalization right

Canonicalization is a way of telling search engines which page should be treated as the "standardized" version when several URLs return virtually the same content.
The main purpose of this is to avoid internal content duplication on your site. Although not a huge offense, this makes your site look messy — like a wild forest in comparison to a neatly trimmed garden.
 Good practice
  • You mark canonical pages using the rel="canonical" attribute.
  • Your rel="canonical" is inserted in either the <head> section or the HTTP header.
  • The canonical page is live (doesn't return a 404 status code).
  • The canonical page is not restricted from indexing in robots.txt or by other means.
 Bad practice
  • You've got multiple canonical URLsspecified for one page.
  • You've got rel="canonical" inserted into the <body> section of the page.
  • Your pages are in an infinite loop of canonical URLs (Page A points to page B, page B points to page A). In this case, search engines will be confused with your canonicalization.
Keys4Seo tip tip:
Use WebSite Auditor to check your pages for duplicate rel="canonical" code

CHAPTER 3 Learn to combat Penguin
Google's Penguin filter aims at detecting artificial backlink patterns and penalizing sites that violate its quality guidelines in regards to backlinks. So, keeping your backlink profile look natural is another key point to focus on in 2016.
 Good practice
  • Your website mostly has editorial links, earned due to others quoting, referring to or sharing your content.
  • Backlink anchor texts are as diverse asreasonably possible.
  • Backlinks are being acquired at amoderate pace.
  • Spam, low quality backlinks are eitherremoved or disavowed.
 Bad practice
  • Participating in link networks.
  • Having lots of backlinks from irrelevant pages.
  • Insignificant variation in link anchor texts.
Keys4Seo tip tip:
Check backlinks' relevancy with SEO SpyGlass
Keys4Seo tip tip:
Detect spammy links in your profile

CHAPTER 4 Improve user experience
Quite a few UX-related metrics have made their way into Google's ranking algorithm over the past years (site speed, mobile-friendliness, the HTTPs protocol). Hence, striving to improve user experience can be a good way to up your search engine rankings.

1. Increase site speed

There are quite a few factors that can affect page loading speed. Statistically, the biggest mistakes site owners make that increase page load time are: using huge images, using large-volume multimedia or other heavy design elements that make the site as slow as a snail.
Use Google's PageSpeed Insights to test your site speed and to get recommendations on particular issues to fix.
Keys4Seo tip tip::
Optimize your pages' loading time with WebSite Auditor

2. Improve engagement & click-through rates

The Bing and Yahoo! alliance, as well as Yandex, have officially confirmed they consider click-through rates and user behavior in their ranking algorithms. If you are optimizing for any of these search engines, it's worth trying to improve these aspects.
While Google is mostly silent on the subject, striving for greater engagement and higher click-through rates tends to bring better rankings as well as indirect SEO results in the form of attracted links, shares, mentions, etc.

3. Consider taking your site HTTPs

In August 2014, Google announced that HTTPs usage is treated as a positive ranking signal.
Currently there is not much evidence that HTTPs-enabled sites outrank non-secure ones. The transition to HTTPS is somewhat controversial, because
a) Most pages on the Web do not involve the transfer of sensitive information;
b) If performed incorrectly, the transition from HTTP to HTTPS may harm your rankings;
c) Most of your site's visitors do not know what HTTP is, so transferring to HTTPS is unlikely to give any conversion boost.

4. Get prepared for HTTP/2

HTTP/2 is a new network protocol that should replace the outdated HTTP/1.1. HTTP/2 is substantially faster than its predecessor. In terms of SEO, you would probably be able to gain some ranking boost due to the improved website speed.
On November 06, 2015 John Mueller announced in a G+ hangout that Google Bot will soon be able to crawl HTTP/2 websites. At the time of writing, about 70% of web browsers support HTTP/2. You can keep track of HTTP/2 support by browsers on "Can I Use".
HTTP/2 is likely to become a "must" soon. Thus, keep an eye on the issue and be ready to implement this feature when required.

CHAPTER 5 Be mobile-friendly
The number of mobile searches may soon exceed the number of desktop searches. With this in mind, search engines in general and Google in particular love mobile-friendly websites.
Mobile-friendliness has become a minor ranking factor for the mobile SERPs. You can test if your website is mobile-friendly using Google's Mobile-Friendly Test.
On October 07, 2015 Google introduced Accelerated Mobile Pages Project (AMP). As the name implies it aims to provide a more streamlined experience for mobile users. The technology consists of three elements: special HTML markup, AMP JavaScript, and a content distribution layer (the latter is optional). The AMP search is currently available only on mobile devices. You may give it a try at g.co/ampdemo.
 Good practice
  • Your page's content can be read on a mobile device without zooming.
  • You've got easy-to-tap navigation and links on your website.
 Bad practice
  • You are using non-mobile-friendly technologies like Flash on your webpages.
Keys4Seo tip tip:
Use mobile-friendly test in WebSite Auditor

CHAPTER 6 Earn social signals — the right way
Search engines favor websites with a strong social presence. Your Google+ posts can make it to your Google connections' organic search results, which is a great opportunity to drive extra traffic. Although the likely effect of Twitter or Facebook links on SEO hasn't been confirmed, Google said it treats social posts (that are open for indexing) just like any other webpages, so the hint here is clear.
 Good practice
  • You attract social links and shares withviral content.
  • You make it easy to share your content: make sure your pages have social buttons, check which image/message is automatically assigned to the post people share.
 Bad practice
  • You are wasting your time and money onpurchasing 'Likes', 'Shares' and other sorts of social signals. Both social networks and search engines are able to detect accounts and account networks created for trading social signals.
Keys4Seo tip tip:
See your site's social signals in SEO PowerSuite

CHAPTER 7 Revise your local SEO plan
In August 2015, Google reduced the number of results in the local pack from 7 to 3 and removed addresses and phone numbers. The search engine made it harder for SEOs to get to the local pack; however, a new map view has been added with up to 20 spots for the search results.
What has changed is that local rankings are now more dependent on the IP address of the user. You can read more on how to adjust your local SEO strategy to the new Google's update in this guide.
Keys4Seo tip tip:
Check website authority in SEO PowerSuite

What's coming in SEO in 2016?

Here are the main SEO trends for 2016, as predicted by our in-house SEO team:
SEO remains part of multi-channel marketing
Customers can find your business through social, paid search, offline ads, etc. Organic search is an integral part of a complex path to conversion. Just be aware of these other channels and get savvy in additional spheres, if necessary.
Google now gets searcher intent & context
The keyword is no longer the gravity center of a ranking. Google now also looks at synonyms, your niche connections, location, etc. to see if you fit the bill (=query). The good news is that you don't need to get too hung up on specific keywords to prove you're relevant to a query.
The end of search monopoly might be near
According to comScore in 2015, both Yahoo! and Bing continued to steadily increase their search market share.
No quick results, no easy SEO
With its latest iterations of Panda & Penguin and the penalties Google dished out to link networks, the search engine widened its net for spam — and became even better at detecting sites with unengaging content or unnatural link patters.
Traffic stays on Google increasingly
Google has definitely stepped up its effort to provide immediate answers to people's searches. And, with the increasing number of rich answers, this tendency for stealing traffic from publishers will likely increase.

Paid search expansion
A few years ago, Google changed Google Shopping's organic model to pay-per-click. It is possible that Google will make yet another organic vertical paid. Local Search is the best candidate for the change, since Local Search results are dominated by businesses selling a product or a service, and this vertical is innately commercial.

0 comments:

Post a Comment