report

A Guide to Google Best Practices

ON-PAGE ACCURACY

Google's not actually conjuring up new ways to punish people for ridiculous violations...It's algorithm is designed and maintained to rank sites based on quality and accuracy.

The best way to make sure that your site is following Google's Best Practices, is to be accurate with your on-page SEO and to publish quality content.

It's really just common sense with a few added inside tips from Google on how to it operate best with their search engine algorithm.

Here are some tips on how to maintain On-Page Accuracy while optimizing your site for Google.

Keywords

Choose a keyword that accurately describes the content on your page, don't just pick one based off search volume and competition.

Titles

Make sure that your page titles include that keyword and accurately represent the content on those pages. If someone is looking for a service that you provide, and you make it clear on your website that you provide that service, they will find you. That's how it works.

Headers

Use <h1> <h2> <h3> and <p> tags to organize your content in a way that will help outline it for your reader. This will also help the search engine crawlers to understand your content and accurately rank it for search engine users.

Meta Descriptions

Use the meta descriptions to inform web users about what the page represents. You want to drive the right consumers to your site and Google will reward you for accuracy. Place the proper keyword in the meta description, but don't stuff every keyword you can think of that users "might" search for. You will lose points with Google, and useless keywords won't help drive the right kind of traffic to your site.

URLs

URLs that are simple and easy to understand are best. They should contain your keyword, which again, should accurately describe the content on your page. URLs that are easy for users and search engine crawlers to understand are best. Avoid including session IDs and unnecessary parameters. Generic is often best.

HUMANS VS ROBOTS

It is important to understand that Google uses search engine crawlers and algorithms to sort through sites, but it is all in the name of helping humanity.

Websites are made for human readers, so in all of your optimizing for search engine robots, don't forget to make your site appealing to humans. It should actually be the priority, and typically, the more optimized a site is for human interaction, the more the robots will reward you with rankings and traffic.

Here are some easy ways to optimize for humans.

Navigation

Keeping your site navigation organized and as simple as possible makes a huge difference in how a web user interacts with your site. It also helps Google's crawlers understand your site better and rank it accurately. Think accurate titles, and organized hierarchy according to what content is most important and what will make sense to the people navigating.

404 Page

Create a custom 404 page that tells users that the link they went to is bad. Make sure it displays a way for them navigate your site, so that they can track down the right page url. Make is personalized to fit the theme of your company.

Helpful Blog Content

User a blog to give consumers and other industry influencers helpful information. Don't use it to plug your services and your brand--that's what the rest of your site is for. Help your customers and others in your field, and they will be more trusting and willing to help you out or purchase your services and products.


MEET THE GOOGLE ZOO

Every search engine has their own algorithms that determines how highly a site ranks in their SERPs (search engine result pages). Most of Google's major algorithm updates are codenamed after animals.

Panda, Penguin, Hummingbird, and Possum are the most important updates to understand in order to follow Google's Best Practices for search engine optimization.

What Should Google Name the Next Update?

See results

THE PANDA

Google's Panda Algorithm first launched in February of 2011, the Panda algorithm was designed to demote sites with low-quality content. The update hit content farms and content aggregators hardest.

Issues that trigger a Panda penalty:

  • Spun or automated content - posts are duplicated or incredibly similar with slight keyword variations.
  • Duplicate content - where most a site's content is unoriginal
  • Aggregate content - where the majority of content is pulled from other sources
  • Low-quality or irrelevant content - unhelpful content that people leave quickly
  • Thin content - barely any information is provided

How Panda tracks these issues:

  • High bounce rates from your site and individual pages
  • Low time on you site and individual pages
  • Low amount of return visitors
  • Low click-through rates

THE PENGUIN

Google's Penguin Algorithm first released in 2012, the Penguin algorithm deals specifically with unnatural backlink patterns. Before this year, Penguin was only updated once a year. That meant that if your site received a Penguin penalty for having low-quality or unnatural backlinks, it would be a year before you recover and know whether or not your changes made a difference.

The fourth update integrated Penguin into Google's core algorithm, which means that Penguin 4.0 now operates in real-time. Every time Google crawls and caches a page, the Penguin criteria takes effect.

Issues that trigger a penalty

  • a lot of low-quality links
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages

THE HUMMINGBIRD

Google's Hummingbird Update was not just an adjustment of one particular focus of the Google algorithm. It was a complete overhaul. Besides the Panda and Penguin updates that were already in place, the entire algorithm was adjusted.

The goal of Hummingbird was to change the criteria generated by user queries. For example, if someone search for "Best place in San Diego for pizza," the Hummingbird algorithm translated "place" and "pizza" as meaning the user was looking for a restaurant, so restaurant would be added to the sorting criteria.

The main reason for this update was to work with voice queries on phones, which became more standard after Siri was developed and released.

There aren't really Hummingbird "penalties" as the purpose of the update wasn't to punitive towards black hat SEO tactics.

The best way to utilize Hummingbird in your SEO efforts is simply to write content that answers people's commonly searched questions, not just keywords. However, this is already a basic SEO practice that you should be using, so the ramifications are small compared to the Penguin and Panda updates.

How Well Do You Know Google's Animals?

THE POSSUM

Google's Possum Update was an update to how the algorithm handles local search results. It replaced Google's last location update, called Pigeon, which was released in 2014.

The revision radicalized how Google treats and filters out duplicate results. If there are multiple business listings across the web with slightly different names or different phone numbers, hours, email addresses, etc., they are filtered out.

Making sure that your business's information is consistent across the web will keep you in good standings with the Possum and can really boost your rankings with Google.

© 2016 Andrew Lowen

Comments

No comments yet.

    Sign in or sign up and post using a HubPages Network account.

    0 of 8192 characters used
    Post Comment

    No HTML is allowed in comments, but URLs will be hyperlinked. Comments are not for promoting your articles or other sites.


    Click to Rate This Article