How to Improve Your PageRank: A Comprehensive Guide to SEO
The Historical Timeline of SEO
Hypertextual Web Search
In 1998, in Brisbane, Australia, Sergey Brin, and Lawrance Page presented a paper entitled "The Anatomy of a Large-Scale Hypertextual Web Search Engine" at the Seventh International World-Wide Web Conference (WWW 1998).
If the to people named above sound familiar, they should - Sergey and Lawrance were the to individuals who founded Google, and the paper they presented to the conference was their first public mention of the idea.
The two described Google as a "large-scale search engine which makes heavy use of the structure present in hypertext."
Hypertext is what links topics you see on your computer screen to related graphics and information, such as text, photos, icons, among other things.
The idea behind Google's design is to "crawl" and "index" that structure, providing the end-user with better search results than what was offered by previous systems.
You can find the prototype which includes the full text, as well as a hyperlink database at http://google.stanford.edu/ and comprised of at least 24 million pages.
Google Introduces Quality Content Guideline in the Early 2000s
Sergey and Lawrance's primary objective behind the Google brand was to ensure its users were provided quality search results. To ensure that this happened, Google issued a set of stringent guidelines for publishing quality content.
One of the ways Google measured PageRank was by tracking the number of inbound links a website had. The more inbound links, the better a site's PageRank.
Even today, nearly 19 years later, inbound links are the Holy Grail of webmasters and SEO specialists everywhere. And from what I see, that doesn't look to change anytime soon.
For those who have been around for a while (like myself), they'll remember the toolbar provided by Google on Internet Explorer especially designed so that webmasters could check a site's PageRank easily.
Around the same time, Google introduced Adwords that allowed you to view the paid and organic search results alongside each other.
And though Google counted the number of inbound links going to a website, it did not consider the quality of those incoming links.
I bet you already know what happened next.
Since there were no specific rules covering link quality, webmasters quickly learned that they could post links pointed to their site anywhere and everywhere.
Very few "white-hat" webmasters existed in those wild days. It was like the Wild, Wild West of the Dot Com era. And so Google called in its sheriff to clean things up: an update to the algorithm.
You can see what Google's PageRank code looks like in Python at geeksforgeeks.org. You can even click the "play" button and it'll run the code for you so you can see what it looks like in action.
PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important it is. The underlying assumption is that more important websites are likely to receive more links from other websites. -Google
2003 - 2004: The Florida Update
If you're a webmaster, content marketer, or a blogger maintaining your blog's SEO, you probably stay updated on SEO matters. If that is the case, then you may have heard about Google's "Update Florida" that took place in November 2003.
The Update Florida rocked the SEO world to its core.
It's when sites lost their PageRank and others had to turn their SEO strategy around 180° or face a similar fate. What made the update even worse is that it hit during the holiday season.
Black-hat SEO tricks like keyword stuffing and linking irrelevant pages to high PageRank links no longer worked.
And though the update happened towards the end of 2003, Google had been studying the issue since 2001. That was the year Google released a paper entitled "Who Links to Whom: Mining Linkage Between Web Sites" (downloadable PDF).
Update Florida was also aimed towards spamming sites.
2005: "Nofollow" Links and Google Analytics Updates
Google designed Jagger to help the search engine reduce the number of low-quality links exchanged. The update also killed anchor text in backlinking.
Google developed Big Daddy to help it gain a clearer understanding of the relevance and quality of backlinks regarding assisting searchers to find quality content.
Other key features included in both updates respectively were improved search engine results and the importance of social signals.
2011: The Panda Update
Google's Panda update targeted poorly written content - mostly keyword stuffing and irrelevant content - produced by content farms. Since I got my start working for content farms, I know firsthand that most of their clients are SEO firms.
These so-called online "copywriting" agencies and the SEO firms they cater pay writers one cent to two cents per word, which is not nearly enough to keep good writers.
There were about 15 versions of the Panda update spread between 2011 to 2014.
Many sites posting unauthentic content witnessed the power of Google's algorithm firsthand as their rankings went up in smoke.
Not only did they lose their rank but Google penalized them severely. However, it wasn't all about crushing black-hat SEO practitioners.
Google also provided those who were interested in following the rules guidelines on what they view as good content and a high-quality website.
Between 2011 and 2014, Google updated Panda 15 times. Keeping up with all of the updates can be trying at times, especially when you have so many other things to do.
2012: The Penguin Updates
After all the updates, Google noticed websites were still publishing low-quality, spammy links in the hopes of gaining better rankings.
Google dealt with these naughty black-hatters by releasing the Penguin update, putting sites with irrelevant and poor quality links on ice.
Here is a brief history of Google Penguin Updates till date and how you can protect yourself from it:
The Penguin update took care of many kinds of websites that bought links from spammy directories or blogs created with the sole purpose of selling links.
Below is a brief history of Google's Penguin updates, as well as ways to protect your site from the same fate.
March 2012: Penguin 1.1
Though not necessarily an algorithm update, this release set out to penalize any spammy sites the first update missed in the first release. Webmasters who improved their link profiles were rewarded by having their rankings increased.
October 2012: Penguin 1.2
This version of Penguin again rewarded sites following the rules and punished those who hadn't gotten the memo. This included non-English sites as well.
May 2013: Penguin 2.0
This major algorithm update directed Google's crawlers to start analyzing sites more thoroughly, including suspicious websites' category pages, not just their home pages.
October 2014: Penguin 3.0
Another update similar to Penguin 1.2, featuring refreshed rankings, penalized those sites that had eluded earlier updates, and improved sites with improved link profiles.
September 2016: Penguin 4.0
At this point, Penguin is a significant facet of Google's core algorithm. Penguin 3.0 updates on its own and in real time.
For those who worked on their link profile after this, they wouldn't need to wait for future updates before seeing improvements in their search rankings.
And rather than punishing sites with spammy backlinks, Google would consider them as "no backlinks" or "nofollow" links.
Subsequent to the latest Penguin update, the following tips should be adhered to:
- Be sure backlinks are on point with the content and linked to high-quality websites.
- Each backlink must have diverse anchor texts
- Don't use targeted keywords or phrases in anchor texts
- Use anchor texts that are natural
- Don't buy links or use link-building automation
- Create content that is informative, original, and unique in the hopes that over time other authoritative sites will link to it
A Short List of Other Google Updates to Know About
Google announced its Hummingbird update on September 26, 2013. This algorithm update addressed natural language and conversational search entries.
It was a significant improvement to Google's already innovative search technology, impacting around 90 percent of queries worldwide.
The primary goal behind Hummingbird is to understand natural language in both written and voice search.
In this way, Google could provide answers searchers intended on in a very natural way. Another result of the Hummingbird update was an improvement to local search results.
On April 21, 2015, Google rolled out the first mobile-friendly update called "Mobilegeddon," requiring webmasters to make their sites mobile-friendly.
Today, this update has made mobile-friendliness one of the primary ranking signals.
Mobilegeddon only affected search rankings for mobile devices. It also covers all languages around the world and applies to individual web pages only.
Of course, sites that are non-mobile-friendly experienced a fall in their rankings, but just for mobile searches.
Additional updates to Mobilegeddon include:
- Accelerated mobile pages in February 2016
- Mobilegeddon 2.0 in March 2016.
Though it was fully functional in April 2015, RankBrain was officially launched in October 2015. Using Artificial Intelligence, it works to make better make sense out of searcher queries.
It applies to all languages and countries globally. Working as a Natural Language Processor (NLP), RankBrain is more active when it recognizes new and unknown searcher queries.
This AI technology allows it to understand the meanings of searchers' queries in the same way as humans do, then does its best to provide the best solutions to them.
In September 2016, Google announced Possum, a local algorithm update. It covered local business search results, as well as Google Maps. That's why it knows where you're going before you do.
Now businesses located outside the limits of cities experience better rankings. This is because it suggested results based on the user's location.
After this update, studies have shown that the local search filter and organic filter work independently of each other.
Achieving Good Rankings Is Your Responsibility
For those of you who are just learning about SEO or have been learning about it for a while but still have a way to go, the first thing to understand is that SEO is ultimately your responsibility.
Whether or not you've hired a professional SEO service, a content writer who tells you they know about it or get advice from various sources, the ball is always in your court. If you hire someone who uses black-hat SEO tactics or you follow bad advice, your site will suffer, not theirs.
If your website gets penalized by Google for an infraction, it will take months to recover, maybe even longer. You can't hold someone else to task for it either.
A lot of people who are running their own blog or business website don't realize that they're essentially a webmaster. In my personal experience, I suggest you learn about what it takes to be a good webmaster and about how to do SEO.
I may write an article about being a webmaster at some point in the future. But for now, I'll share this great article that I came across lately.
If you're a business owner, you most likely don't have time to work on such things. However, for those who are professional writers, bloggers, or run a similar online business, learning webmaster and SEO skills should be at the top of your to-do list.
Common Misconceptions People Have About SEO
In this section, I have listed some misconceptions a lot of individuals have about SEO.
Meta Descriptions Are Crucial for Good SEO
I use Yoast SEO on my WordPress site. One of the warnings it will give you is that you haven't "specified" a meta description and that "search engines will display copy from the page instead."
A meta description is that small snippet of text you see under Google search results. If you don't create a meta description, Google will automatically generate one from the first two or three sentences of the first paragraph of your content.
In the past, meta description optimization was an essential facet of SEO. It included things like using a certain number of words, using the keyword within the snippet, among other things.
And though it drives me nuts not seeing every light indicator in the Yoast plugin green, it's not important.
However, even though Google's algorithm doesn't care about meta descriptions, it does matter to organic beings such as the humans who found your website among the search results.
So take time to write one, but know that it isn't a big deal for your PageRank.
Exact Matching Keywords Are a Must
Exact matching keywords are another old SEO practice. Thanks to a number of significant updates (some of which I mentioned above), Google is sophisticated enough that it looks beyond exact matching keywords.
Google now uses Latent Semantic Indexing, which looks for meaning in blocks of text, not for the keywords. This means it can read your content, even though it may not understand it the same as a human does.
For example, Google can recognize that "car lot" and "car dealer" are the same.
This is good news for us content writers who actually enjoy the creative aspect of writing and are frustrated by the thought of having to find ways of fitting a keyword or phrase into a piece while making it sound natural.
Forget SEO and Only Focus on Awesome Content
This modern-day misconception can be found written in a large number of blogs. If you read this somewhere, then you may as well click the ← button on your browser and search for a better source.
Most pundits agree on three things: Content marketing is king, it's here to stay, and how it's constructed affects SEO.
Google and other search engines get money from ads. Just like in the television industry, the more people viewing equates to more people seeing and clicking on the ads.
And just like the television industry, shows that get bad ratings either get moved to a different times slot or get canceled altogether.
The same content on Google, though Google is far kinder than television networks. As long as your content helps answer people's questions, then you have nothing to worry about.
However, the fact is, even websites with high rankings and great content don't ignore SEO. This is because SEO is what Google's algorithm uses to know if your site is relevant or not. You can consider them "indicators."
Link Building Isn’t Important for SEO
I can never see this being the case. Link building is one of the first indicators Google used to rank sites - and I'm sure it was the first indicator if I'm not mistaken. So for anyone to tell you that link building doesn't matter is foolish, to say the least.
Most people who adopt this train of thought are lazy and don't want to spend the time it takes to build high-quality backlinks. Link building is time-consuming but an essential aspect of SEO.
My Homepage Needs to Be Crammed With Content
Some time ago, 500 to 600 words of text per article used to be the SEO sweet spot. But, today, the more words you have in one article, the better Google can do in grading your content.
I recommend publishing around 2,000 or more per article. And if you can, you should add to that content, such as updates and things of that nature.
If anyone reading this has been blogging for a while, you might remember those one-page themes where you would find numerous short 300 to 500-word posts on a single home page.
Posting your content in that fashion is so not in fashion. And since Google launched Penguin 2.0, it can crawl more than a site's homepage anyway.
This is why webmasters used to cram content onto the homepage before Penguin 2.0 - it only crawled homepages before that.
Your homepage is a doorway to your website. A little content is on it is good, but cramming it full of photos and text will drive people off, which affects your site's "bounce rate."
The bounce rate is an important indicator to Google when it's grading whether or not a site is valuable or not. If people are spending only 30 seconds on your site, Google understands this as meaning your site is of low quality.
However, if people land on your homepage and navigate from there to other parts of your website, Google understands this as meaning your site has information people are interested in.
Lastly, I want to point out that humans of today have really short attention spans - or maybe it's better to put it this way: If people don't get what they're looking for or have their attention grabbed within a few seconds of landing on your site, they're going to push the ← button faster than you can scream "Wait!"
The Best Way to Learn Is to Read Directly From the Source
As you can see, there are countless misconceptions and myths surrounding the subject of SEO.
Many of these misconceptions and myths are spread by SEO "specialists" looking to make a quick buck. Other misconceptions and myths come from the fact that Google is very secretive about certain things involving how they rank sites.
The secretiveness is for a good reason; if the tech company revealed too much about how it ranked websites, black-hat SEO practitioners could find ways to circumvent its tactics.
Some other good reading includes:
- Search Engine Optimization (SEO) Starter Guide
- How Google Search Works
- Webmaster Guidelines
- Content guidelines
- Automatically Generated Content
- Sneaky Redirects
- Link Schemes
- Scraped content
These linked topics are only but a few of what's available. Nevertheless, they should give you a great starting point towards bettering your SEO game.
© 2018 Alexis Wainwright