When it comes to search engine optimization, the standard has basically always been that links reign supreme. Content has often been considered less important and has been given a lower priority. With RankBrain and the integration of contextual search results, content has become one of the most important ranking factors in Google. This is also what we will be dealing with in this article.
In many ways, Google is dependent on links when it comes to expanding their search engine. Lately, the focus has shifted towards text and content and how such things can be used to relieve the link system from giving relevant and accurate search results on Google users search phrases and questions. With the introduction of Hummingbird, the improvements of Latent Semantic Indexing and more recently RankBrain, Google has taken huge steps towards turning their search engine into a machine which could theoretically differentiate content and unclear search phrases and provide specific search results without having to rely on links or specific keywords.
Within the SEO community, links and content have always been considered to be fundamental when it comes to a siteâ€™s ability to rank on Google. This has also turned content into the scapegoat of Internet marketing. Many content evangelists found themselves getting attacked by those of a different persuasion. The veterans of SEO have always known that links are the single most important factor â€“ even though the proponents of content had declared content as king, Darth Vader or a zombie Ayn Rand and left it at that.
The more obscure SEO guys, or should I say the tiny part of the SEO community that understood that links trump content when it comes to Google priority, were focused on creating as much, and as cheap, content as possible, more often than not, with the help of Asian freelancers, Google/Bing Translate and spun (rewritten) Wikipedia content. It was an easy concept â€“ amount and scalability equalled better.
Of course, not all of those involved in search engine optimization were using these methods. Also, counting a few years back â€“ or at least since the so called Panda update â€“ content has been treated in a much more respectful manner.
The problem has never revolved around an unwillingness to produce text/content, but content has, web traffic wise, simply not had much of an effect. If the purpose is to increase traffic, why choose a method that's much less affordable?
Also, those involved in search engine optimization are often cynical individuals. They have all, at least once or twice, been fooled by the â€ścontent is kingâ€ť-crowd, failed and been forced to find other ways. They know that carefully evaluating a siteâ€™s in-links can decide its credibility. To mathematically do the same thing with text/content is a whole different game.
In the olden days â€“ by that I mean the end of the 90's and the early days of Google â€“ there was only content but no way of evaluating its quality. It didn't matter how many apples the content-crowd put on the teacher's desk. The early search engines were simply engines, or machines. They had no way of evaluating content other than using mathematics, which isn't very reliable when dealing with the value and difference of two texts being either the Iliad or a dime novel.
In 2003, when Google bought the unknown Santa Monica based company Applied Semantics, which was a stroke of genius. The company in question dealt with software applications for online marketing. This was a huge step for Google towards making their search engine understand contextual information and semantics â€“ giving the content-crowd a realistic chance against the link-people.
So, what's semantic search? Basically, semantic search has to do with understanding your audience, their intentions and their needs and what kind of questions they might ask. Then you adjust your content towards answering these questions and meeting those needs, by offering contextual content which might not answer the specific search terms, but still cover the topic.
Search words, or keywords, are of less importance here. Following the Hummingbird Update(which improved Google's semantic ability) in 2013, Google has rolled out several new components for their search algorithm giving it the ability to evaluate things such as:
... and several other things that matter, which dampens the effect of specific keywords not being used in the text.
The search engine has gone from focusing on keywords to dealing with the importance of concepts â€“ in certain cases you won't even have to include a keyword in your search phrase, it will be enough to enter what concept you are looking for in order to have your question answered.
How did they achieve this? Hummingbird introduced â€śtrueâ€ť LSI, Latent Semantic Indexing, which is a method for processing and analyzing large amount of data in order to find patterns.
In this case, Google increases its vocabulary by processing text: the search engine analyzes the text and its structure, which words and terms that are being used in relation to each other, abbreviations, and so on. That way the engine increases its ability to understand semantic context.
Before Hummingbird, LSI had been basic and unreliable. Following the update, LSI had become much quicker and better at catching context and semantics.
2013 was a while ago, and you, the reader, might already have noticed an increased reliability in Google's ability to match difficult phrases and search result. How about the classic search â€śfly sheet or newsletterâ€ť?
Instead of wading through information being input manually by engineers, the search engine picks up searches, semantically similar words, terms and similar data by its own by scanning through texts and documents that are available on the Internet.
This is important because before Hummingbird, one could only guess the topic of a text or a site by interpreting which keywords that were being used within the text itself. Keywords are still important, but they are becoming less and less so. Google has for a long time emphasized that they prefer â€śconversational searchâ€ť and the progress of semantics has really helped the company to reach this goal in a very short time. So, the time is now for us SEO pros to begin focusing on answering questions instead of providing keywords. How do I deal with content marketing? What is search engine optimization? And so on.
The two terms have to do with Google's semantic understanding, and which aren't often discussed, are co-occurrence and co-citation. What these two terms actually mean is discussed more thoroughly in this article.
You could say that both functions work towards creating a kind of natural, text based, linking between one or several independent sources. When a source refers to another source it creates a relation between the two sources, and so they become linked through citation which Google understands and takes into consideration.
It means that it is an important and useful practice to mention and link to authoritative sites, personalities and so on, that deal with the same topic as you do. Through your association with these sources, you can make your site more visible in the search box through co-citation.
Co-occurrence happens when two (or more) searches involved words that are so similar that people consider the synonymous, and especially if they are used closely together or in a wider context such as in a text.
For example, you might search for a phrase such as â€śVolvo Craigslistâ€ť and then try â€śused Volvo for saleâ€ť. These two search terms are so similar in language and purpose/intent that Google can safely assume that people want a used Volvo car when they are looking for a Volvo car on Craigslist.
Co-occurrence is dependent on how people behave, and how probable it is that a large amount of users will perform identical searches, which Google then uses to validate that the algorithm is working. This strengthens the bond between search terms, even when used in a different context and in combination with other search terms.
Co-citation is what's usually called link building without involving links. There are basically two ways for two independent sites to be linked to each other in a so called co-citation relationship:
The purpose of co-citation becomes to send a signal to Google's search algorithm that the mentioned site is relevant and believable, a sort of assurance in the shape of the text and not in the shape of a link.
The fact that content isn't prioritized is actually somewhat of a problem for those of us working within the field of search engine optimization. After all, content should be important. Not the number of links your site has.
During the years, we've seen it lots of times â€“ e-retailers literally working 70 hour work weeks trying to produce good content WITHOUT it being worth the effort. There havenâ€™t been any tools in place to reward good content. But with the constant evolution towards giving search engines a human like understanding, things are starting to chance. Things have seriously begun to change.
We're slowly moving towards what could be called a â€śWeb v3â€ť and semantic search is a big part of this process. All signs point towards putting your faith into a future where the Internet is based on content and natural well written text, where you won't have to rely on specific keywords, precise anchor text and links in order to be visible to search queries.
Currently, it's possible to keep yourself afloat without using these strategies for linking. But link building and traditional SEO are still important and have to be integrated with each and every serious Internet venture. What will most likely happen is that even small sites with â€“ and I can't stress this enough â€“ the right content will be able to survive.
Google, due to Rankbrain, now has a much better understanding of intent and context. This will help sites become more visible in higher ranking positions. In the future, although we aren't there yet, seems to be brightly shining on content marketing and quality content.