The future of SEO is looking more and more uncertain with every passing day. It’s not because Google is changing the rules, it’s because the underlying technology is going through a transformation akin to a caterpillar turning into a butterfly.
In the past, the world of search was based on keywords. Companies like Google and Altavista (back in the day) created search algorithms which were based on how many relevant keywords and links a certain page had. Over time, they realized that this way of ranking pages wasn’t airtight: users were often directed to pages that didn’t have much to do with their original question, even though they contained the right keywords, and so they began tinkering with the algorithm. Today search engines are essentially giant “expert systems” in artificial intelligence parlance, with dozens of layers of code laid on top of one another, attempting to direct users to the right content.
During that time, search engines have been doing battle with SEOs. They want search rankings to be organic and determined by the quality of the content, but SEOs want certain sites to rank higher than others because of the needs of business clients to boost their sites onto the first page in order to improve their traffic. This has led to a kind of arms race, where each faction – the search engines and the SEOs – tries to outsmart the other. For a while, it appeared as if the SEOs were winning. They were able to boost the ranking of their clients’ sites simply by increasing the amount of times a particular keyword was mentioned or upping the number of external links. Unfortunately, the search engines got savvy to this sort of thing and began to take into consideration other factors, besides keywords and links. They also developed strategies for downranking sites that were making use of fake backlinks from disreputable sites by introducing the “authority” system, which was a way of Google telling itself that a particular site was trustworthy. Sites with links from authority sites – like Forbes – ranked higher than those with links from other, less reputable sites.
The problem, however, is that this process was very time-consuming and expensive. What’s more, many users still don’t end up with what they want in the search results, despite all the tinkering done by Google. Though the system is good, it’s still brittle and often can’t adapt to unusual queries or longtail keywords.
This is where machine learning comes in. The problem so far is that, despite all the sophistication of the search algorithm, it still relies on keywords to identify the most relevant web pages. Despite what conspiracy theorists might claim, Google still can’t understand the meaning of what you type in the search engine. If you type in a question, Google will provide you with a list of results where people have asked similar questions, but it’s unable to categorically send you to a page with the answer.
With machine learning, search will be different says martechtoday.com. Machine learning is giving computers to understand the meaning of sentences, just like a person. Machines are beginning to learn the relationships between objects and subjects and are building up “common sense” which allows them to understand the essence of meaning. When they see words written down on a page, they’re able to link them to real things in the world, giving them the ability to form concepts.
For SEOs, this is an exciting and unprecedented development. Essentially what it means is that search engine optimization is about to become a lot more natural. The aim of the game will be to generate content which is as close in meaning as possible to the user’s search query. The days of optimizing H1 tags, meta tags and content length may disappear forever.
Another way to think about machine learning is like this: imagine if Google could somehow hire enough people go trawl the internet and manually rank every site for particular search queries. It would be a time-consuming process, but you wouldn’t expect them to go about it the same way as Google’s robots, following a strict set of rules. Instead, you’d expect them to use their intuition. They’d be able to tell, for instance, whether the content on a particular page and depth and usefulness to the people reading it and whether the page it was on deserved to be higher or lower in the search rankings as a result.
For businesses, this is a tremendous opportunity. As seoexpertbrad.com discusses, things like location marketing and Google Maps are becoming more important and will continue to do so. Machine learning search services will always try to pair customers with local businesses. Google already does this to some extent, but AI would be able to do it better by taking into consideration personal habits and other preferences, besides simple reviews.
Businesses will also benefit from the fact that search engines will understand subtle differences in language. For instance, a customer might want a particular product or service, but not quite be able to remember its name. Instead, they might write a short description and hope that the right result comes up. Today, this is hit and miss, but with machine learning, computers will be able to make predictions about what a person meant and provide relevant results. You could type in “the glasses company that has an award-winning website, ” and Google would give you Warby Parker at the top of the results.
Larry Page, one of the founders of Google has said that the ultimate search engine is one that is able to understand exactly what you want, even with limited input. He clearly sees a future beyond mere keywords, and so his company has invested heavily in developing machine learning technologies that will improve its service into the future.
What does this mean for SEOs? Well, another change in job description. Businesses will have to focus on relevancy above all else and provide their customers with exciting and useful content they will love. Actually, when you put it like that, it doesn’t sound all that different from today, does it?