SEO 101 Part 3 – Keyword Trends

Note: This is Part 3 of our ongoing SEO 101 series. Previous: Part 1, Part 2

Keywords are the original SEO tool – in the early days of search engines the relatively simplistic crawling and indexing abilities relied almost exclusively on keyword usage and density. In those wild-west times, the way to generate search engine traffic was through heavy keyword usage – to the point of rampant (and often dishonest) keyword-stuffing tactics. And while a lot has changed since those early days, keyword usage, selection, and implementation still plays a huge role in any well-rounded SEO strategy.

With the latest algorithm changes, and the rise of machine learning and artificial intelligence, it is more important than ever to spend time on creating a keyword strategy. The modern search engine optimizer needs to understand how semantic variation, synonyms, concepts, and natural language are affecting the way Google delivers its results.

Improved Algorithms
The algorithms employed by Google and other search engines are growing more complex with each passing year. And with each update and revision, search engine optimizers scramble to keep up with the latest changes to these all-important formulas.

The past two years have brought huge changes to the way that search engine algorithms process search queries. A rise in ‘machine learning’ allows sites such as Google to understand the semantic relation between search terms; now Google can easily identify common word variations (plural, tense changes, stem-changes, etc.) and synonyms (beautiful, attractive, pretty, etc.) or closely related concepts (cat, feline, tiger, etc.).

In the past, a web designer would need to include keywords for the whole range of variations, synonyms, and concepts that a user might search for; but with machine learning developments there is no need to cover every possible variation in keyword usage. This allows for more natural and organic content creation as unusual keywords are not being shoehorned into otherwise quality material.

Natural Language
Along with an improved ability to detect semantic variations, the rise of machine learning allows Google to interpret natural language searches. While many experienced searchers naturally focus on keywords, the rise of voice search and products like Google Home have expanded the need for search engines to properly parse and understand natural language searches and questions.

If I wanted to know the current weather in Denver and was performing a text search, I would search ‘Denver weather’; but if I’m using a voice search or A.I. based personal assistant the search will most likely be ‘what is the current weather in Denver.’ Google is readily capable of interpreting those two searches and understand that the intent is the same. While the example given is relatively simple, it illustrates the need for modern search engines to be able to determine intent from long-form searches or questions.

As search engines begin doing more work interpreting search queries, keyword research gets more complex. How do we target keywords if the search engine is using AI and machine learning to interpret them into semantically related queries? The days of simply chasing after a few targeted keywords are over – and SEO needs to keep up with the changes.

The 3 G’s
Dr. Peter J. Meyers recently published a great keyword research article over on the Moz blog. He comes up with an elegant solution to this complex problem: breaking keyword research into three stages: gather, group, and generate.

  • Gather: This step hasn’t changed much and involves putting together a list of keywords that are strategic and feasible. Strategic in that they relate to your business and would drive traffic among your target demographic, and feasible in that there is an opening in the market that you can exploit. Once you have compiled a list of keywords you want to rank for you are ready for the next step: grouping.
  • Group: Because the search engines no longer require exact matches, keywords should be grouped by semantic relation. You no longer need to tackle ‘Denver weather’ and ‘what is the current weather in Denver’ as two separate keywords – you, Google, and the end user all know they mean the same thing so there is no need to treat them separately.
  • Generate: Once you have groups of semantically related keywords, you can focus on each group to generate one exemplar to target. To go back to the previous example, ‘Denver weather’ and ‘what is the current weather in Denver’ can be grouped into the single target of ‘current weather in Denver.” Both Google and your end user will understand the content and find it relevant to their search – which should be the ultimate goal of all SEO work.

The method of keyword selection and usage may have changed, but the principle remains the same: understanding what your target audience is looking for and how Google will interpret and deliver your page. Keyword-based SEO doesn’t need to be a mystery – it can be based on quality data and a deep understanding of how the search engine algorithm world is changing.

Keyword Professionals
Fusion Group USA has the experience and tools necessary to research and target keywords that will drive traffic in the modern world of machine learning and artificial intelligence. Contact us today to see how we can help your business through smart and effective search engine optimization tools.

Leave a Reply