Arguing Semantics: Why Keywords Alone are Not Enough

facebooktwittergoogle_plusredditpinterestlinkedinmail

When ordering business content for article marketing, landing pages, SEO or social media, some business owners don’t stop to think about what they need that content to do. It’s a placeholder, a way to fill the space on the page more efficiently than just leaving an “under construction” message on the site. Content, to them, is just the stuff that gets wrapped around the important parts such as keywords and links that draw search engines’ attention. What they don’t realize is that search engines no longer look at links and keywords alone.

Early versions of search engines viewed keywords as isolated entities in a binary universe. Keywords were there or they weren’t; there were no shades of meaning, no nuances, no context. It didn’t take long for black-hat tactics to take advantage of this simplistic set-up, stuffing meta tags full of celebrity names and common medical complaints to siphon traffic on some of the most-searched topics. Search engine architects responded with systems that allowed the engines to read more like people, taking contextual cues into account and forming a more nuanced image of what a page contained.

Latent semantic indexing, or LSI, is more than an industry buzzword. It’s a way for search engines to approximate a human reader’s experience of the site more closely by noting how often whole constellations of words appear together. Google can’t read minds, but it can read big data and use it to put together a coherent picture of sites that most closely match a given long search term.

If the concept seems arcane in the abstract, a concrete example should make it clearer. By itself, the word “stage” could have relevance to a host of subjects from theater to space flight to medicine. With LSI, a search engine can recognize a site that mentions the word “stage” with terms such as “actor,” “set,” and “tickets” as highly relevant to a search on local theaters. Diseases and rockets also have stages of a different sort, and a search engine can organize articles with the word “stage” in them accordingly by noting their semantic context.

The concept of latent semantic indexing is a direct outgrowth of the keyword-stuffing tactics that cluttered search engine results and frustrated users. An unrelated article can no longer mention Miley Cyrus and toenail fungus in its meta tag or within the body of the article to draw attention from people searching for these common topics. Put simply, articles must now be about something to make the most of LSI.

The specifics of LSI are for search engine architects to understand fully, but the concept is still important for anyone who needs content. LSI matters for article marketing and other business content because it’s the foundation on which current search engines seem to be built. While Google’s engineers haven’t specifically said that the search engine uses LSI or in what way it uses it, content that assumes it does consistently outperforms writing that relies solely on individual keywords and keyword density. In other words, while you have no way of knowing if LSI governs Google, you can know that behaving as if it does helps your content’s ranking.

Natural writing – the kind of writing that reads as if a human wrote it with a specific subject and human readers in mind – now has an advantage over keyword-stuffed content because it’s semantically relevant in a way that stuffing alone can’t manage. Keywords are still important, but they’re more important when they’re in context.

© Business Content, Inc. 2013 All Rights Reserved.

Comments are closed.

Content

Contact Us

Content