Categories
SEO

SEO Will Get You Rankings, But SXO Will Get You Results

SEO is less about keywords and backlinks. With the help of AI, SEO is more about search experience optimization or SXO, where quality content and UX dominate.

Written by Michel Fortin. Originally published on seoplus.com.

When most people think of optimizing for the search engines, they tend to think of things like keywords, backlinks, rankings, and so on. But this is not what optimizing for the search engines really is. This is optimizing for search engine visibility, or better said for the position in search engine results.

SEO today is probably more like optimizing for the search engine results pages (SERPs) than it is optimizing for search engines themselves. I know this may sound like semantics, but there’s a reason I’m bringing it up and making the distinction. I’ll come back to this later as the reason will become clearer to you as you continue reading.

Also, this is not to say that SEO doesn’t include keywords or backlinks. It does. These things have been a part of SEO since the beginning and will continue to be a part of the fabric of the web. Google continues to use keywords and backlinks, among many other factors, for its rankings. The issue is that SEO was never intended to be a way to manipulate search engine results.

However, since that time, SEO has evolved just as search engines have evolved. Right now, we’re seeing another shift taking place. It has been happening for a while but it’s increasing in momentum. When you consider the exponential growth in sophisticated search engine algorithms and machine learning, often referred to as artificial intelligence or AI, you will understand why this change is taking place.

More importantly, my goal here is not to change or attempt to revamp the SEO industry. It’s in our agency’s name and our flagship service, after all. My goal is to help you understand what SEO really is, where it’s going, and how to use it to your advantage — or better said, to your market’s advantage.

How SEO got its start

SEO is arguably the first form of digital marketing to hit the web. Some purists may say that email has been around a lot longer, which is true. But that was ARPAnet or pre-Internet. When the world wide web launched in the early 90s, that’s when people tried to get their websites listed in these coveted directories. If they weren’t in them, no one would know they existed.

In the beginning, circa the late 80s, directories were meant for indexing files. The first of its kind was Archie, a pet name for “archive.” As more servers went online and more files were found, the process became time-consuming. That’s when software crawled servers looking for files it could index, such as the World Wide Web Wanderer.

But at the time, crawling the web was bandwidth-intensive. So in the early 90s, the first publicly accessible search engine, Aliweb (i.e., “Archie-Like Indexing for the Web”), offered people the opportunity to submit the location of their files. Moreover, it allowed them to add user-written page descriptions and keywords.

This was the beginning of SEO because it empowered webmasters to submit their pages for inclusion and define the terms under which users could find them. The goal was to be found, not outrank others. There was no ranking, anyway. Search results simply listed everything and in no particular order, other than chronological or alphabetical.

Eventually, search engines were able to start searching content inside those files, and not just search for filenames or user-written texts. But users would only be able to see the content of those files after they’ve downloaded them.

What defined the official launch of the web was the inception of the hypertext protocol. This allowed computers to visually display the contents of files using a tool called a browser. More importantly, it allowed pages to reference each other through hyperlinks, making it easier to find other related pages or files.

As directories grew, search results became bloated, overwhelming, and difficult to use. Having to trudge through millions of search results to find what you’re looking for was becoming quite daunting.

Luckily, a young Stanford student had an idea, which is to count the number of hyperlinks to a page as an indicator of its importance. He gave it his own name: Larry Page’s ranking algorithm, or PageRank for short. Together with the help of another student, Sergey Brin, they launched BackRub, a search engine that would use links to rank results in order of importance.

BackRub then became Google.

And the rest is history.

Where SEO can be misleading

Since the days of BackRub, Google’s objective is and has always been to offer users the best search experience possible. They want to give users the best answers to their queries, without the need to peer through endless irrelevant search results. But for that to happen, they first need to find the right content and understand it.

SEO, at its purest and most fundamental level, is the process of ensuring your website is findable, crawlable, indexable, and rankable. That’s it. It’s not about stuffing keywords or building backlinks, which are what most people think SEO is. More than anything, these are ways to manipulate search engine results.

In fact, Google’s own definition of SEO is simply this: “Search engine optimization is the process of making your site better for search engines.” If you visit Google’s Page Quality measurement tool at Web.dev, they offer four scoring dimensions that Google pays attention to. These four are Performance, Accessibility, Best Practices, and SEO. With the last one, they describe the metric as follows:

These checks ensure that your page is following basic search engine optimization advice so that search engines can crawl and index your page, and understand your content.

Google’s Webmaster Guidelines explain what you need to do to ensure Google finds, crawls, and parses your pages. You can and should read the entire guidelines, and every SEO consultant worth their salt should read them, too. But for simplicity’s sake, they fall into three key directives:

  1. Help Google find your pages.
  2. Help Google understand your pages.
  3. Help visitors use your pages.

What Google looks for are quality signals. “Page quality” is a common term and sometimes used interchangeably with “page experience.” In addition to their guidelines, they offer tools that measure your Core Web Vitals. These vitals are, according to Google, “Quality signals that are essential to delivering a great user experience on the web.” So if you follow the guidelines and improve your quality signals, you will improve your ability to rank.

What helps you rank is one thing, but what makes you rank above others is another. It’s what SEO services typically strive for. To rank well, the original intent of SEO is to help search engines determine your content’s importance or, more specifically, its importance as it relates to the user’s search. How well your content matches what your users are looking for — and provides them with more value than the rest — is what will increase your rankings.

So how do you outrank your competitors?

Before answering that question, remember that following Google’s Webmaster Guidelines will improve your search engine visibility. Increasing your visibility increases the traffic to your website. But traffic alone is not enough.

You don’t want a flood of low-quality traffic that will never engage with you or buy from you. You want qualified visitors who will consume your content, engage with your website, and hopefully buy from your business. That’s what you really want. The key to attracting higher levels of quality traffic, therefore, is to focus on delivering quality to your users. That’s what Google really wants.

In order to attract quality, you need to first provide quality.

Quality is the quest of modern SEO

We may think we are optimizing for the search engines. But in reality, we are optimizing for the search engine’s users — or more specifically, we are optimizing for our users who happen to use Google.

The best SEO services should make it easier for search engines to find, parse, and understand your content. But it should also make it easier for them to gauge your content’s relevance and value as they directly relate to your users’ questions and needs. Just make sure it does a better job (i.e., provides more relevance and value) than your competitors. That’s where the quest for quality comes in.

“Quality” is relative. But that’s precisely what makes SEO simple yet so challenging at the same time.

Content may be of high quality, but if it fails to match the user’s search intent and meet their needs, then it is pointless. Moreover, if it’s outperformed by a competitor who matches the user’s intent more closely and meets their needs more adequately, then quality content is meaningless.

One of the objectives of SEO is to understand what those two things are and how to do a better job than others. It’s not something you can fit into a box or paint by numbers. Even Google themselves can’t tell you what they are. They use thousands of algorithms in the hope that they can measure quality as accurately as possible. They want to provide users with the most accurate and helpful results they can.

Google’s mission is to deliver quality content users want while avoiding disinformation. To verify its algorithms and their performance in trying to do so, Google employs human search quality raters whose jobs are to spot-check search results, verify the content they link to, and rate their level of quality.

These human raters follow a 175-page guide called the Quality Raters Guidelines or QRG. Unlike the Webmaster Guidelines, this document doesn’t offer specific rules or best practices to follow. It doesn’t even reveal Google’s proprietary algorithms, either. But it does offer many clues into what Google considers “quality,” which is an indicator of what Google is looking for.

Human raters are essentially tasked with measuring a search result’s relevance and value. In the QRG, relevance is rated with a metric called “Needs Met.” Value is measured according to “Page Quality.” These ratings are based on a host of factors that are subject to interpretation, which is why “quality content” is subjective and difficult for any algorithm to determine — although they are becoming smarter and more sophisticated by the second.

The QRG essentially asks: “Does this result match what the user is looking for and does it adequately help them?” The more it meets their needs, the higher its relevance is. The more helpful it is, the higher the value it provides.

SEO and the search for quality signals

Core Web Vitals mentioned earlier are part of Google’s page experience guidelines. Google calls them quality signals. But what about content? Does it have quality signals, too? Absolutely.

For content to be relevant and provide value, it must be useful, meaningful, truthful, and valid. Genuinely helpful content adds to the user experience. After all, we never have a good experience when we’re being lied to, cheated on, or misled. Right? But users may not readily detect, understand, or appreciate what “helpful” is.

That’s where Google comes in. Its algorithms try to look for a number of quality signals with content as they do with page experience. These signals vary not only in how they present themselves but also in intensity. Some signals are positive, others are negative. Some are strong, others are weak. Some are clear, others are hidden.

What are those signals?

Among the clues found in the QRG, we find an acronym mentioned over 120 times. If it’s mentioned that often, then it must be important. It’s E-A-T, which stands for “expertise,” “authoritativeness,” and “trustworthiness.” What Google wants is for its search results to show pages providing content that’s:

  • Credible (it’s written by experts);
  • Believable (it’s authoritative); and,
  • Reliable (it’s trustworthy).

The more credible, believable, and reliable the content is, the more value it brings to the user.

Obviously, these signals apply to different websites in different ways and to varying degrees. For example, credibility is a little less important with, say, a free gaming site. These websites are only meant to entertain. But entertainment has value and can be helpful to those who wish to be entertained. Plus, if the site sells subscriptions or gaming swag, and asks for your personal information and credit card details, then E-A-T may play some role.

However, websites that specifically deal with your money or your life, or what Google calls YMYL, are held to higher scrutiny because they have the potential to influence the user’s experience, be it to enhance it or diminish it. That is, any content that affects or can affect one’s health, wealth, or wellness is especially targeted. Therefore, anything of a legal, financial, or medical nature, for instance, must demonstrate strong E-A-T signals.

Quality is in the eyes of the searcher

Now, what are those E-A-T signals? There are many mentioned throughout the 175-page QRG. It’s impossible to cover all of them within the limited scope of this article and do it justice. But here are just a few examples of some of the top signals human raters use to confirm the quality of Google’s algorithms.

Among others, pages should:

  • Have a beneficial purpose;
  • Be mobile-friendly and accessible;
  • Have a favourable reputation;
  • Have positive reviews or ratings;
  • Offer a satisfying amount of content;
  • Provide the user with enough value;
  • Answer the user’s questions;
  • Adequately meet their needs;
  • Be free from any obfuscation;
  • Not hinder the user experience;
  • Have a secure connection; and,
  • Be updated regularly.

Among others, content should:

  • Possess original research;
  • Be factually correct or fact-checked;
  • Be linked or cited in trusted sources;
  • Have proper quotes and sources;
  • Be up-to-date and maintained;
  • Have good grammar and spelling;
  • Not be copied from other sources;
  • Not be objectionable or offensive;
  • Not be harmful or deceptive;
  • Have an identifiable author;
  • List the author’s credentials; and,
  • Be verifiable or peer-reviewed.

For some, covering all the above can be a challenge, particularly “providing a satisfying amount of content” that “provides enough value.” To them, it’s offering as much information as possible. But that can backfire in some cases and detract from the user experience, which is counterproductive. The need for substantive content can vary from website to website and page to page, but the need for a satisfying amount of content is a signal regardless.

To be clear, SEO has nothing to do with content length (e.g., word count). It has nothing to do with the content’s level of sophistication, either. Above all, it has nothing to do with keywords — although longer content will cover more topics and subtopics naturally, and therefore will have a higher number of keywords. But this is a byproduct of offering more content, not a goal in and of itself.

When most people try to apply some basic SEO, they think it’s about sprinkling popular keywords throughout a piece of content. But since Google’s Hummingbird Update in 2013, SEO is no longer about keywords. Better said, it’s no longer about strings of keywords but about their context. It’s about entities, i.e., topics your audience is interested in and asking questions about.

As Google says, it’s about “things, not strings.”

Speaking of Hummingbird, our office doors have symbols that represent major Google updates, like Panda and Penguin. These updates have both shaken and shaped the world of SEO. This photo is the door to my office.

Office door at seoplus+ representing the Hummingbird SEO update from Google.
Office door at seoplus+ representing the Hummingbird SEO update from Google.

This was not intentional as I inherited this space. But lucky for me, it’s perfect as it represents one of my core philosophies around SEO, and that’s the shift from keywords to topics, and from what is being searched to why it’s being searched. In other words, Hummingbird was the beginning of Google’s attempt to understand the searcher’s intent in order to provide them with more relevant results.

“Hummingbird may be seen as bridging the gap between old, spammy practices and modern SEO which seeks to speak the natural language of readers, using their own words.”

Moz.com

SEO boils down to only two things

Nevertheless, modern SEO is offering content that’s relevant, helpful, and valuable to your users. It’s also making your content easy to find, understand, and, above all, use. Turning your attention to these two areas will make a world of difference in your search visibility.

Quite simply, SEO boils down to two essential things:

  1. The quality of your content, and
  2. The quality of your user experience.

Improving these two areas is modern SEO’s ultimate goal. It’s not to find some backdoor or shortcut to leapfrog your way into higher search engine results. Yes, SEO has many layers and facets, and there are many ways to peel a potato. But fundamentally, SEO boils down to quality content and quality UX. That’s it.

The better these two are, and the better they are in relation to other results that come up for the same queries, the better the chances of ranking higher will be. Moreover and likely more important, the higher the quality of your content and experience is, the higher the quality of your traffic and conversions will be. No need for keyword cramming, backlink begging, or coding callisthenics.

Simply offer great content your users want and a great user experience in consuming that content. It’s really that simple. SEO is not as complex as people think. The reason it may not seem as simple is for three reasons:

  • SEO can be seen as complex as there are several hundreds of ranking factors and signals, and knowing and applying each one can be time-consuming and labour-intensive.

  • Some websites can also make it complex, such as applying SEO to a multilingual ecommerce store with thousands of products, which is going to be a lot different than applying SEO to a small blog.

  • Some SEO experts can make it seem more complex because Google’s algorithms are kept secret to prevent cheaters from gaming the system, so these experts give SEO a certain aura of mysticism that, according to them, only a few skilled search-engine soothsayers can decipher.

However, simple doesn’t mean easy. It does take a lot of research and a lot of work. For some, the process may be to listen to their audiences. For others, it may be to outdo their competitors. For most, it’s both. Focusing on a few vital areas will create the greatest impact.

The key to SEO success is to know what to focus on.

SEO and the law of the vital few

Commonly known as the 80/20 rule or the Pareto Principle, the “Law of The Vital Few and The Trivial Many” states that 20% of your efforts will yield 80% of your results. Similarly, 20% of your SEO will yield 80% of your results, while 80% of your SEO efforts will yield only 20%.

What this means is that there’s no need to focus on everything. The goal is to focus on what matters. And what matters in modern SEO is quality. Google is making that quite clear. So the more you focus on improving the quality of your content and user experience, the more you will improve the quality of your results — such as the quality of the visitors you attract and the quality of their interactions.

This doesn’t mean doing less or ignoring other forms of SEO. It may involve a number of traditional SEO strategies and tactics. But every activity will strive for or revolve around improving those quality signals.

For example, do you stop doing keyword research? Of course not. But keyword research is less about knowing what to include in your content and more about learning what your audience is interested in, wants to know more about, and is looking for, and how to give it to them in the best way possible.

Again, this is not an easy process. Improving your quality signals takes work, research, and know-how. Any person with knowledge of search engine optimization can do SEO. But what makes someone an SEO specialist is not her ability to apply SEO but her ability to know where to apply it. For example, she will:

  • Learn about your audience and the questions they’re asking;
  • Conduct topical research around the answer to those questions;
  • Perform competitive analyses to see how others answer them;
  • Identify content gaps and opportunities to deliver better answers;
  • Develop content strategies that will offer those better answers;
  • Apply technical tweaks to ensure Google can find your answers;
  • Structure pages and content so they can better understand them;
  • Amplify your answers so that they show their value and relevance;
  • And so much more.

Let’s not forget everything an SEO specialist does outside of her work to make sure her abilities are always up to par, such as staying on top of any new advancements and changes in the search industry; mastering her craft by constantly improving her skills, knowledge, and tools; and flexing her out-of-the-box SEO thinking so she’s always looking for ways to better help her clients and their users.

In short, the SEO specialist is a consummate professional.

As the story goes, a mechanic fixes a car by giving the engine a couple of taps of her hammer. When she gave her client a $1,000 invoice, the client protested: “But you just tapped it with a hammer!” So the mechanic provided a more detailed invoice, which now read: “Tapping engine with a hammer: $1. Knowing where to tap: $999.”

Nevertheless, SEO has come a long way since its inception, going from writing file descriptions submitted to web archives to help users find them, to writing the best answers to user questions and providing them with the most delightful experiences when retrieving those answers. Eventually, SEO will no longer be about rankings at all and there will no longer be a need to optimize for them.

As Google continues to evolve and the intelligence of its algorithms keeps growing at an accelerated pace, what we will be witnessing is Google essentially removing itself from the search equation. (We already are, by the way.)

By becoming better at processing and understanding natural language, Google is becoming more human-like. It’s also thinking more like its users, too. As it predicts and meets the needs of its users with near laser-like accuracy, it will no longer be necessary to optimize for the search engine. In fact, Google has been saying for ages to stop focusing on search engines and to focus on your users instead. Give them what they want. The more you do and the better you are at doing it, the more visible you will be.

Enter search experience optimization

In the same way, search engine optimization is evolving where the need to place emphasis on search engines and optimizing for their results is dwindling, and emphasis on users and meeting their needs is predominating.

Stated differently, it’s about optimizing for the user’s search experience, from the moment they enter a search query to the moment they get the answers they need. Or as SEO and UX expert Izzi Smith once noted, it’s optimizing for the user’s experience from “SERP to satisfaction.”

That’s why I prefer “search experience optimization” or SXO.

SXO is not a new concept. Creating quality content on a website that delivers a quality user experience is nothing new. It has been a topic of discussion among SEO professionals since the early 2010s. But it has grown to become a more holistic approach to SEO than just optimizing for search engine inclusion or positions.

SXO goes beyond just matching the user’s query and meeting their needs. It also tries to meet the user’s expectations, too. To do that, today’s SEO professional must ask:

  • What questions are users asking?
  • What answers will best answer them?
  • In what format do they want those answers?
  • For what purpose will they use those answers?
  • Are there any subsequent questions they might ask?

Creating content around topics that match the user’s query is one thing. But improving how well it matches the user’s expectations is another. Exceeding those expectations by premptively answering all their questions is yet another. That’s where SXO comes in. It takes more than just understanding what people are searching for. It’s also understanding how they search and why they’re searching for it, too.

Google uses a number of different metrics, the degree to which we are not 100% certain, to learn more about the performance of their search results and in order to refine them. They have publicly said they track behaviours such as on-page interactions and other metrics in their attempts to deliver a better search experience for their users. It would not be surprising if they use these metrics to learn about the quality of their results, too.

For example, they likely consider things like clickthrough rates (i.e., how often a search result gets clicked), dwell times (i.e., how long people stay on a page after clicking a search result), pogosticking (i.e., when users click a search result and return to the search engine because it didn’t satisfy their needs), and a host of other metrics like these. They’re often called “short clicks,” “long clicks,” “last clicks,” “next clicks,” etc.

How much Google uses implicit feedback is still a matter of debate. But suppose Google does not consider implicit feedback in its rankings or even uses them at all, which is highly unlikely. Delivering a better search experience will still improve results as they will lead to better, stickier traffic — and higher converting traffic, too.

Optimize for the search experience

So how do you perform SXO?

In many cases, the answer can be the same as SEO’s oft-hackneyed retort to the same question: “It depends.” However, there are some important steps that will lead you in the right direction.

First, a piece of content should adequately match the user’s level of intent and the user’s level of awareness. Both are important and can vary from search query to search query. In other words, ensure that your content satisfies the search by matching the kind of search they’re making and does so in the way they want.

Level of search intent

Here’s what search intent means:

  • Informational search (“I want to know”)
  • Investigative search (“I want to go over”)
  • Transactional search (“I want to do”)
  • Navigational search (“I want to go”)

An informational search is where the user is only looking for information. They’re asking general questions and they may be looking for more information about their situation, issue, problem, or challenge. They’re only researching and not seeking to do anything specific. For example, “How do I increase traffic to my website?” Or, “What is SEO?”

On the other hand, an investigative search is where the user is looking to do something but needs to go over more information before doing so. It’s also known as “commercial intent” or “commercial investigation intent,” because the searcher is often shopping around and investigating different solutions before they make a decision. For example, “best digital marketing agency” or “SEO vs PPC.”

A transactional search is where the user wants to do something. The search is related to taking that action and has a higher search intent because it’s more specific. For example, it can be to buy, call, download, join, order, register, subscribe, and so on. Typically, these search queries include products or services, industries, or locations, such as “quote for SEO services” or “PPC services near me.”

Finally, navigational search has the highest intent. The user is trying to locate something specific, which is mostly a website, destination, address, or brand name. For example, “seoplus Ottawa” or “Brock Murray LinkedIn.”

Level of search awareness

Depending on the user’s level of awareness, the kind of answer that will best satisfy their query is one that will mirror their understanding of the situation as closely as possible. The content will not only match what they want but also meet them where they are.

There are five stages of awareness:

  1. Unaware (they don’t know they have a problem, issue, or pain point).
  2. Problem-aware (they know they have a problem but not any solutions).
  3. Solution-aware (they know there are solutions but not anything specific).
  4. Product-aware (they know specific solutions, including yours).
  5. Most aware (they know a lot about your solution).

The first one is often the lowest awareness stage. It takes education for the user to realize they have a problem as it may not be readily apparent to them. The second one is where the user knows about their problem but it’s either not a priority or they’re not aware of any solutions. When they become solution-aware, they have an idea of different solutions or options. By the time they become product-aware, they know specific solutions and solution providers. Finally, when they’re most aware, they know your solution and know a lot about it.

What does this mean? Simply, it means content should not only satisfy a user’s search intent and meet their needs but also consider their journey. It means to provide supporting information that answers other related or possible questions they may have based on their current level of awareness.

Google’s machine-learning algorithms, like the newly introduced MUM (or “multitask unified model”), are becoming so sophisticated to the point where they’re able to understand the various implied meanings of a query. By doing so, they can predict the nuances of a topic and present users with multiple content paths to consider.

Say you offer a piece of content that’s educational and the user does an informational search on the topic covered. But if it only answers the query and doesn’t cover all the bases because it doesn’t take into account the user’s search awareness, chances are the search engines will favour other content pieces that do a better job.

Or let’s say you have a piece of content that’s commercial in nature, and the user does an investigative search that your piece specifically answers. It may do a good job at answering it, but it should answer any additional queries the user might have along their journey. Users want to fully understand the topic to the point where they may need to do less research, which in turn provides them with significantly more value.

For example, if a user searches for the “best digital marketing agency,” a content piece that answers that question is sufficient. But the user is likely doing research and might have more questions. It may seem counterproductive, but if the content piece offers alternatives, provides comparisons, weighs the pros and cons, answers objections, describes the next steps, and so on, then it will offer more value and may even outperform the competition.

In short, SEO should think of and for the user.

The quality of your content and user experience can affect a user’s perception of you. For example, the quality of the page experience can influence how people perceive the quality of the content, even the quality of your products and services. Vice versa, poor content may communicate that your business is poor, too.

A negative perception is called the “horn effect” whereas the opposite is called the “halo effect.” This effect can be subtle but it can also be inexorable. If it’s negative, the damage can be difficult to repair. Savvy SEOs understand that this effect starts with the search experience.

For example, a poor search experience will pervade other areas, including the decision to engage with your website, interact with your content, buy from your business, and even recommend you to others. If they bounce back to Google, it may be seen as a sign that your website is not sufficiently engaging or providing enough additional content to encourage people to explore further.

Search experience is more than just having a usable website. It starts the moment they search for, find, and consume your information, and it doesn’t stop until they do.

As Cyrus Shepard once noted, “Be first, be long, and be last.” Be the first search result they click on by offering the right topic and description that answers the user’s question. Offer quality content that satisfies their search along with a delightful experience while consuming that content. Finally, provide enough information that covers all the bases so there’s no reason to return to Google.

Optimize for the search experience with the end in mind. Literally.

In the final analysis, the key is to provide the most relevant and helpful content to the user. Since the early days of BackRub, search engines want the same thing, which is to provide users with the most relevant and helpful results. If a result leads to a page that meets users’ needs and the content is in sync with them, the page will become more visible on the search engines — particularly to the users that count.

While SEO is an industry in which most people focus on keywords and backlinks, it’s open to manipulation, such as spamming backlinks and stuffing keywords into content with no value. Fortunately, Google is catching up to these deceptive practices and changing the SEO landscape — a landscape that demands we focus more on users rather than on search engines; one that demands quality over quantity.

Avatar of Michel Fortin

By Michel Fortin

Michel Fortin is a digital marketing advisor specializing in SEO, communications, and strategy. For over 30 years, he helped hundreds of thousands of clients increase their visibility and their revenue. He is also the VP of Digital Marketing at Musora, the company behind award-winning platforms Drumeo and Pianote. He is the author of the More Traffic Memo™ SEO email newsletter.