wickedbananas
wickedbananas
SEO WordPress and Marketing Tips For Photographers
689 posts
Search Engine optimization can be difficult for photographers to grasp - this tumblr blog pulls all the best info together into one place
Don't wanna be here? Send us removal request.
wickedbananas · 7 years ago
Text
SEO Negotiation: How to Ace the Business Side of SEO - Whiteboard Friday
Posted by BritneyMuller
SEO isn't all meta tags and content. A huge part of the success you'll see is tied up in the inevitable business negotiations. In this week's Whiteboard Friday, our resident expert Britney Muller walks us through a bevy of smart tips and considerations that will strengthen your SEO negotiation skills, whether you're a seasoned pro or a newbie to the practice.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, Moz fans. Welcome to another edition of Whiteboard Friday. So today we are going over all things SEO negotiation, so starting to get into some of the business side of SEO. As most of you know, negotiation is all about leverage.
It's what you have to offer and what the other side is looking to gain and leveraging that throughout the process. So something that you can go in and confidently talk about as SEOs is the fact that SEO has around 20% more opportunity than both mobile and desktop PPC combined.
This is a really, really big deal. It's something that you can showcase. These are the stats to back it up. We will also link to the research to this down below. Good to kind of have that in your back pocket. Aside from this, you will obviously have your audit. So potential client, you're looking to get this deal.
Get the most out of the SEO audit
☑ Highlight the opportunities, not the screw-ups
You're going to do an audit, and something that I have always suggested is that instead of highlighting the things that the potential client is doing wrong, or screwed up, is to really highlight those opportunities. Start to get them excited about what it is that their site is capable of and that you could help them with. I think that sheds a really positive light and moves you in the right direction.
☑ Explain their competitive advantage
I think this is really interesting in many spaces where you can sort of say, "Okay, your competitors are here, and you're currently here and this is why,"and to show them proof. That makes them feel as though you have a strong understanding of the landscape and can sort of help them get there.
☑ Emphasize quick wins
I almost didn't put this in here because I think quick wins is sort of a sketchy term. Essentially, you really do want to showcase what it is you can do quickly, but you want to...
☑ Under-promise, over-deliver
You don't want to lose trust or credibility with a potential client by overpromising something that you can't deliver. Get off to the right start. Under-promise, over-deliver.
Smart negotiation tactics
☑ Do your research
Know everything you can about this clientPerhaps what deals they've done in the past, what agencies they've worked with. You can get all sorts of knowledge about that before going into negotiation that will really help you.
☑ Prioritize your terms
So all too often, people go into a negotiation thinking me, me, me, me, when really you also need to be thinking about, "Well, what am I willing to lose?What can I give up to reach a point that we can both agree on?" Really important to think about as you go in.
☑ Flinch!
This is a very old, funny negotiation tactic where when the other side counters, you flinch. You do this like flinch, and you go, "Oh, is that the best you can do?" It's super silly. It might be used against you, in which case you can just say, "Nice flinch." But it does tend to help you get better deals.
So take that with a grain of salt. But I look forward to your feedback down below. It's so funny.
☑ Use the words "fair" and "comfortable"
The words "fair" and "comfortable" do really well in negotiations. These words are inarguable. You can't argue with fair. "I want to do what is comfortable for us both. I want us both to reach terms that are fair."
You want to use these terms to put the other side at ease and to also help bridge that gap where you can come out with a win-win situation.
☑ Never be the key decision maker
I see this all too often when people go off on their own, and instantly on their business cards and in their head and email they're the CEO.
They are this. You don't have to be that, and you sort of lose leverage when you are. When I owned my agency for six years, I enjoyed not being CEO. I liked having a board of directors that I could reach out to during a negotiation and not being the sole decision maker. Even if you feel that you are the sole decision maker, I know that there are people that care about you and that are looking out for your business that you could contact as sort of a business mentor, and you could use that in negotiation. You can use that to help you. Something to think about.
Tips for negotiation newbies
So for the newbies, a lot of you are probably like, "I can never go on my own. I can never do these things." I'm from northern Minnesota. I have been super awkward about discussing money my whole life for any sort of business deal. If I could do it, I promise any one of you watching this can do it.
☑ Power pose!
I'm not kidding, promise. Some tips that I learned, when I had my agency, was to power pose before negotiations. So there's a great TED talk on this that we can link to down below. I do this before most of my big speaking gigs, thanks to my gramsy who told me to do this at SMX Advanced like three years ago.
Go ahead and power pose. Feel good. Feel confident. Amp yourself up.
☑ Walk the walk
You've got to when it comes to some of these things and to just feel comfortable in that space.
☑ Good > perfect
Know that good is better than perfect. A lot of us are perfectionists, and we just have to execute good. Trying to be perfect will kill us all.
☑ Screw imposter syndrome
Many of the speakers that I go on different conference circuits with all struggle with this. It's totally normal, but it's good to acknowledge that it's so silly. So to try to take that silly voice out of your head and start to feel good about the things that you are able to offer.
Take inspiration where you can find it
I highly suggest you check out Brian Tracy's old-school negotiation podcasts. He has some old videos. They're so good. But he talks about leverage all the time and has two really great examples that I love so much. One being jade merchants. So these jade merchants that would take out pieces of jade and they would watch people's reactions piece by piece that they brought out.
So they knew what piece interested this person the most, and that would be the higher price. It was brilliant. Then the time constraints is he has an example of people doing business deals in China. When they landed, the Chinese would greet them and say, "Oh, can I see your return flight ticket? I just want to know when you're leaving."
They would not make a deal until that last second. The more you know about some of these leverage tactics, the more you can be aware of them if they were to be used against you or if you were to leverage something like that. Super interesting stuff.
Take the time to get to know their business
☑ Tie in ROI
Lastly, just really take the time to get to know someone's business. It just shows that you care, and you're able to prioritize what it is that you can deliver based on where they make the most money off of the products or services that they offer. That helps you tie in the ROI of the things that you can accomplish.
☑ Know the order of products/services that make them the most money
One real quick example was my previous company. We worked with plastic surgeons, and we really worked hard to understand that funnel of how people decide to get any sort of elective procedure. It came down to two things.
It was before and after photos and price. So we knew that we could optimize for those two things and do very well in their space. So showing that you care, going the extra mile, sort of tying all of these things together, I really hope this helps. I look forward to the feedback down below. I know this was a little bit different Whiteboard Friday, but I thought it would be a fun topic to cover.
So thank you so much for joining me on this edition of Whiteboard Friday. I will see you all soon. Bye.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2PxrYSF via IFTTT
31 notes · View notes
wickedbananas · 7 years ago
Text
NEW On-Demand Crawl: Quick Insights for Sales, Prospecting, & Competitive Analysis
Posted by Dr-Pete
In June of 2017, Moz launched our entirely rebuilt Site Crawl, helping you dive deep into crawl issues and technical SEO problems, fix those issues in your Moz Pro Campaigns (tracked websites), and monitor weekly for new issues. Many times, though, you need quick insights outside of a Campaign context, whether you're analyzing a prospect site before a sales call or trying to assess the competition.
For years, Moz had a lab tool called Crawl Test. The bad news is that Crawl Test never made it to prime-time and suffered from some neglect. The good news is that I'm happy to announce the full launch (as of August 2018) of On-Demand Crawl, an entirely new crawl tool built on the engine that powers Site Crawl, but with a UI designed around quick insights for prospecting and competitive analysis.
While you don’t need a Campaign to run a crawl, you do need to be logged into your Moz Pro subscription. If you don’t have a subscription, you can sign-up for a free trial and give it a whirl.
How can you put On-Demand Crawl to work? Let's walk through a short example together.
All you need is a domain
Getting started is easy. From the "Moz Pro" menu, find "On-Demand Crawl" under "Research Tools":
Just enter a root domain or subdomain in the box at the top and click the blue button to kick off a crawl. While I don't want to pick on anyone, I've decided to use a real site. Our recent analysis of the August 1st Google update identified some sites that were hit hard, and I've picked one (lilluna.com) from that list.
Please note that Moz is not affiliated with Lil' Luna in any way. For the most part, it seems to be a decent site with reasonably good content. Let's pretend, just for this post, that you're looking to help this site out and determine if they'd be a good fit for your SEO services. You've got a call scheduled and need to spot-check for any major problems so that you can go into that call as informed as possible.
On-Demand Crawls aren't instantaneous (crawling is a big job), but they'll generally finish between a few minutes and an hour. We know these are time-sensitive situations. You'll soon receive an email that looks like this:
The email includes the number of URLs crawled (On-Demand will currently crawl up to 3,000 URLs), the total issues found, and a summary table of crawl issues by category. Click on the [View Report] link to dive into the full crawl data.
Assess critical issues quickly
We've designed On-Demand Crawl to assist your own human intelligence. You'll see some basic stats at the top, but then immediately move into a graph of your top issues by count. The graph only displays issues that occur at least once on your site – you can click "See More" to show all of the issues that On-Demand Crawl tracks (the top two bars have been truncated)...
Issues are also color-coded by category. Some items are warnings, and whether they matter depends a lot on context. Other issues, like "Critcal Errors" (in red) almost always demand attention. So, let's check out those 404 errors. Scroll down and you'll see a list of "Pages Crawled" with filters. You're going to select "4xx" in the "Status Codes" dropdown...
You can then pretty easily spot-check these URLs and find out that they do, in fact, seem to be returning 404 errors. Some appear to be legitimate content that has either internal or external links (or both). So, within a few minutes, you've already found something useful.
Let's look at those yellow "Meta Noindex" errors next. This is a tricky one, because you can't easily determine intent. An intentional Meta Noindex may be fine. An unintentional one (or hundreds of unintentional ones) could be blocking crawlers and causing serious harm. Here, you'll filter by issue type...
Like the top graph, issues appear in order of prevalence. You can also filter by all pages that have issues (any issues) or pages that have no issues. Here's a sample of what you get back (the full table also includes status code, issue count, and an option to view all issues)...
Notice the "?s=" common to all of these URLs. Clicking on a few, you can see that these are internal search pages. These URLs have no particular SEO value, and the Meta Noindex is likely intentional. Good technical SEO is also about avoiding false alarms because you lack internal knowledge of a site. On-Demand Crawl helps you semi-automate and summarize insights to put your human intelligence to work quickly.
Dive deeper with exports
Let's go back to those 404s. Ideally, you'd like to know where those URLs are showing up. We can't fit everything into one screen, but if you scroll up to the "All Issues" graph you'll see an "Export CSV" option...
The export will honor any filters set in the page list, so let's re-apply that "4xx" filter and pull the data. Your export should download almost immediately. The full export contains a wealth of information, but I've zeroed in on just what's critical for this particular case...
Now, you know not only what pages are missing, but exactly where they link from internally, and can easily pass along suggested fixes to the customer or prospect. Some of these turn out to be link-heavy pages that could probably benefit from some clean-up or updating (if newer recipes are a good fit).
Let's try another one. You've got 8 duplicate content errors. Potentially thin content could fit theories about the August 1st update, so this is worth digging into. If you filter by "Duplicate Content" issues, you'll see the following message...
The 8 duplicate issues actually represent 18 pages, and the table returns all 18 affected pages. In some cases, the duplicates will be obvious from the title and/or URL, but in this case there's a bit of mystery, so let's pull that export file. In this case, there's a column called "Duplicate Content Group," and sorting by it reveals something like the following (there's a lot more data in the original export file)...
I've renamed "Duplicate Content Group" to just "Group" and included the word count ("Words"), which could be useful for verifying true duplicates. Look at group #7 – it turns out that these "Weekly Menu Plan" pages are very image heavy and have a common block of text before any unique text. While not 100% duplicated, these otherwise valuable pages could easily look like thin content to Google and represent a broader problem.
Real insights in real-time
Not counting the time spent writing the blog post, running this crawl and diving in took less than an hour, and even that small amount of time spent uncovered more potential issues than what I could cover in this post. In less than an hour, you can walk into a client meeting or sales call with in-depth knowledge of any domain.
Keep in mind that many of these features also exist in our Site Crawl tool. If you're looking for long-term, campaign insights, use Site Crawl (if you just need to update your data, use our "Recrawl" feature). If you're looking for quick, one-time insights, check out On-Demand Crawl. Standard Pro users currently get 5 On-Demand Crawls per month (with limits increasing at higher tiers).
Your On-Demand Crawls are currently stored for 90 days. When you re-enter the feature, you'll see a table of all of your recent crawls (the image below has been truncated):
Click on any row to go back to see the crawl data for that domain. If you get the sale and decide to move forward, congratulations! You can port that domain directly into a Moz campaign.
We hope you'll try On-Demand Crawl out and let us know what you think. We'd love to hear your case studies, whether it's sales, competitive analysis, or just trying to solve the mysteries of a Google update.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2OTfW53 via IFTTT
8 notes · View notes
wickedbananas · 7 years ago
Text
Do You Need Local Pages? - Whiteboard Friday
Posted by Tom.Capper
Does it make sense for you to create local-specific pages on your website? Regardless of whether you own or market a local business, it may make sense to compete for space in the organic SERPs using local pages. Please give a warm welcome to our friend Tom Capper as he shares a 4-point process for determining whether local pages are something you should explore in this week's Whiteboard Friday!
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hello, Moz fans. Welcome to another Whiteboard Friday. I'm Tom Capper. I'm a consultant at Distilled, and today I'm going to be talking to you about whether you need local pages. Just to be clear right off the bat what I'm talking about, I'm not talking about local rankings as we normally think of them, the local map pack results that you see in search results, the Google Maps rankings, that kind of thing.
A 4-step process to deciding whether you need local pages
I'm talking about conventional, 10 blue links rankings but for local pages, and by local pages I mean pages from a national or international business that are location-specific. What are some examples of that? Maybe on Indeed.com they would have a page for jobs in Seattle. Indeed doesn't have a bricks-and-mortar premises in Seattle, but they do have a page that is about jobs in Seattle.
You might get a similar thing with flower delivery. You might get a similar thing with used cars, all sorts of different verticals. I think it can actually be quite a broadly applicable tactic. There's a four-step process I'm going to outline for you. The first step is actually not on the board. It's just doing some keyword research.
1. Know (or discover) your key transactional terms
I haven't done much on that here because hopefully you've already done that. You already know what your key transactional terms are. Because whatever happens you don't want to end up developing location pages for too many different keyword types because it's gong to bloat your site, you probably just need to pick one or two key transactional terms that you're going to make up the local variants of. For this purpose, I'm going to talk through an SEO job board as an example.
2. Categorize your keywords as implicit, explicit, or near me and log their search volumes
We might have "SEO jobs" as our core head term. We then want to figure out what the implicit, explicit, and near me versions of that keyword are and what the different volumes are. In this case, the implicit version is probably just "SEO jobs." If you search for "SEO jobs" now, like if you open a new tab in your browser, you're probably going to find that a lot of local orientated results appear because that is an implicitly local term and actually an awful lot of terms are using local data to affect rankings now, which does affect how you should consider your rank tracking, but we'll get on to that later.
SEO jobs, maybe SEO vacancies, that kind of thing, those are all going to be going into your implicitly local terms bucket. The next bucket is your explicitly local terms. That's going to be things like SEO jobs in Seattle, SEO jobs in London, and so on. You're never going to get a complete coverage of different locations. Try to keep it simple.
You're just trying to get a rough idea here. Lastly you've got your near me or nearby terms, and it turns out that for SEO jobs not many people search SEO jobs near me or SEO jobs nearby. This is also going to vary a lot by vertical. I would imagine that if you're in food delivery or something like that, then that would be huge.
3. Examine the SERPs to see whether local-specific pages are ranking
Now we've categorized our keywords. We want to figure out what kind of results are going to do well for what kind of keywords, because obviously if local pages is the answer, then we might want to build some.
In this case, I'm looking at the SERP for "SEO jobs." This is imaginary. The rankings don't really look like this. But we've got SEO jobs in Seattle from Indeed. That's an example of a local page, because this is a national business with a location-specific page. Then we've got SEO jobs Glassdoor. That's a national page, because in this case they're not putting anything on this page that makes it location specific.
Then we've got SEO jobs Seattle Times. That's a local business. The Seattle Times only operates in Seattle. It probably has a bricks-and-mortar location. If you're going to be pulling a lot of data of this type, maybe from stats or something like that, obviously tracking from the locations that you're mentioning, where you are mentioning locations, then you're probably going to want to categorize these at scale rather than going through one at a time.
I've drawn up a little flowchart here that you could encapsulate in a Excel formula or something like that. If the location is mentioned in the URL and in the domain, then we know we've got a local business. Most of the time it's just a rule of thumb. If the location is mentioned in the URL but not mentioned in the domain, then we know we've got a local page and so on.
4. Compare & decide where to focus your efforts
You can just sort of categorize at scale all the different result types that we've got. Then we can start to fill out a chart like this using the rankings. What I'd recommend doing is finding a click-through rate curve that you are happy to use. You could go to somewhere like AdvancedWebRanking.com, download some example click-through rate curves.
Again, this doesn't have to be super precise. We're looking to get a proportionate directional indication of what would be useful here. I've got Implicit, Explicit, and Near Me keyword groups. I've got Local Business, Local Page, and National Page result types. Then I'm just figuring out what the visibility share of all these types is. In my particular example, it turns out that for explicit terms, it could be worth building some local pages.
That's all. I'd love to hear your thoughts in the comments. Thanks.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2BuztqP via IFTTT
1 note · View note
wickedbananas · 7 years ago
Text
Ranking the 6 Most Accurate Keyword Research Tools
Posted by Jeff_Baker
In January of 2018 Brafton began a massive organic keyword targeting campaign, amounting to over 90,000 words of blog content being published.
Did it work?
Well, yeah. We doubled the number of total keywords we rank for in less than six months. By using our advanced keyword research and topic writing process published earlier this year we also increased our organic traffic by 45% and the number of keywords ranking in the top ten results by 130%.
But we got a whole lot more than just traffic.
From planning to execution and performance tracking, we meticulously logged every aspect of the project. I’m talking blog word count, MarketMuse performance scores, on-page SEO scores, days indexed on Google. You name it, we recorded it.
As a byproduct of this nerdery, we were able to draw juicy correlations between our target keyword rankings and variables that can affect and predict those rankings. But specifically for this piece...
How well keyword research tools can predict where you will rank.
A little background
We created a list of keywords we wanted to target in blogs based on optimal combinations of search volume, organic keyword difficulty scores, SERP crowding, and searcher intent.
We then wrote a blog post targeting each individual keyword. We intended for each new piece of blog content to rank for the target keyword on its own.
With our keyword list in hand, my colleague and I manually created content briefs explaining how we would like each blog post written to maximize the likelihood of ranking for the target keyword. Here’s an example of a typical brief we would give to a writer:
This image links to an example of a content brief Brafton delivers to writers.
Between mid-January and late May, we ended up writing 55 blog posts each targeting 55 unique keywords. 50 of those blog posts ended up ranking in the top 100 of Google results.
We then paused and took a snapshot of each URL’s Google ranking position for its target keyword and its corresponding organic difficulty scores from Moz, SEMrush, Ahrefs, SpyFu, and KW Finder. We also took the PPC competition scores from the Keyword Planner Tool.
Our intention was to draw statistical correlations between between our keyword rankings and each tool’s organic difficulty score. With this data, we were able to report on how accurately each tool predicted where we would rank.
This study is uniquely scientific, in that each blog had one specific keyword target. We optimized the blog content specifically for that keyword. Therefore every post was created in a similar fashion.
Do keyword research tools actually work?
We use them every day, on faith. But has anyone ever actually asked, or better yet, measured how well keyword research tools report on the organic difficulty of a given keyword?
Today, we are doing just that. So let’s cut through the chit-chat and get to the results...
While Moz wins top-performing keyword research tool, note that any keyword research tool with organic difficulty functionality will give you an advantage over flipping a coin (or using Google Keyword Planner Tool).
As you will see in the following paragraphs, we have run each tool through a battery of statistical tests to ensure that we painted a fair and accurate representation of its performance. I’ll even provide the raw data for you to inspect for yourself.
Let’s dig in!
The Pearson Correlation Coefficient
Yes, statistics! For those of you currently feeling panicked and lobbing obscenities at your screen, don’t worry — we’re going to walk through this together.
In order to understand the relationship between two variables, our first step is to create a scatter plot chart.
Below is the scatter plot for our 50 keyword rankings compared to their corresponding Moz organic difficulty scores.
We start with a visual inspection of the data to determine if there is a linear relationship between the two variables. Ideally for each tool, you would expect to see the X variable (keyword ranking) increase proportionately with the Y variable (organic difficulty). Put simply, if the tool is working, the higher the keyword difficulty, the less likely you will rank in a top position, and vice-versa.
This chart is all fine and dandy, however, it’s not very scientific. This is where the Pearson Correlation Coefficient (PCC) comes into play.
Phew. Still with me?
So each of these scatter plots will have a corresponding PCC score that will tell us how well each tool predicted where we would rank, based on its keyword difficulty score.
We will use the following table from statisticshowto.com to interpret the PCC score for each tool:
Coefficient Correlation R Score
Key
.70 or higher
Very strong positive relationship
.40 to +.69
Strong positive relationship
.30 to +.39
Moderate positive relationship
.20 to +.29
Weak positive relationship
.01 to +.19
No or negligible relationship
0
No relationship [zero correlation]
-.01 to -.19
No or negligible relationship
-.20 to -.29
Weak negative relationship
-.30 to -.39
Moderate negative relationship
-.40 to -.69
Strong negative relationship
-.70 or higher
Very strong negative relationship
In order to visually understand what some of these relationships would look like on a scatter plot, check out these sample charts from Laerd Statistics.
And here are some examples of charts with their correlating PCC scores (r):
The closer the numbers cluster towards the regression line in either a positive or negative slope, the stronger the relationship.
That was the tough part - you still with me? Great, now let’s look at each tool’s results.
Test 1: The Pearson Correlation Coefficient
Now that we've all had our statistics refresher course, we will take a look at the results, in order of performance. We will evaluate each tool’s PCC score, the statistical significance of the data (P-val), the strength of the relationship, and the percentage of keywords the tool was able to find and report keyword difficulty values for.
In order of performance:
#1: Moz
Revisiting Moz’s scatter plot, we observe a tight grouping of results relative to the regression line with few moderate outliers.
Moz Organic Difficulty Predictability
PCC
0.412
P-val
.003 (P<0.05)
Relationship
Strong
% Keywords Matched
100.00%
Moz came in first with the highest PCC of .412. As an added bonus, Moz grabs data on keyword difficulty in real time, rather than from a fixed database. This means that you can get any keyword difficulty score for any keyword.
In other words, Moz was able to generate keyword difficulty scores for 100% of the 50 keywords studied.
#2: SpyFu
Visually, SpyFu shows a fairly tight clustering amongst low difficulty keywords, and a couple moderate outliers amongst the higher difficulty keywords.
SpyFu Organic Difficulty Predictability
PCC
0.405
P-val
.01 (P<0.05)
Relationship
Strong
% Keywords Matched
80.00%
SpyFu came in right under Moz with 1.7% weaker PCC (.405). However, the tool ran into the largest issue with keyword matching, with only 40 of 50 keywords producing keyword difficulty scores.
#3: SEMrush
SEMrush would certainly benefit from a couple mulligans (a second chance to perform an action). The Correlation Coefficient is very sensitive to outliers, which pushed SEMrush’s score down to third (.364).
SEMrush Organic Difficulty Predictability
PCC
0.364
P-val
.01 (P<0.05)
Relationship
Moderate
% Keywords Matched
92.00%
Further complicating the research process, only 46 of 50 keywords had keyword difficulty scores associated with them, and many of those had to be found through SEMrush’s “phrase match” feature individually, rather than through the difficulty tool.
The process was more laborious to dig around for data.
#4: KW Finder
KW Finder definitely could have benefitted from more than a few mulligans with numerous strong outliers, coming in right behind SEMrush with a score of .360.
KW Finder Organic Difficulty Predictability
PCC
0.360
P-val
.01 (P<0.05)
Relationship
Moderate
% Keywords Matched
100.00%
Fortunately, the KW Finder tool had a 100% match rate without any trouble digging around for the data.
#5: Ahrefs
Ahrefs comes in fifth by a large margin at .316, barely passing the “weak relationship” threshold.
Ahrefs Organic Difficulty Predictability
PCC
0.316
P-val
.03 (P<0.05)
Relationship
Moderate
% Keywords Matched
100%
On a positive note, the tool seems to be very reliable with low difficulty scores (notice the tight clustering for low difficulty scores), and matched all 50 keywords.
#6: Google Keyword Planner Tool
Before you ask, yes, SEO companies still use the paid competition figures from Google’s Keyword Planner Tool (and other tools) to assess organic ranking potential. As you can see from the scatter plot, there is in fact no linear relationship between the two variables.
Google Keyword Planner Tool Organic Difficulty Predictability
PCC
0.045
P-val
Statistically insignificant/no linear relationship
Relationship
Negligible/None
% Keywords Matched
88.00%
SEO agencies still using KPT for organic research (you know who you are!) — let this serve as a warning: You need to evolve.
Test 1 summary
For scoring, we will use a ten-point scale and score every tool relative to the highest-scoring competitor. For example, if the second highest score is 98% of the highest score, the tool will receive a 9.8. As a reminder, here are the results from the PCC test:
And the resulting scores are as follows:
Tool
PCC Test
Moz
10
SpyFu
9.8
SEMrush
8.8
KW Finder
8.7
Ahrefs
7.7
KPT
1.1
Moz takes the top position for the first test, followed closely by SpyFu (with an 80% match rate caveat).
Test 2: Adjusted Pearson Correlation Coefficient
Let’s call this the “Mulligan Round.” In this round, assuming sometimes things just go haywire and a tool just flat-out misses, we will remove the three most egregious outliers to each tool’s score.
Here are the adjusted results for the handicap round:
Adjusted Scores (3 Outliers removed)
PCC
Difference (+/-)
SpyFu
0.527
0.122
SEMrush
0.515
0.150
Moz
0.514
0.101
Ahrefs
0.478
0.162
KWFinder
0.470
0.110
Keyword Planner Tool
0.189
0.144
As noted in the original PCC test, some of these tools really took a big hit with major outliers. Specifically, Ahrefs and SEMrush benefitted the most from their outliers being removed, gaining .162 and .150 respectively to their scores, while Moz benefited the least from the adjustments.
For those of you crying out, “But this is real life, you don’t get mulligans with SEO!”, never fear, we will make adjustments for reliability at the end.
Here are the updated scores at the end of round two:
Tool
PCC Test
Adjusted PCC
Total
SpyFu
9.8
10
19.8
Moz
10
9.7
19.7
SEMrush
8.8
9.8
18.6
KW Finder
8.7
8.9
17.6
AHREFs
7.7
9.1
16.8
KPT
1.1
3.6
4.7
SpyFu takes the lead! Now let’s jump into the final round of statistical tests.
Test 3: Resampling
Being that there has never been a study performed on keyword research tools at this scale, we wanted to ensure that we explored multiple ways of looking at the data.
Big thanks to Russ Jones, who put together an entirely different model that answers the question: "What is the likelihood that the keyword difficulty of two randomly selected keywords will correctly predict the relative position of rankings?"
He randomly selected 2 keywords from the list and their associated difficulty scores.
Let’s assume one tool says that the difficulties are 30 and 60, respectively. What is the likelihood that the article written for a score of 30 ranks higher than the article written on 60? Then, he performed the same test 1,000 times.
He also threw out examples where the two randomly selected keywords shared the same rankings, or data points were missing. Here was the outcome:
Resampling
% Guessed correctly
Moz
62.2%
Ahrefs
61.2%
SEMrush
60.3%
Keyword Finder
58.9%
SpyFu
54.3%
KPT
45.9%
As you can see, this tool was particularly critical on each of the tools. As we are starting to see, no one tool is a silver bullet, so it is our job to see how much each tool helps make more educated decisions than guessing.
Most tools stayed pretty consistent with their levels of performance from the previous tests, except SpyFu, which struggled mightily with this test.
In order to score this test, we need to use 50% as the baseline (equivalent of a coin flip, or zero points), and scale each tool relative to how much better it performed over a coin flip, with the top scorer receiving ten points.
For example, Ahrefs scored 11.2% better than flipping a coin, which is 8.2% less than Moz which scored 12.2% better than flipping a coin, giving AHREFs a score of 9.2.
The updated scores are as follows:
Tool
PCC Test
Adjusted PCC
Resampling
Total
Moz
10
9.7
10
29.7
SEMrush
8.8
9.8
8.4
27
Ahrefs
7.7
9.1
9.2
26
KW Finder
8.7
8.9
7.3
24.9
SpyFu
9.8
10
3.5
23.3
KPT
1.1
3.6
-.4
.7
So after the last statistical accuracy test, we have Moz consistently performing alone in the top tier. SEMrush, Ahrefs, and KW Finder all turn in respectable scores in the second tier, followed by the unique case of SpyFu, which performed outstanding in the first two tests (albeit, only returning results on 80% of the tested keywords), then falling flat on the final test.
Finally, we need to make some usability adjustments.
Usability Adjustment 1: Keyword Matching
A keyword research tool doesn’t do you much good if it can’t provide results for the keywords you are researching. Plain and simple, we can’t treat two tools as equals if they don’t have the same level of practical functionality.
To explain in practical terms, if a tool doesn’t have data on a particular keyword, one of two things will happen:
You have to use another tool to get the data, which devalues the entire point of using the original tool.
You miss an opportunity to rank for a high-value keyword.
Neither scenario is good, therefore we developed a penalty system. For each 10% match rate under 100%, we deducted a single point from the final score, with a maximum deduction of 5 points. For example, if a tool matched 92% of the keywords, we would deduct .8 points from the final score.
One may argue that this penalty is actually too lenient considering the significance of the two unideal scenarios outlined above.
The penalties are as follows:
Tool
Match Rate
Penalty
KW Finder
100%
0
Ahrefs
100%
0
Moz
100%
0
SEMrush
92%
-.8
Keyword Planner Tool
88%
-1.2
SpyFu
80%
-2
Please note we gave SEMrush a lot of leniency, in that technically, many of the keywords evaluated were not found in its keyword difficulty tool, but rather through manually digging through the phrase match tool. We will give them a pass, but with a stern warning!
Usability Adjustment 2: Reliability
I told you we would come back to this! Revisiting the second test in which we threw away the three strongest outliers that negatively impacted each tool’s score, we will now make adjustments.
In real life, there are no mulligans. In real life, each of those three blog posts that were thrown out represented a significant monetary and time investment. Therefore, when a tool has a major blunder, the result can be a total waste of time and resources.
For that reason, we will impose a slight penalty on those tools that benefited the most from their handicap.
We will use the level of PCC improvement to evaluate how much a tool benefitted from removing their outliers. In doing so, we will be rewarding the tools that were the most consistently reliable. As a reminder, the amounts each tool benefitted were as follows:
Tool
Difference (+/-)
Ahrefs
0.162
SEMrush
0.150
Keyword Planner Tool
0.144
SpyFu
0.122
KWFinder
0.110
Moz
0.101
In calculating the penalty, we scored each of the tools relative to the top performer, giving the top performer zero penalty and imposing penalties based on how much additional benefit the tools received over the most reliable tool, on a scale of 0–100%, with a maximum deduction of 5 points.
So if a tool received twice the benefit of the top performing tool, it would have had a 100% benefit, receiving the maximum deduction of 5 points. If another tool received a 20% benefit over of the most reliable tool, it would get a 1-point deduction. And so on.
Tool
% Benefit
Penalty
Ahrefs
60%
-3
SEMrush
48%
-2.4
Keyword Planner Tool
42%
-2.1
SpyFu
20%
-1
KW Finder
8%
-.4
Moz
-
0
Results
All told, our penalties were fairly mild, with a slight shuffling in the middle tier. The final scores are as follows:
Tool
Total Score
Stars (5 max)
Moz
29.7
4.95
KW Finder
24.5
4.08
SEMrush
23.8
3.97
Ahrefs
23.0
3.83
Spyfu
20.3
3.38
KPT
-2.6
0.00
Conclusion
Using any organic keyword difficulty tool will give you an advantage over not doing so. While none of the tools are a crystal ball, providing perfect predictability, they will certainly give you an edge. Further, if you record enough data on your own blogs’ performance, you will get a clearer picture of the keyword difficulty scores you should target in order to rank on the first page.
For example, we know the following about how we should target keywords with each tool:
Tool
Average KD ranking ≤10
Average KD ranking ≥ 11
Moz
33.3
37.0
SpyFu
47.7
50.6
SEMrush
60.3
64.5
KWFinder
43.3
46.5
Ahrefs
11.9
23.6
This is pretty powerful information! It’s either first page or bust, so we now know the threshold for each tool that we should set when selecting keywords.
Stay tuned, because we made a lot more correlations between word count, days live, total keywords ranking, and all kinds of other juicy stuff. Tune in again in early September for updates!
We hope you found this test useful, and feel free to reach out with any questions on our math!
Disclaimer: These results are estimates based on 50 ranking keywords from 50 blog posts and keyword research data pulled from a single moment in time. Search is a shifting landscape, and these results have certainly changed since the data was pulled. In other words, this is about as accurate as we can get from analyzing a moving target.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2P622gM via IFTTT
31 notes · View notes
wickedbananas · 7 years ago
Text
How to Win Some Local Customers Back from Amazon this Holiday Season
Posted by MiriamEllis
Your local business may not be able to beat Amazon at the volume of their own game of convenient shipping this holiday season, but don’t assume it’s a game you can’t at least get into!
This small revelation took me by surprise last month while I was shopping for a birthday gift for my brother. Like many Americans, I’m feeling growing qualms about the economic and societal impacts of putting my own perceived convenience at the top of a list of larger concerns like ensuring fair business practices, humane working conditions, and sustainable communities.
So, when I found myself on the periphery of an author talk at the local independent bookstore and the book happened to be one I thought my brother would enjoy, I asked myself a new question:
“I wonder if this shop would ship?”
There was no signage indicating such a service, but I asked anyway, and was delighted to discover that they do. Minutes later, the friendly staff was wrapping up a signed copy of the volume in nice paper and popping a card in at no extra charge. Shipping wasn’t free, but I walked away feeling a new kind of happiness in wishing my sibling a “Happy Birthday” this year.
And that single transaction not only opened my eyes to the fact that I don’t have to remain habituated to gift shopping at Amazon or similar online giants for remote loved ones, but it also inspired this article.
Let’s talk about this now, while your local business, large or small, still has time to make plans for the holidays. Let’s examine this opportunity together, with a small study, a checklist, and some inspiration for seasonal success.
What do people buy most at the holidays and who’s shipping?
According to Statista, the categories in the following chart are the most heavily shopped during the holiday season. I selected a large town in California with a population of 60,000+, and phoned every business in these categories that was ranking in the top 10 of Google’s Local Finder view. This comprised both branded chains and independently-owned businesses. I asked each business if I came in and purchased items whether they could ship them to a friend.
Category
% Offer Shipping
Notes
Clothing
80%
Some employees weren’t sure. Outlets of larger store brands couldn’t ship. Some offered shipping only if you were a member of their loyalty program. Small independents consistently offered shipping. Larger brands promoted shopping online.
Electronics
10%
Larger stores all stressed going online. The few smaller stores said they could ship, but made it clear that it was an unusual request.
Games/Toys/Dolls etc.
25%
Large stores promote online shopping. One said they would ship some items but not all. Independents did not ship.
Food/Liquor
20%
USPS prohibits shipping alcohol. I surveyed grocery, gourmet, and candy stores. None of the grocery stores shipped and only two candy stores did.
Books
50%
Only two bookstores in this town, both independent. One gladly ships. The other had never considered it.
Jewelry
60%
Chains require online shopping. Independents more open to shipping but some didn’t offer it.
Health/Beauty
20%
With a few exceptions, cosmetic and fitness-related stores either had no shipping service or had either limited or full online shopping.
Takeaways from the study
Most of the chains promote online shopping vs. shopping in their stores, which didn’t surprise me, but which strikes me as opportunity being left on the table.
I was pleasantly surprised by the number of independent clothing and jewelry stores that gladly offered to ship gift purchases.
I was concerned by how many employees initially didn’t know whether or not their employer offered shipping, indicating a lack of adequate training.
Finally, I’ll add that I’ve physically visited at least 85% of these businesses in the past few years and have never been told by any staff member about their shipping services, nor have I seen any in-store signage promoting such an offer.
My overarching takeaway from the experiment is that, though all of us are now steeped in the idea that consumers love the convenience of shipping, a dominant percentage of physical businesses are still operating as though this realization hasn’t fully hit in… or that it can be safely ignored.
To put it another way, if Amazon has taken some of your customers, why not take a page from their playbook and get shipping?
The nitty-gritty of brick-and-mortar shipping
62% of consumers say the reason they’d shop offline is because they want to see, touch, and try out items. – RetailDive
There’s no time like the holidays to experiment with a new campaign. I sat down with a staff member at the bookstore where I bought my brother’s gift and asked her some questions about how they manage shipping. From that conversation, and from some additional research, I came away with the following checklist for implementing a shipping offer at your brick-and-mortar locations:
✔ Determine whether your business category is one that lends itself to holiday gift shopping.
✔ Train core or holiday temp staff to package and ship gifts.
✔ Craft compelling messaging surrounding your shipping offer, perhaps promoting pride in the local community vs. pride in Amazon. Don’t leave it to customers to shop online on autopilot — help them realize there’s a choice.
✔ Cover your store and website with messaging highlighting this offering, at least two months in advance of the holidays.
✔ In October, run an in-store campaign in which cashiers verbally communicate your holiday shipping service to every customer.
✔ Sweeten the offer with a dedication of X% of sales to a most popular local cause/organization/institution.
✔ Promote your shipping service via your social accounts.
✔ Make an effort to earn a mention of your shipping service in local print and radio news.
✔ Set clear dates for when the last purchases can be made to reach their destinations in time for the holidays.
✔ Coordinate with the USPS, FedEx, or UPS to have them pick up packages from your location daily.
✔ Determine the finances of your shipping charges. You may need to experiment with whether free shipping would put too big of a hole in your pocket, or whether it’s necessary to compete with online giants at the holidays.
✔ Track the success of this campaign to discover ROI.
Not every business is a holiday shopping destination, and online shopping may simply have become too dominant in some categories to overcome the Amazon habit. But, if you determine you’ve got an opportunity here, designate 2018 as a year to experiment with shipping with a view towards making refinements in the new year.
You may discover that your customers so appreciate the lightbulb moment of being able to support local businesses when they want something mailed that shipping is a service you’ll want to instate year-round. And not just for gifts… consumers are already signaling at full strength that they like having merchandise shipped to themselves!
Adding the lagniappe: Something extra
For the past couple of years, economists have reported that Americans are spending more on restaurants than on groceries. I see a combination of a desire for experiences and convenience in that, don’t you? It has been joked that someone needs to invent food that takes pictures of itself for social sharing! What can you do to capitalize on this desire for ease and experience in your business?
Cards, carols, and customs are wreathed in the “joy” part of the holidays, but how often do customers genuinely feel the enjoyment when they are shopping these days? True, a run to the store for a box of cereal may not require aesthetic satisfaction, but shouldn’t we be able to expect some pleasure in our purchasing experiences, especially when we are buying gifts that are meant to spread goodwill?
When my great-grandmother got tired from shopping at the Emporium in San Francisco, one of the superabundant sales clerks would direct her to the soft surroundings of the ladies’ lounge to refresh her weary feet on an automatic massager. She could lunch at a variety of nicely appointed in-store restaurants at varied prices. Money was often tight, but she could browse happily in the “bargain basement”. There were holiday roof rides for the kiddies, and holiday window displays beckoning passersby to stop and gaze in wonder. Great-grandmother, an immigrant from Ireland, got quite a bit of enjoyment out of the few dollars in her purse.
It may be that those lavish days of yore are long gone, taking the pleasure of shopping with them, and that we’re doomed to meager choosing between impersonal online shopping or impersonal offline warehouses … but I don’t think so.
The old Emporium was huge, with multiple floors and hundreds of employees … but it wasn’t a “big box store”.
There’s still opportunity for larger brands to differentiate themselves from their warehouse-lookalike competitors. Who says retail has to look like a fast food chain or a mobile phone store?
And as for small, independent businesses? I can’t open my Twitter feed nowadays without encountering a new and encouraging story about the rise of localism and local entrepreneurialism.
It’s a good time to revive the ethos of the lagniappe — the Louisiana custom of giving patrons a little something extra with their purchase, something that will make it worth it to get off the computer and head into town for a fun, seasonal experience. Yesterday’s extra cookie that made up the baker’s dozen could be today’s enjoyable atmosphere, truly expert salesperson, chair to sit down in when weary, free cup of spiced cider on a wintry day… or the highly desirable service of free shipping. Chalk up the knowledge of this need as one great thing Amazon has gifted you.
In 2017, our household chose to buy as many holiday presents as possible from Main Street for our nearby family and friends. We actually enjoyed the experience. In 2018, we plan to see how far our town can take us in terms of shipping gifts to loved ones we won’t have a chance to see. Will your business be ready to serve our newfound need?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2Ou0yfm via IFTTT
1 note · View note
wickedbananas · 7 years ago
Text
What Do Dolphins Eat? Lessons from How Kids Search
Posted by willcritchlow
Kids may search differently than adults, but there are some interesting insights from how they use Google that can help deepen our understanding of searchers in general. Comfort levels with particular search strategies, reading only the bold words, taking search suggestions and related searches as answers — there's a lot to dig into. In this week's slightly different-from-the-norm Whiteboard Friday, we welcome the fantastic Will Critchlow to share lessons from how kids search.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hi, everyone. I'm Will Critchlow, founder and CEO of Distilled, and this week's Whiteboard Friday is a little bit different. I want to talk about some surprising and interesting and a few funny facts that I learnt when I was reading some research that Google did about how kids search for information. So this isn't super actionable. This is not about tactics of improving your website particularly. But I think we get some insights — they were studying kids aged 7 to 11 — by looking at how kids interact. We can see some reflections or some ideas about how there might be some misconceptions out there about how adults search as well. So let's dive into it.
What do dolphins eat?
I've got this "What do dolphins eat?" because this was the first question that the researchers gave to the kids to say sit down in front of a search box, go. They tell this little anecdote, a little bit kind of soul-destroying, of this I think it was a seven-year-old child who starts typing dolphin, D-O-L-F, and then presses Enter, and it was like sadly there's no dolphins, which hopefully they found him some dolphins. But a lot of the kids succeeded at this task.
Different kinds of searchers
The researchers divided the ways that the kids approached it up into a bunch of different categories. They found that some kids were power searchers. Some are what they called "developing." They classified some as "distracted." But one that I found fascinating was what they called visual searchers. I think they found this more commonly among the younger kids who were perhaps a little bit less confident reading and writing. It turns out that, for almost any question you asked them, these kids would turn first to image search.
So for this particular question, they would go to image search, typically just type "dolphin" and then scroll and go looking for pictures of a dolphin eating something. Then they'd find a dolphin eating a fish, and they'd turn to the researcher and say "Look, dolphins eat fish." Which, when you think about it, I quite like in an era of fake news. This is the kids doing primary research. They're going direct to the primary source. But it's not something that I would have ever really considered, and I don't know if you would. But hopefully this kind of sparks some thought and some insights and discussions at your end. They found that there were some kids who pretty much always, no matter what you asked them, would always go and look for pictures.
Kids who were a bit more developed, a bit more confident in their reading and writing would often fall into one of these camps where they were hopefully focusing on the attention. They found a lot of kids were obviously distracted, and I think as adults this is something that we can relate to. Many of the kids were not really very interested in the task at hand. But this kind of path from distracted to developing to power searcher is an interesting journey that I think totally applies to grown-ups as well.
In practice: [wat do dolfin eat]
So I actually, after I read this paper, went and did some research on my kids. So my kids were in roughly this age range. When I was doing it, my daughter was eight and my son was five and a half. Both of them interestingly typed "wat do dolfin eat" pretty much like this. They both misspelled "what," and they both misspelled "dolphin." Google was fine with that. Obviously, these days this is plenty close enough to get the result you wanted. Both of them successfully answered the question pretty much, but both of them went straight to the OneBox. This is, again, probably unsurprising. You can guess this is probably how most people search.
"Oh, what's a cephalopod?" The path from distracted to developing
So there's a OneBox that comes up, and it's got a picture of a dolphin. So my daughter, a very confident reader, she loves reading, "wat do dolfin eat," she sat and she read the OneBox, and then she turned to me and she said, "It says they eat fish and herring. Oh, what's a cephalopod?" I think this was her going from distracted into developing probably. To start off with, she was just answering this question because I had asked her to. But then she saw a word that she didn't know, and suddenly she was curious. She had to kind of carefully type it because it's a slightly tricky word to spell. But she was off looking up what is a cephalopod, and you could see the engagement shift from "I'm typing this because Dad has asked me to and it's a bit interesting I guess" to "huh, I don't know what a cephalopod is, and now I'm doing my own research for my own reasons." So that was interesting.
"Dolphins eat fish, herring, killer whales": Reading the bold words
My son, as I said, typed something pretty similar, and he, at the point when he was doing this, was at the stage of certainly capable of reading, but generally would read out loud and a little bit halting. What was fascinating on this was he only read the bold words. He read it out loud, and he didn't read the OneBox. He just read the bold words. So he said to me, "Dolphins eat fish, herring, killer whales," because killer whales, for some reason, was bolded. I guess it was pivoting from talking about what dolphins eat to what killer whales eat, and he didn't read the context. This cracked him up. So he thought that was ridiculous, and isn't it funny that Google thinks that dolphins eat killer whales.
That is similar to some stuff that was in the original research, where there were a bunch of common misconceptions it turns out that kids have and I bet a bunch of adults have. Most adults probably don't think that the bold words in the OneBox are the list of the answer, but it does point to the problems with factual-based, truthy type queries where Google is being asked to be the arbiter of truth on some of this stuff. We won't get too deep into that.
Common misconceptions for kids when searching
1. Search suggestions are answers
But some common misconceptions they found some kids thought that the search suggestions, so the drop-down as you start typing, were the answers, which is bit problematic. I mean we've all seen kind of racist or hateful drop-downs in those search queries. But in this particular case, it was mainly just funny. It would end up with things like you start asking "what do dolphins eat," and it would be like "Do dolphins eat cats" was one of the search suggestions.
2. Related searches are answers
Similar with related searches, which, as we know, are not answers to the question. These are other questions. But kids in particular — I mean, I think this is true of all users — didn't necessarily read the directions on the page, didn't read that they were related searches, just saw these things that said "dolphin" a lot and started reading out those. So that was interesting.
How kids search complicated questions
The next bit of the research was much more complex. So they started with these easy questions, and they got into much harder kind of questions. One of them that they asked was this one, which is really quite hard. So the question was, "Can you find what day of the week the vice president's birthday will fall on next year?" This is a multifaceted, multipart question.
How do they handle complex, multi-step queries?
Most of the younger kids were pretty stumped on this question. Some did manage it. I think a lot of adults would fail at this. So if you just turn to Google, if you just typed this in or do a voice search, this is the kind of thing that Google is almost on the verge of being able to do. If you said something like, "When is the vice president's birthday," that's a question that Google might just be able to answer. But this kind of three-layered thing, what day of the week and next year, make this actually a very hard query. So the kids had to first figure out that, to answer this, this wasn't a single query. They had to do multiple stages of research. When is the vice president's birthday? What day of the week is that date next year? Work through it like that.
I found with my kids, my eight-year-old daughter got stuck halfway through. She kind of realized that she wasn't going to get there in one step, but also couldn't quite structure the multi-levels needed to get to, but also started getting a bit distracted again. It was no longer about cephalopods, so she wasn't quite as interested.
Search volume will grow in new areas as Google's capabilities develop
This I think is a whole area that, as Google's capabilities develop to answer more complex queries and as we start to trust and learn that those kind of queries can be answered, what we see is that there is going to be increasing, growing search volume in new areas. So I'm going to link to a post I wrote about a presentation I gave about the next trillion searches. This is my hypothesis that essentially, very broad brush strokes, there are a trillion desktop searches a year. There are a trillion mobile searches a year. There's another trillion out there in searches that we don't do yet because they can't be answered well. I've got some data to back that up and some arguments why I think it's about that size. But I think this is kind of closely related to this kind of thing, where you see kids get stuck on these kind of queries.
Incidentally, I'd encourage you to go and try this. It's quite interesting, because as you work through trying to get the answer, you'll find search results that appear to give the answer. So, for example, I think there was an About.com page that actually purported to give the answer. It said, "What day of the week is the vice president's birthday on?" But it had been written a year before, and there was no date on the page. So actually it was wrong. It said Thursday. That was the answer in 2016 or 2017. So that just, again, points to the difference between primary research, the difference between answering a question and truth. I think there's a lot of kind of philosophical questions baked away in there.
Kids get comfortable with how they search – even if it's wrong
So we're going to wrap up with possibly my favorite anecdote of the user research that these guys did, which was that they said some of these kids, somewhere in this developing stage, get very attached to searching in one particular way. I guess this is kind of related to the visual search thing. They find something that works for them. It works once. They get comfortable with it, they're familiar with it, and they just do that for everything, whether it's appropriate or not. My favorite example was this one child who apparently looked for information about both dolphins and the vice president of the United States on the SpongeBob SquarePants website, which I mean maybe it works for dolphins, but I'm guessing there isn't an awful lot of VP information.
So anyway, I hope you've enjoyed this little adventure into how kids search and maybe some things that we can learn from it. Drop some anecdotes of your own in the comments. I'd love to hear your experiences and some of the funny things that you've learnt along the way. Take care.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2AWn9ix via IFTTT
1 note · View note
wickedbananas · 7 years ago
Text
Google's August 1st Core Update: Week 1
Posted by Dr-Pete
On August 1, Google (via Danny Sullivan's @searchliaison account) announced that they released a "broad core algorithm update." Algorithm trackers and webmaster chatter confirmed multiple days of heavy ranking flux, including our own MozCast system:
Temperatures peaked on August 1-2 (both around 114°F), with a 4-day period of sustained rankings flux (purple bars are all over 100°F). While this has settled somewhat, yesterday's data suggests that we may not be done.
August 2nd set a 2018 record for MozCast at 114.4°F. Keep in mind that, while MozCast was originally tuned to an average temperature of 70°F, 2017-2018 average temperatures have been much higher (closer to 90° in 2018).
Temperatures by Vertical
There's been speculation that this algo update targeted so called YMYL queries (Your Money or Your Life) and disproportionately impacted health and wellness sites. MozCast is broken up into 20 keyword categories (roughly corresponding to Google Ads categories). Here are the August 2nd temperatures by category:
At first glance, the "Health" category does appear to be the most impacted. Keywords in that category had a daily average temperature of 124°F. Note, though, that all categories showed temperatures over 100°F on August 1st – this isn't a situation where one category was blasted and the rest were left untouched. It's also important to note that this pattern shifted during the other three days of heavy flux, with other categories showing higher average temperatures. The multi-day update impacted a wide range of verticals.
Top 30 winners
So, who were the big winners (so far) of this update? I always hesitate to do a winners/losers analysis – while useful, especially for spotting patterns, there are plenty of pitfalls. First and foremost, a site can gain or lose SERP share for many reasons that have nothing to do with algorithm updates. Second, any winners/losers analysis is only a snapshot in time (and often just one day).
Since we know that this update spanned multiple days, I've decided to look at the percentage increase (or decrease) in SERP share between July 31st and August 7th. In this analysis, "Share" is a raw percentage of page-1 rankings in the MozCast 10K data set. I've limited this analysis to only sites that had at least 25 rankings across our data set on July 31 (below that the data gets very noisy). Here are the top 30...
The first column is the percentage increase across the 7 days. The final column is the overall share – this is very low for all but mega-sites (Wikipedia hovers in the colossal 5% range).
Before you over-analyze, note the second column – this is the percent change from the highest July SERP share for that site. What the 7-day share doesn't tell us is whether the site is naturally volatile. Look at Time.com (#27) for a stark example. Time Magazine saw a +19.5% lift over the 7 days, which sounds great, except that they landed on a final share that was down 54.4% from their highest point in July. As a news site, Time's rankings are naturally volatile, and it's unclear whether this has much to do with the algorithm update.
Similarly, LinkedIn, AMC Theaters, OpenTable, World Market, MapQuest, and RE/MAX all show highs in July that were near or above their August 7th peaks. Take their gains with a grain of salt.
Top 30 losers
We can run the same analysis for the sites that lost the most ground. In this case, the "Max %" is calculated against the July low. Again, we want to be mindful of any site where the 7-day drop looks a lot different than the drop from that site's July low-point...
Comparing the first two columns, Verywell Health immediately stands out. While the site ended the 7-day period down 52.3%, it was up just over 200% from July lows. It turns out that this site was sitting very low during the first week of July and then saw a jump in SERP share. Interestingly, Verywell Family and Verywell Fit also appear on our top 30 losers list, suggesting that there's a deeper story here.
Anecdotally, it's easy to spot a pattern of health and wellness sites in this list, including big players like Prevention and LIVESTRONG. Whether this list represents the entire world of sites hit by the algorithm update is impossible to say, but our data certainly seems to echo what others are seeing.
Are you what you E-A-T?
There's been some speculation that this update is connected to Google's recent changes to their Quality Rater Guidelines. While it's very unlikely that manual ratings based on the new guidelines would drive major ranking shifts (especially so quickly), it's entirely plausible that the guideline updates and this algorithm update share a common philosophical view of quality and Google's latest thinking on the subject.
Marie Haynes' post theorizing the YMYL connection also raises the idea that Google may be looking more closely at E-A-T signals (Expertise, Authoritativeness and Trust). While certainly an interesting theory, I can't adequately address that question with this data set. Declines in sites like Fortune, IGN and Android Central pose some interesting questions about authoritativeness and trust outside of the health and wellness vertical, but I hesitate to speculate based only on a handful of outliers.
If your site has been impacted in a material way (including significant traffic gains or drops), I'd love to hear more details in the comments section. If you've taken losses, try to isolate whether those losses are tied to specific keywords, keyword groups, or pages/content. For now, I'd advise that this update could still be rolling out or being tweaked, and we all need to keep our eyes open.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2AY1BSQ via IFTTT
0 notes
wickedbananas · 7 years ago
Text
Rewriting the Beginner's Guide to SEO, Chapter 4: On-Page Optimization
Posted by BritneyMuller
Chapter Four of the Beginner's Guide to SEO rewrite is chock full of on-page SEO learnings. After all the great feedback you've provided thus far on our outline, Chapter One, Chapter Two, and Chapter Three, we're eager to hear how you feel about Chapter Four. What really works for you? What do you think is missing? Read on, and let us know your thoughts in the comments!
Chapter 4: On-Page Optimization
Use your research to craft your message.
Now that you know how your target market is searching, it’s time to dive into on-page optimization, the practice of crafting web pages that answer searcher’s questions. On-page SEO is multifaceted, and extends beyond content into other things like schema and meta tags, which we’ll discuss more at length in the next chapter on technical optimization. For now, put on your wordsmithing hats — it’s time to create your content!
Creating your content
Applying your keyword research
In the last chapter, we learned methods for discovering how your target audience is searching for your content. Now, it’s time to put that research into practice. Here is a simple outline to follow for applying your keyword research:
Survey your keywords and group those with similar topics and intent. Those groups will be your pages, rather than creating individual pages for every keyword variation.
If you haven’t done so already, evaluate the SERP for each keyword or group of keywords to determine what type and format your content should be. Some characteristics of ranking pages to take note of:
Are they image or video heavy?
Is the content long-form or short and concise?
Is the content formatted in lists, bullets, or paragraphs?
Ask yourself, “What unique value could I offer to make my page better than the pages that are currently ranking for my keyword?”
On-page optimization allows you to turn your research into content your audience will love. Just make sure to avoid falling into the trap of low-value tactics that could hurt more than help!
Low-value tactics to avoid
Your web content should exist to answer searchers’ questions, to guide them through your site, and to help them understand your site’s purpose. Content should not be created for the purpose of ranking highly in search alone. Ranking is a means to an end, the end being to help searchers. If we put the cart before the horse, we risk falling into the trap of low-value content tactics.
Some of these tactics were introduced in Chapter 2, but by way of review, let’s take a deeper dive into some low-value tactics you should avoid when crafting search engine optimized content.
Thin content
While it’s common for a website to have unique pages on different topics, an older content strategy was to create a page for every single iteration of your keywords in order to rank on page 1 for those highly specific queries.
For example, if you were selling bridal dresses, you might have created individual pages for bridal gowns, bridal dresses, wedding gowns, and wedding dresses, even if each page was essentially saying the same thing. A similar tactic for local businesses was to create multiple pages of content for each city or region from which they wanted clients. These “geo pages” often had the same or very similar content, with the location name being the only unique factor.
Tactics like these clearly weren’t helpful for users, so why did publishers do it? Google wasn’t always as good as it is today at understanding the relationships between words and phrases (or semantics). So, if you wanted to rank on page 1 for “bridal gowns” but you only had a page on “wedding dresses,” that may not have cut it.
This practice created tons of thin, low-quality content across the web, which Google addressed specifically with its 2011 update known as Panda. This algorithm update penalized low-quality pages, which resulted in more quality pages taking the top spots of the SERPs. Google continues to iterate on this process of demoting low-quality content and promoting high-quality content today.
Google is clear that you should have a comprehensive page on a topic instead of multiple, weaker pages for each variation of a keyword.
Duplicate content
Like it sounds, “duplicate content” refers to content that is shared between domains or between multiple pages of a single domain. “Scraped” content goes a step further, and entails the blatant and unauthorized use of content from other sites. This can include taking content and republishing as-is, or modifying it slightly before republishing, without adding any original content or value.
There are plenty of legitimate reasons for internal or cross-domain duplicate content, so Google encourages the use of a rel=canonical tag to point to the original version of the web content. While you don’t need to know about this tag just yet, the main thing to note for now is that your content should be unique in word and in value.
Cloaking
A basic tenet of search engine guidelines is to show the same content to the engine's crawlers that you'd show to a human visitor. This means that you should never hide text in the HTML code of your website that a normal visitor can't see.
When this guideline is broken, search engines call it "cloaking" and take action to prevent these pages from ranking in search results. Cloaking can be accomplished in any number of ways and for a variety of reasons, both positive and negative. Below is an example of an instance where Spotify showed different content to users than to Google.
In some cases, Google may let practices that are technically cloaking pass because they contribute to a positive user experience. For more on the subject of cloaking and the levels of risk associated with various tactics, see our article on White Hat Cloaking.
Keyword stuffing
If you’ve ever been told, “You need to include {critical keyword} on this page X times,” you’ve seen the confusion over keyword usage in action. Many people mistakenly think that if you just include a keyword within your page’s content X times, you will automatically rank for it. The truth is, although Google looks for mentions of keywords and related concepts on your site’s pages, the page itself has to add value outside of pure keyword usage. If a page is going to be valuable to users, it won’t sound like it was written by a robot, so incorporate your keywords and phrases naturally in a way that is understandable to your readers.
Below is an example of a keyword-stuffed page of content that also uses another old method: bolding all your targeted keywords. Oy.
Auto-generated content
Arguably one of the most offensive forms of low quality content is the kind that is auto-generated, or created programmatically with the intent of manipulating search rankings and not helping users. You may recognize some auto-generated content by how little it makes sense when read — they are technically words, but strung together by a program rather than a human being.
It is worth noting that advancements in machine learning have contributed to more sophisticated auto-generated content that will only get better over time. This is likely why in Google’s quality guidelines on automatically generated content, Google specifically calls out the brand of auto-generated content that attempts to manipulate search rankings, rather than any-and-all auto-generated content.
What to do instead: 10x it!
There is no “secret sauce” to ranking in search results. Google ranks pages highly because it has determined they are the best answers to the searcher’s questions. In today’s search engine, it’s not enough that your page isn’t duplicate, spamming, or broken. Your page has to provide value to searchers and be better than any other page Google is currently serving as the answer to a particular query. Here’s a simple formula for content creation:
Search the keyword(s) you want your page to rank for
Identify which pages are ranking highly for those keywords
Determine what qualities those pages possess
Create content that’s better than that
We like to call this 10x content. If you create a page on a keyword that is 10x better than the pages being shown in search results (for that keyword), Google will reward you for it, and better yet, you’ll naturally get people linking to it! Creating 10x content is hard work, but will pay dividends in organic traffic.
Just remember, there’s no magic number when it comes to words on a page. What we should be aiming for is whatever sufficiently satisfies user intent. Some queries can be answered thoroughly and accurately in 300 words while others might require 1,000 words!
Pro tip: Don’t reinvent the wheel! If you already have content on your website, save yourself time by evaluating which of those pages are already bringing in good amounts of organic traffic and converting well. Refurbish that content on different platforms to help get more visibility to your site. On the other side of the coin, evaluate what existing content isn’t performing as well and adjust it, rather than starting from square one with all new content.
NAP: A note for local businesses
If you’re a business that makes in-person contact with your customers, be sure to include your business name, address, and phone number (NAP) prominently, accurately, and consistently throughout your site’s content. This information is often displayed in the footer or header of a local business website, as well as on any "contact us" pages. You’ll also want to mark up this information using local business schema. Schema and structured data are discussed more at length in the “Code” section of this chapter.
If you are a multi-location business, it’s best to build unique, optimized pages for each location. For example, a business that has locations in Seattle, Tacoma, and Bellevue should consider having a page for each:
example.com/seattle example.com/tacoma example.com/bellevue
Each page should be uniquely optimized for that location, so the Seattle page would have unique content discussing the Seattle location, list the Seattle NAP, and even testimonials specifically from Seattle customers. If there are dozens, hundreds, or even thousands of locations, a store locator widget could be employed to help you scale.
Hope you still have some energy left after handling the difficult-yet-rewarding task of putting together a page that is 10x better than your competitors’ pages, because there are just a few more things needed before your page is complete! In the next sections, we’ll talk about the other on-page optimizations your pages need, as well as naming and organizing your content.
Beyond content: Other optimizations your pages need
Can I just bump up the font size to create paragraph headings?
How can I control what title and description show up for my page in search results?
After reading this section, you’ll understand other important on-page elements that help search engines understand the 10x content you just created, so let’s dive in!
Header tags
Header tags are an HTML element used to designate headings on your page. The main header tag, called an H1, is typically reserved for the title of the page. It looks like this:
<h1>Page Title</h1>
There are also sub-headings that go from H2 (<h2>) to H6 (<h6>) tags, although using all of these on a page is not required. The hierarchy of header tags goes from H1 to H6 in descending order of importance.
Each page should have a unique H1 that describes the main topic of the page, this is often automatically created from the title of a page. As the main descriptive title of the page, the H1 should contain that page’s primary keyword or phrase. You should avoid using header tags to mark up non-heading elements, such as navigational buttons and phone numbers. Use header tags to introduce what the following content will discuss.
Take this page about touring Copenhagen, for example:
<h1>Copenhagen Travel Guide</h1> <h2>Copenhagen by the Seasons</h2> <h3>Visiting in Winter</h3> <h3>Visiting in Spring</h3>
The main topic of the page is introduced in the main <h1> heading, and each additional heading is used to introduce a new sub-topic. In this example, the <h2> is more specific than the <h1>, and the <h3> tags are more specific than the <h2>. This is just an example of a structure you could use.
Although what you choose to put in your header tags can be used by search engines to evaluate and rank your page, it’s important to avoid inflating their importance. Header tags are one among many on-page SEO factors, and typically would not move the needle like quality backlinks and content would, so focus on your site visitors when crafting your headings.
Internal links
In Chapter 2, we discussed the importance of having a crawlable website. Part of a website’s crawlability lies in its internal linking structure. When you link to other pages on your website, you ensure that search engine crawlers can find all your site’s pages, you pass link equity (ranking power) to other pages on your site, and you help visitors navigate your site.
The importance of internal linking is well established, but there can be confusion over how this looks in practice.
Link accessibility
Links that require a click (like a navigation drop-down to view) are often hidden from search engine crawlers, so if the only links to internal pages on your website are through these types of links, you may have trouble getting those pages indexed. Opt instead for links that are directly accessible on the page.
Anchor text
Anchor text is the text with which you link to pages. Below, you can see an example of what a hyperlink without anchor text and a hyperlink with anchor text would look like in the HTML.
<a href="http://www.domain.com/"></a> <a href="http://www.domain.com/" title="Keyword Text">Keyword Text</a>
On live view, that would look like this:
http://www.example.com/
Keyword Text
The anchor text sends signals to search engines regarding the content of the destination page. For example, if I link to a page on my site using the anchor text “learn SEO,” that’s a good indicator to search engines that the targeted page is one at which people can learn about SEO. Be careful not to overdo it, though. Too many internal links using the same, keyword-stuffed anchor text can appear to search engines that you’re trying to manipulate a page’s ranking. It’s best to make anchor text natural rather than formulaic.
Link volume
In Google’s General Webmaster Guidelines, they say to “limit the number of links on a page to a reasonable number (a few thousand at most).” This is part of Google’s technical guidelines, rather than the quality guideline section, so having too many internal links isn’t something that on its own is going to get you penalized, but it does affect how Google finds and evaluates your pages.
The more links on a page, the less equity each link can pass to its destination page. A page only has so much equity to go around.
So it’s safe to say that you should only link when you mean it! You can learn more about link equity from our SEO Learning Center.
Aside from passing authority between pages, a link is also a way to help users navigate to other pages on your site. This is a case where doing what’s best for search engines is also doing what’s best for searchers. Too many links not only dilute the authority of each link, but they can also be unhelpful and overwhelming. Consider how a searcher might feel landing on a page that looks like this:
Welcome to our gardening website! We have many articles on gardening, how to garden, and helpful tips on herbs, fruits, vegetables, perennials, and annuals. Learn more about gardening from our gardening blog.
Whew! Not only is that a lot of links to process, but it also reads pretty unnaturally and doesn’t contain much substance (which could be considered “thin content” by Google). Focus on quality and helping your users navigate your site, and you likely won’t have to worry about too many links.
Redirection
Removing and renaming pages is a common practice, but in the event that you do move a page, make sure to update the links to that old URL! At the very least, you should make sure to redirect the URL to its new location, but if possible, update all internal links to that URL at the source so that users and crawlers don’t have to pass through redirects to arrive at the destination page. If you choose to redirect only, be careful to avoid redirect chains that are too long (Google says, “Avoid chaining redirects... keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5.")
Example of a redirect chain:
(original location of content) example.com/location1 >> example.com/location2 >> (current location of content) example.com/location3
Better:
example.com/location1 >> example.com/location3
Image optimization
Images are the biggest culprits of slow web pages! The best way to solve for this is to compress your images. While there is no one-size-fits-all when it comes to image compression, testing various options like "save for web," image sizing, and compression tools like Optimizilla, ImageOptim for Mac (or Windows alternatives), as well as evaluating what works best is the way to go.
Another way to help optimize your images (and improve your page speed) is by choosing the right image format.
How to choose which image format to use:
Source: Google’s image optimization guide
Choosing image formats:
If your image requires animation, use a GIF.
If you don’t need to preserve high image resolution, use JPEG (and test out different compression settings).
If you do need to preserve high image resolution, use PNG.
If your image has a lot of colors, use PNG-24.
If your image doesn’t have a lot of colors, use PNG-8.
There are different ways to keep visitors on a semi-slow loading page by using images that produce a colored box or a very blurry/low resolution version while rendering to help visitors feel as if things are loading faster. We will discuss these options in more detail in Chapter 5.
Pro tip: Don’t forget about thumbnails! Thumbnails (especially for E-Commerce sites) can be a huge page speed slow down. Optimize thumbnails properly to avoid slow pages and to help retain more qualified visitors.
Alt text
Alt text (alternative text) within images is a principle of web accessibility, and is used to describe images to the visually impaired via screen readers. It’s important to have alt text descriptions so that any visually impaired person can understand what the pictures on your website depict.
Search engine bots also crawl alt text to better understand your images, which gives you the added benefit of providing better image context to search engines. Just ensure that your alt descriptions reads naturally for people, and avoid stuffing keywords for search engines.
Bad:
<img src="grumpycat.gif" alt="grumpy cat, cat is grumpy, grumpy cat gif">
Good:
<img src="grumpycat.gif" alt="A black cat looking very grumpy at a big spotted dog">
Submit an image sitemap
To ensure that Google can crawl and index your images, submit an image sitemap in your Google Search Console account. This helps Google discover images they may have otherwise missed.
Formatting for readability & featured snippets
Your page could contain the best content ever written on a subject, but if it’s formatted improperly, your audience might never read it! While we can never guarantee that visitors will read our content, there are some principles that can promote readability, including:
Text size and color - Avoid fonts that are too tiny. Google recommends 16+px font to minimize the need for “pinching and zooming” on mobile. The text color in relation to the page’s background color should also promote readability. Additional information on text can be found in the website accessibility guidelines. (Google’s web accessibility fundamentals).
Headings - Breaking up your content with helpful headings can help readers navigate the page. This is especially useful on long pages where a reader might be looking only for information from a particular section.
Bullet points - Great for lists, bullet points can help readers skim and more quickly find the information they need.
Paragraph breaks - Avoiding walls of text can help prevent page abandonment and encourage site visitors to read more of your page.
Supporting media - When appropriate, include images, videos, and widgets that would complement your content.
Bold and italics for emphasis - Putting words in bold or italics can add emphasis, so they should be the exception, not the rule. Appropriate use of these formatting options can call out important points you want to communicate.
Formatting can also affect your page’s ability to show up in featured snippets, those “position 0” results that appear above the rest of organic results.
There is no special code that you can add to your page to show up here, nor can you pay for this placement, but taking note of the query intent can help you better structure your content for featured snippets. For example, if you’re trying to rank for “cake vs. pie,” it might make sense to include a table in your content, with the benefits of cake in one column and the benefits of pie in the other. Or if you’re trying to rank for “best restaurants to try in Portland,” that could indicate Google wants a list, so formatting your content in bullets could help.
Title tags
A page’s title tag is a descriptive, HTML element that specifies the title of a particular web page. They are nested within the head tag of each page and look like this:
<head> <title>Example Title</title> </head>
Each page on your website should have a unique, descriptive title tag. What you input into your title tag field will show up here in search results, although in some cases Google may adjust how your title tag appears in search results.
It can also show up in web browsers…
Or when you share the link to your page on certain external websites…
Your title tag has a big role to play in people’s first impression of your website, and it’s an incredibly effective tool for drawing searchers to your page over any other result on the SERP. The more compelling your title tag, combined with high rankings in search results, the more visitors you’ll attract to your website. This underscores that SEO is not only about search engines, but rather the entire user experience.
What makes an effective title tag?
Keyword usage: Having your target keyword in the title can help both users and search engines understand what your page is about. Also, the closer to the front of the title tag your keywords are, the more likely a user will be to read them (and hopefully click) and the more helpful they can be for ranking.
Length: On average, search engines display the first 50–60 characters (~512 pixels) of a title tag in search results. If your title tag exceeds the characters allowed on that SERP, an ellipsis "..." will appear where the title was cut off. While sticking to 50–60 characters is safe, never sacrifice quality for strict character counts. If you can’t get your title tag down to 60 characters without harming its readability, go longer (within reason).
Branding: At Moz, we love to end our title tags with a brand name mention because it promotes brand awareness and creates a higher click-through rate among people who are familiar with Moz. Sometimes it makes sense to place your brand at the beginning of the title tag, such as on your homepage, but be mindful of what you are trying to rank for and place those words closer toward the beginning of your title tag.
Meta descriptions
Like title tags, meta descriptions are HTML elements that describe the contents of the page that they’re on. They are also nested in the head tag, and look like this:
<head> <meta name=”description” content=”Description of page here.”/> </head>
What you input into the description field will show up here in search results:
In many cases though, Google will choose different snippets of text to display in search results, dependent upon the searcher’s query.
For example if you search “find backlinks,” Google will provide this meta description as it deems it more relevant to the specific search:
While the actual meta description is:
This often helps to improve your meta descriptions for unique searches. However, don’t let this deter you from writing a default page meta description — they're still extremely valuable.
What makes an effective meta description?
The qualities that make an effective title tag also apply to effective meta descriptions. Although Google says that meta descriptions are not a ranking factor, like title tags, they are incredibly important for click-through rate.
Relevance: Meta descriptions should be highly relevant to the content of your page, so it should summarize your key concept in some form. You should give the searcher enough information to know they've found a page relevant enough to answer their question, without giving away so much information that it eliminates the need to click through to your web page.
Length: Search engines tend to truncate meta descriptions to around 300 characters. It’s best to write meta descriptions between 150–300 characters in length. On some SERPs, you’ll notice that Google gives much more real estate to the descriptions of some pages. This usually happens for web pages ranking right below a featured snippet.
URL structure: Naming and organizing your pages
URL stands for Uniform Resource Locator. URLs are the locations or addresses for individual pieces of content on the web. Like title tags and meta descriptions, search engines display URLs on the SERPs, so URL naming and format can impact click-through rates. Not only do searchers use them to make decisions about which web pages to click on, but URLs are also used by search engines in evaluating and ranking pages.
Clear page naming
Search engines require unique URLs for each page on your website so they can display your pages in search results, but clear URL structure and naming is also helpful for people who are trying to understand what a specific URL is about. For example, which URL is clearer?
example.com/desserts/chocolate-pie
OR
example.com/asdf/453?=recipe-23432-1123
Searchers are more likely to click on URLs that reinforce and clarify what information is contained on that page, and less likely to click on URLs that confuse them.
Page organization
If you discuss multiple topics on your website, you should also make sure to avoid nesting pages under irrelevant folders. For example:
example.com/commercial-litigation/alimony
It would have been better for this fictional multi-practice law firm website to nest alimony under “/family-law/” than to host it under the irrelevant "/commercial-litigation/" section of the website.
The folders in which you locate your content can also send signals about the type, not just the topic, of your content. For example, dated URLs can indicate time-sensitive content. While appropriate for news-based websites, dated URLs for evergreen content can actually turn searchers away because the information seems outdated. For example:
example.com/2015/april/what-is-seo/
vs.
example.com/what-is-seo/
Since the topic “What is SEO?” isn’t confined to a specific date, it’s best to host on a non-dated URL structure or else risk your information appearing stale.
As you can see, what you name your pages, and in what folders you choose to organize your pages, is an important way to clarify the topic of your page to users and search engines.
URL length
While it is not necessary to have a completely flat URL structure, many click-through rate studies indicate that, when given the choice between a URL and a shorter URL, searchers often prefer shorter URLs. Like title tags and meta descriptions that are too long, too-long URLs will also be cut off with an ellipsis. Just remember, having a descriptive URL is just as important, so don’t cut down on URL length if it means sacrificing the URL's descriptiveness.
example.com/services/plumbing/plumbing-repair/toilets/leaks/
vs.
example.com/plumbing-repair/toilets/
Minimizing length, both by including fewer words in your page names and removing unnecessary subfolders, makes your URLs easier to copy and paste, as well as more clickable.
Keywords in URL
If your page is targeting a specific term or phrase, make sure to include it in the URL. However, don't go overboard by trying to stuff in multiple keywords for purely SEO purposes. It’s also important to watch out for repeat keywords in different subfolders. For example, you may have naturally incorporated a keyword into a page name, but if located within other folders that are also optimized with that keyword, the URL could begin to appear keyword-stuffed.
Example:
example.com/seattle-dentist/dental-services/dental-crowns/
Keyword overuse in URLs can appear spammy and manipulative. If you aren’t sure whether your keyword usage is too aggressive, just read your URL through the eyes of a searcher and ask, “Does this look natural? Would I click on this?”
Static URLs
The best URLs are those that can easily be read by humans, so you should avoid the overuse of parameters, numbers, and symbols. Using technologies like mod_rewrite for Apache and ISAPI_rewrite for Microsoft, you can easily transform dynamic URLs like this:
http://moz.com/blog?id=123
into a more readable static version like this:
https://moz.com/google-algorithm-change
Hyphens for word separation
Not all web applications accurately interpret separators like underscores (_), plus signs (+), or spaces (%20). Search engines also do not understand how to separate words in URLs when they run together without a separator (example.com/optimizefeaturedsnippets/). Instead, use the hyphen character (-) to separate words in a URL.
Geographic Modifiers in URLs
Some local business owners omit geographic terms that describe their physical location or service area because they believe that search engines can figure this out on their own. On the contrary, it’s vital that local business websites’ content, URLs, and other on-page assets make specific mention of city names, neighborhood names, and other regional descriptors. Let both consumers and search engines know exactly where you are and where you serve, rather than relying on your physical location alone.
Protocols: HTTP vs. HTTPS
A protocol is that “http” or “https” preceding your domain name. Google recommends that all websites have a secure protocol (the “s” in “https” stands for “secure”). To ensure that your URLs are using the https:// protocol instead of http://, you must obtain an SSL (Secure Sockets Layer) certificate. SSL certificates are used to encrypt data. They ensure that any data passed between the web server and browser of the searcher remains private. As of July 2018, Google Chrome displays “not secure” for all HTTP sites, which could cause these sites to appear untrustworthy to visitors and result in them leaving the site.
If you’ve made it this far, congratulations on surpassing the halfway point of the Beginner’s Guide to SEO! So far, we’ve learned how search engines crawl, index, and rank content, how to find keyword opportunities to target, and now, you know the on-page optimization strategies that can help your pages get found. Next, buckle up, because we’ll be diving into the exciting world of technical SEO!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2MuFzsa via IFTTT
1 note · View note
wickedbananas · 7 years ago
Text
Take the 2018 Moz Local Search Marketing Industry Survey
Posted by MiriamEllis
Local search marketing is a dynamic and exciting discipline, but like many digital professions, it can be a bit isolating. You may find yourself running into questions that don't have a ready answer, things like...
What sort of benchmarks should I be measuring my daily work by?
Do my clients’ needs align with what my colleagues are seeing?
Am I over/undervaluing the role of Google in my future work?
Here’s a chance to find out what your peers are observing and doing on a day-to-day basis.
The Moz Local Search Marketing Industry Survey will dive into job descriptions, industries served, most effective tactics, tool usage, and the non-stop growth of Google’s local features. We'll even touch on how folks may have been impacted by the recent August 1 algorithm update, if at all. In-house local SEOs, agency local SEOs, and other digital marketers are all welcome! All participants will be entered into a drawing for a $100 Amazon gift card. The winner will be notified on 8/27/18.
Give just 5 minutes of your time and you’ll get insights and quotable statistics back when we publish the survey results. Be sure to participate by 8/24/2018. We sincerely appreciate your contributions!
Take the Local SEO Survey Now
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2ASbf9g via IFTTT
0 notes
wickedbananas · 7 years ago
Text
Calculated Fields in Google Data Studio - Whiteboard Friday
Posted by DiTomaso
Google Data Studio is a powerful tool to have in your SEO kit. Knowing how to get the most out of its power begins with understanding how to use calculated fields to apply good old-fashioned math to your data. In this week's Whiteboard Friday, we're delighted to welcome guest host Dana DiTomaso as she takes us through how to use calculated fields in Google Data Studio to uncover more value in your data and improve your reports.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hi, Moz fans. I'm Dana DiTomaso, President and partner at Kick Point, and we love Google Data Studio at Kick Point. You may not love Google Data Studio yet, but after you watch this I think you probably will.
One of the first things that you think about Google Data Studio is: Why would I use this? It's just charts. It's the same thing I can get in Analytics or a billion other dashboarding tools out there. But one of the things that I really like about Google Data Studio is math. You can do lots of different stuff in Data Studio, and I'm going to go through four of the basic types in Data Studio and then how you can use that to improve your reports, just as you sort of dip your toes into the Google Data Studio pool. What I've done here is I have written out a lot of the formulas that you're going to be using.
The types
It's a lot of obviously written out formulas, but when you get into Data Studio, you should be able to type these in and they'll work. Let's start at the beginning with the types.
Basic math. This is pretty obvious. 1 + 1 = 2. Phone calls plus emails equals this, for example. You can add together different fields.
Transforms. Let's say people are really bad at writing some things upper case and some things lower case. You have a problem with URLs being written a couple of different ways. You can use a transform to transform upper case into lower case. That's pretty nice.
Formulas. Formulas is where you're saying only show this subset of the data. Or how often does this happen? That could be things like the Count function, so count how many times this occurs, for example, and present that as a totally separate metric, which can be really useful for things like when you want to count the number of times an event occurs and then compare that against something else. It can just pull out that kind of data.
Logic. This is the more complex one. If X, then Y. If this happens, then that's going to happen. There's a lot of really complex stuff in there. But if you're just getting started, start with this, and then look at the Google Data Studio documentation. You'll find some cooler stuff in there.
1. Basic math
Here are some examples of how we use this in our Google Data Studio dashboards. So basic math, one of the things that a lot of people care about is: Are people getting in touch with me?
This is the basics of the reason why we do marketing. Are people getting in touch? So, for example, you can do some basic math and say, "All right. So I know on our website in Google Tag Manager, we have a trigger that fires whenever somebody taps or clicks a MailTo link on the site." In addition to that, we're tracking how many people submit a form, as you should.
Instead of reporting these separately, really they're kind of the same thing. They're emailing one way or the other. Why don't we just submit them as one metric? So in that case, you can say grab all the mail to form completions and then grab all the form goal completions, and now you have a total email requests or total requests or whatever you might want to call it. You can do the same thing where it's like, well, phone calls and emails, does it really matter if they're in separate buckets?
Just put them all in one. The same thing with the basic math. Just add it all together and then you've got one total metric you can present to the client. Here's how much money we made for you. Boom. That's a nice one. The next thing — I'm just going to flip over here — is formulas.
2. Formulas
Okay, so formulas, one of the things that I really like doing is looking at your Google Search Console data. This is in Data Studio. You're going to use Search Console for this, which is a nice data source. We all know Search Console data is not necessarily 100% accurate, but there's always lots of keyword treasure in there to be found if it's easy to find, which the Search Console interface isn't super great.
So you can make a report in Data Studio and say regex match, and so don't be afraid of regex. I think everyone should learn it. But if you're not super familiar with it, this is a really easy way to do it. Say, okay, every time a keyword contains why, how, can, what, for example, then those are question searches. You may change it to whatever makes sense for you.
But this is just pulling out that subset of data. Then you can see, so if these are question searches, do we have content that answers that question? No. Maybe this is something we need to think about. Or we're getting impressions for this. You could filter it and say only show questions searches where our average rank is below 20. Maybe if we improve this content, this is a featured snippet opportunity for us, for example. That's a real gold mine of data you can play around with.
3. Transforms
The third one is transforms. As I mentioned earlier, this is a really nice way to take Facebook, for example. We had a client who had Facebook in all upper case and Facebook in title case and Facebook in lower case in their sources and mediums, because they were very casual with how they used their UTM codes. We just standardized them all to go to lower, and those are nice text transforms that you can do.
It just makes things look a little bit nicer. I do recommend doing some of this, especially if you have messy data.
4. Logic
Then the big one here. This is logic, and I'm just going to toss over here for a second. Now logic has a lot of different components. What I'm showing you right now is a case when else end transform or logic. We use this to tidy up bad channel data.
So that client that I mentioned, who was just super casual with their UTM tags and they would just put in any old stuff, I think they had retargeting ads as a medium. You can set up channels and whatnot in Google Analytics. But I mean, really, when it comes down to it, not everybody is great at following the rules for UTMs that you've set up. Stuff happens.
It's okay. You can fix it in Data Studio. Especially if you open up Google Analytics and you see that you have this other channel, which I'm sure when we've inherited an Analytics account, we take a look at it, and there's this channel, and it's just a big bag of crap.
You can go in there and turn that into real, useful, actual channel data that matches up with where it should go. What I've got here is a really simple example. This could go on for lines and line and lines. I've just included two lines because this whiteboard is only so big.
So you start off by saying case. It is the case when, is the idea when, and then the first line here is source equals direct and medium equals not set or medium none, then direct. So I'm saying, okay, so this is the basics of how direct traffic happens.
If the source is direct and the medium is not set or the medium is none, like if I have no data whatsoever, now it's direct traffic. Great, that's basically what Google Analytics does. Nothing fancy is going on here. Now here's the next thing. In this case, I'm saying now I'm combining a regex match, which we talked about up here, with the case, and so now what I'm saying is when regex match medium, and then I've got this here.
Don't be scared of this. I know it's regex and maybe you're not super comfortable with it, but this is pretty elementary stuff, and once you do this, you will feel like a data wizard, I guarantee. The first time I did this I stood up from my computer and said "Yes" the first time it worked. Just play with it. It's going to be awesome. So you've got a little ... what's the thing called? You've got a little up arrow thingy there, very bad mediums dollar sign.
What this is saying is that if you've got anything in there that's sort of a weird medium, just write out all the crud that people have put in there over the years, all the weird mediums that totally don't make any sense at all. Just put it all in there and then you can toss it in a bucket say called paid social. You can do the same thing with referral traffic. Or, for example, this is really useful if a client is saying, "Well, I want to know how this set of affiliate traffic compares to say this set of affiliate traffic," then you can separate these out into different buckets.
This isn't just for channel data. I've done this, for example, where we were looking at social data and we were comparing NFL teams as an example for another tool, Rival IQ. What I said was, okay, so these teams here are in the AFC East, and these teams are in the AFC West. If I've screwed up and I said AFC East and West, please don't get mad at me in the comments. I promise I play fantasy football. I just don't remember right now.
But you can combine different areas. This is great for things like sales regions, for example. So North America equals Canada plus the USA plus Mexico, if you're feeling generous. This is NAFTA politics. It really depends on what you want to do with those sales regions and how your data, what is meaningful for you. That's the most important thing about this is that you can change this data to be whatever you need it to be to make that reporting so much easier for you.
I mean, Else then, we don't know if this might actually output. I haven't tried this myself. If it does, please leave a comment and let me know.
Then you end up with an End. When you're in Data Studio, when you're making these calculated formulas, you'll see right away whether or not it works or not. Just keep trying until you see it happen.
One of the great things about Data Studio is that if it's right, you'll see these types of colors, and I've used different color whiteboard markers to indicate how it should look. If you see red where you should be seeing black or green where you should be seeing black, for example, then you know you've typed in something wrong in your formula. For me, typically I find it's a misplaced bracket. Just keep an eye on that.
Have fun with Data Studio. One of the great things too is that you can't mess up your original data when doing calculated fields, so you can go hog wild and it's not going to mess with the original data. I hope you have a great time in Data Studio. Tell me what you've done in the comments, please. Thank you.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2LVYejw via IFTTT
6 notes · View notes
wickedbananas · 7 years ago
Text
Rewriting the Beginner's Guide to SEO, Chapter 3: Keyword Research
Posted by BritneyMuller
Welcome to the draft of Chapter Three of the new and improved Beginner's Guide to SEO! So far you've been generous and energizing with your feedback for our outline, Chapter One, and Chapter Two. We're asking for a little more of your time as we debut the our third chapter on keyword research. Please let us know what you think in the comments!
Chapter 3: Keyword Research
Understand what your audience wants to find.
Now that you’ve learned how to show up in search results, let’s determine which strategic keywords to target in your website’s content, and how to craft that content to satisfy both users and search engines.
The power of keyword research lies in better understanding your target market and how they are searching for your content, services, or products.
Keyword research provides you with specific search data that can help you answer questions like:
What are people searching for?
How many people are searching for it?
In what format do they want that information?
In this chapter, you'll get tools and strategies for uncovering that information, as well as learn tactics that'll help you avoid keyword research foibles and build strong content. Once you uncover how your target audience is searching for your content, you begin to uncover a whole new world of strategic SEO!
What terms are people searching for?
You may know what you do, but how do people search for the product, service, or information you provide? Answering this question is a crucial first step in the keyword research process.
Discovering keywords
You likely have a few keywords in mind that you would like to rank for. These will be things like your products, services, or other topics your website addresses, and they are great seed keywords for your research, so start there! You can enter those keywords into a keyword research tool to discover average monthly search volume and similar keywords. We’ll get into search volume in greater depth in the next section, but during the discovery phase, it can help you determine which variations of your keywords are most popular amongst searchers.
Once you enter in your seed keywords into a keyword research tool, you will begin to discover other keywords, common questions, and topics for your content that you might have otherwise missed.
Let’s use the example of a florist that specializes in weddings.
Typing “wedding” and “florist” into a keyword research tool, you may discover highly relevant, highly searched for related terms such as:
Wedding bouquets
Bridal flowers
Wedding flower shop
In the process of discovering relevant keywords for your content, you will likely notice that the search volume of those keywords varies greatly. While you definitely want to target terms that your audience is searching for, in some cases, it may be more advantageous to target terms with lower search volume because they're far less competitive.
Since both high- and low-competition keywords can be advantageous for your website, learning more about search volume can help you prioritize keywords and pick the ones that will give your website the biggest strategic advantage.
Pro tip: Diversify! It’s important to note that entire websites don’t rank for keywords, pages do. With big brands, we often see the homepage ranking for many keywords, but for most websites, this isn’t usually the case. Many websites receive more organic traffic to pages other than the homepage, which is why it’s so important to diversify your website’s pages by optimizing each for uniquely valuable keywords.
How often are those terms searched?
Uncovering search volume
The higher the search volume for a given keyword or keyword phrase, the more work is typically required to achieve higher rankings. This is often referred to as keyword difficulty and occasionally incorporates SERP features; for example, if many SERP features (like featured snippets, knowledge graph, carousels, etc) are clogging up a keyword’s result page, difficulty will increase. Big brands often take up the top 10 results for high-volume keywords, so if you’re just starting out on the web and going after the same keywords, the uphill battle for ranking can take years of effort.
Typically, the higher the search volume, the greater the competition and effort required to achieve organic ranking success. Go too low, though, and you risk not drawing any searchers to your site. In many cases, it may be most advantageous to target highly specific, lower competition search terms. In SEO, we call those long-tail keywords.
Understanding the long tail
It would be great to rank #1 for the keyword "shoes"... or would it?
It's wonderful to deal with keywords that have 50,000 searches a month, or even 5,000 searches a month, but in reality, these popular search terms only make up a fraction of all searches performed on the web. In fact, keywords with very high search volumes may even indicate ambiguous intent, which, if you target these terms, it could put you at risk for drawing visitors to your site whose goals don't match the content your page provides.
Does the searcher want to know the nutritional value of pizza? Order a pizza? Find a restaurant to take their family? Google doesn’t know, so they offer these features to help you refine. Targeting “pizza” means that you’re likely casting too wide a net.
The remaining 75% lie in the “chunky middle” and "long tail" of search.
Don’t underestimate these less popular keywords. Long tail keywords with lower search volume often convert better, because searchers are more specific and intentional in their searches. For example, a person searching for "shoes" is probably just browsing. Whereas, someone searching for "best price red womens size 7 running shoe,” practically has their wallet out!
Pro tip: Questions are SEO gold! Discovering what questions people are asking in your space, and adding those questions and their answers to an FAQ page, can yield incredible organic traffic for your website.
Getting strategic with search volume
Now that you’ve discovered relevant search terms for your site and their corresponding search volumes, you can get even more strategic by looking at your competitors and figuring out how searches might differ by season or location.
Keywords by competitor
You’ll likely compile a lot of keywords. How do you know which to tackle first? It could be a good idea to prioritize high-volume keywords that your competitors are not currently ranking for. On the flip side, you could also see which keywords from your list your competitors are already ranking for and prioritize those. The former is great when you want to take advantage of your competitors’ missed opportunities, while the latter is an aggressive strategy that sets you up to compete for keywords your competitors are already performing well for.
Keywords by season
Knowing about seasonal trends can be advantageous in setting a content strategy. For example, if you know that “christmas box” starts to spike in October through December in the United Kingdom, you can prepare content months in advance and give it a big push around those months.
Keywords by region
You can more strategically target a specific location by narrowing down your keyword research to specific towns, counties, or states in the Google Keyword Planner, or evaluate "interest by subregion" in Google Trends. Geo-specific research can help make your content more relevant to your target audience. For example, you might find out that in Texas, the preferred term for a large truck is “big rig,” while in New York, “tractor trailer” is the preferred terminology.
Which format best suits the searcher's intent?
In Chapter 2, we learned about SERP features. That background is going to help us understand how searchers want to consume information for a particular keyword. The format in which Google chooses to display search results depends on intent, and every query has a unique one. While there are thousands of of possible search types, there are five major categories to be aware of:
1. Informational queries: The searcher needs information, such as the name of a band or the height of the Empire State Building.
2. Navigational queries: The searcher wants to go to a particular place on the Internet, such as Facebook or the homepage of the NFL.
3. Transactional queries: The searcher wants to do something, such as buy a plane ticket or listen to a song.
4. Commercial investigation: The searcher wants to compare products and find the best one for their specific needs.
5. Local queries: The searcher wants to find something locally, such as a nearby coffee shop, doctor, or music venue.
An important step in the keyword research process is surveying the SERP landscape for the keyword you want to target in order to get a better gauge of searcher intent. If you want to know what type of content your target audience wants, look to the SERPs!
Google has closely evaluated the behavior of trillions of searches in an attempt to provide the most desired content for each specific keyword search.
Take the search “dresses,” for example:
By the shopping carousel, you can infer that Google has determined many people who search for “dresses” want to shop for dresses online.
There is also a Local Pack feature for this keyword, indicating Google’s desire to help searchers who may be looking for local dress retailers.
If the query is ambiguous, Google will also sometimes include the “refine by” feature to help searchers specify what they’re looking for further. By doing so, the search engine can provide results that better help the searcher accomplish their task.
Google has a wide array of result types it can serve up depending on the query, so if you’re going to target a keyword, look to the SERP to understand what type of content you need to create.
Tools for determining the value of a keyword
How much value would a keyword add to your website? These tools can help you answer that question, so they’d make great additions to your keyword research arsenal:
Moz Keyword Explorer - Our own Moz Keyword Explorer tool extracts accurate search volume data, keyword difficulty, and keyword opportunity metrics by using live clickstream data. To learn more about how we're producing our keyword data, check out Announcing Keyword Explorer.
Google Keyword Planner - Google's AdWords Keyword Planner has historically been the most common starting point for SEO keyword research. However, Keyword Planner does restrict search volume data by lumping keywords together into large search volume range buckets. To learn more, check out Google Keyword Planner’s Dirty Secrets.
Google Trends - Google’s keyword trend tool is great for finding seasonal keyword fluctuations. For example, “funny halloween costume ideas” will peak in the weeks before Halloween.
AnswerThePublic - This free tool populates commonly searched for questions around a specific keyword. Bonus! You can use this tool in tandem with another free tool, Keywords Everywhere, to prioritize ATP’s suggestions by search volume.
SpyFu Keyword Research Tool - Provides some really neat competitive keyword data.
Download our free keyword research template! Keyword research can yield a ton of data. Stay organized by downloading our free keyword research template. You can customize the template to fit your unique needs (ex: remove the “Seasonal Trends” column), sort keywords by volume, and categorize by Priority Score. Happy keyword researching!
Now that you know how to uncover what your target audience is searching for and how often, it’s time to move onto the next step: crafting pages in a way that users will love and search engines can understand.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2AxE7DT via IFTTT
0 notes
wickedbananas · 7 years ago
Text
Better Than Basics: Custom-Tailoring Your SEO Approach
Posted by Laura.Lippay
Just like people, websites come in all shapes and sizes. They’re different ages, with different backgrounds, histories, motivations, and resources at hand. So when it comes to approaching SEO for a site, one-size-fits-all best practices are typically not the most effective way to go about it (also, you’re better than that).
An analogy might be if you were a fitness coach. You have three clients. One is a 105lb high school kid who wants to beef up a little. One is a 65-year-old librarian who wants better heart health. One is a heavyweight lumberjack who’s working to be the world’s top springboard chopper. Would you consider giving each of them the same diet and workout routine? Probably not. You’re probably going to:
Learn all you can about their current diet, health, and fitness situations.
Come up with the best approach and the best tactics for each situation.
Test your way into it and optimize, as you learn what works and what doesn’t.
In SEO, consider how your priorities might be different if you saw similar symptoms — let’s say problems ranking anything on the first page — for:
New sites vs existing sites
New content vs older content
Enterprise vs small biz
Local vs global
Type of market — for example, a news site, e-commerce site, photo pinning, or a parenting community
A new site might need more sweat equity or have previous domain spam issues, while an older site might have years of technical mess to clean up. New content may need the right promotional touch while old content might just simply be stale. The approach for enterprise is often, at its core, about getting different parts of the organization to work together on things they don’t normally do, while the approach for small biz is usually more scrappy and entrepreneurial.
With the lack of trust in SEO today, people want to know if you can actually help them and how. Getting to know the client or project intimately and proposing custom solutions shows that you took the time to get to know the details and can suggest an effective way forward. And let’s not forget that your SEO game plan isn’t just important for the success of the client — it’s important for building your own successes, trust, and reputation in this niche industry.
How to customize an approach for a proposal
Do: Listen first
Begin by asking questions. Learn as much as you can about the situation at hand, the history, the competition, resources, budget, timeline, etc. Maybe even sleep on it and ask more questions before you provide a proposal for your approach.
Consider the fitness trainer analogy again. Now that you’ve asked questions, you know that the high school kid is already at the gym on a regular basis and is overeating junk food in his attempt to beef up. The librarian has been on a low-salt paleo diet since her heart attack a few years ago, and knows she knows she needs to exercise but refuses to set foot in a gym. The lumberjack is simply a couch potato.
Now that you know more, you can really tailor a proposed approach that might appeal to your potential client and allow you and the client to see how you might reach some initial successes.
Do: Understand business priorities.
What will fly? What won’t fly? What can we push for and what’s off the table? Even if you feel strongly about particular tactics, if you can’t shape your work within a client’s business priorities you may have no client at all.
Real-world example:
Site A wanted to see how well they could rank against their biggest content-heavy SERP competitors like Wikipedia but wanted to keep a sleek, content-light experience. Big-brand SEO vendors working for Site A pushed general, content-heavy SEO best practices. Because Site A wanted solutions that fit into their current workload along with a sleek, content-light experience, they pushed back.
The vendors couldn’t keep the client because they weren’t willing to get into the clients workload groove and go beyond general best practices. They didn’t listen to and work within the client’s specific business objectives.
Site A hired internal SEO resources and tested into an amount of content that they were comfortable with, in sync with technical optimization and promotional SEO tactics, and saw rankings slowly improve. Wikipedia and the other content-heavy sites are still sometimes outranking Site A, but Site A is now a stronger page one competitor, driving more traffic and leads, and can make the decision from here whether it’s worth it to continue to stay content-light or ramp up even more to get top 3 rankings more often.
The vendors weren’t necessarily incorrect in suggesting going content-heavy for the purpose of competitive ranking, but they weren’t willing to find the middle ground to test into light content first, and they lost a big brand client. At its current state, Site A could ramp up content even more, but gobs of text doesn’t fit the sleek brand image and it’s not proven that it would be worth the engineering maintenance costs for that particular site — a very practical, “not everything in SEO is most important all the time” approach.
Do: Find the momentum
It’s easiest to inject SEO where there’s already momentum into a business running full-speed ahead. Are there any opportunities to latch onto an effort that’s just getting underway? This may be more important than your typical best practice priorities.
Real-world example:
Brand X had 12–20 properties (websites) at any given time, but their small SEO team could only manage about 3 at a time. Therefore the SEO team had to occasionally assess which properties they would be working with. Properties were chosen based on:
Which ones have the biggest need or opportunities?
Which ones have resources that they’re willing to dedicate?
Which ones are company priorities?
#2 was important. Without it, the idea that one of the properties might have the biggest search traffic opportunity didn’t matter if they had no resources to dedicate to implement the SEO team’s recommendations.
Similarly, in the first example above, the vendors weren’t able to go with the client’s workflow and lost the client. Make sure you’re able to identify which wheels are moving that you can take advantage of now, in order to get things done. There may be some tactics that will have higher impact, but if the client isn’t ready or willing to do them right now, you’re pushing a boulder uphill.
Do: Understand the competitive landscape
What is this site up against? What is the realistic chance they can compete? Knowing what the competitive landscape looks like, how will that influence your approach?
Real-world example:
Site B has a section of pages competing against old, strong, well-known, content-heavy, link-rich sites. Since it’s a new site section, almost everything needs to be done for Site B — technical optimization, building content, promotion, and generating links. However, the nature of this competitive landscape shows us that being first to publish might be important here. Site B’s competitors oftentimes have content out weeks if not months before the actual content brand owner (Site B). How? By staying on top of Site B’s press releases. The competitors created landing pages immediately after Site B put out a press release, while Site B didn’t have a landing page until the product actually launched. Once this was realized, being first to publish became an important factor. And because Site B is an enterprise site, and changing that process takes time internally, other technical and content optimization for the page templates happened concurrently, so that there was at least the minimal technical optimization and content on these pages by the time the process for first-publishing was shaped.
Site B is now generating product landing pages at the time of press release, with links to the landing pages in those press releases that are picked up by news outlets, giving Site B the first page and the first links, and this is generating more links than their top competitor in the first 7 days 80% of the time.
Site B didn’t audit the site and suggest tactics by simply checking off a list of technical optimizations prioritized by an SEO tool or ranking factors, but instead took a more calculated approach based on what’s happening in the competitive landscape, combined with the top prioritized technical and content optimizations. Optimizing the site itself without understanding the competitive landscape in this case would be leaving the competitors, who also have optimized sites with a lot of content, a leg up because they were cited (linked to) and picked up by Google first.
Do: Ask what has worked and hasn’t worked before
Asking this question can be very informative and help to drill down on areas that might be a more effective use of time. If the site has been around for a while, and especially if they already have an SEO working with them, try to find out what they’ve already done that has worked and that hasn’t worked to give you clues on what approaches might be successful or not..
General example:
Site C has hundreds, sometimes thousands of internal cross-links on their pages, very little unique text content, and doesn’t see as much movement for cross-linking projects as they do when adding unique text.
Site D knows from previous testing that generating more keyword-rich content on their landing pages hasn’t been as effective as implementing better cross-linking, especially since there is very little cross-linking now.
Therefore each of these sites should be prioritizing text and cross-linking tactics differently. Be sure to ask the client or potential client about previous tests or ranking successes and failures in order to learn what tactics may be more relevant for this site before you suggest and prioritize your own.
Do: Make sure you have data
Ask the client what they’re using to monitor performance. If they do not have the basics, suggest setting it up or fold that into your proposal as a first step. Define what data essentials you need to analyze the site by asking the client about their goals, walking through how to measure those goals with them, and then determining the tools and analytics setup you need. Those essentials might be something like:
Webmaster tools set up. I like to have at least Google and Bing, so I can compare across search engines to help determine if a spike or a drop is happening in both search engines, which might indicate that the cause is from something happening with the site, or in just one search engine, which might indicate that the cause is algo-related.
Organic search engine traffic. At the very least, you should be able to see organic search traffic by page type (ex: service pages versus product pages). At best, you can also filter by things like URL structure, country, date, referrers/source and be able to run regex queries for granularity.
User testing & focus groups. Optional, but useful if it’s available & can help prioritization. Has the site gathered any insights from users that could be helpful in deciding on and prioritizing SEO tactics? For example, focus groups on one site showed us that people were more likely to convert if they could see a certain type of content that wouldn’t have necessarily been a priority for SEO otherwise. If they’re more likely to convert, they’re less likely to bounce back to search results, so adding that previously lower-priority content could have double advantages for the site: higher conversions and lower bounce rate back to SERPs.
Don’t: Make empty promises.
Put simply, please, SEOs, do not blanket promise anything. Hopeful promises leads to SEOs being called snake oil salesmen. This is a real problem for all of us, and you can help turn it around.
Clients and managers will try to squeeze you until you break and give them a number or a promised rank. Don’t do it. This is like a new judoka asking the coach to promise they’ll make it to the Olympics if they sign up for the program. The level of success depends on what the judoka puts into it, what her competition looks like, what is her tenacity for courage, endurance, competition, resistance… You promise, she signs up, says “Oh, this takes work so I’m only going to come to practice on Saturdays,” and everybody loses.
Goals are great. Promises are trouble. Good contracts are imperative.
Here are some examples:
We will get you to page 1. No matter how successful you may have been in the past, every site, competitive landscape, and team behind the site is a different challenge. A promise of #1 rankings may be a selling point to get clients, but can you live up to it? What will happen to your reputation of not? This industry is small enough that word gets around when people are not doing right by their clients.
Rehashing vague stats. I recently watched a well-known agency tell a room full of SEOs: “The search result will provide in-line answers for 47% of your customer queries”. Obviously this isn’t going to be true for every SEO in the room, since different types of queries have different SERPS, and the SERP UI constantly changes, but how many of the people in that room went back to their companies and their clients and told them that? What happens to those SEOs if that doesn’t prove true?
We will increase traffic by n%. Remember, hopeful promises can lead to being called snake oil salesmen. If you can avoid performance promises, especially in the proposal process, by all means please do. Set well-informed goals rather than high-risk promises, and be conservative when you can. It always looks better to over-perform than to not reach a goal.
You will definitely see improvement. Honestly, I wouldn’t even promise this unless you would *for real* bet your life on it. You may see plenty of opportunities for optimization but you can’t be sure they’ll implement anything, they’ll implement things correctly, implementations will not get overwritten, competitors won’t step it up or new ones rise, or that the optimization opportunities you see will even work on this site.
Don’t: Use the same proposal for every situation at hand.
If your proposal is so vague that it might actually seem to apply to any site, then you really should consider taking a deeper look at each situation at hand before you propose.
Would you want your doctor to prescribe the same thing for your (not yet known) pregnancy as the next person’s (not yet known) fungal blood infection, when you both just came in complaining of fatigue?
Do: Cover yourself in your contract
As a side note for consultants, this is a clause I include in my contract with clients for protection against being sued if clients aren’t happy with their results. It’s especially helpful for stubborn clients who don’t want to do the work and expect you to perform magic. Feel free to use it:
“Consultant makes no warranty, express, implied or statutory, with respect to the services provided hereunder, including without limitation any implied warranty of reliability, usefulness, merchantability, fitness for a particular purpose, noninfringement, or those arising from the course of performance, dealing, usage or trade. By signing this agreement, you acknowledge that Consultant neither owns nor governs the actions of any search engine or the Customer’s full implementations of recommendations provided by Consultant. You also acknowledge that due to non-responsibility over full implementations, fluctuations in the relative competitiveness of some search terms, recurring changes in search engine algorithms and other competitive factors, it is impossible to guarantee number one rankings or consistent top ten rankings, or any other specific search engines rankings, traffic or performance.”
Go get 'em!
The way you approach a new SEO client or project is critical to setting yourself up for success. And I believe we can all learn from each other’s experiences. Have you thought outside the SEO standards box to find success with any of your clients or projects? Please share in the comments!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2OxcAVV via IFTTT
1 note · View note
wickedbananas · 7 years ago
Text
Using the Flowchart Method for Diagnosing Ranking Drops - Whiteboard Friday
Posted by KameronJenkins
Being able to pinpoint the reason for a ranking drop is one of our most perennial and potentially frustrating tasks as SEOs. There are an unknowable number of factors that go into ranking these days, but luckily the methodology for diagnosing those fluctuations is readily at hand. In today's Whiteboard Friday, we welcome the wonderful Kameron Jenkins to show us a structured way to diagnose ranking drops using a flowchart method and critical thinking.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Hey, everyone. Welcome to this week's edition of Whiteboard Friday. My name is Kameron Jenkins. I am the new SEO Wordsmith here at Moz, and I'm so excited to be here. Before this, I worked at an agency for about six and a half years. I worked in the SEO department, and really a common thing we encountered was a client's rankings dropped. What do we do?
This flowchart was kind of built out of that mentality of we need a logical workflow to be able to diagnose exactly what happened so we can make really pointed recommendations for how to fix it, how to get our client's rankings back. So let's dive right in. It's going to be a flowchart, so it's a little nonlinear, but hopefully this makes sense and helps you work smarter rather than harder.
Was it a major ranking drop?: No
The first question I'd want to ask is: Was their rankings drop major? By major, I would say that's something like page 1 to page 5 overnight. Minor would be something like it just fell a couple positions, like position 3 to position 5.
We're going to take this path first. It was minor.
Has there been a pattern of decline lasting about a month or greater?
That's not a magic number. A month is something that you can use as a benchmark. But if there's been a steady decline and it's been one week it's position 3 and then it's position 5 and then position 7, and it just keeps dropping over time, I would consider that a pattern of decline.
So if no, I would actually say wait.
Volatility is normal, especially if you're at the bottom of page 1, maybe page 2 plus. There's going to be a lot more shifting of the search results in those positions. So volatility is normal.
Keep your eyes on it, though. It's really good to just take note of it like, "Hey, we dropped. Okay, I'm going to check that again next week and see if it continues to drop, then maybe we'll take action."
Wait it out. At this point, I would just caution against making big website updates if it hasn't really been warranted yet. So volatility is normal. Expect that. Keep your finger on the pulse, but just wait it out at this point.
If there has been a pattern of decline though, I'm going to have you jump to the algorithm update section. We're going to get there in a second. But for now, we're going to go take the major rankings drop path.
Was it a major ranking drop?: Yes
The first question on this path that I'd want to ask is:
Was there a rank tracking issue?
Now, some of these are going seem pretty basic, like how would that ever happen, but believe me it happens every once in a while. So just before we make major updates to the website, I'd want to check the rank tracking.
I. The wrong domain or URL.
That can be something that happens a lot. A site maybe you change domains or maybe you move a page and that old page of that old domain is still listed in your ranking tracker. If that's the case, then the rank tracking tool doesn't know which URL to judge the rankings off of. So it's going to look like maybe you dropped to position 10 overnight from position 1, and that's like, whoa, that's a huge update. But it's actually just that you have the wrong URL in there. So just check that. If there's been a page update, a domain update, check to make sure that you've updated your rank tracker.
II. Glitches.
So it's software, it can break. There are things that could cause it to be off for whatever reason. I don't know how common that is. It probably is totally dependent on which kind of software you use. But glitches do happen, so I would manually check your rankings.
III. Manually check rankings.
One way I would do that is...
Go to incognito in Google and make sure you're logged out so it's not personalized. I would search the term that you're wanting to rank for and see where you're actually ranking.
Google's Ad Preview tool. That one is really good too if you want to search where you're ranking locally so you can set your geolocation. You could do mobile versus desktop rankings. So it could be really good for things like that.
Crosscheck with another tool, like Moz's tool for rank tracking. You can pop in your URLs, see where you're ranking, and cross-check that with your own tool.
So back to this. Rank tracking issues. Yes, you found your problem. If it was just a rank tracking tool issue, that's actually great, because it means you don't have to make a lot of changes. Your rankings actually haven't dropped. But if that's not the issue, if there is no rank tracking issue that you can pinpoint, then I would move on to Google Search Console.
Problems in Google Search Console?
So Google Search Console is really helpful for checking site health matters. One of the main things I would want to check in there, if you experience a major drop especially, is...
I. Manual actions.
If you navigate to Manual Actions, you could see notes in there like unnatural links pointing to your site. Or maybe you have thin or low-quality content on your site. If those things are present in your Manual Actions, then you have a reference point. You have something to go off of. There's a lot of work involved in lifting a manual penalty that we can't get into here unfortunately. Some things that you can do to focus on manual penalty lifting...
Moz's Link Explorer. You can check your inbound links and see their spam score. You could look at things like anchor text to see if maybe the links pointing to your site are keyword stuffed. So you can use tools like that.
There are a lot of good articles too, in the industry, just on getting penalties lifted. Marie Haynes especially has some really good ones. So I would check that out.
But you have found your problem if there's a manual action in there. So focus on getting that penalty lifted.
II. Indexation issues.
Before you move out of Search Console, though, I would check indexation issues as well. Maybe you don't have a manual penalty. But go to your index coverage report and you can see if anything you submitted in your sitemap is maybe experiencing issues. Maybe it's blocked by robots.txt, or maybe you accidentally no indexed it. You could probably see that in the index coverage report. Search Console, okay. So yes, you found your problem. No, you're going to move on to algorithm updates.
Algorithm updates
Algorithm updates happen all the time. Google says that maybe one to two happen per day. Not all of those are going to be major. The major ones, though, are listed. They're documented in multiple different places. Moz has a really good list of algorithm updates over time. You can for sure reference that. There are going to be a lot of good ones. You can navigate to the exact year and month that your site experienced a rankings drop and see if it maybe correlates with any algorithm update.
For example, say your site lost rankings in about January 2017. That's about the time that Google released its Intrusive Interstitials Update, and so I would look on my site, if that was the issue, and say, "Do I have intrusive interstitials? Is this something that's affecting my website?"
If you can match up an algorithm update with the time that your rankings started to drop, you have direction. You found an issue. If you can't match it up to any algorithm updates, it's finally time to move on to site updates.
Site updates
What changes happened to your website recently? There are a lot of different things that could have happened to your website. Just keep in mind too that maybe you're not the only one who has access to your website. You're the SEO, but maybe tech support has access. Maybe even your paid ad manager has access. There are a lot of different people who could be making changes to the website. So just keep that in mind when you're looking into it. It's not just the changes that you made, but changes that anyone made could affect the website's ranking. Just look into all possible factors.
Other factors that can impact rankings
A lot of different things, like I said, can influence your site's rankings. A lot of things can inadvertently happen that you can pinpoint and say, "Oh, that's definitely the cause."
Some examples of things that I've personally experienced on my clients' websites...
I. Renaming pages and letting them 404 without updating with a 301 redirect.
There was one situation where a client had a blog. They had hundreds of really good blog posts. They were all ranking for nice, long tail terms. A client emailed into tech support to change the name of the blog. Unfortunately, all of the posts lived under the blog, and when he did that, he didn't update it with a 301 redirect, so all of those pages, that were ranking really nicely, they started to fall out of the index. The rankings went with it. There's your problem. It was unfortunate, but at least we were able to diagnose what happened.
II. Content cutting.
Maybe you're working with a UX team, a design team, someone who is looking at the website from a visual, a user experience perspective. A lot of times in these situations they might take a page that's full of really good, valuable content and they might say, "Oh, this is too clunky. It's too bulky. It has too many words. So we're going to replace it with an image, or we're going to take some of the content out."
When this happens, if the content was the thing that was making your page rank and you cut that, that's probably something that's going to affect your rankings negatively. By the way, if that's happening to you, Rand has a really good Whiteboard Friday on kind of how to marry user experience and SEO. You should definitely check that out if that's an issue for you.
III. Valuable backlinks lost.
Another situation I was diagnosing a client and one of their backlinks dropped. It just so happened to be like the only thing that changed over this course of time. It was a really valuable backlink, and we found out that they just dropped it for whatever reason, and the client's rankings started to decline after that time. Things like Moz's tools, Link Explorer, you can go in there and see gained and lost backlinks over time. So I would check that out if maybe that might be an issue for you.
IV. Accidental no index.
Depending on what type of CMS you work with, it might be really, really easy to accidentally check No Index on this page. If you no index a really important page, Google takes it out of its index. That could happen. Your rankings could drop.So those are just some examples of things that can happen. Like I said, hundreds and hundreds of things could have been changed on your site, but it's just really important to try to pinpoint exactly what those changes were and if they coincided with when your rankings started to drop.
SERP landscape
So we got all the way to the bottom. If you're at the point where you've looked at all of the site updates and you still haven't found anything that would have caused a rankings drop, I would say finally look at the SERP landscape.
What I mean by that is just Google your keyword that you want to rank for or your group of keywords that you want to rank for and see which websites are ranking on page 1. I would get a lay of the land and just see:
What are these pages doing?
How many backlinks do they have?
How much content do they have?
Do they load fast?
What's the experience?
Then make content better than that. To rank, so many people just think avoid being spammy and avoid having things broken on your site. But that's not SEO. That's really just helping you be able to compete. You have to have content that's the best answer to searchers' questions, and that's going to get you ranking.
I hope that was helpful. This is a really good way to just kind of work through a ranking drop diagnosis. If you have methods, by the way, that work for you, I'd love to hear from you and see what worked for you in the past. Let me know, drop it in the comments below.
Thanks, everyone. Come back next week for another edition of Whiteboard Friday.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2mNGI2S via IFTTT
1 note · View note
wickedbananas · 7 years ago
Text
Rewriting the Beginner's Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking
Posted by BritneyMuller
It's been a few months since our last share of our work-in-progress rewrite of the Beginner's Guide to SEO, but after a brief hiatus, we're back to share our draft of Chapter Two with you! This wouldn’t have been possible without the help of Kameron Jenkins, who has thoughtfully contributed her great talent for wordsmithing throughout this piece.
This is your resource, the guide that likely kicked off your interest in and knowledge of SEO, and we want to do right by you. You left amazingly helpful commentary on our outline and draft of Chapter One, and we'd be honored if you would take the time to let us know what you think of Chapter Two in the comments below.
Chapter 2: How Search Engines Work – Crawling, Indexing, and Ranking
First, show up.
As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet's content in order to offer the most relevant results to the questions searchers are asking.
In order to show up in search results, your content needs to first be visible to search engines. It's arguably the most important piece of the SEO puzzle: If your site can't be found, there's no way you'll ever show up in the SERPs (Search Engine Results Page).
How do search engines work?
Search engines have three primary functions:
Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
Rank: Provide the pieces of content that will best answer a searcher's query. Order the search results by the most helpful to a particular query.
What is search engine crawling?
Crawling, is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.
The bot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, crawlers are able to find new content and add it to their index — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.
What is a search engine index?
Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.
Search engine ranking
When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher's query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.
It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.
By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!
Note: In SEO, not all search engines are equal
Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that's nearly 20 times Bing and Yahoo combined.
Crawling: Can search engines find your site?
As you've just learned, making sure your site gets crawled and indexed is a prerequisite for showing up in the SERPs. First things first: You can check to see how many and which pages of your website have been indexed by Google using "site:yourdomain.com", an advanced search operator.
Head to Google and type "site:yourdomain.com" into the search bar. This will return results Google has in its index for the site specified:
The number of results Google displays (see “About __ results” above) isn't exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.
For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don't currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google's index, among other things.
If you're not showing up anywhere in the search results, there are a few possible reasons why:
Your site is brand new and hasn't been crawled yet.
Your site isn't linked to from any external websites.
Your site's navigation makes it hard for a robot to crawl it effectively.
Your site contains some basic code called crawler directives that is blocking search engines.
Your site has been penalized by Google for spammy tactics.
If your site doesn't have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console or manually submitting individual URLs to Google. There's no guarantee they'll include a submitted URL in their index, but it's worth a try!
Can search engines see your whole site?
Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It's important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.
Ask yourself this: Can the bot crawl through your website, and not just to it?
Is your content hidden behind login forms?
If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won't see those protected pages. A crawler is definitely not going to log in.
Are you relying on search forms?
Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.
Is text hidden within non-text content?
Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there's no guarantee they will be able to read and understand it just yet. It's always best to add text within the <HTML> markup of your webpage.
Can search engines follow your site navigation?
Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.
Common navigation mistakes that can keep crawlers from seeing all of your site:
Having a mobile navigation that shows different results than your desktop navigation
Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!
This is why it's essential that your website has a clear navigation and helpful URL folder structures.
Information architecture
Information architecture is the practice of organizing and labeling content on a website to improve efficiency and fundability for users. The best information architecture is intuitive, meaning that users shouldn't have to think very hard to flow through your website or to find something.
Your site should also have a useful 404 (page not found) page for when a visitor clicks on a dead link or mistypes a URL. The best 404 pages allow users to click back into your site so they don’t bounce off just because they tried to access a nonexistent link.
Tell search engines how to crawl your site
In addition to making sure crawlers can reach your most important pages, it’s also pertinent to note that you’ll have pages on your site you don’t want them to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.
Blocking pages from search engines can also help crawlers prioritize your most important pages and maximize your crawl budget (the average number of pages a search engine bot will crawl on your site).
Crawler directives allow you to control what you want Googlebot to crawl and index using a robots.txt file, meta tag, sitemap.xml file, or Google Search Console.
Robots.txt
Robots.txt files are located in the root directory of websites (ex. yourdomain.com/robots.txt) and suggest which parts of your site search engines should and shouldn't crawl via specific robots.txt directives. This is a great solution when trying to block search engines from non-private pages on your site.
You wouldn't want to block private/sensitive pages from being crawled here because the file is easily accessible by users and bots.
Pro tip:
If Googlebot can't find a robots.txt file for a site (40X HTTP status code), it proceeds to crawl the site.
If Googlebot finds a robots.txt file for a site (20X HTTP status code), it will usually abide by the suggestions and proceed to crawl the site.
If Googlebot finds neither a 20X or a 40X HTTP status code (ex. a 501 server error) it can't determine if you have a robots.txt file or not and won't crawl your site.
Meta directives
The two types of meta directives are the meta robots tag (more commonly used) and the x-robots-tag. Each provides crawlers with stronger instructions on how to crawl and index a URL's content.
The x-robots tag provides more flexibility and functionality if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.
These are the best options for blocking more sensitive*/private URLs from search engines.
*For very sensitive URLs, it is best practice to remove them from or require a secure login to view the pages.
WordPress Tip: In Dashboard > Settings > Reading, make sure the "Search Engine Visibility" box is not checked. This blocks search engines from coming to your site via your robots.txt file!
Avoid these common pitfalls, and you'll have clean, crawlable content that will allow bots easy access to your pages.
Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.
Sitemaps
A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google's standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.
Google Search Console
Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly. How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages.
Indexing: How do search engines understand and remember your site?
Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page's contents. All of that information is stored in its index.
Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.
Can I see how a Googlebot crawler sees my pages?
Yes, the cached version of your page will reflect a snapshot of the last time googlebot crawled it.
Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like https://www.nytimes.com will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, http://www.rogerlovescupcakes.com (if only it were real…)
You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing "Cached":
You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.
Are pages ever removed from the index?
Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:
The URL is returning a "not found" error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
The URL has been blocked from crawling with the addition of a password required before visitors can access the page.
If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can manually submit the URL to Google by navigating to the “Submit URL” tool in Search Console.
Ranking: How do search engines rank URLs?
How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.
To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.
Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: “We’re making quality updates all the time.” This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.
What do search engines want?
Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?
Think about it in terms of someone learning a new language.
At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics—- the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.
When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:
Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.
This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.
The role links play in SEO
When we talk about links, we could mean two things. Backlinks or "inbound links" are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).
Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.
Backlinks work very similarly to real life WOM (Word-Of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:
Referrals from others = good sign of authority Example: Many different people have all told you that Jenny’s Coffee is the best in town
Referrals from yourself = biased, so not a good sign of authority Example: Jenny claims that Jenny’s Coffee is the best in town
Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
No referrals = unclear authority Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.
This is why PageRank was created. PageRank (part of Google's core algorithm) is a link analysis algorithm named after one of Google's founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.
The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.
The role content plays in SEO
There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.
Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?
Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.
Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.
What is RankBrain?
RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.
For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.
Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.
What does this mean for SEOs?
Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.
Engagement metrics: correlation, causation, or both?
With Google rankings, engagement metrics are most likely part correlation and part causation.
When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:
Clicks (visits from search)
Time on page (amount of time the visitor spent on a page before leaving it)
Bounce rate (the percentage of all website sessions where users viewed only one page)
Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)
Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?
What Google has said
While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.
According to Google’s former Chief of Search Quality, Udi Manber:
“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”
Another comment from former Google engineer Edmond Lau corroborates this:
“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”
Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.
What tests have confirmed
Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:
Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.
Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.
In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.
The evolution of search results
Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.
In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on their search result pages, called SERP features. Some of these SERP features include:
Paid advertisements
Featured snippets
People Also Ask boxes
Local (map) pack
Knowledge panel
Sitelinks
And Google is adding new ones all the time. It even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”
The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.
So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.
Query Intent
Possible SERP Feature Triggered
Informational
Featured Snippet
Informational with one answer
Knowledge Graph / Instant Answer
Local
Map Pack
Transactional
Shopping
We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.
Localized search
A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.
If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.
When it comes to localized search results, Google uses three main factors to determine ranking:
Relevance
Distance
Prominence
Relevance
Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.
Distance
Google use your geo-location to better serve you local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).
Organic search results are sensitive to a searcher's location, though seldom as pronounced as in local pack results.
Prominence
With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine local ranking, such as:
Reviews
The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.
Citations
A "business citation" or "business listing" is a web-based reference to a local business' "NAP" (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).
Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business's name, location, and phone number it strengthens Google's "trust" in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.
Check a local business' citation accuracy here.
Organic ranking
SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.
In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.
[Bonus!] Local engagement
Although not listed by Google as a local ranking determiner, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits...
...and even provides searchers with the ability to ask the business questions!
Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.
Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real time engagement metrics to determine quality and relevance.
You don’t have to know the ins and outs of Google’s algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let’s learn about choosing the keywords your content will target!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2LBxlRP via IFTTT
4 notes · View notes
wickedbananas · 7 years ago
Link
New blog post on PhotoProSEO.com https://ift.tt/2NxcsVe
2 notes · View notes
wickedbananas · 7 years ago
Text
Reputation Management SEO: How to Own Your Branded Keywords in Google - Whiteboard Friday
Posted by randfish
A searcher's first experience with your brand happens on Google's SERPs — not your website. Having the ability to influence their organic first impression can go a long way toward improving both customer perception of your brand and conversion rates. In today's Whiteboard Friday, Rand takes us through the inherent challenges of reputation management SEO and tactics for doing it effectively.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are chatting about reputation management SEO.
So it turns out I've been having a number of conversations with many of you in the Moz community and many friends of mine in the startup and entrepreneurship worlds about this problem that happens pretty consistently, which is essentially that folks who are searching for your brand in Google experience their first touch before they ever get to your site, their first experience with your brand is through Google's search result page. This SERP, controlling what appears here, what it says, how it says it, who is ranking, where they're ranking, all of those kinds of things, can have a strong input on a bunch of things.
The challenge
We know that the search results' content can impact...
Your conversion rate. People see that the reviews are generally poor or the wording is confusing or it creates questions in their mind that your content doesn't answer. That can hurt your conversion rate.
It can hurt amplification. People who see you in here, who think that there is something bad or negative about you, might be less likely to link to you or share or talk about you.
It can impact customer satisfaction. Customers who are going to buy from you but see something negative in the search results might be more likely to complain about it. Or if they see that you have a lower review or ranking or whatnot, they may be more likely to contribute a negative one than if they had seen that you had stellar ones. Their expectations are being biased by what's in these search results. A lot of times it is totally unfair.
So many of the conversations I've been having, for example with folks in the startup space, are like, "Hey, people are reviewing my product. We barely exist yet. We don't have these people as customers. We feel like maybe we're getting astroturfed by competitors, or someone is just jumping in here and trying to profit off the fact that we have a bunch of brand search now." So pretty frustrating.
How can we influence this page to maximize positive impact for our brand?
There are, however, some ways to address it. In order to change these results, make them better, Minted, for example, of which I should mention I used to be on Minted's Board of Directors, and so I believe my wife and I still have some stock in that company. So full disclosure there. But Minted, they're selling holiday cards. The holiday card market is about to heat up before November and December here in the United States, which is the Christmas holiday season, and that's when they sell a lot of these cards. So we can do a few things.
I. Change who ranks. So potentially remove some and add some new ones in here, give Google some different options. We could change the ranking order. So we could say, "Hey, we prefer this be lower down and this other one be higher up." We can change that through SEO.
II. Change the content of the ranking pages. If you have poor reviews or if someone has written about you in a particular way and you wish to change that, there are ways to influence that as well.
III. Change the SERP features. So we may be able to get images, for example, of Minted's cards up top, which would maybe make people more likely to purchase them, especially if they're exceptionally beautiful.
IV. Add in top stories. If Minted has some great press about them, we could try and nudge Google to use stuff from Google News in here. Maybe we could change what's in related searches, those types of things.
V. Shift search demand. So if it's the case that you're finding that people start typing "Minted" and then maybe are search suggested "Minted versus competitor X" or "Minted card problems" or whatever it is, I don't think either of those are actually in the suggest, but there are plenty of companies who do have that issue. When that's the case, you can also shift the search demand.
Reputation management tactics
Here are a number of tactics that I actually worked on with the help of Moz's Head of SEO, Britney Muller. Britney and I came up with a bunch of tactics, so many that they won't entirely fit on here, but we can describe a few more for you in the comments.
A. Directing link to URLs off your site (Helps with 1 & 2). First off, links are still a big influencer of a lot of the content that you see here. So it is the case that because Yelp is a powerful domain and they have lots of links, potentially even have lots of links to this page about Minted, it's the case that changing up those links, redirecting some of them, adding new links to places, linking out from your own site, linking from articles you contribute to, linking from, for example, the CEO's bio or a prominent influencer on the team's bio when they go and speak at events or contribute to sources, or when Minted makes donations, or when they support public causes, or when they're written about in the press, changing those links and where they point to can have a positive impact.
One of the problems that we see is that a lot of brands think, "All my links about my brand should always go to my homepage." That's not actually the case. It could be the case that you actually want to find, hey, maybe we would like our Facebook page to rank higher. Or hey, we wrote a great piece on Medium about our engineering practices or our diversity practices or how we give back to our community. Let's see if we can point some of our links to that.
B. Pitching journalists or bloggers or editors or content creators on the web (Helps with 1, 4, a little 3), of any kind, to write about you and your products with brand titled pieces. This is on e of the biggest elements that gets missing. For example, a journalist for the San Francisco Chronicle might write a piece about Minted and say something like, "At this startup, it's not unusual to find blah, blah, blah." What you want to do is go, "Come on, man, just put the word 'Minted' in the title of the piece." If they do, you've got a much better shot of having that piece potentially rank in here. So that's something that whoever you're working with on that content creation side, and maybe a reporter at the Chronicle would be much more difficult to do this, but a blogger who's writing about you or a reviewer, someone who's friendly to you, that type of a pitch would be much more likely to have some opportunity in there. It can get into the top stories SERP feature as well.
C. Crafting your own content (Helps with 1, a little 3). If they're not going to do it for you, you can craft your own content. You can do this in two kinds of ways. One is for open platforms like Medium.com or Huffington Post or Forbes or Inc. or LinkedIn, these places that accept those, or guest accepting publications that are much pickier, that are much more rarely taking input, but that rank well in your field. You don't have to think about this exclusively from a link building perspective. In fact, you don't care if the links are nofollow. You don't care if they give you no links at all. What you're trying to do is get your name, your title, your keywords into the title element of the post that's being put up.
D. You can influence reviews (Helps with 3 & 5). Depending on the site, it's different from site to site. So I'm putting TOS acceptable, terms of service acceptable nudges to your happy customers and prompt diligent support to the unhappy ones. So Yelp, for example, says, "Don't solicit directly reviews, but you are allowed to say, 'Our business is featured on Yelp.'" For someone like Minted, Yelp is mostly physical places, and while Minted technically has a location in San Francisco, their offices, it's kind of odd that this is what's ranking here. In fact, I wouldn't expect this to be. I think this is a strange result to have for an online-focused company, to have their physical location in there. So certainly by nudging folks who are using Minted to rather than contribute to their Facebook reviews or their Google reviews to actually say, "Hey, we're also on Yelp. If you've been happy with us, you can check us out there." Not go leave us a review there, but we have a presence.
E. Filing trademark violations (Helps with 1 & 3). So this is a legal path and legal angle, but it works in a couple of different ways. You can do a letter or an email from your attorney's office, and oftentimes that will shut things down. In fact, brief story, a friend of mine, who has a company, found that their product was featured on Amazon's website. They don't sell on Amazon. No one is reselling on Amazon. In fact, the product mostly hasn't even shipped yet. When they looked at the reviews, because they haven't sold very many of their product, it's an expensive product, none of the people who had left reviews were actually their customers. So they went, "What is going on here?" Well, it turns out Amazon, in order to list your product, needs your trademark permission. So they can send an attorney's note to Amazon saying, "Hey, you are using our product, our trademark, our brand name, our visuals, our photos without permission. You need to take that down."
The other way you can go about this is the Digital Millennium Copyright Act (DMCA) protocols. You can do this directly through Google, where you file and say basically, "Hey, they've taken copyrighted content from us and they're using it on their website, and that's illegal." Google will actually remove them from the search results. This is not necessarily a legal angle, but I bet you didn't know this. A few years ago I had an article on Wikipedia about me, Rand Fishkin. There was like a Wikipedia piece. I don't like that. Wikipedia, it's uncontrollable. Because I'm in the SEO world, I don't have a very good relationship with Wikipedia's editors. So I actually lobbied them, on the talk page of the article about me, to have it removed. There are a number of conditions that Wikipedia has where a page can be removed. I believe I got mine removed under the not notable enough category, which I think probably still applies. That was very successful. So wonderfully, now, Wikipedia doesn't rank for my name anymore, which means I can control the SERPs much more easily. So a potential there too.
F. Using brand advertising and/or influencer marketing to nudge searchers towards different phrases (Helps with 5). So what you call your products, how you market yourself is often how people will search for you. If Minted wanted to change this from Minted cards to minted photo cards, and they really like the results from minted photo cards and those had better conversion rates, they could start branding that through their advertising and their influencer marketing.
G. Surrounding your brand name, a similar way, with common text, anchor phrases, and links to help create or reinforce an association that Google builds around language (Helps with 4 & 5). In that example I said before, having Minted plus a link to their photo cards page or Minted photo cards appearing on the web, not only their own website but everywhere else out there more commonly than Minted cards will bias related searches and search suggest. We've tested this. You can actually use anchor text and surrounding text to sort of bias, in addition to how people search, how Google shows it.
H. Leverage some platforms that rank well and influence SERP features (Helps with 2 & 4). So rather than just trying to get into the normal organic results, we might say, "Hey, I want some images here. Aha, Pinterest is doing phenomenal work at image SEO. If I put up a bunch of pictures from Minted, of Minted's cards or photo cards on Pinterest, I have a much better shot at ranking in and triggering the image results." You can do the same thing with YouTube for videos. You can do the same thing with new sites and for what's called the top stories feature. The same thing with local and local review sites for the maps and local results feature. So all kinds of ways to do that.
More...
Four final topics before we wrap up.
Registering and using separate domains? Should I register and use a separate domain, like MintedCardReviews, that's owned by Minted? Generally not. It's not impossible to do reputation management SEO through that, but it can be difficult. I'm not saying you might not want to give it a spin now and then, but generally that's sort of like creating your own reviews, your own site. Google often recognizes those and looks behind the domain registration wall, and potentially you have very little opportunity to rank for those, plus you're doing a ton of link building and that kind of stuff. Better to leverage someone's platform, who can already rank, usually.
Negative SEO attacks. You might remember the story from a couple weeks ago, in Fast Company, where Casper, the mattress brand, was basically accused of and found mostly to be generally guilty of going after and buying negative links to a review site that was giving them poor reviews, giving their mattresses poor reviews, and to minimal effect. I think, especially nowadays, this is much less effective than it was a few years ago following Google's last Penguin update. But certainly I would not recommend it. If you get found out for it, you can be sued too.
What about buying reviewers and review sites? This is what Casper ended up doing. So that site they were buying negative links against, they ended up just making an offer and buying out the person who owned it. Certainly it is a way to go. I don't know if it's the most ethical or honest thing to do, but it is a possibility.
Monitoring brand and rankings. Finally, I would urge you to, if you're not experiencing these today, but you're worried about them, definitely monitor your brand. You could use something like a Fresh Web Explorer or Mention.com or Talkwalker. And your rankings too. You want to be tracking your rankings so that you can see who's popping in there and who's not. Obviously, there are lots of SEO tools to do that.
All right, everyone, thanks for joining us, and we'll see again next week for another edition of Whiteboard Friday. Take care.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2O2njaJ via IFTTT
27 notes · View notes
wickedbananas · 7 years ago
Text
The Local SEO’s Guide to the Buy Local Phenomenon: A Competitive Advantage for Clients
Posted by MiriamEllis
Photo credit: Michelle Shirley
What if a single conversation with one of your small local business clients could spark activity that would lead to an increase in their YOY sales of more than 7%, as opposed to only 4% if you don’t have the conversation? What if this chat could triple the amount of spending that stays in their town, reduce pollution in their community, improve their neighbors’ health, and strengthen democracy?
What if the brass ring of content dev, link opportunities, consumer sentiment and realtime local inventory is just waiting for you to grab it, on a ride we just haven’t taken yet, in a setting we’re just not talking about?
Let’s travel a different road today, one that parallels our industry’s typical conversation about citations, reviews, markup, and Google My Business. As a 15-year sailor on the Local SEO ship, I love all this stuff, but, like you, I’m experiencing a merging of online goals with offline realities, a heightened awareness of how in-store is where local business successes are born and bred, before they become mirrored on the web.
At Moz, our SaaS tools serve businesses of every kind: Digital, bricks-and-mortar, SABs, enterprises, mid-market agencies, big brands, and bootstrappers. But today, I’m going to go as small and as local as possible, speaking directly to independently-owned local businesses and their marketers about the buy local/shop local/go local movement and what I’ve learned about its potential to deliver meaningful and far-reaching successes. Frankly, I think you’ll be as amazed as I’ve been.
At the very least, I hope reading this article will inspire you to have a conversation with your local business clients about what this growing phenomenon could do for them and for their communities. Successful clients, after all, are the very best kind to have.
What is the Buy Local movement all about?
What’s the big idea?
You’re familiar with the concept of there being power in numbers. A single independent business lacks the resources and clout to determine the local decisions and policies that affect it. Should Walmart or Target be invited to set up shop in town? Should the crumbling building on Main St. be renovated or demolished? Which safety and cultural services should be supported with funding? The family running the small grocery store has little say, but if they join together with the folks running the bakery, the community credit union, the animal shelter, and the bookstore ... then they begin to have a stronger voice.
Who does this?
Buy Local programs formalize the process of independently-owned businesses joining together to educate their communities about the considerable benefits to nearly everyone of living in a thriving local economy. These efforts can be initiated by merchants, Chambers of Commerce, grassroots citizen groups, or others. They can be assisted and supported by non-profit organizations like the American Independent Business Alliance (AMIBA) and the Institute for Local Self-Reliance (ILSR).
What are the goals?
Through signage, educational events, media promotions, and other forms of marketing, most Buy Local campaigns share some or all of these goals:
Increase local wealth that recirculates within the community
Preserve local character
Build community
Create good jobs
Have a say in policy-making
Decrease environmental impacts
Support entrepreneurship
Improve diversity/variety
Compete with big businesses
Do Buy Local campaigns actually work?
Yes - research indicates that, if managed correctly, these programs yield a variety of benefits to both merchants and residents. Consider these findings:
1) Healthy YOY sales advantages
ILSR conducted a national survey of independent businesses to gauge YOY sales patterns. 2016 respondents reported a good increase in sales across the board, but with a significant difference which AMIBA sums up:
“Businesses in communities with a sustained grassroots “buy independent/buy local” campaign reported a strong 7.4% sales increase, nearly doubling the 4.2% gain for those in areas without such an alliance.”
2) Keeping spending local
The analysts at Civic Economics conducted surveys of 10 cities to gauge the local financial impacts of independents vs. chain retailers, yielding a series of graphics like this one:
While statistics vary from community to community, the overall pattern is one of significantly greater local recirculation of wealth in the independent vs. chain environment. These patterns can be put to good use by Buy Local campaigns with the goal of increasing community-sustaining wealth.
3) Keeping communities employed and safe
Few communities can safely afford the loss of jobs and tax revenue documented in a second Civic Economics study which details the impacts of Americans’ Amazon habit, state by state and across the nation:
While the recent supreme court ruling allowing states to tax e-commerce models could improve some of these dire numbers, towns and cities with Buy Local alliances can speak plainly: Lack of tax revenue that leads to lack of funding for emergency services like fire departments is simply unsafe and unsustainable. A study done a few years back found that ⅔ of volunteer firefighters in the US report that their departments are underfunded with 86% of these heroic workers having to dip into their own pockets to buy supplies to keep their stations going. As I jot these statistics down, there is a runaway 10,000 acre wildfire burning a couple of hours north of me…
Meanwhile, Inc.com is pointing out,
“According to the Bureau of Labor Statistics, since the end of the Great Recession, small businesses have created 62 percent of all net new private-sector jobs. Among those jobs, 66 percent were created by existing businesses, while 34 percent were generated through new establishments (adjusted for establishment closings and job losses)”.
When communities have Go Local-style business alliances, they are capitalizing on the ability to create jobs, increase sales, and build up tax revenue that could make a serious difference not just to local unemployment rates, but to local safety.
4) Shaping policy
In terms of empowering communities to shape policy, there are many anecdotes to choose from, but one of the most celebrated surrounds a landmark study conducted by the Austin Independent Business Alliance which documented community impacts of spending at the local book and music stores vs. a proposed Borders. Their findings were compelling enough to convince the city not to give a $2.1 million subsidy to the now-defunct corporation.
5) Improving the local environment
A single statistic here is incredibly eye opening. According to the US Department of Transportation, shopping-related driving per household more than tripled between 1969-2009.
All you have to do is picture to yourself the centralized location of mainstreet businesses vs. big boxes on the outskirts of town to imagine how city planning has contributed to this stunning rise in time spent on the road. When residents can walk or bike to make daily purchases, the positive environmental impacts are obvious.
6) Improving residents’ health and well-being
A recent Cigna survey of 20,000 Americans found that nearly half of them always or sometimes feel lonely, lacking in significant face-to-face interactions with others. Why does this matter? Because the American Psychological Association finds that you have a 50% less chance of dying prematurely if you have quality social interactions.
There’s a reason author Jan Karon’s “Mitford” series about life in a small town in North Carolina has been a string of NY Times Best Sellers; readers and reviewers continuously state that they yearn to live someplace like this fictitious community with the slogan “Mitford takes care of its own”. In the novels, the lives of residents, independent merchants, and “outsiders” interweave, in good times and bad, creating a support network many Americans envy.
This societal setup must be a winner, as well as a bestseller, because the Cambridge Journal of Regions published a paper in which they propose that the concentration of small businesses in a given community can be equated with levels of public health.
Beyond the theory that eating fresh and local is good for you, it turns out that knowing your farmer, your banker, your grocer could help you live longer.
7) Realizing big-picture goals
Speaking of memorable stories, this video from ILSR does a good job of detailing one view of the ultimate impacts independent business alliances can have on shaping community futures:
https://www.youtube.com/watch?time_continue=150&=&v=kDw4dZLSDXg
I interviewed author and AMIBA co-founder, Jeff Milchen, about the good things that can happen when independents join hands. He summed it up,
“The results really speak for themselves when you look at what the impact of public education for local alliances has been in terms of shifting culture. It’s a great investment for independent businesses to partner with other independents, to do things they can’t do individually. Forming these partnerships can help them compete with the online giants.”
Getting going with a Go Local campaign, the right way
If sharing some of the above with clients has made them receptive to further exploration of what involvement in an independent business alliance might do for them, here are the next steps to take:
First, find out if a Go Local/Shop Local/Buy Local/Stay Local campaign already exists in the business’ community. If so, the client can join up.
If not, contact AMIBA. The good folks there will know if other local business owners in the client’s community have already expressed interest in creating an alliance. They can help connect the interested parties up.
I highly, highly recommend reading through Amiba’s nice, free primer covering just about everything you need to know about Go Local campaigns.
Encourage the client to publicize their intent to create an alliance if none exists in their community. Do an op ed in the local print news, put it on social media sites, talk to neighbors. This can prompt outreach from potential allies in the effort.
A given group can determine to go it alone, but it may be better to rely on the past experience of others who have already created successful campaigns. AMIBA offers a variety of paid community training modules, including expert speakers, workshops, and on-site consultations. Each community can write in to request a quote for a training plan that will work best for them. The organization also offers a wealth of free educational materials on their website.
According to AMIBA’s Jeff Milchen, a typical Buy Local campaign takes about 3-4 months to get going.
It’s important to know that Go Local campaigns can fail, due to poor execution. Here is a roundup of practices all alliances should focus on to avoid the most common pitfalls:
Codify the definition of a “local” business as being independently-owned-and-run, or else big chain inclusion will anger some members and cause them to leave.
Emphasize all forms of local patronage; campaigns that stick too closely to words like “buy” or “shop” overlook the small banks, service area businesses, and other models that are an integral part of the independent local economy.
Ensure diversity in leadership; an alliance that fails to reflect the resources of age, race, gender/identity, political views, economics and other factors may wind up perishing from narrow viewpoints. On a related note, AMIBA has been particularly active in advocating for business communities to rid themselves of bigotry. Strong communities welcome everyone.
Do the math of what success looks like; education is a major contributing factor to forging a strong alliance, based on projected numbers of what campaigns can yield in concrete benefits for both merchants and residents.
Differentiate inventory and offerings so that independently-owned businesses offer something of added value which patrons can’t easily replicate online; this could be specialty local products, face-to-face time with expert staff, or other benefits.
Take the high road in inspiring the community to increase local spending; campaigns should not rely on vilifying big and online businesses or asking for patronage out of pity. In other words, guilt-tripping locals because they do some of their shopping at Walmart or Amazon isn’t a good strategy. Even a 10% shift towards local spending can have positive impacts for a community!
Clearly assess community resources; not every town, city, or district hosts the necessary mix of independent businesses to create a strong campaign. For example, approximately 2.2% of the US population live in “food deserts”, many miles from a grocery store. These areas may lack other local businesses, as well, and their communities may need to create grassroots campaigns surrounding neighborhood gardens, mobile markets, private investors and other creative solutions.
In sum, success significantly depends on having clear definitions, clear goals, diverse participants and a proud identity as independents, devoid of shaming tactics.
Circling back to the Web — our native heath!
So, let’s say that your incoming client is now participating in a Buy Local program. Awesome! Now, where do we go from here?
In speaking with Jeff Milchen, I asked what he has seen in terms of digital marketing being used to promote the businesses involved in Buy Local campaigns. He said that, while some alliances have workshops, it’s a work in progress and something he hopes to see grow in the future.
As a Local SEO, that future is now for you and your fortunate clients. Here are some ways I see this working out beautifully:
Basic data distribution and consistency
Small local businesses can sometimes be unaware of inconsistent or absent local business listings, because the owners are just so busy. The quickest way I know to demo this scenario is to plug the company name and zip into the free Moz Check Listing tool to show them how they’re doing on the majors. Correct data errors and fill in the blanks, either manually, or, using affordable software like Moz Local. You’ll also want to be sure the client has a presence on any geo or industry-specific directories and platforms. It’s something your agency can really help with!
A hyperlocalized content powerhouse
Build proud content around the company’s involvement in the Buy Local program.
Write about all of the economic, environmental, and societal benefits residents can support by patronizing the business.
Motivated independents take time to know their customers. There are stories in this. Write about the customers and their needs. I’ve even seen independent restaurants naming menu items after beloved patrons. Get personal. Build community.
Don’t forget that even small towns can be powerful points of interest for tourists. Create a warm welcome for travelers, and for new neighbors, too!
Link building opportunities of a lifetime
Local business alliances form strong B2B bonds.
Find relationships with related businesses that can sprout links. For example, the caterer knows the wedding cake baker, who knows the professional seamstress, who knows the minister, who knows the DJ, who knows the florist.
Dive deep into opportunities for sponsoring local organizations, teams and events, hosting and participating in workshops and conferences, offering scholarships and special deals.
Make fast friends with local media. Be newsworthy.
A wellspring of sentiment
Independents form strong business-to-community bonds.
When a business really knows its customers, asking for online reviews is so much easier. In some communities, it may be necessary to teach customers how to leave reviews, but once you get a strategy going for this, the rest is gravy.
It’s also a natural fit for asking for written and video testimonials to be published on the company website.
Don’t forget the power of Word of Mouth Marketing, while you’re at it. Loyal patrons are an incredible asset.
The one drawback could be if your business model is one of a sensitive nature. Tight-knit communities can be ones in residents may be more desirous of protecting their privacy.
Digitize inventory easily
30% of consumers say they’d buy from a local store instead of online if they knew the store was nearby (Google). Over half of consumers prefer to shop in-store to interact with products (Local Search Association). Over 63% of consumers would rather buy from a company they consider to be authentic over the competition (Bright Local).
It all adds up to the need for highly-authentic independently-owned businesses to have an online presence that signals to Internet users that they stock desired products. For many small, local brands, going full e-commerce on their website is simply too big of an implementation and management task. It’s a problem that’s dogged this particular business sector for years. And it’s why I got excited when the folks at AMIBA told me to check out Pointy.
Pointy offers a physical device that small business owners can attach to their barcode scanner to have their products ported to a Pointy-controlled webpage. But, that’s not all. Pointy integrates with the “See What’s In Store” inventory function of Google My Business Knowledge Panels. Check out Talbot’s Toyland in San Mateo, CA for a live example.
Pointy is a startup, but one that is exciting enough to have received angel investing from the founder of Wordpress and the co-founder of Google Maps. Looks like a real winner to me, and it could provide a genuine answer for brick-and-mortar independents who have found their sales staggering in the wake of Amazon and other big digital brands.
Local SEOs have an important part to play
Satisfaction in work is a thing to be cherished. If the independent business movement speaks to you, bringing your local search marketing skills to these alliances and small brands could make more of your work days really good days.
The scenario could be an especially good fit for agencies that have specialized in city or state marketing. For example, one of our Moz Community members confines his projects to South Carolina. Imagine him taking it on the road a bit, hosting and attending workshops for towns across the state that are ready to revitalize main street. An energetic client roster could certainly result if someone like him could show local banks, grocery stores, retail shops and restaurants how to use the power of the local web!
Reading America
Our industry is living and working in complex times.
The bad news is, a current Bush-Biden poll finds that 8/10 US residents are “somewhat” or “very” concerned about the state of democracy in our nation.
The not-so-bad news is that citizen ingenuity for discovering solutions and opportunities is still going strong. We need only look as far as the runaway success of the TV show “Fixer Upper”, which drew 5.21 million viewers in its fourth season as the second-largest telecast of Q2 of that year. The show surrounded the revitalization of dilapidated homes and businesses in and around Waco, Texas, and has turned the entire town into a major tourist destination, pulling in millions of annual visitors and landing book deals, a magazine, and the Magnolia Home furnishing line for its entrepreneurial hosts.
While not every town can (or would want to) experience what is being called the “Magnolia effect”, channels like HGTV and the DIY network are heavily capitalizing on the rebirth of American communities, and private citizens are taking matters into their own hands.
There’s the family who moved from Washington D.C. to Water Valley, Mississippi, bought part of the decaying main street and began to refurbish it. I found the video story of this completely riveting, and look at the Yelp reviews of the amazing grocery store and lunch counter these folks are operating now. The market carries local products, including hoop cheese and milk from the first dairy anyone had opened in 50 years in the state.
There are the half-dozen millennials who are helping turn New Providence, Iowa into a place young families can live and work again. There’s Corning, NY, Greensburg, KS, Colorado Springs, CO, and so many more places where people are eagerly looking to strengthen community sufficiency and sustainability.
Some marketing firms are visionary forerunners in this phenomenon, like Deluxe, which has sponsored the Small Business Revolution show, doing mainstreet makeovers that are bringing towns back to life. There could be a place out there somewhere on the map of the country, just waiting for your agency to fill it.
The best news is that change is possible. A recent study in Science magazine states that the tipping point for a minority group to change a majority viewpoint is 25% of the population. This is welcome news at a time when 80% of citizens are feeling doubtful about the state of our democracy. There are 28 million small businesses in the United States - an astonishing potential educational force - if communities can be taught what a vote with their dollar can do in terms of giving them a voice. As Jeff Milchen told me:
“One of the most inspiring things is when we see local organizations helping residents to be more engaged in the future of their community. Most communities feel somewhat powerless. When you see towns realize they have the ability to shift public policy to support their own community, that’s empowering.”
Sometimes, the extremes of our industry can make our society and our democracy hard to read. On the one hand, the largest brands developing AI, checkout-less shopping, driverless cars, same-day delivery via robotics, and the gig economy win applause at conferences.
On the other hand, the public is increasingly hearing the stories of employees at these same companies who are protesting Microsoft developing face recognition for ICE, Google’s development of AI drone footage analysis for the Pentagon, working conditions at Amazon warehouses that allegedly preclude bathroom breaks and have put people in the hospital, and the various outcomes of the “Walmart Effect”.
The Buy Local movement is poised in time at this interesting moment, in which our democracy gets to choose. Gigs or unions? Know your robot or know your farmer? Convenience or compassion? Is it either/or? Can it be both?
Both big and small brands have a major role to play in answering these timely questions and shaping the ethics of our economy. Big brands, after all, have tremendous resources for raising the bar for ethical business practices. Your agency likely wants to serve both types of clients, but it’s all to the good if all business sectors remember that the real choosers are the “consumers”, the everyday folks voting with their dollars.
I know that it can be hard to find good news sometimes. But I’m hoping what you’ve read today gifts you with a feeling of optimism that you can take to the office, take to your independently-owned local business clients, and maybe even help take to their communities. Spark a conversation today and you may stumble upon a meaningful competitive advantage for your agency and its most local customers.
Every year, local SEOs are delving deeper and deeper into the offline realities of the brands they serve, large and small. We’re learning so much, together. It’s sometimes a heartbreaker, but always an honor, being part of this local journey.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog https://ift.tt/2JucgDC via IFTTT
1 note · View note