Tumgik
seotopbox-blog · 6 years
Photo
Tumblr media
Google Ranking Factors That Matter in 2018
The most important goal of search engine optimization (SEO) is obviously, rankings. Of late, digital marketers have been trying to downplay the importance of rankings in favor of more branded signals to Google’s search algorithms, in an attempt to conform to the search behemoth’s quality guidelines.
However, your site’s rankings for your chosen keywords will always be important until search results are displayed in order of 1, 2, 3… No surprise then, that in-house SEO experts and agency marketers alike are always on the lookout for every little technical tweak that can take them one step closer to the elusive No. 1 position on Google.
SEO and digital marketing tool suite (the de-facto industry leader) has conducted one of the most in-depth studies ever into the ranking factors that take a website to the top of Google.
Their machine learning based analysis of 6 lakh search results from 10 crore real internet users searching across Google’s Indian, US, UK and other versions throws up some interesting facts on how they rank websites!
For a high-level analysis, read on.
Here’s a graphic that shows the 17 most crucial signals for ranking on Google:
  Image Credits: https://www.semrush.com/ranking-factors/
There are some clear takeaways that emerge from the study…
Direct website traffic tops all other factors.
This is the single most significant finding of this study. Direct traffic is when the user comes to your website straight from the browser (by entering your URL in the address bar or clicking a bookmark) or a saved document (such as PDF), instead of clicking through from the search results or social media shares.
Numerous other studies on ranking factors have neglected to consider this critical factor, because of the possibility of correlation – i.e. Google ranks the best sites, but naturally, the best sites have more traffic.
However, the SEMrush study highlights Google’s increasing focus on the “brand authority” of a website. When direct traffic to a site increases, it is a strong indication that the brand is growing stronger. This means businesses need to redouble their PR efforts to build brand awareness and brand affinity for their products, inducing customers to proactively search for their brand terms
You may also like: How to Grow Traffic When You Know Nothing About SEO
Eventually, Google will pick up on these “entity searches” and pass on the credit to your website.
User experience is paramount to success.
Google has repeatedly advised digital marketers to chase users instead of the algorithm. And they plainly mean it. Direct traffic is followed by three metrics closely related to user behavior on the ranking factors chart:
Time on site: The amount of time the average user spends on the site per visit
Pages per session: The number of pages viewed by a user per visit
Bounce rate: The ratio of visitors who leave your site after viewing a single page to those who stick around for more
Taken together, these three metrics tell Google whether or not users really find what they’re looking for on your website, and how engaging it is to your audience. In marketing parlance, they reveal the “relevance” of your content to the searcher’s “intent.”
These user-specific ranking factors underscore the value of your content. Google has time and again emphasized the importance of creating quality content through their webmaster guidelines and hangouts. If your site delivers what users are looking for, Google is sure to reward you for it.
Links do matter.
Backlinks are at the building blocks of Google’s original PageRank algorithm, which still forms the core of its ranking system. This is reflected in the fact that elements number 5 to 8 on the chart are link-specific metrics. The quantity and quality of backlinks, the variety of domains they come from as well as the spread of their IP addresses, all play a role in pushing up your rankings.
While Google discourages link building solely for SEO purposes, the fact remains that great content garners the maximum amount of links, along with closely relevant anchor text. This, in turn, plays a big role in enhancing brand authority, eventually bringing more referral as well as direct traffic (the biggie, remember?) to the website.
Ultimately, it’s important not to focus narrowly on any single aspect of link building, such as IPs or “followed” links. Marketers should focus on developing a diverse backlink profile, built on the back of multi-channel PR and content marketing.
You can start small with low-volume keywords, and eventually move on to targeting the more competitive ones, as your content base increases and your existing links start to bring in traffic.
Google is on a mission to make the web safer with HTTPS.
Google is going all out to persuade webmasters across the world to move to HTTPS in their quest to make the web a secure place. HTTPS maintains the integrity of your website data, prevents intruders from tapping into your communication with users, and protects their privacy.
Google is doing everything it can to coax website owners into adopting HTTPS – and they’ve found a simple, but effective way: ranking HTTPS sites above their HTTP counterparts, all else being equal.
If you’re still biding your time, the time to make the switch has come. The higher the volume of keywords you’re targeting and the bigger your website, the more important HTTPS adoption becomes. Without a secure site, you’ll soon be relegated to the sidelines and watch your rivals pass you by.
Keywords are losing their shine.
SEO practitioners have been terribly obsessed with keywords for a long, long time. This has frequently led them to cross the line into spamming, by stuffing HTML tags and content with keywords, giving the SEO industry a bad name and lessening its importance.
Also, read:
With updates like Hummingbird and RankBrain, Google has got its act together when it comes to semantics and understanding content. Peppering your headlines, copy and meta tags with keywords will no longer give you even a slight advantage.
SEMrush found that more than 35% of sites that ranked for high-volume keywords didn’t even have the keyword in the title. Talk about meeting users’ intent by speaking their language!
Over to You
SEO forms the core of your digital marketing efforts. When it comes to building an effective SEO strategy that works for your business, every little bit helps. Engaging content, brand authority, links from the right places and a secure website, all play their part. Analyzing Google’s algorithm is a complex and demanding process, and it pays to stay updated on the factors that influence rankings. Good luck!
0 notes
seotopbox-blog · 6 years
Photo
Tumblr media
Google Ranking Factors That Matter in 2018
The most important goal of search engine optimization (SEO) is obviously, rankings. Of late, digital marketers have been trying to downplay the importance of rankings in favor of more branded signals to Google’s search algorithms, in an attempt to conform to the search behemoth’s quality guidelines.
However, your site’s rankings for your chosen keywords will always be important until search results are displayed in order of 1, 2, 3… No surprise then, that in-house SEO experts and agency marketers alike are always on the lookout for every little technical tweak that can take them one step closer to the elusive No. 1 position on Google.
SEO and digital marketing tool suite (the de-facto industry leader) has conducted one of the most in-depth studies ever into the ranking factors that take a website to the top of Google.
Their machine learning based analysis of 6 lakh search results from 10 crore real internet users searching across Google’s Indian, US, UK and other versions throws up some interesting facts on how they rank websites!
For a high-level analysis, read on.
Here’s a graphic that shows the 17 most crucial signals for ranking on Google:
Image Credits: https://www.semrush.com/ranking-factors/
There are some clear takeaways that emerge from the study…
Direct website traffic tops all other factors.
This is the single most significant finding of this study. Direct traffic is when the user comes to your website straight from the browser (by entering your URL in the address bar or clicking a bookmark) or a saved document (such as PDF), instead of clicking through from the search results or social media shares.
Numerous other studies on ranking factors have neglected to consider this critical factor, because of the possibility of correlation – i.e. Google ranks the best sites, but naturally, the best sites have more traffic.
However, the SEMrush study highlights Google’s increasing focus on the “brand authority” of a website. When direct traffic to a site increases, it is a strong indication that the brand is growing stronger. This means businesses need to redouble their PR efforts to build brand awareness and brand affinity for their products, inducing customers to proactively search for their brand terms
You may also like: How to Grow Traffic When You Know Nothing About SEO
Eventually, Google will pick up on these “entity searches” and pass on the credit to your website.
User experience is paramount to success.
Google has repeatedly advised digital marketers to chase users instead of the algorithm. And they plainly mean it. Direct traffic is followed by three metrics closely related to user behavior on the ranking factors chart:
Time on site: The amount of time the average user spends on the site per visit
Pages per session: The number of pages viewed by a user per visit
Bounce rate: The ratio of visitors who leave your site after viewing a single page to those who stick around for more
Taken together, these three metrics tell Google whether or not users really find what they’re looking for on your website, and how engaging it is to your audience. In marketing parlance, they reveal the “relevance” of your content to the searcher’s “intent.”
These user-specific ranking factors underscore the value of your content. Google has time and again emphasized the importance of creating quality content through their webmaster guidelines and hangouts. If your site delivers what users are looking for, Google is sure to reward you for it.
Links do matter.
Backlinks are at the building blocks of Google’s original PageRank algorithm, which still forms the core of its ranking system. This is reflected in the fact that elements number 5 to 8 on the chart are link-specific metrics. The quantity and quality of backlinks, the variety of domains they come from as well as the spread of their IP addresses, all play a role in pushing up your rankings.
While Google discourages link building solely for SEO purposes, the fact remains that great content garners the maximum amount of links, along with closely relevant anchor text. This, in turn, plays a big role in enhancing brand authority, eventually bringing more referral as well as direct traffic (the biggie, remember?) to the website.
Ultimately, it’s important not to focus narrowly on any single aspect of link building, such as IPs or “followed” links. Marketers should focus on developing a diverse backlink profile, built on the back of multi-channel PR and content marketing.
You can start small with low-volume keywords, and eventually move on to targeting the more competitive ones, as your content base increases and your existing links start to bring in traffic.
Google is on a mission to make the web safer with HTTPS.
Google is going all out to persuade webmasters across the world to move to HTTPS in their quest to make the web a secure place. HTTPS maintains the integrity of your website data, prevents intruders from tapping into your communication with users, and protects their privacy.
Google is doing everything it can to coax website owners into adopting HTTPS – and they’ve found a simple, but effective way: ranking HTTPS sites above their HTTP counterparts, all else being equal.
If you’re still biding your time, the time to make the switch has come. The higher the volume of keywords you’re targeting and the bigger your website, the more important HTTPS adoption becomes. Without a secure site, you’ll soon be relegated to the sidelines and watch your rivals pass you by.
Keywords are losing their shine.
SEO practitioners have been terribly obsessed with keywords for a long, long time. This has frequently led them to cross the line into spamming, by stuffing HTML tags and content with keywords, giving the SEO industry a bad name and lessening its importance.
Also, read:
With updates like Hummingbird and RankBrain, Google has got its act together when it comes to semantics and understanding content. Peppering your headlines, copy and meta tags with keywords will no longer give you even a slight advantage.
SEMrush found that more than 35% of sites that ranked for high-volume keywords didn’t even have the keyword in the title. Talk about meeting users’ intent by speaking their language!
Over to You
SEO forms the core of your digital marketing efforts. When it comes to building an effective SEO strategy that works for your business, every little bit helps. Engaging content, brand authority, links from the right places and a secure website, all play their part. Analyzing Google’s algorithm is a complex and demanding process, and it pays to stay updated on the factors that influence rankings. Good luck!
Bluehost
Pros
Free Domain
Unlimited Bandwidth
Money-back Guarantee
Cons
Limited CPU Usage
Customer support in Bluehost is pretty slow
Bluehost only has one set of DNS:
0 notes
seotopbox-blog · 6 years
Photo
Tumblr media
How to Grow Traffic When You Know Nothing About SEO
People who are just getting started with a new internet business often get that common feeling that search engine optimization is too technical while the competition in SERPs is enormous. So generally SEO isn’t worth the effort.
While it’s true that there are some niches where it’s hard to succeed, that’s not the case for many other industries. Quite often, when you look at a particular vertical, you’ll find that it’s not as saturated as you think, and competitors have a very limited understanding of SEO. So you only have to get the basics right to have immediate success.
Part of the reason is that many business owners don’t have enough time to spend on creating and implementing a good SEO strategy. So they decide to hire someone to do it for them. The inherent risk with this course of action is that when you don’t understand the basics of the work that needs to be done, you can easily get ripped off.
So, no matter whether you’re just getting started with DIY SEO or planning on hiring someone to do it for you, this guide will give you everything you need to know to succeed with generating organic traffic for your website.
We’ll go through the following vital SEO basics:
How to figure out what your customers are searching for
How to optimize your pages so they rank for your keywords
How to make sure your website is accessible and convenient to both search engines and human visitors
And finally, how to get other websites to link to your site
Keep reading to learn the best practices for each of these aspects of SEO and how to apply them to your website.
Step 1: Figure out what your target customers are searching for
While many people think of SEO as an inherently technical discipline — no matter whether you’re researching keywords, optimizing your site, or analyzing competitors to understand what they’re doing — the reality is that SEO is predominantly about understanding who your customers are and what they care about.
That doesn’t necessarily need to be done using any SEO tactics, in fact, the best way to get started with it is by talking to your clients and listening to their feedback. Here are some of the obvious places where you can start:
Comments on your website
Emails from customers
Your phone/chatline
Events where you meet with customers
Keep an eye not just on what they’re saying, but also what language they’re using. Start building a list of topics your audience is interested in and the terms they use.
To get a better idea of how that would look in practice, we’ll use an example.
Let’s say we’re working on SEO for a new hotel/bed&breakfast in Dublin, Ireland. We don’t have customers yet, so we use common sense to come up with several ways people may search online to find us:
‘hotels in Dublin’
‘hotel in Dublin’
‘place to stay in Dublin’
‘accommodation Dublin’
There are quite a few terms we can come up with, and while that’s always helpful, your primary concern at this point is not to create a super-extensive list.
Get a general idea of how people discuss your topic
My next step is to pick one of the terms I came up with and run it through Keywords Explorer in Ahrefs: Ahrefs gives me a whole bunch of useful information on that keyword. I see that ‘hotel in Dublin’ gets a good amount of searches in the US, it’s popular in Ireland and the UK, and (most importantly) it is a part of a larger topic ‘Dublin hotels’ that gets a lot more searches.
I also get numerous suggestions for related keywords that Ahrefs presents using several keyword generation methods.
In just a few minutes with Ahrefs’ Keywords Explorer, I’ve managed to identify a high-volume keyword that our potential customers can use to find my website. Besides I’ve got a whole bunch of other keywords that can generate organic traffic to our website.
A free tool like Google Trends can also provide insight into how people are searching around a particular topic, but that information will be nowhere near what an advanced tool will give you. Still, our guide on using Google Trends for keyword research will definitely come in handy.
Check out the search suggestions (aka autocomplete)
Start typing a query in the search box, but don’t hit Enter. Google will immediately suggest some additional search terms that people have used: You can repeat this by typing each letter of the alphabet: Ahrefs will save hours of your time on this kind of research. With Keywords Explorer you’ll get an extensive list based on Google’s search suggestions. Our keyword tool will also automatically populate it with valuable metrics such as the number of searches each term gets as well as its ranking difficulty estimation (we’ll go into more detail on this in a minute):
Get some hints from the related searches
Finally, look at the related searches section (you will find it at the bottom of the search results page): This is an important place to look at because it will give you some extremely relevant keywords you should target. For example, when we look at the related searches for ‘Dublin hotels’, we can see that people search for things like ‘cheap hotels in Dublin city center’ and ‘bed and breakfast dublin ireland’, which weren’t in our original consideration.
Based on the type of hotel we’re working on and the type of customers it will be targeting, some of these can be a great way to reach our intended audience.
How to expand your keyword list
Search suggestions and related searches will give you a pretty limited number of keyword ideas. This is the part where you can really benefit from using one of the premium keyword research tools on the market.
Ahrefs is built on a database of over 5 billion (yes, with a B) keywords. A simple check with Keywords Explorer using some of the terms I found during my initial research/brainstorming gives me over 4k(!) suggestions to work with: When you have access to this calibre of data-backed insight, coming up with keyword ideas is no longer a challenge. You should just choose the opportunities to focus on and pursue.
Understand how your target customers are talking about your topic
Google search results can give you a lot to get started with SEO, but it’s in no way enough.
To be successful with SEO, you need to understand how people are talking about the niche you’re operating in, what problems they’re facing, what language they’re using, and so on. Use every opportunity to talk to customers and take notice of the language they’re using.
Doing it in person is great, but also very time-consuming. So here are some places where you can find the words people use while talking about your topic of interest:
Ahrefs Content Explorer
Content Explorer provides one of the quickest and most reliable methods to understand what topics are the most popular and engaging in a certain field.
Here’s what I got when I searched for ‘Ireland travel’: From the results, I can see that travel guides are very popular. That gives me the idea that we can publish a travel guide on the site and attract visitors and social shares through it.
Social Media, Forums, and Communities
Websites with a lot of user-generated content are a great place to see how people are talking about your chosen topic. Some of the places you can visit to perform this kind of analysis include:
Forums: there are quite a few communities where people discuss traveling. A simple search led me to the TripAdvisor Dublin Travel Forum for example.
Quora/Reddit: Quora is probably not the first place that will come to mind when you’re looking for travel advice, but you’d be surprised by the amount of information you can find there even on this topic. Where it comes to Reddit, the old adage there’s a subReddit for everything holds true.
Facebook/LinkedIn Groups: Make sure you’re looking in the most relevant place — I won’t be using LinkedIn for my research, but there are plenty of groups on Facebook where I can find a lot of information.
We have a really extensive resource on how to use these communities to generate keyword ideas.
Literally anywhere else
Any website or social network with a large number of visitors and user-generated content can serve as a source of inspiration and keyword ideas.
Podcasts are hugely popular, which means the people that produce them spend considerable time researching topics: On Amazon you can search for your field of interest and see what books/products are selling well — this serves as a validation that people are interested in this particular topic. Neil Patel has a great idea on how to get even more information out of this. Check out the table of contents of the book and especially the chapter titles for more keyword ideas:
Image source
Combining all these methods should give you an extensive list of search terms (or keywords) that you can target with the content on your website.
However, some keywords (especially in a field that’s as competitive as travel) get a lot of searches but are very hard to place well for. Others would be easy to rank for but hardly get any searches. Understanding what keywords to target with your pages and your content is the essential step in achieving SEO success. Understand the metrics In keyword research, there are two metrics of the utmost importance:
search volume (representing how many times people search for a certain term) and
keyword difficulty (showing how easy or hard it would be to appear on the first page of search results for that term).
I see way too many SEO experts who still proclaim Google Keyword Planner as a viable tool for predicting keyword traffic. The reality is — it’s NOT a dependable place for this kind of data.
That’s why it’s important to use an advanced tool, which can give you reliable data on keyword search volumes: The same applies to keyword difficulty. Keyword Planner uses the metric aimed at paid advertising, i.e. how much competition there is for the paid positions for a given term. Keyword Difficulty in Ahrefs makes it extremely simple to understand the chances of ranking in the top 10 search results by estimating how many domains you need to link to a page with your content (more on the topic of backlinking further in this post).
Understand the intent of each keyword you’re targeting
When you start delving deeper into SEO and keyword research, you’ll realize there are 3 types of searches people make:
Navigational: They’re looking for a specific website, e.g. ‘Dublin airport’
Informational: They’re looking to learn more on a specific topic, e.g. ‘things to do in Dublin’
Transactional: They’re looking to purchase a specific product/service, e.g. ‘book hotel in Dublin’
Naturally, searches with a high level of commercial interest are more valuable from a business point of view, since the people doing them are much closer to the point of purchase and thus more likely to spend money if they land on your site.
There’s a difference in the intent even between transactional terms. Here’s an example. Which search query is closer to conversion: ‘last minute hotel deals Dublin’ or ‘Dublin hotels’?
To understand the intent behind the search term, make sure you thoroughly review what pages rank for it. Since Google’s main preoccupation is to satisfy its users, the algorithm keeps a close eye on the behavior of people searching for a particular term and tries to offer results that will satisfy the intent of their search. Thus, a search for ‘Dublin guide’ will show a bunch of informational results from Lonely Planet and TripAdvisor: But once I type ‘Dublin accommodation’ in the search box, it’s a whole different story — the results page is dominated by offers for hotels and b&b’s:
Learn professional keyword research
Keyword research is a huge area of SEO and it can be very intimidating to someone who’s just getting started. However, understanding the basic concepts and metrics is a must for every website owner who wants to drive organic traffic.
To dive deeper into professional keyword research, check out our extensive guide on the topic.
After you figure out the keyword(s) you want to rank for and what searchers are expecting to see when they type them into Google, you also need to designate what page on your site is going to rank for each keyword. Create a map (in a spreadsheet or another file) that ties each keyword in your plan to a page on your site.
Next, you have to make sure this page is structured in a way that search engines understand what topic (i.e. search term) it is targeted at.
Step 2: Creating pages optimized for search
Keyword research is just the first step towards attracting search traffic to your website.
However, you also need to make sure your pages are structured well in order to rank for the keywords you selected and satisfy those who’re searching.
Perform basic on-page optimization
On-page optimization is the next essential step in your basic SEO strategy. Even if you find the most profitable keywords and have the best content for them, your effort would be wasted if your pages are not optimized for search engines.
Before I get into the details of on-page optimization, let’s clear up what tools you need to perform it. There are many ways to implement the features I discuss in the following sections, but if you’re using WordPress for your website, my recommendation is to go with the Yoast SEO plugin. It’s free and simple to use and it’s perfect for someone who’s just starting with on-page SEO. And now let’s get to the nitty-gritty. Setting up your pages for success with search engine goes through optimizing the following elements:
Content
I’m putting it first because it’s the most important factor for the success of your pages. SEO is becoming extremely competitive and there’s no way to succeed with it (especially if you’re popularizing a new website) without producing extremely high-quality content.
Many people think wrongly that good content = long content. There is an equation that works, but it goes like this:
Good content = content that is useful to the person consuming it.
To put it in more concrete terms, if you want to rank a travel guide on Dublin, what is going to be more useful to your readers — 2000 words explaining the main areas of the city or a map that quickly shows them the most popular attractions and the best areas to go to for food, drink, and entertainment? Word count does not even start to cover what matters when we’re talking about quality.
URL
The web address of your page sends a rather strong signal to search engines about its topic. It’s important that you get it right the first time because you should avoid changing it.
Try to make the URL as short as possible and include the main keyword you want that page to rank for. For example, I would put my Dublin travel guide under domain.com/dublin-guide.
Remember that Google recommends keeping URLs simple.
Meta properties
Web pages have two specific features that search engines use when building up search results:
Title
Contrary to what the name suggests, the meta title tag does not appear anywhere on your page. It sets the name of the browser tab displaying your page and is used by Google and other search engines when the page is featured in search results.
The title tag is a great opportunity to write a headline that both
a) includes the keyword you want this page to rank for and b) is compelling enough to make searchers click on it and visit your site.
Because mobile is becoming increasingly important and there are many different devices we use to browse and search the internet, there isn’t a hard rule on how long titles should be anymore. The most recent research suggests you should still aim to keep yours under 60 characters to make sure they’re fully displayed in search results.
Description
Although Google won’t always show your page description in a search result snippet, it will quite often. So don’t forget to include the keywords you want to rank for in the description section. Notice how Google highlights the search term in each result in the screenshot below. Make sure your descriptions are under 135 characters long and that they compel searchers to follow the link to your website.
You can learn more about writing meta descriptions from this guide.
Headers and Subheaders
Use the standard HTML format for headers (H1 to H6) to make it easy for search engines to understand the structure of your page and the importance of each section.
Header 1 should be reserved for the on-page title of your content and should include the main keyword that page is targeting. Make sure you only have one H1 header per page.
Header 2 can be used for the titles of the main sections on your page. They should also include the main keyword you’re targeting (whenever possible) and are also a good place to include additional (longer-tail) keywords you want to rank for with this piece of content.
Every time you’re going a step further in sections, just use the next type of header, e.g. Header 3 for subheadings within an H2 section, and so on. Here’s what a well-structured piece of content should look like when headers are used appropriately:
H1: The Complete First-Time Traveller’s Guide to Dublin
H2: Sights & Attractions
H3: Trinity College
H3: The Guinness Storehouse
H3: The Temple Bar Area
H2: Accommodation
H3: Hotel 1
H3: Hotel 2
H3: Hotel 3
H2: Restaurants
H3: Upscale restaurant
H3: Gastropub
H3: Another hip place
H2: Bars
H3: Bar with live music
H3: Bar with great cocktails
H3: Very touristy bar
Following a clear and exhaustive structure not only makes it easy for search engines to categorize your content, but also helps human readers make sense of your text. They will reward you by spending longer time on your page and coming back to it when they want to learn more about the topic.
Internal linking
Linking between the various pages on your website strategically is a great way to improve the speed at which search engines crawl your website and instruct them about the most important pages on your website.
For example, you can use the hub-and-spoke strategy to rank for highly competitive keywords:
Image source
In our example, we can create a page that targets ‘Dublin guide’ and have it link to separate pages that cover ‘Dublin sights’, ‘Dublin restaurants’, and so on.
Images
While the visual material is a great aid for humans, search engine crawlers are not so great (yet) at making sense of them. To help them, you should use the alt tag to explain what the image is about (and ideally include the keyword you’re targeting with this particular piece of content).
In WordPress this can be achieved easily by using the Alternative Text field in the image editor: If you’re not using WordPress, you can also add the tag manually: Apart from optimizing each piece of content you publish, you also have to make sure the basic setup of your website is done in a way that it doesn’t hurt your chances to rank.
Step 3: Making sure your website is well accessible to search engines and visitors
One of the important things to keep in mind when doing SEO is that you’re essentially working for two separate customers — your human readers and the bots search engines use to index your website.
While Google and other search engines have been making strides in developing a human-like understanding for their crawlers, many differences still exist between the two. Therefore, your goal should be to create a positive experience for both human and robot visitors on your website.
If you’re anything like me (i.e. very non-technical), understanding how to work with human customers is the easy(-ish) part. It’s robots that I find it more challenging to appease. That’s why, while it’s definitely important to have a strong understanding of the features we discuss in the following paragraphs, I would always encourage you strongly to delegate them to a professional (preferably a developer), who knows how to implement them for your website.
These are the technical elements you need to keep an eye on.
Optimize your website’s loading speed
Both humans and search engine bots prioritize the loading speed of websites. Studies suggest that up to 40% of people leave websites that take longer than 3 seconds to load.
Using Google’s PageSpeed Insights tool or GTMetrix not only can help you find out how quickly your site loads, but will also give you actionable advice on how to further improve the speed of your pages:
Create a sitemap and a robots.txt file
These two items are aimed solely at helping search engines make sense of your website.
Sitemaps
A sitemap is a file published in a special format called XML, which allows search engines to find all the pages that exist on your website and understand how they’re connected (i.e. see the overall structure of your website).
Sitemaps do not affect rankings directly, but they allow search engines to find and index new pages on your website faster.
Robots.txt
While the sitemap lays out the full structure of your website, the Robots.txt file gives specific instructions to the search engine crawlers on which parts of the website they should and shouldn’t index.
Image source
It is important to have a Robots.txt file for your website because search engine allocate a crawl budget to their bots — a number of pages they’re allowed to crawl with each visit.
All major search engine crawlers and other “good” bots recognize and obey the Robots.txt format.
Website architecture matters
The structure of the data on your website also plays a major role in successful SEO performance.
Google takes into account factors such as how long people stay on your website, how many pages they view per visit, how high the bounce rate (single page visits) is, etc. when making decisions on how high to rank your website in search results. Therefore, having a clear structure and navigation not only helps visitors find their way around with ease but will also signal to Google that your site is worth ranking.
In general, setting these features in the right way involves a fair amount of technical knowledge, so you might want to consider hiring a professional to do them for you.
Step 4: Building backlinks from other websites
As a website owner, you’ve got full control of the keywords to target, on-page optimization, and site structure.
However, there’s a part of SEO, which you can’t manage directly, but is essential for your performance — how many websites link back to your page.
Backlinks are among the most important ranking factors
There’s clear evidence showing that external links are one of the factors with the strongest influence on Google’s ranking algorithm. Our study of 2 million keywords discovered that the number and authority of the pages and domains linking back to a website are the strongest predictor of ranking success.
Even Google has admitted that backlinks are in top 3 factors that affect how SERPs are built.
Here are the most important things you need to know about building backlinks.
Not all links are created equal
In the past, many SEO experts were heavily focused on getting as many links as possible without considering whether the pages were logically related. This gave rise to various forms of abuse such as link farming, buying links, etc.
Over the last few years Google has introduced a number of changes to penalize such practices, so now the quality of the backlinks is much more important than the quantity. Link quality comes down to a number of factors:
Authority of the page and the website
The authority of the site and the page that link to your website has an effect on how valuable that link is. Getting a mention and a link from TripAdvisor’s blog would count for more than a review on a small travel blog. That is because search engines know that TripAdvisor is an authoritative website on travel since there are thousands of other websites that link to it.
In the same way, the page where your link is published also carries its own authority. For example, getting a link from TripAdvisor’s main page for Dublin — which gets thousands of links and visitors itself — is better than having tons of mentions of your hotel on page 66 in forum threads.
At Ahrefs, we use the Domain Rating and URL Rating metrics to help users understand how valuable each backlink opportunity is. You can run any page through Site Explorer to review its stats. Here’s what we get for TripAdvisor’s main page on Dublin:
Dofollow vs. Nofollow links
When placing a link on your website, you can instruct search engine bots whether they should treat it as an endorsement of the page you’re linking to or not. That’s controlled by the optional a rel=nofollow attribute:
<a href="http://www.example.com/" rel="nofollow">Link text</a>
By default, a hyperlink is “dofollow” (i.e. you do not need to add rel=dofollow to your links as this attribute doesn’t even exist). This signals that you endorse the link and want to pass “link juice” to it. Adding the nofollow attribute instructs search engine crawlers that they shouldn’t follow the link.
Websites sometime use the rel=nofollow tag as a way to prevent abuse. For example, you’d often find pages that automatically add rel=nofollow to links placed in the comments section of the website.
Obviously, you’d always want to get a dofollow link for your website, when you’re doing link building.
How to build links
Link building is critical for the success of your SEO strategy, so if you’re prepared to spend resources (time, money, etc.) on producing content, you should also be prepared to commit at least as much time promoting and generating links to your content.
There are many tactics you can use to get other websites to link back to your page. Some are more legitimate than others. However, before you start cherry picking “link building hacks” to try, take the time to review and analyze your competition.
Each niche is different and tactics that work great in one might not be so effective in yours. Therefore, the best thing you can do is to analyze how your competitors are building links and look for patterns.
When you’re done with this, consider some of the following popular tactics.
Identifying opportunities + Email outreach
The first step in your approach should be to find authoritative websites that would be a good fit for linking back to your content (essentially, a combination of high authority and relevance).
Let’s say we decide to go ahead with the idea to publish a Dublin travel guide. We produce a high-quality piece of content and want to build links to it. First, we’ll check out what pages already rank for the keyword we’re going to target. Here’s a page offering a bit of information, but I would not call it spectacular: We can use Ahrefs to check what websites link to this page: As I check the results, I quickly find a good candidate I’d love to get a backlink from: The name and email of the website owner are available in plain sight, so now I can reach out to them, using a good email script, and suggest them to check out your high-quality piece of content.
Guest blogging
Many people think guest blogging is dead because this year Google published a “warning” about guest posting for link building.
But read it carefully:
“Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site.”
And one of the violations is:
“Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on.”
So as long as your guest posts are helpful, informative and quality, you don’t have to worry.
Guest blogging works for link building when 3 basic rules are followed:
You publish on an authoritative website with a large relevant audience.
You create a high-quality piece of content, which is helpful to the audience of the website where it will be published.
You add link(s) to relevant resources on your website that would further help the audience expand their knowledge of the topic.
When these 3 are combined, guest blogging can be a great tool for brand-building, generating referral traffic, and improving rankings.
These tactics are just the tip of the iceberg. There are many additional ways to build links. To learn more about link building check out our Noob Friendly Guide to Link Building.
Everything you need to know to get started with SEO
Although search engine optimization has gotten very competitive in the last few years, it is still by far the most effective way to drive sustainable traffic to your website.
Moreover, the efforts you put in optimizing your website for search add up over time, helping you get even more traffic as long as you’re consistent with SEO.
To achieve this, remember to follow the 4 steps of good basic SEO:
Find keywords to rank for with the search intent, search volume and keyword difficulty in mind.
Create pages that are optimized to rank for the keyword(s) they’re targeting.
Make sure your website loads fast and that it is structured so that it’s readable to both search engines and human readers.
Build links from other high-quality websites to the page you want to rank.
Over time this process will help you build up the authority of your website and you will be able to rank for more competitive keywords with high search volumes.
If you think I missed some other important search engine optimization basics, please let me know in comments!
0 notes
seotopbox-blog · 6 years
Photo
Tumblr media
SEO Tools: Best Position Trackers SEO Tools With Results
Let’s not beat around the bush here. Irrespective of what some professionals may claim, search engine rankings are still the best indicators of SEO progress.
Sure, on their own, they’re nothing more than a means to an end – increasing online visibility, getting more sales or achieving any other outcome you want for your business.
And it is not possible if your rankings aren’t up there.
It is crucial that you can properly track and monitor your domain’s positions for relevant keywords.
The thing is, given the plethora of choices, how do you choose the best keyword ranking tool?
Well, that’s what I intended to find out. I decided to compare some of the popular rank trackers in the market – Moz, RankRanger, SEMrush, SE Ranking, and Ahrefs – to establish which offers the best value.
Without any further ado, here’s what I discovered.
Comparison criteria:
Here’s the functionality I took into consideration when comparing rank trackers.
How many search engines and devices does each tool track
How many keywords could I monitor
Can I track rankings by location (and how many locations could I specify)
Can I track rankings for my business name in local listings
Can I compare my rankings with the competition
Could the tool suggest additional competitors for me
Can I see which keywords trigger a featured snippet
How often do they update the data
Can a tool alert me of unusual ranking fluctuations
What sort of reports can I generate
I know there’s a lot to cover so let’s dive right in.
Comparison Results
#1. Search Engines and Devices
Search Engines
It goes without saying that we are all eager to know how our sites rank on Google.
But depending on various factors, like your location or business type, for example, you might want to track the rankings on other search engines as well.
And as it turns out, rank trackers differ greatly when it comes to search engines they target.
For example, Moz allows you to track rankings on Google, Google Mobile, Bing and Yahoo.
SEMrush and Ahrefs, however, track only Google (and Google mobile) rankings.
SE Rankings on the other hand, also tracks rankings in Yandex, Yandex mobile and YouTube.
And finally, RankRanger offers an impressive list of search engines, including Google, Google mobile, Yahoo, Yandex, Bing, YouTube, Google Maps, Google Play, iTunes, Google Jobs, Haosou, Sogou, Baidu, Seznam, Atlas, and others.
However, the problem is, RankRanger allows you to pick only two search engines from the list. And so, although you have a plethora of choice, you can’t avail the options in full.
Verdict: If you look at the list of supported engines, RankRanger is miles ahead of the rest. Unfortunately, its restriction to two sources lands it in line with other tools.
Devices
All rank trackers allow monitoring rankings for desktop and mobile devices separately. However, SEMrush is the only one that splits mobile devices between smartphones and tablets, offering a more in-depth look at your mobile rankings.
Verdict: SEMrush for offering a more detailed analysis of mobile rankings.
#2. Keywords
Depending on the size of your site, and the business, you might want to track anything from a handful of keywords to thousands or more.
And particularly, if you’re in the latter category, it is crucial that the rank tracker you choose will not force you to drop some of the data.
Here’s how my selected rank trackers stack up regarding the number of keywords they allow you to track:
(Note: I’m quoting official information from those companies here, covering various price plans these products offer)
SEMrush Moz Ahrefs SE Ranking RankRanger From 500 to 5000 From 300 to 7500 From 300 to 10000 From 50 to 20000 From 350 to 2000
Verdict: A tricky choice as the best tool in this case largely depends on your business requirements.
So, instead of offering a strict verdict, I thought I’d compare the cost per keyword for each tool (using their smallest paid plan).
Price per month Keywords Price per Keyword MOZ 99 300 0.33 Ahrefs 99 300 0.33 SEMrush 99 500 0.20 SE Ranking 29 250 0.12 RankRanger 69 350 0.20
#3. Location Tracking
If you run a local business, it is crucial that you know how your keywords perform in your immediate locale. And although it’s not a functionality I’d personally be interested in, I compared whether rank trackers from my list can deliver local rankings.
And I was pleasantly surprised to find out that every tool on the list allows tracking rankings by location.
SEMrush Moz Ahrefs SE Ranking RankRanger Yes Yes Yes Yes Yes
However, they all offer a different number of location targets you could specify and track. What’s more, in some of the tools, the actual number also depends on a price plan you’ve selected.
Here’s the breakdown of how many locations per website you can specify:
SEMrush Moz Ahrefs SE Ranking RankRanger 10 4 From 1 to unlimited, depending on a price plan 5 2
Verdict: Ahrefs beats everyone else with their unlimited package.
#4. Tracking Business Name in Local Listings
Similarly, if you run (or help promote) a local business, you should know how it ranks for its business name.
Therefore, a rank tracker you choose should give you the ability to monitor rankings for the business name.
From the ones I’ve compared, SEMrush, SE Ranking and RankRanger offer this functionality.
RankRanger
SE Ranking
SEMrush
Moz doesn’t include it in its PRO package. However, you can get it if you sign up for their Moz LOCAL tool. To use it, however, you need to buy a separate license.
Ahrefs, however, doesn’t allow tracking a business name.
Verdict: A tie between SEMrush, SE Ranking and RankRanger.
#5. Competitor Rankings
Similarly, I analyzed which rank tracker allows me to compare my rankings with the competition and build an image of my competitive landscape.
Ahrefs, SE Ranking and RankRanger allow to compare up to 5 competitors.
SE Ranking
Ahrefs
Rank Ranger
SEMrush allows adding up to 20 competitors. You can also add specific competitors for each keyword, making the tool’s competitive report highly relevant.
Finally, with Moz, you can track rankings of up to 3 competitors. However, you can also add them from the organic report.
Verdict: SEMrush. If you need to monitor ranking changes for a large group of competitors, the tool delivers the greatest opportunity.
#6. Additional Competitor Research
I believe most will agree with me on this:
Companies you consider immediate competitors aren’t necessarily the ones you battle for the searcher’s attention.
Additional websites might be ranking for your keywords, trying to steal your traffic and revenue.
And so, it is crucial that you understand who your online competitors are (not just the market ones).
And as it turns out, there are only two tools on the list that offer this functionality:
SEMrush
SEMrush’s competitor report includes all domains that target the same keywords as you, not just the ones that battle for the same organic traffic.
Plus, it provides a wealth of additional information about each domain that you could use to plan strategies to overcome them in rankings.
SE Rankings
SE Rankings offers a list of domains targeting similar keywords as you to choose from.
Verdict: SEMrush and SE Ranking because of their holistic approach of looking at other sources than just organic to get the full picture of the competitive landscape.
#7. Featured Snippets Report
Featured snippets are one heck of a deal.
After all, as some SEOs suggested, the featured snippet is now a rank zero, the ultimate ranking position you could strive for.
(Hell, you could rank on top of organic search, and the Answer Box will still nick your traffic!)
Unfortunately, getting it isn’t easy.
Or is it?
Because, for one, knowing which keywords already trigger the answer box could help you improve your relevant pages, and take over the feature box.
And so, as part of the comparison, I checked which rank trackers report on featured snippets in rankings.
And this is where things get really tricky. Because all the tools show the featured snippet icon, denoting that a particular keyword fires off the Answer Box. Then some of them allow to dig deeper and filter results by this rank type.
So to take it from the top, Moz, Ahrefs, SEMrush and RankRanger denote featured snippet in ranking reports, allowing you to identify keywords that trigger the answer box.  
The cool thing about SEMrush featured snippet report is that it is unique in the market. It significantly speeds up your workflow of getting into featured snippets in two ways: result analysis and opportunities search.
First, based on your keyword list, you get the list of all your pages that got you to featured snippet for any location or device that you target.
At the same time, in the other tab, you get all the keywords that you did not gain a featured snippet rank for, but your competitors did. In the table, you can see at once the keywords AND the specific URL from the snippet. This feature allows you to quickly analyze why a particular page gained on the FS ranking and therefore how to improve your own page. Also, by putting priorities to your pages that got to the top-20 SERP positions, it’s easy to strategically plan your work.
RankRanger provides a dedicated featured snippet report as well, including the information about keywords, domains, and landing pages that trigger the answer box.
     Verdict: A tie between RankRanger and SEMrush
#8. Data Update Frequency
Given how fast rankings can change, it is crucial that you have access to the most current data.
Spotting unusual rankings fluctuation might help you identify and overcome a bigger issue if the problem was left unattended.
For that reason alone, I prefer an instant data update, rather than a delayed one. And so, as part of the process, I checked the intervals at which I could have my data delivered by the rank trackers.
But to my surprise, there are huge discrepancies with data update frequency between those products.
SE Ranking, SEMrush and RankRanger provide daily rankings.
From SEMrush’s website:
RankRanger
Moz reports on your search positions only once a week.
SE Ranking, on the other hand, allows you to specify your desired frequency, and choose to receive the data every day, once in 3 days or weekly. Note that the frequency you choose will affect your price, with daily reports costing you more, and weekly slashing your cost by as much as 40%.
Finally, Ahrefs ties frequency to price plans. The higher plan you choose, the faster you’ll receive your data.
Verdict: RankRanger provides the best value for the money here. But it’s worth mentioning that SE Ranking allows to select the frequency on all plans, and customize it to your needs.  
#9. Alerts
In case something unusual happens to the rankings one would like to know the cause behind it, identify the problem in detail and take adequate measures to eradicate the problem.
Unfortunately, the only rank tracker that offers the ability to set up alerts, and receive emails when rankings change is SEMrush.
That said, I admit that this is hands down an incredible feature, allowing SEOs to become more proactive, rather than reactive to changes and potential issues with rankings.
Verdict: SEMrush, undoubtedly.
Note: This is a different functionality from weekly (or daily, monthly, etc.) ranking update reports that most tools offer.
#10. Reports
It doesn’t matter if you’re an in-house SEO or work for an agency, at some point you’ll have to issue reports to prove your work. Although agency folk might do it more often, it’s a feature we all need.
Most of the rank trackers I compared offer the ability to create PDF reports. The only exception in the list is Ahrefs. It does not offer any reporting functionality.
With others, however, there are some differences regarding the type of reports you can create.
SEMrush allows to create a standard PDF report that you can schedule to receive regularly.  
You can also customize the look and content of the PDF, adding data from other SEMrush tools with a simple drag and drop interface.
Many of those elements allow you to bring in custom data and reports created specifically for this report, making it an invaluable tool for marketers and agencies that need to provide more context for the ranking report.
Moz offers custom reports too, and allows to schedule and annotate them.
SE Ranking includes a template builder, allowing you to create various templates for your reports, depending on your requirements.
Finally, RankRanger offers an option to create white-labelled reports, customize the cover page, and customize predefined PDF templates.
Closing Thoughts
As SEOs, we all have different objectives – increasing overall rankings, showing up in the Answer Box, boosting local presence among others. However, the rankings, like I said, are the best indicators of SEO progress.
And we need to use a tool offering functionality that can help us achieve it.
I hope this detailed comparison helps you identify the most effective tool to reach your own SEO goals.
Disclaimers:
First, I signed up for and used demo trials of each tool. As a result, there is a possibility that certain functionality I may highlight as lacking in a particular app was simply blocked in the trial version.
Second, the results of this comparison are entirely subjective and based on my expectations for a rank tracker.
0 notes
seotopbox-blog · 6 years
Photo
Tumblr media
SEO Ranking Factors & Correlation: What Does It Mean When a Metric Is Correlated with Google Rankings? - Whiteboard Friday
In an industry where knowing exactly how to get ranked on Google is murky at best, SEO ranking factors studies can be incredibly alluring. But there’s danger in believing every correlation you read, and wisdom in looking at it with a critical eye. In this Whiteboard Friday, Rand covers the myths and realities of correlations, then shares a few smart ways to use and understand the data at hand.
&amp;lt;span id=”selection-marker-1″ class=”redactor-selection-marker”&amp;gt;&amp;lt;/span&amp;gt;
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are chatting about SEO ranking factors and the challenge around understanding correlation, what correlation means when it comes to SEO factors.
So you have likely seen, over the course of your career in the SEO world, lots of studies like this. They’re usually called something like ranking factors or ranking elements study or the 2017 ranking factors, and a number of companies put them out. Years ago, Moz started to do this work with correlation stuff, and now many, many companies put these out. So people from Searchmetrics and I think Ahrefs puts something out, and SEMrush puts one out, and of course Moz has one.
These usually follow a pretty similar format, which is they take a large number of search results from Google, from a specific country or sometimes from multiple countries, and they’ll say, “We analyzed 100,000 or 50,000 Google search results, and in our set of results, we looked at the following ranking factors to see how well correlated they were with higher rankings.” That is to say how much they predicted that, on average, a page with this factor would outrank a page without the factor, or a page with more of this factor would outrank a page with less of this factor.
Correlation in SEO studies like these usually mean:
So, basically, in an SEO study, they usually mean something like this. They do like a scatter plot. They don’t have to specifically do a scatter plot, but visualization of the results. Then they’ll say, “Okay, linking root domains had better correlation or correlation with higher organic rankings than the 10 blue link-style results to the degree of 0.39.” They’ll usually use either Spearman or Pearson correlation. We won’t get into that here. It doesn’t matter too much.
Across this many searches, the metric predicted higher or lower rankings with this level of consistency. 1.0, by the way, would be perfect correlation. So, for example, if you were looking at days that end in Y and days that follow each other, well, there’s a perfect correlation because every day’s name ends in Y, at least in English.
So search visits, let’s walk down this path just a little bit. So search visits, saying that that 0.47 correlated with higher rankings, if that sounds misleading to you, it sounds misleading to me too. The problem here is that’s not necessarily a ranking factor. At least I don’t think it is. I don’t think that the more visits you get from search from Google, the higher Google ranks you. I think it’s probably that the correlation runs the other way around — the higher you rank in search results, the more visits on average you get from Google search.
So these ranking factors, I’ll run through a bunch of these myths, but these ranking factors may not be factors at all. They’re just metrics or elements where the study has looked at the correlation and is trying to show you the relationship on average. But you have to understand and intuit this information properly, otherwise you can be very misled.
Myths and realities of correlation in SEO
So let’s walk through a few of these.
1. Correlation doesn’t tell us which way the connection runs.
So it does not say whether factor X influences the rankings or whether higher rankings influences factor X. Let’s take another example — number of Facebook shares. Could it be the case that search results that rank higher in Google oftentimes get people sharing them more on Facebook because they’ve been seen by more people who searched for them? I think that’s totally possible. I don’t know whether it’s the case. We can’t prove it right here and now, but we can certainly say, “You know what? This number does not necessarily mean that Facebook shares influence Google results.” It could be the case that Google results influence Facebook searches. It could be the case that there’s a third factor that’s causing both of them. Or it could be the case that there’s, in fact, no relationship and this is merely a coincidental result, probably unlikely given that there is some relationship there, but possible.
2. Correlation does not imply causation.
This is a famous quote, but let’s continue with the famous quote. But it sure is a hint. It sure is a hint. That’s exactly what we like to use correlation for is as a hint of things we might investigate further. We’ll talk about that in a second.
3. In an algorithm like Google’s, with thousands of potential ranking inputs, if you see any single metric at 0.1 or higher, I tend to think that, in general, that is an interesting result.
Not prove something, not means that there’s a direct correlation, just it is interesting. It’s worthy of further exploration. It’s worthy of understanding. It’s worthy of forming hypotheses and then trying to prove those wrong. It is interesting.
4. Correlation does tell us what more successful pages and sites do that less successful sites and pages don’t do.
Sometimes, in my opinion, that is just as interesting as what is actually causing rankings in Google. So you might say, “Oh, this doesn’t prove anything.” What it proves to me is pages that are getting more Facebook shares tend to do a good bit better than pages that are not getting as many Facebook shares.
I don’t really care, to be honest, whether that is a direct Google ranking factor or whether that’s just something that’s happening. If it’s happening in my space, if it’s happening in the world of SERPs that I care about, that is useful information for me to know and information that I should be applying, because it suggests that my competitors are doing this and that if I don’t do it, I probably won’t be as successful, or I may not be as successful as the ones who are. Certainly, I want to understand how they’re doing it and why they’re doing it.
5. None of these studies that I have ever seen so far have looked specifically at SERP features.
So one of the things that you have to remember, when you’re looking at these, is think organic, 10 blue link-style results. We’re not talking about AdWords, the paid results. We’re not talking about Knowledge Graph or featured snippets or image results or video results or any of these other, the news boxes, the Twitter results, anything else that goes in there. So this is kind of old-school, classic organic SEO.
6. Correlation is not a best practice.
So it does not mean that because this list descends and goes down in this order that those are the things you should do in that particular order. Don’t use this as a roadmap.
7. Low correlation does not mean that a metric or a tactic doesn’t work
Example, a high percent of sites using a page or a tactic will result in a very low correlation. So, for example, when we first did this study in I think it was 2005 that Moz ran its first one of these, maybe it was ’07, we saw that keyword use in the title element was strongly correlated. I think it was probably around 0.2, 0.15, something like that. Then over time, it’s gone way, way down. Now, it’s something like 0.03, extremely small, infinitesimally small.
What does that mean? Well, it could mean one of two things. It could mean Google is using it less as a ranking factor. It could mean that it was never connected, and it’s just total speculation, total coincidence. Or three, it could mean that a lot more people who rank in the top 20 or 30 results, which is what these studies usually look at, top 10 to top 50 sometimes, a lot more of them are putting the keyword in the title, and therefore, there’s just no difference between result number 31 and result number 1, because they both have them in the title. So you’re seeing a much lower correlation between pages that don’t have them and do have them and higher rankings. So be careful about how you intuit that.
Oh, one final note. I did put -0.02 here. A negative correlation means that as you see less of this thing, you tend to see higher rankings. Again, unless there is a strong negative correlation, I tend to watch out for these, or I tend to not pay too much attention. For example, the keyword in the meta description, it could just be that, well, it turns out pretty much everyone has the keyword in the meta description now, so this is just not a big differentiating factor.
What is correlation good for?
All right. What’s correlation actually good for? We talked about a bunch of myths, ways not to use it.
A. IDing the elements that more successful pages tend to have
So if I look across a correlation and I see that lots of pages are twice as likely to have X and rank highly as the ones that don’t rank highly, well, that is a good piece of data for me.
B. Watching elements over time to see if they rise or lower in correlation.
For example, we watch links very closely over time to see if they rise or lower so that we can say: “Gosh, does it look like links are getting more or less influential in Google’s rankings? Are they more or less correlated than they were last year or two years ago?” And if we see that drop dramatically, we might intuit, “Hey, we should test the power of links again. Time for another experiment to see if links still move the needle, or if they’re becoming less powerful, or if it’s merely that the correlation is dropping.”
C. Comparing sets of search results against one another we can identify unique attributes that might be true
So, for example, in a vertical like news, we might see that domain authority is much more important than it is in fitness, where smaller sites potentially have much more opportunity or dominate. Or we might see that something like https is not a great way to stand out in news, because everybody has it, but in fitness, it is a way to stand out and, in fact, the folks who do have it tend to do much better. Maybe they’ve invested more in their sites.
D. Judging metrics as a predictive ranking ability
Essentially, when I’m looking at a metric like domain authority, how good is that at telling me on average how much better one domain will rank in Google versus another? I can see that this number is a good indication of that. If that number goes down, domain authority is less predictive, less sort of useful for me. If it goes up, it’s more useful. I did this a couple years ago with Alexa Rank and SimilarWeb, looking at traffic metrics and which ones are best correlated with actual traffic, and found Alexa Rank is awful and SimilarWeb is quite excellent. So there you go.
E. Finding elements to test
So if I see that large images embedded on a page that’s already ranking on page 1 of search results has a 0.61 correlation with the image from that page ranking in the image results in the first few, wow, that’s really interesting. You know what? I’m going to go test that and take big images and embed them on my pages that are ranking and see if I can get the image results that I care about. That’s great information for testing.
This is all stuff that correlation is useful for. Correlation in SEO, especially when it comes to ranking factors or ranking elements, can be very misleading. I hope that this will help you to better understand how to use and not use that data.
Thanks. We’ll see you again next week for another edition of Whiteboard Friday.
Video transcription by Speechpad.com
The image used to promote this post was adapted with gratitude from the hilarious webcomic, xkcd.
0 notes
seotopbox-blog · 6 years
Photo
Tumblr media
SEO Ranking Factors & Correlation: What Does It Mean When a Metric Is Correlated with Google Rankings?
In an industry where knowing exactly how to get ranked on Google is murky at best, SEO ranking factors studies can be incredibly alluring. But there’s danger in believing every correlation you read, and wisdom in looking at it with a critical eye. In this Whiteboard Friday, Rand covers the myths and realities of correlations, then shares a few smart ways to use and understand the data at hand.
&amp;lt;span id=”selection-marker-1″ class=”redactor-selection-marker”&amp;gt;&amp;lt;/span&amp;gt;
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are chatting about SEO ranking factors and the challenge around understanding correlation, what correlation means when it comes to SEO factors.
So you have likely seen, over the course of your career in the SEO world, lots of studies like this. They’re usually called something like ranking factors or ranking elements study or the 2017 ranking factors, and a number of companies put them out. Years ago, Moz started to do this work with correlation stuff, and now many, many companies put these out. So people from Searchmetrics and I think Ahrefs puts something out, and SEMrush puts one out, and of course Moz has one.
These usually follow a pretty similar format, which is they take a large number of search results from Google, from a specific country or sometimes from multiple countries, and they’ll say, “We analyzed 100,000 or 50,000 Google search results, and in our set of results, we looked at the following ranking factors to see how well correlated they were with higher rankings.” That is to say how much they predicted that, on average, a page with this factor would outrank a page without the factor, or a page with more of this factor would outrank a page with less of this factor.
Correlation in SEO studies like these usually mean:
So, basically, in an SEO study, they usually mean something like this. They do like a scatter plot. They don’t have to specifically do a scatter plot, but visualization of the results. Then they’ll say, “Okay, linking root domains had better correlation or correlation with higher organic rankings than the 10 blue link-style results to the degree of 0.39.” They’ll usually use either Spearman or Pearson correlation. We won’t get into that here. It doesn’t matter too much.
Across this many searches, the metric predicted higher or lower rankings with this level of consistency. 1.0, by the way, would be perfect correlation. So, for example, if you were looking at days that end in Y and days that follow each other, well, there’s a perfect correlation because every day’s name ends in Y, at least in English.
So search visits, let’s walk down this path just a little bit. So search visits, saying that that 0.47 correlated with higher rankings, if that sounds misleading to you, it sounds misleading to me too. The problem here is that’s not necessarily a ranking factor. At least I don’t think it is. I don’t think that the more visits you get from search from Google, the higher Google ranks you. I think it’s probably that the correlation runs the other way around — the higher you rank in search results, the more visits on average you get from Google search.
So these ranking factors, I’ll run through a bunch of these myths, but these ranking factors may not be factors at all. They’re just metrics or elements where the study has looked at the correlation and is trying to show you the relationship on average. But you have to understand and intuit this information properly, otherwise you can be very misled.
Myths and realities of correlation in SEO
So let’s walk through a few of these.
1. Correlation doesn’t tell us which way the connection runs.
So it does not say whether factor X influences the rankings or whether higher rankings influences factor X. Let’s take another example — number of Facebook shares. Could it be the case that search results that rank higher in Google oftentimes get people sharing them more on Facebook because they’ve been seen by more people who searched for them? I think that’s totally possible. I don’t know whether it’s the case. We can’t prove it right here and now, but we can certainly say, “You know what? This number does not necessarily mean that Facebook shares influence Google results.” It could be the case that Google results influence Facebook searches. It could be the case that there’s a third factor that’s causing both of them. Or it could be the case that there’s, in fact, no relationship and this is merely a coincidental result, probably unlikely given that there is some relationship there, but possible.
2. Correlation does not imply causation.
This is a famous quote, but let’s continue with the famous quote. But it sure is a hint. It sure is a hint. That’s exactly what we like to use correlation for is as a hint of things we might investigate further. We’ll talk about that in a second.
3. In an algorithm like Google’s, with thousands of potential ranking inputs, if you see any single metric at 0.1 or higher, I tend to think that, in general, that is an interesting result.
Not prove something, not means that there’s a direct correlation, just it is interesting. It’s worthy of further exploration. It’s worthy of understanding. It’s worthy of forming hypotheses and then trying to prove those wrong. It is interesting.
4. Correlation does tell us what more successful pages and sites do that less successful sites and pages don’t do.
Sometimes, in my opinion, that is just as interesting as what is actually causing rankings in Google. So you might say, “Oh, this doesn’t prove anything.” What it proves to me is pages that are getting more Facebook shares tend to do a good bit better than pages that are not getting as many Facebook shares.
I don’t really care, to be honest, whether that is a direct Google ranking factor or whether that’s just something that’s happening. If it’s happening in my space, if it’s happening in the world of SERPs that I care about, that is useful information for me to know and information that I should be applying, because it suggests that my competitors are doing this and that if I don’t do it, I probably won’t be as successful, or I may not be as successful as the ones who are. Certainly, I want to understand how they’re doing it and why they’re doing it.
5. None of these studies that I have ever seen so far have looked specifically at SERP features.
So one of the things that you have to remember, when you’re looking at these, is think organic, 10 blue link-style results. We’re not talking about AdWords, the paid results. We’re not talking about Knowledge Graph or featured snippets or image results or video results or any of these other, the news boxes, the Twitter results, anything else that goes in there. So this is kind of old-school, classic organic SEO.
6. Correlation is not a best practice.
So it does not mean that because this list descends and goes down in this order that those are the things you should do in that particular order. Don’t use this as a roadmap.
7. Low correlation does not mean that a metric or a tactic doesn’t work
Example, a high percent of sites using a page or a tactic will result in a very low correlation. So, for example, when we first did this study in I think it was 2005 that Moz ran its first one of these, maybe it was ’07, we saw that keyword use in the title element was strongly correlated. I think it was probably around 0.2, 0.15, something like that. Then over time, it’s gone way, way down. Now, it’s something like 0.03, extremely small, infinitesimally small.
What does that mean? Well, it could mean one of two things. It could mean Google is using it less as a ranking factor. It could mean that it was never connected, and it’s just total speculation, total coincidence. Or three, it could mean that a lot more people who rank in the top 20 or 30 results, which is what these studies usually look at, top 10 to top 50 sometimes, a lot more of them are putting the keyword in the title, and therefore, there’s just no difference between result number 31 and result number 1, because they both have them in the title. So you’re seeing a much lower correlation between pages that don’t have them and do have them and higher rankings. So be careful about how you intuit that.
Oh, one final note. I did put -0.02 here. A negative correlation means that as you see less of this thing, you tend to see higher rankings. Again, unless there is a strong negative correlation, I tend to watch out for these, or I tend to not pay too much attention. For example, the keyword in the meta description, it could just be that, well, it turns out pretty much everyone has the keyword in the meta description now, so this is just not a big differentiating factor.
What is correlation good for?
All right. What’s correlation actually good for? We talked about a bunch of myths, ways not to use it.
A. IDing the elements that more successful pages tend to have
So if I look across a correlation and I see that lots of pages are twice as likely to have X and rank highly as the ones that don’t rank highly, well, that is a good piece of data for me.
B. Watching elements over time to see if they rise or lower in correlation.
For example, we watch links very closely over time to see if they rise or lower so that we can say: “Gosh, does it look like links are getting more or less influential in Google’s rankings? Are they more or less correlated than they were last year or two years ago?” And if we see that drop dramatically, we might intuit, “Hey, we should test the power of links again. Time for another experiment to see if links still move the needle, or if they’re becoming less powerful, or if it’s merely that the correlation is dropping.”
C. Comparing sets of search results against one another we can identify unique attributes that might be true
So, for example, in a vertical like news, we might see that domain authority is much more important than it is in fitness, where smaller sites potentially have much more opportunity or dominate. Or we might see that something like https is not a great way to stand out in news, because everybody has it, but in fitness, it is a way to stand out and, in fact, the folks who do have it tend to do much better. Maybe they’ve invested more in their sites.
D. Judging metrics as a predictive ranking ability
Essentially, when I’m looking at a metric like domain authority, how good is that at telling me on average how much better one domain will rank in Google versus another? I can see that this number is a good indication of that. If that number goes down, domain authority is less predictive, less sort of useful for me. If it goes up, it’s more useful. I did this a couple years ago with Alexa Rank and SimilarWeb, looking at traffic metrics and which ones are best correlated with actual traffic, and found Alexa Rank is awful and SimilarWeb is quite excellent. So there you go.
E. Finding elements to test
So if I see that large images embedded on a page that’s already ranking on page 1 of search results has a 0.61 correlation with the image from that page ranking in the image results in the first few, wow, that’s really interesting. You know what? I’m going to go test that and take big images and embed them on my pages that are ranking and see if I can get the image results that I care about. That’s great information for testing.
This is all stuff that correlation is useful for. Correlation in SEO, especially when it comes to ranking factors or ranking elements, can be very misleading. I hope that this will help you to better understand how to use and not use that data.
Thanks. We’ll see you again next week for another edition of Whiteboard Friday.
Video transcription by Speechpad.com
The image used to promote this post was adapted with gratitude from the hilarious webcomic, xkcd.
0 notes
seotopbox-blog · 6 years
Photo
Tumblr media
Google to roll out new Search Console features in coming weeks
Google announced today that it will be making the new Google Search Console available to everyone in the coming weeks. Specifically, verified users in Google Search Console will be able to access the new Search Performance, Index Coverage, AMP status and Job posting reports.
Google said the new Search Console reports provide “more transparency into Google’s indexing, stateful two-way communications between Google and website owners to help resolve issues faster, and a responsive user-interface.”
Search Engine Land was the first to break the news around the beta Search Console features back in July. Google then shared additional details about this beta. With this launch, the search performance report will have a year-plus of data, which has long been a priority request from the industry.
Both versions of Search Console will continue to be live for all users and can be used side by side. Google said it will continue to add classic Search Console features into the new Search Console, so until that process is complete, webmasters will be able to access both versions via a link in the navigation bar.
Also noteworthy in the new Search Console is the ability to share critical issues that surface in (most) reports with multiple team members in an organization, to more efficiently distribute the information and have problems addressed more expediently. Revoking access is likewise easy and intuitive. More on this below.
Search Performance report:
Google’s new Search Performance report is similar to the Search Analytics report, but it gives 16 months of data. The report lets you see clicks, impressions, CTR and average position. It also lets you filter by web, image or video search results and segment by query, page, country or device type. Google indicates this data will also soon be available via the Search Console API.
Search Performance Report
For more on this report, see the help documentation.
Index Coverage report:
The Index Coverage report is similar to the Index Status report: It shows webmasters how well Google is indexing their site. It lists properly indexed URLs and gives warnings about indexing issues — along with explanations for why Google is not indexing some URLs. The report shows changes over time and improvements to site indexing over that time frame. Clicking on any error URL will bring up links to diagnostics tools, as well as a way to resubmit the URL for indexing. The data can also be exported for deeper analysis. Google notes that this report “works best for sites that submit sitemap files.”
Index Coverage report
From the Google announcement (Note the item numbers correspond to the items in the image above):
When you drill down into a specific issue, you will see a sample of URLs from your site. Clicking on error URLs brings up the page details with links to diagnostic tools that help you understand what is the source of the problem.
Fixing Search issues often involves multiple teams within a company. Giving the right people access to information about the current status, or about issues that have come up there, is critical to improving an implementation quickly. Now, within most reports in the new Search Console, you can do that with the share button on top of the report which will create a shareable link to the report. Once things are resolved, you can disable sharing just as easily.
The new Search Console can also help you confirm that you’ve resolved an issue, and help us to update our index accordingly. To do this, select a flagged issue, and click validate fix. Google will then crawl and reprocess the affected URLs with a higher priority, helping your site to get back on track faster than ever.
The Index Coverage report works best for sites that submit sitemap files. Sitemap files are a great way to let search engines know about new and updated URLs. Once you’ve submitted a sitemap file, you can now use the sitemap filter over the Index Coverage data, so that you’re able to focus on an exact list of URLs.
For more on this report, see the help documentation.
AMP status report:
The AMP status report gives errors and warnings around your AMP URLs. It shows which URLs have issues, gives a diagnosis of the issue itself and lets you fix the issue and then test to make sure the AMP URL is now valid. Google said that “you can request that Search Console validates the fix across multiple pages.” Once the issue is fixed, Google said it will “crawl and reprocess the affected URLs with a higher priority.” In addition, the AMP reports can be shared with external teams.
AMP Status report
For more on this report, see the help documentation.
Job Posting report:
Those who post job listings on their websites can now check out the new Job Posting report. This report will show statistics around your job listing results, any indexing issues and a way to resolve those issues.
Job Posting report
For more on this report, see the help documentation.
Google is looking for ongoing feedback on the new Search Console and encourages webmasters to continue sending this as adoption of the new version rolls out. See the full blog post from Google here.
0 notes
seotopbox-blog · 7 years
Photo
Tumblr media
Keyword Research for SEO: Complete Guide 2017
They represent all the phrases which you type into Google search box when surfing the net. Having this in mind, you can quickly establish that SEO is a user-oriented profession.
In fact, expert’s proficiency can be measured by their ability to discover trending keywords and rank for them. In other words, SEOs ability to perform keyword research.
Similarly to any commercial products, there are two main things that should concern us – the strength of our competition and the demand for a certain keyword.
By using SEO terminology, we can say that two main factors of keyword research are:
Search Volume (number of monthly searches)
Keyword Difficulty (competitiveness of a keyword)
Unfortunately, unlike a classic economy where everything is quantifiable, things get a bit troublesome in the world of SEO. We usually rely on stats provided by the Google Keyword Planner tool which is based on PPC (pay per click) or paid search. It is really hard to establish the real state of things and it usually comes down to approximation.
But, we will discuss that later on in the article so stay tuned. For now, let’s start with the basics.
What type of keywords should I pursue?
There are two types of keywords that you should consider during your keyword research:
Those that can bring you profit (so called “money” or commercial keywords)
Those that can bring you traffic and links (also known as informative keywords)
Most websites exist so they could make a profit. In the majority of the cases, the products are directly sold through the website and shipped all over the country/world. This is why it is necessary to rank for keywords that will lure potential customers to your website and increase your sales.
Whenever you create some content, you have to consider your potential clients. What kind of a keyword will they use when searching for a product? These phrases will usually include descriptive words such as buy, cheap, affordable, etc.. They will help your customer pinpoint just the thing they need.
Unfortunately, as keywords become more commercially oriented, they will also become more competitive. For example, phrases with “buy” and “cheap” in them are among hardest ones to rank on the Internet.
Nevertheless, you still have to try and rank for them as they are the best way for you to remain profitable. On the other hand, you can search for keywords that will attract additional traffic.
Why would I do that, you might ask?
Simply put, unless you have enormous amount of money to spend on an aggressive marketing pay-per-click campaign (such as the one performed by Amazon), you will have to build your website from the ground up which will ultimately bring a lot of organic traffic.
You will require more links and shares to get to that point and the best way to get them it by writing about things that will interest larger audience. Here, I am not only referring to potential clients but also news websites, popular blogs within your niche and industry experts.
Let’s use this example. You are selling tractors. One of the first articles which you posted on your blog is about different types of tractors. Naturally, you are trying to promote your own tractors by linking to your product pages. If the piece is awesome, you might get several links and a nice bunch of shares.
As an alternative, you can create an awesome article about new agricultural measures. It may elaborate something that everybody is talking about and ultimately, it will give your website a lot of buzz.
The drawback of this second method is that your website won’t be making any profit. Yes, there will be a lot of visitors on your website but this will not be commercial audience. When it comes to selling your tractors, the conversion rate will be minimal. However, this is a good initial step towards building your brand and online presence.
For short-term goals, money keywords should be prioritized. For long-term, you need both types. Bear in mind, no matter what you do, you will have to use commercial keywords as a way to keep your website afloat.
Structure of a keyword
The structure and length of a keyword is one of the crucial things that are directly correlated to its difficulty and volume.
As I previously mentioned, there are certain types of keywords that are significantly more difficult to rank for. On the other hand, there are those that constantly have high or low volume or may even fluctuate. A Good example is “Summer Olympics”.
Length of a keyword is another factor that is important for volume and difficulty. As you can presume, volume becomes lower for longer keywords and vice versa. Based on their length, we can differentiate three types of keywords:
Short-tail keywords (1 to 2 words)
Medium-tail keywords (3 to 4 words)
Long-tail keywords (longer than 4)
When it comes to structure, we can differentiate:
Head (main word or a phrase which is the centerpiece of the search)
Modifier (a word which can be substituted for other words in order to change a single aspect of the keywords meaning)
Tail (all other words used to describe or explain our query)
Short-tail keywords are the simplest formation. They only have a head word. Generally speaking, it is nearly impossible to rank for such a phrase due to extensive competition. However, they do bring an enormous traffic.
Medium-tail keywords are just the thing we are looking for. As you can presume, 3 to 4 word phrases are extremely sought after. They definitely do not have the same volume as short-tail keywords but with them, you stand a chance of ranking.
Long-tail keywords are longer phrases than four words. Even though they are really easy to rank for, they are often neglected due to their low volume. However, long tails can also be quite powerful when you rank for a lot of them at the same time.
Basically, when you perform research, you should focus on phrases that have medium volume and low or medium difficulty (thus medium-tail keywords). But, there is a catch. Keyword research doesn’t stop when you find such a phrase. Instead, you need to focus on those medium-tail keywords that are performing better than the rest.
If a keyword has lower volume, it needs to compensate by being easier to rank for. On the flipside, if it has medium difficulty, it needs to have higher search volume to justify the effort.
Finding keyword ideas
In order to do keyword research properly, we need a lot of initial ideas that will lead us during the process. Based on the previous chapters, you somewhat understand what is required from you. Now, let’s find a way to detect all those phrases that can have a positive impact on our sales.
It is usually recommended that you start from your main product or service which you are offering. Commercial keyword research is much more limited. You already know what you have to focus on and you will do everything to optimize around that phrase. On the other hand, if you wish to boost website’s stats, you are able to create different content.
Always have in mind that besides your own industry, you can also tap into niche markets. They include all the topics that are somewhat related to you but are not exactly what you are offering. We can call them shoulder niches.
How to find keywords
Let’s review all the tools and approaches you can us to get keyword ideas:
1) Google auto-suggest and searches related to
Google itself is a keyword suggestion tool. For example, when you start typing in a phrase the search engine will start completing your sentences, giving you 10 suggestions as you go.
At the same time, on the bottom of every page there will be “Searches related to” section. Here, Google will give you eight additional suggestions that are closely related to your topic.
However, due to its limitations, the biggest search engine can only be used as a way to get basic understanding of the topic. Nevertheless, it is a solid starting point.
2) Wikipedia 
Oftentimes when we look for something on the internet, we turn to Wikipedia as a source of extensive knowledge. Even though there are better sources for particular topics, this website is still considered as the best and most comprehensive encyclopedia.
By entering your main keyword in its search engine, you will get a page with a description. Here, in the table of content, you can get other relevant topics and sub-categories.
Most of these sub-categories are really extensive and they can be used as source to additional research. We refer to them as shoulder niches or niches that are in some way related to our own niche.
3) Quora, Yahoo! Answers and forums
For some time now, Quora and Yahoo! Answers have been the two best places for finding answers to all your questions. Nevertheless, every industry has its own forums that are recognized as good source of information.
Now, here is the general idea. If someone has already looked for something on forums, there is a high chance they will use the same (or similar) phrase in Google search bar. By using these platforms, you can easily learn what are the trending topics, what are people interested in and subsequently, what is going to bring most traffic to your website.
BoardReader.com is a forum search engine that can be extremely useful when it comes to collecting keyword ideas from forums and online boards. Simply enter your keyword in its search box and you’ll be given lots of keyword ideas directly from forums which you won’t be able to find anywhere else.
4) Google Trends and Google Correlate 
As I mentioned, search volume for keywords is not static. It fluctuates. If you are an SEO expert, you should recognize rising and falling trends and act accordingly. This is why many experts like to use Google Trends as the initial point of their research.
If the number of searches per keyword is rising, this means that we have a chance of creating awesome content before the topic becomes too popular and hard to rank for. Furthermore, Google Trends can show you from where the majority of the traffic is coming from and give you some additional keyword ideas.
Google Correlate is part of Google Trends. It uses as a scale of 1 to -1 to show you the level of correlation between your starting keyword and all the other phrases. To rephrase, it shows the search patterns where some keywords are likely to rise or fall together with your main keyword.
5) Google Keyword Planner
Google Keyword Planner is one of the most commonly mentioned tools when it comes to keyword suggestions and there is a good reason for it. This tool is based on AdWords system where search engine is able to calculate volume, competitiveness and price for each keyword.
Unfortunately, when it comes to volume and competition, it is based on paid search not the organic one. As of late, Google focused on using it primarily for PPC. So, unless you invest some money in paid campaign, you cannot get good results.
Nevertheless, it is still a good tool for getting keyword ideas. First, you need to access “Search for new keyword and ad group ideas” option.
Then, you are interested in two things:
Ad group ideas (suggested keywords are categorized into potential ad groups)
Keyword ideas (a list with keywords that are closely related to your main keyword)
Although you can only use keyword ideas, I strongly recommend that you also use Ad group ideas. It will widen the scope of your search a lot.
For example, if you use “cat food” keyword, you will instantly get several suggested phrases consisting of both “cat” and “food”. They will have volume, suggested bid, competition and other stats. However, if you use Ad group ideas before Keyword ideas, you can get a list with all the other related groups of keywords such as “cat toys” and “pet food”.
As you see with Ad group ideas you not only get suggestions based on your main keyword but also semantically related words and phrases that your competition might not even know they exist.
6) Keyword Shitter
This rather simplistic keyword suggestions tool is considered as one of the most comprehensive tools of its kind on the internet. Besides the fact that it is easy to use, it provides amazing results.
All you have to do is type in a keyword and this program will give you a huge list of suggestions. To refine search, you can use positive and negative filters that will include or exclude certain word or phrase.
This tool needs some time to retrieve all the results but it is more than worth it. Word of caution – after a while it will start giving unrelated results. Because of it, you will have to be careful when assessing them.
After your list is complete just head back to Google Keyword Planner, copy-paste the list of results Keyword Shitter got you add the most lucrative keywords to your list.
  7) Keywordtool.io
Another great free SEO tool on our list, Keyword.io is pretty solid when it comes to extracting keyword from various sources. You can use Google, Bing, YouTube, Amazon and other search engines and add suggestions to your list.
For example, you can add several suggestions from Google and then start browsing Bing and add several suggestions from Bing. What makes this tool so special is the fact that it doesn’t only give you words to add before and after your main keyword, but it does this for every letter in the alphabet.
For example, if your main word is “organic food”, Keyword.io will give you ideas like “best organic food” or “organic food delivery” for every letter from A to Z.
At the end of the process, you can export all these results to use later on in Google Keyword Planner of some other tool of your choice.
8) SEMRush
When it comes to reverse engineering your competitors’ SEO, SEMRush is definitely the tool which you always have to have in your arsenal. Its unique advantage lies is its ability to show very accurate organic and PPC data for almost every website.
This great program can be used to spy on your competition and check out their keywords (among other things). Just enter the URL of your main competitors in SEMRush and see exactly where their organic traffic is coming from.
It will give you pretty accurate data allowing you to copy the strategy of your competition. SEMRush shows all the ranking keywords of a website and their current positions in Google together with the percentage of traffic they bring and many other useful stats. It is a great way to get some fresh keyword ideas that no other tool can show you.
Assessing the keywords and your competition
You probably have an extensive list of results in front of you by now. That’s great!
Now, you need to examine all of them and find just the right keyword that has greatest potential.
  If I wrote this article a couple of months ago, I would definitely suggest using Google Keyword Planner. Due to the significant changes that Keyword Planner undertook, it is no longer an option that’s free for everyone. Google wanted to place emphasis on PPC users that spend money on AdWords . As an organic user, you won’t get the full scope of things.
But, there are other tools which can be used. During this part of research, you have to determine search volume and keyword difficulty of a keyword. You also need to see how much work it will take to reach top rankings in Google which can be done by analyzing the links of your competitors as well as the strength of their websites.
How to use your keyword research tools
Some things can be done quickly and painlessly without having to invest a cent. But, during this stage, it is recommended to invest some money in tools.
Have in mind that you are able to perform the entire research without spending any money. But, for optimal results, you might consider getting some of the keyword research tools from the list.
Now, let’s see what kind of programs you need.
1) io – assessing keyword volume 
Maybe Google Keyword Planner changed but Keywordtool.io hasn’t. The tool is based on the same data which can be found in Google Keyword Planner. In fact, it extracts all the info from it. So, even though Keyword Planner is no longer an option, you have a suitable replacement.
It provides three basic types of data:
Volume (total number of monthly searches in Google)
CPC (cost-per-click or the amount of money that bidders pay for that particular keyword)
AdWords Competition (number of people bidding for that keyword)
Although this data is based on PPC, it does show us how competitive and popular a keyword is. Be cautious though because this is only an approximation. It doesn’t show the real state of organic traffic.
Similarly to Keyword Shitter, you have a positive and negative filter which allows you to include or exclude certain words to your liking. On top of that, you can further filter your search by looking for data either in Google, YouTube, Bing, Amazon or App Store. As you can presume, Keywordtool.io is also good for getting new keyword ideas.
2) Moz Keyword Explorer – assessing keyword difficulty
Next step of the process is determining the difficulty of your keyword. Although Keywordtool.io is great at accessing the volume, it doesn’t evaluate keyword difficulty. Instead, you should use Moz Keyword Explorer. This tool is by far the best way to assess how difficult a keyword is.
The three basic stats that this tool provides are:
Difficulty (how difficult it is to rank higher than the articles which are already ranking)
Opportunity (estimated organic click-through-rate)
Potential (combination of previous scores)
Together with the previously mentioned tool, Keyword Explorer can help us understand what to expect from a keyword. Its algorithms that assess difficulty are quite precise and I would full-heartedly recommend them.
The fact that this is a freemium tool makes it that much better. Just register an account with Moz and you’ll get five free searches per day.
Besides this basic data, it also shows you other keyword suggestions, SERP analysis and keyword mentions. It is a very practical way to analyze the first page of results and check your competition.
  3) MozBar – Assessing domain authority
MozBar is something that every blogger should have regardless of whether they are performing keyword research or not.
This nifty extension is completely free and you can get it through Google Web Store. It shows you the PA and DA (page and domain authority of websites).
Whenever you search for a keyword in Google, you will get a list of all the top competitors. With this extension, you can see page and domain authority score of each one of the top 10 ranking websites without having to click on every page individually.
With that, we come to our next point.
4) Manually checking the first page
People tend to forget that the process of keyword research isn’t exclusively based on tool usage. Human factor also plays a role as you go to the first page of results and check all the competitors with your own eyes (also known as eyeballing).
No matter what you do, I always suggest that you start by checking keyword’s volume and difficulty. It is a necessity that will save you a lot of time later on. But, the numbers can only tell you so much. You need to eyeball each result on the first page and check all the competitors yourself.
Are there too many authoritative websites on page one? Do these results have extensive, high quality articles? If so, there is a slim chance of ranking.
On the other hand, if you notice a lot of sites with low PA and DA scores, forum results, pages on free blog platforms like wordpress.com or blogspot, it may indicate that the search is lacking quality sources.
By creating your own high quality long article, you can easily beat the competition and rank on page one. Don’t forget to build links as you go too which takes us to the last tool.
5) Ahrefs – Assessing links’ power and quantity
For now, everything seems ok. You have assessed the stats and your competition doesn’t look too stiff. Awesome! But, there is another, last step of the way. You need to check top 10 competitors’ backlinks.
Links still remain the most impactful ranking factor. That being said, you always have to check the links of other pages and see if you can beat that score. I would recommend using Ahrefs as the best tool for this particular purpose.
Ahrefs is pretty quick to notice new backlinks. On top of that, it is rather precise when doing so. The biggest issue with this tool is the price. But, if you are serious about keyword research, it is better if you get it.
Without assessing the links of your main competitors, you can never know whether you can actually rank for a keyword. Getting links can especially be problematic for brand new websites. As a result, all your efforts may be in vein.
There are two things that need to be considered:
Quantity of the links
Quality of the links
When it comes to backlinks, more is not always better. One link from a highly authoritative website can easily trump dozens coming from weaker blogs.
Again, it’s all based on free assessment. If a website has a certain number of links that doesn’t necessarily mean that we need the same number to overtake him. There are numerous additional ranking factors that have some impact. But, if the first few results have around 100 links each with average DA over 50, it can tell us where we stand and if the keyword is too difficult to penetrate.
Conclusion
SEO is not an exact science. It has never been. At best, it can be called a profession of educated guesses. Same goes with keyword research. But, similarly to other professions that are rather intangible (such as stock trading) we need a starting point which can reduce the risk of failure. In the end, there is no point in randomly selecting keywords, right?
Keyword research is a process that can be costly. At the same time, if you know the tools, you will be able to perform it for a much lower price. With this detailed guide, I hope you’ve got some basic understanding of what can work and what’s a complete waste of time. Let me know in the comments below.
0 notes
seotopbox-blog · 7 years
Photo
Tumblr media
Key Benefits Of Social Bookmarking and Content Curation Sites
Tumblr media
If you expect your content to go viral or at least get the traffic you want without doing anything apart from pressing the ‘Publish’ button, you’re wrong. There are many ways to help people find your content – social media is the preferred method, but bookmarking is still useful, if used correctly.
When it comes to content promotion, not only must you use a variety of channels, you need to have a strategy. As part of your strategy, as well as your promotional arsenal, social bookmarking sites should not be overlooked. For sure, you still need to engage on social media, as well as have a rigorous outreach strategy, but social bookmarking sites are still an effective way of putting your content in front of the people who are interested in your industry, products and services. Therefore, these should be implemented into your SEO and content promotion strategy.
Key Benefits Of Social Bookmarking and Content Curation Sites
Whilst times have changed and bookmarking is no longer considered a link building practice on a massive scale, there are still a lot of benefits this can generate for your website. These can include:
More traffic –  Many of the top social bookmarking sites can be a great referral source; for example StumbleUpon often appears as one of the top three referral sources
Qualified visitors – If you operate in a specific industry, the use of highly targeted/niche bookmarking sites can help you gain qualified visitors
More social signals – The more social signals your content receives across multiples sites the better, as it is believed this can have a positive effect on your rankings
Faster indexing – Social bookmarking platforms are crawled almost constantly by search engines which can often result in your content being indexed much faster
However, given Google’s recent algorithm updates I believe you need to have a strategy when it comes to social bookmarking to avoid any pitfalls, or even worse, receiving a penalty from Google.
Bookmarking The Right Way
According to Google’s latest quality guidelines, links coming from low-quality directories and bookmarking sites can be seen us unnatural and therefore could have a negative effect on your rankings.
Years ago, social bookmarking was pretty much about getting your content bookmarked on every site, no matter the quality, out there in order to generate as many links as possible. Well, these days are gone so if you’re still doing it, STOP now!
The correct way to use social bookmarking and content curation sites is to use them on a regular basis. You need to be active, grow your followers (if possible) and most importantly engage with the audience, otherwise you won’t get the results you want and you can actually put your website at greater risk.
I would highly recommend applying the Pareto 80/20 rule where 80% of the content you bookmark/curate/submit is not from your website, but still relevant to your industry. The rest (20%) of submitted or shared articles should be from your website or blog.
6 Rules for Social Bookmarking In the Post-Penguin World
Find quality bookmaking sites that are relevant to your industry, products and target audience
Create complete profiles when setting up your profile/account as you would with Facebook, Twitter or Google+
Quality over quantity – avoid bookmarking your content on as many sites as possible, relevancy is the key. Stop using automated Social Bookmarking tools
Use the 80/20 rule – don’t just promote your own content
Build a portfolio of good bookmarks that will help other users. This can help you build trust; in other words people will be more likely to share, like or vote up your bookmarked content
Be active – re-share other’s content, engage with people, ask questions and post comments
Top Social Bookmarking and Content Curation Sites
It’s time to take your content beyond the usual social networks such as Twitter, Facebook, LinkedIn, and Pinterest. I’ve created a list of top social bookmarking sites which I have a good experience with and use on a regular basis.
1. Pearltrees
Pearltrees is one of my favourite bookmarking sites as it’s very powerful and an easy-to-use platform. I use it as my number one personal bookmarking site and have already seen good results in terms of traffic, as well as likes within the platform itself. Below you can see how I structure the content I share.
Tumblr media
Pros – very easy to setup and use, simple yet powerful, paid options available
Cons – you need to build up a good portfolio as sporadic activity won’t help you
2. StumbleUpon
StumbleUpon is a must-use site not only for those who want to share their articles, but also for people who want to find interesting articles, hence the nickname ‘discovery engine’. It can be a powerful traffic driver for many businesses, big as well as small. With a wide range of categories to choose from, you’ll be able to show your content to a relevant audience.
Tumblr media
Pros – great for finding interesting content, many topics/interests/categories which to submit content,
Cons –  Again, you need to be active first; make sure you submit other articles and have friends on it before you submit your own content.
3. Delicious
Delicious is a very effective bookmarking site that enables you to easily bookmark sites you like and create personal collections based on tags/keywords. As a bonus, all of your bookmarks can be shown in a feed allowing others to follow/subscribe to your bookmarks.
Tumblr media
Pros –  Extremely popular bookmarking site with a large base of users, it can be a powerful traffic driver
Cons –  A good and active account is required in order to see results
4. Digg
Digg together with Delicious are top bookmarking sites with a high authority. If your website/content is shared and voted up and makes the Digg Home page you can expect a lot of traffic.
Tumblr media
The main pros and cons are similar to Delicious, so don’t expect much from your first submission.
5. Flipboard
Flipboard claims to be social magazine which collects news and other content and presents it in magazine-style format. With Flipboard you can create your own magazine on any topic you like. This way you can create visually appealing magazines on topics relevant to your industry and audience.
Tumblr media
This SteamFeed article will show you how to use Flipboard to promote your business.
Pros – stunning visual design, great way to showcase your best content or even image-based content, e.g. an online story book
Cons – need to signup via phone (Web-based platform launched recently), being active is a must as no-one will have a look at your 3 page-only magazine
6. BizSugar
BizSugar is one of my favourite curation sites which I use on a regular basis. As the name suggest, BizSugar is a social bookmarking sites made for small and medium sized business to share useful content. If it’s good users can reward it with ‘sugars’ (votes); the more you get, the better. Plus your submitted content can become ‘hot’ and make the first page. Below is an example of one of my SEO posts that became hot, resulting in a traffic increase as well as additional comments:
Tumblr media
Pros – Easy to use platform, you can create your own Meta descriptions to make it more engaging for readers, instant user interactions (votes, comments)
Cons – fully completed profile, as well as community engagement is a must, otherwise there are limitations on the number of posts you can submit in a day and you’ll only receive business news
7. Sharebloc
Sharebloc is a very similar to BizSugar where you can curate useful content and get votes. It’s a quite new platform, but it has great potential to grow.
Tumblr media
Pros – simple and easy-to-use platform, as it’s still new you have a great opportunity to become an expert
Cons – fairly new platform with a smaller user base than BizSugar
8. Bundlr
Bundlr is another site which is fairly new on the block. I’ve only explored it recently, however it’s a very simple and easy to use platform allowing you to add content about your favourite topics, photos, videos and more.
Tumblr media
If done correctly social bookmarking can be still a very effective tactic at driving traffic in today’s post-Penguin world. However, don’t forget that relevancy and regular activity is the key.
How about you?
What experience do you have with social bookmarking? What other social bookmarking and content curation sites do you find useful and effective in the post-penguin era? I would love to hear your suggestions and thoughts. 
0 notes
seotopbox-blog · 7 years
Photo
Tumblr media
SEO Guide for Beginners
You hear the term all the time, but how do you actually rank higher in the search engines? I know when I first heard the term, it sounded like some voodoo magic that only a few people understood how to use.
The reality is, SEO isn’t rocket science. Some gurus would have you believe it takes years of dedicated study to understand it, but I don’t think that’s true. Sure, mastering the subtle nuances takes time, but the truth is that you can learn the fundamentals in just a few minutes.
So, I got to thinking, “Why don’t I lay out the basics, all in one post?”
It’s a long one, to be sure, but after years of studying SEO and working behind the scenes to help companies get first page rankings, I’m convinced this is all you need to know. If you are looking to boost your traffic so that you can increase your sales, just follow these basic guidelines.
The Traffic Trap (and How SEO Really Works)
Lots of marketers make the mistake of seeing SEO only as a source of free traffic. It’s true, free traffic is the end result, but it’s not how SEO works.
The real purpose of SEO is to help people who are looking for you find you. To do that, you have to match the content on your website to what people are trying to find.
For example:
Mary sells custom knitted sweaters. On her blog, she shows how she makes the sweaters by hand, often talking about the different yarns she uses. There’s not much competition for keywords relating to yarn, and Mary is publishing lots of great content about it, so before long, she has front page rankings for several different types of yarn.
Do you see the potential problem?
The people searching for yarn most likely knit themselves, and it’s unlikely they’ll be interested in purchasing Mary’s sweaters. She’ll get lots of traffic, sure, but none of the traffic will convert, because the visitors have completely different goals.
The lesson here: if you want SEO to work for you, you need to make sure your goals match the goals of your visitors. It’s not about traffic. It’s about figuring out what you want, and then optimizing for keywords that bring in visitors who want the same things.
How do you discover what those keywords are?
Simple: research.
Research: How to Find the Right Keywords
Sure, research is a little tedious, but it’s an indispensable part of finding the right keywords. You want to uncover keywords that:
Have a high search volume (people are looking for the keywords)
Have low competition (smaller amount of results will mean your chances of ranking higher improve)
Are supported by your content (the keywords are relevant to your site).
There are lots of tools to aid you in finding the right keywords, the most popular being Google’s Search-Based Keyword Tool. It provides results based on actual Google searches, and if you are logged into an AdWords account, it will also give you a list of keyword ideas customized to the site on the account.
Before you get too far though, let’s discuss an important concept for deciding how broad or narrow you want your keywords to be. It’s called, “The Long Tail.”
The Long Tail
Popularized by Chris Anderson, the Long Tail describes a phenomenon where lots of low traffic keywords can collectively send you more visitors than a few high-traffic keywords.
For example, although Amazon may get thousands of visits from the keyword “DVD,” they get millions of visits from all of the individual DVD titles (i.e., Dark Knight, Toy Story, etc.). Individually, none of those titles get anywhere close to the traffic of a term like, “DVD,” but collectively, their volume is a lot larger than any one keyword.
How does the long tail apply to you?
When you combine them all, your long tail (unpopular) keywords should make up roughly 80% of your traffic. So, when you’re researching keywords, don’t just focus on the ones getting massive amounts of traffic. Take note of some of the less popular ones too, and then incorporate them into your overall strategy.
Crafting Your Content
After you pick the right keywords, it’s important to start crafting your content.
Search engines have bots that automatically crawl your website, “reading” it to find out what it’s about and then deciding which keywords each of your pages should rank for. You can influence their “decisions” by strategically optimizing your content for certain keywords.
This is especially true if you’re creating content bots can’t read. It’s easy for bots to interpret text, but they aren’t advanced enough yet to watch videos, look at images, or listen to audio. You’ll need to describe them, so they bot can understand and rank your pages for the appropriate keywords.
One quick word of warning, though.
Writing solely for search engines usually makes your content boring, and typically, that won’t help convert your visitors into customers. It’s far better to focus on people first, making your content as easy as possible, and then optimize for search engine bots where you can, without sacrificing the persuasiveness of your content.
Pay attention to:
Titles – Create eye-catching titles that raise the reader’s interest. You only have one chance to make a great first impression.
Keywords – Pick keywords that will help bring people to your site and are relevant.
Links – Link to quality sites that compliment what your website is about. It’ll encourage sites in your niche to link to you as well.
Quality – Try to publish unique and quality content. This prompts users to come to your site because they cannot easily find the content elsewhere.
Freshness – If you are publishing content that does not age or become outdated, that’s great, but you also need to add new content on a regular basis. If you don’t have the time to add content to your website, consider adding a question and answer section or a blog to your website.
And most importantly, do not publish someone else’s content on your site. This creates duplicate content, and search engines can penalize you for it.
Optimizing Your Code
Search engine bots don’t just read your website’s text. They also read your website’s code.
With that in mind, there are eight different sections of your code you need to optimize. To help demonstrate these points, I am going to use examples from zeldman.com and stuffandnonsense.co.uk, two popular web designers that take different approaches in their site markup.
Title Tags
Title tags encase the title of your site. To demonstrate, this is the code from zeldman.com:
<title> Jeffrey Zeldman Presents The Daily Report</title>
Here, Zeldman puts the emphasis on his name and the name of the site. If you wanted to find it in the search engines, you would probably search for, “Jeffrey Zeldman” or “the Daily Report.”
Let’s take a look at the other site:
<title>Fantastic web site design in Flintshire, North Wales from Stuff and Nonsense</title>
Stuffannonsense.co.uk took a different approach. By putting the site name at the end, they emphasize what the website is about. You’d most likely find them by searching for, “web design in Flintshire, North Wales,” or a variation thereof.
The bottom line: when coding your title tags, make sure keywords are in the title. To further maximize search engine results, each page should have a unique title tag.
Meta Tags
The main meta tag you should be concerned with is called the, “meta description tag.” It doesn’t have much of an impact on your search engine ranking, but it tells visitors what your site is about, so it can have a big impact on whether they decide to click through or not.
Let’s take a look at some examples:
<meta name=”description” content=”Web design insights since 1995. Personal site of Jeffrey Zeldman, publisher of A List Apart Magazine, founder of Happy Cog Studios, co-founder of The Web Standards Project, co-founder of the Event Apart design conference, author of Designing With Web Standards.” />
<meta name=”description” content=”Looking for fantastic web site design in North Wales? Stuff and Nonsense are world renowned web designers based in North Wales.” />
Can you spot the keywords Zeldman.com and stuffandnonsense.co.uk emphasize?
Zeldman was very thorough by mentioning his other projects. If you do a Google search for “Zeldman,” zeldman.com comes up first. Happy Cog and A List Apart also show up. If you have multiple online interests, you might want to take Zeldman’s approach and keyword them in the description meta tag.
Stuff and Nonsense emphasizes the type of visitor who should visit their site. By asking the question, “Looking for fantastic website design in North Wales?” they make it crystal clear that it’s a site built for people looking for web design. If you’re one of those people, it would probably stand out to you.
When creating meta tag descriptions, make sure your keywords are in your description, using full sentences. Don’t make the description too long, though, or it might get cut off. If possible, also try to make each page have a unique meta description.
Headings
These are very similar to headings in a book, but these come in a specific order. H1, H2, H3, H4, and so on, with H1 starting the page as the main heading. The remaining heading codes descend to lower level headings on the site.
For example:
<h1>How to Optimize Your Business for Search Engines</h1>
<h2>The ABCs of SEO</h2>
<h3>Research</h3>
Note the pattern. The more specific your content becomes, the higher the number of the heading.
Generally, there should only be one H1 tag on each page, and you can have as many h2s, h3s, and h4s as needed. Also, make sure your headings contain keywords and are relevant to the content on your website.
Sitemaps
Sitemaps are like a roadmap for search engines. They give bots directions to all of the different pages on your website, making sure they find everything.
There are two types of sitemaps you can create: HTML sitemaps and XML sitemaps. The main difference is XML sitemaps are coded specifically for search engines to read, while HTML sitemaps are easy for people to read too. You can link to them, giving the visitor an overview of everywhere they go.
If you have less than a few hundred pages, you should place a link to each page in your HTML sitemap. If your web site has a few thousand pages or more, just link to the most important pages.
XML sitemaps, on the other hand, contain every page of your web site, even if your web site has a million pages. You can use tools like the XML Sitemap Creator to automatically create a sitemap for you. Once your XML sitemap is created, you then want to submit it to Google Webmaster Central and Bing so that the major search engines can crawl and index your web site.
Domain Name
Domain names that contain keywords within them rank a lot higher than domains without keywords. Exact match domain names rank even higher.
But there’s a cost: exact match domains aren’t very unique. The reason why you see many companies use made-up words for their domain name is you can build a brand around it, instead of fighting the existing meaning.
Which is better?
It depends.
If your traffic comes purely from search engines, then using an exact match domain name may be a smart decision for you. For example Diamonds.com and Hotels.com will always rank well for “diamonds” and “hotels” because their domain name is keyword rich.
If SEO is only a small part of your strategy, however, go with something more unique. A decade ago, no one was searching for “Google,” but now it’s a huge brand. The same goes for sites like Zappos and Zillow.
URL Structure
URLs are another important but often overlooked part of SEO.
If your URLs are messy, search engines will have a hard time crawling them, and if search engines have a hard time crawling them, they will not be able to index your site, which means you will not rank in the search engines.
Keep these factors in mind to make your URLs more search engine friendly:
URLs should not contain extraneous characters ( $ @ ! * % = ? )
Shorter URLS typically rank better than longer ones
Numbers and letters should only be used in URLs.
Do not use underscores. Search engines prefer dashes.
Sub-domains can rank better than sub directories.
Site Structure
The way you link web pages together will make a big impact on your rankings. Here are some tips when cross-linking your web site:
Links within your content tend to carry more weight than links within a sidebar or footer.
Try to keep the number of links on each page under 100.
No-follow outgoing links that are not relevant (do not have quality content). For example, links to a Feedburner page.
Other SEOs also talk about no-following internal links, such as to their terms of service, but pagerank sculpting does not work anymore. If you want to block pages such as your terms of service, the best way to do this is to exclude it in your robots.txt file.
Alt Tags
For search engine bots to properly index images, alt tags need to be added to each image, adding a brief description. For example, if there was an image of a “blue widget”, I would tell the search engine that the image is a blue widget by using an alt tag. It would look something like this:
<img src=”https://seotopbox.com/images/bluewidget.jpg” alt=“blue widget” />
In addition, make sure your image names are relevant to the image. The picture of the blue widget would be called bluewidget.jpg instead of image3.jpg.
Links
Links are maybe the most important part of SEO. The more web sites that link to your web site, the higher your web pages will rank.
The reason links have a high value in SEO is that it is easy for anyone to do research, modify their content, or create content, but is hard to convince hundreds or thousands of web sites to link to you. In the eyes of a search engine, the more trustworthy, non-spammy sites are linking to you, the more authority you must have on the topic.
Before we get into how to build links, here are some things you need to know. In general:
Links within content are more effective than links in a sidebar or footer
Links from related sites are better than links from non-relevant sites
Anchor text plays the most important role in link building. If you want to rank for “blue widget” then you want the anchor text of the link to be “blue widget”.
Here are some things to avoid:
Links from spammy or irrelevant sites.
Site wide links can hurt more than they may help.
If all of your links are rich in anchor text, it can hurt you.
Reciprocal links (I link to you and you link to me) are not too effective.
If you buy text links and get caught, you can get banned from a search engine.
Here are a few ways you can increase your link count:
Social media – getting on sites like Digg or StumbleUpon don’t just drive a ton of traffic. The increase in visibility also improves your chances of getting linked to.
Directories – There are many directories on the web. Take the time to submit your web site to the ones that compliment your content.
The top 100 – If you want to rank for a specific keyword, the best links you can get are from sites that already rank in the top 100 search results for that keyword. Granted, some of the sites that rank for the term you are trying to rank for are your competitors, so they will not link to you, but some will not be your competition and you can always shoot them a nice email asking them to link to you.
Forums – Many forums allow you to create signatures, in which you can link back to your web site. As long as those links are not no-followed, they will help with your rankings.
Competition – The easiest way to get links is to see who links to your competition and write them an email telling them the benefits of your web site compared to your competition. Roughly, 5% of the web sites you email will also add your link.
Dead links – There are billions of links on the web, so expect a good portion of those links to die over time. Web sites go down and many of the links pointing to that web site are still active. If you email those web sites informing them of the dead link, and that your content is similar, there is a good chance they will replace the dead link to one going to your website.
Conclusion
If you implement all of the advice here, your traffic from search engines will increase.
Just be patient. It takes time for search engines to update their records, as they have to crawl billions of websites.
Also, note that it will take time to figure out what works for your site. What works for site A might not work for site B. There aren’t any shortcuts. If you do anything shady to speed things up, eventually you will get caught and punished. It’s never worth it.
A better approach?
Figure out what people are looking for
Create a site that gives it to them
3. Optimize for search engines, so they help people find you
It’s not just smart SEO. It’s what search engines want you to do.
Ultimately, their goal is to have the best websites for every given topic show up at the top. So if you work hard to create the best website, and then promote it effectively, eventually they will catch up.
Just keep the above points in mind to help guide you. It takes time, and it’s a lot of hard work, but if you stick with it, it does pay off.
0 notes