bambiguertinus
bambiguertinus
Bambi Guertin
241 posts
Hi I am Bambi Guertin,33 years old from New Jersey,NJ,USA, working as SEO and Website Designing Expert.Here I am sharing tips about it. My Website
Don't wanna be here? Send us removal request.
bambiguertinus · 6 years ago
Text
SEO writing guide: From keyword to content brief
If content is queen, and the critical role SEO plays a role of bridging the two to drive growth, then there’s no question as to whether or not keyword research is important.
However, connecting the dots to create content that ranks well can be difficult. What makes it so difficult? How do you go from a target keyword phrase and write an article that is unique, comprehensive, encompasses all the major on-page SEO elements, touches the reader, and isn’t structured like the “oh-so-familiar” generic SEO template?
There’s no one size fits all approach! However, there is a simple way to support any member of your editorial, creative writing, or content team in shaping up what they need in order to write SEO-friendly content, and that’s an SEO content brief.
Key benefits of a content brief:
Productivity and efficiency – A content brief clearly outlines expectation for the writer resulting in reduced revisions
Alignment – Writers understand the intent and goals of the content
Quality – Reduces garbage in, garbage out.
So the rest of this article will cover how we actually get there & we’ll use this very article as an example:
Keyword research
Topical expansion
Content/SERP (search engine results page) analysis
Content brief development
Template and tools
Any good editor will tell you great content comes from having a solid content calendar with topics planned in advance for review and release at a regular cadence. To support topical analysis and themes as SEOs we need to start with keyword research.
Start with keyword research: Topic, audience, and objectives
The purpose of this guide isn’t to teach you how to do keyword research. It’s to set you up for success in taking the step beyond that and developing it into a content brief. Your primary keywords serve as your topic themes, but they are also the beginning makings of your content brief, so try to ensure you:
Spend time understanding your target audience and aligning their goals to your keywords. Many call this keyword intent mapping. Rohan Ayyr provides an excellent guide to matching keywords to intent in his article, ‘How to move from keyword research to intent research’.
Do the keyword research in advance, it will allow writers and editors the freedom to move things around and line it up with trending topics.
How does all this help in supporting a content brief?
You and your team can get answers to the key questions mentioned below.
What will they write about? Primary keywords serve as the topic in your content brief.
Who is the intended audience? Keyword intent helps unearth what problem the user is trying to solve, helping us understand who they are, and what they need.
Now with keywords as our guide to overall topical themes, we can focus on the next step, topical expansion.
Topical expansion: Define key points and gather questions
Writers need more than keywords, they require insight into the pain points of the reader, key areas of the topic to address and most of all, what questions the content should answer. This too will go into your content brief.
We’re in luck as SEOs because there is no shortage of tools that allow us to gather this information around a topic.
For example, let’s say this article focuses on “SEO writing”. There are a number of ways to expand on this topic.
Using a tool like SEMRush’s topic research tool, you can take your primary keyword (topic), and get expanded/related topics, a SERP snapshot and questions in a single view. I like this because it covers what many other tools do separately. Ultimately it supports both content expansion & SERP analysis at the same time.
Use keyword suggestion tools like KeywordTool.io or Ubersuggest to expand the terms combined with Google search results to quickly view potential topics.
Use Answerthepublic.com to get expanded terms and inspirational visuals.
Alternatively, Ann Smarty offers up four additional tools, for related keywords to support topical expansion.
You’ve taken note of what to write about, and how to cover the topic fully. But how do we begin to determine what type of content and how in-depth it should be?
Content and SERP analysis: Specifying content type and format
Okay, so we’re almost done. We can’t tell writers to write unique content if we can’t specify what makes it unique. Reviewing the competition and what’s being displayed consistently in the SERP is a quick way to assess what’s likely to work. You’ll want to look at the top ten results for your primary topic and collect the following:
Content type – Are the results skewed towards a specific type of content? (For example, in-depth articles, infographics, videos, or blog posts)
Format – Is the information formatted as a guide? A how-to? Maybe a list?
Differentiation points – What stands out about the top three results compared to the rest?
Content brief development: Let’s make beautiful content together
Now you’re ready to prepare your SEO content brief which should include the following:
Topic and objective – Your topic is your primary keyword phrase. Your objective is what this content supposed to accomplish.
Audience and objective – Based on your keyword intent mapping, describe who the article is meant to reach.
Topical coverage – Top three related keyword phrases from your topical expansion.
Questions to answer – Top three to five from topical expansion findings. Ensure they support your related keyword phrases as well.
Voice, style, tone – Use an existing content/brand style guide.
Content type and format – Based on your SERP analysis.
Content length – Based on SERP Analysis. Ensure you’re meeting the average across the top three results based on content type.
Deadline – This is only pertinent if you are working solo, otherwise, consult/lean on your creative team lead.
[Note: If/when using internally, consider making part of the content request process, or a template for the editorial staff. When using externally be sure to include where the content will be displayed, format/output, specialty editorial guidance.]
Template and tools
Want to take a shortcut? Feel free to download and copy my SEO content brief template, it’s a Google doc.
Other content brief templates/resources:
https://izea.com/2018/05/02/content-marketing-brief/
https://contentmarketinginstitute.com/2016/12/content-strategy-kit-marketers/content-brief-example/
If you want to streamline the process as a whole, MarketMuse provides a platform that manages the keyword research, topic expansion, provides the questions, and manages the entire workflow. It even allows you to request a brief, all in one place.
I only suggest this for larger organizations looking to scale as there is an investment involved. You’d likely also have to do some work to integrate into your existing processes.
Jori Ford is Sr. Director of Content & SEO at G2Crowd. She can also be found on Twitter @chicagoseopro.
The post SEO writing guide: From keyword to content brief appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/16/seo-writing-guide-from-keyword-to-content-brief/
0 notes
bambiguertinus · 6 years ago
Text
Top advanced YouTube SEO tips to boost your video performance
YouTube is not just a social media platform. It’s a powerful search engine for video content. Here’s how to make the most of its SEO potential.
There are more than 1.9 billion users who use YouTube every month. People are spending over a billion hours watching videos every day on YouTube. This means that there is a big opportunity for brands, publishers and video creators to expand their reach.
Search optimization is not just for your site’s content. YouTube can have its own best practices around SEO and it’s good to keep up with the most important ones that can improve your ranking.
How can you improve your SEO on YouTube? We’ve organized our advanced YouTube SEO tactics into three key areas:
Keyword research
Content optimization
Engagement
Advanced YouTube SEO tips to drive more traffic and improved rankings
Keyword research
It’s not enough to create the right content if you don’t get new viewers to actually watch it. Keywords can actually help you understand how to link your video with the best words to describe it.
They can make it easier for viewers to discover your content and they also help search engines match the content with the search queries and their relevance.
A video keyword research can help you discover new content opportunities while you can also improve your SEO.
A quick way to find popular keywords for the content you have in mind is to start searching on YouTube’s search bar. The auto-complete feature will highlight the most popular keywords around your topic. You can also perform a similar search in Google to come up with more suggestions for the best keywords.
If you’re serious about keyword research and need to find new ideas, you can use additional online tools that will provide with a list of keywords to consider.
When it comes to picking the best keywords, you don’t need to aim for the most obvious choice. You can start with the keywords that are low in competition and aim to rank for them.
Moreover, it’s good to keep in mind that YouTube is transcribing all your videos. If you want to establish your focus keywords you can include them in your actual video by mentioning throughout your talking. This way you’re helping YouTube understand the contextual relevance of your content along with your keywords.
Recap
Use the auto-complete search function to find popular keywords
Perform a Google search for more keyword ideas
You can even use SEO tools for additional keyword ideas
Say your keywords as part of your videos
Content optimization
There are many ways to improve the optimization of your content and here are some key tips to keep in mind:
1. Description
Your description should facilitate the search for relevant content. A long description helps you provide additional context to your video. It can also serve as an introduction to what you’re going to talk about. As with blog posts, a longer description can grant you the space to expand your thoughts. Start treating your videos as the starting point and add further details about them in the description. If your viewers are genuinely interested in your videos then they will actually search for additional details in your description.
2. Timestamp
More videos are adding timestamps in their description. This is a great way to improve user experience and engagement. You are helping your viewers to find exactly what they are looking for, which increases the chances of keeping them coming back.
3. Title and keywords
Keywords are now clickable in titles. This means that you are increasing the chances of boosting your SEO by being more creative with your titles. Be careful not to create content just for search engines though, always start by creating content that your viewers would enjoy.
4. Location
If you want to tap into local SEO then it’s a good idea to include your location in your video’s copy. If you want to create videos that are targeting local viewers then it’s a great starting point for your SEO strategy.
5. Video transcripts
Video transcripts make your videos more accessible. They also make it easier for search engines to understand what the video is about. Think of the transcript as the process that makes the crawling of your content easier. There are many online options to create your video transcripts so it shouldn’t be a complicated process to add them to your videos.
Engagement
Engagement keeps gaining ground when it comes to YouTube SEO. It’s not enough to count the number of views if your viewers are not engaging with your content. User behavior helps search engines understand whether your content is useful or interesting for your viewers to rank it accordingly.
Thus, it’s important to pay attention to these metrics:
Watch time: The time that your viewers are spending on your video is a good indicator of its appeal and relevance to them.
Likes, comments, and shares: The starting point of measuring engagement is to track the number of likes, comments, and shares in your videos. They don’t make the only engagement metric anymore but they can still serve as a good indication of what counts as popular content. Likes may be easier to achieve but comments and most importantly shares can skyrocket the engagement and views of your videos. It’s not a bad idea to encourage your viewers to support your work. It is actually a common tactic. However, make sure that you’re not trying too hard as this is not appreciated. Every call-to-action needs to feel natural in your videos.
Subscribers after watching a video: The number of subscribers serves as an indication of your channel’s popularity. People who actually subscribe to your channel after watching a video make a very good indication of your content’s engagement.
CTR: The click-through rate (CTR) is the number of clicks your video is receiving based on the impressions, the number of times that it’s shown. For example, if you optimize your content to show up high in rankings but it still doesn’t get too many clicks, then it means that your viewers don’t find it appealing enough to click on it. This may not be related to the quality of your content but on the first impression that it gets. You can improve the CTR by paying attention to your title and your thumbnail. Bear in mind that YouTube is not encouraging you to clickbait your viewers, so you shouldn’t create misleading titles or thumbnails if you want to aim for higher rankings in the longer term.
Learning from the best
A good tip to understand YouTube SEO is to learn from the best by looking at the current most popular videos. You can also search for topics that are relevant to your channel to spot how your competitors are optimizing their titles, their keywords, and how thumbnails and descriptions can make it easier to click on one video over another.
Have any queries or tips to add to these? Share them in the comments.
The post Top advanced YouTube SEO tips to boost your video performance appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/22/top-advanced-youtube-seo-tips-to-boost-your-video-performance/
0 notes
bambiguertinus · 6 years ago
Text
Using Python to recover SEO site traffic (Part three)
When you incorporate machine learning techniques to speed up SEO recovery, the results can be amazing.
This is the third and last installment from our series on using Python to speed SEO traffic recovery. In part one, I explained how our unique approach, that we call “winners vs losers” helps us quickly narrow down the pages losing traffic to find the main reason for the drop. In part two, we improved on our initial approach to manually group pages using regular expressions, which is very useful when you have sites with thousands or millions of pages, which is typically the case with ecommerce sites. In part three, we will learn something really exciting. We will learn to automatically group pages using machine learning.
As mentioned before, you can find the code used in part one, two and three in this Google Colab notebook.
Let’s get started.
URL matching vs content matching
When we grouped pages manually in part two, we benefited from the fact the URLs groups had clear patterns (collections, products, and the others) but it is often the case where there are no patterns in the URL. For example, Yahoo Stores’ sites use a flat URL structure with no directory paths. Our manual approach wouldn’t work in this case.
Fortunately, it is possible to group pages by their contents because most page templates have different content structures. They serve different user needs, so that needs to be the case.
How can we organize pages by their content? We can use DOM element selectors for this. We will specifically use XPaths.
For example, I can use the presence of a big product image to know the page is a product detail page. I can grab the product image address in the document (its XPath) by right-clicking on it in Chrome and choosing “Inspect,” then right-clicking to copy the XPath.
We can identify other page groups by finding page elements that are unique to them. However, note that while this would allow us to group Yahoo Store-type sites, it would still be a manual process to create the groups.
A scientist’s bottom-up approach
In order to group pages automatically, we need to use a statistical approach. In other words, we need to find patterns in the data that we can use to cluster similar pages together because they share similar statistics. This is a perfect problem for machine learning algorithms.
BloomReach, a digital experience platform vendor, shared their machine learning solution to this problem. To summarize it, they first manually selected cleaned features from the HTML tags like class IDs, CSS style sheet names, and the others. Then, they automatically grouped pages based on the presence and variability of these features. In their tests, they achieved around 90% accuracy, which is pretty good.
When you give problems like this to scientists and engineers with no domain expertise, they will generally come up with complicated, bottom-up solutions. The scientist will say, “Here is the data I have, let me try different computer science ideas I know until I find a good solution.”
One of the reasons I advocate practitioners learn programming is that you can start solving problems using your domain expertise and find shortcuts like the one I will share next.
Hamlet’s observation and a simpler solution
For most ecommerce sites, most page templates include images (and input elements), and those generally change in quantity and size.
I decided to test the quantity and size of images, and the number of input elements as my features set. We were able to achieve 97.5% accuracy in our tests. This is a much simpler and effective approach for this specific problem. All of this is possible because I didn’t start with the data I could access, but with a simpler domain-level observation.
I am not trying to say my approach is superior, as they have tested theirs in millions of pages and I’ve only tested this on a few thousand. My point is that as a practitioner you should learn this stuff so you can contribute your own expertise and creativity.
Now let’s get to the fun part and get to code some machine learning code in Python!
Collecting training data
We need training data to build a model. This training data needs to come pre-labeled with “correct” answers so that the model can learn from the correct answers and make its own predictions on unseen data.
In our case, as discussed above, we’ll use our intuition that most product pages have one or more large images on the page, and most category type pages have many smaller images on the page.
What’s more, product pages typically have more form elements than category pages (for filling in quantity, color, and more).
Unfortunately, crawling a web page for this data requires knowledge of web browser automation, and image manipulation, which are outside the scope of this post. Feel free to study this GitHub gist we put together to learn more.
Here we load the raw data already collected.
Feature engineering
Each row of the form_counts data frame above corresponds to a single URL and provides a count of both form elements, and input elements contained on that page.
Meanwhile, in the img_counts data frame, each row corresponds to a single image from a particular page. Each image has an associated file size, height, and width. Pages are more than likely to have multiple images on each page, and so there are many rows corresponding to each URL.
It is often the case that HTML documents don’t include explicit image dimensions. We are using a little trick to compensate for this. We are capturing the size of the image files, which would be proportional to the multiplication of the width and the length of the images.
We want our image counts and image file sizes to be treated as categorical features, not numerical ones. When a numerical feature, say new visitors, increases it generally implies improvement, but we don’t want bigger images to imply improvement. A common technique to do this is called one-hot encoding.
Most site pages can have an arbitrary number of images. We are going to further process our dataset by bucketing images into 50 groups. This technique is called “binning”.
Here is what our processed data set looks like.
Adding ground truth labels
As we already have correct labels from our manual regex approach, we can use them to create the correct labels to feed the model.
We also need to split our dataset randomly into a training set and a test set. This allows us to train the machine learning model on one set of data, and test it on another set that it’s never seen before. We do this to prevent our model from simply “memorizing” the training data and doing terribly on new, unseen data. You can check it out at the link given below:
Model training and grid search
Finally, the good stuff!
All the steps above, the data collection and preparation, are generally the hardest part to code. The machine learning code is generally quite simple.
We’re using the well-known Scikitlearn python library to train a number of popular models using a bunch of standard hyperparameters (settings for fine-tuning a model). Scikitlearn will run through all of them to find the best one, we simply need to feed in the X variables (our feature engineering parameters above) and the Y variables (the correct labels) to each model, and perform the .fit() function and voila!
Evaluating performance
After running the grid search, we find our winning model to be the Linear SVM (0.974) and Logistic regression (0.968) coming at a close second. Even with such high accuracy, a machine learning model will make mistakes. If it doesn’t make any mistakes, then there is definitely something wrong with the code.
In order to understand where the model performs best and worst, we will use another useful machine learning tool, the confusion matrix.
When looking at a confusion matrix, focus on the diagonal squares. The counts there are correct predictions and the counts outside are failures. In the confusion matrix above we can quickly see that the model does really well-labeling products, but terribly labeling pages that are not product or categories. Intuitively, we can assume that such pages would not have consistent image usage.
Here is the code to put together the confusion matrix:
Finally, here is the code to plot the model evaluation:
Resources to learn more
You might be thinking that this is a lot of work to just tell page groups, and you are right!
Mirko Obkircher commented in my article for part two that there is a much simpler approach, which is to have your client set up a Google Analytics data layer with the page group type. Very smart recommendation, Mirko!
I am using this example for illustration purposes. What if the issue requires a deeper exploratory investigation? If you already started the analysis using Python, your creativity and knowledge are the only limits.
If you want to jump onto the machine learning bandwagon, here are some resources I recommend to learn more:
Attend a Pydata event I got motivated to learn data science after attending the event they host in New York.
Hands-On Introduction To Scikit-learn (sklearn)
Scikit Learn Cheat Sheet
Efficiently Searching Optimal Tuning Parameters
If you are starting from scratch and want to learn fast, I’ve heard good things about Data Camp.
Got any tips or queries? Share it in the comments.
Hamlet Batista is the CEO and founder of RankSense, an agile SEO platform for online retailers and manufacturers. He can be found on Twitter @hamletbatista.
The post Using Python to recover SEO site traffic (Part three) appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/17/using-python-to-recover-seo-site-traffic-part-three/
0 notes
bambiguertinus · 6 years ago
Text
Study: How ready are businesses for voice search?
“So… most businesses know about voice search. But has this knowledge helped them optimize for it?”
An interesting report recently released by Uberall sought to address that exact question. For as much as we talk about the importance of voice search, and even how to optimize for it — are people actually doing it?
In this report, researchers analyzed 73,000 business locations (using the Boston Metro area as their sample set), across 37 different voice search directories, as well as across SMBs, mid-market, and enterprise.
They looked at a number of factors including accuracy of address, business hours, phone number, name, website, and zip code, as well as accuracy across various voice search directories.
In order, this was how they weighted the importance of a listing’s information:
And pictured below are “the 37 most important voice search directories” that they accounted for.
Uberall analysts did note, however, that Google (search + maps), Yelp, and Bing together represent about 90% of the score’s weight.
How ready are businesses for voice search?
The ultimate question. Here, we’ll dive into a few key findings from this report.
1. Over 96% of all business locations fail to list their business information correctly
When looking just at the three primary listings locations (Google, Yelp, Bing), Uberall found that only 3.82% of business locations had no critical errors.
In other words, more than 96% of all business locations failed to list their business information correctly.
Breaking down those 3.82% of perfect business location listings, they were somewhat evenly split across enterprise, mid-market, and SMB, with enterprise having the largest share as one might expect.
2. The four most common types of listing errors
In their analysis, here’s the breakdown of most common types of missing or incorrect information:
Opening hours: 978,305 errors (almost half of all listings)
Website: 710,113 errors (almost one-third of all listings)
Location name: 510,010 errors (almost one-quarter of all listings)
Street: 421,048 errors (almost one-fifth of all listings)
3. Which types of businesses are most likely to be optimized for voice search?
Industries that were found to be most voice search ready included:
Dentists
Health food
Home improvement
Criminal attorneys
Dollar stores
Industries that were found to be least voice search ready included:
Consumer protection organizations
Congressional representatives
Business attorneys
Art galleries
Wedding services
Not much surprise on the most-prepared industries relying heavily on people being able to find their physical locations. Perhaps a bit impressed that criminal attorneys landed so high on the list. Surprising that art galleries ranked second to last, but perhaps this helps explain decline in traffic of late.
And as ever, we can be expectedly disappointed by the technological savvy of congressional representatives.
What’s the cost of businesses not being optimized for voice search?
The next question, of course, is: how much should we care? Uberall spent a nice bit of their report discussing statistics about the history of voice search, how much it’s used, and its predicted growth.
Interestingly, they also take a moment to fact check the popular “voice will be 50% of all search by 2020” statistic. Apparently, this was taken from an interview with Andrew Ng (co-founder of Coursera, formerly lead at both Google Brain and Baidu) and was originally referring to the growth of a combined voice and image search, specifically via Baidu in China.
1. On average, adults spend 10x more hours on their phones than they did in 2018
This data was compiled from a number of charts from eMarketer, showing overall increase in digital media use from 2008 to 2017 (and we can imagine is even higher now). Specifically, we see how most all of the growth is driven just from mobile.
The connection here, of course, is that mobile devices are one of the most popular devices for voice search, second only perhaps to smart home devices.
2. About 21% of respondents were using voice search every week
According to this study, 21% of respondents were using voice search every week. 57% of respondents said they never used voice search. And about 14% seem to have tried it once or twice and not looked back.
In general, it seems people are a bit polarized — either it’s a habit or it’s not.
Regardless, 21% is a sizable number of consumers (though we don’t have information about how many of those searches convert to purchases).
And it seems the number is on the rise: the recent report from voicebot.ai showed that smart speaker ownership grew by nearly 40% from 2018 to 2019, among US adults.
Overall, the cost of not being optimized for voice search may not be sky high yet. But at the same time, it’s probably never too soon to get your location listings in order and provide accurate information to consumers.
You might also like:
Voice search optimization guide: Six steps for 2019
Yext Brain and the future of conversational AI: Q&A with CMO Jeff Rohrs (ClickZ)
What can we learn from voice search in 2018?
How to optimize your local business for voice search
Voice search and local SEO: How to get started?
The comprehensive guide to voice search keyword research
The post Study: How ready are businesses for voice search? appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/18/voice-search-study-uberall/
0 notes
bambiguertinus · 6 years ago
Text
The Fractured Web
Anyone can argue about the intent of a particular action & the outcome that is derived by it. But when the outcome is known, at some point the intent is inferred if the outcome is derived from a source of power & the outcome doesn't change.
Or, put another way, if a powerful entity (government, corporation, other organization) disliked an outcome which appeared to benefit them in the short term at great lasting cost to others, they could spend resources to adjust the system.
If they don't spend those resources (or, rather, spend them on lobbying rather than improving the ecosystem) then there is no desired change. The outcome is as desired. Change is unwanted.
Engagement is a toxic metric.Products which optimize for it become worse. People who optimize for it become less happy.It also seems to generate runaway feedback loops where most engagable people have a) worst individual experiences and then b) end up driving the product bus.— Patrick McKenzie (@patio11) April 9, 2019
News is a stock vs flow market where the flow of recent events drives most of the traffic to articles. News that is more than a couple days old is no longer news. A news site which stops publishing news stops becoming a habit & quickly loses relevancy. Algorithmically an abandoned archive of old news articles doesn't look much different than eHow, in spite of having a much higher cost structure.
According to SEMrush's traffic rank, ampproject.org gets more monthly visits than Yahoo.com.
That actually understates the prevalence of AMP because AMP is generally designed for mobile AND not all AMP-formatted content is displayed on ampproject.org.
Part of how AMP was able to get widespread adoption was because in the news vertical the organic search result set was displaced by an AMP block. If you were a news site either you were so differentiated that readers would scroll past the AMP block in the search results to look for you specifically, or you adopted AMP, or you were doomed.
Some news organizations like The Guardian have a team of about a dozen people reformatting their content to the duplicative & proprietary AMP format. That's wasteful, but necessary "In theory, adoption of AMP is voluntary. In reality, publishers that don’t want to see their search traffic evaporate have little choice. New data from publisher analytics firm Chartbeat shows just how much leverage Google has over publishers thanks to its dominant search engine."
It seems more than a bit backward that low margin publishers are doing duplicative work to distance themselves from their own readers while improving the profit margins of monopolies. But it is what it is. And that no doubt drew the ire of many publishers across the EU.
And now there are AMP Stories to eat up even more visual real estate.
If you spent a bunch of money to create a highly differentiated piece of content, why would you prefer that high spend flaghship content appear on a third party website rather than your own?
Google & Facebook have done such a fantastic job of eating the entire pie that some are celebrating Amazon as a prospective savior to the publishing industry. That view - IMHO - is rather suspect.
Where any of the tech monopolies dominate they cram down on partners. The New York Times acquired The Wirecutter in Q4 of 2016. In Q1 of 2017 Amazon adjusted their affiliate fee schedule.
Amazon generally treats consumers well, but they have been much harder on business partners with tough pricing negotiations, counterfeit protections, forced ad buying to have a high enough product rank to be able to rank organically, ad displacement of their organic search results below the fold (even for branded search queries), learning suppliers & cutting out the partners, private label products patterned after top sellers, in some cases running pop over ads for the private label products on product level pages where brands already spent money to drive traffic to the page, etc.
They've made things tougher for their partners in a way that mirrors the impact Facebook & Google have had on online publishers:
"Boyce’s experience on Amazon largely echoed what happens in the offline world: competitors entered the market, pushing down prices and making it harder to make a profit. So Boyce adapted. He stopped selling basketball hoops and developed his own line of foosball tables, air hockey tables, bocce ball sets and exercise equipment. The best way to make a decent profit on Amazon was to sell something no one else had and create your own brand. ... Amazon also started selling bocce ball sets that cost $15 less than Boyce’s. He says his products are higher quality, but Amazon gives prominent page space to its generic version and wins the cost-conscious shopper."
Google claims they have no idea how content publishers are with the trade off between themselves & the search engine, but every quarter Alphabet publish the share of ad spend occurring on owned & operated sites versus the share spent across the broader publisher network. And in almost every quarter for over a decade straight that ratio has grown worse for publishers.
When Google tells industry about how much $ it funnels to rest of ecosystem, just show them this chart. It's good to be the "revenue regulator" (note: G went public in 2004). pic.twitter.com/HCbCNgbzKc— Jason Kint (@jason_kint) February 5, 2019
The aggregate numbers for news publishers are worse than shown above as Google is ramping up ads in video games quite hard. They've partnered with Unity & promptly took away the ability to block ads from appearing in video games using googleadsenseformobileapps.com exclusion (hello flat thumb misclicks, my name is budget & I am gone!)
They will also track video game player behavior & alter game play to maximize revenues based on machine learning tied to surveillance of the user's account: "We’re bringing a new approach to monetization that combines ads and in-app purchases in one automated solution. Available today, new smart segmentation features in Google AdMob use machine learning to segment your players based on their likelihood to spend on in-app purchases. Ad units with smart segmentation will show ads only to users who are predicted not to spend on in-app purchases. Players who are predicted to spend will see no ads, and can simply continue playing."
And how does the growth of ampproject.org square against the following wisdom?
If you do use a CDN, I'd recommend using a domain name of your own (eg, https://t.co/fWMc6CFPZ0), so you can move to other CDNs if you feel the need to over time, without having to do any redirects.— John (@JohnMu) April 15, 2019
Literally only yesterday did Google begin supporting instant loading of self-hosted AMP pages.
China has a different set of tech leaders than the United States. Baidu, Alibaba, Tencent (BAT) instead of Facebook, Amazon, Apple, Netflix, Google (FANG). China tech companies may have won their domestic markets in part based on superior technology or better knowledge of the local culture, though those same companies have largely went nowhere fast in most foreign markets. A big part of winning was governmental assistance in putting a foot on the scales.
Part of the US-China trade war is about who controls the virtual "seas" upon which value flows:
it can easily be argued that the last 60 years were above all the era of the container-ship (with container-ships getting ever bigger). But will the coming decades still be the age of the container-ship? Possibly not, for the simple reason that things that have value increasingly no longer travel by ship, but instead by fiberoptic cables! ... you could almost argue that ZTE and Huawei have been the “East India Company” of the current imperial cycle. Unsurprisingly, it is these very companies, charged with laying out the “new roads” along which “tomorrow’s value” will flow, that find themselves at the center of the US backlash. ... if the symbol of British domination was the steamship, and the symbol of American strength was the Boeing 747, it seems increasingly clear that the question of the future will be whether tomorrow’s telecom switches and routers are produced by Huawei or Cisco. ... US attempts to take down Huawei and ZTE can be seen as the existing empire’s attempt to prevent the ascent of a new imperial power. With this in mind, I could go a step further and suggest that perhaps the Huawei crisis is this century’s version of Suez crisis. No wonder markets have been falling ever since the arrest of the Huawei CFO. In time, the Suez Crisis was brought to a halt by US threats to destroy the value of sterling. Could we now witness the same for the US dollar?
China maintains Huawei is an employee-owned company. But that proposition is suspect. Broadly stealing technology is vital to the growth of the Chinese economy & they have no incentive to stop unless their leading companies pay a direct cost. Meanwhile, China is investigating Ericsson over licensing technology.
India has taken notice of the success of Chinese tech companies & thus began to promote "national champion" company policies. That, in turn, has also meant some of the Chinese-styled laws requiring localized data, antitrust inquiries, foreign ownership restrictions, requirements for platforms to not sell their own goods, promoting limits on data encryption, etc.
The secretary of India’s Telecommunications Department, Aruna Sundararajan, last week told a gathering of Indian startups in a closed-door meeting in the tech hub of Bangalore that the government will introduce a “national champion” policy “very soon” to encourage the rise of Indian companies, according to a person familiar with the matter. She said Indian policy makers had noted the success of China’s internet giants, Alibaba Group Holding Ltd. and Tencent Holdings Ltd. ... Tensions began rising last year, when New Delhi decided to create a clearer set of rules for e-commerce and convened a group of local players to solicit suggestions. Amazon and Flipkart, even though they make up more than half the market, weren’t invited, according to people familiar with the matter.
Amazon vowed to invest $5 billion in India & they have done some remarkable work on logistics there. Walmart acquired Flipkart for $16 billion.
Other emerging markets also have many local ecommerce leaders like Jumia, MercadoLibre, OLX, Gumtree, Takealot, Konga, Kilimall, BidOrBuy, Tokopedia, Bukalapak, Shoppee, Lazada. If you live in the US you may have never heard of *any* of those companies. And if you live in an emerging market you may have never interacted with Amazon or eBay.
It makes sense that ecommerce leadership would be more localized since it requires moving things in the physical economy, dealing with local currencies, managing inventory, shipping goods, etc. whereas information flows are just bits floating on a fiber optic cable.
If the Internet is primarily seen as a communications platform it is easy for people in some emerging markets to think Facebook is the Internet. Free communication with friends and family members is a compelling offer & as the cost of data drops web usage increases.
At the same time, the web is incredibly deflationary. Every free form of entertainment which consumes time is time that is not spent consuming something else.
Add the technological disruption to the wealth polarization that happened in the wake of the great recession, then combine that with algorithms that promote extremist views & it is clearly causing increasing conflict.
If you are a parent and you think you child has no shot at a brighter future than your own life it is easy to be full of rage.
Empathy can radicalize otherwise normal people by giving them a more polarized view of the world:
Starting around 2000, the line starts to slide. More students say it's not their problem to help people in trouble, not their job to see the world from someone else's perspective. By 2009, on all the standard measures, Konrath found, young people on average measure 40 percent less empathetic than my own generation ... The new rule for empathy seems to be: reserve it, not for your "enemies," but for the people you believe are hurt, or you have decided need it the most. Empathy, but just for your own team. And empathizing with the other team? That's practically a taboo.
A complete lack of empathy could allow a psychopath (hi Chris!) to commit extreme crimes while feeling no guilt, shame or remorse. Extreme empathy can have the same sort of outcome:
"Sometimes we commit atrocities not out of a failure of empathy but rather as a direct consequence of successful, even overly successful, empathy. ... They emphasized that students would learn both sides, and the atrocities committed by one side or the other were always put into context. Students learned this curriculum, but follow-up studies showed that this new generation was more polarized than the one before. ... [Empathy] can be good when it leads to good action, but it can have downsides. For example, if you want the victims to say 'thank you.' You may even want to keep the people you help in that position of inferior victim because it can sustain your feeling of being a hero." - Fritz Breithaupt
News feeds will be read. Villages will be razed. Lynch mobs will become commonplace.
Many people will end up murdered by algorithmically generated empathy.
As technology increases absentee ownership & financial leverage, a society led by morally agnostic algorithms is not going to become more egalitarian.
The more I think about and discuss it, the more I think WhatsApp is simultaneously the future of Facebook, and the most potentially dangerous digital tool yet created. We haven't even begun to see the real impact yet of ubiquitous, unfettered and un-moderatable human telepathy.— Antonio García Martínez (@antoniogm) April 15, 2019
When politicians throw fuel on the fire it only gets worse:
It’s particularly odd that the government is demanding “accountability and responsibility” from a phone app when some ruling party politicians are busy spreading divisive fake news. How can the government ask WhatsApp to control mobs when those convicted of lynching Muslims have been greeted, garlanded and fed sweets by some of the most progressive and cosmopolitan members of Modi’s council of ministers?
Mark Zuckerburg won't get caught downstream from platform blowback as he spends $20 million a year on his security.
The web is a mirror. Engagement-based algorithms reinforcing our perceptions & identities.
And every important story has at least 2 sides!
The Rohingya asylum seekers are victims of their own violent Jihadist leadership that formed a militia to kill Buddhists and Hindus. Hindus are being massacred, where’s the outrage for them!? https://t.co/P3m6w4B1Po— Imam Tawhidi (@Imamofpeace) May 23, 2018
Some may "learn" vaccines don't work. Others may learn the vaccines their own children took did not work, as it failed to protect them from the antivax content spread by Facebook & Google, absorbed by people spreading measles & Medieval diseases.
Passion drives engagement, which drives algorithmic distribution: "There’s an asymmetry of passion at work. Which is to say, there’s very little counter-content to surface because it simply doesn’t occur to regular people (or, in this case, actual medical experts) that there’s a need to produce counter-content."
As the costs of "free" become harder to hide, social media companies which currently sell emerging markets as their next big growth area will end up having embedded regulatory compliance costs which will end up exceeding any sort of prospective revenue they could hope to generate.
The Pinterest S1 shows almost all their growth is in emerging markets, yet almost all their revenue is inside the United States.
As governments around the world see the real-world cost of the foreign tech companies & view some of them as piggy banks, eventually the likes of Facebook or Google will pull out of a variety of markets they no longer feel worth serving. It will be like Google did in mainland China with search after discovering pervasive hacking of activist Gmail accounts.
Just tried signing into Gmail from a new device. Unless I provide a phone number, there is no way to sign in and no one to call about it. Oh, and why do they say they need my phone? If you guessed "for my protection," you would be correct. Talk about Big Brother...— Simon Mikhailovich (@S_Mikhailovich) April 16, 2019
Lower friction & lower cost information markets will face more junk fees, hurdles & even some legitimate regulations. Information markets will start to behave more like physical goods markets.
The tech companies presume they will be able to use satellites, drones & balloons to beam in Internet while avoiding messy local issues tied to real world infrastructure, but when a local wealthy player is betting against them they'll probably end up losing those markets: "One of the biggest cheerleaders for the new rules was Reliance Jio, a fast-growing mobile phone company controlled by Mukesh Ambani, India’s richest industrialist. Mr. Ambani, an ally of Mr. Modi, has made no secret of his plans to turn Reliance Jio into an all-purpose information service that offers streaming video and music, messaging, money transfer, online shopping, and home broadband services."
Publishers do not have "their mojo back" because the tech companies have been so good to them, but rather because the tech companies have been so aggressive that they've earned so much blowback which will in turn lead publishers to opting out of future deals, which will eventually lead more people back to the trusted brands of yesterday.
Publishers feeling guilty about taking advertorial money from the tech companies to spread their propaganda will offset its publication with opinion pieces pointing in the other direction: "This is a lobbying campaign in which buying the good opinion of news brands is clearly important. If it was about reaching a target audience, there are plenty of metrics to suggest his words would reach further – at no cost – on Facebook. Similarly, Google is upping its presence in a less obvious manner via assorted media initiatives on both sides of the Atlantic. Its more direct approach to funding journalism seems to have the desired effect of making all media organisations (and indeed many academic institutions) touched by its money slightly less questioning and critical of its motives."
When Facebook goes down direct visits to leading news brand sites go up.
When Google penalizes a no-name me-too site almost nobody realizes it is missing. But if a big publisher opts out of the ecosystem people will notice.
The reliance on the tech platforms is largely a mirage. If enough key players were to opt out at the same time people would quickly reorient their information consumption habits.
If the platforms can change their focus overnight then why can't publishers band together & choose to dump them?
CEO Jack Dorsey said Twitter is looking to change the focus from following specific individuals to topics of interest, acknowledging that what's incentivized today on the platform is at odds with the goal of healthy dialoguehttps://t.co/31FYslbePA— Axios (@axios) April 16, 2019
In Europe there is GDPR, which aimed to protect user privacy, but ultimately acted as a tax on innovation by local startups while being a subsidy to the big online ad networks. They also have Article 11 & Article 13, which passed in spite of Google's best efforts on the scaremongering anti-SERP tests, lobbying & propaganda fronts: "Google has sparked criticism by encouraging news publishers participating in its Digital News Initiative to lobby against proposed changes to EU copyright law at a time when the beleaguered sector is increasingly turning to the search giant for help."
Remember the Eric Schmidt comment about how brands are how you sort out (the non-YouTube portion of) the cesspool? As it turns out, he was allegedly wrong as Google claims they have been fighting for the little guy the whole time:
Article 11 could change that principle and require online services to strike commercial deals with publishers to show hyperlinks and short snippets of news. This means that search engines, news aggregators, apps, and platforms would have to put commercial licences in place, and make decisions about which content to include on the basis of those licensing agreements and which to leave out. Effectively, companies like Google will be put in the position of picking winners and losers. ... Why are large influential companies constraining how new and small publishers operate? ... The proposed rules will undoubtedly hurt diversity of voices, with large publishers setting business models for the whole industry. This will not benefit all equally. ... We believe the information we show should be based on quality, not on payment.
Facebook claims there is a local news problem: "Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. ... more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all."
Google is so for the little guy that for their local news experiments they've partnered with a private equity backed newspaper roll up firm & another newspaper chain which did overpriced acquisitions & is trying to act like a PE firm (trying to not get eaten by the PE firm).
Does the above stock chart look in any way healthy?
Does it give off the scent of a firm that understood the impact of digital & rode it to new heights?
If you want good market-based outcomes, why not partner with journalists directly versus operating through PE chop shops?
If Patch is profitable & Google were a neutral ranking system based on quality, couldn't Google partner with journalists directly?
Throwing a few dollars at a PE firm in some nebulous partnership sure beats the sort of regulations coming out of the EU. And the EU's regulations (and prior link tax attempts) are in addition to the three multi billion Euro fines the European Union has levied against Alphabet for shopping search, Android & AdSense.
Google was also fined in Russia over Android bundling. The fine was tiny, but after consumers gained a search engine choice screen (much like Google pushed for in Europe on Microsoft years ago) Yandex's share of mobile search grew quickly.
The UK recently published a white paper on online harms. In some ways it is a regulation just like the tech companies might offer to participants in their ecosystems:
Companies will have to fulfil their new legal duties or face the consequences and “will still need to be compliant with the overarching duty of care even where a specific code does not exist, for example assessing and responding to the risk associated with emerging harms or technology”.
If web publishers should monitor inbound links to look for anything suspicious then the big platforms sure as hell have the resources & profit margins to monitor behavior on their own websites.
Australia passed the Sharing of Abhorrent Violent Material bill which requires platforms to expeditiously remove violent videos & notify the Australian police about them.
There are other layers of fracturing going on in the web as well.
Programmatic advertising shifted revenue from publishers to adtech companies & the largest ad sellers. Ad blockers further lower the ad revenues of many publishers. If you routinely use an ad blocker, try surfing the web for a while without one & you will notice layover welcome AdSense ads on sites as you browse the web - the very type of ad they were allegedly against when promoting AMP.
There has been much more press in the past week about ad blocking as Google's influence is being questioned as it rolls out ad blocking as a feature built into Google's dominant Chrome web browser. https://t.co/LQmvJu9MYB— Jason Kint (@jason_kint) February 19, 2018
Tracking protection in browsers & ad blocking features built directly into browsers leave publishers more uncertain. And who even knows who visited an AMP page hosted on a third party server, particularly when things like GDPR are mixed in? Those who lack first party data may end up having to make large acquisitions to stay relevant.
Voice search & personal assistants are now ad channels.
Google Assistant Now Showing Sponsored Link Ads for Some Travel Related Queries "Similar results are delivered through both Google Home and Google Home Hub without the sponsored links." https://t.co/jSVKKI2AYT via @bretkinsella pic.twitter.com/0sjAswy14M— Glenn Gabe (@glenngabe) April 15, 2019
App stores are removing VPNs in China, removing Tiktok in India, and keeping female tracking apps in Saudi Arabia. App stores are centralized chokepoints for governments. Every centralized service is at risk of censorship. Web browsers from key state-connected players can also censor messages spread by developers on platforms like GitHub.
Microsoft's newest Edge web browser is based on Chromium, the source of Google Chrome. While Mozilla Firefox gets most of their revenue from a search deal with Google, Google has still went out of its way to use its services to both promote Chrome with pop overs AND break in competing web browsers:
"All of this is stuff you're allowed to do to compete, of course. But we were still a search partner, so we'd say 'hey what gives?' And every time, they'd say, 'oops. That was accidental. We'll fix it in the next push in 2 weeks.' Over and over. Oops. Another accident. We'll fix it soon. We want the same things. We're on the same team. There were dozens of oopses. Hundreds maybe?" - former Firefox VP Jonathan Nightingale
This is how it spreads. Google normalizes “web apps” that are really just Chrome apps. Then others follow. We’ve been here before, y’all. Remember IE? Browser hegemony is not a happy place. https://t.co/b29EvIty1H— DHH (@dhh) April 1, 2019
In fact, it’s alarming how much of Microsoft’s cut-off-the-air-supply playbook on browser dominance that Google is emulating. From browser-specific apps to embrace-n-extend AMP “standards”. It’s sad, but sadder still is when others follow suit.— DHH (@dhh) April 1, 2019
YouTube page load is 5x slower in Firefox and Edge than in Chrome because YouTube's Polymer redesign relies on the deprecated Shadow DOM v0 API only implemented in Chrome. You can restore YouTube's faster pre-Polymer design with this Firefox extension: https://t.co/F5uEn3iMLR— Chris Peterson (@cpeterso) July 24, 2018
As phone sales fall & app downloads stall a hardware company like Apple is pushing hard into services while quietly raking in utterly fantastic ad revenues from search & ads in their app store.
Part of the reason people are downloading fewer apps is so many apps require registration as soon as they are opened, or only let a user engage with them for seconds before pushing aggressive upsells. And then many apps which were formerly one-off purchases are becoming subscription plays. As traffic acquisition costs have jumped, many apps must engage in sleight of hand behaviors (free but not really, we are collecting data totally unrelated to the purpose of our app & oops we sold your data, etc.) in order to get the numbers to back out. This in turn causes app stores to slow down app reviews.
Apple acquired the news subscription service Texture & turned it into Apple News Plus. Not only is Apple keeping half the subscription revenues, but soon the service will only work for people using Apple devices, leaving nearly 100,000 other subscribers out in the cold: "if you’re part of the 30% who used Texture to get your favorite magazines digitally on Android or Windows devices, you will soon be out of luck. Only Apple iOS devices will be able to access the 300 magazines available from publishers. At the time of the sale in March 2018 to Apple, Texture had about 240,000 subscribers."
Apple is also going to spend over a half-billion Dollars exclusively licensing independently developed games:
Several people involved in the project’s development say Apple is spending several million dollars each on most of the more than 100 games that have been selected to launch on Arcade, with its total budget likely to exceed $500m. The games service is expected to launch later this year. ... Apple is offering developers an extra incentive if they agree for their game to only be available on Arcade, withholding their release on Google’s Play app store for Android smartphones or other subscription gaming bundles such as Microsoft’s Xbox game pass.
Verizon wants to launch a video game streaming service. It will probably be almost as successful as their Go90 OTT service was. Microsoft is pushing to make Xbox games work on Android devices. Amazon is developing a game streaming service to compliment Twitch.
The hosts on Twitch, some of whom sign up exclusively with the platform in order to gain access to its moneymaking tools, are rewarded for their ability to make a connection with viewers as much as they are for their gaming prowess. Viewers who pay $4.99 a month for a basic subscription — the money is split evenly between the streamers and Twitch — are looking for immediacy and intimacy. While some hosts at YouTube Gaming offer a similar experience, they have struggled to build audiences as large, and as dedicated, as those on Twitch. ... While YouTube has made millionaires out of the creators of popular videos through its advertising program, Twitch’s hosts make money primarily from subscribers and one-off donations or tips. YouTube Gaming has made it possible for viewers to support hosts this way, but paying audiences haven’t materialized at the scale they have on Twitch.
Google, having a bit of Twitch envy, is also launching a video game streaming service which will be deeply integrated into YouTube: "With Stadia, YouTube watchers can press “Play now” at the end of a video, and be brought into the game within 5 seconds. The service provides “instant access” via button or link, just like any other piece of content on the web."
Google will also launch their own game studio making exclusive games for their platform.
When consoles don't use discs or cartridges so they can sell a subscription access to their software library it is hard to be a game retailer! GameStop's stock has been performing like an ICO. And these sorts of announcements from the tech companies have been hitting stock prices for companies like Nintendo & Sony: “There is no doubt this service makes life even more difficult for established platforms,” Amir Anvarzadeh, a market strategist at Asymmetric Advisors Pte, said in a note to clients. “Google will help further fragment the gaming market which is already coming under pressure by big games which have adopted the mobile gaming business model of giving the titles away for free in hope of generating in-game content sales.”
The big tech companies which promoted everything in adjacent markets being free are now erecting paywalls for themselves, balkanizing the web by paying for exclusives to drive their bundled subscriptions.
How many paid movie streaming services will the web have by the end of next year? 20? 50? Does anybody know?
Disney alone with operate Disney+, ESPN+ as well as Hulu.
And then the tech companies are not only licensing exclusives to drive their subscription-based services, but we're going to see more exclusionary policies like YouTube not working on Amazon Echo, Netflix dumping support for Apple's Airplay, or Amazon refusing to sell devices like Chromecast or Apple TV.
The good news in a fractured web is a broader publishing industry that contains many micro markets will have many opportunities embedded in it. A Facebook pivot away from games toward news, or a pivot away from news toward video won't kill third party publishers who have a more diverse traffic profile and more direct revenues. And a regional law blocking porn or gambling websites might lead to an increase in demand for VPNs or free to play points-based games with paid upgrades. Even the rise of metered paywalls will lead to people using more web browsers & more VPNs. Each fracture (good or bad) will create more market edges & ultimately more opportunities. Chinese enforcement of their gambling laws created a real estate boom in Manila.
So long as there are 4 or 5 game stores, 4 or 5 movie streaming sites, etc. ... they have to compete on merit or use money to try to buy exclusives. Either way is better than the old monopoly strategy of take it or leave it ultimatums.
The publisher wins because there is a competitive bid. There won't be an arbitrary 30% tax on everything. So long as there is competition from the open web there will be means to bypass the junk fees & the most successful companies that do so might create their own stores with a lower rate: "Mr. Schachter estimates that Apple and Google could see a hit of about 14% to pretax earnings if they reduced their own app commissions to match Epic’s take."
As the big media companies & big tech companies race to create subscription products they'll spend many billions on exclusives. And they will be training consumers that there's nothing wrong with paying for content. This will eventually lead to hundreds of thousands or even millions of successful niche publications which have incentives better aligned than all the issues the ad supported web has faced.
Categories: 
publishing & media
from Digtal Marketing News http://www.seobook.com/fractured
0 notes
bambiguertinus · 6 years ago
Text
Three fundamental factors in the production of link-building content
One of the most overused phrases in content marketing is how it is an ever-changing landscape, forcing agencies and marketers to adapt and improve their existing processes.
In a short space of time, a topic can go from being newsworthy to negligible, all while certain types of content become tedious to the press and its readers.
A vast amount of the work we do — at Kaizen and many other similar agencies — is create content with the sole purpose of building high authority links, making it all the more imperative that we are conscious of the changes and trends outlined above.
If we were to split the creative process into three sections — content, design, and outreach strategy — how are we able to engineer our own successes and failures to provide us with a framework for future campaigns?
Three important factors for producing link-worthy content
Over the past month, I’ve analyzed over 120 pieces of content across 16 industries to locate and define the common threads between campaigns that exceed or fall short of their expectations. From the amount of data used and visualized to the importance of effective headline storytelling, the insight is a way of both rationalizing and reshaping our approach to content production.
1. Not too much data — our study showed an average of just over five metrics
Behind every great piece of content is (usually) a unique or noteworthy set of data. Both static and interactive content enables us to display limitless amounts of research which provide the origins of the stories we try to communicate. However many figures or metrics you choose to visualize, there is always a point where a journalist or reader switches off.
This glass ceiling is difficult to pinpoint and depends on the type of content, and the industry or readership you’re looking to appeal to, but a more granular study of good and poor performing campaigns that I performed suggested some benefits of refining data sets.
Observations
A starting point for any piece of research is the individual metrics, whether it is cost, type, or essentially anything worth measuring and comparing. In my research, in the content campaigns that exceed our typical KPI, there was an average of just over 5 metrics used on each piece compared to almost double in campaigns with either a normal or below satisfactory performance. The graph below shows the correlation between a lower number of metrics and a higher link performance.
An example of these findings in practice can be found in an infographic study completed for online travel retailer Lastminute.com that sought to find the world’s most chilled out countries. Following a comprehensive study of 36 countries across 10 metrics, the task was to refine these figures in a way that can be translated well through its design. The number of countries was whittled down to just the top 15, and the metrics were condensed to have four indexes which the rankings were based on. The decision to not showcase the data in its entirety proved fruitful, securing over 50 links, covered by the Mail Online and Lonely Planet.
As an individual who very much enjoys partaking in the research process, it can be extremely difficult to sacrifice any element of your work, but it is that level of tact in the production of content that distinguishes one piece from another.
2. Simple, powerful data visualizations — our analysis showed highest achievers had just one visualization
Regardless of how saturated the content marketing industry becomes, we are graced every year with new and innovative ways of visualizing data. The balancing act between originality in your design and an unnecessarily complex data-visualization is often the point on which success and failure can pivot. As is the case with data, overloading a piece of content with an amass of multi-faceted graphs and charts is a surefire way of alienating your users, leaving them either bored or confused.
Observations
For my study, I decided to look at the content that contained data visualizations that failed to hit the mark and see whether the quality is as much of a problem as quantity in terms of design. As I carried out the analysis, I denoted the two examples where one visual would incorporate most or all of the study, or the same illustration was replicated several times for a country, region or sector. For instance, this study, from medical travel insurance provider Get Going, on reliable airlines condenses all the key information into one single data-visualization. Conversely, this piece from The Guardian on the gender pay gap shows how it can be effective to use one visual several times to present your data.     
Unsurprisingly, many of the low scorers in my research averaged around eight different forms of data visualizations while high achievers contained just one. The graph below showcases how many data-visualizations are used on average by high and low performing pieces, both static and interactive. Low performing static examples contained an average of just over six, with less than one for their higher-scoring counterparts. For interactive content, the optimum is just over one with poor performing content containing almost nine per piece.
In examples where the same type of graph or chart was used repeatedly, poor performers had approximately 33 per piece, with their more favorable counterparts using just three.
It is important to note that ranking-based pieces often require the repetition of a visual in order to tell a story, but once again this is part of the balancing act for creatives in terms of what type and how many data-visualizations one utilizes.
A fine example of an effective illustration of the data study contained in one visual comes from a 2017 piece by Federica Fragapane for Italian publication La Lettura, showcasing the most violent cities in the world. The chart depicts each city as a shape sized by its homicide rate, with other small indicators defined in the legend to the right of the graphic. The aesthetic qualities of the graph give a campaign, fairly morbid in the topic, an extended appeal beyond the subject of just global crime. While the term “design-led” is so-often thrown around, this example proves how effective it can be to integrate visuals effectively through your data. The piece, produced originally for print, proved hugely successful in the design space, with 18 referring domains from sites such as Visme.co.
3. Pandering to the press — over a third of our published links used the same headline as our pitch email subject line
Kaizen produces hundreds of campaigns on a yearly basis across a range of industries, meaning the task of looking inward is as necessary today as it ever has been. Competition means that press contacts are looking for something extra special to warrant your content’s publication. While ingenuity is required in every area of content marketing, it’s equally important to recognize the importance of getting the basics right.
The task of outreach can be won and lost in several ways, but your subject line is, and will always be, the most significant component of your pitch. Whether you encapsulate your content in a single sentence or highlight your most attention-worthy finding, an email headline is a laborious but crucial task. My task through my research was to find how vital it is in terms of the end result of achieving coverage.
Observations
As part of my analysis, I recorded the backlinks of a sample of our high and average content and recorded the headlines used in the coverage for each campaign. I found in better-performing examples, over a third of links used the same headlines used in our pitch emails, emphasizing the importance of effective storytelling in every area of your PR process. Below is an illustration in the SERPs of how far an effective headline can take you, with example coverage from one of our most successful pieces for TotallyMoney on work/life balance in Europe.
Another area I was keen to investigate, given the time and effort that goes into it, is how press releases are used across the coverage we get. Using scraping software, I was able to pull out the copy from each article where a follow link was achieved and compare it to the press releases we have produced. It was pleasing to see that one in five links contained at least a paragraph of copy used in our press materials. In contrast, just seven percent of the coverage within the lower performing campaigns contained a reference to our press releases, and an even lower four percent using headlines from our email subject lines.
Final thoughts
These correlations, similar to the ones discussed previously, suggest not only how vital the execution of basic processes are, but serve as a reminder that a campaign can do well or fall down at so many different points of production. For marketers, analysis of this nature indicates that a refinement of creative operations is a more secure route for your content and its coverage. Don’t think of it as “less is more” but a case of picking the right tools for the job at hand.
Nathan Abbott is Content Manager at Kaizen.
The post Three fundamental factors in the production of link-building content appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/15/three-fundamental-factors-in-the-production-of-link-building-content/
0 notes
bambiguertinus · 6 years ago
Text
Top 19 Instagram marketing tools to use for success
Instagram is a phenomenon of our time. The photo-sharing app has 7.7 billion users by now (and counting).
One billion people use Instagram every month and 500 million use the platform every day. Its engagement is also 10 times higher than that of Facebook, 54 times higher than Pinterest’s, and 84 times higher than Twitter’s.
All kinds of businesses ranging from your teen neighbor making earrings to huge corporations and media are on Instagram. And for a good reason — 80% of Instagram accounts follow at least one business.
[Screenshot taken from the Instagram Business homepage]
At times when Facebook is becoming more and more Messenger-based and Twitter revolves around politics and social issues, Instagram stands to be the platform for friends, strangers, and brands alike.
It’s no surprise we’re so serious about Instagram marketing and the tools that help us with it.
Below is the list of such tools which covers everything from filters to analytics.
19 top Instagram marketing tools
1. Grum
Grum is a scheduling tool that lets you publish content (both photos and videos) on Instagram. You can publish from multiple accounts at the same time and tag the users. You can do that right from your desktop.
Price: Starts at $9.9/month. Offers a free trial for 3 days.
2. Awario
Awario is a social media monitoring tool that finds mentions of your brand (or any other keyword) across the web, news/blogs, and social media platforms, including Instagram. By analyzing mentions of your brand on the platform, it tells you who your brand advocates and who the industry influencers are, what the sentiment behind your brand (positive, negative, or neutral) is, as well as the languages and locations of your audience. It also analyzes the growth and reach of your mentions, and tells you how you compare to your competitors.
Price: Starts at $29/month. Offers a free trial for 14 days.
3. Buffer
Buffer is another scheduling tool. However, it includes Instagram among other social networks rather than focusing on Instagram alone. With Buffer, you can schedule content to be published across Instagram, Facebook, Twitter, Pinterest, and LinkedIn. You can publish the same or different messages across different platforms. You can also review how your posts are performing in terms of engagement, impressions, and clicks.
The tool can be used by up to 25 team members, and you can assign them the appropriate access levels.
Price: Starts at $15/month. Offers free 7-day or 14-day trials depending on the plan.
4. Hashtags for likes
Hashtags for likes is a simple tool that suggests you the most trending relevant hashtags. Knowing the most popular hashtags in real time helps brands keep up with trends, bandwagon on the news, and ultimately grow followers.
Price: $9.99/month.
5. Iconosquare
Iconosquare is a social media analytics tool that works for Instagram and Facebook. It shows you the metrics on content performance and engagement as well as on your followers. You’ll discover the best times to post and understand your followers better. The tool also analyzes Instagram Stories.
Besides analytics, you can schedule posts, monitor tags and comments about your brand.
Price: Starts at $39/month. A free 14-day trial is available.
6. Canva
Canva is a design tool that is a great fit for marketers and companies that don’t have an in-house designer. Among other things, Canva helps create perfect Instagram stories. Stylish templates and easy design tools ensure that your Story stands out, which, again, isn’t easy in the world of Instagram.
Price: Free
7. Shortstack
Shortstack is a tool to run Instagram contests. Contests are huge on this platform, they cause loads of buzz, increase brand awareness, and attract new followers. They are a practice loved by marketers.
ShortStack gathers all user-generated content, such as images that have been posted on your content hashtag, and displays them. It also keeps track of your campaign’s performance, showing your traffic, engagement, and other valuable data.
Price: Free up to 100 entries. Paid plans start at $29/month.
8. Soldsie
Soldsie is a handy tool that helps you to sell on Instagram and Facebook using comments. All you have to do is upload a product picture with relevant product information. Users who are registered with Soldsie can simply comment on the photo, and Soldsie will turn that into a transaction.
More expensive Soldsie plans are also integrated with Shopify.
Price: Starts at $49/monthly and 5.9% transaction fee.
9. Social Rank
Social Rank is a tool that identifies and analyzes your audience. You can identify influencers among your followers, see who engages with your brand and with what frequency. You can sort your followers in lists that are easy to work with (for example: most valuable, most engaged, and others).
You can also filter your audience by bio keyword, word/hashtag, and geographic location.
Price: Available on request.
10. Plann
Plann is an Instagram social media management tool. It allows you to design, edit, schedule, and analyze your posts. For example, you can edit the Instagram grid to look just as you wish. You can rearrange, organize, crop, and schedule your Instagram Stories. All exciting stats, from best times to post and best-performing hashtags to your best-performing color schemes are available. And you can also collaborate with other marketers to run your Instagram account together.
Price: Free, paid plans start from $6/month.
11. Social Insights
Social Insights is another platform that offers many important Instagram marketing features, such as scheduling and posting from your computer, identifying and organizing your followers, and analyzing followers’ growth, interactions, and engagement. You can add other team members without sharing your Instagram login.
Price: Starts at $29/month. A free 14-day trial is available.
12. Instagram Ads by Mailchimp
If you’re already using MailChimp, its Instagram Ads feature might come in handy. The tool lets you use MailChimp contact lists to create Instagram campaigns. The whole process (creating, buying, and tracking results of your ads) is, therefore, in the familiar place and powered by data.
Price: No extra fees if you’re using MailChimp.
13. Unfold – Story Creator
Unfold – Story Creator is an iOS app that makes lifestyle, fashion, and travel content more professional-looking. The app offers stylish templates, advanced fonts and text tools, and exports your stories in high resolution so that you can share them to other platforms besides Instagram.
Price: Free
14. Picodash
Picodash is an Instagram tool that finds target audiences and influencers on the platform. It lets you export your and your competitors’ Instagram followers and following lists, users that have used a specific hashtag, posted at a specific location or venue, commented or liked a specific post, as well as tagged users. You can also download any account stories or highlighted stories.
Price: Starts from $10 for a Followers/Hashtag Posts export. You can also request a sample of 100 for free before you order a full export report.
15. Wyng
Wyng is an enterprise-level platform that finds user-generated content with a specific mention or hashtag, exports it, and gets the rights to this content. This is very helpful for running contests. Instagram is, however, a tiny fraction of what the tool covers.
Price: Available on request. A free 14-day trial is available.
16. Afterlight
Afterlight is the iOS/Android image editing app that makes your content look more professional and refined. It offers plenty of unique filters, natural effects, and frames.
Price: $2.99
17. Sendible
Sendible is a popular social media management platform that lets you run accounts on different social media platforms, including Instagram. It’s integrated with some other tools that are useful for Instagram, such as Canva. The tool does scheduling, monitors mentions, and tracks the performance of your Instagram posts. You can also team up with other marketers and work together on your Instagram marketing (and other) goals.
Price: Starts at $29/month. A free 14-day trial is available.
18. Olapic
Olapic is an advanced visual commerce platform. It collects user-generated video content in real time, publishes it to your social media channels (including Instagram) makes it shoppable, measures and predicts which content will perform best. It goes far beyond Instagram and even social media. What is more, it obtains rights for the content for you so that you’re able to use it across your advertising, email, and offline channels.
Price: Available on request.
19. Pablo
Pablo (made by Buffer) is a platform that lets you easily create beautiful images for your Instagram marketing purposes. You can choose photos from Pablo’s own library which includes more than 500,000 images, add text (25+ stylish fonts are available) and format. The resizing option for various social platforms, including Instagram, will ensure your image fits perfectly.
Price: Free
Conclusion
As you can see, there’re plenty of tools to choose from. Check them out, spot the ones that you need, and take your Instagram marketing to a whole new level.
Aleh is the Founder and CMO at SEO PowerSuite and Awario. He can be found on Twitter at @ab80.
Read next:
Tips to maximize ROI on paid social: Facebook + Instagram
How to optimize your Instagram account for search engines
Social media: How does it affect SEO?
Top social media trends for 2019
The post Top 19 Instagram marketing tools to use for success appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/15/top-nineteen-instagram-marketing-tools-to-use-for-success/
0 notes
bambiguertinus · 6 years ago
Text
How to optimize paid search ads for phone calls
There have been an abundance of hand-wringing articles published that wonder if the era of the phone call is over, not to mention speculation that millennials would give up the option to make a phone call altogether if it meant unlimited data.
But actually, the rise of direct dialing through voice assistants and click to call buttons for mobile search means that calls are now totally intertwined with online activity.
Calling versus buying online is no longer an either/or proposition. When it comes to complicated purchases like insurance, healthcare, and mortgages, the need for human help is even more pronounced. Over half of consumers prefer to talk to an agent on the phone in these high-stakes situations.
In fact, 70% of consumers have used a click to call button. And three times as many people prefer speaking with a live human over a tedious web form. And calls aren’t just great for consumers either. A recent study by Invoca found that calls actually convert at ten times the rate of clicks.
However, if you’re finding that your business line isn’t ringing quite as often as you’d like it to, here are some surefire ways to optimize your search ads to drive more high-value phone calls.  
Content produced in collaboration with Invoca.
Four ways to optimize your paid search ads for more phone calls
Let your audience know you’re ready to take their call — and that a real person will answer
If you’re waiting for the phone to ring, make sure your audiences know that you’re ready to take their call. In the days of landlines, if customers wanted a service, they simply took out the yellow pages and thumbed through the business listings until they found the service they were looking for. These days, your audience is much more likely to find you online, either through search engines or social media. But that doesn’t mean they aren’t looking for a human to answer their questions.
If you’re hoping to drive more calls, make sure your ads are getting that idea across clearly and directly. For example, if your business offers free estimates, make sure that message is prominent in the ad with impossible-to-miss text reading, “For a free estimate, call now,” with easy access to your number.
And to make sure customers stay on the line, let them know their call will be answered by a human rather than a robot reciting an endless list of options.
Cater to the more than half of users that will likely be on mobile
If your customer found your landing page via search, there’s a majority percent chance they’re on a mobile device.
While mobile accounted for just 27% of organic search engine visits in Q3 of 2013, its share increased to 57% as of Q4 2018.
That’s great news for businesses looking to boost calls, since mobile users obviously already have their phone in hand. However, forcing users to dig up a pen in order to write down your business number only to put it back into their phone adds an unnecessary extra step that could make some users think twice about calling.  
Instead, make sure mobile landing pages offer a click to call button that lists your number in big, bold text. Usually, the best place for a click to call button is in the header of the page, near your form, but it’s best practice to A/B test button location and page layouts a few different ways in order to make sure your click to call button can’t be overlooked.
Use location-specific targeting
Since 2014, local search queries from mobile have skyrocketed in volume as compared to desktop.
In 2014, there were 66.5 billion search queries from mobile and 65.6 billion search queries from desktop.
Now in 2019, desktop has decreased slightly to 62.3 billion — while mobile has shot up to 141.9 billion — nearly a 250% increase in five years.
Mobile search is by nature local, and vice versa. If your customer is searching for businesses hoping to make a call and speak to a representative, chances are, they need some sort of local services. For example, if your car breaks down, you’ll probably search for local auto shops, click a few ads, and make a couple of calls. It would be incredibly frustrating if each of those calls ended up being to a business in another state.
Targeting your audience by region can ensure that you offer customers the most relevant information possible.
If your business only serves customers in Kansas, you definitely don’t want to waste perfectly good ad spend drumming up calls from California.
If you’re using Google Ads, make sure you set the location you want to target. That way, you can then modify your bids to make sure your call-focused ads appear in those regions.  
Track calls made from ads and landing pages
Keeping up with where your calls are coming from in the physical world is important, but tracking where they’re coming from on the web is just as critical. Understanding which of your calls are coming from ads as well as which are coming from landing pages is an important part of optimizing paid search. Using a call tracking and analytics solution alongside Google Ads can help give a more complete picture of your call data.
And the more information you can track, the better. At a minimum, you should make sure your analytics solution captures data around the keyword, campaign/ad group, and the landing page that led to the call. But solutions like Invoca also allow you to capture demographic details, previous engagement history, and the call outcome to offer a total picture of not just your audience, but your ad performance.
For more information on how to use paid search to drive calls, check out Invoca’s white paper, “11 Paid Search Tactics That Drive Quality Inbound Calls.”
The post How to optimize paid search ads for phone calls appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/16/optimize-paid-search-phone-calls/
0 notes
bambiguertinus · 6 years ago
Text
How to conduct a branded search audit
Search queries for your brand name, called “brand searches,” are among the most important keywords in a keyword portfolio. Even so, marketers are not often paying as much attention to these types of queries as they should.
While juicy high-volume non-branded queries are exciting, providing your audience and customers with helpful brand information it is an equally thrilling prospect. The truth is that users for many brands, big and small, are commonly underserved by branded search results they find.
In this post, we’ll show you exactly how to conduct a branded search audit, identify failing results, and implement improvements. This audit is one that we perform for our clients at Stella Rising. Now you can do the same for your clients or website.
The first part of the audit is about setting the stage. Do you know what ratio of your traffic is the result of non-brand queries vs. brand queries? You should. In this section of the audit, you’ll set the stage to discuss the importance of what you identify.
Why are branded queries so important?
Branded queries are among the most important keywords you can optimize as they represent a brand-aware audience that is more likely to convert. In fact, many of the people searching for your brand are already customers looking for information or looking to purchase again.
It’s easy (generally)
Unlike most things in SEO, Google wants you to rank well for your own brand terms. Whenever we see branded searches that are failing users, it’s usually easy to fix. Often, it’s as simple as creating a new page or changing a meta tag. Other times it can be more challenging, such as when brands have significant PR and/or brand reputation issues. That said, in most cases, branded search queries are among the easiest to rank for. Don’t overlook them.
Correlation to rankings and personalization
While the search volume of a domain name is not a confirmed ranking factor, Google does hold a patent that may indicate the more searches a brand receives, the more likely that brand is to to be seen as high quality by Google. This, in turn, may help them to rank for associated non-brand terms. What does that mean? Essentially, if tens of thousands of people search your brand name + couch, you may be more likely to rank for “sofas.” A number of 2014 Wayfair commercials brilliantly capitalized on this opportunity. The commercials literally told people to Google “Wayfair my sofa” or “Wayfair my kitchen,” thus tying signals around their entity to other non-brand entities.
How Wayfair brilliantly linked its advertising with its branded search terms
youtube
Branded searches can also impact autocomplete which in turn can impact more branded searches, feeding into the connection described above. When users click one of these autocompleted search suggestions, they execute a “branded search,” which then signals to Google that the entities are related. For example, “Amazon, Ralph Lauren or Macy’s” with “Men’s shirts”.
Branded search audit part one: Setting the stage
In part one of this audit, you will provide branded search landscape insights. Like any good show, you need to set the stage for the information you are about to present. This will help to win buy-in, prioritize your efforts, and keep you strategically on track.
What is your ratio of branded search?
For this part, you’ll need to head to the Google Search Console and open up your performance report for the last three months. Start by setting up a filter for your brand name. If you have a brand name that people commonly misspell, then you will want to take that into account.*
Click the “+ New” button and then click “Query.” Filter by “Queries Containing” and not “Query is Exactly.”
*Advanced Tip: The above instructions only account for your brand name, not product names that are proprietary to your brand. Consider using the API to pull down your GSC report and do filtering for those as well.
You’ll then see the total number of branded clicks and impressions your site receives.
Now change your filter to “Queries not containing,” this will give you roughly the number of non-branded clicks the website has received from Google search.
Take this data and bring it into Excel. From there, create a pie chart to visually demonstrate the ratio of branded to non-branded clicks the website receives.
What are the top branded queries driving traffic?
The next and perhaps one of the most critical questions for this analysis is, “What are the top branded queries?” Understanding this is important because the next step in this audit is to manually search each of the top ten (or more) queries. Then you will understand which queries will better serve users.
While this analysis is simple, we found that creating a simple visual with the data makes for a better story in your presentation. To do this, download the data by clicking the down arrow at the top of the queries table.
Once you have the data in Excel, you can create a cool visual using the 2D bar chart graph.
What are the top pages receiving branded clicks?
Similar to the above analysis, you will need to download the top pages from the “pages” tab in your performance report.
Where do branded searches come from?
For some brands, it is worth considering where branded searches come from geographically. To find this information, set your filter to include brand queries and then click the “countries” tab in the Google Search Console.
Now we could just put that image into a report, but what fun would that be? Instead, download the data and bring it into Excel to create a visual.
You can do this a number of ways, but we recommend either using a map or a tree-map which generates a cool way of looking at the data. A pie chart would do but is not as visually appealing. Open Excel and highlight the data you downloaded. Click “Insert”, then click either on “Maps” or the second box for a treemap.
Tree-map visualization of country click data
How has branded search trended over time?
The last and perhaps the most essential question outside of which queries get clicks is, “how has branded search trended over time?” This trend is a hugely important question for any brand that has or is currently investing in brand building efforts, media, PR, or even non-branded paid search campaigns.
During this section of the audit, we have seen brands that previously invested millions in traditional print flatten out for years. Alternatively, we’ve also seen DTC brand’s growth hit a wall. Knowing where brand interest stands is a data point that is vital to all brands and their performance marketing strategies. Whether you are conducting a branded search audit or not, tracking the branded search volume is something that should be on the KPI list for marketers across a multitude of disciplines.
Branded search audit part two: Identifying issues in branded search
Do you have any brand image issues?
Sorry, no SEO magic here. If your brand, founder or employee made headlines (and not the good ones you send to mom) your only strategy is to do your best to rectify the situation.
Take, for example, a brand we came across which was at one point dealing with first page Google results full of nasty headlines. The headings covered how the center had allegedly abandoned more than 40 research animals on an island.
When I first heard this story and considered the best plan forward I thought, “Can they just decide not to abandon the animals and find them homes?”
In fact, that’s exactly what they did. As a result, the negative stories were replaced over time with positive ones about how the center reached an agreement to find a sanctuary for the animals. The first page of results for their brand name is now squeaky clean. The moral of this story is, never abandon animals on an island.
But, if you do, and get dragged through the mud for it, no amount of SEO will save you. Are there any abandoned research animals in your organization? If so, get them off the island.
In other less metaphorical terms, part of auditing brand search is brand reputation.
As we all now know, “E-A-T” and reputation are hugely important to Google. Deal with business practice issues head-on and find the best resolution possible. John Mueller has��reminded us that Google has a really good memory and is not apt to forget anything about your brand history. Stains on your brand reputation can really take a toll and have lasting power. The best offense here is a good defense.
Searching the top ten queries
Now for the part of the audit where we find the broken stuff.
Take your list of top terms and manually execute the search in a private browsing window for each term. Record your results and a screenshot of the SERP.
What are we looking for?
The number of positions on the page we own – With branded searches for things like “brand name backpacks,” you can own 15+ of the top positions.
Others owning our conversation – Look for other brands ranking for your branded terms. Are they authorized to sell or talk about your products? Is the information they provide correct or accurate? Could a better job be done?
Misalignment of the query and title/meta – Does your title tag and meta description clearly speak to the query searched and align to the searcher’s intent? If not, make some changes to bring this into alignment. For example, if your shipping info is on a page labeled “FAQ” that’s unclear for users.
Cross-channel insights – Are all of your social properties listed on the first page where appropriate? Are others bidding on brand terms where you are not? Look for cross-channel synergies that could be leveraged for more aggregate traffic.
Broken pages – For big and small brands we have found soft “404” messages, outdated or broken pages ranking for highly searched branded terms. Click around the top ten and make sure everything is working as expected.
Out of stock products – Sometimes the products that rank for branded searches on ecommerce sites are out of stock. Make sure you have a protocol in place for managing inventory so that searchers are best served by the landing page.
Searching other navigational queries
Other areas to check out are navigational or known brand queries that help users navigate and interact with your brand. For example:
Brand Name Opening Hours
Brand Name Location
Brand Name Address
Brand Name Telephone Number
Brand Name Customer Service
Brand Name FAQ
Brand Name Return Policy
Brand Name Shipping
Brand Name Refund Policy
Brand Name Size Chart/Guide
Brand Name Sale
Manually execute a search in a private browsing window for these terms. You’re looking for title tags and meta descriptions that do not speak to the query, missing pages where content would be better-served broken-out, and other websites owing the conversation around your terms. In some of our audits, we have seen third-party sellers ranking on position one for things like “brand name return policy”.
Furthermore, the information presented was not even correct. When it came to our site, we were not making that information easy to find, and as such, we got beat out. Simple adjustments to meta information and where content lived helped to improve our visibility and win back the featured snippet.
Track and measure
As with any good SEO effort, you should carefully track and measure the success of your recommendations. Consider setting up a monthly tracking sheet for your branded search volume. Also, consider tracking some of the branded queries identified as “needing work” in your rank tracker.
Conclusion
The branded search audit is a deliverable which does not take a great deal of time but can result in tremendous impact for your brand or client. By focusing on branded queries first, you are serving those most likely to convert on your site while simultaneously addressing your lowest hanging fruit.
Have questions about conducting your own audit? Let us know in the comments section below.
John Morabito is the Director of SEO at Women’s Marketing / Flying Point Digital. He can be found on Twitter @JohnMorabitoSEO.
The post How to conduct a branded search audit appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/15/how-to-branded-search-audit/
0 notes
bambiguertinus · 6 years ago
Text
How to write SEO-friendly alt text for your images
One of the biggest problems digital marketers face is nuances to crafting high-quality SEO rich content.
A great area of opportunity for marketers is their SEO alt text for images. We’ve all been to websites and the image is replaced by a red “X”, or it’s just a blank box. Wouldn’t it be great if you could benefit from that image box for an increased search engine ranking?
That’s where alt text comes in.
Alt text is just a way to describe what is going on in the image while actively increasing your ranking through smart, thoughtful placement of SEO keywords. We are going to look at ways you can improve your image alt text while keeping your content search engine friendly.
Research keywords before you start
It’s important that you look carefully into which keywords you’re going to use before you start creating content including your alt text. Google’s Keyword Planner tool can help you make educated decisions about which words are best suited for your website, depending on your niche.
When you’re researching keywords, the best practice is to look for words that feature high search volume but low competition. The reason for this thought process is simple.
High volume, high competition keywords result in an uphill battle that you may not win. If there are plenty of people searching for the words you pick, but a bunch of reputable websites who have a high domain authority, you’re going to have a much harder time reaching the top of the search engine results.
At the same time, low competition, low search keywords mean your website probably will not get the traffic you need to thrive. The happy medium is words that are popular, but not dominated by highly authoritative sites. The success of your keywords is going to reflect not just in your content or title, but in your alt text, making this an important starting point.
Supplement your alt text with primary keywords
It’s worth pointing out that alt text is important, but it should never take priority over your researched and currently implemented SEO. You would never want to rearrange your pre-arranged keywords to make the alt text keywords fit.
Instead, try to find images that compliment the keywords you’ve already selected. When you work backward from your alt text images, you could end up with a page that is more focused on the images instead of the content throughout.
The only exception to this rule is if your content is image heavy. Companies that implement slideshows, photo galleries, and the likes may benefit more from working backward from their images instead of the other way around.
Connect the content to the image text
Another common mistake that SEO marketers make is they don’t directly link the alt text to the content they create. Alt text, as mentioned, is just text that describes what’s going on in the image. If you want to make a strong connection with your audience and the search engine results, make sure you make a connection between the text in your content, the image, and the alt text.
For example, if your piece of content was about website design, your content should include text within the piece that explains the image. In this fictional piece, let’s say your keyword is “expert web design”, you’re going to need to include an image that emphasizes your point, explains the image in the content, and the alt text should include the keyword.
Keep it short
Since the main purpose of alt text is to inform the reader of what the image shows if they can’t view it, your alt-text should never drag on. Simply explain what the image shows using your keywords as the primary descriptor and additional text as needed.
The recommended alt-text length is about 125 characters. Some browsers only create one line of alt-text and allocate the size of the image to the length of the one line. The result of a long alt text line is not just “search engine confusion”, but also reader confusion when they cannot finish the line of text from within the image because it was cut off by the browser they are using.
If you find that your alt text is always longer than 125 characters, your point is probably better off posted in the actual content of the article instead of the alt image text.
Examples of SEO-friendly alt text
First, let’s take a look at the source code:
<img src=”Image.gif” alt=”alt-text-goes-here”>
In this example, the “image.gif” is the image that is displayed to those who can properly see the image. Those who can’t see the image will instead see the text you include where it says “alt-text-goes-here”.
Here are some better examples to give you an idea of what a good SEO-friendly piece of alt text looks like.
Example one
You own a pet shop and your display picture is a kitten in a basket at your pet shop. Your source code should look something like this:
<img src=”FluffyCat.png” alt=”Pet Shop Kitten Snuggling in Basket”>
The goal is to make your alt-text clean, concise, and friendly to the keywords you decided to target in your piece.
Example two
Now let’s say you have an online car accessory shop. You sell things like seat covers, floor liners, and air fresheners. On your air freshener page your alt text will look like this:
<img src=”AirFreshner.png” alt=”Air Freshener Pack and New Car Accessories”>
In the example above, you’re targeting air fresheners, new cars, cars in general, and car accessories.
Example three
Finally, you have a membership site that sells marketing tips to your audience. You have an infographic of marketing statistics everyone should know in 2019. How will your alt-text look in this situation? Since you obviously can’t fit every stat in your alt-text, you might say:
<img src=”MarketingStatsInfo.Png” alt=”New Marketing Statistics for 2019″>
Piecing it together
There’s no doubt that alt text plays a crucial role in an online world consumed by the importance of keywords. If you want to make the most of your alt text, keep these tips in mind and remember that the online world is constantly evolving.
As your website grows in size and authority you may have to make changes to your SEO keywords for future articles, and therefore for your alt text. The good news is, this allows you to pull off some interesting split tests to see which keywords are ranking well for you, and which ones are pulling in lackluster results.
One thing is clear, don’t underestimate the power of alt text as it relates to your readers and your search ranking. It may not be the most important factor, but correctly creating optimized images and alt text is an important piece of the puzzle.
Syed Balkhi is an entrepreneur, marketer, and CEO of Awesome Motive. He’s also the founder of WPBeginner, OptinMonster, WPForms, and MonsterInsights. Syed can be found on Twitter @syedbalkhi.
Read next:
10 on-page SEO essentials: Crafting the perfect piece of content
A quick guide to SEO in 2019
How to move from keyword research to intent research
The best alternative keyword research tools
4 Ahrefs and SEMrush alternatives that bring innovation to competitive analysis
The post How to write SEO-friendly alt text for your images appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/12/seo-friendly-image-alt-text/
0 notes
bambiguertinus · 6 years ago
Text
Survey: Less than 10% of marketers to focus on Digital PR in 2019
Zazzle Media has released their annual State of Content Marketing 2019 survey, which found that less than one in ten marketers (9%) will be focusing on Digital PR in 2019.
Despite this, over three quarters (76%) state that brand awareness is a key performance indicator for them.
Not only this, but 25% of content marketers will be ceasing to participate in offline PR activity as it has been perceived as an ineffective channel for them over the recent years.
It seems there is an apparent disconnect between marketers’ desired goals and the tactics they need to carry out to achieve these.
So why are marketers seemingly less concerned about off-page distribution, and why should you make a case for Digital PR to hold a key position in your marketing activities?
Brand awareness
Whilst the creation of written blog content will appeal to people on the site, we need a mechanism that is going to drive these people towards the site first.
Digital PR can help users find your site in a more organic way rather than in a targeted advertorial manner.
The survey found that a quarter of marketers want to target new audiences through content distribution, but without Digital PR this will prove to be a difficult task.
Brand protection
PR allows you to control narratives and get involved with industry conversations which you would otherwise be unable to participate in. The digital aspect also allows you to receive real-time coverage updates which mention your brand’s name and put out an immediate response in an attempt to stem or enhance any positive or negative feedback. Protecting your brand, especially in the SERPs, is a powerful tool for PRs.
Read next: Organic reputation management & brand protection
Link building
A major perk of creating Digital PR campaigns is that they usually come with linkable assets that have a chance of being cited within media coverage.
Link building is an activity which has a reputation of relying on black-hat tactics for success, paying for links, directories, and the others. Digital PR allows you to avoid all these techniques and the risks associated with them and build some legitimate links from high authority publications.
Read next: Five proven content formats to maximize link acquisition with digital PR.
Managing director of Zazzle Media, Simon Penson, commented on the statistics:
“Brand awareness has appeared as one of 2019’s core focuses when it comes to brands content marketing efforts. Whilst this can be achieved through a number of marketing techniques, Digital PR is one of the strongest means of getting your name out there to new audiences.
2019 is shaping up to be an exciting year for content marketing, and Digital PR could be the key to giving your brand new audiences and visibility.”
What do you think of these findings? Let us know your thoughts on the results in the comments.
Kirsty Daniel is a Digital Marketing Executive at Zazzle Media. She can be found on Twitter @kirsty_daniel.
The post Survey: Less than 10% of marketers to focus on Digital PR in 2019 appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/12/digital-pr-2019/
0 notes
bambiguertinus · 6 years ago
Text
How to perfectly balance affiliate marketing and SEO
In all my years as an SEO consultant, I can’t begin to count the number of times I saw clients who were struggling to make both SEO and affiliate marketing work for them.
When their site rankings dropped, they immediately started blaming it on the affiliate links. Yet what they really needed to do was review their search marketing efforts and make them align with their affiliate marketing efforts.
Both SEO and affiliate marketing have the same goal of driving relevant, high-quality traffic to a site so that those visits eventually turn into sales. So there’s absolutely no reason for them to compete against each other. Instead, they should work together in perfect balance so that the site generates more revenue. SEO done right can prove to be the biggest boon for your affiliate marketing efforts.
It’s crucial that you take a strategic approach to align these two efforts.
Four ways to balance your affiliate marketing and SEO efforts
1. Find a niche that’s profitable for you
One of the reasons why affiliate marketing may clash with SEO is because you’re trying to sell too many different things from different product categories. So it’s extremely challenging to align your SEO efforts with your affiliate marketing because it’s all over the place.
This means that you’ll have a harder time driving a targeted audience to your website. While your search rankings may be high for a certain product keyword, you may be struggling to attract visitors and customers for other products.
Instead of trying to promote everything and anything, pick one or two profitable niches to focus on. This is where it gets tricky. While you may naturally want to focus on niches in which you have a high level of interest and knowledge, they may not always be profitable. So I suggest you conduct some research about the profitability of potential niches.
To conduct research, you can check resources that list the most profitable affiliate programs. You can also use platforms like ClickBank to conduct this research. While you can use other affiliate platforms for your research, this is a great place to start. First, click on the “Affiliate Marketplace” button at the top of the ClickBank homepage.
You’ll see a page that gives you the option to search for products. On your left, you can see the various affiliate product categories available. Click on any of the categories that pique your interest.
On the search results page, you’ll see some of the affiliate marketing programs available on the platform. The page also displays various details about the program including the average earning per sale.
Then filter the search results by “Gravity,” which is a metric that measures how well a product sells in that niche.
You should ideally look for products with a Gravity score of 50 or higher. Compare the top Gravity scores of each category to see which is the most profitable. You can additionally compare the average earnings per sale for products in different categories.
2. Revise your keyword strategy
Since you’re already familiar with search marketing, I don’t need to tell you about the importance of keyword planning. That being said, I would recommend that you revise your existing keyword strategy after you’ve decided on a niche to focus on and the products you want to sell.
The same keyword selection rules apply even in this process. You would want to work with keywords that have a significant search volume yet aren’t too competitive. And you will need to focus on long-tail keywords for more accuracy. While you should still use the Google Keyword Planner, I suggest you try out other tools as well for fresh keyword ideas.
Among the free tools, Google Trends is an excellent option. It gives you a clear look at the changes in interest for your chosen search term. You can filter the result by category, time frame, and region. It also gives you a breakdown of how the interest changes according to the sub-region.
The best part about this tool is that if you scroll down, you can also see some of the related queries. This will give you insights into some of the other terms related to your original search term with rising popularity. So you can get some quick ideas for trending and relevant keywords to target.
AnswerThePublic is another great tool for discovering long-tail keyword ideas. This tool gives you insights into some of the popular search queries related to your search term. So you’ll be able to come up with ideas for keywords to target as well as topic ideas for fresh content.
3. Optimize your website content
High-quality content is the essence of a successful SEO strategy. It also serves the purpose of educating and converting visitors for affiliate websites. So it’s only natural that you will need to optimize the content on your website. You can either create fresh content or update your existing content, or you can do both.
Use your shortlisted keywords to come up with content ideas. These keywords have a high search volume, so you know that people are searching for content related to them. So when you create content optimized with those keywords, you’ll gain some visibility in their search results. And since you’re providing them with the content they need, you will be driving them to your site.
You can also update your existing content with new and relevant keywords. Perhaps to add more value, you can even include new information such as tips, stats, updates, and more. Whatever you decide to do, make sure the content is useful for your visitors. It shouldn’t be too promotional but instead, it needs to be informative.
4. Build links to boost site authority and attract high-quality traffic
You already know that building high-quality backlinks can improve the authority of your site and therefore, your search rankings. So try to align your link-building efforts with your affiliate marketing by earning backlinks from sites that are relevant to the products you’re promoting.
Of course, you can generate more social signals by trying to drive more content shares. But those efforts aren’t always enough. Especially if you want to drive more revenue.
I suggest you try out guest posting, as it can help you tap into the established audience of a relevant, authoritative site. This helps you drive high-quality traffic to your site. It also boosts your page and domain authority since you’re getting a link back from a high authority site.
Although Matt Cutts said in 2014 that guest posting for SEO is dead, that’s not true if you plan your approach. The problem is when you try to submit guest posts just for the sake of getting backlinks. Most reputable sites don’t allow that anymore.
To get guest posting right, you need to make sure that you’re creating content that has value. So it needs to be relevant to the audience of your target site, and it should be helpful to them somehow. Your guest posts should be of exceptional quality in terms of writing, readability, and information.
Not only does this improve your chances of getting accepted, but it also helps you gain authority in the niche. Plus, you will get to reach an engaged and relevant audience and later direct them to your site depending on how compelling your post is.
Bottom line
SEO and affiliate marketing can work in perfect alignment if you strategically balance your efforts. These tips should help you get started with aligning the two aspects of your business. You will need some practice and experimentation before you can perfectly balance them. You can further explore more options and evolve your strategy as you get better at the essentials.
Shane Barker is a Digital Strategist, Brand and Influencer Consultant. He can be found on Twitter @shane_barker.
The post How to perfectly balance affiliate marketing and SEO appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/11/how-to-perfectly-balance-affiliate-marketing-and-seo/
0 notes
bambiguertinus · 6 years ago
Text
How to get international insights from Google Analytics
Your international marketing campaigns hinge on one crucial element: how well you have understood your audience.
As with all marketing, insight into the user behavior, preferences and needs of your market is a must. However, if you do not have feet on the ground in these markets, you may be struggling to understand why your campaigns are not hitting the mark.
Thankfully you have a goldmine of data about your customers’ interests, behavior, and demographics already at your fingertips. Wherever your international markets are, Google Analytics should be your first destination for drawing out actionable insights.
Setting up Google Analytics for international insight
Google Analytics is a powerful tool but the sheer volume of data available through it can make finding usable insights tough. The first step for getting the most out of Google Analytics is ensuring it has been set up in the most effective way. This needs to encompass the following:
Also read: An SEO’s guide to Google Analytics terms
1. Setting up views for geographic regions
Depending on your current Google Analytics set-up you may already have more than one profile and view for your website data. What insight you want to get from your data will influence how you set up this first stage of filtering. If you want to understand how the French pages are being accessed and interacted with then you may wish to create a filter based on the folder structure of your site, such as the “/fr-fr/” sub-folder of your site.
However, this will show you information on visitors who arrive on these pages from any geographic location. If your hreflang tags aren’t correct and Google is serving your French pages to a Canadian audience, then you will be seeing Canadian visitors’ data under this filter too.
If you are interested in only seeing how French visitors interact with the website, no matter where on the site they end up, then a geographic filter is better. Here’s an example.
2. Setting up segments per target area
Another way of being able to identify how users from different locations are responding to your website and digital marketing is by setting up segments within Google Analytics based on user demographics. Segments enable you to see a subset of your data that, unlike filters, don’t permanently alter the data you are viewing. Segments will allow you to narrow down your user data based on a variety of demographics, such as which campaign led them to the website, the language in which they are viewing the content, and their age. To set up a segment in Google Analytics click on “All Users” at the top of the screen. This will bring up all of the segments currently available in your account.
To create a new segment click “New Segment” and configure the fields to include or exclude the relevant visitors from your data. For instance, to get a better idea of how French-Canadian visitors interact with your website you might create a segment that only includes French-speaking Canadians. To do this you can set your demographics to include “fr-fr” in the “Language” field and “Canada” in the “Location” field.
Use the demographic fields to tailor your segment to include visitors from certain locations speaking specific languages.
The segment “Summary” will give you an indication of what proportion of your visitors would be included in this segment which will help you sense-check if you have set it up correctly. Once you have saved your new segment it will be available for you to overlay onto your data from any time period, even from before you set up the segment. This is unlike filters, which will only apply to data recorded after the filter was created.
Also read: A guide to the standard reports in Google Analytics – Audience reports 
3. Ensuring your channels are recording correctly
A common missing step to setting your international targeting up on Google Analytics is ensuring the entry points for visitors onto your site are tracking correctly.
For instance, there are a variety of international search engines that Google Analytics counts as “referral” sources rather than organic traffic sources unless a filter is added to change this.
The best way to identify this is to review the websites listed as having driven traffic to your website, follow the path – Acquisition > All Traffic > Referrals. If you identify search engines among this list then there are a couple of solutions available to make sure credit for your marketing success is being assigned correctly.
First, visit the “Organic Search Sources” section in Google Analytics which can be found under Admin > Property > Organic Search Sources.
From here, you can simply add the referring domain of the search engine that is being recorded as a “referral” to the form. Google Analytics should start tracking traffic from that source as organic. Simple. Unfortunately, it doesn’t always work for every search engine.
If you find the “Organic Search Sources” solution isn’t working, filters are a fool-proof solution but be warned, this will alter all your data in Google Analytics from the point the filter is put in place. Unless you have a separate unfiltered view available (which is highly recommended) then the data will not be recoverable and you may struggle to get an accurate comparison with data prior to the filter implementation. To set up a view without a filter you simply need to navigate to “Admin” and under “View” click “Create View”.
Name your unfiltered view “Raw data” or similar that will remind you that this view needs to remain free of filters.
To add a filter to the Google Analytics view that you want to have more accurate data in, go to “Filters” under the “View” that you want the data to be corrected for.
Click “Add Filter” and select the “Custom” option. To change traffic from referral to organic, copy the below settings:
Filter Type: Advanced
Field A – Extract A: Referral (enter the domain of the website you want to reclassify traffic from)
Field B – Extract B: Campaign Medium – referral
Output To – Constructor: Campaign Medium – organic
Then ensure the “Field A Required”, “Field B Required”, and “Override Output Field” options are selected.
You may also notice the social media websites are listed among the referral sources. The same filter process applies to them.  Just enter “social” rather than “organic” under the “Output To” field.
4. Setting goals per user group
Once you have a better idea of how users from different locations use your website you may want to set up some independent goals specific to those users in Google Analytics. This could be, for example, a measure of how many visitors download a PDF in Chinese. This goal might not be pertinent to your French visitors’ view, but it is a very important measure of how well your website content is performing for your Chinese audience.
The goals are simple to create in Google Analytics, just navigate to “Admin”, and under the view that you want to add the goal to click “Goals”.  This will bring up a screen that displays any current goals set up in your view and, if you have edit level permissions in Google Analytics, you can create a new one by clicking “New Goal”.
Once you have selected “New Goal” you will be given the option of setting up a goal from a template or creating a custom one. It is likely that you will need to configure a custom goal in order to track specific actions based off of events or page destinations. For example, if you are measuring how many people download a PDF you may track the “Download” button click events, or you may create a goal based on visitors going to the “Thank you” page that is displayed once a PDF is downloaded.
Most goals will need to be custom ones that allow you to track visitors completing specific events or navigating to destination pages.
With the number of goals you can set up under each view (which is limited to 20), it is likely that your goals will be different under each in order to drive the most relevant insight.
5. Filtering tables by location
An easy way to determine location-specific user behavior is using the geographic dimensions to further drill-down into the data that you are viewing.
For instance, if you run an experiential marketing campaign in Paris to promote awareness of your products, then viewing the traffic that went to the French product pages of your website that day compared to a previous day could give you an indicator of success. However, what would be even more useful would be to see if interest in the website spiked for visitors from Paris.
By applying “City” as a secondary dimension on the table of data you are looking at how you can get a more specific overview of how well the campaign performed in that region.
Dimensions available include “Continent”, “Sub-continent” “Country”, “Region”, and “City”, as well as being able to split the data by “Language”.
Also read: How to integrate SEO into the translation process to maximize global success
Drawing intelligence from your data
Once you have your goals set up correctly you will be able to drill much further down into the data Google Analytics is presenting you with. An overview of how international users are navigating your site, interacting with content and their pain points is valuable in determining how to better optimize your website and marketing campaigns for conversion.
1. Creating personas
Many organizations will have created user personas at one stage or another, but it is valuable to review them periodically to ensure they are still relevant in the light of changes to your organization or the digital landscape. It is imperative that your geographic targeting has been set up correctly in Google Analytics to ensure your personas drive insight into your international marketing campaigns.
Creating personas using Google Analytics ensures they are based on real visitors who land on your website. This article from my agency, Avenue Digital, gives you step by step guidance on how to use your Google Analytics data to create personas, and how to use them for SEO.
2. Successful advertising mediums
One tip for maximizing the data in Google Analytics is discerning what the most profitable advertising medium is for that demographic.
If you notice that a lot of your French visitors are coming to the website as a result of a PPC campaign advertising your products, but the traffic that converts the most is actually from Twitter, then you can focus on expanding your social media reach in that region.
This may not be the same for your UK visitors who might arrive on the site and convert most from organic search results. With the geographic targeting set up correctly in Google Analytics, you will be able to focus your time and budgets more effectively for each of your target regions, rather than employing a blanket approach based on unfiltered data.
3. Language
Determining the best language to provide your marketing campaigns and website may not be as simple as identifying the primary language for each country you are targeting. For example, Belgium has three official languages – Dutch, German, and French. Google Analytics can help you narrow down which of these languages is primarily used by the demographic that interacts with you the most online.
If you notice that there are a lot of visitors from French-speaking countries landing on your website, but it is only serving content in English, then this forms a good base for diversifying the content on your site.
4. Checking the correctness of your online international targeting
An intricate and easy to get wrong aspect of international marketing is signaling to the search engines what content you want available to searchers in different regions.
Google Analytics allows you to audit how well international targeting has been understood and respected by the search engines. If you have filtered your data by a geographic section of your website, like, /en-gb/ but a high proportion of your organic traffic landing on this section of the site is from countries that have their own specified pages on the site, then this would suggest that your hreflang tags may need checking.
5. Identifying emerging markets
Google Analytics could help identify other markets that are not being served by your current products, website or marketing campaigns that could prove very fruitful if tapped into.
If through your analysis you notice that there is a large volume of visitors from a country you don’t currently serve then you can begin investigations into the viability of expanding into those markets.
Conclusion
As complex as Google Analytics may seem, once you have set it up right expect to get clarity over your data, as it makes drilling down into detail for each of your markets an easy job. The awareness into your markets you gain can be the difference between your digital marketing efforts soaring or falling flat.
Helen Pollitt is the Head of SEO Avenue Digital. She can be found on Twitter @HelenPollitt1.
The post How to get international insights from Google Analytics appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/10/how-to-get-international-insights-from-google-analytics/
0 notes
bambiguertinus · 6 years ago
Text
A survival kit for SEO-friendly JavaScript websites
JavaScript-powered websites are here to stay. As JavaScript in its many frameworks becomes an ever more popular resource for modern websites, SEOs must be able to guarantee their technical implementation is search engine-friendly.
In this article, we will focus on how to optimize JS-websites for Google (although Bing also recommends the same solution, dynamic rendering).
The content of this article includes:
1.    JavaScript challenges for SEO
2.    Client-side and server-side rendering
3.    How Google crawls websites
4.    How to detect client-side rendered content
5.    The solutions: Hybrid rendering and dynamic rendering
1. JavaScript challenges for SEO
React, Vue, Angular, Node, and Polymer. If at least one of these fancy names rings a bell, then most likely you are already dealing with a JavaScript-powered website.
All these JavaScript frameworks provide great flexibility and power to modern websites.
They open a large range of possibilities in terms of client-side rendering (like allowing the page to be rendered by the browser instead of the server), page load capabilities, dynamic-content, user-interaction, and extended functionalities.
If we only look at what has an impact on SEO, JavaScript frameworks can do the following for a website:
Load content dynamically based on users’ interactions
Externalize the loading of visible content (see client-side rendering below)
Externalize the loading of meta-content or code (for example, structured data)
Unfortunately, if implemented without using a pair of SEO lenses, JavaScript frameworks can pose serious challenges to the page performance, ranging from speed deficiencies to render-blocking issues, or even hindering crawlability of content and links.
There are many aspects that SEOs must look after when auditing a JavaScript-powered web page, which can be summarized as follows:
Is the content visible to Googlebot? Remember the bot doesn’t interact with the page (images, tabs, and more).
Are links crawlable, hence followed? Always use the anchor (<a>) and the reference (href=), even in conjunction with the “onclick” events.
Is the rendering fast enough?
How does it affect crawl efficiency and crawl budget?
A lot of questions to answer. So where should an SEO start?
Below are key guidelines to the optimization of JS-websites, to enable the usage of these frameworks while keeping the search engine bots happy.
2. Client-side and server-side rendering: The best “frenemies”
Probably the most important pieces of knowledge all SEOs need when they have to cope with JS-powered websites is the concepts of client-side and server-side rendering.
Understanding the differences, benefits, and disadvantages of both are critical to deploying the right SEO strategy and not getting lost when speaking with software engineers (who eventually are the ones in charge of implementing that strategy).
Let’s look at how Googlebot crawls and indexes pages, putting it as a very basic sequential process:
1. The client (web browser) places several requests to the server, in order to download all the necessary information that will eventually display the page. Usually, the very first request concerns the static HTML document.
2. The CSS and JS files, referred to by the HTML document, are then downloaded: these are the styles, scripts and services that contribute to generating the page.
3. The Website Rendering Service (WRS) parses and executes the JavaScript (which can manage all or part of the content or just a simple functionality). This JavaScript can be served to the bot in two different ways:
Client-side: all the job is basically “outsourced” to the WRS, which is now in charge of loading all the script and necessary libraries to render that content. The advantage for the server is that when a real user requests the page, it saves a lot of resources, as the execution of the scripts happens on the browser side.
Server-side: everything is pre-cooked (aka rendered) by the server, and the final result is sent to the bot, ready for crawling and indexing. The disadvantage here is that all the job is carried out internally by the server, and not externalized to the client, which can lead to additional delays in the rendering of further requests.
4. Caffeine (Google’s indexer) indexes the content found
New links are discovered within the content for further crawling
This is the theory, but in the real world, Google doesn’t have infinite resources and has to do some prioritization in the crawling.
3. How Google actually crawls websites
Google is a very smart search engine with very smart crawlers.
However, it usually adopts a reactive approach when it comes to new technologies applied to web development. This means that it is Google and its bots that need to adapt to the new frameworks as they become more and more popular (which is the case with JavaScript).
For this reason, the way Google crawls JS-powered websites is still far from perfect, with blind spots that SEOs and software engineers need to mitigate somehow.
This is in a nutshell how Google actually crawls these sites:
The above graph was shared by Tom Greenaway in Google IO 2018 conference, and what it basically says is – If you have a site that relies heavily on JavaScript, you’d better load the JS-content very quickly, otherwise we will not be able to render it (hence index it) during the first wave, and it will be postponed to a second wave, which no one knows when may occur.
Therefore, your client-side rendered content based on JavaScript will probably be rendered by the bots in the second wave, because during the first wave they will load your server-side content, which should be fast enough. But they don’t want to spend too many resources and take on too many tasks.
In Tom Greenaway’s words:
“The rendering of JavaScript powered websites in Google Search is deferred until Googlebot has resources available to process that content.”
Implications for SEO are huge, your content may not be discovered until one, two or even five weeks later, and in the meantime, only your content-less page would be assessed and ranked by the algorithm.
What an SEO should be most worried about at this point is this simple equation:
No content is found = Content is (probably) hardly indexable
And how would a content-less page rank? Easy to guess for any SEO.
So far so good. The next step is learning if the content is rendered client-side or server-side (without asking software engineers).
4. How to detect client-side rendered content
Option one: The Document Object Model (DOM)
There are several ways to know it, and for this, we need to introduce the concept of DOM.
The Document Object Model defines the structure of an HTML (or an XML) document, and how such documents can be accessed and manipulated.
In SEO and software engineering we usually refer to the DOM as the final HTML document rendered by the browser, as opposed to the original static HTML document that lives in the server.
You can think of the HTML as the trunk of a tree. You can add branches, leaves, flowers, and fruits to it (that is the DOM).
What JavaScript does is manipulate the HTML and create an enriched DOM that adds up functionalities and content.
In practice, you can check the static HTML by pressing “Ctrl+U” on any page you are looking at, and the DOM by “Inspecting” the page once it’s fully loaded.
Most of the times, for modern websites, you will see that the two documents are quite different.
Option two: JS-free Chrome profile 
Create a new profile in Chrome and disallow JavaScript through the content settings (access them directly here –  Chrome://settings/content).
Any URL you browse with this profile will not load any JS content. Therefore, any blank spot in your page identifies a piece of content that is served client-side.
Option three: Fetch as Google in Google Search Console
Provided that your website is registered in Google Search Console (I can’t think of any good reason why it wouldn’t be), use the “Fetch as Google” tool in the old version of the console. This will return a rendering of how Googlebot sees the page and a rendering of how a normal user sees it. Many differences there?
Option four: Run Chrome version 41 in headless mode (Chromium) 
Google officially stated in early 2018 that they use an older version of Chrome (specifically version 41, which anyone can download from here) in headless mode to render websites. The main implication is that a page that doesn’t render well in that version of Chrome can be subject to some crawling-oriented problems.
Option five: Crawl the page on Screaming Frog using Googlebot
And with the JavaScript rendering option disabled. Check if the content and meta-content are rendered correctly by the bot.
After all these checks, still, ask your software engineers because you don’t want to leave any loose ends.
5. The solutions: Hybrid rendering and dynamic rendering
Asking a software engineer to roll back a piece of great development work because it hurts SEO can be a difficult task.
It happens frequently that SEOs are not involved in the development process, and they are called in only when the whole infrastructure is in place.
We SEOs should all work on improving our relationship with software engineers and make them aware of the huge implications that any innovation can have on SEO.
This is how a problem like content-less pages can be avoided from the get-go. The solution resides on two approaches.
Hybrid rendering
Also known as Isomorphic JavaScript, this approach aims to minimize the need for client-side rendering, and it doesn’t differentiate between bots and real users.
Hybrid rendering suggests the following:
On one hand, all the non-interactive code (including all the JavaScript) is executed server-side in order to render static pages. All the content is visible to both crawlers and users when they access the page.
On the other hand, only user-interactive resources are then run by the client (the browser). This benefits the page load speed as less client-side rendering is needed.
Dynamic rendering
This approach aims to detect requests placed by a bot vs the ones placed by a user and serves the page accordingly.
If the request comes from a user: The server delivers the static HTML and makes use of client-side rendering to build the DOM and render the page.
If the request comes from a bot: The server pre-renders the JavaScript through an internal renderer (such as Puppeteer), and delivers the new static HTML (the DOM, manipulated by the JavaScript) to the bot.
The best of both worlds
Combining the two solutions can also provide great benefit to both users and bots.
Use hybrid rendering if the request comes from a user
Use server-side rendering if the request comes from a bot
Conclusion
As the use of JavaScript in modern websites is growing every day, through many light and easy frameworks, it requires software engineers to solely rely on HTML to please search engine bots which are not realistic nor feasible.
However, the SEO issues raised by client-side rendering solutions can be successfully tackled in different ways using hybrid rendering and dynamic rendering.
Knowing the technology available, your website infrastructure, your engineers, and the solutions can guarantee the success of your SEO strategy even in complicated environments such as JavaScript-powered websites.
Giorgio Franco is a Senior Technical SEO Specialist at Vistaprint.
The post A survival kit for SEO-friendly JavaScript websites appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/09/a-survival-kit-for-seo-friendly-javascript-websites/
0 notes
bambiguertinus · 6 years ago
Text
Keyword Not Provided, But it Just Clicks
When SEO Was Easy
When I got started on the web over 15 years ago I created an overly broad & shallow website that had little chance of making money because it was utterly undifferentiated and crappy. In spite of my best (worst?) efforts while being a complete newbie, sometimes I would go to the mailbox and see a check for a couple hundred or a couple thousand dollars come in. My old roommate & I went to Coachella & when the trip was over I returned to a bunch of mail to catch up on & realized I had made way more while not working than what I spent on that trip.
What was the secret to a total newbie making decent income by accident?
Horrible spelling.
Back then search engines were not as sophisticated with their spelling correction features & I was one of 3 or 4 people in the search index that misspelled the name of an online casino the same way many searchers did.
The high minded excuse for why I did not scale that would be claiming I knew it was a temporary trick that was somehow beneath me. The more accurate reason would be thinking in part it was a lucky fluke rather than thinking in systems. If I were clever at the time I would have created the misspeller's guide to online gambling, though I think I was just so excited to make anything from the web that I perhaps lacked the ambition & foresight to scale things back then.
In the decade that followed I had a number of other lucky breaks like that. One time one of the original internet bubble companies that managed to stay around put up a sitewide footer link targeting the concept that one of my sites made decent money from. This was just before the great recession, before Panda existed. The concept they targeted had 3 or 4 ways to describe it. 2 of them were very profitable & if they targeted either of the most profitable versions with that page the targeting would have sort of carried over to both. They would have outranked me if they targeted the correct version, but they didn't so their mistargeting was a huge win for me.
Search Gets Complex
Search today is much more complex. In the years since those easy-n-cheesy wins, Google has rolled out many updates which aim to feature sought after destination sites while diminishing the sites which rely one "one simple trick" to rank.
Arguably the quality of the search results has improved significantly as search has become more powerful, more feature rich & has layered in more relevancy signals.
Many quality small web publishers have went away due to some combination of increased competition, algorithmic shifts & uncertainty, and reduced monetization as more ad spend was redirected toward Google & Facebook. But the impact as felt by any given publisher is not the impact as felt by the ecosystem as a whole. Many terrible websites have also went away, while some formerly obscure though higher-quality sites rose to prominence.
There was the Vince update in 2009, which boosted the rankings of many branded websites.
Then in 2011 there was Panda as an extension of Vince, which tanked the rankings of many sites that published hundreds of thousands or millions of thin content pages while boosting the rankings of trusted branded destinations.
Then there was Penguin, which was a penalty that hit many websites which had heavily manipulated or otherwise aggressive appearing link profiles. Google felt there was a lot of noise in the link graph, which was their justification for the Penguin.
There were updates which lowered the rankings of many exact match domains. And then increased ad load in the search results along with the other above ranking shifts further lowered the ability to rank keyword-driven domain names. If your domain is generically descriptive then there is a limit to how differentiated & memorable you can make it if you are targeting the core market the keywords are aligned with.
There is a reason eBay is more popular than auction.com, Google is more popular than search.com, Yahoo is more popular than portal.com & Amazon is more popular than a store.com or a shop.com. When that winner take most impact of many online markets is coupled with the move away from using classic relevancy signals the economics shift to where is makes a lot more sense to carry the heavy overhead of establishing a strong brand.
Branded and navigational search queries could be used in the relevancy algorithm stack to confirm the quality of a site & verify (or dispute) the veracity of other signals.
Historically relevant algo shortcuts become less appealing as they become less relevant to the current ecosystem & even less aligned with the future trends of the market. Add in negative incentives for pushing on a string (penalties on top of wasting the capital outlay) and a more holistic approach certainly makes sense.
Modeling Web Users & Modeling Language
PageRank was an attempt to model the random surfer.
When Google is pervasively monitoring most users across the web they can shift to directly measuring their behaviors instead of using indirect signals.
Years ago Bill Slawski wrote about the long click in which he opened by quoting Steven Levy's In the Plex: How Google Thinks, Works, and Shapes our Lives
"On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the "Long Click" — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query."
Of course, there's a patent for that. In Modifying search result ranking based on implicit user feedback they state:
user reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking. The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.
If you are a known brand you are more likely to get clicked on than a random unknown entity in the same market.
And if you are something people are specifically seeking out, they are likely to stay on your website for an extended period of time.
One aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The first number can include a number of the longer views of the document result, the second number can include a total number of views of the document result, and the determining can include dividing the number of longer views by the total number of views.
Attempts to manipulate such data may not work.
safeguards against spammers (users who generate fraudulent clicks in an attempt to boost certain search results) can be taken to help ensure that the user selection data is meaningful, even when very little data is available for a given (rare) query. These safeguards can include employing a user model that describes how a user should behave over time, and if a user doesn't conform to this model, their click data can be disregarded. The safeguards can be designed to accomplish two main objectives: (1) ensure democracy in the votes (e.g., one single vote per cookie and/or IP for a given query-URL pair), and (2) entirely remove the information coming from cookies or IP addresses that do not look natural in their browsing behavior (e.g., abnormal distribution of click positions, click durations, clicks_per_minute/hour/day, etc.). Suspicious clicks can be removed, and the click signals for queries that appear to be spmed need not be used (e.g., queries for which the clicks feature a distribution of user agents, cookie ages, etc. that do not look normal).
And just like Google can make a matrix of documents & queries, they could also choose to put more weight on search accounts associated with topical expert users based on their historical click patterns.
Moreover, the weighting can be adjusted based on the determined type of the user both in terms of how click duration is translated into good clicks versus not-so-good clicks, and in terms of how much weight to give to the good clicks from a particular user group versus another user group. Some user's implicit feedback may be more valuable than other users due to the details of a user's review process. For example, a user that almost always clicks on the highest ranked result can have his good clicks assigned lower weights than a user who more often clicks results lower in the ranking first (since the second user is likely more discriminating in his assessment of what constitutes a good result). In addition, a user can be classified based on his or her query stream. Users that issue many queries on (or related to) a given topic T (e.g., queries related to law) can be presumed to have a high degree of expertise with respect to the given topic T, and their click data can be weighted accordingly for other queries by them on (or related to) the given topic T.
Google was using click data to drive their search rankings as far back as 2009. David Naylor was perhaps the first person who publicly spotted this. Google was ranking Australian websites for [tennis court hire] in the UK & Ireland, in part because that is where most of the click signal came from. That phrase was most widely searched for in Australia. In the years since Google has done a better job of geographically isolating clicks to prevent things like the problem David Naylor noticed, where almost all search results in one geographic region came from a different country.
Whenever SEOs mention using click data to search engineers, the search engineers quickly respond about how they might consider any signal but clicks would be a noisy signal. But if a signal has noise an engineer would work around the noise by finding ways to filter the noise out or combine multiple signals. To this day Google states they are still working to filter noise from the link graph: "We continued to protect the value of authoritative and relevant links as an important ranking signal for Search."
The site with millions of inbound links, few intentional visits & those who do visit quickly click the back button (due to a heavy ad load, poor user experience, low quality content, shallow content, outdated content, or some other bait-n-switch approach)...that's an outlier. Preventing those sorts of sites from ranking well would be another way of protecting the value of authoritative & relevant links.
Best Practices Vary Across Time & By Market + Category
Along the way, concurrent with the above sorts of updates, Google also improved their spelling auto-correct features, auto-completed search queries for many years through a featured called Google Instant (though they later undid forced query auto-completion while retaining automated search suggestions), and then they rolled out a few other algorithms that further allowed them to model language & user behavior.
Today it would be much harder to get paid above median wages explicitly for sucking at basic spelling or scaling some other individual shortcut to the moon, like pouring millions of low quality articles into a (formerly!) trusted domain.
Nearly a decade after Panda, eHow's rankings still haven't recovered.
Back when I got started with SEO the phrase Indian SEO company was associated with cut-rate work where people were buying exclusively based on price. Sort of like a "I got a $500 budget for link building, but can not under any circumstance invest more than $5 in any individual link." Part of how my wife met me was she hired a hack SEO from San Diego who outsourced all the work to India and marked the price up about 100-fold while claiming it was all done in the United States. He created reciprocal links pages that got her site penalized & it didn't rank until after she took her reciprocal links page down.
With that sort of behavior widespread (hack US firm teaching people working in an emerging market poor practices), it likely meant many SEO "best practices" which were learned in an emerging market (particularly where the web was also underdeveloped) would be more inclined to being spammy. Considering how far ahead many Western markets were on the early Internet & how India has so many languages & how most web usage in India is based on mobile devices where it is hard for users to create links, it only makes sense that Google would want to place more weight on end user data in such a market.
If you set your computer location to India Bing's search box lists 9 different languages to choose from.
The above is not to state anything derogatory about any emerging market, but rather that various signals are stronger in some markets than others. And competition is stronger in some markets than others.
Search engines can only rank what exists.
"In a lot of Eastern European - but not just Eastern European markets - I think it is an issue for the majority of the [bream? muffled] countries, for the Arabic-speaking world, there just isn't enough content as compared to the percentage of the Internet population that those regions represent. I don't have up to date data, I know that a couple years ago we looked at Arabic for example and then the disparity was enormous. so if I'm not mistaken the Arabic speaking population of the world is maybe 5 to 6%, maybe more, correct me if I am wrong. But very definitely the amount of Arabic content in our index is several orders below that. So that means we do not have enough Arabic content to give to our Arabic users even if we wanted to. And you can exploit that amazingly easily and if you create a bit of content in Arabic, whatever it looks like we're gonna go you know we don't have anything else to serve this and it ends up being horrible. and people will say you know this works. I keyword stuffed the hell out of this page, bought some links, and there it is number one. There is nothing else to show, so yeah you're number one. the moment somebody actually goes out and creates high quality content that's there for the long haul, you'll be out and that there will be one." - Andrey Lipattsev – Search Quality Senior Strategist at Google Ireland, on Mar 23, 2016
youtube
Impacting the Economics of Publishing
Now search engines can certainly influence the economics of various types of media. At one point some otherwise credible media outlets were pitching the Demand Media IPO narrative that Demand Media was the publisher of the future & what other media outlets will look like. Years later, after heavily squeezing on the partner network & promoting programmatic advertising that reduces CPMs by the day Google is funding partnerships with multiple news publishers like McClatchy & Gatehouse to try to revive the news dead zones even Facebook is struggling with.
"Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. ... more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all."
As mainstream newspapers continue laying off journalists, Facebook's news efforts are likely to continue failing unless they include direct economic incentives, as Google's programmatic ad push broke the banner ad:
"Thanks to the convoluted machinery of Internet advertising, the advertising world went from being about content publishers and advertising context—The Times unilaterally declaring, via its ‘rate card’, that ads in the Times Style section cost $30 per thousand impressions—to the users themselves and the data that targets them—Zappo’s saying it wants to show this specific shoe ad to this specific user (or type of user), regardless of publisher context. Flipping the script from a historically publisher-controlled mediascape to an advertiser (and advertiser intermediary) controlled one was really Google’s doing. Facebook merely rode the now-cresting wave, borrowing outside media’s content via its own users’ sharing, while undermining media’s ability to monetize via Facebook’s own user-data-centric advertising machinery. Conventional media lost both distribution and monetization at once, a mortal blow."
Google is offering news publishers audience development & business development tools.
Heavy Investment in Emerging Markets Quickly Evolves the Markets
As the web grows rapidly in India, they'll have a thousand flowers bloom. In 5 years the competition in India & other emerging markets will be much tougher as those markets continue to grow rapidly. Media is much cheaper to produce in India than it is in the United States. Labor costs are lower & they never had the economic albatross that is the ACA adversely impact their economy. At some point the level of investment & increased competition will mean early techniques stop having as much efficacy. Chinese companies are aggressively investing in India.
“If you break India into a pyramid, the top 100 million (urban) consumers who think and behave more like Americans are well-served,” says Amit Jangir, who leads India investments at 01VC, a Chinese venture capital firm based in Shanghai. The early stage venture firm has invested in micro-lending firms FlashCash and SmartCoin based in India. The new target is the next 200 million to 600 million consumers, who do not have a go-to entertainment, payment or ecommerce platform yet— and there is gonna be a unicorn in each of these verticals, says Jangir, adding that it will be not be as easy for a player to win this market considering the diversity and low ticket sizes.
RankBrain
RankBrain appears to be based on using user clickpaths on head keywords to help bleed rankings across into related searches which are searched less frequently. A Googler didn't state this specifically, but it is how they would be able to use models of searcher behavior to refine search results for keywords which are rarely searched for.
In a recent interview in Scientific American a Google engineer stated: "By design, search engines have learned to associate short queries with the targets of those searches by tracking pages that are visited as a result of the query, making the results returned both faster and more accurate than they otherwise would have been."
Now a person might go out and try to search for something a bunch of times or pay other people to search for a topic and click a specific listing, but some of the related Google patents on using click data (which keep getting updated) mentioned how they can discount or turn off the signal if there is an unnatural spike of traffic on a specific keyword, or if there is an unnatural spike of traffic heading to a particular website or web page.
And, since Google is tracking the behavior of end users on their own website, anomalous behavior is easier to track than it is tracking something across the broader web where signals are more indirect. Google can take advantage of their wide distribution of Chrome & Android where users are regularly logged into Google & pervasively tracked to place more weight on users where they had credit card data, a long account history with regular normal search behavior, heavy Gmail users, etc.
Plus there is a huge gap between the cost of traffic & the ability to monetize it. You might have to pay someone a dime or a quarter to search for something & there is no guarantee it will work on a sustainable basis even if you paid hundreds or thousands of people to do it. Any of those experimental searchers will have no lasting value unless they influence rank, but even if they do influence rankings it might only last temporarily. If you bought a bunch of traffic into something genuine Google searchers didn't like then even if it started to rank better temporarily the rankings would quickly fall back if the real end user searchers disliked the site relative to other sites which already rank.
This is part of the reason why so many SEO blogs mention brand, brand, brand. If people are specifically looking for you in volume & Google can see that thousands or millions of people specifically want to access your site then that can impact how you rank elsewhere.
Even looking at something inside the search results for a while (dwell time) or quickly skipping over it to have a deeper scroll depth can be a ranking signal. Some Google patents mention how they can use mouse pointer location on desktop or scroll data from the viewport on mobile devices as a quality signal.
Neural Matching
Last year Danny Sullivan mentioned how Google rolled out neural matching to better understand the intent behind a search query.
This is a look back at a big change in search but which continues to be important: understanding synonyms. How people search is often different from information that people write solutions about. pic.twitter.com/sBcR4tR4eT— Danny Sullivan (@dannysullivan) September 24, 2018
Last few months, Google has been using neural matching, --AI method to better connect words to concepts. Super synonyms, in a way, and impacting 30% of queries. Don't know what "soapopera effect" is to search for it? We can better figure it out. pic.twitter.com/Qrwp5hKFNz— Danny Sullivan (@dannysullivan) September 24, 2018
The above Tweets capture what the neural matching technology intends to do. Google also stated:
we’ve now reached the point where neural networks can help us take a major leap forward from understanding words to understanding concepts. Neural embeddings, an approach developed in the field of neural networks, allow us to transform words to fuzzier representations of the underlying concepts, and then match the concepts in the query with the concepts in the document. We call this technique neural matching.
To help people understand the difference between neural matching & RankBrain, Google told SEL: "RankBrain helps Google better relate pages to concepts. Neural matching helps Google better relate words to searches."
There are a couple research papers on neural matching.
The first one was titled A Deep Relevance Matching Model for Ad-hoc Retrieval. It mentioned using Word2vec & here are a few quotes from the research paper
"Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements."
"the interaction-focused model, which first builds local level interactions (i.e., local matching signals) between two pieces of text, and then uses deep neural networks to learn hierarchical interaction patterns for matching."
"according to the diverse matching requirement, relevance matching is not position related since it could happen in any position in a long document."
"Most NLP tasks concern semantic matching, i.e., identifying the semantic meaning and infer"ring the semantic relations between two pieces of text, while the ad-hoc retrieval task is mainly about relevance matching, i.e., identifying whether a document is relevant to a given query."
"Since the ad-hoc retrieval task is fundamentally a ranking problem, we employ a pairwise ranking loss such as hinge loss to train our deep relevance matching model."
The paper mentions how semantic matching falls down when compared against relevancy matching because:
semantic matching relies on similarity matching signals (some words or phrases with the same meaning might be semantically distant), compositional meanings (matching sentences more than meaning) & a global matching requirement (comparing things in their entirety instead of looking at the best matching part of a longer document); whereas,
relevance matching can put significant weight on exact matching signals (weighting an exact match higher than a near match), adjust weighting on query term importance (one word might or phrase in a search query might have a far higher discrimination value & might deserve far more weight than the next) & leverage diverse matching requirements (allowing relevancy matching to happen in any part of a longer document)
Here are a couple images from the above research paper
And then the second research paper is
Deep Relevancy Ranking Using Enhanced Dcoument-Query Interactions "interaction-based models are less efficient, since one cannot index a document representation independently of the query. This is less important, though, when relevancy ranking methods rerank the top documents returned by a conventional IR engine, which is the scenario we consider here."
That same sort of re-ranking concept is being better understood across the industry. There are ranking signals that earn some base level ranking, and then results get re-ranked based on other factors like how well a result matches the user intent.
Here are a couple images from the above research paper.
For those who hate the idea of reading research papers or patent applications, Martinibuster also wrote about the technology here. About the only part of his post I would debate is this one:
"Does this mean publishers should use more synonyms? Adding synonyms has always seemed to me to be a variation of keyword spamming. I have always considered it a naive suggestion. The purpose of Google understanding synonyms is simply to understand the context and meaning of a page. Communicating clearly and consistently is, in my opinion, more important than spamming a page with keywords and synonyms."
I think one should always consider user experience over other factors, however a person could still use variations throughout the copy & pick up a bit more traffic without coming across as spammy. Danny Sullivan mentioned the super synonym concept was impacting 30% of search queries, so there are still a lot which may only be available to those who use a specific phrase on their page.
Martinibuster also wrote another blog post tying more research papers & patents to the above. You could probably spend a month reading all the related patents & research papers.
The above sort of language modeling & end user click feedback compliment links-based ranking signals in a way that makes it much harder to luck one's way into any form of success by being a terrible speller or just bombing away at link manipulation without much concern toward any other aspect of the user experience or market you operate in.
Pre-penalized Shortcuts
Google was even issued a patent for predicting site quality based upon the N-grams used on the site & comparing those against the N-grams used on other established site where quality has already been scored via other methods: "The phrase model can be used to predict a site quality score for a new site; in particular, this can be done in the absence of other information. The goal is to predict a score that is comparable to the baseline site quality scores of the previously-scored sites."
Have you considered using a PLR package to generate the shell of your site's content? Good luck with that as some sites trying that shortcut might be pre-penalized from birth.
Navigating the Maze
When I started in SEO one of my friends had a dad who is vastly smarter than I am. He advised me that Google engineers were smarter, had more capital, had more exposure, had more data, etc etc etc ... and thus SEO was ultimately going to be a malinvestment.
Back then he was at least partially wrong because influencing search was so easy.
But in the current market, 16 years later, we are near the infection point where he would finally be right.
At some point the shortcuts stop working & it makes sense to try a different approach.
The flip side of all the above changes is as the algorithms have become more complex they have went from being a headwind to people ignorant about SEO to being a tailwind to those who do not focus excessively on SEO in isolation.
If one is a dominant voice in a particular market, if they break industry news, if they have key exclusives, if they spot & name the industry trends, if their site becomes a must read & is what amounts to a habit ... then they perhaps become viewed as an entity. Entity-related signals help them & those signals that are working against the people who might have lucked into a bit of success become a tailwind rather than a headwind.
If your work defines your industry, then any efforts to model entities, user behavior or the language of your industry are going to boost your work on a relative basis.
This requires sites to publish frequently enough to be a habit, or publish highly differentiated content which is strong enough that it is worth the wait.
Those which publish frequently without being particularly differentiated are almost guaranteed to eventually walk into a penalty of some sort. And each additional person who reads marginal, undifferentiated content (particularly if it has an ad-heavy layout) is one additional visitor that site is closer to eventually getting whacked. Success becomes self regulating. Any short-term success becomes self defeating if one has a highly opportunistic short-term focus.
Those who write content that only they could write are more likely to have sustained success.
A mistake people often make is to look at someone successful, then try to do what they are doing, assuming it will lead to similar success.This is backward.Find something you enjoy doing & are curious about.Get obsessed, & become one of the best at it.It will monetize itself.— Neil Strauss (@neilstrauss) March 30, 2019
from Digtal Marketing News http://www.seobook.com/keyword-not-provided-it-just-clicks
0 notes
bambiguertinus · 6 years ago
Text
Five tools for audience research on a tiny budget
When starting out a digital marketing program, you might not yet have a lot of internal data that helps you understand your target consumer. You might also have smaller budgets that do not allow for a large amount of audience research.
So do you start throwing darts with your marketing? No way.
It is critical to understand your target consumer to expand your audiences and segment them intelligently to engage them with effective messaging and creatives. Even at a limited budget, you have a few tools that can help you understand your target audience and the audience that you want to reach. We will walk through a few of these tools in further detail below.
Five tools for audience research on a budget
Tool #1 – In-platform insights (LinkedIn)
If you already have a LinkedIn Ads account, you have a great place to gain insights on your target consumer, especially if you are a B2B lead generation business.
In order to pull data on your target market, you must place the LinkedIn insight tag on your site.
Once the tag has been placed, you will be able to start pulling audience data, which can be found on the website demographics tab. The insights provided include location, country, job function, job title, company, company industry, job seniority, and company size. You can look at the website as a whole or view specific pages on the site by creating website audiences. You can also compare the different audiences that you have created.
Tool #2 – In-platform insights (Facebook)
Facebook’s Audience Insights tool allows you to gain more information about the audience interacting with your page. It also shows you the people interested in your competitors’ pages.
You can see a range of information about people currently interacting with your page by selecting “People connected to your page.”
To find out information about the users interacting with competitor pages, select “Interests” and type the competitor page or pages. The information that you can view includes age and gender, relationship status, education level, job title, page likes, location (cities, countries, and languages), and device used.
Tool #3 – In-platform insights (Google Customer Match)
Google Customer Match is a great way to get insights on your customers if you have not yet run paid search or social campaigns.
You can load in a customer email list and see data on your customers to include details like gender, age, parental status, location, and relevant Google Audiences (in-market audiences and affinity audiences). These are great options to layer onto your campaigns to gain more data and potentially bid up on these users or to target and bid in a separate campaign to stay competitive on broader terms that might be too expensive.
Tool #4 – External insights (competitor research)
There are a few tools that help you conduct competitor research in paid search and paid social outside of the engines and internal data sources.
SEMrush and SpyFu are great for understanding what search queries you are showing up for organically. These tools also allow you to do some competitive research to see what keywords competitors are bidding for, their ad copy, and the search queries they are showing up for organically.
All of these will help you understand how your target consumer is interacting with your brand on the SERP.
MOAT and AdEspresso are great tools to gain insights into how your competition portrays their brand on the Google Display Network (GDN) and Facebook. These tools will show you the ads that are currently running on GDN and Facebook, allowing you to further understand messaging and offers that are being used.
Tool #5 – Internal data sources
There might not be a large amount of data in your CRM system, but you can still glean customer insights.
Consider breaking down your data into different segments, including top customers, disqualified leads, highest AOV customers, and highest lifetime value customers. Once you define those segments, you can identify your most-desirable and least-desirable customer groups and bid/target accordingly.
Conclusion
Whether you’re just starting a digital marketing program or want to take a step back to understand your target audience without the benefit of a big budget, you have options. Dig into the areas defined in this post, and make sure that however you’re segmenting your audiences, you’re creating ads and messaging that most precisely speak to those segments.
Lauren Crain is a Client Services Lead in 3Q Digital’s SMB division, 3Q Incubate.
The post Five tools for audience research on a tiny budget appeared first on Search Engine Watch.
from Digtal Marketing News https://searchenginewatch.com/2019/04/08/five-tools-for-audience-research-on-a-tiny-budget/
0 notes
bambiguertinus · 6 years ago
Text
How The Internet Happened: From Netscape to the iPhone
Brian McCullough, who runs Internet History Podcast, also wrote a book named How The Internet Happened: From Netscape to the iPhone which did a fantastic job of capturing the ethos of the early web and telling the backstory of so many people & projects behind it's evolution.
I think the quote which best the magic of the early web is
Jim Clark came from the world of machines and hardware, where development schedules were measured in years—even decades—and where “doing a startup” meant factories, manufacturing, inventory, shipping schedules and the like. But the Mosaic team had stumbled upon something simpler. They had discovered that you could dream up a product, code it, release it to the ether and change the world overnight. Thanks to the Internet, users could download your product, give you feedback on it, and you could release an update, all in the same day. In the web world, development schedules could be measured in weeks.
The part I bolded in the above quote from the book really captures the magic of the Internet & what pulled so many people toward the early web.
The current web - dominated by never-ending feeds & a variety of closed silos - is a big shift from the early days of web comics & other underground cool stuff people created & shared because they thought it was neat.
Many established players missed the actual direction of the web by trying to create something more akin to the web of today before the infrastructure could support it. Many of the "big things" driving web adoption relied heavily on chance luck - combined with a lot of hard work & a willingness to be responsive to feedback & data.
Even when Marc Andreessen moved to the valley he thought he was late and he had "missed the whole thing," but he saw the relentless growth of the web & decided making another web browser was the play that made sense at the time.
Tim Berners-Lee was dismayed when Andreessen's web browser enabled embedded image support in web documents.
Early Amazon review features were originally for editorial content from Amazon itself. Bezos originally wanted to launch a broad-based Amazon like it is today, but realized it would be too capital intensive & focused on books off the start so he could sell a known commodity with a long tail. Amazon was initially built off leveraging 2 book distributors ( Ingram and Baker & Taylor) & R. R. Bowker's Books In Print catalog. They also did clever hacks to meet minimum order requirements like ordering out of stock books as part of their order, so they could only order what customers had purchased.
eBay began as an /aw/ subfolder on the eBay domain name which was hosted on a residential internet connection. Pierre Omidyar coded the auction service over labor day weekend in 1995. The domain had other sections focused on topics like ebola. It was switched from AuctionWeb to a stand alone site only after the ISP started charging for a business line. It had no formal Paypal integration or anything like that, rather when listings started to charge a commission, merchants would mail physical checks in to pay for the platform share of their sales. Beanie Babies also helped skyrocket platform usage.
The reason AOL carpet bombed the United States with CDs - at their peak half of all CDs produced were AOL CDs - was their initial response rate was around 10%, a crazy number for untargeted direct mail.
Priceline was lucky to have survived the bubble as their idea was to spread broadly across other categories beyond travel & they were losing about $30 per airline ticket sold.
The broader web bubble left behind valuable infrastructure like unused fiber to fuel continued growth long after the bubble popped. The dot com bubble was possible in part because there was a secular bull market in bonds stemming back to the early 1980s & falling debt service payments increased financial leverage and company valuations.
TED members hissed at Bill Gross when he unveiled GoTo.com, which ranked "search" results based on advertiser bids.
Excite turned down offering the Google founders $1.6 million for the PageRank technology in part because Larry Page insisted to Excite CEO George Bell ‘If we come to work for Excite, you need to rip out all the Excite technology and replace it with [our] search.’ And, ultimately, that’s—in my recollection—where the deal fell apart.”
Steve Jobs initially disliked the multi-touch technology that mobile would rely on, one of the early iPhone prototypes had the iPod clickwheel, and Apple was against offering an app store in any form. Steve Jobs so loathed his interactions with the record labels that he did not want to build a phone & first licensed iTunes to Motorola, where they made the horrible ROKR phone. He only ended up building a phone after Cingular / AT&T begged him to.
Wikipedia was originally launched as a back up feeder site that was to feed into Nupedia.
Even after Facebook had strong traction, Marc Zuckerberg kept working on other projects like a file sharing service. Facebook's news feed was publicly hated based on the complaints, but it almost instantly led to a doubling of usage of the site so they never dumped it. After spreading from college to college Facebook struggled to expand ad other businesses & opening registration up to all was a hail mary move to see if it would rekindle growth instead of selling to Yahoo! for a billion dollars.
The book offers a lot of color to many important web related companies.
And many companies which were only briefly mentioned also ran into the same sort of lucky breaks the above companies did. Paypal was heavily reliant on eBay for initial distribution, but even that was something they initially tried to block until it became so obvious they stopped fighting it:
“At some point I sort of quit trying to stop the EBay users and mostly focused on figuring out how to not lose money,” Levchin recalls. ... In the late 2000s, almost a decade after it first went public, PayPal was drifting toward obsolescence and consistently alienating the small businesses that paid it to handle their online checkout. Much of the company’s code was being written offshore to cut costs, and the best programmers and designers had fled the company. ... PayPal’s conversion rate is lights-out: Eighty-nine percent of the time a customer gets to its checkout page, he makes the purchase. For other online credit and debit card transactions, that number sits at about 50 percent.
Here is a podcast interview of Brian McCullough by Chris Dixon.
How The Internet Happened: From Netscape to the iPhone is a great book well worth a read for anyone interested in the web.
Categories: 
book reviews
from Digtal Marketing News http://www.seobook.com/how-internet-happened-netscape-iphone
0 notes