Tumgik
mypradip33pansuriya · 5 years
Photo
Tumblr media
SEOs on Google e-commerce category recommendation: ‘I’ll stop doing it when it stops working.’ https://ift.tt/2SWtjTq
E-commerce SEOs often place relevant content about a category in the footer of their e-commerce category pages, hoping it will help the page rank in search. But Google’s John Mueller recently suggested in a webmaster hangout that this isn’t a good idea. He said “I’d try to stick to really informative content and put that in places where you think that users will be able to see it,” rather than throwing that content on the footer of the page.
John’s advice. John said this at the 7:18 mark of the video documenting the hangout. In short, he said it is best to have your content integrated into the category pages where the content is useful to the web site visitor.  “Things you could do here,” he said, “is kind of make sure that those pages are well-integrated with your website so that we have a clear context of how those pages should belong the website and what those pages are about.”
He added, “another thing you can do is when you have that listing of products — make sure that there’s some information on those listings so that we can understand what the page is about.”
But do not just throw content in the footer of the page under the listings for the products.
But it works. Despite what Mueller said, SEOs are unlikely to be dissuaded from this practice. After I wrote about this on my personal blog, I heard from many SEOs who said that even if it’s not a recommended practice, it still works. In other words, they’ve observed that placing content on the footer of category pages on e-commerce sites does help those category pages rank for the keywords they are targeting.
One such SEO said, “I’ll stop doing it when it stops working.” That was a typical response we saw on social media in response to John Mueller’s recommendation.
Why it matters. SEO isn’t an exact science. What might work on one site might not work on another site. Also, the advice Google gives you might not end up working as expected on your site. The key here is to try different things, stay within Google’s webmaster guidelines and also make sure to build a site that your web site visitors will appreciate and convert on.
The video and transcript. Here is the video of John Mueller saying this, it starts at 7:18:
Here is the transcript: Many e-commerce websites optimize their categories by adding a big chunk of text under the product listings. Nothing except an h1 heading above the fold. I don’t consider this good usability since users need to scroll all the way down to read this. Does Google treat this content the same as any other or would you for improving rankings recommend putting the category text above the fold? So this is something that that comes up fairly regularly.
One of the reasons why websites initially started kind of doing this this kind of workaround is that it was really hard sometimes for us to rank category pages on ecommerce sites if there’s no useful information on that page, if there’s no context on that page. So as a workaround people started stuffing whole Wikipedia articles above below the fold using a small font, sometimes using a link like that says like more information that pops out at this giant article of text.
Snd from our point of view that’s essentially keyword stuffing. So that’s something worth which I would try to avoid.
I’d try to stick to really informative content and put that in place where you think that users will be able to see it. Especially if it is content that you want to provide for users. And more than that I would think about what you can do to make those pages rank well without having to put a giant paragraph of content below the page.
So things you could do here is kind of make sure that those pages are well integrated with your website so that we have clear context of how those pages should belong the website and what those pages are about. And another thing you can do is when you have that listing of products, make sure that there’s some information on that on those listings that we can understand what this page is about.
So instead of just listing I don’t know 40 photos of your product, put some text on there. Make sure that you have alt text for the images, that you have captions below their images. So that when we look at this page we understand oh there’s this big heading on top that’s telling us like this is the type of product that you have on your website and there’s lots of product information in those listings and we can follow those listings to do even more information so that you don’t need to put this giant block of text on the bottom. Obviously having some amount of text makes sense. So maybe shifting that giant block of text into maybe one or two sentences that you place above the fold below the heading is a good approach here because it also gives users a little bit more information about what they should expect on this page. So that’s that’s kind of the direction I would head there.
I’d really try to avoid the situation where you’re kind of fudging a page by putting tons of text on the bottom of the page just because of the rest of the page is suboptimal and instead try to find ways to improve the page overall so that you don’t have to go this workaround.
0 notes
mypradip33pansuriya · 5 years
Photo
Tumblr media
Google drops mobile app properties from GSC, reminds site owners to remove G+ integrations https://ift.tt/2T5SQOb
This week, Google sent out notifications to webmasters, SEOs and developers through Google Search Console telling them that mobile app properties and Google+ web integrations will no longer be supported.
Mobile app properties.
In 2015, Google added the ability to include your mobile apps, specifically Android apps, in Google Search Console. Starting in 2017, Google began moving that functionality from Google Search Console to Firebase. And, as expected, several GSC features are shutting down in March, and one of those features is mobile app property support.
Google sent an email to those who have app properties in Search Console that they will no longer be supported in late March.
The mobile app email notification.
The email says “You are the owner of one or more mobile app properties in Search Console. At the end of March, 2019, Search Console will end support for app properties.”
Here is a copy of the email I received:
Google+ web integrations. 
Google+ is closing down soon, at least the consumer version of it, and, with that, the Google+ social sharing buttons will stop working. Google sent notifications to publishers that have Google+ buttons or any web integrations with Google+, telling that they should remove those integrations by March 7, 2019.
The Google+ notification email.
 Here is a copy of the email that says “Google has announced that Google+ for consumers will be sunset. As part of the sunset, all Google+ web integrations will stop serving on March 7, 2019. This has the potential to affect website layouts and/or functionality if no action is taken by website owners.”
Why it matters. 
If you have any mobile app data within Google Search Console, you will want to export as much of it as you can before it gets shut down. In addition, you may want to consider integrating your apps with Google’s Firebase platform to get more functionality and data than you were able to get within Google Search Console.
In addition, if you are using Google+ on your web sites, you will want to remove those dependencies by March 7.
0 notes
mypradip33pansuriya · 5 years
Photo
Tumblr media
How reputation became a major ranking factor in SEO https://ift.tt/2T5ychd
Over 10 years ago, I predicted that Google could use quality scoring for organic rankings, and I also proposed a number of ways they could quantify the quality of websites and specific factors that could be vital for this. The recent core algorithm updates and Medic Update over the past year, and publication of the Quality Rating Guidelines, are largely indicating that a business’s reputation is also key to this as well. If you’re interested in how this all may function, read on.
I first predicted Google might apply a Quality Score to organic search in 2007, and in subsequent years I highlighted the need for: About Us pages so people can know who is behind a website; good Contact Us pages to instill trust; good usability and user-experience; copyright statements; and good spelling and grammar. I have predicted enough of these factors to still see the direction of where the algorithm evolution has been heading.
The most recent edition of Google’s “Quality Evaluators Guidelines”, also called “Quality Rating Guidelines” (“QRG”) is nearly a script of my past recommendations involving quality factors.
Google Quality Rating guidelines seem relevant to recent updates
It seems clear that quality factors, such as those listed in the QRG, have become more influential in Google search rankings.
You may already be saying, “oh, no – he is barking up the wrong tree about the Quality Guidelines”, since others have written about them, too, since they were released back in July (as well as with earlier “leaked” releases), and Googler’s have stated that the raters’ scores are not used directly in websites’ rankings. Notably, Jennifer Slegg reported that Danny Sullivan confirmed that the ratings of the human evaluators are not used in machine learning for the algorithms, replying on Twitter, “We don’t use it that way.” He also noted that rater data is used like restaurants using feedback cards so that they know if their search “recipes” are working:
Others have pointed out specific things mentioned in the Guidelines as likely signals, particularly involving Expertise, Authority and Trust (Google cites those as “E-A-T”). For instance, Marie Haynes called-out how she felt that elements mentioned in the Quality Raters Guidelines might be influential ranking factors, such as a business’s BBB ranking and authors’ reputations. Others criticized her for citing the BBB rating and author reputations as a direct ranking factor, and subsequently, John Mueller at Google essentially states that Google was not researching author’s reputations nor using proprietary rating scores like the BBB rating grade.
At the same time, Googlers have increasingly been advising webmasters to “just focus on quality,” and even recommending that webmasters read the QRG in offering the best content possible — this is what Danny Sullivan did when he officially commented upon last year’s core algorithm updates in October:
And, Ben Gomes, Google’s VP of Search, Assistant and News, in an interview last year stated:
“You can view the rater guidelines as where we want the search algorithm to go. They don’t tell you how the algorithm is ranking results, but they fundamentally show what the algorithm should do.”
So, it rather begs the question – how exactly is Google determining Quality algorithmically, involving somewhat subjective-seeming concepts of expertise, authoritativeness, trustworthiness and reputation? Google’s algorithms must translate these concepts into quantifiable criteria that can be measured and compared between competing sites/pages.
I think that some of Google’s past algorithmic development likely points towards the process, and I think that there has been some degree of disconnect between what people have asked, how various Googlers have answered and how people have interpreted those answers. Much of the conjecture has seemed far too reductionist, focusing on relatively naive, theoretical factors. Factors that Googlers have directly denied, in some instances.
If Google instructs its human raters to assess E-A-T of a site but does not incorporate the resulting ratings, what is the algorithm using? Simply stating that the algo uses a collection of things like a BBB rating, user reviews, or link-trust analysis seems far too limited. Likewise, stating that Google comes up with a quality assessment based purely upon link trust analysis and query analysis seems too limited – Google is obviously taking into account some factors beyond just more advanced link/query analysis, although that is also some part of the mix.
Google’s website quality patent
One of Google’s past patents seems to me to describe what they could be using, or something quite similar, and it relies upon machine learning. It’s “Website Quality Signal Generation,” and it was noted by Bill Slawski in a brief synopsis and commentary in 2013 when the patent was granted. It describes how humans could be used to rate the quality of websites, and then an analysis algorithm could associate those ratings with website signals – likely automatically identifying relationships between quantified signals and the human rating values – and thus generate models from the characteristic signals. These signal models could then be used to compare against other unrated websites to apply a quality score to them. The wording is quite fascinating:
“Raters (e.g., people) connect to the Internet to view websites and rate the quality of each of the websites. The raters can submit website quality ratings to the quality analysis server through the rating input device. The quality analysis server receives website quality ratings from the rating input devices and stores the website quality ratings in the signal store. The website quality ratings are associated with a uniform resource locator and other website signals corresponding to the rated website. The quality analysis server identifies relationships between the website quality ratings and website signals and creates a model representing the relationships, as described below.
Further, the quality analysis server searches the signal store for unrated websites (e.g., sites lacking a signal indicating a quality rating). The quality analysis server determines whether the unrated websites have website signals that are related to quality ratings and applies the model to the unrated websites. Application of the model results in a calculated quality rating. The quality analysis server assigns the calculated quality rating to the corresponding website. The calculated quality rating is an additional website signal that is stored in the signal store for use by other applications.”
“The search engine can use the website signals stored in the signal store to filter and/or order the search results based on the stored quality ratings of the websites. For example, a threshold can be used to filter websites that have a stored quality rating below the threshold. Additionally, the websites returned by the search device can be ordered according to their stored quality rating such that the websites having higher stored quality ratings are listed prior to websites having lower stored quality ratings.”
The patent also provides some examples of the sorts of things that could be used as quality factors:
“…quality factors can be used such as the correctness of the grammar and spelling of the text on the web pages, whether obscene or otherwise inappropriate material is presented, whether the websites have blank or incomplete pages, as well as other factors that would affect the quality of the website.”
You will note that these are some of the very things mentioned in the QRG. But, that is not the reason why the patent is so compelling – it is because it provides a very logical framework in which to develop methods for assessing websites’ and webpages’ quality, and for generating a Quality Score that can be used in ranking determinations. The methods involved indicate that relatively small sample sets of test pages could be used to create models that could work well across all other pages of similar types.
Imagine that you have identified a type of page that you would like to create a Quality Score for — such as an information article about a health topic. (This is a “Your Money or Your Life” type of page, or “YMYL” page as explained in the QRG.) Google could take signals they have mentioned before that they can measure, including: amount of content, page layout, number of ads and ad placement on the page (are they above the fold or interstitial — do they obscure or interfere with accessing the page’s main content), reviews about the site, reviews and links about the content creator if different from the site, links out from the page (perhaps indicating identity of content creator and/or citing info reference sources), links to the site and page (PageRank), clickthrough rates to the page from its prime keyword searches, correspondence of the visible headline/title at top of the page with the page’s content and search snippet title and meta description, and factors that indicate the site is on the up-and-up.
Through testing of numbers of similar types of pages, Google could have developed a model – a pattern – of the combination of quality signals and developed a score value associated with it. Essentially, when a health topic article page has a certain range of the criteria signals like PageRank, plus content layout and quantity of similar type, plus certain inlink PageRank, plus CTR, plus user reviews – then Google could apply the computed Quality Score to the page without having any human manually review it. It seems likely that Google has developed some such models for many different types of pages.
The reason to think that Google could be incorporating machine learning in this sequence is that it could come up with quality rating relationships in the data that would not be simple toggles that they manually set with different weightings. Instead of clumsily setting a weight for a particular signal (like, saying that a health topic article of a certain quality rating level should have minimum PageRank of X, and the minimum number of links, etc), with machine learning the system could identify more complex relationships (like, assign a certain quality score if PageRank is of N to NN, CTR is X to XX, paired with a specific page layout type).
The patent describes this possibility:
“In some implementations, the model can be derived from the website signals and the website quality ratings using a machine learning subsystem that implements support vector regression.”
Essentially, this produces a machine learning classifier. The model can then be used to identify all pages of a common class, calculate a quality score, and then apply the same or similar rating to other members of the same class of pages.
Whether Google inputs the data directly from the pages and their quality evaluators’ ratings of them into a machine learning system or not does not really matter so much – they could still use a number of “models” of types of pages and then check them against how the quality raters assess them and then manually tweak weightings. But, with the sort of massive firepower Google has at their disposal, I tend to think this does not make sense anymore. Indeed, quite some other SEO analysts are also concluding that Google is incorporating machine learning into content rankings (such as Eric Enge and Mark Traphagan), and not just in query interpretation which Google has publicly disclosed.
This really explains some of the vagueness that has come from Google when asked what to do to address ranking drops from the core algorithm changes. Support vector machines, or neural networks, will come up with scores very holistically, so pointing to any one signal or even a handful of signals as influencing the outcome most in any given instance probably would not really be possible — it would be buried in the complex interior of the models, which might be fairly abstracted.
Once the computed models are applied to search rankings, Google then has their evaluators rate the search results again. Doing this iteratively helps tweak the results to better and better over time.
The various possible signals they are likely incorporating into a Quality Score also have complexity:
PageRank – and/or, some evolved signal that may also involve the quality/trust of the links;
User Reviews Sentiment – Google’s QRG suggests that there would be some threshold number of reviews, so if a business/website had relatively few total reviews, they might opt to ignore the measure as too volatile/unrepresentative. Note that Google has long done research on sentiment analysis and has a patent or two for conducting it. They have also actually performed it for businesses and displayed it with local listings in the past. Now, they have somewhat waffled back-and-forth about whether they use it for rankings. But, they essentially declared that they would be incorporating sentiment, following the DecorMyEyes brouhaha. To put it conversely, if Google is incorporating reputation, how else could they adjust ranking if not by performing some type of sentiment analysis? Among all the somewhat nebulous quality assessments, the sentiment is relatively straightforward to analyze and use.
Mentions Sentiment – Do people mention your product in social media and emails to each other? Social media “buzz” is a popularity measure and the sentiment can be a quality measure as well.
Click-Through-Rate (CTR) – (and related bounce rate) there has been a lot of debate about this potential factor over time, but, much like Quality Score itself, if it is used by their ads’ Quality Score, then why not for organic rankings as well? In a 2012 freedom-of-information-act disclosure that was mistakenly made public by the Wall Street Journal, Google’s former chief of search quality, Udi Manber, is quoted saying:
“The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure out, well, probably Result 2 is the one people want. So we’ll switch it.”
Some research has supported that CTR is indeed influential upon rankings and some has found that it is not. If it is just part of a website’s Quality Score, then the CTR is not a direct ranking factor and this could explain the discrepancies in research findings.
Percent / Intrusiveness of Ad Placements – How much of the total page space is comprised of ads? Google may have calculated a good/bad threshold. Also, how many ads, and do they break up the main content of the page too much? Are there interstitials or overlays that cannot be closed? Do ads follow you as you scroll the page? The degree of intrusiveness of ads could be calculated.
Sites Missing Identification Information – Sites ideally should have “About” and “Contact Us” pages. About sections should explain who the company is, and, even better, list top staff members along with photos. Contact Us page should ideally have as much contact info as possible, including street addresses, phone numbers and contact submission forms. It likely helps if staff pages link to employee’s social media and LinkedIn profiles and vice-versa. There is nothing worse than visiting a site and being unable to see who is behind it!
Critical E-Comm & Financial Site Factors – HTTPS is a given — but, I have seen that Google is also going further than just checking-off if you have HTTPS since not all encryption and configurations are the same — I have seen Google Chrome throw up warnings for deprecated/faulty SSL; E-Comm and financial info sites must also have clear Terms & Conditions, Privacy Policy and Customer Service contact info. Also, include Help sections and Return Policies, for good measure, if applicable. I have also highlighted using Trust Badges in the past — although Google does not use a Trust Badge itself as a quality factor most likely, it is still good for reassuring online consumers and can enhance sales by increasing trust.
Site Speed – this nearly goes without saying, but this was a factor in the PPC Quality Score way back at the beginning, and it is still a type of quality measure in organic search today.
Reviewed & Updated Regularly – The QRG cites some types of articles/information as needing to be reviewed and updated if necessary on a regular basis. Medical, legal, financial and tax advice are types of information that they mention. If your site contains articles of this sort, check it regularly and if changes are made, I recommend adding a visible “updated” date on the article page. Likewise, failure to update sitewide Copyright dates in footers is also a potential measure of quality, and if not updated might contribute to an evaluation that the site is abandoned or not up-to-date. Other aspects of this could include pages with malfunctioning widgets/apps; pages with broken images and broken links; and, pages that purport to list current information but have clearly outdated content.
No-Content & Worthless Content Pages – Google can see the percentages of pages encountered by their spider which are error pages, blank pages and pages with unsatisfactorily brief content. Sites with large numbers of errors indicate a lack of updating and poor maintenance. Even if one page on such a site has a great article, the chances increase that a user might click to another page on the site and encounter unsatisfying content.
Porn / Malicious / Objectionable / Deceptive Content – I very nearly did not bother mentioning this, but such types of content sharply increase the likelihood that the page and site will receive the lowest possible Quality Score. Want an instant decrease in your Quality Score? Add these things. Concerningly, a website is also responsible for such stuff appearing in the ad content, pulled in from other servers, so one must be a little vigilant in knowing what sorts of things are getting displayed on one’s site.
PageRank Score calculation methodology could be in use
One other interesting aspect of how Quality Score could work is that a page’s Quality Score could also incorporate the Quality Scores of the other pages linking to it or its overall domain. Quality Score could partly be computed by iterating over the entire link graph a number of times in computing the scores, using a method similar to the original PageRank algorithm. This would produce a more weighted scale of Quality Score values, helping to insure that the highest quality content is truly up at the top and utterly burying the lowest quality content to the point where searches virtually never encounter it in typical search interactions.
Observations on some sites affected by Medic Update
Barry Schwartz published a handful of some of the sites impacted in the Medic Update, and I did a cursory review of some and saw them lacking in much of the criteria that I outlined as potential quality factors and which the QRG specifically mentions:
Prevention.com – Had no About page. Infinite scrolling – could the algorithm see the site’s privacy and Terms & Conditions links? The Contact Us page was on a different domain. Reputation of article writers not necessarily expert on medical topics. Very large ads and side content badges that are grotesque and distracting.
Gcflearnfree.org – It redirects, cartoon images on the staff page, and no text, no phone/address. Contact form only appears after a human validation click.
Minted.com – Customer service contact phone number hard to locate and linked-to on a subdomain.
Buoyhealth.com – About page redirects to the homepage! No phone/address/personnel Also, weird logo graphic code.
VeryWellHealth.com – surprisingly, it is a participant in HONcode and has doctor writers – so, why did it get dinged? Award-winning. Journalistic integrity with editorial doctor board. Perhaps it needs a more prominent About link at the top? No phone/address for contact. Too many ads on article pages, perhaps, or the sort of according-opening ads at tops of pages may be too irritating to users. Also, a very weird logo linking system. Path-class image map possibly hard for typical machine interpretation of links.
GoodRX.com – About page says who they’re NOT, and reads more like a public service warning. Doesn’t tell who they ARE. Site does a whole lot of things right.
Spine-health.com – uses HONcode, which is good. Phone and address not present. Contact Us on different domain. Different name and link in the page footer.
NaturalFoodSeries.com – An almost infinite scrolling homepage? No About page, Contact Us page, no phone, no address, etc.
Some of these sites may have changed since I looked at them weeks ago, and some may have recovered rankings. The nearly cosmetic issues I noted about them are likely not all of the quality signals they are deficient with — simply adding an About page is unlikely to fix the overall combination of issues.
Mixed messages over ranking signals
Some of the potential signals I have cited here and in the past have been quite controversial, and I do not doubt that some will disagree over some of my assertions and conjectures herein. But, I think my explanations come close to explaining why there has been a disconnect over what some SEO professionals believe are ranking factors and what Googlers are claiming are not.
I think Googlers have sometimes played a game of semantics. They can say that presence of one or more of the controversial factors (CTR, sentiment analysis, etc.) are not direct ranking factors — because those factors may or may not roll up into a complex evaluation resulting in a Quality Score. For instance, if my conjectures on Quality Score are close to how it functions, then two different pages on the same topic could be very similarly relevant and both have a high CTR, but one’s Quality Score could be significantly higher due to a combination of other things. In yet other cases, a higher CTR could shift a page up into being classified at a higher rung of quality.
A “model”/pattern of a good-quality page sounds much like a “search recipe,” as Danny Sullivan referred to it. There are almost certainly different combos and weightings of factors used to determine Quality Scores of different types of pages.
Use of a support vector regression or a neural network would make the resultant Quality Scores nigh impossible to thoroughly reverse-engineer because it is holistic. The Quality Score is a gestalt. This is why that Googlers have been recommending vague-sounding changes for those affected, like “work on improving your overall quality” and “focus on making the best content possible.” Aside from technical SEO issues, one will rarely be able to change one or two things sitewide to address low Quality Scores with these changes. As Glenn Gabe has observed with sites affected by these changes, “Site owners shouldn’t just look for a single smoking gun, since there’s almost never just one.”
How to improve your site’s quality score
So, how does knowing that the Quality Score is now a complex gestalt help you to improve your rankings?
The good news is that most SEO guidance is all still very apt. Exercise generally good SEO “hygiene” in technical SEO construction, keeping error pages and no-content pages to a minimum. Eliminate unnecessary, thin content pages. Avoid prohibited link building practices.
SEO itself is a holistic practice – do all the things that are appropriate and reasonably feasible in optimizing your site. Ensure that you have the elements that reassure consumers that your site is on the up-and-up: About pages, Contact pages, Privacy Policy and Terms and Conditions pages, Customer Service contact options and update content as appropriate.
Beyond technical SEO stuff, you need to be somewhat obsessed with good usability, good user experience, good customer service and creating great content. You need highly critical people to assess your site and give you feedback on what may not be working well and how to fix it. You need to work to view the site as though you are a consumer of your site’s content and try to fulfill your site users’ needs thoroughly. You need to respond to negative reviews and try to elicit online reviews from your pleased users/customers. If your service frequently creates friction with clients, eliminate the points of conflict and improve your customer service practices. You need to produce the best content and get the best content creators when building your content.
Finally, practice proactive reputation management by working to positively engage with your community, both online and offline, consistently over time. Engage with social media and develop your presence via those channels. Provide good expert advice online for free as a means of reputation-building. Become a member of your industry’s professional groups and community business groups. Respond professionally and do not be provoked online.
These practices will generate the signals for a good Quality Score over time and you will reap the benefits.
0 notes
mypradip33pansuriya · 5 years
Photo
Tumblr media
पुलवामा हमला: जिनकी शहादत से गुस्से में है देश, सरकारी रिकॉर्ड में नहीं कहलाएंगे 'शहीद' http://bit.ly/2SF0rUz
जम्मू-कश्मीर के पुलवामा में आतंकियों ने कायराना हरकत को अंजाम दिया है. आतंकियों ने सुरक्षा बलों के काफिले को निशाना बनाते हुए बड़ा हमला किया. हमले में CRPF के 37 जवान शहीद हो गए,जबकि कई जवान घायल हैं. इस हमले की जिम्मेदारी आतंकी संगठन जैश-ए-मोहम्मद ने ली है. जिस आतंकी ने इस हमले को अंजाम दिया उसका नाम आदिल अहमद डार है. वो पुलवामा जिले के काकपोरा का ही र��ने वाला है. बताया जा रहा है कि आदिल पिछले साल फरवरी में मोस्ट वांटेड आतंकी जाकिर मूसा के गजवत उल हिंद में शामिल हुआ था और कुछ ही महीने पहले ही वह जैश में शामिल हुआ था.
(adsbygoogle = window.adsbygoogle || []).push({});
देश जहां 37 जवानों के मारे जाने पर आंसू बहा रहा है, वहीं इस पर राजनीति पर अपने चरम है. विपक्ष जमकर मोदी सरकार पर हमला बोल रहा है और सरकार दावा कर रही है कि इस हमले का मुंहतोड़ जवाब दिया जाएगा. खुद प्रधानमंत्री नरेंद्र मोदी कह चुके हैं कि जवानों की शहादत व्यर्थ नहीं जाएगी. जवानों के शहीद होने पर नेता राजनीति करते रहें, लेकिन सच तो यह है कि किसी ने भी अब तक जवानों के लिए कोई कदम नहीं उठाया. यहां हम इस मुद्दे को इस वजह से उठा रहे हैं क्योंकि इस हमले में जो जवान मारे गए हैं उनको हम शहीद तो बोल रहे हैं लेकिन उनको सरकार की तरफ से शहीद का दर्जा नहीं दिया जाता है. दरअसल, सीआरपीएफ बीएसएफ, आईटीबीपी या ऐसी ही किसी फोर्स से जिसे पैरामिलिट्री कहते हैं उनके जवान अगर ड्यूटी के दौरान मारे जाते हैं तो उनको शहीद का दर्जा नहीं मिलता है.
वहीं थलसेना, नौसेना या वायुसेना के जवान ड्यूटी के दौरान अगर जान देते हैं तो उन्हें शहीद का दर्जा मिलता है. थलसेना, नौसेना या वायुसेना रक्षा मंत्रालय के तहत काम करता है तो वहीं पैरामिलिट्री फोर्सेज गृह मंत्रालय के तहत काम करते हैं.
पैरामिलिट्री को नहीं मिलती हैं ये सुविधाएं
बात शहीद के दर्जे में भेदभाव की हो या फिर पेंशन, इलाज, कैंटीन की, जो सुविधाएं सेना के जवानों को मिलती है, वह पैरामिलिट्री को नहीं दिया जाता है. सीमा पर गोली यदि सेना का जवान खाता है तो बीएसएफ के जवान को भी गोली लगती है. जान उसकी भी जाती है. सेना जहां बाहरी खतरों से देश की रक्षा करती है, जबकि सीआरपीएफ आंतरिक सुरक्षा के लिए जिम्मेदार है. पैरामिलिट्री का जवान अगर आतंकी या नक्सली हमले में मारा जाए तो उसकी सिर्फ मौत होती है. उसको शहीद का दर्जा नहीं मिलता है.
(adsbygoogle = window.adsbygoogle || []).push({});
शहीद जवान के परिवार वालों को राज्य सरकार में नौकरी में कोटा, शिक्षण संस्थान में उनके बच्चों के लिए सीटें आरक्षित होती हैं. पैरामिलिट्री के जवानों को ऐसी सुविधाएं नहीं मिलती हैं. इतना ही नहीं पैरामिलिट्री के जवानों को पेंशन की सुविधा भी नहीं मिलती है. जब से सरकारी कर्मचारियों की पेंशन बंद हुई है, तब से सीआरपीएफ-बीएसएफ की पेंशन भी बंद कर दी गई. सेना इसके दायरे में नहीं है.
ऐसे में साफ है कि चाहे वो विपक्ष हो या सरकार दोनों एक दूसरे पर आरोप लगाते रहते हैं. कांग्रेस की सरकार भी सत्ता में रह चुकी है और अब बीजेपी की सरकार है. दोनों ही सरकारों ने जवानों को लेकर बड़ी-बड़ी बातें तो जरूर की, लेकिन असल में देश के इन जवानों के लिए दोनों ने कोई बड़ा कदम नहीं उठाया.
0 notes
mypradip33pansuriya · 5 years
Photo
Tumblr media
How to make your content more accessible to the visually impaired http://bit.ly/2FHFFvM
Globally, it’s estimated that approximately 1.3 billion people live with some form of distance or near vision impairment. In the past, vision impairment may have hampered their online screen experience, but thanks to the tech advancements of today, virtually anyone can jump online and search up the latest news, new restaurant reviews, or their next vacation destination.
Making sure businesses and marketers develop online content that is accessible to anyone and everyone is the big idea behind inclusive marketing. This form of marketing takes into account factors such as gender, race, language, income, sexuality, age, religion, ethnicity and ability, recognizing that marketers can no longer forge ahead assuming that one brand is designed for customers from all walks of life. Rather, marketers need to intelligently engage with individuals, taking into account their personalities, eccentricities and necessary accommodations.
Part of inclusive marketing is making your online media more accessible for your clients and customers with visual impairments. By maximizing the accessibility of your organic search presence, you’re making your products and services available to an otherwise untapped market of potential consumers. And c’mon, it’s just the right thing to do.
Online search engines don’t wave a magic wand and make your images and videos accessible, but there are a few things you can easily incorporate into your content development and online advertising routine to make sure everyone understands what’s on their screen.  You can also utilize features in applications such as accessibility checker, to make all of your marketing materials as accessible as possible.
Optimize your images using strong alt text descriptions
Alternative text (alt text) provides a textual alternative to non-text content online, such as images, graphics, infographics and the like. Complete alt text descriptions increase the accessibility of the internet to those with vision impairments. As a screen reader encounters images on a web page, it reads the alt text provided aloud, allowing the content and/or function of the image to be understood by the user.
Beyond accessibility, alt text also gives your SEO ranking a good boost by providing search engines such as Bing and Google with more information about what’s on specific web pages. The more info their web crawlers can scan and understand, the better chances you have to relevantly rank in SERPs (search engine result pages).
After all, web crawlers (and screen readers) can’t analyze an image and determine its value, they can only understand text. So, that text had better accurately describe the image or media. Otherwise, it’s like it doesn’t exist at all.
Here are a few tips to writing a good alt text description:
Be accurate and present the content and function of the image.
Be concise. Generally no more than a few words are needed.
Avoid redundancies, do not provide information already present in the surrounding text.
Do not use the phrases “image of …” or “graphic of…” in your alt text description.
When the image is only text, the text within the image can serve as the alt text.
If the image is functional, for example, the image is a link to something else, include that in the alt text.
Optimize and create friendly URLs, image titles and file names
Your file name will help search engines and screen readers understand what the image is and if it’s relevant. Before you upload the image to your CMS, make sure the file name is simple and describes the subject matter of the media, and use it as an opportunity to include target keywords if appropriate.
Here are two examples of file names, which one is more understandable?
orange-rainboots-children-toddlers.jpg
IMG22402A.JPG
I rest my case.
It’s the same idea with URLs and image titles. Take the time to not only include them but write good ones that make sense and properly describe the image. It can only help!
Use schema markup data for images/media
Schema Markup data is used by Bing, Google, etc. to provide better search results. A type of HTML coding or structured data markup, it provides additional context to the search engines and will improve the knowledge pane, which can be read aloud as the featured snippet.
Schema can be used to mark up just about anything and is used by Bing and other leading search engines. By employing structured data markups, search engines can better read the contents on a webpage, changing how they may display the search results.
Carry accessibility principles over to videos, PowerPoints and PDFs
As the use of video marketing continues to rise, consider these accessibility tips to make them more available to the visually impaired:
Create and provide accurate video transcripts on the page.
Increase engagement by using both open and closed captions for video content. Note, the text tile attached for closed captions is readable by search engines.
For PowerPoints and PDF documents:
As with images, create search-friendly file names and optimize your titles with keywords.
Add alt-tags for images and charts within the document or PowerPoint.
Complete the description field – this will serve as the meta descriptions within search results.
Include your company name in the author field.
In Adobe Acrobat, there are additional metadata fields, sure to complete them.
Write protect your documents to make it hard for others to edit and add their links to your content.
Link to the document internally and include backlinks with your target keywords.
Modern marketing is accessible marketing
Inclusive marketing is all about creating information and content that is more representative of everyone, including the visually impaired. Following the measures described above will help you make your content more universally accessible and improve not only the quality of the content but the experience for the user.
0 notes
mypradip33pansuriya · 5 years
Photo
Tumblr media
How are your YouTube campaigns performing on TV screens? http://bit.ly/2sbYi3f
Only a few months ago, Google announced TV screens as a device type advertisers could target for their video campaigns in Google Ads. And this month Ginny Marvin explained that “The TV screens device type targets YouTube channel inventory on smart TVs, set top boxes, gaming consoles and streaming devices such as Apple TV, Chromecast and Roku.” While TV screens may not be the dominant device type for your video campaigns, you should at least have it on your radar to monitor device growth. Let’s check out a few ways you can keep an eye on how your YouTube videos perform on TV screens.
Use the device segmentation in Google Ads
We have two options when viewing our device data for video campaigns within Google Ads. The first option is to select the Devices report when you’re in your campaign or ad group.
Tumblr media
My biggest gripe with this route is Google Ads doesn’t give me all of the proper YouTube columns I like (view percent, earned metrics, others) when I’m in the Device view. We pretty much have view metrics. In this example, I can see TV screens might only make up 1.3 percent of views, but they have the best view rate. I’ve then gone on and added an increase bid modifier to try and get as many views I can from TV screens. The second option we can use in Google Ads, and my preferred option, is to look at the campaign view and then segment by device.
Tumblr media
Because I’m in the campaign view, Google Ads gives me the columns I prefer to check on for my video campaigns. I can now see how long people watch on TV screens. I can also see how many earned actions users take after interactive with a paid video ad. This information is going to be a lot better for me to review to make proper bid adjustments.
Wait. I haven’t run ads yet. Can I still see this data for my YouTube videos?
That’s a great question. You might be new to video campaigns and want to get an understanding of how users consume your content before launching any campaigns. If this is the case, and assuming you already have videos live on your channel, the first place you should check is your YouTube Analytics. Once you’re in your YouTube Analytics (which should default to the YouTube Studio BETA), click on the “Analytics” option in the left-hand menu to pull up some basic video stats. Next, click anywhere in the first graph you see in the Analytics view.
After you click on the graph, you’ll be taken to a different screen giving you some basic stats on how each of your videos is performing depending on the date range. The top menu for this page gives you several options to segment your user data, but for this post, we will want to select the “Device Type” option.
Tumblr media
We will then see a screen similar to this image…
Tumblr media
Now we can see device stats for all of our videos during the date range we have chosen. Again, this is showing all video performance, not just your video ad campaigns. You can see there is a separate report for “Traffic Sources” in the navigation. One thing also included in the image, is the menu of columns you can select from after clicking on the blue plus button. There are several column options you have to data-dive into how TV screens could be performing depending on the goals of your video are. Looking at the same image, we can see TV screens make up only 5.0 percent of total watch time. But curious me is going to add other columns to see if TV screens get better subscribers, or maybe even a better rate of likes or shares. If I find that is the case, I might consider proactively setting my bid adjustments before launching anything new. Or I might be comfortable structuring my campaigns differently to try and capitalize on better performance as early as possible. I can’t give you a concrete answer on the best option because your data will have to be the guide.
YouTube usage on TV screens will only grow
In the link to the Google blog I referenced in the first sentence of this post, you can find the stat that users watch over 180 million hours of YouTube on TV screens every day. Now add to the mix how cutting the cord with cable TV is growing as users switch to alternative services like Amazon Prime Video, Hulu and yes…YouTube TV. I am not saying you must stop everything you are doing to push your videos on TV screens, but you should at least be monitoring the performance and making the proper adjustments when necessary. The TV screen device type will be growing in the years to come.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Search Engine Land was mistakenly removed from the Google Index https://ift.tt/2QrKJug
Search Engine Land was misidentified by Google’s system as being hacked and was removed from Google’s search results early Friday morning. Google has resolved the issue and Search Engine Land should return to the Google index shortly.
What happened? 
A Google spokesperson said “This was a case where our system misidentified the site as being hacked.” Since it was very early in the morning, and I personally do not have access to Google Search Console for Search Engine Land, I was unable to find out the cause of the issue. In its response, Google told us, “[A] message informing the site owner that we believed the site was hacked would have been made available within Search Console, as well as an option to request a review. … However, as we discovered the error before a review request was made, we reversed the action,” The normal process would be that someone with verified access to Google Search Console would see the hacked message and request a review. Then Google would return the site to Google’s index if there is not hack. In this case, Google spotted the misclassification before the review was requested and processed the reversal sooner.
The good news.
 A Google spokesperson told us “the site should return to the index soon.” Our tech team on the west coast is going to wake up to a good story.
Why you should care? 
This highlights why having access to Google Search Console is so important. Without that, in many cases, many would continue to speculate why the site was not indexed in Google. And they might spend countless hours trying to debug a technical SEO issue or off-page site issue that was not there. Now we know it was a Google error, where they misclassified the site with a hacking issue and Google reversed the classification.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
SEO for the Holidays: 2019 Edition https://ift.tt/2DIHUPw
For anyone who’s been involved in SEO for longer than a few months, you’ll know that you’re dealing with a rapidly changing environment.
Not only are we dealing with algorithms that change daily but it’s compounded with SERP layout changes and often dramatic shifts in device usage.
You’re hopefully already well-underway with your holiday SEO efforts for 2018, but SEO in 2019 and beyond is going to be very different than the Google you’re optimizing for today.
As we mentioned above, there are three core areas of change we need to keep our eyes on:
Algorithms
SERP layouts
Devices
So, let’s look at them one-by-one and what you need to be thinking of as you prep for the holidays, which are just 15 short months away.
Algorithms In 2019
If there is a constant in SEO, it’s change.
Constant change.
Google updates their algorithms on an almost daily basis and with machine learning, this is sure to accelerate with the barrier to change and testing dropped to near-zero.
Without a doubt, the focus of algorithmic change as we head toward the holidays will be to fulfilling user intent.
What does this means for us?
The primary signals will continue their shift away from those we traditionally consider critical, like having your keywords in the title tag, backlinks, content length, etc.
This is not to say they will be unimportant. Just less important.
Areas to keep your eyes on in your prep for the 2019 holiday season are:
Post-Click Time & Subsequent Action(s)
Whether a user spends a day at your page or 5 seconds is irrelevant for determining if their intent was met. What matters is whether the next action they took at Google implied their intent being fulfilled or not.
If they return to Google in 5 seconds and click the next results they are likely unsatisfied, if they return and change their query they likely found what they wanted or determined the query was incorrect.
What happens post-click is (according to Google) not currently a signal for an individual site but is used to determine overall search result quality.
I can all but guarantee that as we approach the holidays in 2019 and beyond we’ll be discussing this very differently.
I feel confident in this assertion based on the increasing prevalence of machine learning in the algorithm and its ability to generate its own signals.
I don’t believe that presently many, if any, search algorithms are left to define and weight their own signals, but this will change and using post-click information just make sense.
In fact, if I were the machine learning algorithm, I’d complain enormously if you tied my hands on this set of possible metrics while directing me to create the best set of search results.
What you can do about it is ensure that you are accurately predicting the user’s likely intent:
Review the phrases that are driving people to specific key pages within your site.
Consider what the user should want when they get there based on the variety of phrases.
Then look at your analytics to determine if the signals match.
For example, we have a page on our site where we give away Google Ads Promo codes each month.
The queries driving traffic to the page are:
And here’s what the traffic is doing:
On most other pages on our site having a bounce rate of 80.85 percent and 1.31 average pages per session would be a horrible signal.
On this page, however, people are supposed to come, copy a code or see that it doesn’t apply to them, and leave.
The signals for this page are exactly what they should be.
While our rankings fluctuate a bit by region, they have held steady and traffic has improved over time as we capture more terms.
In fact, if I saw a high number of pages per session for this term I would have to consider what I’ve been doing wrong or what unexpected next action people are taking.
Essentially, what I’m considering is whether my analytics match what I’d expect if the user’s intent was met for the terms driving them to the page. This will allow me to assess what the next likely action they have taken is.
In the case above, they would likely adjust their term to include their region if it is not one we give our codes for or head over to Google Ads to use it and move on.
Probability of Content to Fulfill User Intent
Post-click signals are great to let you get a feel for how your currently ranking content is doing – but what about content that isn’t ranking or is in development?
Google’s goal is to satisfy their user’s intent and as such want to maximize the probability that a piece of content will do that.
With a term like “buy blue widgets” it’s fairly straightforward. The searcher clearly wants to purchase a product. The goal then is to provide all the information that they may need to make the decision to buy it from you.
It gets more complicated when you’re striving to rank for terms like “blue widgets”.
The searcher may want to purchase some, may want to know the history of the blue widget manufacturing industry, or may want to know why blue widgets are constantly used as an example in SEO circles.
Different users may also desire different content formats and the larger the variety of phrases a page may capture, the more variables there are.
The goal is not to destroy the user experience by trying to fulfill every possible intent for every possible type of user, but rather to satisfy the likely intent for a larger group of searchers than your competitors.
As you ready your site and develop your content strategy for the holidays you want to make sure that you’re maximizing the likelihood of satisfying not just your purchasers but other searchers who would use the same terms.
Reputation & Relatedness
If you think links are important you’re just getting warmed up.
As I discussed in my article on machine learning in search, entities are the future of search.
Everything you can think of (including that thought) is an entity.
Entities will someday replace links (though links will assist in defining and strengthening relationships between them). What I’m referring to here as reputation could also be referred to as strength.
With entities, we pull away from the idea of PageRank yielding weight and instead would consider relatedness combined with reputation.
The more references that tie one entity to another (like tying the author “dave davies” to the topic “seo”) and the stronger the references, the more likely “dave davies” will be considered an authority on the topic of “seo”.
In this paragraph I have, without links, strengthened that relationship and the fact that this reference is on Search Engine Journal, a site that also has a strong relatedness to the entity “seo” and a strong reputation makes it even more valuable
There is a link in the bio, that will also reinforce the relatedness, but only of the entity “dave davies” to the site “Search Engine Journal” assisting my reputation.
The relatedness factors are a lot more complex and involve the relatedness of page, content, and entities being connected.
So, as you prepare for the holidays, yes you want to build some links but more, you need to consider how you are tying the entity that is your site to the entity that is your core offering, whatever that may be.
Often this will involve mentions within relevant content, links from relevant sites and even connections with other relevant authors and entities.
This might sound a lot like link building, but the strategies will be tweaked and some completely new or for completely different reasons.
Why would you have someone write for your site?
If the entity “loren baker” wrote a post on the entity that is “beanstalk’s website” I strongly suspect that in optimizing for the algorithms of the 2019 holiday season, provided I can reinforce that the piece was indeed written by this Loren Baker, that the content would rank higher than if I wrote the same piece as the entity that is him is more strongly associated with the terms that would rank.
Additionally, you will want the entity that is your company to be tied to related entities as the holidays approach.
If you sell blue widgets you want to also be associated with their various uses, history, etc. In this case, it doesn’t have to be the entity that is “you” but simply one associated with the entity that is “your company”.
Each staff may specialize in an area related to blue widgets, and all staff will be associated with your company, thus passing that overall weight to your company. However, if one wrote about the topic of another, the relevance passed would likely not be as strong. Like me writing about the entity that is “email marketing”.
So, think of creative ways to get your entities associated correctly. Invite other entities to your playground and try to get invited to theirs.
It won’t just be about the link so don’t get pre-occupied, it will be about the topics you get associated with and how strongly.
SERP Layouts
How much wood could a woodchuck chuck puzzle.
It’s tough to predict what the SERPs will look like in 15 months.
What we know with 100 percent certainty is that they will be very different and that some sectors will change more than others.
If you’re in travel, for example, you can be sure that Google will have completely changed how they lay out the results and how users search.
Just a couple weeks ago, SEJ’s Matt Southern reported on the addition of pricing insights for hotels and flight searches:
A couple weeks before that it was adding location scores:
And so on.
Basically, they’ve been launching major layout changes every week or two so one can readily assume that they SERP layout heading into the holidays of 2019 will be very different for travel sites.
And that’s just the tip of the iceberg. In just the past few months we’ve seen:
Comparison carousels added.
New hotel search layouts (a few of them).
Q&A, FAQ and how-tos placed in the SERPs.
Sticky headers.
New dataset Schema reflected in the SERPs.
This is literally just scratching the surface of what’s happened over the past couple of months.
As we ready ourselves for the holidays, it is going to be critically important for all website owners to pay very close attention to changes in these layouts to consider how they will impact you and how you can take advantage of them.
On top of that, even if you don’t invest in Google or Bing Ads you need to pay attention to changes there. Different layouts and structures in the paid space often lead to changes in the layouts.
If there are new headlines and increased click-throughs in paid, that’s bad news for your organic side.
But if you want to be optimistic – consider how their structure appears on a page and that they’ve put considerable time and money into testing which layouts get clicked the most often. Perhaps take some inspiration from what they’re doing.
Regardless of what changes you see you need to stop and think. Don’t just think about what they’re doing and don’t even just think about how you can take advantage of it – think about why.
Think about what is being lost or gained and consider whether they might push it further and start preparing for that.
There will be hundreds of SERP layout changes between now and the holidays. No matter how small be sure to pay attention and as things start to stack up you’ll have a very good idea of the direction things are heading and what to do about it.
Changing Devices
How people are accessing data is changing rapidly.
To give you an idea, when you were preparing for the holidays in 2016 you were dealing with an environment where only about 25 percent of people would use voice search in an office if people were around.
By the holidays of 2017, that number rose to over 30 percent.
With that increase came staggering growth in the number of voice-first devices which climbed from roughly 7.9 million to more than 23 million.
By the holidays of 2018, there will be an estimated 50 million and something to keep in mind as you prepare for the holidays of 2019… by then there will be an estimated 89 million.
On top of that, the way people interact with their devices has changed dramatically with “near me” replacing location-specific queries and these types of user-driven changes, combined with Google tweaking mobile layouts and how and what data is presented – the process by which user find and purchase their products will be significantly different.
As with changing SERP layouts, the goal in this area is mainly reactive.
You should make sure that key content on your site is formatted to provide voice results to common user questions when they ask “what is the world record for eating blue widgets?”
So look to Schema and content structures that capture featured snippets.
Loren Baker’s advice in 5 Voice Engine Optimization Strategies to Get Ahead is a good starting point.
In prep for the holidays, you’ll need to read constantly and keep up on the changing landscape and who are changing with it.
Read the studies by Stone Temple and others like it where they reveal data like:
Revealing that higher income people use voice search at a higher degree in most environments. Or:
Lending insight into what environments people tend to use voice search and thus help us predict how we can inject our content into their lives.
Run a grocery store with fast delivery? People use voice search when their hands are dirty so make it convenient for them to quickly order a product for delivery through your app or mobile site while they’re up to their elbows in a recipe they’re working on for Thanksgiving dinner and realized they forgot an ingredient.
It’s Time to Start Now
The 2019 holidays are just 15 short months away.
Many of the tasks you need to accomplish will take time.
You’ll need to:
Adjust your link strategy to an entity strategy.
Review (and potentially rework) your content for voice search.
Keep up with SERP layout changes and the opportunities they will provide.
This is going to be exhausting – and incredibly rewarding.
Get started now and you won’t just stand a chance, but you’ll be ahead of many.
And THAT will make for some very happy holidays.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
SEO for holiday shoppers https://ift.tt/2Dt8JGx
Here we are at the end of October and you’re realizing your SEO is not in shape for the holidays. Whether that’s because you are just now understanding that your SEO strategy isn’t going to yield the results you want before the holidays, or if you’ve just procrastinated – you’re looking for some techniques that will generate short-term wins.
Fortunately, they exist.
A couple of years ago I wrote a similar piece on last-minute SEO tips for the holidays. That left readers with about three weeks to make use of them.
This year we’re getting a slightly earlier start, so let’s dive in with 5 things you can do right now to get started on making more money during this peak time of year.
Titles and descriptions in the SERPs
I’m going to start with the only tip that I’ll be repeating from my previous article and that’s titles and descriptions. I’m repeating it for two reasons:
It’s easily the most straight-forward thing you can do with tremendous impact.
There have been changes in how to approach this.
Let’s consider a parent is out looking for a video game for his kids and encounters two titles and descriptions in the SERPs:
Title: Gamer-Rated Top 10 Video Games For Christmas 2018 | GamerEmpire.info
Description: Gamer Empire enlists top video game enthusiasts to rate and rank this year’s top video games to make your Christmas gift buying easier.
OR
Title: Best Video Games | Last Guardian, Titanfall 2, Pokemon Sun & Moon, Battlefield
Description: Best video games for Christmas including Last Guardian, Titalfall 2, Pokemon Such & Moon, Battlefield 1, Call Of Duty: Infinite Warfare, Skyrim, PS4, xBox One
Which one am I likely to click? One tells me that I’m going to find what I’ve likely queried, the other is showing me a list of things I probably don’t recognize.
Look through the new Search Console and find the terms that your pages are ranking for and focus your titles and descriptions on improving the clickthroughs for those terms. Remember, you are not just optimizing for the person who wants what’s offered on your site, you’re optimizing for the people who would purchase it for them.
This year, we can very quickly test titles (and thankfully we have time) with Google Ads. With the expanded text ads now allowing three sets of 30 characters rather than two (and in the fall of 2016 it had just been increased from a single 25-character headline) and descriptions now increased from 80 to 90 characters, we can test versions a lot closer to what we would deploy organically.
Importance of featured snippets
Featured snippets give you the opportunity to jump the queue and launch yourself into the coveted position zero for a lot of the types of queries that holiday shoppers would use (remember – for this purchaser they often don’t know what they want so many of the queries will be exploratory).
Here’s what Tech Radar pulled off for one such phrase:
In my opinion, that featured snippet is more valuable than any #1.
If we consider some of the data regarding the growth in voice search, that will be a strong influencer as well. The folks at Stone Temple Consulting (now Perficient Digital) outlined the year-over-year data in voice search just after the holidays last year in this study. At its core, it revealed a much stronger willingness of people to use voice search, especially in public (read: on their phones).
Featured snippets essentially drive voice search but they are a bit different so I recommend reading this piece by Brian Ussery. There have been a few changes since it was written a year ago however the information and process are still valid.
Updating evergreen URL with new content
This advice pretty much works anytime but never more than when you’re in a scramble for rankings to attract visitors who may not necessarily buy from you for their own purposes.
Top lists of popular games/toys/books/etc. are always a winner. Staff Picks. Reviews and ratings. Guides.
Think not about what you sell or what the people who want to buy it would search for, think about who shops for that demographic, what questions they would have, how they would ask it and target that in your content.
If you do this annually, I’d recommend creating a URL something like:
/guides/10-best-video-games/
Next year when you update it take the content from that location if you want to archive it and move it to something like:
/guides/10-best-video-games/2018/
And put your 2019 content at the old URL. You’re effectively creating an evergreen URL but keeping your archive. This will keep any link weight passing to the primary URL headed to your most current content.
Rank elsewhere and format correctly
If you want to rank for terms that are too competitive for your current site strength, find strong resources that can rank that accept guest articles. But be careful to review Google’s reminder about large-scale article campaigns.
Make a shortlist of 3 or 4 sites that rank well within your niche and accept guest articles.
Research exactly what type of content resonated with THEIR audience. Use your favorite backlink or social measurement tool to figure out what content on their site gets shared the most.
Create content ideas and outlines around what will appeal to their audience that overlaps with what you want to rank for and your knowledge base.
Read their guidelines incredibly carefully and pitch in the EXACT format requested.
In your pitch, be concise. Editors have a tough, time-consuming job making people like me look good. Respect that and keep your pitch to the point but thorough and showing your knowledge of both the subject matter and their audience.
Rank during and post-holiday queries
We have several clients in travel and one of their biggest buying seasons is not before the holidays but rather, during them. It’s when family and friends get together and our analytics tells us how it plays out in many households.
Rather than searching for a “vacation rental Portland” they’re looking for “family reunion portland” or “8 bedroom vacation rental Portland.”
The searchers are looking not for a general type of place but are searching based on the end criteria (i.e., we need x bedrooms, or we want to host y event, etc.).
Couple this with the excitement and convenience of everyone being together, place a low barrier-to-entry on the site (a low non-refundable deposit in one case) and you’re set up to win.
The reason this ties to SEO is that the terms you’ll be targeting are often less competitive. Everyone wants to rent out their “vacation rental portland” but far less competition for the bedrooms, amenities, etc.
At the same time, you’ve got a bunch of folks with newly received gift cards and their searches will be very specific.
Where the parent might have looked for “best video games 2018,” the gift card holder will be searching queries like “black ops 4 price” or “black ops 4 ps4 cheap.”
The search volume isn’t what you’d see for just “black ops 4” but the terms are far easier to target, and the strategy works just as well if you rank already for the core terms and are just expanding to get the during-and-post holiday traffic you might have been missing out.
Focus on maximizing your strategy
The holidays are a time to pull up your socks and focus on the things you can do that will impact your results and maximize your earning from holiday shoppers and post-holiday spenders. Next year, promise yourself you’ll get an earlier start with SEO for the 2019 holidays.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
How to boost search rankings using only your internal linking strategy https://ift.tt/2BdC5ab
Links, even within a website, show relationships between content. They transfer value and importance between pages. Even more importantly, internal links define your website’s structure. They help users to navigate and search engines to understand your website. A healthy internal linking strategy makes it easier for your pages to be indexed and to rank, as well as increasing click-through rates and conversion rates for visitors who can find what they are looking for. Before we start looking at three ways to boost your SERP rankings, let’s take a look at some of the basic principles of internal linking:
The more links a page receives, the more value Google gives it. Just like external links indicate an overall value with regard to other pages on the web, internal links help Google determine the importance of a page with regard to the other pages on your website.
Internal links are so important that Google now considers that 1000 is a “reasonable number” of links per page. Don’t forget that this includes every single link in the header, footer, menu and sidebar.
Linking is so powerful that it can (unintentionally) give unparalleled boosts to relatively low-value pages that receive a large number of internal links, like the home, about and contact pages.
Links from fresh content pass fresh value, and can, therefore, signal new content to Google, helping new pages get crawled.
Armed with that information, here are three strategies to use internal links to improve your SEO.
Boost rankings by reducing page depth
In June this year, John Mueller indicated in a Google Webmaster Central Hangout that click depth — that is, the number of clicks required to reach a page when starting from the website’s homepage  — matters in SEO. Pages that are closer to the homepage receive more positive attention from Google. This is good news, as it can mean quick wins for websites with complex and deep structures. OnCrawl has helped websites in e-commerce, classifieds, and job listings to increase ranking positions and improve UX by reducing the overall depth of pages.
Based on statistics for each website, we have found that page depth can be reduced in a number of ways:
Increasing the number of recommended or similar pages that appear on product pages or job listings. This is a means of increasing internal links within a category, or between related categories, bringing some of the deeper pages to a shallower level.
Increasing the number of high-level categories. By doubling the number of categories listed on the homepage, more pages within those categories were placed closer to the homepage.
Reducing the number of browsable pages in a category’s listing. This can be achieved by increasing the number of items per page, but can also be addressed in part by the increased number of high-level categories, which results in fewer items in each category. If a category’s seven pages are reduced to five, the URLs listed on the sixth and seventh pages all move up at least two clicks in depth.
Thanks to these types of modifications, we’ve found that it is fairly easy to reduce the overall depth of a website from over 50 to under 15 levels of depth.
We’ve also observed that, following improvements that reduce depth, the product catalog of e-commerce websites is usually crawled in shorter periods of time. As a result, new products are discovered more efficiently by Google. This allows sales, promotions, and other commercial campaigns to be more effective and more profitable.
Boost rankings by creating content hubs
For websites that use content marketing to grow their site and increase their authority on a selection of topics, internal linking will help push content to perform better. By strengthening key subjects and grouping related content around pillar pages, you can improve the authority of all pages on a topic and gain positions in the SERPs.
Using OnCrawl to segment your website based on your content hubs will help you visualize the linking structure within a hub and between thematic groups of content. This will allow you to track key indicators for each content hub, and to confirm increases in metrics such as organic visits, rankings, Fresh Rank, and crawl frequency. A content hub:
Consists of multiple pages of content on your website on closely related topics or keywords.
Is structured around a pillar page, an authoritative page on your website that covers a subject your site would like to be seen as an expert on, and that links out to other pages in the content hub on your website.
Uses internal links between its pages to reinforce the relationships between the pages. These links should, when reasonable, use words related to the content hub’s theme in their anchor text.
Typically, your pillar page will rank well. The internal link structure from this page to the pages in the hub will distribute the authority gained by this page to all of the related pages. This transfer of authority is an example of the theory of page importance put into practice. Using internal links to build thematic content hubs can also increase Domain Authority and improve click-through rates for your website. The more the pages in a thematic content group link to one another, the better the group can place in the SERPs.
Boost rankings on a selected group of pages
Working with websites that run seasonal or event-related campaigns means that OnCrawl has seen over and over again how internal linking strategies can benefit pages that are extremely profitable during a narrow window of time. For example, holidays specials, sales, and ticket sales for industry events should all be promoted in the time before the event. To promote an event-related group of pages:
Create a landing page optimized for search queries related to the event. This page should contain enough text to rank correctly for the intended search terms, and should be tightly centered around the event theme. It is easier to promote a rich landing page on a relevant topic than multiple product pages.
Place the date and a countdown on the landing page. The countdown can help establish changes in the content and increase crawl frequency.
Create a content hub of related content and link to this content from the landing page. Content, in this case, will likely be existing pages on your website, and may include event pages, related product pages, related articles, related categories, and so on. This will help distribute popularity from the ranking landing page to the pages it links to.
Use internal linking to promote the landing page on your site. Link to it from the homepage using relevant anchor text and place it in high-value positions in your menu, making it accessible from a large number of pages on your site.
In OnCrawl, monitor metrics that help confirm the success of your strategy:
Crawl frequency: 
during the period leading up to the event, confirm an increase in Google’s crawl behavior on your site.
Impressions and SEO visits : 
check that your landing page appears in the SERPs and that organic visits to your landing page increase.
On-page factors: 
make sure that the on-page SEO for your landing page is optimized. Check the title, meta description, H1 and H2, and word count for the landing page.
Remember to remove or demote the landing page and the links to it when the promotion period has passed.
Takeaways for internal linking strategies
Internal linking strategies can increase site authority on a subject, improve click-through rate, and boost rankings through direct influence on ranking factors like click depth and page importance. Internal links can be used to produce quantifiable results in how Google prioritizes your pages, how and when your pages rank, and the number of organic visits they receive. Using internal links to reduce page depth, create content hubs, and promote specific groups of pages over others will provide an extra edge to your SEO strategy.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Drive SEO results with artificial intelligence https://ift.tt/2RYII5D
One of the more powerful subsets of Artificial Intelligence (AI) being used these days in SEO is machine learning, which deals specifically with the training of algorithms, or understanding how and why algorithms work.
But machine learning is only as good as the data it is operating on. Part of what makes a great use case for machine learning is having large amounts of very precise data. Without this precision however, model guidance gets murky, and often leads to erroneous (and costly) conclusions.
Are you using the right data? You can’t just take Google ranking data, which is effectively the output of a black box, feed it into some sort of machine learning, and automatically get positive results.
This white paper from Market Brew shows you how to drive better SEO results with AI and precision data. Visit Digital Marketing Depot to download “Implementing Artificial Intelligence to Attack SEO.”
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Link building tool roundup: Site crawlers https://ift.tt/2FugvlX
If a search engine’s crawler can’t find your content to index, it’s not going to rank. It’s also not a good signal, but, most importantly, if a search engine can’t find something, a user may not be able to either. That’s why tools that mimic the actions of a search engine’s crawler can be very useful.
You can find all sorts of problems using these crawlers, problems that can drastically impact how well your site performs in search engines. They can also help you, as a link builder, to determine which sites deserve your attention.
Link building is never a magic bullet. Links take a lot of hard work, and it can be useless to build links to a site that suffers from terrible SEO.
For this article, I’ve looked at four crawlers: two are web-based and two are desktop versions. I’ve done a very light analysis of each in order to show you how they can be used to help when you’re building links, but they have many, many more uses.
I’ll go through uses for link outreach and also for making sure your own site is in good shape for building or attracting links.
For the record, here are the basics about each tool I’ll be mentioning:
Sitebulb: offers 14-day free trial. Desktop web crawler.
DeepCrawl: offers 7-day free trial. Web-based crawler.
Screaming Frog: free download for light use or buy a license for additional features. Desktop web crawler.
OnCrawl: offers 14-day free trial. Web-based crawler.
Note: these crawlers are constantly being updated so screenshots taken at the time of publication may not match the current view.
Evaluate a link-target site
Using a crawler tool can help you maximize your link building efficiency and effectiveness
Do a sample site audit.
Before you reach out to a site on which you want a link, conduct an audit of the site so you have an “in” by pointing out any errors that you find.
The beauty of a sample audit is the small amount of time used. I have seen some crawlers take ages to do a full crawl. so a sample audit, in my opinion, is genius!
In the example report below, you can look at just a few of the hints and easily see that there are some duplication issues, which is a great lead-in for outreach.
Run a report using custom settings to see if a link is worth pursuing. If tons of the site’s content is inaccessible and there are errors all over the site, it may not be a good idea to invest a lot of time and effort in trying to get a link there.
Find the best pages for links.
Sitebulb has a Link Equity score that is similar in idea to internal PageRank. The higher the link equity score, the more likely the page is to rank. A link from a page with a high Link Equity score should, theoretically, with all other things being equal, be more likely to help you rank than one from a page with a much lower Link Equity score.
Run a report to find broken pages with backlinks.
DeepCrawl has an easy way to view these pages. Great for broken link building obviously…but even if you’re not doing broken link building, it’s a good “in” with a webmaster.
Who doesn’t want to know that they have links pointing to pages that can’t be found?
Make your own (or client’s) site more link-worthy
You can run the same report on your own site to see what content is inaccessible there. Always remember that there may be cases where you want some of your content to be inaccessible, but, if you want it to rank, it needs to be accessible. You don’t want to seek a link for content that’s inaccessible if you want to get any value out of it.
Do I have duplicate content?
Sitebulb has a handy Duplicate Content tab you can click on. Duplicate content can impact your rankings in some cases so it’s best to avoid or handle it properly. (For more on duplicate content see Dealing with Duplicate Content.)
Are my redirects set up correctly?
As a link builder, my main concern with redirects involves making sure that if I move or remove a page with a lot of good links, the change is handled properly with a redirect. There are a lot of arguments for and against redirecting pages for things like products you no longer carry or information that is no longer relevant to your site, as much of that has to do with usability.
I just hate to get great links for a page that doesn’t get properly redirected, as the loss of links feels like such a waste of time.
Am I seeing the correct error codes?
DeepCrawl has a section on Non-200 Pages which is very helpful. You can click on and view a graphical representation of these pages.
Generally speaking, you’d expect to see most pages returning a 200 code. You’d expect to see 301 and 302 redirects. You don’t want to see over 50% of your site returning 404 codes though. Screaming Frog has a tab where you can easily view response codes for all pages crawled.
I would say that you need to make sure you understand which codes should be returning from various pages though, as there may be good reasons for something to return a certain code.
Is my load time ok?
Some people are much more patient than I am. If a page doesn’t load almost immediately, I bounce. If you’re trying to get links to a page and it takes 10 seconds to load, you’re going to have a disappointing conversion rate. You want to make sure that your critical pages load as quickly as possible.
Here’s a report from OnCrawl that can help you zero in on any pages that are loading too slowly:
How is my internal link structure?
You don’t want to have orphaned pages or see a lot of internal broken links. If pages can’t be found and crawled, they won’t get indexed. Good site architecture is also very important from a user’s perspective. If you have critical content that can’t be found unless it’s searched for, or you have to click on ten different links to get to it, that’s not good.
Screaming Frog has an Inlinks column (accessed by clicking on the Internal tab) that tells you how many internal links are pointing to each page. You want to see the highest number of internal links pointing to your most critical pages.
In the image below, I’ve sorted my own website by highest to lowest Inlinks, making sure that the most important pages have the most Inlinks.
Do I have pages that are too “thin”?
Considering that you can receive a manual action for thin content, it’s best not to have any. Thin content isn’t good for search engines or for users. Thin content won’t generally attract a lot of great links. In fact, if you do have links to thin content, you run the risk of having those links replaced by link builders working with sites with better resources.
OnCrawl has a good view of thin content which is very helpful.
As I said earlier, there are so many ways you can use these crawlers. Here’s a big warning, though! Some crawlers can use a ton of resources. I once crawled a site and the client’s hosting company called him to say they’d blocked it because it was making too many requests at once. If you’re going to run anything heavy duty, make sure the proper people know about it and can whitelist the relevant IP addresses.
Happy crawling!
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Google Image Search updates guidelines, adding structured data, speed and more https://ift.tt/2CIovfZ
Google updated the Google Image Publishing Guidelines document today, adding more details around structured data, page speed, title management and user experience details. The old documentation can be viewed in this screen shot I captured previously.
(adsbygoogle = window.adsbygoogle || []).push({});
The new guidelines have updated content around creating a better user experience with your images, including adding good context, optimizing placement, embedding tips, device-friendly sites and good URL structure for your images. In addition, Google has explained how the image titles work since the change.
Google also added sections for adding structured data for product, video and recipe markup. There is a new section for speed, including information about their PageSpeeds Insights tool, AMP and responsive image techniques.
You can check out the new guidelines over here and compare them to the old guidelines over here.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Google Lens feature now within Google Image Search results https://ift.tt/2CGghoE
Google announced it is now rolling out the Google Lens button within Google Image search results. Google Lens, which was introduced in May 2017, is a visual search tool to help people learn more about images and what they are visually looking at.
Adding this tool to image search allows users to learn more about the images they clicked on within Google Image search. Google said “Starting today, when you see something in an image that you want to know more about, like a landmark in a travel photo or wallpaper in a stylish room, you can use Lens to explore within the image.”
(adsbygoogle = window.adsbygoogle || []).push({}); Here is a GIF of it in action within image search:
Tumblr media
Google Lens has been a part of Google Assistant and Google Photos for some time and is a great feature within the Pixel phone as well.
How do I see Google Lens in image search? After you view an image, you should see an icon square Google Lens icon in the image, as illustrated above in the GIF. Personally, I do not yet see it, so it may still be rolling out.
What if I do not see it yet? Don’t worry, I don’t see it yet either and I am in New York. Google said “Lens in Images is now live on the mobile web for people in the U.S. searching in English, and will soon be rolled out to other countries, languages and Google Images locations.”
Why is this important? This can lead to more ways for searchers to find more information about objects within images that they want to learn more about.
Do SEOs need to consider anything? There doesn’t appear to be any special schema or markup that SEOs need to add to take advantage of ranking better for Google Lens. Google Lens is just a new search feature that was added to Google Image search. One would assume that typical Google Image search SEO tips would suffice here.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Google Speed Update is now being released to all users https://ift.tt/2PsOWgR
Google has updated its original blog post around the Speed Update saying this is now “rolling out for all users.” Google this morning has begun incorporating the new Speed Update algorithm in the mobile search results as a search ranking factor. (adsbygoogle = window.adsbygoogle || []).push({}); We posted detailed FAQs on new Google Speed Update a while back explaining some of the more common questions around the speed update. And as we clarified last week, this update only impacts the slowest of sites on the internet. Google’s original announcement said this will “only affect a small percentage of queries.” One of the more common questions we hear is how we know if Google considers our web pages to be fast or not. Google won’t give you a specific metric about this, but the search giant does say it uses a number of ways to measure page speed and suggested webmasters look at the metrics from the Chrome User Experience report, the Lighthouse tool and PageInsights tool. Again, Google has quietly updated the blog post this morning to say this Speed Update has begun rolling out July 9, 2018.
0 notes
mypradip33pansuriya · 6 years
Photo
Tumblr media
Google Search Console is sending notices for slow loading pages https://ift.tt/2Phh32n
Google is sending out a new type of notification to those who have verified properties in Google Search Console. The new messages inform site owners about really slow pages that take too long to load. Oliver H.G. Mason posted a screenshot of this message on  (adsbygoogle = window.adsbygoogle || []).push({}); Twitter:
Google has a Google Speed update which is aimed at reducing the search rankings of really slow mobile pages. Google also had a speed factor in 2010. This notice references how slow specific pages are with this specific website and how to fix the issue.
0 notes
mypradip33pansuriya · 7 years
Photo
Tumblr media
0 notes