#and then if adding those back in via HTML didn’t remove the read mores
Explore tagged Tumblr posts
Text
I have come to a few conclusions after spending nearly the last three hours trying to use Tumblr’s mass post editor to add unique tags to all my fic posts so I can maybe start to tag things a little easier (both for myself and other users):
I have way too many posts
This blue hellsite seems intentionally designed to make the user experience as utterly frustrating as possible no matter what you’re doing
I wish I had started using unique tags way back when, but in my hubris thought that would never be necessary (and didn’t understand their purpose on top of that)
I will never not be bitter that they took away my <hr> tag for reasons that were never documented or explained.
I’ll just... start the new decade by tagging things better from here on out.
*sobbing at my lost time*
#like I could literally just open up each individual post and change it there#...if editing the post to add a tag didn’t obliterate my horizontal line scene separators#and then if adding those back in via HTML didn’t remove the read mores#(which are kind of important on fic)#it’s like a catch-22#don't mind me#like you have no idea how difficult it is to find specific posts in mass post editor#you can't use it to search tagged stuff to narrow down your search#so you literally have to look at every post and select it manually#but then at some point it refused to add tags#and i'd spent like fifteen minutes culling through june-august 2018 posts and it refused to actually add the tag#i'm just gonna... leave it for now#and maybe bite the bullet later on my grand organization plans#and choose between hr tags and read mores#at least they're old posts#so it shouldn't mess up anyone's dash#but still
16 notes
·
View notes
Text
Top Rated Text To Speech Software For Mac?
The Best Free Text-to-Speech Software app downloads for Mac: Voice TextSpeech Pro Elements Toau SpeechMirror Speechissimo Listen Later. The Best Free Text-to-Speech Software app downloads for Mac. There are many free text to speech software available in the market. Most of these software work almost the same way – basically convert to speech. They differ in the type of documents that they support, and in ease of conversion from text to speech. If your OS of choice is Mac OS X, and all you need is basic but extremely solid text to speech (TTS) functionality, you don’t even have to bother with a third party software application, as the OS’ native Text to Speech feature has you covered.
Top Rated Text To Speech Software
Mac Text To Speech Online
Free Text To Speech Software
In years gone by, text to speech software was rather expensive, but these days there are excellent text to speech tools available free of charge. We're here to help you find the very best tools that will make converting written documents to audio files as easy as possible.
Text to speech software can be enormously helpful for anyone who's visually impaired, or has a condition like dyslexia that makes reading on screens tricky. It can also help overcome language barriers for people who read a language but don't speak it, or are in the process of learning.
Text to speech software is also ideal if you want to listen to a document while doing something else, if you find it easier to retain information you've heard, or if you want to sense-check something you've written.
Here's our pick of the best free text to speech software for reading either individual paragraphs or whole documents aloud.
1. Balabolka
Save text as a spoken audio file, with customizable voices
Lots of voices to choose from
There are a couple of ways to use Balabolka's free text to speech software: you can either copy and paste text into the program, or you can open a number of supported file formats (including DOC, PDF, and HTML) in the program directly. In terms of output you can use SAPI 4 complete with eight different voices to choose from, SAPI 5 with two, or the Microsoft Speech Platform if you download and install the necessary files. Whichever route you choose, you can adjust the speech, pitch and volume of playback to create custom voice.
In addition to reading words aloud, this free text to speech software can also save narrations as audio files in a range of formats including MP3 and WAV. For lengthy documents you can create bookmarks to make it easy to jump back to a specific location and there are excellent tools on hand to help you to customize the pronunciation of words to your liking.
With all these features to make life easier when reading text on a screen isn't an option, Balabolka is best free text to speech software around.
2. Natural Reader
Free text to speech software with its own web browser
Choice of interfaces
Natural Reader is a free text to speech tool that can be used in a couple of ways. The first option is to load documents into its library and have them read aloud from there. This is a neat way to manage multiple files, and the number of supported file types is impressive, including ebook formats. There's also OCR, which enables you to load up a photo or scan of text, and have it read to you.
The second option takes the form of a floating toolbar. In this mode, you can highlight text in any application and use the toolbar controls to start and customize text to speech. This means you can very easily use the feature in your web browser, word processor and a range of other programs. There's also a built-in browser to convert web content to speech more easily.
3. Panopretor Basic
Easy text to speech conversion, with WAV and MP3 output
Top Rated Text To Speech Software
Exports in WAV and MP3 formats
As the name suggests, Panopreter Basic delivers free text to speech conversion without frills. It accepts plain and rich text files, web pages and Microsoft Word documents as input, and exports the resulting sound in both WAV and MP3 format (the two files are saved in the same location, with the same name).
The default settings work well for quick tasks, but spend a little time exploring Panopreter Basic's Settings menu and you'll find options to change the language, destination of saved audio files, and set custom interface colors. The software can even play a piece of music once it's finished reading – a nice touch you won't find in other free text-to-speech software.
If you need something more advanced, a premium version of Panopreter is available for US$29.95 (about £20, AU$40). This edition offers several additional features including toolbars for Microsoft Word and Internet Explorer, the ability to highlight the section of text currently being read, and extra voices.
4. WordTalk
An extension that adds text to speech to your word processor
Customizable voices
Developed by the University of Edinburgh, WordTalk is a toolbar add-on for Word that brings customizable text to speech to Microsoft Word. It works with all editions of Word and is accessible via the toolbar or ribbon, depending on which version you're using.
Mac Text To Speech Online
The toolbar itself is certainly not the most attractive you'll ever see, appearing to have been designed by a child. Nor are all of the buttons' functions very clear, but thankfully there's a help file on hand to help.
There's no getting away from the fact that WordTalk is fairly basic, but it does support SAPI 4 and SAPI 5 voices, and these can be tweaked to your liking. The ability to just read aloud individual words, sentences or paragraphs is a particularly nice touch. You also have the option of saving narrations, and there are a number of keyboard shortcuts that allow for quick and easy access to frequently used options.
5. Zabaware Text-to-Speech Reader
A great choice for converting text from websites to speech
Good file format support
Despite its basic looks, Zabaware Text-to-Speech Reader has more to offer than you might first think. You can open numerous file formats directly in the program, or just copy and paste text.
Alternatively, as long as you have the program running and the relevant option enables, Zabaware Text-to-Speech Reader can read aloud any text you copy to the clipboard – great if you want to convert words from websites to speech – as well as dialog boxes that pop up. Zabaware Text-to-Speech Reader can also convert text files to WAV format.
Unfortunately the selection of voices is limited, and the only settings you can customize are volume and speed unless you burrow deep into settings to fiddle with pronunciations. Additional voices are available for a US$25 fee (about £20, AU$30), which seems rather steep, holding it back from a higher place in our list.
Update (July 2018): Please refer to our full article on free speech to text softwareThe Best (Free) Speech-to-Text Software for WindowsThe Best (Free) Speech-to-Text Software for WindowsLooking for the best free Windows speech to text software? We compared Dragon Naturally Speaking with free alternatives from Google and Microsoft.Read More.
I’m writing lots, and frequently getting arm ache. Are there any free good speech to text software programs available to download? I just want to up notepad and start talking, and have my voice translated into text and typed into notepad.
Free Text To Speech Software
What is the best stt software then?
good
Windows 7 has speech recognition which is good.
Great Question. I am looking for the same thing but I don't have windows 7 or vista. Or any money.
Free text editor for mac free download - VideoPad Free Video Editor for Mac, Plain Text Editor, MovieMator Free Mac Video Editor, and many more programs. The Best Free Text Editors for Windows, Linux, and Mac Lori Kaufman April 28, 2012, 12:00pm EDT We all use text editors to take notes, save web addresses, write code, as well as other uses. Free text editors for mac os x. How can the answer be improved? The best free and paid text editor programs for Mac whether you're a web developer, programmer, technical writer, or anything in between! Text editors are an entirely different story. Text editors are much more helpful if you're editing code, creating web pages, doing text transformation or other things for which a word processor is just overkill.
just found this i hope it work as well as they say, [Broken Link Removed]
I'm no big Windoze fan but the speech to text in Windows is every bit as good aS Dragon as I use them both. In ANY speech to text you have to be aware of mic positioning and extraeous noise. Work on those two things and the text will take care of itself.
i found the speach recognition software on my computer, but i need the speech to text and now i can only find text to speech! someone can help?
Dragon naturally speaking software is the best one.
I'm a special ed teacher who needs a Speech to text software (hopefully free) for 12 students with great ideas but few or no writing skills (K-1).
I haven't read where anyone has mentioned Talk It Type It yet. It is very economical. I bought the basic software about 6 years ago. I paid aprox 20.00 for it. Much cheaper than Dragon. TITI does have higher priced editions but I only needed the basic. I had to train it to recognize my voice, but you will have to do that with any of them. Google them to ck them out. I haven't checked recently to see if the co. is still in business. I say that because I haven't heard any ads about the software like they had a few years back. It could be worth checking them out? I like mine.
Just started using the Windows Speech Recognition and it seems to work well, but needs a lot of patience in training the computer to recognise your voice. Wanted a free option to start with and didn't know I had this on my computer all this while.. Anyone know how I can access the dictionary so as to add a few words? Or does it work with the standard windows dictionary, so I edit my words there. Still getting used to it..thanks to all who recommended this!
Of that I'm not certain, but it may be a great question to ask on MUO Answers..
Text plus for mac. With textPlus for smartphones and tablets, get a FREE number and start texting and calling today! With textPlus for smartphones and tablets, get a FREE number and start texting and calling today! Home Features News. Free unlimited text. Calls for 2¢ / min. TextPlus for PC is a free video, voice & text messenger. Free download Text Plus for Computer/PC Windows 10/8.1/8/7 & Mac Laptop. Textplus for PC is now available and can be easily installed on a desktop PC/Laptop running Windows XP/7/8/8.1/10 and MacOS/OS X. Let’s learn a little bit about this all new app and then have a look on the installation method via BlueStacks or BlueStacks 2.
Fortunately, this is MakeUseOf Answers. :)
What Ryan means though is, please ask a new question! The above is many weeks old and it will take a while to get an answer. If you post a new question, however, you will receive an answer within hours.
Beware Dragon, works fine but when you upgrade from say XP to Vista or Vista to 7, the version of Dragon no longer works, and they want you to buy it again.
If you have a reasonably fast computer running Windows 7, the speech recognition which comes with windows works pretty well.
The latest stable version of Google Chrome 11 has been released sporting the new flat icon with improved security and with the speech-to-text support through HTML speech input API. The first official Google service to make use of this service is Google Translate.
Once after downloading and installing Chrome 11 you can head over to Google Translate page to check out Speech-to-text translation. Right now Google supports only English to other languages. If activated you will see a microphone icon turn blue when you hover over it and the Speak Now speech bubble appear. When you have finished speaking and Chrome 11 speech input API has successfully converted voice to text, Google Translate service steps in and translates language. Hit listen button to heart the translated word.
I'm in Australia and they didn't have the icon you described. Maybe this feature is only enabled in particular countries? Just a thought.
Make sure that you have the source language set to English.
Replying to Bill in reply to Robert Aussies have to be careful to recognize that Strine is not English, which is not spoken in Aussie except by English-Speaking visitors -- and that they tune to International (IE: American) English. <]:^)-<
and you, mr. or ms. anonymous, should be careful as well, since strine is as 'english' as 'american english', both a dirivative of the TRUE original British english, which when you come to think about it is closer to strine than it is american english.
Thanks, it works..
hi, can You help me? I need from my phone iPhone from voice recorder translate it to the text in word. It is somehow possible? And it will be perfect if is for free. Thank Ypou so much. I am not talent for technology so I do not know how to do it.
What about any speech to text for Windows XP?
Marylou above recommended Dragon Naturally Speaking. Did you try that one, yet?
It's a bit pricey and I haven't tried it yet
Try using a bit torrent site with peerblock installed and running.
stop going on about dragon
does not work very well on windows 7 home premium thats what i have and it didnt get one word right
that is for windows 7
Dictating textWhen you speak into the microphone, Windows Speech Recognition converts your spoken words into text that appears on your screen.To dictate textOpen Speech Recognition by clicking the Start button , clicking All Programs, clicking Accessories, clicking Ease of Access, and then clicking Windows Speech Recognition. Say 'start listening' or click the Microphone button to start the listening mode.Open the program you want to use or select the text box you want to dictate text into.Say the text that you want dictate.
Might try [Broken Link Removed]. There is a zip file installer available at [Broken Link Removed]. The trial is fully functional, and I tried this on Windows 7. It seemed to act as a front end for MS Speech Recognition, but I am not sure. Worth a shot, though.
free good speech to text software programs available to download http://www.tazti.com/
Thanks for this info although, the term, 'if it sounds too good to be true, it usually is'. I checked out the website in your comment & found that it is free, but for only 15 days. After that, it's $29.95. which is a good price but, I'm afraid the term, 'you get what you pay for' may apply.
So Harry, you went on to check some free software, it wasn't free, and now you're complaining that it's too cheap. Did I sum it up correctly?
I use the Microsoft inbuilt version and it works fine. The trick is to slowly train the program to understand your voice and practice until it does. Accuracy for me is now about 85 to 95%. Its no good expecting speech to text software to work out of the box, although dragon is faster than the others to do that as I have used both. I will upgrade to Dragon 11 later but for now I am using Microsofts version and I write articles from it, so it does work.
Andy
People just don't seem to understand the English language anymore do they??? HUGE difference between 'text to speech' and 'speech to text' .. But if you're reading carefully and not just jumping in because you think you know what you're talking about, it's pretty easy to catch..
Did u guys just use google to find this website : hope it will help u ppl ;) http://www.naturalreaders.com/index.htm from : softlogik
you did not read carefully. they are looking for speech to text NOT text to speech. BIG difference.
i,m in a fix guys. i dont have a card yet and cant buy any? isnt there any freeware?
I have just switched on Windows Seven speech recognition and am trying it out for the first time. With a bit of juggling it seems to be going quite reasonably, but I can see that there is a pretty steep learning curve, especially as I have a quite pronounced lisp (and wasn't that fun to have to spell out).
Still for a first try it's not going too badly and I can see me having some fun playing with this to see if I can get anywhere near my not very impressive typing speed. One interesting thing that I have noticed in my short acquaintance with this program is that less common words seem to be recognized easier, a not unsurprising result all things considered. One thing, I am using the microphone built into my web cam, perhaps with a better quality microphone there would be fewer errors, although I'm not sure if a better microphone would be more susceptible to ambient sound. A secondary issue, and one that might not bother others, is I like to have music playing in the background whilst on my computer, either from my sound system or the computer itself, and that would have to go if I were to use speech recognition as more than an occasional thing. tempersfugue
I use a MacMice Microphone with Vista and it's great. I also use it with my favorite MacBook Pro and one of the newer versions of MacSpeech. The mike is a goosenecked usb item that works well up to 2 feet from my mouth. I can use headphones if I don't want music to interfere as ambient sound. Works with PC or Mac. I've been trying and using speech programs for years. The Vista one trains in about 7 minutes. What has to be done though is corrections, otherwise if it practices mistakes, it gets better at them. My son also uses Dragon Naturally Speaking on his XP and just likes it better every time he uses it. Mike is plug and play, look here: [Broken Link Removed]
I use Google Voice, a free service, when I want speech to text. I use it with my Android cell phone and call my own phone number to leave a message. Google does a good job of transcribing my voice message to text and emailing me the text to my Gmail account.
How to edit text in a pdf. I'm pretty sure Google Voice works with any phone. You don't need to own an Android phone to use it. The service is now out of beta and is open for anyone to use.
Hope this helps.
The built in speech recognition works reasonably well IF you have the right mic and sound card. Wrong mic or sound card and you won't get good results no matter what software you use.
Dragon is better than the built in software, particularly Dragon 10 & 11. I use it all the time. I blogged about it here: [Broken Link Removed]
Wade Hatler
if you have broadband, use [Broken Link Removed], just copy/paste your text and hit the play button. (it was mentioned in makeuseof directory)
the question is about free 'speech to text' software. not 'text to speech'. that's a whole different question. but a useful piece of software nevertheless.
SPEECH TO TEXT!!!!!!!!!!!!!!!!!!!!!! SPEECH TO TEXT!!!!!!!!!!!!!!!!! NOT TEXT TO SPEECH!!!!!!!!!!!!!!!!!!!!
yo y u wearin a rag, n wat color is dat.. looks lik dark brown, i wanna say blacc.. u folk?
Irrelevant, Jonathan P.
Hi Massey Speech Project [Broken Links Removed]
possible it can do what you want, mostly freewares will not be good and sharewares like viavoice and dragon are little expensive
It also looks like Dragon is for 32 bit computers. mine's 64bit.
If I ever get a copy, i will be sure to post here and give my verdict, but still think the price is too steep.
this topic is best FREE SPEECH TO TEXT dragon is not free and all the other crap is not speech to text like what theyre looking for not just for commands but to write with
Mango - I'm in the same boat as you, I've been searching for not only a good speech to text program, but also decent API to use in some of my programming. But it appears that speech technology is one of those things that's a bit too advanced to get for free. I'm leaning toward Dragon as well, especially now that a couple people here say that it performs well. My own fear was buying it and then seeing that it doesn't perform any better than the free ones!
If you do buy a copy, let us know how you like it!
Tried Dragon and it did not work well at all
Dragon is great! Here is ver10 and $ 59.00 @ [Broken Link Removed] It is the last version but performed very well. Hope it helps.
@Eduardo I just found the inbuilt voice recognition software in windows 7, and tested it out. it is very poor quality, and even with a microphone, it's unable to get sentences right, so I'm afraid I'll have to pass on that one.
@ha14 I don't have microsoft office installed. i use notepad++ or openoffice.
@Aibek, dragon looks ok, but far too expensive, I was hoping for something completely free. have you had any experience using dragon?
I tried for a few days about 5 years ago. Back then I was mainly looking for a program that would let me use my PC using voice commands. Dragon did fairly well but required user to train it first. Because I wanted something quick I uninstalled it:-)
I heard lots of positive feedback about Dragon Naturally Speaking, http://www.nuance.com/dragon/index.htm Mac os find file.
Unfortunately it's a bit pricey.
Hi
[Broken Link Removed]
If you are using Windows Vista or 7, you may have access to the built in voice recognition program. Look for it in the Star menu. Note: I think you have to be running Home Premium or superior to use this feature, though I'm not completely sure.
isnt this just for voice commands to mover around on your computer????we are looking for somethiing that takes your vocals and types them into text!!!!!!!!!
Windows 7 speech recognition does both - controls the computer, and takes dictation. Like most speech to text programs there is some learning to be done on your and the computer's part. If you want punctuation you'll need to say that (period/stop, comma, etc.) If you have a decent microphone then you are all set.
I guess almost all Vistas, Windows 7 and higher specification XPs have speech recognition. In Vista, go to control panel, then Ease of Access, Follow, Speech recognition and you are set.. The tutorial is easy and the best adive is do not get a cheap microphone.
more punctuation does not the answer change -_-
Sublime text 3 for mac. Sublime Text is available for Mac, Windows and Linux. One license is all you need to use Sublime Text on every computer you own, no matter what operating system it uses. Sublime Text uses a custom UI toolkit, optimized for speed and beauty, while taking advantage of native functionality on each platform. Sublime Text 3 may be downloaded from the Sublime Text 3 page. This is the recommended version of Sublime Text to use, and is available for Windows, OS X and Linux. This is the recommended version of Sublime Text to use, and is available for Windows, OS X and Linux.
dear Eduardo, lot of merit to you. I was looking for voice recognition software without knowing it is with me installed in computer. thank you
nanda
Btw y has no one noticed that there is no such thing as Windows superior
read better, he meant windows home premium or better!
Ha, that was funny.
I think he just meant home premium or better
there is actually windows superior. its just not a 'legal' copy of windows 7, the product has been modified to suit the user and the pc in usability and response times. my supervisor was talking about it in work when we were discussing upgrading the OS's in the office computers.
Hi, have experience using the Microsoft Speech to Text software built into Office XP/2002. First computer was Compaq AP200, PII400, 512MB PC100 SDRAM. Headset with boom mike turned out to be the problem. Changed it to Logitech USB set. Ran nicely on USB 1.1 at its rate. Worked amazingly better on the new OptiPlex 745 with its Pentium D dual core and 2 GB of specified RAM - 533 MHz. End of buzz and fuzz, beginning of virtually perfect translation of speech to text paragraph after paragraph. Customer agreed to invest her time and effort in training with her 19' LCD monitor. Good luck, dc
Thanks for sharing your experience, Dick!
0 notes
Text
Today I learned just how difficult it is to install a TV wall mount when you don’t own a drill.
The securing bolts are about 4″ long.
I went out and bought a stud finder (it beeped as soon as I picked it up).
youtube
But before I picked that up, I missed the bit about needing a drill.
You know how hard it is to use an awl to make a pilot hole in a wooden stud? It literally took me an hour and a half to get those four bolts into the wall. And hours later, my arms are still sore from the pressure I had to put on them as I was using a not-very-good socket wrench to get them to start gripping into the wood.
But...I got the job done.
The hard part was then mounting the TV to the bracket (it’s above my dresser, and it was REALLY hard to get a good angle to (a) even see the brackets on the TV that I needed to match up to the wall bracket, and (b) reach it in the first place (over a four-foot tall, three-feet-deep dresser that is FAR too heavy to move by myself), and (c) not drop the TV behind the dresser as I climbed on top of it, holding the TV (which is not as light as I thought it would be), to get a decent angle, and (d) hook up the power cord, which was almost impossible, even though my bracket lets me move the TV about 6″ from the wall.
And it’s a good thing I bought the wall mount as an afterthought. If I didn’t have it, I’d still be watching my old TV. If I had to rely on the legs that were included with the TV, well...those would be 45″ apart. And my dresser is 42″ wide. It wouldn’t have worked.
But now I have a gigantic 55″ TV on the wall of my room, and I’ve angled/tilted it so that I see the actual picture, and not the reflection of my bedroom ceiling lighting fixture.
[insert Read More here because I rambled about both the new and old TV]
Turned out that, even using a laser level, I mounted the wall bracket a little crooked -- when I was done, the right side of the TV was about 1″ lower than the left. But the manufacturer accounted for that possibility and included a way to adjust the TV brackets in order to raise each side of the TV separately without disassembling and starting over (and putting more large holes in the wall).
It was a LOT of work (if I’d known what I was in for, I’d have just had Best Buy do the installation [[ free installation was included, but I ordered my wall mount a day later and wasn’t sure it would have been here, and the 5.1 sound bar still isn’t here yet, and I’m too macho in my head for my own good and say “I can install it myself! but there are things that could have been done better than I did them with the right tools), but...it’s up, and it looks fantastic.Any larger would have been too big (not that I wouldn’t want a full-wall TV, but...with the furniture and the wall decorations, this BARELY fit without blocking anything or being blocked by anything). And don’t get me wrong...if it was just hooking up cables to the TV, I would be golden
But a wall mount? That was VERY new to me. And I screwed it up.
But since I ordered it a day after the TV, I said no to the installation, thinking they’d come to install the TV to my wall only to find that I had no wall mount.
It all worked out (so far -- after about 12 hours, my TV hasn’t fallen off the wall, so I’m thinking I probably did it right (despite the fact that one of the bolts went in at about 30º upper-left instead of going striaght int the wood stud) even though the TV is FAR from centered on the mount -- bust MOSTLY centered where I need it, and angled to where the reflection of my bedroom light isn’t glaraing back at me via the TV screen -- because of where the studs in my wall are).
But for wood stud installations, the manufacturer says that positioning the TV at ANY POINT along the wall bracket works, so I think I’ll be okay.
I leveled it. I raised it on both sides (I was about 2″ short in where I installed the mount, but the TV portion of the mount had a work-around that I used).
And unlike my old TV (which I have to figure out how to legally get rid of -- see below for how Ive decided to try to get rid of it without paying reycling fees; if someone ffers me $5, I’m gonna take it), the new one hasn’t restarted itself once when I didn’t want it to so far (it did restart when it was first turned on and downloading software updates, but that was just part of the update process).
I had thought about waiting until tomorrow to install the new TV, but when I turned on my old TV earlier, the sound cut out -- and it literally took 15 minutes for the TV to reboot with sound. That was when I decided that today was the day. And I was without a TV in my room for HOURS while I installed the new one.
The only thing -- I went into my parents’ room to get a step stool which I knew was there to help me reach where I needed to install the wall mount (and, later, climb on top of my dresser, which I’m REALLY glad I didn’t end up tipping over as I carefully shifted my weight onto it -- there was one pretty close call). Their bedroom door was closed. They usually leave it open when they’re out of town.
And right by their dresser, there’s a 40″ flat screen TV, brand new in the box.
When my mom asked about birthday gifts, my reply included a new TV. They ended up getting me the Disneyland Spirit Award pin (which I mentioned in the same email), but...they may have decided to save that TV for a Christmas gift (and a very nice one, don’t get me wrong).
So tomorrow I’d better text to say Happy Thanksgiving and also mention that I bought myself something nice for my birthday. That way they can (hopefully) return the TV when they get home (if it is for me -- I don’t know that for sure), and if it will have been too long since they purchased it, maybe use it as an upgrade from their own smaller 32″ TV in their room.
Now I’m just waiting for my 5.1-Channel Sound Bar (with wireless sub-woofer and rear speakers -- which now costs $20 more on Best Buy’s website than it did when I bought it in their “pre-Black Friday sale”).
I ordered it in the same transaction as the wall mount, but while the wall mount arrived a day earlier than promised, the sound bar -- instead of being shipped from California to here in a couple days -- was sent from California to Utah for some reason. Even at 2:00 Wednesday morning, the tracking info from UPS said “on time delivery by the end of the day Tuesday the 21st.” Now it’s expected to arrive on the 24th, while I’m at work (hopefully it won’t get stolen from my front porch). I see on the tracking that it made it earlier tonight to the place in Nevada where it will be put on a truck to my house, but UPS also says that they don’t do deliveries on Thanksgiving (which is fine...just hoping that nobody steals this $170 sound bar as it sits on my porch all day until I get home from work.) Also, I decided to sell my old TV on CraistList rather than try to find a legal way to dispose of it (which might cost me money -- Best Buy wanted $15 to haul it away).
But fear not, my integrity is intact. The following is the ad I submitted (which can be seen at https://reno.craigslist.org/ele/d/flat-screen-smart-tv-works-as/6397825672.html):
I just replaced a 38½" 1080P Insignia flat screen Smart TV and it's yours for a low price. The original box (not included) said 39" Class but official diagonal measurement is 38½" (which my tape measure agrees with). For full product details, see https://www.insigniaproducts.com/pdp/NS-39DR510NA17/4863802 (this is the exact model I am selling -- they are still selling it new for $180, so you can save some money here if you can accept that it's a TV bought in July of 2016 and used daily since then). A couple of caveats, because I want to run an honest ad: The TV has a history of restarting itself at random times. You might be watching a favorite show, and the TV will reset, go to the Insignia/ROKU logo, and start from scratch, so you may miss a part of what you are watching. Sometimes the sound gives out, and a reboot to fix it (large square button on the back left of the TV -- hold it down until it starts the reboot process) can take up to 10-15 minutes before you're back to watching TV. The basic restarts happen, on average, once a day (some days it happens twice, other days it doesn't happen at all), and those generally take a minute or two before you're back to watching TV or playing your video game (I would NOT recommend this TV for a gamer who can't press "pause" when this happens, like if you are playing a multiplayer online game). The 10-15 minute reboots, maybe once every two weeks. If you can handle that, the TV has very good picture quality, and fit very well on top of a tall dresser that is 42" wide. No wall mount included. Legs can be easily removed if you have a wall mount already. Usually these restarts happen within 10-15 minutes of powering it on, but once in a while, you can be watching TV for hours, be really into a sports event, and...suddenly, there's an Insignia logo in place of the great play you were about to see, and now will have to look up on YouTube. This is why I replaced the TV after just 16 months -- it was frustrating to me. But if you're okay with this, or know how to fix it, or have a friend that you like a *little bit* that you feel obligated to get a gift for...this could be the TV for you. Cable box and compact DVD/Blu-Ray player fit comfortably in the space between the TV and the surface -- your experience may vary based on the size of your cable/satellite box and/or DVD/Blu-Ray player (actual under-TV clearance: 2" on the left and right of center, 1½" at the center where the LED light and remote control sensor are located). Has original remote, and the legs are still attached. All settings have been restored to factory default. For inputs, see images. To clarify dimensions: TV itself is 3½" deep (9½" if you include the legs), 34½" wide (if you have limited space on your flat surface, a surface 27" wide would accommodate the legs with the TV hanging just over the sides), and 22" high (with legs -- 20 5/8" high without legs). As far as when this sale (or barter, if you've got something you think I might want -- I'm certainly open to negotiations) can happen -- I work in a local casino, and cannot be seen on camera accepting cash when I'm at work since I'm a manager who is ineligible for tips, and a transaction like this could be seen as taking a tip if it happened while I am at work. I work from 3pm-11pm Friday through Tuesday. I would be available after 11:00pm any of those days (before work, I sleep until it's time to get ready for work -- if you work out a 12pm-1pm meeting in advance, I can adjust my sleep schedule for that day). Wednesdays and Thursdays, I am free whenever works for you (this includes today, Thanksgiving). Given the problems that the TV has with restarting, I will accept offers. The $50 list price is not firm. I will say, though, that if I tell you we have a deal, we have a deal. If I get a higher offer while you're on your way, I'm going to honor our deal. If I accept your offer, that acceptance is firm and not going to change, even if I get a higher offer after the fact. I absolutely guarantee that you won't get a call on the way to my house telling you that the deal is off if I agreed to it. That's on me, not you. But if my phone is ringing off the hook (I know, wishful thinking) and people are outbidding each other, I'll keep you all informed as to the most recent deal. Again...this is Craigslist, and I live in a relatively small community, so...what I actually expect is for one person to call (if I'm lucky), offer a price, and then for me to accept that price and you take the TV away and give it to someone you sort of like, but don't like as much as you like [insert favorite food/music/movie/vacation destination here]. And again, I'm not against a barter if you have something cool that you no longer want. Either way, it beats me having to pay to have the TV recycled...especially when it works (most of the time) and has just about as many bells and whistles as the TV I just bought.
If someone actually pays the $50 asking price, I’d be shocked (though the TV does, honestly, work as well as it did out of the box -- it just sucked out of the box but I disclosed that so I wouldn’t feel any guilt), but...the fact that I said “If I get a higher offer while you’re on your way, I’m going to honor our deal” (and not be greedy and go for a higher offer from someone who may call 20 minutes later) might get me some takers. I’ve never sold on Craigslist before, although I helped my parents sell a lawnmower so I’m not a TOTAL newb). but...the promise of “I know I said we had a deal, but someone else just offered $10 more, do you want to match/beat it?” seems like such a slimeball move that.
I don’t want to go there. I’m not a used car salesman. I’m just looking to pawn a mostly-working (but not without problems) TV off on someone else, after being honest with the TV’s pluses and minuses, so I don’t have to pay to have it recycled. Let someone else have my lemon TV -- as long as they have had the chance to see that it’s a lemon before they drop some money on it (honestly, I’d be THRILLED if I got $20 of my $50 asking price...I don’t know Craigslist well, but even getting $20 would be worth saying no when Best Buy said they’d haul my old TV away -- which they sold me -- if I paid them $15.
4 notes
·
View notes
Link
It’s the culmination of a 25-year effort to grapple with the reality of slavery in the home of one of liberty’s most eloquent champions.
The life it represents was anything but. The newly opened space at Monticello, Thomas Jefferson’s palatial mountaintop plantation, is presented as the living quarters of Sally Hemings, an enslaved woman who bore the founding father’s children.
But it is more than an exhibit.
It’s the culmination of a 25-year effort to grapple with the reality of slavery in the home of one of liberty’s most eloquent champions. The Sally Hemings room opens to the public Saturday, alongside a room dedicated to the oral histories of the descendants of slaves at Monticello, and the earliest kitchen at the house, where Hemings’ brother cooked.
The public opening deals a final blow to two centuries of ignoring, playing down or covering up what amounted to an open secret during Jefferson’s life: his relationship with a slave that spanned nearly four decades, from his time abroad in Paris to his death.
To make the exhibit possible, curators had to wrestle with a host of thorny questions. How to accurately portray a woman for whom no photograph exists? (The solution: casting a shadow on a wall.) How to handle the skepticism of those who remain unpersuaded by the mounting evidence that Jefferson was indeed the father of Hemings’ children? (The solution: tell the story entirely in quotes from her son Madison.)
And, thorniest of all, in an era of Black Lives Matter and #MeToo: How to describe the decadeslong sexual relationship between Jefferson and Hemings? Should it be described as rape?
“We really can’t know what the dynamic was,” said Leslie Greene Bowman, president of the Thomas Jefferson Foundation. “Was it rape? Was there affection? We felt we had to present a range of views, including the most painful one.”
After a DNA test in 1998, the nonprofit foundation, which owns Monticello, determined that there was a “high probability” that Jefferson fathered at least one of Hemings’ children, and that he likely fathered them all. The new exhibit asserts Jefferson’s paternity as a fact.
The “Life of Sally Hemings” exhibit is perhaps the most striking example of the sea change that has taken place at Monticello, as the foundation has increasingly focused on highlighting the stories of Monticello’s slaves. The foundation has embarked on a multiyear, $35 million project aimed at restoring Monticello to the way it looked when Jefferson was alive. It rebuilt a slave cabin and workshops where slaves labored, and has hosted reunions there for the descendants of the enslaved population, including sleepovers. It removed a public bathroom installed in 1940s atop slave quarters.
And it is phasing out the popular “house tour” of the mansion, which made only minimal mention of slavery alongside Jefferson’s accomplishments, radically changing what is experienced by the more than 400,000 tourists who visit Monticello annually.
Thanks to a short description given by one of Jefferson’s grandsons, historians believe that Hemings lived in the slave quarters in the South Wing. But they aren’t sure which room. Curators decided to tell Hemings’ story in one of the rooms. Instead of making it a period room with objects that she might have possessed, they left it empty, projecting the words of her son Madison on the wall to tell her story.
The 1995 movie “Jefferson in Paris” imagined that Hemings and Jefferson loved each other. But no one knows how they really felt. Their sexual relationship is believed to have started in France, where slavery was outlawed. Hemings wanted to remain in Paris, where she could have been granted freedom, but she eventually returned to Virginia with Jefferson after he offered her extraordinary privileges and freedom for any children she might have, according to an account by Madison Hemings. Her children, who were all fair-skinned and named after Jefferson’s friends, were freed when they reached adulthood.
No portrait or photograph exists of Hemings. Even her skin tone remains a mystery, and a source of controversy. Cartoons in the 18th century, which aimed to derail Jefferson’s political career, portrayed her as dark-skinned. But her father was a white plantation owner and her mother, an enslaved woman, was of mixed race. One account described Hemings as “mighty near white.” Curators at Monticello opted not to recreate a physical image of her. Instead, they will project a woman’s shadow on a wall.
At a time when sexual abuses by powerful men have dominated the news, curators struggled for months over how to describe the relationship between Hemings and Jefferson — and in particular whether to use the word “rape” in the exhibit. The foundation held conference calls and meetings with historians, board members and descendants to discuss the question.
“There are a lot of people who believe rape is too polarizing a word,” said Niya Bates, a public historian at Monticello. “But it was a conversation that we knew we could not avoid. It’s a conversation the public is already having.”
In the end, historians opted to use the word “rape” with a question mark, knowing that some would criticize them for including the word, while others would have criticized them for leaving it out.
The question is asked on a plaque on the wall outside the Hemings exhibit titled “Sex, Power and Ownership.” It spells out the power dynamic between the two: Under Virginia law, Hemings was Jefferson’s property.
Curators acknowledged that the question could be difficult for some visitors to digest, especially schoolchildren.
“We’re still having a little heartburn” about the placement of the plaque, Bates said.
Lucia “Cinder” Stanton, a retired historian who spent 25 years collecting oral history from the descendants of slaves at Monticello, said it remains to be seen how the public will react at a time when political views have become so extreme.
“The words ‘rape’ and ‘rapist,’ what it conjures up is not a nuanced situation,” she said. “There were other relationships like theirs which were clearly love matches.”
Some couples moved to Ohio, where slavery was outlawed, she said, adding: “Jefferson wasn’t that. But he wasn’t violently accosting Sally Hemings every day for 30 years.”
At reunions of the descendants of Monticello’s slaves, the question of whether Jefferson is guilty of rape has sparked heated arguments.
“I really don’t think slaves had a choice,” said Rosemary Medley Ghoston, a retired hairdresser in Ohio who discovered in the 1980s, through genealogical research, that she was a descendant of Madison Hemings. “Maybe if it was not rape, it was a duty that she had to fulfill.”
But her distant cousin, Julius “Calvin” Jefferson, whom she met at a descendants’ event, feels differently.
“I think it was a love story,” he said, noting that Hemings was the half sister of Jefferson’s late wife, Martha, whose death had devastated him. “Did she look like Martha? I think she did.”
The exhibit has divided the white descendants of Jefferson’s acknowledged family, and stoked outrage among a small but determined group of Jefferson enthusiasts who insist that he didn’t father Hemings’ children.
“The charge is an extremely serious charge against him,” said Mary Kelley, a sculptor from Chevy Chase, Maryland, who took a tour of Monticello in 2013 and was shocked by what she considered to be the guide’s negative tone about a man she has always idolized.
Afterward, she joined the Thomas Jefferson Heritage Society, a group that was formed to dispute the growing historical consensus that Jefferson fathered Hemings’ children.
Now Kelley hunts down clues about who else could have fathered Hemings’ children and writes articles criticizing the plans for the Sally Hemings exhibit. She even created an artistically rendered drawing of the DNA used in the 1998 paternity test, and plans to attend a coming conference in Charlottesville, where heritage society members will share papers they have written.
“Some nights I just curl up in the semidark and just read his letters,” she says of Jefferson. “He just doesn’t seem to be a person who would do this.”
John H. Works Jr., a descendant of Jefferson’s who is among the founding members of the Thomas Jefferson Heritage Society, accuses the nonprofit organization that runs Monticello of bowing to political correctness, and insists that the entire premise of the exhibit is flawed.
But his brother, David Works, who has embraced the descendants of slaves at Monticello as “cousins,” attended a special viewing Friday to celebrate.
“They are actually showing it as it was,” he said.
Annette Gordon-Reed, a history professor at Harvard University whose book, “Thomas Jefferson and Sally Hemings: An American Controversy,” helped bolster Monticello’s transformation, said that it would take time for people to accept the changes.
“Some people come here and say, ‘I didn’t come here, to a slave plantation, to hear about slavery,'” she said. “There’s nothing to do but keep pushing back.”
This article originally appeared in The New York Times.
FARAH STOCKMAN and GABRIELLA DEMCZUK © 2018 The New York Times
via NewsSplashy - Latest Nigerian News Online,World Newspaper
0 notes
Text
How Long Should Your Meta Description Be? (2018 Edition)
Posted by Dr-Pete
Summary: The end of November saw a spike in the average length of SERP snippets. Across 10K keywords (90K results), we found a definite increase but many oddities, such as video snippets. Our data suggests that many snippets are exceeding 300 characters, and going into 2018 we recommend a new meta description limit of 300 characters.
Back in spring of 2015, we reported that Google search snippets seemed to be breaking the 155-character limit, but our data suggested that these cases were fairly rare. At the end of November, RankRanger's tools reported a sizable jump in the average search snippet length (to around 230 characters). Anecdotally, we're seeing many long snippets in the wild, such as this 386-character one on a search for "non compete agreement":
Search Engine Land was able to get confirmation from Google of a change to how they handle search snippets, although we don't have specifics or official numbers. Is it time to revisit our guidelines on meta descriptions limits heading into 2018? We dug into our daily 10,000-keyword tracking data to find out...
The trouble with averages
In our 10K tracking data for December 15th, which consisted of 89,909 page-one organic results, the average display snippet (stripped of HTML, of course) was 215 characters long, slightly below RankRanger's numbers, but well above historical trends.
This number is certainly interesting, but it leaves out quite a bit. First of all, the median character length is 186, suggesting that some big numbers are potentially skewing the average. On the other hand, some snippets are very short because their meta Ddescriptions are very short. Take this snippet for Vail.com:
Sure enough, this is Vail.com's meta description tag (I'm not gonna ask):
Do we really care that a lot of people just write ridiculously short meta descriptions? No, what we really want to know is at what point Google is cutting off long descriptions. So, let's just look at the snippets that were cut (determined by the " ..." at the end). In our data set, this leaves just about 3.6% (3,213), so we can already see that the vast majority of descriptions aren't getting cut off.
Coincidentally, the average is still 215, but let's look at the frequency distribution of the lengths of just the cut snippets. The graph below shows cut-snippet lengths in bins of 25 (0-25, 25-50, etc.):
If we're trying to pin down a maximum length for meta descriptions, this is where things get a bit weird (and frustrating). There seems to be a chunk of snippets cut off at the 100–125 character range and another chunk at the 275–300 range. Digging in deeper, we discovered that two things were going on here...
Oddity #1: Video snippets
Spot-checking some of the descriptions cut off in the 100–125 character range, we realized that a number of them were video snippets, which seem to have shorter limits:
These snippets seem to generally max out at two lines, and they're further restricted by the space the video thumbnail occupies. In our data set, a full 88% of video snippets were cut off (ended in " ..."). Separating out video, only 2.1% of organic snippets were cut off.
Oddity #2: Pre-cut metas
A second oddity was that some meta description tags seem to be pre-truncated (possibly by CMS systems). So, the "..." in those cases is an unreliable indicator. Take this snippet, for example:
This clocks in at 150 characters, right around the old limit. Now, let's look at the meta description:
This Goodreads snippet is being pre-truncated. This was true for almost all of the Goodreads meta descriptions in our data set, and may be a CMS setting or a conscious choice by their SEO team. Either way, it's not very useful for our current analysis.
So, we attempted to gather all of the original meta description tags to check for pre-truncated data. We were unable to gather data from all sites, and some sites don't use meta description tags at all, but we were still able to remove some of the noise.
Let's try this again (...)
So, let's pull out all of the cut snippets with video thumbnails and the ones where we know the meta description ended in "...". This cuts us down to 1,722 snippets (pretty deep dive from the original 89,909). Here's what the frequency distribution of lengths looks like now:
Now, we're getting somewhere. There are still a few data points down in the 150–175 range, but once I hand-checked them, they appear to be sites that had meta description tags ending in "..." that we failed to crawl properly.
The bulk of these snippets are being cut off in the 275–325 character range. In this smaller, but more normal-looking distribution, we've got a mean of 299 characters and a median of 288 characters. While we've had to discard a fair amount of data along the way, I'm much more comfortable with these numbers.
What about the snippets over 350 characters? It's hard to see from this graph, but they maxed out at 375 characters. In some cases, Google is appending their own information:
While the entire snippet is 375 characters, the "Jump..." link is added by Google. The rest of the snippet is 315 characters long. Google also adds result counts and dates to the front of some snippets. These characters don't seem to count against the limit, but it's a bit hard to tell, because we don't have a lot of data points.
Do metas even matter?
Before we reveal the new limit, here's an uncomfortable question — when it seems like Google is rewriting so many snippets, is it worth having meta description tags at all? Across the data set, we were able to successfully capture 70,059 original Meta Description tags (in many of the remaining cases, the sites simply didn't define one). Of those, just over one-third (35.9%) were used as-is for display snippets.
Keep in mind, though, that Google truncates some of these and appends extra data to some. In 15.4% of cases, Google used the original meta description tag, but added some text. This number may seem high, but most of these cases were simply Google adding a period to the end of the snippet. Apparently, Google is a stickler for complete sentences. So, now we're up to 51.3% of cases where either the display snippet perfectly matched the meta description tag or fully contained it.
What about cases where the display snippet used a truncated version of the meta description tag? Just 3.2% of snippets matched this scenario. Putting it all together, we're up to almost 55% of cases where Google is using all or part of the original meta description tag. This number is probably low, as we're not counting cases where Google used part of the original meta description but modified it in some way.
It's interesting to note that, in some cases, Google rewrote a meta description because the original description was too short or not descriptive enough. Take this result, for example:
Now, let's check out the original meta description tag...
In this case, the original meta description was actually too short for Google's tastes. Also note that, even though Google created the snippet themselves, they still cut it off with a "...". This strongly suggests that cutting off a snippet isn't a sign that Google thinks your description is low quality.
On the flip side, I should note that some very large sites don't use meta description tags at all, and they seem to fare perfectly well in search results. One notable example is Wikipedia, a site for which defining meta descriptions would be nearly impossible without automation, and any automation would probably fall short of Google's own capabilities.
I think you should be very careful using Wikipedia as an example of what to do (or what not do), when it comes to technical SEO, but it seems clear from the data that, in the absence of a meta description tag, Google is perfectly capable of ranking sites and writing their own snippets.
At the end of the day, I think it comes down to control. For critical pages, writing a good meta description is like writing ad copy — there's real value in crafting that copy to drive interest and clicks. There's no guarantee Google will use that copy, and that fact can be frustrating, but the odds are still in your favor.
Is the 155 limit dead?
Unless something changes, and given the partial (although lacking in details) confirmation from Google, I think it's safe to experiment with longer meta description tags. Looking at the clean distribution, and just to give it a nice even number, I think 300 characters is a pretty safe bet. Some snippets that length may get cut off, but the potential gain of getting in more information offsets that relatively small risk.
That's not to say you should pad out your meta descriptions just to cash in on more characters. Snippets should be useful and encourage clicks. In part, that means not giving so much away that there's nothing left to drive the click. If you're artificially limiting your meta descriptions, though, or if you think more text would be beneficial to search visitors and create interest, then I would definitely experiment with expanding.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Blogger http://ift.tt/2D89Ys7 via IFTTT
0 notes
Text
How Long Should Your Meta Description Be? (2018 Edition)
Posted by Dr-Pete
Summary: The end of November saw a spike in the average length of SERP snippets. Across 10K keywords (90K results), we found a definite increase but many oddities, such as video snippets. Our data suggests that many snippets are exceeding 300 characters, and going into 2018 we recommend a new meta description limit of 300 characters.
Back in spring of 2015, we reported that Google search snippets seemed to be breaking the 155-character limit, but our data suggested that these cases were fairly rare. At the end of November, RankRanger's tools reported a sizable jump in the average search snippet length (to around 230 characters). Anecdotally, we're seeing many long snippets in the wild, such as this 386-character one on a search for "non compete agreement":
Search Engine Land was able to get confirmation from Google of a change to how they handle search snippets, although we don't have specifics or official numbers. Is it time to revisit our guidelines on meta descriptions limits heading into 2018? We dug into our daily 10,000-keyword tracking data to find out...
The trouble with averages
In our 10K tracking data for December 15th, which consisted of 89,909 page-one organic results, the average display snippet (stripped of HTML, of course) was 215 characters long, slightly below RankRanger's numbers, but well above historical trends.
This number is certainly interesting, but it leaves out quite a bit. First of all, the median character length is 186, suggesting that some big numbers are potentially skewing the average. On the other hand, some snippets are very short because their meta Ddescriptions are very short. Take this snippet for Vail.com:
Sure enough, this is Vail.com's meta description tag (I'm not gonna ask):
Do we really care that a lot of people just write ridiculously short meta descriptions? No, what we really want to know is at what point Google is cutting off long descriptions. So, let's just look at the snippets that were cut (determined by the " ..." at the end). In our data set, this leaves just about 3.6% (3,213), so we can already see that the vast majority of descriptions aren't getting cut off.
Coincidentally, the average is still 215, but let's look at the frequency distribution of the lengths of just the cut snippets. The graph below shows cut-snippet lengths in bins of 25 (0-25, 25-50, etc.):
If we're trying to pin down a maximum length for meta descriptions, this is where things get a bit weird (and frustrating). There seems to be a chunk of snippets cut off at the 100–125 character range and another chunk at the 275–300 range. Digging in deeper, we discovered that two things were going on here...
Oddity #1: Video snippets
Spot-checking some of the descriptions cut off in the 100–125 character range, we realized that a number of them were video snippets, which seem to have shorter limits:
These snippets seem to generally max out at two lines, and they're further restricted by the space the video thumbnail occupies. In our data set, a full 88% of video snippets were cut off (ended in " ..."). Separating out video, only 2.1% of organic snippets were cut off.
Oddity #2: Pre-cut metas
A second oddity was that some meta description tags seem to be pre-truncated (possibly by CMS systems). So, the "..." in those cases is an unreliable indicator. Take this snippet, for example:
This clocks in at 150 characters, right around the old limit. Now, let's look at the meta description:
This Goodreads snippet is being pre-truncated. This was true for almost all of the Goodreads meta descriptions in our data set, and may be a CMS setting or a conscious choice by their SEO team. Either way, it's not very useful for our current analysis.
So, we attempted to gather all of the original meta description tags to check for pre-truncated data. We were unable to gather data from all sites, and some sites don't use meta description tags at all, but we were still able to remove some of the noise.
Let's try this again (...)
So, let's pull out all of the cut snippets with video thumbnails and the ones where we know the meta description ended in "...". This cuts us down to 1,722 snippets (pretty deep dive from the original 89,909). Here's what the frequency distribution of lengths looks like now:
Now, we're getting somewhere. There are still a few data points down in the 150–175 range, but once I hand-checked them, they appear to be sites that had meta description tags ending in "..." that we failed to crawl properly.
The bulk of these snippets are being cut off in the 275–325 character range. In this smaller, but more normal-looking distribution, we've got a mean of 299 characters and a median of 288 characters. While we've had to discard a fair amount of data along the way, I'm much more comfortable with these numbers.
What about the snippets over 350 characters? It's hard to see from this graph, but they maxed out at 375 characters. In some cases, Google is appending their own information:
While the entire snippet is 375 characters, the "Jump..." link is added by Google. The rest of the snippet is 315 characters long. Google also adds result counts and dates to the front of some snippets. These characters don't seem to count against the limit, but it's a bit hard to tell, because we don't have a lot of data points.
Do metas even matter?
Before we reveal the new limit, here's an uncomfortable question — when it seems like Google is rewriting so many snippets, is it worth having meta description tags at all? Across the data set, we were able to successfully capture 70,059 original Meta Description tags (in many of the remaining cases, the sites simply didn't define one). Of those, just over one-third (35.9%) were used as-is for display snippets.
Keep in mind, though, that Google truncates some of these and appends extra data to some. In 15.4% of cases, Google used the original meta description tag, but added some text. This number may seem high, but most of these cases were simply Google adding a period to the end of the snippet. Apparently, Google is a stickler for complete sentences. So, now we're up to 51.3% of cases where either the display snippet perfectly matched the meta description tag or fully contained it.
What about cases where the display snippet used a truncated version of the meta description tag? Just 3.2% of snippets matched this scenario. Putting it all together, we're up to almost 55% of cases where Google is using all or part of the original meta description tag. This number is probably low, as we're not counting cases where Google used part of the original meta description but modified it in some way.
It's interesting to note that, in some cases, Google rewrote a meta description because the original description was too short or not descriptive enough. Take this result, for example:
Now, let's check out the original meta description tag...
In this case, the original meta description was actually too short for Google's tastes. Also note that, even though Google created the snippet themselves, they still cut it off with a "...". This strongly suggests that cutting off a snippet isn't a sign that Google thinks your description is low quality.
On the flip side, I should note that some very large sites don't use meta description tags at all, and they seem to fare perfectly well in search results. One notable example is Wikipedia, a site for which defining meta descriptions would be nearly impossible without automation, and any automation would probably fall short of Google's own capabilities.
I think you should be very careful using Wikipedia as an example of what to do (or what not do), when it comes to technical SEO, but it seems clear from the data that, in the absence of a meta description tag, Google is perfectly capable of ranking sites and writing their own snippets.
At the end of the day, I think it comes down to control. For critical pages, writing a good meta description is like writing ad copy — there's real value in crafting that copy to drive interest and clicks. There's no guarantee Google will use that copy, and that fact can be frustrating, but the odds are still in your favor.
Is the 155 limit dead?
Unless something changes, and given the partial (although lacking in details) confirmation from Google, I think it's safe to experiment with longer meta description tags. Looking at the clean distribution, and just to give it a nice even number, I think 300 characters is a pretty safe bet. Some snippets that length may get cut off, but the potential gain of getting in more information offsets that relatively small risk.
That's not to say you should pad out your meta descriptions just to cash in on more characters. Snippets should be useful and encourage clicks. In part, that means not giving so much away that there's nothing left to drive the click. If you're artificially limiting your meta descriptions, though, or if you think more text would be beneficial to search visitors and create interest, then I would definitely experiment with expanding.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Blogger http://ift.tt/2BfVTLN via SW Unlimited
0 notes
Text
How Long Should Your Meta Description Be? (2018 Edition)
Posted by Dr-Pete
Summary: The end of November saw a spike in the average length of SERP snippets. Across 10K keywords (90K results), we found a definite increase but many oddities, such as video snippets. Our data suggests that many snippets are exceeding 300 characters, and going into 2018 we recommend a new meta description limit of 300 characters.
Back in spring of 2015, we reported that Google search snippets seemed to be breaking the 155-character limit, but our data suggested that these cases were fairly rare. At the end of November, RankRanger's tools reported a sizable jump in the average search snippet length (to around 230 characters). Anecdotally, we're seeing many long snippets in the wild, such as this 386-character one on a search for "non compete agreement":
Search Engine Land was able to get confirmation from Google of a change to how they handle search snippets, although we don't have specifics or official numbers. Is it time to revisit our guidelines on meta descriptions limits heading into 2018? We dug into our daily 10,000-keyword tracking data to find out...
The trouble with averages
In our 10K tracking data for December 15th, which consisted of 89,909 page-one organic results, the average display snippet (stripped of HTML, of course) was 215 characters long, slightly below RankRanger's numbers, but well above historical trends.
This number is certainly interesting, but it leaves out quite a bit. First of all, the median character length is 186, suggesting that some big numbers are potentially skewing the average. On the other hand, some snippets are very short because their meta Ddescriptions are very short. Take this snippet for Vail.com:
Sure enough, this is Vail.com's meta description tag (I'm not gonna ask):
Do we really care that a lot of people just write ridiculously short meta descriptions? No, what we really want to know is at what point Google is cutting off long descriptions. So, let's just look at the snippets that were cut (determined by the " ..." at the end). In our data set, this leaves just about 3.6% (3,213), so we can already see that the vast majority of descriptions aren't getting cut off.
Coincidentally, the average is still 215, but let's look at the frequency distribution of the lengths of just the cut snippets. The graph below shows cut-snippet lengths in bins of 25 (0-25, 25-50, etc.):
If we're trying to pin down a maximum length for meta descriptions, this is where things get a bit weird (and frustrating). There seems to be a chunk of snippets cut off at the 100–125 character range and another chunk at the 275–300 range. Digging in deeper, we discovered that two things were going on here...
Oddity #1: Video snippets
Spot-checking some of the descriptions cut off in the 100–125 character range, we realized that a number of them were video snippets, which seem to have shorter limits:
These snippets seem to generally max out at two lines, and they're further restricted by the space the video thumbnail occupies. In our data set, a full 88% of video snippets were cut off (ended in " ..."). Separating out video, only 2.1% of organic snippets were cut off.
Oddity #2: Pre-cut metas
A second oddity was that some meta description tags seem to be pre-truncated (possibly by CMS systems). So, the "..." in those cases is an unreliable indicator. Take this snippet, for example:
This clocks in at 150 characters, right around the old limit. Now, let's look at the meta description:
This Goodreads snippet is being pre-truncated. This was true for almost all of the Goodreads meta descriptions in our data set, and may be a CMS setting or a conscious choice by their SEO team. Either way, it's not very useful for our current analysis.
So, we attempted to gather all of the original meta description tags to check for pre-truncated data. We were unable to gather data from all sites, and some sites don't use meta description tags at all, but we were still able to remove some of the noise.
Let's try this again (...)
So, let's pull out all of the cut snippets with video thumbnails and the ones where we know the meta description ended in "...". This cuts us down to 1,722 snippets (pretty deep dive from the original 89,909). Here's what the frequency distribution of lengths looks like now:
Now, we're getting somewhere. There are still a few data points down in the 150–175 range, but once I hand-checked them, they appear to be sites that had meta description tags ending in "..." that we failed to crawl properly.
The bulk of these snippets are being cut off in the 275–325 character range. In this smaller, but more normal-looking distribution, we've got a mean of 299 characters and a median of 288 characters. While we've had to discard a fair amount of data along the way, I'm much more comfortable with these numbers.
What about the snippets over 350 characters? It's hard to see from this graph, but they maxed out at 375 characters. In some cases, Google is appending their own information:
While the entire snippet is 375 characters, the "Jump..." link is added by Google. The rest of the snippet is 315 characters long. Google also adds result counts and dates to the front of some snippets. These characters don't seem to count against the limit, but it's a bit hard to tell, because we don't have a lot of data points.
Do metas even matter?
Before we reveal the new limit, here's an uncomfortable question — when it seems like Google is rewriting so many snippets, is it worth having meta description tags at all? Across the data set, we were able to successfully capture 70,059 original Meta Description tags (in many of the remaining cases, the sites simply didn't define one). Of those, just over one-third (35.9%) were used as-is for display snippets.
Keep in mind, though, that Google truncates some of these and appends extra data to some. In 15.4% of cases, Google used the original meta description tag, but added some text. This number may seem high, but most of these cases were simply Google adding a period to the end of the snippet. Apparently, Google is a stickler for complete sentences. So, now we're up to 51.3% of cases where either the display snippet perfectly matched the meta description tag or fully contained it.
What about cases where the display snippet used a truncated version of the meta description tag? Just 3.2% of snippets matched this scenario. Putting it all together, we're up to almost 55% of cases where Google is using all or part of the original meta description tag. This number is probably low, as we're not counting cases where Google used part of the original meta description but modified it in some way.
It's interesting to note that, in some cases, Google rewrote a meta description because the original description was too short or not descriptive enough. Take this result, for example:
Now, let's check out the original meta description tag...
In this case, the original meta description was actually too short for Google's tastes. Also note that, even though Google created the snippet themselves, they still cut it off with a "...". This strongly suggests that cutting off a snippet isn't a sign that Google thinks your description is low quality.
On the flip side, I should note that some very large sites don't use meta description tags at all, and they seem to fare perfectly well in search results. One notable example is Wikipedia, a site for which defining meta descriptions would be nearly impossible without automation, and any automation would probably fall short of Google's own capabilities.
I think you should be very careful using Wikipedia as an example of what to do (or what not do), when it comes to technical SEO, but it seems clear from the data that, in the absence of a meta description tag, Google is perfectly capable of ranking sites and writing their own snippets.
At the end of the day, I think it comes down to control. For critical pages, writing a good meta description is like writing ad copy — there's real value in crafting that copy to drive interest and clicks. There's no guarantee Google will use that copy, and that fact can be frustrating, but the odds are still in your favor.
Is the 155 limit dead?
Unless something changes, and given the partial (although lacking in details) confirmation from Google, I think it's safe to experiment with longer meta description tags. Looking at the clean distribution, and just to give it a nice even number, I think 300 characters is a pretty safe bet. Some snippets that length may get cut off, but the potential gain of getting in more information offsets that relatively small risk.
That's not to say you should pad out your meta descriptions just to cash in on more characters. Snippets should be useful and encourage clicks. In part, that means not giving so much away that there's nothing left to drive the click. If you're artificially limiting your meta descriptions, though, or if you think more text would be beneficial to search visitors and create interest, then I would definitely experiment with expanding.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog http://ift.tt/2yWbzgZ via IFTTT
0 notes
Text
How Long Should Your Meta Description Be? (2018 Edition)
Posted by Dr-Pete
Summary: The end of November saw a spike in the average length of SERP snippets. Across 10K keywords (90K results), we found a definite increase but many oddities, such as video snippets. Our data suggests that many snippets are exceeding 300 characters, and going into 2018 we recommend a new meta description limit of 300 characters.
Back in spring of 2015, we reported that Google search snippets seemed to be breaking the 155-character limit, but our data suggested that these cases were fairly rare. At the end of November, RankRanger's tools reported a sizable jump in the average search snippet length (to around 230 characters). Anecdotally, we're seeing many long snippets in the wild, such as this 386-character one on a search for "non compete agreement":
Search Engine Land was able to get confirmation from Google of a change to how they handle search snippets, although we don't have specifics or official numbers. Is it time to revisit our guidelines on meta descriptions limits heading into 2018? We dug into our daily 10,000-keyword tracking data to find out...
The trouble with averages
In our 10K tracking data for December 15th, which consisted of 89,909 page-one organic results, the average display snippet (stripped of HTML, of course) was 215 characters long, slightly below RankRanger's numbers, but well above historical trends.
This number is certainly interesting, but it leaves out quite a bit. First of all, the median character length is 186, suggesting that some big numbers are potentially skewing the average. On the other hand, some snippets are very short because their meta Ddescriptions are very short. Take this snippet for Vail.com:
Sure enough, this is Vail.com's meta description tag (I'm not gonna ask):
Do we really care that a lot of people just write ridiculously short meta descriptions? No, what we really want to know is at what point Google is cutting off long descriptions. So, let's just look at the snippets that were cut (determined by the " ..." at the end). In our data set, this leaves just about 3.6% (3,213), so we can already see that the vast majority of descriptions aren't getting cut off.
Coincidentally, the average is still 215, but let's look at the frequency distribution of the lengths of just the cut snippets. The graph below shows cut-snippet lengths in bins of 25 (0-25, 25-50, etc.):
If we're trying to pin down a maximum length for meta descriptions, this is where things get a bit weird (and frustrating). There seems to be a chunk of snippets cut off at the 100–125 character range and another chunk at the 275–300 range. Digging in deeper, we discovered that two things were going on here...
Oddity #1: Video snippets
Spot-checking some of the descriptions cut off in the 100–125 character range, we realized that a number of them were video snippets, which seem to have shorter limits:
These snippets seem to generally max out at two lines, and they're further restricted by the space the video thumbnail occupies. In our data set, a full 88% of video snippets were cut off (ended in " ..."). Separating out video, only 2.1% of organic snippets were cut off.
Oddity #2: Pre-cut metas
A second oddity was that some meta description tags seem to be pre-truncated (possibly by CMS systems). So, the "..." in those cases is an unreliable indicator. Take this snippet, for example:
This clocks in at 150 characters, right around the old limit. Now, let's look at the meta description:
This Goodreads snippet is being pre-truncated. This was true for almost all of the Goodreads meta descriptions in our data set, and may be a CMS setting or a conscious choice by their SEO team. Either way, it's not very useful for our current analysis.
So, we attempted to gather all of the original meta description tags to check for pre-truncated data. We were unable to gather data from all sites, and some sites don't use meta description tags at all, but we were still able to remove some of the noise.
Let's try this again (...)
So, let's pull out all of the cut snippets with video thumbnails and the ones where we know the meta description ended in "...". This cuts us down to 1,722 snippets (pretty deep dive from the original 89,909). Here's what the frequency distribution of lengths looks like now:
Now, we're getting somewhere. There are still a few data points down in the 150–175 range, but once I hand-checked them, they appear to be sites that had meta description tags ending in "..." that we failed to crawl properly.
The bulk of these snippets are being cut off in the 275–325 character range. In this smaller, but more normal-looking distribution, we've got a mean of 299 characters and a median of 288 characters. While we've had to discard a fair amount of data along the way, I'm much more comfortable with these numbers.
What about the snippets over 350 characters? It's hard to see from this graph, but they maxed out at 375 characters. In some cases, Google is appending their own information:
While the entire snippet is 375 characters, the "Jump..." link is added by Google. The rest of the snippet is 315 characters long. Google also adds result counts and dates to the front of some snippets. These characters don't seem to count against the limit, but it's a bit hard to tell, because we don't have a lot of data points.
Do metas even matter?
Before we reveal the new limit, here's an uncomfortable question — when it seems like Google is rewriting so many snippets, is it worth having meta description tags at all? Across the data set, we were able to successfully capture 70,059 original Meta Description tags (in many of the remaining cases, the sites simply didn't define one). Of those, just over one-third (35.9%) were used as-is for display snippets.
Keep in mind, though, that Google truncates some of these and appends extra data to some. In 15.4% of cases, Google used the original meta description tag, but added some text. This number may seem high, but most of these cases were simply Google adding a period to the end of the snippet. Apparently, Google is a stickler for complete sentences. So, now we're up to 51.3% of cases where either the display snippet perfectly matched the meta description tag or fully contained it.
What about cases where the display snippet used a truncated version of the meta description tag? Just 3.2% of snippets matched this scenario. Putting it all together, we're up to almost 55% of cases where Google is using all or part of the original meta description tag. This number is probably low, as we're not counting cases where Google used part of the original meta description but modified it in some way.
It's interesting to note that, in some cases, Google rewrote a meta description because the original description was too short or not descriptive enough. Take this result, for example:
Now, let's check out the original meta description tag...
In this case, the original meta description was actually too short for Google's tastes. Also note that, even though Google created the snippet themselves, they still cut it off with a "...". This strongly suggests that cutting off a snippet isn't a sign that Google thinks your description is low quality.
On the flip side, I should note that some very large sites don't use meta description tags at all, and they seem to fare perfectly well in search results. One notable example is Wikipedia, a site for which defining meta descriptions would be nearly impossible without automation, and any automation would probably fall short of Google's own capabilities.
I think you should be very careful using Wikipedia as an example of what to do (or what not do), when it comes to technical SEO, but it seems clear from the data that, in the absence of a meta description tag, Google is perfectly capable of ranking sites and writing their own snippets.
At the end of the day, I think it comes down to control. For critical pages, writing a good meta description is like writing ad copy — there's real value in crafting that copy to drive interest and clicks. There's no guarantee Google will use that copy, and that fact can be frustrating, but the odds are still in your favor.
Is the 155 limit dead?
Unless something changes, and given the partial (although lacking in details) confirmation from Google, I think it's safe to experiment with longer meta description tags. Looking at the clean distribution, and just to give it a nice even number, I think 300 characters is a pretty safe bet. Some snippets that length may get cut off, but the potential gain of getting in more information offsets that relatively small risk.
That's not to say you should pad out your meta descriptions just to cash in on more characters. Snippets should be useful and encourage clicks. In part, that means not giving so much away that there's nothing left to drive the click. If you're artificially limiting your meta descriptions, though, or if you think more text would be beneficial to search visitors and create interest, then I would definitely experiment with expanding.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from The Moz Blog http://ift.tt/2yWbzgZ via IFTTT
0 notes
Text
How to Beat Google’s Mobile Page Speed Benchmarks – Search Engine Journal
Brad Smith
5.5K
READS
Google recently unveiled mobile page speed industry benchmarks and analyzed customer behavior to figure out how the two lined up.
Unfortunately, they didn’t.
Meaning:
Most mobile websites are slooooooooooow.
Consumers won’t wait longer than a few seconds.
That’s a problem. It means the vast majority of mobile websites are losing money, practically forcing customers to bounce and go somewhere else.
Here’s why that’s happening and what you should do about it.
Slow Page Load Speed Sabotages Your Revenue
The probability of someone bouncing from your site increases by 113 percent if it takes seven seconds to load, according to Google’s mobile page speed industry benchmarks, which were released in February.
The problem?
The average time it takes to fully load a mobile landing page is 22 seconds, according to the same report.
That’s not good. In fact, it’s awful because that trickle down effect hits your bottom line, too. Slower sites cause more bounces which then lowers conversions:
“Similarly, as the number of elements—text, titles, images—on a page goes from 400 to 6,000, the probability of conversion drops 95 percent.”
This is nothing new. Slow page speeds have long been public enemy number one for years. Over a decade ago, then-Googler Marissa Mayer confirmed that Google themselves saw a 20 percent drop in traffic with just a 0.5-second delay.
Mobile-first indexation is coming, and speed is the mobile SEO Achilles Heel. E-commerce brands lose half of their traffic if pages take three seconds or longer, which has motivated some to get up-and-running in less than a second.
The primary reason for slow loads? In a word: bloat.
Too much. The way you feel after a Thanksgiving feast.
Google’s latest industry benchmark report analyzed more than 900,000 mobile ads from 126 countries (so sample size apparently ain’t a problem). Seventy percent of pages were over 1MB and “1.49MB takes 7 seconds to load using a fast 3G connection” (which brings things back to seven seconds and the 113 percent increase in likelihood to bounce).
The solution isn’t easy. You’re not gonna like it.
In fact, you might be tempted by a shortcut. It might seem easier initially to use a mobile-friendly alternative like AMP or Facebook Instant Articles.
But that would be a mistake.
Here’s why.
The Problem With AMP & Instant Articles
The Accelerated Mobile Pages Project (AMP) is a self-described “open-source initiative” with the lofty ideal to make the web faster.
Companies who use their technology can see mobile pages load “nearly instantaneously.” It does that by minimizing the amount of resources required through optimizing and compressing notoriously ginormous files like your images.
The ideals are lofty and ambitious. And the results are admittedly good.
Wired Magazine is just one of many huge publishers to reveal glowing highlights, with a 25 percent increase in click-through rates from search results. Gizmodo’s AMP traffic is 80 percent new visitors (presumably coming via search).
Why does AMP perform so well? You don’t need Benedict Cumberbatch for that one. It’s a Google-backed project. So AMP pages tend to get, how should we say, prime mobile SERP placement.
That’s a good thing. But there are a few drawbacks.
AMP is technically more difficult to implement, for starters. Jan Dawson argues that it’s effectively making it harder to publish on the web, writing:
“Technically, these formats use standards-based elements — for example, AMP is a combination of custom HTML, custom JavaScript and caching. But the point here is the outputs from traditional online publishing platforms aren’t compatible with any of these three formats. And in order to publish to these formats directly, you need to know a lot more code than I ever did back in the mid-1990s before the first round of WYSIWYG tools for the web emerged.”
Fortunately, things are slightly easier for WordPress sites. Here’s a three-step guide to setting up AMP on a WordPress site.
There are other problems, though. Losing your branding on AMP pages is one thing. Not good but not a deal killer necessarily. Losing your mobile traffic to Google is quite another, and it’s also the crux of the issue.
AMP content isn’t technically yours anymore. This can impact things like ad revenue, where results are mixed, as seen in the following tweets from Marie Haynes that caught my eye a few months back:
Facebook’s Instant Articles work largely the same way as AMP. Similar pros and cons, too.
Pages load on super speed on the plus side, reportedly up to 10x quicker. Early results from Facebook Partners also showed a 70 percent decrease in Instant Article abandonment (with a 20 percent CTR to boot).
But the same proprietary infrastructure problems have caused many media conglomerates to hit the Pause button. According to analysis from NewsWhip and Digiday, several notable companies have pulled back on Facebook Instant Articles in the last year or so:
Boston Globe went from an incredible 100 percent to 0 percent
Business Insider posted 10 percent and now barely posts 2 percent
The New York Times has dropped to 10 percent
The Atlantic went from posting 85 percent to now only around 10 percent
Other early adopters like the BBC News, National Geographic, and The Wall Street Journal are now “barely using the platform”
Now, this isn’t a Chicken Little, “sky is falling” kinda thing. But it is a cause for concern.
Mobile-friendly platforms offer a tremendous shortcut in boosting mobile page speed. However, there are very serious drawbacks, too, like band-aids on broken arms.
A more prudent approach is to roll up your sleeves, take the long view, and fix your site from the ground-up.
Here’s how to do it.
How to Diagnose Slow Mobile Page Speed
Test My Site is the new version of Google’s old PageSpeed tool (complete with the latest and greatest, 2017 OC Housewives-style facelift).
So start there.
Point and click. This page should pop up next.
Just plug in your URL and hit Test Now.
First, you’ll see the Mobile Friendliness score. Then in the middle is the mobile speed score in question.
Ruh roh.
Let’s scroll down a bit to find out more details on that near-failing grade.
Click on that little box to bring up a detailed assessment of where your site is doing well, along with those areas that aren’t doing so well.
Google mercifully goes into the details of which individual elements are causing you the biggest problems.
Here’s another view of this mobile page speed assessment on a mobile device. Because… why not? Everyone loves a good meta joke.
OK. So the result ain’t pretty. That’s fine. Because now we know what to fix.
The next step is to dive into some of these new mobile page speed industry benchmarks and figure out how to increase them.
Buckle up. It’s about to get geeky.
How to Beat 3 Google Mobile Page Speed Benchmarks
1. Reduce Your Average Request Count
Google’s Best Practice: Fewer Than 50
Requests are literal. Someone tries to visit your website and their browser requests information from your server. The data is compiled and sent back.
The more requests, the longer it takes. Reduce the number of requests that need to be sent back-and-forth and you can greatly reduce average page loading times across the board.
First, reduce the number of files that need to be sent. Yoast cites JavaScript, CSS, and images as your three primary problems.
Minifying JavaScript and CSS kills two birds with one stone. It reduces the number of files that need to be sent back-and-forth. It reduces the overall file size, too.
The GIDNetwork will help you run a compression audit.
Gzip will turn website files into zip files for easier transfers.
WP Super Minify is a WordPress plugin that will do a lot of heavy lifting for you.
Otherwise, Yahoo’s YUI Compressor can help tackle both CSS and JavaScript compression.
Contemporary web design is 90 percent image-driven. I just made up that stat. But you get the point. Today’s websites look like hollow shells if you remove the beautiful, retina-ready images that stretch across your screen.
The problem is that images (if not handled properly) will kill loading times. Once again, Yoast recommends using CSS sprites to combine multiple images into one. SpriteMe, for example, will take background images and combine them to decrease the total number of individual images.
Content Delivery Networks (CDNs) can also help you recoup bandwidth and cut down on website requests. They host large image files for you and distribute them across their own global network of servers. MaxCDN and CloudFlare are among the most popular.
Last but certainly not least, reduce redirects you use if possible. Redirects create additional requests. So proceed with caution.
2. Decrease Average Page Weight
Google’s Best Practice: Less Than 500KB
Seventy-eight percent of shoppers want more product images, according to the Omni Channel Retail report from BigCommerce.
The problem, as we just discussed, is that images can cripple page loads. They create more requests for servers. But they put your average page weight on a bulking plan that would make those meathead bodybuilders at your gym rage with envy.
Page size should be less than 500KB according to Google. And yet a single, unoptimized, high-res image already clocks in at around 1 or 2 MB.
You could start by simply cropping the sizes of your images so each is the exact width and height for the space it’s being used. Except, of course, nobody ever does that. Manually. Every single time they upload an image.
So instead, let’s start by compressing the image file itself with something like WP Smush.it. A non-WordPress tool like Compressor.io can also reduce an image by up to 73 percent.
Let’s run a quick scenario:
Average e-commerce website conversions hover around 1-3 percent.
That number can rise as high as 5 percent. (One example, Natomounts, sees 5 percent conversion rates with ~85 percent from mobile!)
We just discovered that shoppers want more product images.
And yet, according to Radware, 45 percent of the top 100 e-commerce sites don’t compress images!
3. Decrease Average Time to First Byte
Google’s Best Practice: Under 1.3 seconds
Time to first byte (TTFB) is a measurement that shows how long a browser has to wait before receiving its first byte of data from the server.
It’s essentially a three-step process:
A visitor sends an HTTP request to your server.
Your server has to figure out how to respond. This includes gathering the data required and organizing it to be sent back.
Assuming all goes well, the request is sent back to the visitor.
TTFB is the time it takes for that complete cycle to finish.
We’ve already covered a few potential roadblocks during this journey. Too many requests, too many redirects, too many junky WordPress plugins, etc. all take its toll. A website visitor’s own network connection and speed also make an impact.
The aforementioned CDNs also help by reducing your server’s workload. They take over the burden of delivering large files so your own server can focus on delivering the rest of your site’s files and content. The best CDNs even go the extra mile. For example, reducing the physical location between the person requesting a file and the server sending it can have a huge impact.
Caching reduces TTFB by helping web browsers store your website data. Best of all, it only takes a simple plugin (like W3 Total Cache) or using a premium web host that will set up caching for you at the server-level (so no additional tools or plugins are needed).
A web host is like your server’s foundation. You can optimize images all you’d like. Use the best CDN on the market. But if you’re using slow shared hosting that splits resources, your site is going to be slow no matter how many tricks or tips or hacks you use.
Last but not least, a little sleight of hand.
Technically, removing JavaScript files from the head section and relocating them lower on an HTML document won’t reduce the overall number of requests or reduce file sizes. But it will help the important stuff — like the words on each page — to load a little quicker.
JavaScript is selfish. It wants to load all of its code before allowing anything else on the page to have a turn. Pushing it further down forces it to wait its turn until after a few images and basic content can pop up first.
Lazy loading is another common technique that won’t load (or display) an image until it’s within view. That way, page content can be loaded first. That’s helpful on long pages with tons of images (like this blog post). WPMU has a list of six lazy-loading WordPress plugins to try out.
Conclusion
Google has helpfully provided a few mobile page speed benchmarks to shoot for based on their in-depth analysis of what customers want. Unfortunately, the vast majority of websites are nowhere close to them.
Slow mobile page speed has been shown to cause users to bounce, which affects where you show up in search results, and ultimately what your website is able to generate in revenue.
Start by reducing the number of requests that happen each time someone visits your site. Then reduce file sizes along with average time to first byte.
It’ll take some heavy lifting. Definitely some dev help. But it’s your only shot at rescuing sub-par performance that’s sabotaging your bottom line.
Image Credits
Featured Image: Templune/Pixabay.com
In-Post Photo: Google.com
In-Post Photo: Facebook.com
Screenshots by Brad Smith. April 2017.
https://www.searchenginejournal.com/mobile-page-speed-benchmarks/194511/
On – 24 Apr, 2017 By Brad Smith
source https://andlocal.org/how-to-beat-googles-mobile-page-speed-benchmarks-search-engine-journal/ from ANDLOCAL http://andlocal.blogspot.com/2017/05/how-to-beat-googles-mobile-page-speed.html
0 notes
Text
How to Beat Google’s Mobile Page Speed Benchmarks – Search Engine Journal
Brad Smith
5.5K
READS
Google recently unveiled mobile page speed industry benchmarks and analyzed customer behavior to figure out how the two lined up.
Unfortunately, they didn’t.
Meaning:
Most mobile websites are slooooooooooow.
Consumers won’t wait longer than a few seconds.
That’s a problem. It means the vast majority of mobile websites are losing money, practically forcing customers to bounce and go somewhere else.
Here’s why that’s happening and what you should do about it.
Slow Page Load Speed Sabotages Your Revenue
The probability of someone bouncing from your site increases by 113 percent if it takes seven seconds to load, according to Google’s mobile page speed industry benchmarks, which were released in February.
The problem?
The average time it takes to fully load a mobile landing page is 22 seconds, according to the same report.
That’s not good. In fact, it’s awful because that trickle down effect hits your bottom line, too. Slower sites cause more bounces which then lowers conversions:
“Similarly, as the number of elements—text, titles, images—on a page goes from 400 to 6,000, the probability of conversion drops 95 percent.”
This is nothing new. Slow page speeds have long been public enemy number one for years. Over a decade ago, then-Googler Marissa Mayer confirmed that Google themselves saw a 20 percent drop in traffic with just a 0.5-second delay.
Mobile-first indexation is coming, and speed is the mobile SEO Achilles Heel. E-commerce brands lose half of their traffic if pages take three seconds or longer, which has motivated some to get up-and-running in less than a second.
The primary reason for slow loads? In a word: bloat.
Too much. The way you feel after a Thanksgiving feast.
Google’s latest industry benchmark report analyzed more than 900,000 mobile ads from 126 countries (so sample size apparently ain’t a problem). Seventy percent of pages were over 1MB and “1.49MB takes 7 seconds to load using a fast 3G connection” (which brings things back to seven seconds and the 113 percent increase in likelihood to bounce).
The solution isn’t easy. You’re not gonna like it.
In fact, you might be tempted by a shortcut. It might seem easier initially to use a mobile-friendly alternative like AMP or Facebook Instant Articles.
But that would be a mistake.
Here’s why.
The Problem With AMP & Instant Articles
The Accelerated Mobile Pages Project (AMP) is a self-described “open-source initiative” with the lofty ideal to make the web faster.
Companies who use their technology can see mobile pages load “nearly instantaneously.” It does that by minimizing the amount of resources required through optimizing and compressing notoriously ginormous files like your images.
The ideals are lofty and ambitious. And the results are admittedly good.
Wired Magazine is just one of many huge publishers to reveal glowing highlights, with a 25 percent increase in click-through rates from search results. Gizmodo’s AMP traffic is 80 percent new visitors (presumably coming via search).
Why does AMP perform so well? You don’t need Benedict Cumberbatch for that one. It’s a Google-backed project. So AMP pages tend to get, how should we say, prime mobile SERP placement.
That’s a good thing. But there are a few drawbacks.
AMP is technically more difficult to implement, for starters. Jan Dawson argues that it’s effectively making it harder to publish on the web, writing:
“Technically, these formats use standards-based elements — for example, AMP is a combination of custom HTML, custom JavaScript and caching. But the point here is the outputs from traditional online publishing platforms aren’t compatible with any of these three formats. And in order to publish to these formats directly, you need to know a lot more code than I ever did back in the mid-1990s before the first round of WYSIWYG tools for the web emerged.”
Fortunately, things are slightly easier for WordPress sites. Here’s a three-step guide to setting up AMP on a WordPress site.
There are other problems, though. Losing your branding on AMP pages is one thing. Not good but not a deal killer necessarily. Losing your mobile traffic to Google is quite another, and it’s also the crux of the issue.
AMP content isn’t technically yours anymore. This can impact things like ad revenue, where results are mixed, as seen in the following tweets from Marie Haynes that caught my eye a few months back:
Facebook’s Instant Articles work largely the same way as AMP. Similar pros and cons, too.
Pages load on super speed on the plus side, reportedly up to 10x quicker. Early results from Facebook Partners also showed a 70 percent decrease in Instant Article abandonment (with a 20 percent CTR to boot).
But the same proprietary infrastructure problems have caused many media conglomerates to hit the Pause button. According to analysis from NewsWhip and Digiday, several notable companies have pulled back on Facebook Instant Articles in the last year or so:
Boston Globe went from an incredible 100 percent to 0 percent
Business Insider posted 10 percent and now barely posts 2 percent
The New York Times has dropped to 10 percent
The Atlantic went from posting 85 percent to now only around 10 percent
Other early adopters like the BBC News, National Geographic, and The Wall Street Journal are now “barely using the platform”
Now, this isn’t a Chicken Little, “sky is falling” kinda thing. But it is a cause for concern.
Mobile-friendly platforms offer a tremendous shortcut in boosting mobile page speed. However, there are very serious drawbacks, too, like band-aids on broken arms.
A more prudent approach is to roll up your sleeves, take the long view, and fix your site from the ground-up.
Here’s how to do it.
How to Diagnose Slow Mobile Page Speed
Test My Site is the new version of Google’s old PageSpeed tool (complete with the latest and greatest, 2017 OC Housewives-style facelift).
So start there.
Point and click. This page should pop up next.
Just plug in your URL and hit Test Now.
First, you’ll see the Mobile Friendliness score. Then in the middle is the mobile speed score in question.
Ruh roh.
Let’s scroll down a bit to find out more details on that near-failing grade.
Click on that little box to bring up a detailed assessment of where your site is doing well, along with those areas that aren’t doing so well.
Google mercifully goes into the details of which individual elements are causing you the biggest problems.
Here’s another view of this mobile page speed assessment on a mobile device. Because… why not? Everyone loves a good meta joke.
OK. So the result ain’t pretty. That’s fine. Because now we know what to fix.
The next step is to dive into some of these new mobile page speed industry benchmarks and figure out how to increase them.
Buckle up. It’s about to get geeky.
How to Beat 3 Google Mobile Page Speed Benchmarks
1. Reduce Your Average Request Count
Google’s Best Practice: Fewer Than 50
Requests are literal. Someone tries to visit your website and their browser requests information from your server. The data is compiled and sent back.
The more requests, the longer it takes. Reduce the number of requests that need to be sent back-and-forth and you can greatly reduce average page loading times across the board.
First, reduce the number of files that need to be sent. Yoast cites JavaScript, CSS, and images as your three primary problems.
Minifying JavaScript and CSS kills two birds with one stone. It reduces the number of files that need to be sent back-and-forth. It reduces the overall file size, too.
The GIDNetwork will help you run a compression audit.
Gzip will turn website files into zip files for easier transfers.
WP Super Minify is a WordPress plugin that will do a lot of heavy lifting for you.
Otherwise, Yahoo’s YUI Compressor can help tackle both CSS and JavaScript compression.
Contemporary web design is 90 percent image-driven. I just made up that stat. But you get the point. Today’s websites look like hollow shells if you remove the beautiful, retina-ready images that stretch across your screen.
The problem is that images (if not handled properly) will kill loading times. Once again, Yoast recommends using CSS sprites to combine multiple images into one. SpriteMe, for example, will take background images and combine them to decrease the total number of individual images.
Content Delivery Networks (CDNs) can also help you recoup bandwidth and cut down on website requests. They host large image files for you and distribute them across their own global network of servers. MaxCDN and CloudFlare are among the most popular.
Last but certainly not least, reduce redirects you use if possible. Redirects create additional requests. So proceed with caution.
2. Decrease Average Page Weight
Google’s Best Practice: Less Than 500KB
Seventy-eight percent of shoppers want more product images, according to the Omni Channel Retail report from BigCommerce.
The problem, as we just discussed, is that images can cripple page loads. They create more requests for servers. But they put your average page weight on a bulking plan that would make those meathead bodybuilders at your gym rage with envy.
Page size should be less than 500KB according to Google. And yet a single, unoptimized, high-res image already clocks in at around 1 or 2 MB.
You could start by simply cropping the sizes of your images so each is the exact width and height for the space it’s being used. Except, of course, nobody ever does that. Manually. Every single time they upload an image.
So instead, let’s start by compressing the image file itself with something like WP Smush.it. A non-WordPress tool like Compressor.io can also reduce an image by up to 73 percent.
Let’s run a quick scenario:
Average e-commerce website conversions hover around 1-3 percent.
That number can rise as high as 5 percent. (One example, Natomounts, sees 5 percent conversion rates with ~85 percent from mobile!)
We just discovered that shoppers want more product images.
And yet, according to Radware, 45 percent of the top 100 e-commerce sites don’t compress images!
3. Decrease Average Time to First Byte
Google’s Best Practice: Under 1.3 seconds
Time to first byte (TTFB) is a measurement that shows how long a browser has to wait before receiving its first byte of data from the server.
It’s essentially a three-step process:
A visitor sends an HTTP request to your server.
Your server has to figure out how to respond. This includes gathering the data required and organizing it to be sent back.
Assuming all goes well, the request is sent back to the visitor.
TTFB is the time it takes for that complete cycle to finish.
We’ve already covered a few potential roadblocks during this journey. Too many requests, too many redirects, too many junky WordPress plugins, etc. all take its toll. A website visitor’s own network connection and speed also make an impact.
The aforementioned CDNs also help by reducing your server’s workload. They take over the burden of delivering large files so your own server can focus on delivering the rest of your site’s files and content. The best CDNs even go the extra mile. For example, reducing the physical location between the person requesting a file and the server sending it can have a huge impact.
Caching reduces TTFB by helping web browsers store your website data. Best of all, it only takes a simple plugin (like W3 Total Cache) or using a premium web host that will set up caching for you at the server-level (so no additional tools or plugins are needed).
A web host is like your server’s foundation. You can optimize images all you’d like. Use the best CDN on the market. But if you’re using slow shared hosting that splits resources, your site is going to be slow no matter how many tricks or tips or hacks you use.
Last but not least, a little sleight of hand.
Technically, removing JavaScript files from the head section and relocating them lower on an HTML document won’t reduce the overall number of requests or reduce file sizes. But it will help the important stuff — like the words on each page — to load a little quicker.
JavaScript is selfish. It wants to load all of its code before allowing anything else on the page to have a turn. Pushing it further down forces it to wait its turn until after a few images and basic content can pop up first.
Lazy loading is another common technique that won’t load (or display) an image until it’s within view. That way, page content can be loaded first. That’s helpful on long pages with tons of images (like this blog post). WPMU has a list of six lazy-loading WordPress plugins to try out.
Conclusion
Google has helpfully provided a few mobile page speed benchmarks to shoot for based on their in-depth analysis of what customers want. Unfortunately, the vast majority of websites are nowhere close to them.
Slow mobile page speed has been shown to cause users to bounce, which affects where you show up in search results, and ultimately what your website is able to generate in revenue.
Start by reducing the number of requests that happen each time someone visits your site. Then reduce file sizes along with average time to first byte.
It’ll take some heavy lifting. Definitely some dev help. But it’s your only shot at rescuing sub-par performance that’s sabotaging your bottom line.
Image Credits
Featured Image: Templune/Pixabay.com
In-Post Photo: Google.com
In-Post Photo: Facebook.com
Screenshots by Brad Smith. April 2017.
https://www.searchenginejournal.com/mobile-page-speed-benchmarks/194511/
On – 24 Apr, 2017 By Brad Smith
from ANDLOCAL SEO Services https://andlocal.org/how-to-beat-googles-mobile-page-speed-benchmarks-search-engine-journal/ from ANDLOCAL https://andlocalorg.tumblr.com/post/160371326886
0 notes
Text
How to Beat Google’s Mobile Page Speed Benchmarks – Search Engine Journal
Brad Smith
5.5K
READS
Google recently unveiled mobile page speed industry benchmarks and analyzed customer behavior to figure out how the two lined up.
Unfortunately, they didn’t.
Meaning:
Most mobile websites are slooooooooooow.
Consumers won’t wait longer than a few seconds.
That’s a problem. It means the vast majority of mobile websites are losing money, practically forcing customers to bounce and go somewhere else.
Here’s why that’s happening and what you should do about it.
Slow Page Load Speed Sabotages Your Revenue
The probability of someone bouncing from your site increases by 113 percent if it takes seven seconds to load, according to Google’s mobile page speed industry benchmarks, which were released in February.
The problem?
The average time it takes to fully load a mobile landing page is 22 seconds, according to the same report.
That’s not good. In fact, it’s awful because that trickle down effect hits your bottom line, too. Slower sites cause more bounces which then lowers conversions:
“Similarly, as the number of elements—text, titles, images—on a page goes from 400 to 6,000, the probability of conversion drops 95 percent.”
This is nothing new. Slow page speeds have long been public enemy number one for years. Over a decade ago, then-Googler Marissa Mayer confirmed that Google themselves saw a 20 percent drop in traffic with just a 0.5-second delay.
Mobile-first indexation is coming, and speed is the mobile SEO Achilles Heel. E-commerce brands lose half of their traffic if pages take three seconds or longer, which has motivated some to get up-and-running in less than a second.
The primary reason for slow loads? In a word: bloat.
Too much. The way you feel after a Thanksgiving feast.
Google’s latest industry benchmark report analyzed more than 900,000 mobile ads from 126 countries (so sample size apparently ain’t a problem). Seventy percent of pages were over 1MB and “1.49MB takes 7 seconds to load using a fast 3G connection” (which brings things back to seven seconds and the 113 percent increase in likelihood to bounce).
The solution isn’t easy. You’re not gonna like it.
In fact, you might be tempted by a shortcut. It might seem easier initially to use a mobile-friendly alternative like AMP or Facebook Instant Articles.
But that would be a mistake.
Here’s why.
The Problem With AMP & Instant Articles
The Accelerated Mobile Pages Project (AMP) is a self-described “open-source initiative” with the lofty ideal to make the web faster.
Companies who use their technology can see mobile pages load “nearly instantaneously.” It does that by minimizing the amount of resources required through optimizing and compressing notoriously ginormous files like your images.
The ideals are lofty and ambitious. And the results are admittedly good.
Wired Magazine is just one of many huge publishers to reveal glowing highlights, with a 25 percent increase in click-through rates from search results. Gizmodo’s AMP traffic is 80 percent new visitors (presumably coming via search).
Why does AMP perform so well? You don’t need Benedict Cumberbatch for that one. It’s a Google-backed project. So AMP pages tend to get, how should we say, prime mobile SERP placement.
That’s a good thing. But there are a few drawbacks.
AMP is technically more difficult to implement, for starters. Jan Dawson argues that it’s effectively making it harder to publish on the web, writing:
“Technically, these formats use standards-based elements — for example, AMP is a combination of custom HTML, custom JavaScript and caching. But the point here is the outputs from traditional online publishing platforms aren’t compatible with any of these three formats. And in order to publish to these formats directly, you need to know a lot more code than I ever did back in the mid-1990s before the first round of WYSIWYG tools for the web emerged.”
Fortunately, things are slightly easier for WordPress sites. Here’s a three-step guide to setting up AMP on a WordPress site.
There are other problems, though. Losing your branding on AMP pages is one thing. Not good but not a deal killer necessarily. Losing your mobile traffic to Google is quite another, and it’s also the crux of the issue.
AMP content isn’t technically yours anymore. This can impact things like ad revenue, where results are mixed, as seen in the following tweets from Marie Haynes that caught my eye a few months back:
Facebook’s Instant Articles work largely the same way as AMP. Similar pros and cons, too.
Pages load on super speed on the plus side, reportedly up to 10x quicker. Early results from Facebook Partners also showed a 70 percent decrease in Instant Article abandonment (with a 20 percent CTR to boot).
But the same proprietary infrastructure problems have caused many media conglomerates to hit the Pause button. According to analysis from NewsWhip and Digiday, several notable companies have pulled back on Facebook Instant Articles in the last year or so:
Boston Globe went from an incredible 100 percent to 0 percent
Business Insider posted 10 percent and now barely posts 2 percent
The New York Times has dropped to 10 percent
The Atlantic went from posting 85 percent to now only around 10 percent
Other early adopters like the BBC News, National Geographic, and The Wall Street Journal are now “barely using the platform”
Now, this isn’t a Chicken Little, “sky is falling” kinda thing. But it is a cause for concern.
Mobile-friendly platforms offer a tremendous shortcut in boosting mobile page speed. However, there are very serious drawbacks, too, like band-aids on broken arms.
A more prudent approach is to roll up your sleeves, take the long view, and fix your site from the ground-up.
Here’s how to do it.
How to Diagnose Slow Mobile Page Speed
Test My Site is the new version of Google’s old PageSpeed tool (complete with the latest and greatest, 2017 OC Housewives-style facelift).
So start there.
Point and click. This page should pop up next.
Just plug in your URL and hit Test Now.
First, you’ll see the Mobile Friendliness score. Then in the middle is the mobile speed score in question.
Ruh roh.
Let’s scroll down a bit to find out more details on that near-failing grade.
Click on that little box to bring up a detailed assessment of where your site is doing well, along with those areas that aren’t doing so well.
Google mercifully goes into the details of which individual elements are causing you the biggest problems.
Here’s another view of this mobile page speed assessment on a mobile device. Because… why not? Everyone loves a good meta joke.
OK. So the result ain’t pretty. That’s fine. Because now we know what to fix.
The next step is to dive into some of these new mobile page speed industry benchmarks and figure out how to increase them.
Buckle up. It’s about to get geeky.
How to Beat 3 Google Mobile Page Speed Benchmarks
1. Reduce Your Average Request Count
Google’s Best Practice: Fewer Than 50
Requests are literal. Someone tries to visit your website and their browser requests information from your server. The data is compiled and sent back.
The more requests, the longer it takes. Reduce the number of requests that need to be sent back-and-forth and you can greatly reduce average page loading times across the board.
First, reduce the number of files that need to be sent. Yoast cites JavaScript, CSS, and images as your three primary problems.
Minifying JavaScript and CSS kills two birds with one stone. It reduces the number of files that need to be sent back-and-forth. It reduces the overall file size, too.
The GIDNetwork will help you run a compression audit.
Gzip will turn website files into zip files for easier transfers.
WP Super Minify is a WordPress plugin that will do a lot of heavy lifting for you.
Otherwise, Yahoo’s YUI Compressor can help tackle both CSS and JavaScript compression.
Contemporary web design is 90 percent image-driven. I just made up that stat. But you get the point. Today’s websites look like hollow shells if you remove the beautiful, retina-ready images that stretch across your screen.
The problem is that images (if not handled properly) will kill loading times. Once again, Yoast recommends using CSS sprites to combine multiple images into one. SpriteMe, for example, will take background images and combine them to decrease the total number of individual images.
Content Delivery Networks (CDNs) can also help you recoup bandwidth and cut down on website requests. They host large image files for you and distribute them across their own global network of servers. MaxCDN and CloudFlare are among the most popular.
Last but certainly not least, reduce redirects you use if possible. Redirects create additional requests. So proceed with caution.
2. Decrease Average Page Weight
Google’s Best Practice: Less Than 500KB
Seventy-eight percent of shoppers want more product images, according to the Omni Channel Retail report from BigCommerce.
The problem, as we just discussed, is that images can cripple page loads. They create more requests for servers. But they put your average page weight on a bulking plan that would make those meathead bodybuilders at your gym rage with envy.
Page size should be less than 500KB according to Google. And yet a single, unoptimized, high-res image already clocks in at around 1 or 2 MB.
You could start by simply cropping the sizes of your images so each is the exact width and height for the space it’s being used. Except, of course, nobody ever does that. Manually. Every single time they upload an image.
So instead, let’s start by compressing the image file itself with something like WP Smush.it. A non-WordPress tool like Compressor.io can also reduce an image by up to 73 percent.
Let’s run a quick scenario:
Average e-commerce website conversions hover around 1-3 percent.
That number can rise as high as 5 percent. (One example, Natomounts, sees 5 percent conversion rates with ~85 percent from mobile!)
We just discovered that shoppers want more product images.
And yet, according to Radware, 45 percent of the top 100 e-commerce sites don’t compress images!
3. Decrease Average Time to First Byte
Google’s Best Practice: Under 1.3 seconds
Time to first byte (TTFB) is a measurement that shows how long a browser has to wait before receiving its first byte of data from the server.
It’s essentially a three-step process:
A visitor sends an HTTP request to your server.
Your server has to figure out how to respond. This includes gathering the data required and organizing it to be sent back.
Assuming all goes well, the request is sent back to the visitor.
TTFB is the time it takes for that complete cycle to finish.
We’ve already covered a few potential roadblocks during this journey. Too many requests, too many redirects, too many junky WordPress plugins, etc. all take its toll. A website visitor’s own network connection and speed also make an impact.
The aforementioned CDNs also help by reducing your server’s workload. They take over the burden of delivering large files so your own server can focus on delivering the rest of your site’s files and content. The best CDNs even go the extra mile. For example, reducing the physical location between the person requesting a file and the server sending it can have a huge impact.
Caching reduces TTFB by helping web browsers store your website data. Best of all, it only takes a simple plugin (like W3 Total Cache) or using a premium web host that will set up caching for you at the server-level (so no additional tools or plugins are needed).
A web host is like your server’s foundation. You can optimize images all you’d like. Use the best CDN on the market. But if you’re using slow shared hosting that splits resources, your site is going to be slow no matter how many tricks or tips or hacks you use.
Last but not least, a little sleight of hand.
Technically, removing JavaScript files from the head section and relocating them lower on an HTML document won’t reduce the overall number of requests or reduce file sizes. But it will help the important stuff — like the words on each page — to load a little quicker.
JavaScript is selfish. It wants to load all of its code before allowing anything else on the page to have a turn. Pushing it further down forces it to wait its turn until after a few images and basic content can pop up first.
Lazy loading is another common technique that won’t load (or display) an image until it’s within view. That way, page content can be loaded first. That’s helpful on long pages with tons of images (like this blog post). WPMU has a list of six lazy-loading WordPress plugins to try out.
Conclusion
Google has helpfully provided a few mobile page speed benchmarks to shoot for based on their in-depth analysis of what customers want. Unfortunately, the vast majority of websites are nowhere close to them.
Slow mobile page speed has been shown to cause users to bounce, which affects where you show up in search results, and ultimately what your website is able to generate in revenue.
Start by reducing the number of requests that happen each time someone visits your site. Then reduce file sizes along with average time to first byte.
It’ll take some heavy lifting. Definitely some dev help. But it’s your only shot at rescuing sub-par performance that’s sabotaging your bottom line.
Image Credits
Featured Image: Templune/Pixabay.com
In-Post Photo: Google.com
In-Post Photo: Facebook.com
Screenshots by Brad Smith. April 2017.
https://www.searchenginejournal.com/mobile-page-speed-benchmarks/194511/
On – 24 Apr, 2017 By Brad Smith
from ANDLOCAL SEO Services https://andlocal.org/how-to-beat-googles-mobile-page-speed-benchmarks-search-engine-journal/
0 notes
Text
How Long Should Your Meta Description Be? (2018 Edition)
Posted by Dr-Pete
Summary: The end of November saw a spike in the average length of SERP snippets. Across 10K keywords (90K results), we found a definite increase but many oddities, such as video snippets. Our data suggests that many snippets are exceeding 300 characters, and going into 2018 we recommend a new meta description limit of 300 characters.
Back in spring of 2015, we reported that Google search snippets seemed to be breaking the 155-character limit, but our data suggested that these cases were fairly rare. At the end of November, RankRanger's tools reported a sizable jump in the average search snippet length (to around 230 characters). Anecdotally, we're seeing many long snippets in the wild, such as this 386-character one on a search for "non compete agreement":
Search Engine Land was able to get confirmation from Google of a change to how they handle search snippets, although we don't have specifics or official numbers. Is it time to revisit our guidelines on meta descriptions limits heading into 2018? We dug into our daily 10,000-keyword tracking data to find out...
The trouble with averages
In our 10K tracking data for December 15th, which consisted of 89,909 page-one organic results, the average display snippet (stripped of HTML, of course) was 215 characters long, slightly below RankRanger's numbers, but well above historical trends.
This number is certainly interesting, but it leaves out quite a bit. First of all, the median character length is 186, suggesting that some big numbers are potentially skewing the average. On the other hand, some snippets are very short because their meta Ddescriptions are very short. Take this snippet for Vail.com:
Sure enough, this is Vail.com's meta description tag (I'm not gonna ask):
Do we really care that a lot of people just write ridiculously short meta descriptions? No, what we really want to know is at what point Google is cutting off long descriptions. So, let's just look at the snippets that were cut (determined by the " ..." at the end). In our data set, this leaves just about 3.6% (3,213), so we can already see that the vast majority of descriptions aren't getting cut off.
Coincidentally, the average is still 215, but let's look at the frequency distribution of the lengths of just the cut snippets. The graph below shows cut-snippet lengths in bins of 25 (0-25, 25-50, etc.):
If we're trying to pin down a maximum length for meta descriptions, this is where things get a bit weird (and frustrating). There seems to be a chunk of snippets cut off at the 100–125 character range and another chunk at the 275–300 range. Digging in deeper, we discovered that two things were going on here...
Oddity #1: Video snippets
Spot-checking some of the descriptions cut off in the 100–125 character range, we realized that a number of them were video snippets, which seem to have shorter limits:
These snippets seem to generally max out at two lines, and they're further restricted by the space the video thumbnail occupies. In our data set, a full 88% of video snippets were cut off (ended in " ..."). Separating out video, only 2.1% of organic snippets were cut off.
Oddity #2: Pre-cut metas
A second oddity was that some meta description tags seem to be pre-truncated (possibly by CMS systems). So, the "..." in those cases is an unreliable indicator. Take this snippet, for example:
This clocks in at 150 characters, right around the old limit. Now, let's look at the meta description:
This Goodreads snippet is being pre-truncated. This was true for almost all of the Goodreads meta descriptions in our data set, and may be a CMS setting or a conscious choice by their SEO team. Either way, it's not very useful for our current analysis.
So, we attempted to gather all of the original meta description tags to check for pre-truncated data. We were unable to gather data from all sites, and some sites don't use meta description tags at all, but we were still able to remove some of the noise.
Let's try this again (...)
So, let's pull out all of the cut snippets with video thumbnails and the ones where we know the meta description ended in "...". This cuts us down to 1,722 snippets (pretty deep dive from the original 89,909). Here's what the frequency distribution of lengths looks like now:
Now, we're getting somewhere. There are still a few data points down in the 150–175 range, but once I hand-checked them, they appear to be sites that had meta description tags ending in "..." that we failed to crawl properly.
The bulk of these snippets are being cut off in the 275–325 character range. In this smaller, but more normal-looking distribution, we've got a mean of 299 characters and a median of 288 characters. While we've had to discard a fair amount of data along the way, I'm much more comfortable with these numbers.
What about the snippets over 350 characters? It's hard to see from this graph, but they maxed out at 375 characters. In some cases, Google is appending their own information:
While the entire snippet is 375 characters, the "Jump..." link is added by Google. The rest of the snippet is 315 characters long. Google also adds result counts and dates to the front of some snippets. These characters don't seem to count against the limit, but it's a bit hard to tell, because we don't have a lot of data points.
Do metas even matter?
Before we reveal the new limit, here's an uncomfortable question — when it seems like Google is rewriting so many snippets, is it worth having meta description tags at all? Across the data set, we were able to successfully capture 70,059 original Meta Description tags (in many of the remaining cases, the sites simply didn't define one). Of those, just over one-third (35.9%) were used as-is for display snippets.
Keep in mind, though, that Google truncates some of these and appends extra data to some. In 15.4% of cases, Google used the original meta description tag, but added some text. This number may seem high, but most of these cases were simply Google adding a period to the end of the snippet. Apparently, Google is a stickler for complete sentences. So, now we're up to 51.3% of cases where either the display snippet perfectly matched the meta description tag or fully contained it.
What about cases where the display snippet used a truncated version of the meta description tag? Just 3.2% of snippets matched this scenario. Putting it all together, we're up to almost 55% of cases where Google is using all or part of the original meta description tag. This number is probably low, as we're not counting cases where Google used part of the original meta description but modified it in some way.
It's interesting to note that, in some cases, Google rewrote a meta description because the original description was too short or not descriptive enough. Take this result, for example:
Now, let's check out the original meta description tag...
In this case, the original meta description was actually too short for Google's tastes. Also note that, even though Google created the snippet themselves, they still cut it off with a "...". This strongly suggests that cutting off a snippet isn't a sign that Google thinks your description is low quality.
On the flip side, I should note that some very large sites don't use meta description tags at all, and they seem to fare perfectly well in search results. One notable example is Wikipedia, a site for which defining meta descriptions would be nearly impossible without automation, and any automation would probably fall short of Google's own capabilities.
I think you should be very careful using Wikipedia as an example of what to do (or what not do), when it comes to technical SEO, but it seems clear from the data that, in the absence of a meta description tag, Google is perfectly capable of ranking sites and writing their own snippets.
At the end of the day, I think it comes down to control. For critical pages, writing a good meta description is like writing ad copy — there's real value in crafting that copy to drive interest and clicks. There's no guarantee Google will use that copy, and that fact can be frustrating, but the odds are still in your favor.
Is the 155 limit dead?
Unless something changes, and given the partial (although lacking in details) confirmation from Google, I think it's safe to experiment with longer meta description tags. Looking at the clean distribution, and just to give it a nice even number, I think 300 characters is a pretty safe bet. Some snippets that length may get cut off, but the potential gain of getting in more information offsets that relatively small risk.
That's not to say you should pad out your meta descriptions just to cash in on more characters. Snippets should be useful and encourage clicks. In part, that means not giving so much away that there's nothing left to drive the click. If you're artificially limiting your meta descriptions, though, or if you think more text would be beneficial to search visitors and create interest, then I would definitely experiment with expanding.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Blogger http://ift.tt/2CCEbyj via IFTTT
0 notes