Pick His Brain!
I’d like to introduce one of our members, Patrick Stox, for our next ‘Pick His Brain’ session and I want to thank him for the participation.
Patrick is the technical SEO advisor for Ahrefs and former technical SEO at IBM.
He’s written for many SEO industry blogs, spoken at many SEO conferences, runs 2 SEO meetups and an SEO conference, and moderates /r/TechSEO on Reddit.
If you have any questions related to technical SEO, please feel free to pick his brain.
Here are the rules.
1) I’ll let the thread go on until he asks me to stop. Theoretically, this thread can continue until the FaceBook stock value goes to zero.
2) Please, no snarky remarks. I will not tolerate any intentional negativity. We are here to learn from each other’s successes and strategies.
3) Please do not PM him and bother him. If you have a private question, ask for his permission on this thread when appropriate.
Please subscribe to our new Twitter account for the latest updates.
What’s the first thing you look for when assessing SEO strength of a site?
Traffic would probably be first, but also Traffic Value and a link metric like Ahrefs DR score.
Combined they kind of show if people are creating valuable content and can be easily compared vs competitor metrics to get an overall idea of how a site is doing in their niche.
What’s the most common mistake amateur technical SEOs make?
Great question Steven Kang! Wanting to fix everything.
There are plenty of issues you can spend a lot of time on that won’t have much, if any impact.
For example adding pages to a sitemap that are already indexed or fixing a link that redirects. Prioritization is hard.
I recently shared my preferred method to help with prioritization on Ahrefs blog https://ahrefs.com/blog/enterprise-seo/ which is an impact / effort matrix to help estimate impact and time it will take but you still have to be able to estimate those things properly which can be difficult.
What’s your take on the future of maps? Do you think it will stay around or become 100% pay to play? Do you think the value of local SEO has diminished since maps (seems all my calls come from the maps).
That’s an interesting question. I have a feeling this will be a mixed model of ads and organic for a while.
Even with things like shopping they went from free (Froogle) to fully paid, and now back to a mixed model.
I’m sure if they felt like they could do it without backlash they’d go all paid.
As for the value of local diminishing, I don’t think so. I think your website and your business falls under “prominence” in their ranking factors for the maps, so things like links, mentions, and if people know you will all be impacted by your website and content.
What’s the best way to stand out in a highly competitive niche market? Is it possible to combine Google Ads to aide SEO or do the two work mutually exclusively? What’s the one thing to absolutely get right when launching a new site?
To stand out you need hard work, creativity, and expertise. You really have to be better or be different in some way and there’s not just one way to do that.
Google Ads won’t directly impact your traffic (it’s not a ranking factor), but similar to social media (also not a ranking factor) it’s good for promotion and reach.
It’s possible that someone sees your content in an ad or on Facebook and then writes an article and links to you.
For the one thing to get right, with a brand new site it’s to make sure it’s indexable. If it’s a new site your moving to like a migration, the number one issue is usually failing to do redirects properly.
I wouldn’t bother doing this.
In general with the way PageRank flows according to the reasonable surfer model (a patent from Google), it’s not likely that much value would be passing to links that are less likely to be clicked on.
So things in the footer that link to privacy policies for instance aren’t likely to get clicked on or have much value passed to them anyway.
Sometimes there is a big discrepancy in user/visitor data between Ahrefs & GSC or other visitor tracking sw. What are possibly the reasons for such wide variations?
It depends on the sources you’re looking at. For Ahrefs to GSC, we’re never likely to match their data.
We basically estimate traffic by modeling clicks for each keyword and summing those up to estimate page traffic.
In GSC, if you add up the clicks on keywords, it’s not going to add up to the same number as the page clicks.
They actually have a blank row in the API with clicks for keywords they don’t show which contains usually 20-80% of the overall data.
Unless we added our own blank row with some variable, we’re not likely to show the traffic number you’d see in GSC.
If it’s GSC to GA, then GSC is going to be more reliable for clicks to the page since it’s simply recording number of clicks when a result was shown in search results.
With GA, there may be issues with your tag firing, people blocking tracking, or even people leaving your site before your tag fires.
What is the best free tools to do keyword research and competitor analysis for start up seo agency?
I’ll pitch our own first, https://ahrefs.com/keyword-generator
But there are a lot of free tools like UberSuggest is popular, Google Keyword Planner which is within Google Ads, Google Trends.
I’m sure someone probably has a list of free tools if you search for them.’
How can I reduce Leverage browsing cache and minimize redirects at my site (not wordpress site).
For browser caching, you’ll need to set cache controls on your server or CDN for whatever resources aren’t being cached.
You can search for instructions for this for whatever system you use like Apache or nginx.
For the redirects, look at what is redirecting and update the references if you can.
You may not be able to update all of them as some may be controlled by 3rd parties.
What’s it like working for Ahrefs?
It’s awesome! I love the tool, the people are great, and everyone just wants to build cool things.
I couldn’t be happier.
If you were a business owner hiring for SEO services, what are the top 5 things you would ask from a SEO service provider, and how often would you like them presented to you?
For a business owner, probably all that matters is $.
How many people contacted me, how they contacted me, how they found me, how much money did that drive me.
The company may not be able to answer some of these without the business owner’s input, so instead you sometimes get things like traffic value, traffic, or rankings which are meant to show gains.
But in the end, it’s all about how much value they’re driving to the business so whatever most closely aligns to that.
For frequency, I mean real-time dashboards are thing now but if you’re looking at revenue then there’s time between service/sale and reporting.
Whatever makes sense. Monthly is fairly common just so a lot of time isn’t taken up by reporting.
What’s the recommended tools to do technical SEO audit?
I don’t think there’s ever just one tool. For general audits you have ahrefs site audit, screaming frog, sitebulb, deepcrawl, botify, oncrawl, semrush, ryte, and many others.
For page speed, page speed insights or webpagetest are my preferred tools.
A lot of technical SEO is more about a process to check certain things as well which can be difficult for tools.
I guess it just depends on what you’re working on at the time.
What if you had to start with a domain that has a starter link profile, how would you rank for at least some of those keywords (not over 10k searches/mo for examples) on a budget
On a budget that’s going to be hard, but you have to start somewhere. It’s probably going to be a hard road.
A lot of work goes into the Ahrefs blog for instance with 2-3 people working on each blog and at times even we don’t rank on the first try.
We go back and update posts all the time, add our own insights, or even try different formats to see what works.
How can you differentiate your content? If you can add insights no one else has on these topics then people will find this content and share it.
It might be a slow path, but over time it’s possible.
Just one example of differentiating, I started a Twitter series where I tell people things they might not know about topics.
My second one was about titles and meta descriptions, something that’s been covered a million times by a million different people, yet people responded because I found a unique angle to it by sharing less known information.
Here are the stats from that tweet:
Please tell us about your outreach methods. I have sent lots of outreach mail & also offer them publishing fee but after sending 1000 mail i just get 5 links
Oh man, I’ve been there. It was always a numbers game where most people didn’t respond and most who did wanted to be paid.
Looking back to when I used to do more of this type of work, it was really hard and I wasn’t good at it.
I tried a bunch of tricks like using a female persona, complimenting people, I even ran ads targeted towards their email or at times towards reporters/writers.
Following up always helped with the numbers a bit.
I think part of what I struggled with was that the content I wanted people to link to wasn’t really all that good.
It was written by an SEO and not an expert and probably came off as a shameless SEO article to those I was outreaching to.
Somewhere along the line that changed. Maybe because I’m not writing for plumbers and dentists and am writing about SEO now.
Now I know a lot of the people who share my content and link to me and in most cases they do it without me asking.
I fully recognize the privilege of this situation and I know it’s not comparable.
Now when I ask people they usually want to support me or Ahrefs because they know the content and/or tool is good and I can just message people on slack, fb, twitter, etc and ask. If you can figure out how to become a person or company that people want to support, who they know makes killer content, then it’s like being on easy mode.
I think I got there from just continuing to write, present, and growing my network. I’m sure it won’t be an easy path, but just keep at it.
Ahrefs and SEMrush obviously have different data points and strengths. How would adding Ahrefs alongside SEMrush help, and how would you separate and implement the findings from the data?
I’m not sure I can be diplomatic in an answer for this.
I used them both at IBM and even before that when I was at a local SEO agency but I think no matter what my answer will come off as biased so I’m sorry in advance.
1. Click data – as far as I know Ahrefs is the only tool that exposes this data. So each keyword has it’s own model where there’s enough data and you can actually see the click distribution to the different ranking sites.
3. Historical keyword rankings – I actually loved this feature in SEMrush at IBM because the company acquired a bunch of other companies (and domains) and killed off a bunch of content, so going back and seeing what changed helped me find what went wrong.
Ahrefs recently released this data in Organic keywords 2.0 and you can do the comparison in the tool rather than exporting to excel and having to compare there, which saves time.
4. Internal link data – again the only tool with this. Was great for finding generic anchor text links like “learn more” “read more” “click here”
5. Internal link opportunities – a report in site audit that suggests internal links to add.
6. Best by links > filter to 404. A list of pages with links that 404, or basically pages you might want to redirect.
7. Content Explorer – pretty unique and a lot of use cases.
These are some of my top use cases anyway, but there’s a blog with more https://ahrefs.com/blog/unique-features-ahrefs/
Beyond proper schema, do you know how ranking signals differ for knowledge graph and rich snippets? Also, have you seen examples of google misunderstanding a query and have any insight to changing it? Currently google is showing a definition rich snipped for a custom product.
Rich snippets I think always come from data on your page.
Knowledge graphs have additional sources of data like Wikipedia, Wikidata, and GMB.
Is the product the same name as something else? There are many documented examples where Google got things wrong over the years.
One of the most popular was the logo for Stone Temple Consulting. I’m not sure I can make specific recommendations without seeing what’s wrong and where that data is coming from.
What’s the most common SEO mistake we ( Newbie ) do and how to fix it ?
Probably depends on the generation of SEO. I haven’t worked with new SEOs in a while so it’s hard to say.
One thing that was hard to break people of for a while was keyword density myths, where they believed they had to use a word a certain number of times in an article.
Which one should I do first Technical SEO or On- Page SEO?
Mostly the same thing.
When you do competitors website analysis what you look for?
Usually things like their top pages to see what content is working for them is a good place to start.
How many Schema I can use on my website? Is there penalty issue for using more than one schema?
You can use many. No penalty unless you’re breaking the guidelines https://developers.google.com/search/docs/guides/sd-policies
I just started a fitness blog and a tech blog. The fitness blog does not have an affiliate links but the tech blog has affiliate links to amazon products. My question is how do I use back links to drive traffic to both sites? Does having an site with affiliate links affect this process?
Having an affiliate site doesn’t change the process.
You probably want to look more at your content than links for driving traffic to the site.
What did you study to become a technical SEO specialist? Any study resources you recommend?
Economics and Business Administration.
Most of the rest just came from curiosity into how things work and how to build things.
Google has some pretty good documentation and really any topic you’re interested in just search it and read some of the articles.
You can learn a lot from others, but you also learn a lot by doing.
From your experience, how do Bing ranking factors differ from Google?
Probably mostly similar.
I can’t say I’ve ever looked hard into the difference as I never worked on a site where Bing sent much of the traffic.
In order to optimize for crawl budget, Should I noindex low-quality pages or permanently delete those pages?
On most sites you probably don’t have to worry about crawl budge.
No-indexing pages won’t help you much with crawl budget since those pages will still be crawled periodically.
After a complete site audit using SEO tools, what are the major technical aspects of SEO that need a quick fix and can have a greater impact on ranking?
I usually look if pages are indexed and how pages are indexed first.
There are usually things indexed that probably shouldn’t be (security reasons usually) and things that aren’t indexed that should be.
Can’t rank a page if it’s not indexed.
What is the quickest / best “technical SEO” fix for WordPress sites? How would you define “technical SEO”? Do you have a technical SEO audit / process checklist we could follow? Should links to noindexed pages also be nofollowed? If you were stranded on a desert island and had to SEO yourself to safety and couldn’t choose Ahrefs, what ONE tool would you choose to save your a$$?
WordPress is pretty solid out of the box. You probably don’t have much to do here.
Definition of technical SEO: “any sufficiently technical action undertaken with the intent to improve search results.”
I definitely did not steal this from Russ Jones.
I don’t have a process or checklist but you can find many online.
I have processes depending on what I’m looking for or what I suspect is the problem and some of those are documented in articles like this https://ahrefs.com/blog/remove-urls-from-google/
I wouldn’t bother nofollowing links to noindexed pages.
Desert island SEO tool that isn’t Ahrefs, SEMrush.
What are the top 3 most overlooked aspects of Technical SEO ?
1. Redirects. No one likes to do them or spend the time mapping them.
2. Internal links. This somehow fell to technical SEOs because of the process to scrape and check and all. Might not be considered technical SEO in the future since we built a link opportunities report in Site Audit.
3. Future proofing. A lot of people may want to change things now, change them again later, and again. Each time they’re doing the “right” thing but sometimes chasing what’s right means you’re taking on a lot of unnecessary risk like changing URL structures means redirects that may not be maintained.
What’s the top 3 most influencing factors for Local SEO?
Probably Relevance, Distance, Prominence since Google lists them here https://support.google.com/business/answer/7091?hl=en
Do you think core web vitals are a factor yet?
Google said they wouldn’t be until next year.
What do you prefer Google Universal Analytics or Google Analytics 4.
New one sounds pretty good to me. Auto-tagging events is great.
I’d be hesitant to trust some of the data filled in with machine learning.
Not to say it will be bad, but it might be.
Is it true that we should not create separate pages for similar keywords (keywords variations)? Rather we should target them on a single page?
It depends on if you have enough content about that variation where you think making a page makes sense or not.
SEO vs what is SEO, then yeah you could probably make a separate page or have them together.
T-shirt red, white, yellow, black, purple, xl, l, m , s, probably is all better as one page.
What does your day to day technical SEO advising look like?
Looking for bugs, thinking of new opportunities, making proof of concepts for tools, making feature roadmaps, website fixes, answering some questions in our FB group or from support, etc.
What’s your thought on recent sudden disappearance of ranking pages from SERP? Anything can be done by site owner to fix this issue?
I think Google is in the process of fixing these.
They had some bugs on their end.
We have a website which is still incomplete (tables to show products were missing from the website) and while making it Google was able to rank the website and get lot of traffic to it. But in May the traffic tanked and I feel we are facing some kind of manual penalty. We are upgrading the website and it will take us 3-4 months to update the whole content – 300 pages to be updated. We have around 1100 pages in the website. Should I noindex pages or update them slowly slowly. We haven’t yet started with link building of money pages as well. Should I do that along with updating the pages ?
I wouldn’t noindex them, no. Just keep updating them as you can.
If it was a manual penalty it would be shown in GSC, but likely from the timing it’s an algorithmic adjustment based on the quality of the site.
Just keep improving and you’ll likely recover.
Do you recommend using review schema for a single “editor’s review” of an affiliate product?
Nope. Guidelines say it must be sourced from users.
Giving your self a review is a bad idea.
An ecommerce store has stock in both GB and DE with pages in need of mutual hreflang tags. However, some stock is only available in one or the other and these pages don’t need hreflangs. Crucially, this isn’t fixed – it can change as stock becomes available or sells out in either country, which happens regularly. Is there a good way to implement this dynamically with a trigger that checks that both pages exist first, then only adds hreflang tags to those that do? Perhaps in Tag Manager? (Assume the same URL structure except for country codes)
A good way, probably not.
You’re going to need something custom to check stock and output hreflang either in the sitemap or on the page.
Does location-servicename pages have ranking power
If I understand, you mean pages for specific locations?
You can overdo this and basically be creating doorway pages, but in general yes a page about a location is going to be more relevant.
Counterpoint, a page about multiple locations is probably going to be stronger.
Do 301 redirects pass all the link equity? For example we have a lot of client sites that link to us in the footer but the page we used to link to is no longer on our site and we have since redirected that page to the homepage. Wondering if we should spent the effort to go fix all those links and point them directly to the homepage?
Depends on the page they were pointing to.
If it’s relevant, then yes all the value should pass.
If the page content is completely different like all the links were about blue widget and now you point to your homepage about Christmas, then probably not.
My market is very local to my geographical region (New Zealand) and I’m currently not using a CDN provider (only a quick local server). In terms of ranking, in your opinion would a CDN make any ranking differences?
Probably not. As long as your host has good uptime and can handle the traffic load you’re probably good.
I would like to know why an Australian based site (has a .com url not .com.au) would rank better for US audiences? Australia is mentioned frequently (GMB, H tags, alt text, map, author etc). It isn’t a major issue as but it would be good to improve Aussie rank.
It’s possible that you have more competition for those queries in Australia.
Hard to say for sure on this without seeing it.