David Harry

 

Source: SEOSignalsLab

Pick His Brain!

I’d like to introduce one of our members, David Harry, for our next ‘Pick His Brain’ session and I want to thank him for the participation.

David is a world class SEO forensic specialist. He has worked with numerous brands and large websites for the last two decades.

He is also known as Clint Eastwood of SEO forensics in some circles.

Please note that he is not afraid to speak his mind and can be salty time to time as that’s his style.

Anyhow, please feel free to pick his brain on SEO forensics and SEO audit.

Here are the rules.

1) I’ll let the thread go on until he asks me to stop. Theoretically, this thread can continue until the Facebook stock value goes to zero.

2) Please, no snarky remarks. I will not tolerate any intentional negativity. We are here to learn from each other’s success and strategies.

3) Please do not PM him and bother him. If you have a private question, ask for his permission on this thread when appropriate.

#PickHisBrain

Table of Contents

What do you believe to be the most poweful on page seo signal that can move the needle for a website. I.e H1 tags?

Hard to really say because when it comes to “moving the needle” that often depends on the current status of the site or page in question.

Certainly the TITLE element is going to be strong, but after that it often depends on how the rest of the page is structured and what state the existing content is.

Ans also what the query targets are.

Internal links for example, can be very important in moving the needle for “quick wins” when we start a project

When doing seo audit, what are the common mistakes you see?

Tools… people far too often grab a tool and report on the data.

That’s backwards. We want to establish the needs of the client and the goals of the website before we start an audit.

Then identify what data we need, and then we seek out which tools will supply us with the data.

It is the mind of the auditor which holds the value, not the data.

What are some things you do to optimize site speed?

Some standard ones are minifying scripts and then look for common images.

Those might be part of the template or in ecom situations often times images related to product attributes are across multiple products.

By optimizing those you can make a larger impact than working on a page-by-page basis.

Beyond that, most tools (I use GTmetrix account) will have a fairly good list

RELATED:  Pick His Brain! with John Romaine

Does post frequency aid site rankings? What are your best link building tactics?

That’s 2 dammit…. one per customer lol.

Anyway, while there are a bunch of sociring elements related to temporal factors, post frequency really isn’t a thing.

That being said, if you’re pumping out relevant quality content, it can certainly help with the EAT elements with Google.

It can also help build brand, visibility and most certainly increase the chances of gaining natual links.

So, directly? No. Indirectly? Certainly

Sadly, I don’t really “build links”…. prefer to “attract” them.

As such, much of the above comes into play. Add to that a strong social strategy, networking etc.

And that tends to be the land I play in.

What is the best way to combine external css & jss files due to large number of http requests?

Send it to my programmers? Seriously, I have “people” for that.

My company does SEO and web development.

I just give that kinda stuff to the right person…

To be honest it often depends on the platform… hell, WP even has plugins for that

For example with WP

https://en-ca.wordpress.org/plugins/w3-total-cache/

https://premium.wpmudev.org/project/wp-hummingbird/

https://wordpress.org/plugins/hummingbird-performance/ (free version)

Why do you think people attribute AI super powers to Google on all ends when there’s so little AI in use by them and it isn’t practical cost wise to execute AI on all factors considered for the SERPs? What AIs (apart from rankbrain) are in use right now by Google? And which ones do you think will get implemented in the next 2-5 years?

Indeed a lot of folks like to spew about Google’s AI/ML stuff.

And I think that’s often to look smart of otherwise find an excuse as to why they’re not doing as well in Google as they feel they should be.

Now cost wise, they’re kinda banking on it helping reduce computing so it may indeed ultimately be cheaper, if not just faster IMO

As for how much Google uses, it’s kinda surprising to be honest.

I like the Google Brain project (not to be confused with RankBrain) which is here; https://en.wikipedia.org/wiki/Google_Brain

You can see a LOT of the places they’re playning in via the AI site and blog;

https://ai.google/

https://ai.googleblog.com/

Do you think onpage optimization is mathematical (hitting the values Google expects for a node and type of page) or is there more to it than that?

Well, all things with Google are mathmatical.

But a lot of the time I break it down into technical (ensuring the page is optimal in how it’s presented/renders etc) and then almost a UX approach to ensure the page is satisfying the users need.

I don’t stress over phrase ratios and over targeting terms we want that page to come up for.

I often rather tweak things over time… as Google changes, we change the page in question.

1. Can you share some good practices on how to “funnel” link juice from these pages to your money pages? Does good internal linking do the job? 2. How important is that the topic of the money page and the linkable asset be the same/relevant? I ask because ahrefs says that page-level links (their UR metric) has the highest correlation with rankings, not the links to the domain in general. If that’s the case, is my strategy still feasible?

1. Yea I tend to focus on the internal linking from strong pages, navigation etc… then I would also look at the click-depth of a target page to ensure it’s not too deep.

Oh and speaking of internal links and the passing of external equity, it never hurts to prune the unwanted pages from time to time.

2. That going to depend on the site. Is the entire site on a related market/niche? Or does it have multiple angles? That would dictate things for me.

If the core site concepts are tightly related I ain’t gonna stress how tight the relevance is.

Conversely, if there’s a wide variance in concepts/topics then it’s going to matter a bit more.

And well, correlation isn’t causation so I am not a huge fan of third party made up scoring metrics (DA/PA yada yada).

Again, I consider things on a conceptual level (things not strings).

Sure, a link with tighter relevance might be a bit of a boost, but I don’t think that’s huge.

For me it’s more about how the external signals are being passed through the site and the IA of the site in general.

As for linkable assets, I posted some ideas on evergreen content recently here;

https://www.facebook.com/groups/SEOSignalsLab/permalink/1907430839574369/

Like

Audits – I recently published a framework for the group in this thread here –

https://www.facebook.com/…/permalink/2053971828253602/

Should you disavow low quality backlinks that have linked to your website through no fault of your own?

ALWAYS. IT really doesn’t matter where the crap comes from… kill it!!!

What do you think about nofollowing about/tos/contact… and legal pages? Do you think it’s good idea to link with nofollow less ranking-important internal pages to save pages’ ranking power and rank flow for more important ones?

Well, from what we know PageRank sculpting isn’t really a “thing” it just ends up as a hanging node.

So, I doubt there’s much to be gained.

On the flip-side, those actually do tend to fall into EAT (expertise authority trust) considerations.

Especially with YMYL type query spaces and have more value than most folks actually understand (especially post–Aug core algo update IMO)

Do you have a specific page & site structure / architecture template that you use religiously… or is the page & site structure / architecture different for each site and each page on each site? Do you have a preferred page structure?

Yea that’s really going to “depend” as it’s very much related to the site in question as far as the size of it, the needs, goals… conversion points (primary and secondary) etc.

I don’t really have a cookie-cutter approach on that

How do you integrate LSI in a Joomla site?

Well, sadly that’s the secret sauce…. if I told you, well… u know the rest.

What are your absolute “must haves” when it comes to tools when starting an organic SEO campaign? What does a typical SEO campaign look like process wise including those tools?

Must have? I guess I have to go with Analytics and Search Console then.

After that it depends on what I am doing (SEO strategy, audit, content strategy, competitor analysis etc).

Certainly some of the others for me include Screaming Frog/SiteBulb, Majestic/Ahrefs, SEMrush…).

As for a campaign I’ll go with the classic “depends”. It still comes back to the needs of the client, the goals of the site etc.

Is it a new site? Older site? Small site? Large site? Ecom? Informational? Cookie cutter SEO ain’t my thing.

But in general, I tend to work; audit, competitor analysis, quick wins coming right out of the gate.

I would also take stock of the other areas such as social, CRO, PR teams etc.

After those are done, we start to develop a strategy to move forward with.

When you’re dealing with a large (few M pages) legacy news publishing site that uses scala based platform and nobody in dev/management/journalists really understands SEO, which sized hammer would you use?

Uhm… is C4 an option??? Jesus man.

RELATED:  Pick His Brain! with Dino Gomez

Much like most larger sites/corps you need to get upper level buy-in or it’s going to be a lot of sleepless nights.

It’s bad enough that we get the whole “10% implementation.

100% blame” in a lot of those situations, but that platform is kinda limited even at that.

The SEO team really does need to be part of the editorial guidelines for publishing and have some friends in the dev team that can get things done.

Large sites can be a bitch and things can get out of control in a hurry.

Methinks about the largest I’ve dealt with to date was 25m pages… and plenty of others over 1m.

It can be a nightmare.

What are the common elements you’ve seen on pages that have been hit on the August 1st update?

Well, that’s a tough one for sure.

That being said, we certainly ‘seem’ to be seeing a few areas; EAT – Expertise Authority Trust – which comes from the Google raters guide.

In the ecom spaces (and other YMYL query spaces) we certainly noted some correlations as far as EAT related elements are concerned.

From more full-fledged contact/tos/privacy policies etc.. to higher levels of FAQs and other expertise elements.

Other studies have cited YMYL (your money or your life) sites seemingly taking a hit, but Google said it wasn’t “targeted” at those… whatever that means LOL.

So yea, we certainly did see some stuff related to EAT across the board as something we’ve been looking at with clients and with competitor analysis stuff.

Query Classifications – we also noticed some changes that were somewhat like 2013’s Hummingbird stuff as far as some changes in how some query spaces were being treated.

Some sites started to pop up in the top 10 that were nowhere to be seen previously.

In addition to that, we noticed some larger players getting a boost as well (though more likely attributed to the above ‘authority’ stuff, not query classifications.

To learn more about query classifications, I wrote this years ago, might help get a sense of what it is;

http://vervedevelopments.com/…/query-classification…

I hope that helps some, as always we never really know… it’s Google.

But if you’re having issues/losses, feel free to hit me up.

Could you please list some of your “quick wins” you do regarding on page optimization?

Ok when we talk about “quick wins” I am referring to that situation where you get a new client and want to try and make some form of an impact early on to justify your expense.

We really don’t want to be waiting 4-6 months to show some type of ROI on our services.

For my firm, we tend to start off by looking at a few things;

Ranking pages – if it’s via Search Console or ranking reports (or both) we like to take a closer look at pages ranking between #5-12 positions.

Meaning; Google does like them, just not enough. From there we grab the ones that seemingly will have value (traffic and targeting) and tighten those up.

You can look at internal link ratios, TITLE elements, on-page elements etc.

Also look at SERP CTR and tighten up the call to action via the meta-description.

These are pages that we can usually move the needle on early and show some ROI for the client while we work on the bigger fish and larger issues of the site.

CRO – again, much like the end part above, from meta-descriptions increasing SEPR CTR, we also look at the conversion rate optimization of the important pages and sometimes the traffic flow.

Heck, even the UX while you’re at it.

In short, early on we can maximize the value of the traffic we’re already getting, before we worry about attracting new/more traffic. Ya know?

I hope that helps some….

Can you share a little how you decide to internally link? E.g. Do you have any sort of limits on these internal links? I.e is 20 to a moneypage too much? How do you choose your anchor texts? Which pages do you choose to link from? Would love to learn a bit more about this.

For us it really comes from establishing our targeting pages by mapping terms to pages.

Once we’ve prioritized those we start to look at the internal link ratios and adapt from core elements such as navigation.

After that we look at which pages (beyond the home page) have external equity to them, assess them topically, then start to link internally more based upon our prioritization.

As for anchor texts, as I mentioned elsewhere in this thread, I don’t really stress that.

I do what’s natural. It might be exact match, partial match or even “read more” etc.

I’m no big on the value of internal anchor value and tend to look at it more from a UX perspective.

I would like to hear your thoughts on using curated content and interlinking that content on your site in order to build EAT.

Well, it’s certainly a good strategy, but one of the new things that we’re considering more is the recent additions to the Google Raters Guide in the form of content creators.

One of the additions used the terminology “Creator of the Main Content” (MC). And the low quality section added, “reputation of the creator of the content”.

Now this does seem to be aimed more at the world of YMYL (your money or your life) but it certainly is something I’d be considering with all forms of content.

That being said, I would imagine that curated content that has some semantic, topical or quality relevance to trusted documents might be able to still score well, regardless of the reputation of the content creator.

To my mind it just is something that when we can, we should.

When I have clients with content that is created by someone that is a known entity in a market, I am going to highlight that.

To the extent of using mark-up, links to known profiles (LinkedIn.Wiki etc) and so on. In short, we’ll try to leverage it.

As for interlinking, we always try to work on a smart taxonomy that builds on the IA across the site and then not only link, but work the navigational elements to best build on that path.

Related from recent Google Raters Guide; http://www.thesempost.com/google-search-quality-rater…/…

For a local service business stuck on position 3 to 5 what would you do to bump it up to get in the top 3?

Well, I gotta tell ya I ain’t the “local SEO” guy.

I’ve managed local on an international/national scale for corps with dozens of outlets, but that’s a bit different.

Hell, I even manage some campaigns with local elements, but I’m no expert in that world.

Maybe Doc Sheldon or Ammon Johns might have more useful info for ya – we can’t be good at EVERYTHING, am I right?

Does anchor text matter or power of the page when linking to my page?

Both? lol.

Seriously though, we can consider the anchor text, the equity of the page and even the authority of the site in question.

And hey, there are even some Google patents that look at historic/temporal elements.

RELATED:  Pick His Brain! with Imran Tariq

Such as links over time to the page in question that’s linking to you.

If a page hasn’t been getting new links in a long time, Google might treat that page as less relevant.

I personally don’t think that anchor text is what it used to be… but it does still hold value.

When building brand links for EMD how to avoid penalties?

Same as anything really… avoid crap sites. I remember talking to Matt Cutts about a forensic client gig a few years back.

And how they’d done outreach with a free product, but hadn’t even asked for a link.

Bloggers being bloggers, they linked.

But his advice was to avoid crap sites and sites that have posts about everything from duct tape to jewelry.

The low quality and topical dilution stands out as low quality to Google.

So, EMD or not…it’s all about LOCATION LOCATION LOCATION.

What’s your strategy with internal linking? Do you go agressive and use your exact match everywhere or do you mix it up and use different partial matches and different anchor text?

Not to sound like a Google “fan boy” but I actually tend to consider UX more than anything.

Given that internal anchors aren’t likely a huge scoring element, I want the usability to be key.

Remember, we’re here for conversions at the end of the day… (unless informational and ‘views’ are the money).

As such I am not only thinking what might be good for Google, but what might be good for the site as well.

I’d use a combination of partials, exact match, stemmings and even the odd “click here” or “read more”.

For me it’s more about the internal link ratios, the IA and how I am passing equity through the site… Ya know?

What is an internal link ratio?

Sorry, that’s just my thing for discussing how many internal links are pointing at various pages on a site.

A Googler once explained it to me as “the more you link to a page internally, the more important we feel it is to you.

The less you link to a page, the less important” (paraphrasing).

What builds Authority for the site and what makes trust for the site/page?

Ok, we never know exactly what Google thinks or does, but here’s a view of some thing we look at with my firm;

Trust Elements

As per the EAT and YMYL elements, the concepts we want to work around can include;

Customer Service

Contact information

Delivery options

Payment options

Returns and cancellations

Warranties

Privacy information

Terms of Use and TOS

Authority Elements

About Us

Author tags (for creator elements)

FAQs

Buying Guides

Product Videos

Buying and maintenance advice

Testimonials

Manufacturers and suppliers

And of course for expertise we tend to focus on evergreen content such as;

Collections of Statistics

Original Research

Case Studies

Free and Paid Tools available Online

“How to” Guides

Mistakes to be avoided (common mistakes)

Resource Guide

Tip Roundups

History and Background

How-to (beginner to advanced)

FAQ sections

Best or Worst Practices

Posts that can be updated over time

Glossaries

Tutorials

Again, that’s my view on it… what Google does? Who knows…

What is your experience with ifttt syndication networks?

Don’t have any sorry.

How do you leverage social to gain links? Are there any strategies you can share aside from the “search buzzsumo for people sharing similar content” which can generate traffic short term but doesn’t guarantee links?

To be honest I don’t manage our clients social media campaigns.

That being said, I do work with them and occasionally audit client’s social strategies.

What we focus on is building out strong social media presence and we don’t over stress the target audiences (unless we’re doing a paid ad campaign).

From there we leverage it with targeted content (ours and others), with engagement and PR teams as well, if they have one in place.

For me the real consideration is that social rarely gets us direct conversions and is more part of the sales funnel.

Attribution with social is tricky… a lot of times they discover you on social, then research you via Google.

Do you have any templates you use that you would be willing to share related to seo?

I shared an audit template recently, other than that, nothing on hand.

We’ll have some in the SEO Dojo when it re-launches:

1. Table of Contents

2. Domain information

2.1 Domain analysis

2.2 Geo Targeting

2.3 DNS and glue

2.4 CMS status

2.5 Past or present penalties

2.6 CDNs or other off-site assets

3. General Onsite Analysis

3.1 Titles and Meta Data

3.1.1 TITLE length

3.1.2 TITLE targeting/duplication

3.1.3 Meta-D CTAs

3.2 Prominence Elements

3.3 Image Alts

3.4 Duplicate Content

3.4.1 Thin content

3.5 Phrase Diversity

3.6 URL Structures

3.7 Site/Page Speed

3.8 Mobile Readiness

4. Technical On-site Analysis

4.1 Page Naming Conventions

4.2 Site Structure

4.2.1 Architecture

4.2.3 Faceted navigation

4.2.4 Breadcrumbs

4.2.5 Click depth

4.3 Canonical Issues

4.3.1 Page Canonicals

4.3.2 Rel/Next

4.3.3 WW v non-WWW

4.4 Cookies/Session IDs

4.5 Robots.txt

4.6 URL re-Writing;(Apache/Microsoft IIS etc)

4.7 Header status codes

4.8 Error Codes (search console

4.9 Site Maps

4.10 Structured Data / rich media

4.11 HTTPs implementation

5. On-site Link Profile

5.1 Internal links / Navigation

5.2 Keyword Mapping

5.3 Outbound links

5.4 Broken Links

6. Off-site Link Profile

6.1 Summary

6.2 Link Diversity

6.3 Link Text(s

6.4 Velocity

6.5 Potential for link toxicity

7. Search Engine Visibility

7.1 Indexation

7.1.1 Over/Under indexation

7.1.2 Search Console

7.2 Existing Rankings(s

7.2.1 Past ranking reports / search console

7.2.2 Query classification diversity

7.3 Analytics analysis

7.3.1 Channels

7.3.2 Organic

7.3.3 Conversions

7.3.4 Annotations

7.4 Local needs/presence

7.4.1 Citations and reviews

7.5 Social media presence (in engines)

7.6 Brand knowledge graph presence

8. Action Plan

6.10 Summary

6.8 Short Term Recommendations

6.8.1 Quick wins

6.9 Long Term Recommendations

SEO Audit Start Up Needs

Things we’d like to have (from the client) where possible;

• Google Webmaster Tools (access)

• Google Analytics or related (access)

• Keyword strategy (core and stemming/modified terms)

• Page mapping of terms and target pages (if any)

• Historic rankings (if any)

• Geo-local targeting (if applicable)

• Past SEO work (if recorded)

• Past developer change logs

• List of other domains/microsites (if applicable)

• Competitive analysis (known query competitors)

How come I find you so funny and (some) people find you salty/irritating? Specially when you’re Canadian… Known for their good behaviour and civilized manners. P.S: Been to Canada, and loved it. Hope I can get back sometime.

Hehe well shit bro, I do tend to rub folks the wrong way cause I don’t stand for the bullshit and am passionate as a MoFo.

What is the correct keyword density for a 1k-word text about skateboarding tricks and how many of those keywords should be LSI?

Sigh…. just had to do it huh? I saw ya typing… just waiting to see..

You did not disappoint – as for the question – no comment hehe.

 

 

Leave a Reply

Your email address will not be published.