EPISODE #14

Conquering Technical SEO with Michael King

Welcome back Marketing Microscopers! We appreciate you checking out the 14th installment of the company podcast!

This one is focused on technical SEO.

Michael KingWe proudly welcome Michael King to the Microscope! King is an SEO expert who works as Managing Director on iPullRank. King is featured on Moz and has spoken at SEO events across the world!

Manish, Taral, myself, and King kick off the episode talking about technical SEO as a whole. King gets into the specifics about the role it plays in 2020, the impact of fast-evolving tech, and how to optimize websites built in the latest JS languages.

We then shift gears to site navigation, where Taral asks some good questions about top-level navigation, its importance from a user point-of-view, and the click depth of pages. King dives into the details of URL structure and how it impacts large eCommerce sites SEO – and provides some expert tidbits on how to build it out!

We begin wrapping up the first segment of the show by discussing loading speed. King gives us his honest take about SEOs relying on Google’s Page Speed Insights and its suggestions. He provides some excellent words of wisdom on what factors to prioritize when analyzing loading speeds, as well as some good tools to use for more accuracy.

After a brief intermission, we get into the thick of e-commerce and how to implement an in-depth SEO audit – especially for sites with tons of products. King provides some expert advice on managing crawl budgets – and how sites that operate in multiple countries can avoid duplicate content issues.

As the show comes to a close, I ask King some more general questions related to SEO as a whole – like whether linking out to high authority sites actually makes a difference in rankings. We briefly discuss tiered linking and its impact on “relevancy”, as well as Google’s outlook for 2020.

King closes out the show by providing five pieces of wisdom for webmasters!

If technical SEO is your game, this episode is a must-listen!

To hear more episodes of The Marketing Microscope, check out our website or find us on Stitcher!

Read The Full Podcast Transcript

Kevin Svec: Good morning Marketers. Welcome to the 14th installment of the marketing microscope podcast brought to you by E2M Solutions, a full service digital agency, specializing in content marketing, web design, e-commerce, SEO, and copywriting.
This is your host, Kevin Svec, Chief content strategist at E2M solutions and joining me today from E2M we’ve got Manish, founder of the company, and SEO specialist Taral Patel. On the show today, we are welcoming Michael King, Technical SEO expert and lead consultant at iPullRank.
Michael King: Hey, hey, hey!
Kevin Svec: Michael, how are you doing today?
Michael King: I am fantastic and happy to be here.
Kevin Svec: Yeah, we’re really happy you’re joining us. We’ve got a really good show planned for you. Now, Michael, I know that you are one of the top guys in the SEO industry. You’ve been a contributor in Moz and you spoke at SEO events all over the world, and you probably need no introduction to our listeners. But if you wouldn’t mind telling us just a little bit more about yourself and your role in the SEO industry.
Michael King: Cool, yeah, I am, like you said, somebody that is pretty involved in a lot of leadership stuff, my background is in computer science. I’ve also spent a few years (eight years) doing music for a living. So I come with a very interesting diagram for perspective. And so, I’ve worked with some pretty large brands, iPullRank we’ve been around for 5 years now, almost 6 at this point. And, you know, we primarily do SEO and content strategy for a lot of enterprise brands.
Kevin Svec: Very cool. We’re really psyched to have you on the show. So, just to direct this to Taral. I know you wanted to start off with the more technical aspects of SEO. So, Taral, if you want to take the ball here, you can dive right into the questions.
Manish Dudharejia: Yeah, I guess, I’ll take the technical SEO questions first.
Kevin Svec: Oh, go ahead, Manish.
Manish Dudharejia: So Mike, I’ve also been in this industry for over 10 years and have been following you and reading your stuff, so I do know that you have been a lot into technical SEO. We definitely wanted to start with some questions related to that.
So, to start with, in 2020, what do you think, how far have we come in terms of technical SEO and how is it becoming more important these days. What is the importance of technical SEO in terms of content? Content is still king, but if you have to put weight which would you consider heavier, content or technical SEO. Or is it all about balancing both?
Michael King: I mean, to answer your question directly, it is about balancing both, but it is also about what do you think is content Vs. what do you think is technical. Because, I think that there are many technical elements to content that are coming more and more into the forefront with respect to our ability to leverage national language processing and to form our content such that it matches up with the statistical expectations of a search engine. So, I think that there are technical components to content that are coming more and more to the forefront because we have a lot of tools out there that will help you, you know, with your optimization, and help you with things like, national language generation. I completely believe that there is going to be a tool very soon, where it’s going to be like, here’s my keyword, help me generate a piece of content based on my brand voice, based on my products, or whatever it is and then it’s going to be pressing a button. You’re going to have a perfectly optimized piece of content that you then just have to adjust a bit more so that it works from a conversion perspective. I completely believe that’s going to be a product that someone has in the next, you know, 3 to 5 years.
So I think that there are definitely technical components to content. But again, if we are just going back to the way people think about content Vs. technical, I think they are equally important. You know, I think that, the way that sites are built, are becoming even more complex. You know, everything is JavaScript driven at this point, not just a site here and there that is built that way. And then you got to make sure you have the right content strategies that align with the people as far as what you want them to accomplish against your business objectives. So, I completely believe that the two have equal weight.
Manish Dudharejia: No, that makes sense and I totally agree with you. You mentioned that somebody might be building a product like that and you wouldn’t be surprised if that kind of a product is out in the next couple of years, but do you think it is totally doable? I mean, do you think Google won’t be able to catch it as a manipulation? Because these days we are talking a lot about creating content for humans, and that it has to be created by real people. So, if someone is going to build a product like that, and if we are going to have completely optimized content, then I am surprised wondering what the role of creative content writers will be.
Michael King: So, I believe, there will always be value in a creative content writer. I don’t think that an algorithm will ever be as creative as a person, at least not, you know, in the near term, right? So, I think there’s always going to be value in that. And I also don’t make a distinction between content written for people vs. content written for search engines. I don’t think content should be written for search engines.
I think you should always be looking to balance things between these two audiences. When you talk about personas, one of your personas will be Google Wide. So you have to satisfy all your personas. And as far as, can they detect it, absolutely! There are already tools for detecting GPT to generated content. So, that’s definitely a possibility, but I don’t think it is necessarily a problem. Because if that content is perfectly optimized, but has utility for users, then who cares where it came from. They don’t care for person wrote it vs. computer writing as long as it has value for the person and I believe that to that point, user’s signals are going to be even more important.
Google always talks about, at least over the last three to five years, that they are a machine learning company. They are using machine learning as part of their scoring system these days. With Machine learning, you have to have something to validate whether or not the answer was correct. So, if you are building a model of some kind you have to have a feedback loop to indicate if this is a right answer. So, what is a better right answer than those user signals? And that just means that as we get to that point where more and more people are able to generate content that is perfectly optimized at scale, we have to leverage that to form these algorithms to understand whether or not it is a good result.
Manish Dudharejia: No, absolutely. I agree with you and I guess user search intent and user signal will play a major role in deciding whether it will be successful or not. I know you also mentioned about these days lots of developers are using JavaScript technologies for quick loading time, like Angular, Node and React. And it has been getting consistently popular these days among the developer community. That is fair because if you build your sites using Angular and Node, your loading time will be as low as one or two seconds. So, how do you think this technology impacts SEO? I mean, if I am building a website with PHP Vs. Angular. Do you think Google will look at these sites differently? And which one will be easier for Google to crawl and index? So, what are your thoughts on specifically doing SEO on sites that are built on JavaScript based technologies?
Michael King: Yeah. Good question. So, I don’t think they necessarily care what technology it is. I mean they care with respect to how can they see the content and so on, but if you are building something in PHP, Vs. something in Node, I don’t think they necessarily care what technology you use in that case. Now, they do care about load times of course because you got to think about the computational expense. Before you can even think about what the end user experiences when they click on your site, you got to think about the computational expense of Google crawling the entirety of the web. So if something takes 2 seconds, they can crawl way more for cheaper than if it takes 20 seconds. So, starting right there, you have to make sure that it makes sense, you know, for them to do it. Then, beyond that we have to think about what is the user’s experience and so on. So, I would say, you’ve got to use whatever makes sense for your use case.
Like, if something is in PHP, everything is going to be service side rendered so everything is going to be perfect from Google’s perspective. If it is something like Angular (Angular always makes me laugh) because Google built it but it has so many problems with Google crawling. But if you build something in Angular and it makes sense for you to do that because it’s more of an actual application than a website, then some components of the website may not need to be seen by Google. So, it doesn’t make sense to you. But at the same time you have to make sure that you are optimizing for them to be able to crawl the site effectively, fast, so that they can crawl more pages and that they can actually see it.
So, the main thing to note though is:
This is something that we’ve learnt recently from the things that were discussed in Google’s recent talks; they only use the render tree when they are considering things for rankings. So, they don’t have to load everything. They don’t have to do the painting phase to determine what is performing for considering for rankings. So, that in itself chops off some of the potential time. They don’t load ads, if they’re blocked by the ad server itself. So there are things to consider in your page that you may not have considered before. I think the general thinking earlier was that, ‘well, okay, a website looks the same way as it loads in the browser as Google sees it’. That’s not necessarily true. They are so adherent to both robot style text and the render tree. But, end of the day, you still want to make sure that you are following that progressive enhancement ethos, I know that they say dynamic rendering is the way to go now. But there are so many instances where they may not render those pages. So it is better to follow that progressive enhancement ethos, where all the actual content that you want on the page for the user is served from the server side then you just add those enhancements for the client side.
Manish Dudharejia: Yeah, that’s what I was going to ask, because you know, most of the sites that are built with Angular and these types of JavaScript technologies, they have an issue about the content that they are also fetching from the cache rather than from the server side. IT pretty much doesn’t load and Google cannot crawl it. But I think what you said makes total sense. I know you mentioned about the crawling time and the budget for that, what do you think the crawl budget is like. I mean for a large website, with millions of pages, how can webmaster manage the crawl budget?
Michael King: Yeah, so (Joy Ford?) gave a great presentation on this concept recently. And one of the things that she showed (I don’t remember the exact technical terms right now) a load balancer with more loads. So, basically, the more capacity that the site had, the higher the crawl budget would go. So, effectively it really just comes down to the architecture of your site and what you can handle on your site in addition to the authority that Google associates with your site. So, obviously a site like cnn.com gets crawled thousands of times per second where as someone’s little blog might get crawled just a hundred times a day. So, it’s a function of what you can actually handle and then also what Google thinks is important and worth crawling.
They key thing to do though is whenever you are at a point where you need to do migration and you can see a drop of traffic for a long period of time, probably like 3 to 6 months, there’s a section in Google search counsel where you can give feedback to the crawling team. In that form you can say that you made a big change and ask them to ramp up the crawl. They will do that for a few days and then they’ll basically expand your crawl budget for that time period.
But, for instance if you’ve added more nodes to your server and then you want them to crawl more on  regular basis, I suspect that if you wrote in the same form, that, ‘hey we’ve enhanced our server architecture, we’d love you guys to crawl more’, they would probably do that as well.
Manish Dudharejia: That’s very interesting. I think this answer will help a lot of people to understand how to manage their crawl budget. Great answer. So, moving on, I know that you also talked about render blocking. How do you think Google bots render the information in JavaScript? I mean, can you explain it in a simple way? And also what are the best ways to identify render blocking elements on a page?
Michael King: When you say Render blocking are we specifically talking about the critical rendering path in Speed or you just talking about how does Google render pages.
Manish Dudharejia: Yeah, generally, how does Google render pages? The store of information in JavaScript.
Michael King: Right. Google renders pages very similar to how your browser renders pages. Google bot is a modified version of the headless chromium and so as far as I understand like I said they just render pages up to the render tree. They don’t do the whole painting process, but basically, a CML is formed and then they start downloading all the assets, they start to construct, document object model, then the CSS object model. And then as they run it through JavaScript that alters both of those things. So, to explain that in a simple form, you know imagine that you turn the TV on, your TV pings your cable company, and then the cable company sends this information about where all the pixels need to be placed on your TV screen (all those dots). Then it gets the color information and then it further enhances what you see on the TV. Then you also get the capability to do all the things with the remote and as you push buttons on your remote it changes the information it gets on the cable company and finally it shows that to you. All that stuff happens pretty instantaneously with your television whereas with your browser it takes a bit more time because all these operations are going out to different servers and so on.
So, the key thing to know is that the painting process is not something that Google is doing and so the same way that you might see content in your browser when you turn off CSS, you will see the hierarchy of different objects, that’s effectively what they are looking at in the render tree with Google bot.
So, what that indicates is that the whole process of layout and actually showing stuff doesn’t have to happen for Google bot.
Manish Dudharejia: Yeah. That was a great example. IT makes it so easy to understand how it works. So, let’s move on to the next set of questions. We have some questions about site navigation. Taral will be taking on those questions.
Taral Patel: Hello. As we all know that site navigation is a huge topic and we can all go on asking questions about it, I’ll just limit them to a few before we move on to other topics like URL structure and loading speed. So, my first question is that, we have heard a lot about top level navigation and its importance in understanding the site structure and how Google sees your website. What is your opinion on how important it is from SEO point of view? We all know that it is important from a user point of view but what about SEO?
Michael King: Yeah, absolutely. I think that top level navigation is how you basically lay out the taxonomy of your website. I think we should talk to somebody from information architecture to get the real idea of the expected view of this type of stuff but I think your file structure and your navigation structure give a clear indication of what is important across the site and what is the site about. One of the things that Bruce Clay popularizes is the idea of silos and topical silos. All of this is reinforced by navigation.
Of course, your internal links tend to be more important because after a certain point Google considers navigation as a boilerplate element so, they’d be like, Okay, we understand that you use it as a basis of understanding this website, but it may not be the most important way to reinforce these ideas.’
The in context links are better because there’s text around them so they can understand what the page is actually about. But nevertheless, the top level navigation is incredibly important just to say, this section is about this and this section goes with this other section. It reinforces these ideas around keyword relevance as well.
Taral Patel: So, if we have a lot of important elements, or main page elements from where we want users to navigate how should we prioritize the top level navigation.
Michael King: In some top level navigations, some sites don’t change as you go. For instance, it may update to reflect the secondary sections of that area of the site. Let’s say we are talking about an e-commerce site of a furniture store. Let’s say you’ve got the kitchen area and when you click on something in the kitchen and the top level navigation now becomes all the sub-sets of kitchen. So, some sites do it that way and I think that’s actually pretty smart because then the top level navigation start to act like an internal link as they are just relevant to that area.
But that’s not to say that you can’t back out the other top level. So say for instance you may have bedroom, dining room and so on. Those top level areas may still be there so you can get between topics and such. But the other part of the top level navigation is all about the kitchen, since that’s where you are.
So, I think there are a variety of approaches that you can take as you all know less links on a given page tends to be more powerful so some sites take that approach. I think that works quite well for them.
Manish Dudharejia: That was a wonderful way of explaining about the top level navigation.
Taral Patel: Moving on to my next question is about one of the issues that are faced by large ecommerce websites which have a lot of categories and subcategories in their product page. The issue is about the creep depth of the pages. How much impact does ‘the creek? Depth’ have on the final page?
Michael King: Yeah that is definitely something that we thought about a lot earlier in SEO (about 5 or so years ago). And the general line of thinking was that closer to the homepage the better. You definitely don’t want things that are more than 5 levels deep. But I think the reality of the web is that (especially in e-commerce) there’s just so much stuff, that Google has had to perhaps recalibrate that a bit.
You’ve got bedroom, bathroom and kitchen, all under the same roof, and then basically it will be difficult for the search engine to understand what the site is about, because everything is about everything. So, having a structural hierarchy and going back to the concept of silo-ing really lends itself to the idea of topic milling. So you have a section of your directory structure that’s about bathroom and then there’s something within it about bidets or toilets or whatever. It becomes very obvious to a search engine that this section is about these things. So it is easier for them to reinforce those ideas and form ranking.
Taral Patel: Okay. Thank you for that answer. So, let’s move ahead with the URL structure. I just have one question. This is again about e-commerce websites. Lately I’ve been seeing that a lot of e-commerce websites have been using a separate URL structure for their product trainees. In the normal way, the structure is like: Domain Name/ Category/ Product. What they have been doing is Domain Name/ Product. Do you think it makes any sense to have a completely new URL for a product page? Or should we just go about in the normal way?
Michael King: I think it depends on how many products you have. So, it goes back to the same idea that we were just talking about. If your site is straight up D to C, and you sell just three products, then it’s fine, Domain/Product is fine. But if your e-commerce site has millions of products, it doesn’t really make any sense. Because again it is really difficult for you to establish those topics and rank as well as possible for a site that big.
Manish Dudharejia: Yeah, I think I agree with that. That makes sense. It really depends on the number of products, the length of your category, the category characters and things like that. Alright. I think the next set of questions that we have are related to loading speed.
Taral Patel: So while looking into the page loading issues, many SEO experts rewrite heavily on Google page speeding sites. When I started my career in SEO, page speeding sites were the only thing that I knew. So, how reliable do you think are page speeding sites and to what extent should its suggestions be followed?
Michael King: I think the thing about page speeding sites is that you are not going to get very far with that in dealing with developers. In fact, developers hate when SEOs come in with screen shots of suggestions from page speeding sites. So, perhaps you can start there but I would recommend that you use lighthouse instead. Lighthouse is going to give more specific recommendations. It’s also going to get you more familiar with the things in Google Chrome console. It’s going to give you all the right information, so I would say look at the network tab, look at the timeline. You will get way more specific insights from there. The timeline and the network tab are going to give you stuff are more similar to a bunch of tools that show you the waterfall like GTmetrix. It will also show you the things that are taking the longest.
Developers appreciate specificity. They don’t appreciate when you just throw a URL on the tool and it gives you a list of things that may or may not actually be issues or may or may not be the things that they can fix. So, for instance, those tools are going to say things like, you’ve got render blocking JavaScript and so on and so forth.
But, in some cases, you may not be able to do anything about it. It may be a third party script; it may be a Facebook tracking pixel or something. So, they hate when you come in and say all these things about fixing them. So, I would say definitely g away from page speeding sites and use what a developer might actually use.
Taral Patel: I think what you said was really important. I believe it will give many more options to our audience about what they can use and the alternatives for a page speeding site. Okay, moving on to the last question that I have related to loading speeds is that many people consider different factors for loading speed. Some say that we should look at the first paint, some say we should look at the content that we first paint, while others say that we should look at the first meaningful paint. So, my question is what is the best factor to judge the loading speed?
Michael King: I don’t think anyone has been definitive on that. Like I said that the Google bot isn’t going to paint, it doesn’t care as much about the paint. However, they may be using those metrics separately after they run things through the WRS. I am not really sure exactly at which stage they are determining the page speed. But I would definitely take a look at the time to interactive; I would also look at time to first byte. Those two are the ones that I tend to focus on but I don’t think we’ve had a definitive answer around what Google cares about most.
Kevin Svec: Thanks a lot Michael. I think all that was very cool. We are about to go to intermission as that’s all the first section had. And we should be back in about a minute.
Kevin Svec: Thanks for listening to the 14th installment of the Marketing Microscope podcast brought to you by E2M solutions. Hopefully, you are getting some good nuggets from Michael King on Technical SEO and how to apply it to your own strategy. Are you ready for the Adobe BC ‘End of Life’ on March 26th? It’ll come faster than you think.
If you haven’t developed a plan to migrate your site, don’t wait any longer. On the E2M Blog, Hiren Modi, leader of E2M’s web division has outlined many different migration options to choose from including DUDA, Webflow, Treepl and many more. Don’t be left behind and get your site migrated to the perfect home as Adobe BC becomes a thing of the past. For more information, visit our website at www.e2msolutions.com/blog/.
Our guest today is Michael King, Technical SEO expert and lead consultant at iPullRank. And, just to dive right back into the questions here, we are going to talk about e-commerce, which we touched upon in the earlier section but will go in depth here, and Manish, if you want to take the questions here, feel free to go right ahead.
Manish Dudharejia: Sure. So, Michael, let’s talk about the e-commerce sites specifically because we are focusing a lot on e-commerce SEO and we often face a lot of challenges. We’ve observed that these challenges are very common in the industry. So, my first question is if you have a very large enterprise e-commerce website where there are hundreds of thousands of products, how do you go about creating an audit report for these kinds of large sites and what are some of your favorite tools to perform such kind of SEO audits for large sites?
 Michael King: We worked on a pretty popular marketplace that had two billion pages of text. At any point it could also have 5 billion pages. We realized that it really depends on what the goals are. In some cases, it may be enough to just crawl a few million pages and get a sense of what to do. In other cases you may need to crawl hundreds of millions of pages because the issues that you spotted may be bigger than you though they were once you get a bigger sample size. It just really depends on what the resources are and so on.
But after that, it is about how we come up with things that are going to scale. How are we going to identify things that can be rolled out without you having to write copy for every single page? And so, the reality of that from the stand point of e-commerce sites is that it’s all going to be starting from the category page and then building out from there. So, that’s typically where we spend most of our time as far as things that we want to do that are more bespoke. Then we figure out how to scale things as far as how the product pages are built effectively.
So, when we talk about a marketplace, it’s a bit easier because you’ve got people uploading their content, the things that they want to sell, and you can just nudge them on the backend with forms. For example, if someone is uploading their products, you can put validation steps like what the title can be and how long it can be, how much copy needs to be there in the description. You can give examples of things that they can write and also keywords they can use as they are filling those things out.
So, again it’s all about scale and figuring out how you can get other people to optimize these things as you go, but if you are thinking of an e-commerce site where you don’t have a ton of resources then it’s all about what you can do programmatically as we go.
Think about how you can programmatically build internal links or how do you programmatically filter the copy that you may be getting from manufacturers so that it can be better and different or even what type of copy you can add to it. So again, things like national language generation really come into play. There are a lot of tools out there that currently exists, one of which I know of is called ‘Phrasetech’. It will use your data model and use that to generate a copy about products and so on so that you have more opportunities for that content to rank.
Manish Dudharejia: Cool, yeah. Think it clears up a lot of things. So, when you have a site that is pretty much built and the client is managing everything like the products, then the tools like Deep Crawl, Screaming Frog, Semrush, all these have certain URL limitation when you want to run an audit. Also, their enterprise level plans are too expensive for some clients, so, you mentioned that in such cases, we should first go after categories and then prioritize like that. But, is there any favorite tool that you recommend to our audience that you prefer working with?
Michael King: Yeah absolutely. I definitely prefer Botify, and I definitely prefer to get access to log files for that type of sites because again you need to know where you can prioritize. The thing about Botify is that I really like their capability to allow me to cross tabulate any of the data points that they have on the data set. I really love what they’ve done with the tool.
Manish Dudharejia: That’s great to know. I think knowing about these kind of tools help a lot of people in making their jobs a lot easier. So, my next question is what is the best way to resolve faceted URL issues for large e-commerce sites? This is probably the most common question that comes from everyone.
Michael King: Faceted navigation. Yes. It is a combination of not creating too many paths around those URLs and if you do making sure that you’ve got good canonicalization in place. I think out of the box a lot of e-commerce sites are like, ‘ okay, we’ve got new facets, lets generate a net new URL for any permutation or combination of filters and so on’, and so, you still want those to be pointing back to that top level version of the page. So, if you are limiting it to say, blue products or products that are extra large, don’t create a brand new URL for that if there’s no value in it.
But, I will say this; we’ve seen a lot of instances where that type of duplication work very well for certain sites. So, it’s definitely worth AB testing that. So, don’t just go crazy with canonicalization without doing an AB test first. We have done that in the past and we’ve seen that the page that we canonicalized to did not capture all the traffic of the duplicate page that existed. You can’t just take that as a best practice. You should always AB test first. E-commerce sites are some of the best types of sites to run AB testing on.
Manish Dudharejia: True. I agree. So, you recommend we should not go for canonicalization in the first place and let those faceted URLs as is and see how it goes. If they perform really well, then you don’t do anything about it. Right?
Michael King: Yes. But, you’ve definitely got to keep it as a situational thing rather than just always doing that and assuming it’s going to work, because sometimes it doesn’t.
Manish Dudharejia: Yeah. Absolutely, it makes sense. So my last question specifically in the e-commerce section is related to international SEO. Let’s say we have two domains .ca and .com and the content of both the sites is pretty much the same. What are the things that the SEO people need to take care of to ensure that there are no duplicated content issues and to make sure that the .com site appears in Google USA and the .ca site appears in Canada?
Michael King: The first thing is in the top level domains, in the Google search console you can set the country that you want it to show up for, you also want to have H-reference tags between the two and also localization of the content. So it’s not just like having the same language. There are definitely slangs or words that they have in Canada and not in the USA. So, if the product warrants that, we have to make sure that we use them.
Also, in Canada we have French Canada, where they speak French. You may want to have some of that content be in French, because some of those people will only look at things if they’re in French. So, its not just about doing the H reference thing, you also have to think about the audience and what resonates with them in that content.
Manish Dudharejia: Yes. Absolutely. Now, you mentioned that every country has its own slang or terminologies when it comes to language, but if you have hundreds of thousands of products, it will become very difficult to go after every single product and make such small changes. In that case, you are left with pretty much no option but to use the same content, then how do you go after that?
Michael King: Yeah, I think that is resource dependent. If you have the resources, then there are firms that do localization of content. ‘We Localize’ is one of those. But if you don’t have the resources then I completely agree with what you are saying, you are pretty much stuck with only the H referencing thing.
Manish Dudharejia: Right. So, Kevin, I know that you have some general questions, so please go ahead!
Kevin Svec: Yeah, to shift gears and to get away from all the technical stuff, just to close out the show here, I want to kind of get into SEO as a whole. John Muller recently talked about how linking to high authority domains doesn’t necessarily impact how the content will rank. I’m curious to get your thoughts on that. Do you think this is true or do you agree with it?
Michael King: So, he said linking out to high domain authority sites from your site or is it the other way?
Kevin Svec: Yeah, yeah. Linking out. Where you write a piece of content linking out to higher authority domains.
Michael King: I think that in the past a lot of the idea of SEO was in placing yourself in a good neighborhood, and a lot was presented on this subject. However, in my experience I haven’t seen that make a significant difference (who you link to). But I do know that having parity on subject matter between links is more powerful than getting links from random places.
So, that is to say that if you have a page about red Nike shoes and you get a link from a page that is also talking about red Nike shoes is more powerful than getting a link from a page that is talking about water balloons.
So, I believe that could be important but as far as linking out is concerned, I don’t know how much impact that really has as an actionable technique to improve your rankings.
Kevin Svec: My take on it is that Google has repeatedly said that create content that is of value to the user and Google will reward you for it. My humble opinion on linking out, I feel like it should help a little bit in rankings.
Like for example, let’s say, we are writing a blog post on the best type of content to utilize in 2020 and you have a section talking about how important infographs are. So you talk about the importance, you give some statistics and you maybe include a link in that to a website where you can easily create inforgraphics.
To me that sounds like it really helps the user. You talk about the importance and then you provide a credible resource where the user can actually take action on it. In my opinion, that should help a little bit on how content ranks. Maybe that’ll be the case in the future, maybe not. That’s just my take on it.
Moving on, I know that there’s a lot of terminology when it comes to link building like Tier 1 links, Tier 2 links and all that kind of stuff. I want to know how you think this factors in the big concept of relevancy that we keep hearing after Google updates. We hear this term ‘Relevancy’ too often. So, how do you think this type of terminology factors into that.
Michael King: I don’t really think it does.  I mean, when we had that terminology, like Tier 1 Vs. Tier 2, that mostly meant that it was a more expensive link. So, Tier 1 was like expensive. Tier 2 was something that wasn’t very hard to get. Mostly low authority domains. But Tier 1 meant that we were going after like DA 90. I don’t think that it necessarily impacts relevancy because that wasn’t really the factor, at least for us.
But I do think that relevancy does impact the value of links. I think it is one of the more important factors that we kind of avoid, because when we think about links, we are always thinking about volume rather than actual value. Most people think that more links get you higher domain authority, higher page ranking and so on but so does the quality aspect of it, in terms of relevancy.
We have seen if you build less links that are more relevant it just works better. Everything ranks better than if you just put in as many links as you can from random places.
Kevin Svec: Right and that goes back to what you were talking about earlier when you said if you are talking about Nike shoes then it won’t help to  get a link from water balloons. That’s really not going to do much. And I think that the paradigm is really starting to shift on the whole relevancy thing in terms of Google rankings.
Relevancy is really taking over and it’s not about how many links you can get but about how valuable they are to the reader.
So, I am interested to get your take. Do you think that the higher the domain, the higher the authority these days? Or is it more about how much value you can actually add? And is the importance of how high the domain is kind of diminishing?
Michael King: So, domain authority is not something that Google uses. All these metrics that we use are our triangulations or estimations of things that Google might be using. So we have no idea of how many links they are actually counting, no idea what the measures are, so we understand at the base level what page rank is. But Page rank has also changed so much. We have topical page ranking and so on and so forth.
So everything we have is our best guess. So, domain authority as a concept has never mattered. But it’s what we’ve used to try to understand what might matter.
Kevin Svec: So, that kind of segways into my next question about domain authority 2.0. I’ve been reading a lot about it and the shift has been, there is a higher emphasis on EAT or at least in the way Moz ranks the domain authority of certain websites. I am interested to get your thoughts on it and kinds of experience you’ve had in terms of domain authority 2.0 and how EAT factors in.
Michael King: Actually, I don’t have much experience with that. I do know that Russ and his team or the team of Moz did some very smart things in putting it together but I am not completely aware of everything that was done. So I haven’t really looked at it through the lens of EAT. In fact, there are some people on my team that are experts on EAT. Faji Mohammed who is on our team wrote a blog post that was cited in the documentation of Google side. That really speaks to how we look at EAT but I wouldn’t say that I am the foremost expert in our organization on it.
Kevin Svec: Okay. Sure. Maybe we’ll have to get her on the future episode of Marketing Microscope one of these days.
Michael King: Yeah, we should!
Manish Dudharejia: That’s a great idea.
Kevin Svec: Yeah. So before we close out the show I would like to talk about where Google is heading in the next decade. We are seeing all this content come around at this time of the year, you know all of the wrap ups of 2010s and where things are headed in 2020. I’m interested to get your take on it. What are kind of the major shifts in the goals?
Michael King: I think Google is going to continue to try to be predictive based on both implicit and explicit information that you get. I think that Visual search is something that they are going to push a lot more, with Google lens and all these products. Something like, I pull my phone out and you tell me what this building is. And I also think that Google is going to continue to build on more and more context on us but cut more and more people out of it. So, there was a report yesterday that said that they are going to phase out cookies the same way that Apple and all these other browsers have been doing. That’s not going to hurt Google because Google has so many ways to better understand you based on how you’ve logged in and so on. So, I think that from the product perspective they’re going to try to keep us on Google but there are going to be so many more ways for us to use Google through the various different applications.
Kevin Svec: Yeah, personally I’m just really interested to see how Visual search is going to shape up in the next few years. That’s definitely going to be the interesting thing to watch.
Michael King: Right and how do you even optimize for that? You can’t!
Kevin Svec: Yeah, it’s all just going to be AI bots, you know. Visual scanning. Yeah, it’s definitely going to be an interesting decade. Cool, so Michael, I just want to close out on the show and I want some final words of wisdom from you. So, if you had to share 5 things that webmasters should start looking at fixing immediately, what would they be?
Michael King: Yeah the number 1 thing that I always look for is Broken Link Target. So, where do you have your external links pointing to on your site, those pages disappeared or whatever. That seems to be one of the biggest wins once you 301 them to targets that are comparable to what you had before.
Also, your Internal Linking Structure (we talked a lot about e-commerce today), is incredibly powerful. We’ve done things we’ve worked with clients that we tested it out, that if you build a certain number of links in your site to a given page that increases the rankings in organic search. I think that the E-Bay team pioneered this. There’s a blog post by Dennis G. where he talks about the concept of building more internal links to do what I just talked about. And they built this whole system to support it where they push things from page 2 to page 1 just by building more internal links. Looking at Internal Linking Structure looking at Broken Redirects to 404 and fixing that.
Looking at XML Sitemap can also be incredibly powerful especially in e-commerce sites.We made adjustments to those and it changed dramatically how Google was crawling the site and we saw a lot of improvements there.
Another quick win, although I hate to say it because it’s an easy one, Metadata. Metadata is a big one to sort out because of the fact that you can be in the same position and drive more clicks by improving your page title and or your Meta description. So that’s 4 right there. What’s the 5th one?
I would say, Structure Data. You know, Google is continuing to have more and more opportunities where they are the presentation layer of the web using things like those ‘how-tos’ and those ‘FAQs’ you can jump to the first page without even currently ranking well in the searches. So I recommend everybody to look for those opportunities to structure data that are relevant to their site and implement those right away.
Kevin Svec: Perfect! All right Michael, that was a perfect 5 part wrap up and I really appreciate that. So that should do it for our time today which is just around an hour. And yeah, we really appreciate you talking with us, it really means a lot. And again, awesome talking shop with you! We hope to stay in contact and talk again in future.
Michael King: Absolutely! Thanks for having me that was awesome! Love the questions and love what you guys are doing. But if anyone wants to check us out we are at www.ipullrank.com and iPullRank on twitter. Thanks again for giving us the opportunity to have this discussion.
Kevin Svec: Thanks again Michael, we appreciate it.
Thanks again for checking out the 14th episode of Marketing Microscope Podcast brought to you by E2M Solutions. Hope you got some good tidbits on technical SEO and how to bring your site to the next level. For more episodes of the Marketing Microscope podcast, visit our website www.e2msolutions.com/podcast/. You can also catch our podcast at Stitcher. See you next time.