written by Brian Winters on May 01. 2020
Brian Winters is an SEO expert at Social Brothers. Simply put, his job means that he ensures that the websites we build or modify appear as high as possible in the search results. He does that in different ways. It is always an interplay between on-site SEO, off-site SEO and technical SEO.
Social Brothers interviews SEO specialist Brian to discuss technical SEO. It seems more complicated than on-page SEO, mainly because it is less concrete. Yet it is very important for a good findability of your website. No matter how good your website's content is, technical issues can prevent it from ranking high in Google. So it's high time for a conversation with our SEO expert.
What exactly is technical SEO and how is it different from on-site and off-site SEO?
“Technical SEO is the interplay between technology and content. By technology I mean everything we don't see. In other words, everything that goes on behind the website. That is the job of the developers. The content is the visible part. SEO purely focused on content is on-site SEO. Read more about it here on site SEO.
For technical SEO, the Marketing department and the Development department of Social Brothers very closely. This ensures the high SEO score of our websites. This collaboration is logical, because technical SEO is on the border between technology and content. Take, for example, a 404 error: the user sees the error, but the cause lies in the technology.”
What do you pay attention to when improving technical SEO?
“As an SEO expert, I pay most attention to the health of a website, such as the absence of broken links. Links not working is one of the most common technical SEO problems. But actually I look at everything that is broken and should work, such as images that don't load and CSS files that aren't on the website in the smallest possible way and so on.
We want to keep errors to a minimum. When Social Brothers builds a website, we want a minimum SEMrush score of 85%. A former teacher of mine was blown away when I told him that, because above 60% is already good. For us, 85% is realistic, because we know what Google likes in the technical and content area and we have all the specialists in-house: developers, content writers, designers and marketers. So everything that contributes to a healthy website. That is why we want to deliver a certain quality.”
Which program helps you the most with technical SEO?
“For technical SEO I like to work with it program semrush. This program measures the health of a website and indicates which errors there are. This can be divided into three categories from bad to bad: errors, warnings and notices.
Errors include broken links, pages that cannot be crawled, duplicate meta descriptions, and errors in the robots.txt file or xml sitemap. The XML sitemap is important because it provides directions for your website. This way search engines know how the page is structured.
Warnings are often about too little text or the lack of alt attributes on images. This is a short description of an image, which you do not see yourself, but Google does. Other errors include duplicate content or page titles longer than 60 characters. Finally, notices are minor errors that have little impact, such as wrong redirects.
We also use for SEO Yoast, Google Search Console en Google Analytics combined with Google Tag Manager. "
Read more about the most common SEO mistakes.
What makes SEO so interesting for you as an SEO expert?
“It's not static, so there's always work to be done, especially the marketing part. Customers add pages or write blogs and that always requires optimization. The competitor also keeps busy and you should be a few steps ahead of him with innovations. It also takes time, but it is worth it. It is in principle free, but only visible in the long term. You will therefore also see later if an adjustment is disadvantageous. The long term is partly because Google considers reliability important. It takes a while to build up authority.”
Technical SEO tips from our SEO expert
How do I make sure my website is accessible to crawlers?
“The most important thing is safety. Have a website with https. The S stands for security and indicates that your website has a security certificate. If you don't have that, Google will decide sooner not to crawl the website. You also get a safe website through reliable content, for example by not luring people with keywords and then talking about a completely different topic. Provide backlinks to safe websites and as few broken links as possible. All of this contributes to reliability.
The structure of the XML sitemap and internal linking are also important for the accessibility of the website for crawlers. How is the site built? How are your pages connected to each other? All pages should be accessible from the homepage in as few clicks as possible. Beyond three clicks will not work. Google looks at it from the user's perspective: if the user has to click through to get to the correct information, the user has also dropped out.”
How do I make sure my website is indexable?
“Important for this is Google Search Console. With this you request indexing from Google. This also happens if you do not apply, but less quickly. In addition, you can choose not to have certain pages indexed, such as a thank you page for a newsletter or a personal shopping cart. This is content that is not interesting and irrelevant to a search engine user, so you also want to prevent it from appearing in the search results.”
How do I get duplicate content? And how do I prevent it?
“Usually this is due to cutting and pasting, for example from meta descriptions. If you can't get around it, it's best to use a canonical link. You then say to search engines, as it were: these ten pages are the same, but only look at this one.
Duplicate content can also come from content that is automatically generated. You can prevent this by taking it into account in your code. That is why we at Social Brothers also work closely with the developers.”
How do I get broken links on my website? And how do I prevent it?
“Imagine you have a link to a blog, but you change the URL. The old url no longer exists and existing links no longer work. That is why you should always create a redirect when you change a page URL. The redirect causes the old link to redirect.
In addition, it is useful to check links regularly. This takes a lot of time manually. Fortunately, SEMrush can help with this. The program indicates on which pages the links do not work.”
What would you like to say as an SEO expert?
“I often see that customers don't have SEO thinking behind their content. For example, titles are very, very long. Or the title does not match the text on the website itself. That is why my tip is: remember that your title very briefly reflects the topic that you are going to discuss in your text. Then it is much easier for you to include your keywords in it.
And people don't always wonder if their topic is a good keyword. Ask yourself which keyword you want to be found on and don't write a text because it looks nice. If people don't go looking for it, it's a real shame. Then you have a very nice text, which will not be found.
For technical SEO you need a close collaboration between SEO expert and developers and for on-site SEO a close collaboration between copywriter and SEO expert. That collaboration between developers, marketers and content writers is crucial for SEO.”
Want to know more?
Do you want to know more about SEO and optimizing it? Then check us out complete SEO handbook.
Check out our here SEO services† Do you need help optimizing your website? Please feel free contact on. Social Brothers is happy to help you.
Help us, what do you think?
Thanks for the feedback!
Thanks for the feedback!