media-rotator.net Open in urlscan Pro
190.2.139.23  Public Scan

URL: https://media-rotator.net/
Submission: On August 15 via api from US — Scanned from NL

Form analysis 0 forms found in the DOM

Text Content

TECHNICAL SEO AS A BASIS FOR ONLINE SEARCH - SEMALT TIPS




Ranking in Google starts with optimizing the technology behind your website. 


By optimizing your website with technical SEO, you ensure that Google's bots can
easily find and index your content. This is the first step to appearing in
Google search results. 


In this guide, we'll explain what technical SEO is, why it's so important and
how you can optimize it.


WHAT IS TECHNICAL SEO?

Technical SEO is the part of search engine optimization that helps you improve
the technology behind your website. This process helps search engines like
Google to better find, crawl, understand and index your website pages. 


WHY IS TECHNICAL SEO IMPORTANT?

Technical SEO should actually be the starting point of your search engine
optimization process. The technical aspect of SEO ensures that search engines
can easily find and index your content. 


You can write as much good content as you want and fully optimize it with
on-page SEO. If the technical side of your website is not in order, the content
will not be found. This means that you cannot rank in Google. 


By applying technical SEO correctly, you will be found better and you can rank
higher.


IMPROVE YOUR WEBSITE'S TECHNICAL SEO

Now that we know what technical SEO is and its importance, let's talk about the
important points that will allow you to improve the technical SEO of your
site.  


SITE STRUCTURE AND NAVIGATION




Structure and navigation on a website are the starting point of technical SEO.
If these aspects are not in order, crawlers cannot find your site or index it
properly. 


The site structure influences almost all other optimizations that you perform on
the website. So, it's important to get this right first, to make the rest a lot
easier. 


As a starting point, you can say that you should have a flat site structure.
This means that all the pages on your site are just a few clicks away from each
other. Google and other search engines can easily crawl 100% of your pages.


In a complex site structure, 'orphan pages' often occur. Orphan pages do not
have internal links pointing to the article. Crawlers can hardly find these
pages. 


INTERNAL LINKING




Crawlers follow links they come across on pages. Internal links thus help
crawlers to crawl multiple pages on your website.


An article without internal links is a dead end and an article to which no
internal links point will not be found.


Internal links from high authority pages are more valuable. These pages are
crawled more often. This increases the chance that an article/post/page from
that authoritative page will be found by the crawler.


LINK DEPTH

The link depth indicates how many layers there are from the homepage. With a
flat and clear site structure, the depth will be a maximum of 4 to 5 links
deep. 


A less deep link structure ensures that pages are found more easily. It is often
the articles that hang at the bottom of the site structure that are not linked
to much. These pages are therefore difficult for crawlers to find.


INDEXING & CRAWLING




The main goal of technical SEO is to make a site easily crawlable and indexable.
To optimize these processes, you need to know which pages are causing problems.
You can do this using several tools: 
 * The Dedicated SEO Dashboard
 * Google Search Console

Below, we will explain in more detail how one of these tools can help you crawl
your site more easily and make it more indexable.


DEDICATED SEO DASHBOARD

The Dedicated SEO Dashboard is a technical SEO tool from which you can easily
get a coverage report. This is an overview that shows which pages Google can't
reach or index.


To request this report, go to the tool's interface in the Report Center
category. 


 
You can see if there are any problems with the details of the report. By
clicking on the problems, you can see which URLs are causing them. With this
information, you can improve the pages.


You can also see if important pages are missing from the index or if certain
pages are indexed incorrectly. You can make optimizations based on this.


It is important to have control over what goes into the Google index. This way,
the Googlebot also knows better how to crawl your web content. 


When it comes to technical SEO, this tool can help you understand the
following: 
 * How many URLs were crawled
 * How deep is your website structure
 * How fast do pages load
 * Which pages are missing a meta tag or have a meta tag that is too short/too
   long


XML SITEMAPS

An XML sitemap is a kind of table of contents for your website. It makes it
easier for crawlers to index your website because a full list of URLs is
available.


Several metadata are also available in an XML sitemap, such as:
 * When the page was last modified. This allows a crawler to determine whether
   the page should be reindexed or not.
 * How often the page has been updated. Important pages are changed more often
   than other pages. This allows Google to determine which pages are more
   important.
 * How important a page is. No-index pages are not included in a sitemap. If you
   have duplicate content, then put only the canonical URL in the sitemap. 

You can check in Google Search Console which sitemap Google currently has of
your website. To do this, go to 'Sitemaps' in the left menu under the heading
'Index'. 


Scroll through the sitemap to see if any pages are missing or if there are pages
that do not need to be indexed.


DUPLICATE CONTENT




When it comes to duplicate content, this can be interpreted in two ways, namely:
 1. Content that you use multiple times on your website
 2. Content that does not only appear on your website, but also on other
    websites

Duplicate content, in any sense, can negatively impact your rankings. So, it is
important to know if you have duplicate content and if so how to deal with it.


The Dedicated SEO Dashboard is a technical SEO tool that checks whether the
content on your website is used on other websites.


Duplicate content matching another website can mean two things:
 1. Another website copied you
 2. You copied another website

In the first case, you don't have to take any action, because Google sees you as
the original source. In the second case, you'd better update your content to
avoid penalties.


You can check who is the original source by entering the duplicate content
between "double quotes" in Google. The website that appears first is the
original one. 


THIN CONTENT

Thin content, or in other words lean content, refers to pages with a small
number of words. 


These pages can have a negative impact on your ranking. Google values
informative content that actually informs the reader about a topic.


A page with less than 300 words is generally considered thin. It is best to give
these pages a no-index tag so that they are not indexed. This way they will not
affect your rankings. 


BLOCK ROBOTS




Your website's robots.txt file can help tell Google which pages can and cannot
be crawled. 


So, you can block crawlers on certain pages. Stopping the Google bots prevents
them from crawling, rendering and indexing the URL. The page is therefore
considered non-existent and will not affect your rankings in any way. 


An important part of technical SEO is to keep the robots.txt file up to date.
This way you can keep crawlers away from duplicate content. 


CANONICAL TAGS

In some cases, you want to use duplicate content on your website. For example,
in the case of a webshop where products are available in multiple colours or
sizes.


You can indicate with a canonical tag which page Google should index. 


The pages belonging to the main page are not indexed. Any value from the other
pages is assigned to the main page.


HREFLANG TAGS




Hreflang is a part of technical SEO that is applied to websites that are
available in multiple languages. Using Hreflang, you tell the crawler which
pages are equivalent to each other
.
An Hreflang tag tells Google what language you're using on a specific page. The
search engine can display this result to users who search in this language.


Based on the location of a user, Google knows in which language the page should
be displayed. 


For example, Google will subdivide a website that is available in Dutch and
English. The website will automatically open in English for a user in the United
Kingdom and in Dutch for a user in the Netherlands.


PAGE SPEED

Another important aspect of technical SEO is the speed of your website. Several
points are considered, such as:
 * How long it takes for the first pixel to load
 * How long it takes for the first interaction to take place
 * How long it takes for an action to be performed

The speed is determined separately for each page. By optimizing the speed of
your pages, you achieve better user-friendliness. This counts for a large part
of your Google score to rank.


CORE WEB VITALS




Google's core web vitals are points against which is measured the user
experience of a website. By optimizing these points with the help of technical
SEO, you make your website technically in order.


The points below are measured using the core web vitals:
 * LCP. This stands for Largest Contentful Paint and indicates when the first
   usable content loads on the page.
 * FID. This stands for First Input Delay and indicates how long it takes to
   perform an action, such as scrolling or clicking. 
 * CLS. This stands for Cumulative Layout Swift and indicates whether certain
   actions are difficult to perform due to page shifts.


SSL

An SSL certificate makes a website more secure. You can recognize a safe website
by starting with HTTPS instead of HTTP. 


Such a certificate secures the data exchange. Information exchanged between the
website and the visitor cannot be intercepted by a third party. 


The SSL certificate, therefore, offers security for you as the website owner as
well as for the user:
 * It keeps the content of the website safe
 * It keeps the data users enter on the website safe


MOBILE-FIRST INDEXING




More and more users are accessing the internet via their mobile phones. There
has been a significant shift in recent years. Where 10 years ago most sites were
still visited via desktops, now mobile use has the upper hand.


Google indexes pages according to its mobile-first principles. So, it first
looks at how fast, accessible and user-friendly a mobile page is. 


An optimized website for mobile use is therefore indispensable. Fortunately,
Google has developed its own tool to test mobile-friendliness. 


After performing the test, it is indicated which technical SEO points you can
still improve for the mobile website. Get started on this to make sure the
mobile version of your site is up and running. 


HTML, CSS AND JAVASCRIPT

If you have started working with a website yourself, you will probably have come
across these terms. They are codes with which your website is created. 


Even when you use a page builder on a WordPress website, these codes on the back
will play a big role.


It is important from a technical point of view that you have these codes in
order. Google is less good at rendering complex structures. 


There is a chance that Google will not be able to properly assess the usability
of your website if the codes are not in order. It is therefore best to keep your
website as simple as possible.


To better understand what you can pay attention to when it comes to technical
SEO, we list below what the different codes do:
 * HTML. The HTML codes provide the site structure. Think of displaying titles,
   subheadings, titles, normal text, etc. It is best to ensure that your most
   important navigation functions are written in this language.
 * CSS. The CSS codes make everything look nice. Think of colours, fonts and the
   general appearance of your website.
 * JavaScript. You can add interactive elements to your page using JavaScript.
   Too much JavaScript can cause rendering problems. Google may not be able to
   get a complete picture of your pages because of this.


DEAD LINKS




Dead links are links that lead to 404 pages. Links that end in a 404 page do not
exist and a crawler ends up there.


According to Google, it is absolutely no problem for technical SEO if you have
dead links to external pages. For internal links, however, this is a different
story.


A broken internal link makes it more difficult for a Googlebot to crawl your
website. Indexing can be a problem because of this, which in turn can lead to a
page that cannot rank.


So, while Google states that broken links don't directly lead to lower rankings,
it can certainly have consequences. Do a dead link audit once in a while to make
sure all your internal links are still working. 


STRUCTURED DATA




Structured data helps crawlers correctly interpret the information on your
website. With the help of structured data, you can, for example, indicate
whether your page describes a recipe, how-to, review or product. 


By applying structured data to your website, you can end up in Google's
snippets. These snippets often appear above the organic search results. In this
way, structured data can provide a huge boost to organic traffic.


If you need to learn more about the subject of SEO and website promotion, we
invite you to visit our Semalt blog.





send email