Directory Image
This website uses cookies to improve user experience. By using our website you consent to all cookies in accordance with our Privacy Policy.

Good practices to avoid classic SEO errors

Author: Neo Anderson
by Neo Anderson
Posted: Jun 29, 2019

The success of a showcase or online sales site is based on an effective SEO strategy. The offer of the company or e-merchant must indeed be permanently visible and understandable search engines to offer visitors an optimal experience on the platform.

But maintaining a good SEO often involves errors, the most common of which can happily be avoided or quickly corrected through some good practices.

1. Complete the title and meta description tags

The HTML title tag plays a vital role in informing crawlers about the content of a page, with an influence on its SEO. If it is incorrect, the robots can leave it out because they cannot index it precisely.

  • Tools such as Botify, or ScreamingFrog replace Google's Search Console to detect pages with empty, missing, or duplicate tags.
  • A template allows you to automate all tags, with the ability to manually edit them one by one thereafter.
  • A monthly crawl is recommended to prevent the accumulation of errors related to the title tag.

Similarly, the meta description tag, which presents the content of a page, influences the clickthrough rate. This excerpt must be unique, optimized and the right length to attract users.

The setting up of a page template including all the tags, including the meta description, and a regular crawl are the keys against any risk of error.

2. Delete duplicate contents

The duplicate content disrupts the indexing of pages involved in the crawling robot and therefore it's ranking. In addition, the links are distributed between identical pages, which limits their scope.

To make sure you only have unique content, a manual search on Google from an excerpt of the text is the easiest way. But there are tools (or sites) dedicated to the automatic verification of possible duplicates.

If duplications are identified, you must:

  • Delete or edit one of the pages,
  • Or de-index the one with the least interest in terms of SEO.

3. Minimize the loading time of the page

To be well referenced on Google, a site must load very quickly on all media, especially on mobile. Various tools make it possible to measure this loading time. Several KPIs are analysed:

  • The server response,
  • Displaying the entire content of the page,
  • The possibility of interaction with the page,
  • etc.

Depending on the result, the technical teams implement the appropriate solutions: compression of resources, cleaning of the cache, optimization of the site through its back code, database queries, etc.

A sudden increase in the load time can be the result of an external attack, a problem with the host or the launch of a new feature.

4. Verify that Googlebot can crawl the URL of the page

The degradation of a page's positioning in the SERPs can be caused by the bot's inability to crawl it. To verify that a URL is well indexed in Google, simply search for it on the engine or run a crawl with the Googlebot.

In the event of an anomaly, tools for monitoring robots.txt or log analysis make it possible to identify their origin and to take the necessary measures:

  • Blocking of crawl in the robots.txt,
  • Excessive loading time,
  • Orphan page,
  • Failing JavaScript,
  • etc.

5. Avoid or fix 4XX errors and in the XML sitemap

When searches are made for a non-planned page in the site structure or no longer exists, an operating error (4XX) is displayed, the 404 being the most common. In order to avoid downgrading by Google, regular internal crawls via the apache / nginx server log analysis are needed to understand the cause of the problem.

Three options are available to the SEO team:

  • Delete dead links,
  • Correct the faulty pages,
  • Update links to redirect visitors to a new page.

It is also essential to check the status of sitemaps and perform non-regression tests with the DSI, even if the impact of errors related to this aspect is less. The details of the various anomalies encountered by Google appear in the Search Console.

Another solution is to ensure that the XML file format of an e-commerce site is valid before submitting them to Google.

About the Author

Engineer by Education, Marketing Influencer by Profession, and Creative Writer by Passion Neo is involved in Branding, Advertising & Consulting in the domain of Digital Marketing over the years.

Rate this Article
Leave a Comment
Author Thumbnail
I Agree:
Comment 
Pictures
Author: Neo Anderson

Neo Anderson

Member since: Jun 25, 2019
Published articles: 4

Related Articles