Withdrawal of the online store from Yandex filter «Baden-Baden»
Subject: sales of car tires and wheels.
Objectives: withdrawal of the site from Baden-Baden filter before the winter tire sales season (October) as well as the maintenance of high positions in Google.
We found the filter and began the activities: May, 2020.
Number of product items: constantly changing. The site was a tire hypermarket of 5,000 products.
Analyzed: texts on 160 pages.
Positions recovered: September, 2020.
Duration of the activities with texts: 3 months.
Duration of all the activities: 5 months.
The beginning of the story

The customer whose business was selling wheels and tires for cars contacted our company for help. His site got under the Baden-Baden filter in Yandex. Until 2017 the site had been promoted by another SEO company, and later by the client himself. Our objective was to remove the filter and make the site rank higher in the search results by October — the winter tire sales season. It was also of great importance not to lose the positions in Google. If the results had been unfavorable, the client's business would've been left without a major income until spring. Our specialists knew that it would be necessary to redesign the entire website of the online store in a short time, but the client trusted us, since it's great to act on your own, but real professionals would do it much better.

We found out that the main problem was the large number of auto-generated non-unique texts. The site had hundreds of pages with tires of standard sizes, which differed in the diameter of the wheel, width and height of the tire. In order to avoid writing texts for each page of such a type, the client had created a text template with a simple change in the standard size.

The quality of the unique texts also left much to be desired. They contained the large number of phrases, which sounded strange in live speech, which gave out SEO texts in them, were uninformative and overspammed.

The message in Webmaster proved it:

None of 5000 selected search requests made it to the top 10:

Initial percentage of the keys in the top 10:

• Google: 57%

• Yandex: 0%

The traffic to the site from Yandex had significantly decreased, while the site hadn't lost its positions in Google:

The activities on the site were undertaken from May to September 2020 (for 5 months).
The results achieved:

1. During the first month (May) we managed to collect the complete semantic core of more than 5,000 requests. Later our team checked the quality of all the texts on the site and made the content plan for them. We analyzed both technical indicators (spam, redundancy, uniqueness) and usefulness of the text for a user.

We also found logical duplicates of pages which drew the same group of requests. For example, 2 landing pages were made for tires of the same brand. So our team configured 301 redirect from the duplicate pages to the main ones.


2. From June to August our specialists processed 160 pages with texts. Those texts were edited, completed, and written from scratch. This constituted a difficult problem, since it required to create unique high-quality content for standard pages full of products with almost completely identical characteristics. We also corrected the meta tags.

In September, the fifth month, we sent a request through Webmaster to remove the filter. Later the filter was successfully removed.

The filter removed, the site got higher in search results:


3. Our specialists also conducted a full technical audit of the site and provided recommendations to fix errors. They managed to fix:

The large number of internal links giving 301 and 404 response codes.
Such links are dangerous because you can get the status of a doorway site which doesn't make any other sense apart from redirecting a user to some other one. And 404 code simply says that a page doesn't exist.
Hiding of scripts, style files, and images in robots.txt.
This flaw hides from the search engine the styles which shape the page's appearance. Without recognizing them the page will look messy, unfinished.
Errors in the sitemap.xml.
This file is a list of site pages for the search engine. It makes it easier for the robot to bypass, informs about the date when the page was changed, so the file is of significant importance for such a large site as proshina.by.
Problems with page duplicates due to the writing of URLs in different registers.
Duplicates are pages which are accessible with different URLs, but contain the same or similar content. They compete with each other in search results damaging the site's rank.
Problems with pagination pages, which the system added various parameters to, resulting in numerous duplicates and other drawbacks.

During the withdrawal of the site from the filter due to the improvement in the quality of texts, the percentage of the keys in Google top increased from 57% to 71%:

As a result, by October the percentage of the keys in Yandex top 10 was 39% (at the time of the filter's removal):

Owing to the comprehensive work on the site (the improvement of the quality of the tags, texts and technical condition) we removed the Baden-Baden filter, took favorable positions in Yandex and even improved the positions in Google.

Our activities enabled our client to increase the sales of winter tires thus improving road safety :)

Order a consultation
How can I help you?