Why does your website lose Organic traffic?

As we already mentioned in one of the previous articles, organic traffic is much more stable than traffic from other sources.

It is relatively hard to lose and if it declines, it does so gradually - excluding cases when webmasters roll out a new website and fail to keep the links previously indexed by the search engines (see the Indexing section below). So you normally have enough time to notice the decline and stop it. Of course you should keep track of your search engine marketing performance using Google Analytics to be aware.
To understand the reasons why you can lose organic traffic, you should understand how web search works in general.

How does web search work?

If you want to know what to blame for the loss of traffic and what actions will ensure quickest recovery, you should understand some of the search engine working principles. Specifically, you need to know the stages of data processing they use to find, rank, and then put your domain on their search result page.
Every search engine processes the data iteratively in three phases: crawling, indexing, and ranking.

#1 Crawling. Special search robots designed by the search engine vendor constantly crawl the global web entering each thinkable domain and page and registering all changes or new pages that have appeared recently. It normally takes the Google’s robots two weeks to crawl through the whole of the Internet and start their journey anew. That is why you usually have to wait for a while before your new pages appear on the search result list, even on its bottom.
Every new website the robots discover is crawled along and across. The robot first enters the website’s main page and then checks every link it finds, both internal and leading to other websites. The robot looks for the file called sitemap.xml where all the web site’s pages are listed to make sure no single page is missed out, even if there is no direct link leading to it. All the pages that are not blocked from crawling (according to the website’s robots.txt file) are registered.
No, the robots are not recognized as users, Google Analytics takes care of distinguishing robots from humans.

#2 Indexing. All the data collected by the robots is processed and systemized by the search engine. The website in general and each of its pages are scrutinized and described in the search engine’s index according to the following criteria:
- Page title
- Page description
- Page keywords
- Types of content present
- Amount of links, internal & external
- Image titles
- Video metadata,
- etc.

Indexing allows the search engine to keep all the data needed to perform web searches, i.e. mainly to rank the pages, without storing the content itself.

#3 Ranking. It is performed every time the search engine gets any search query from a user. After the query is submitted, the search engine decides what web pages to offer the user and in which order. There are several algorithms at work here, including machine learning, natural language processing, and manual rules set by the search engine’s employees.

These actual ranking rules are kept secret (according to the information revealed by Google their amount is over 250), but the intuition behind them is relevance and usefulness. In other words, the search engine tries to build such a rank that will provide the user with the options most likely to resolve his or her problem, answer their question, satisfy their desire.

5 steps to learn why you organic traffic is declining

Keeping the search engine work principles in mind, we can think of a number of reasons you may lose your organic traffic. Let us list the most frequent of them.

#0 Make sure it is really a loss, not just an error. Ar error can occur because of a change to your GA tracking code. I.e. the traffic is alright, but some recent changes to the web site’s code spoiled the tracking code or made it irrelevant. So start with checking if your tracking code is installed properly. In Google Analytics, open “Admin” → “Tracking Info” → “Tracking Code” and check the status. If it is working properly, check the following reasons. If not, follow the instructions from the GA Help section to fix it.

#1. Check the Google Search Console for important diagnostic information. This is the best way to learn what happened:
- Notifications of website errors or downtime;
- Changes to the list of internal and external backlinks;
- Crawl rate and the dates of recent crawling;
- Performance dynamics for specific keywords and specific pages;
- Messages from Google that can point at specific changes, including manual, that led to the downturn.
At the very least, you will learn when the decline began and what sources / keywords / pages started underperforming. Considering the time you can decide whether the drop can be connected to specific changes you have done. You may find a loss of ranking by specific important keywords or notice an indexing issue.

#2. Check if the search engine’s where you noticed decline algorithms have changed. For example Google updates its algorithms over 600 times yearly. They never reveal the actual data but they announce major changes, especially their decisions to penalize websites for some specific kinds of content or behavior. If you are not aware of such changes you may have a hard time wondering why Google started ranking your website 100 lines lower overnight.

#3. Check if your website is losing backlinks or some of its internal links stopped working. If some donors deleted links to your website this may cause a loss of referral traffic, which is easy to notice. But as a side effect, Google notices that your website is losing authority and starts ranking your website lower. You can also consult the Ahrefs service to learn about backlink losses.
Breaking internal links is also bad because it tells Google that your website is losing integrity and becomes harder to navigate. It is quite easy to break internal links if you change some page’s URL and forget to change all the internal links leading to it. Now they lead nowhere, i.e. return the 404 page. Use specialized scanners to reveal such breakages and fix them.

#4. Make sure you have taken care of all the needed changes while migrating your website. If you have done a redesign or moved your website to a new hosting recently, this may have caused undesirable changes to the website parameters counted by the search engines. They include:
- Loss of backlinks and internal links discussed above;
- Wrong links to images;
- Issues with the website availability or speed - one of Google’s top ranking factors;
- Redirect script failures;
- Lost content and metatags;
- New information architecture.

#5. Lastly, check the quality of your content - and that of your competitors. It is widely known that longer texts are typical for highly ranked websites, the average content length for successful websites being around 2500 symbols per text.
Besides the content size, Google successfully evaluates its quality, both directly (using the natural language processing) and indirectly (observing the users’ behavior).

What does Google consider as bad content? You can refer to the following criteria:
- Poorly structured superficial texts;
- Low-quality or borrowed images;
- Hardly navigable and unclear interface;
- Overwhelming pop-up windows, notifications and ads;
- Too many “doorway” and dynamic pages with low organic traffic. Consider either improving or removing them. You can also block them from indexing using robots.txt;

After checking for the mentioned possible factors, if you still haven’t found the reasons, that probably means your competitors started outperforming you in some of the mentioned factors. Check the search result pages, Alexa rank and other popular rankings to figure out what competitors have recently overcome you and investigate their recent changes: what content they have posted, where they have got new backlinks, etc.

With this knowledge, you have all the chances to stop the traffic loss and regain your organic traffic.