When was the first SEO algorithm started?

Google Updates - The most important changes to the Google algorithm

Panda, Penguin, Hummingbird, RankBrain, BERT and whatever they are called - Google is constantly testing and changing its comprehensive algorithm to improve search results for users. Below you will find an overview of the most important Google updates in recent years and insights into their effects. Then you are well prepared and know what to look out for when optimizing your website so that it is not devalued during the next Google update.

Table of Contents

Latest Google updates

In the past year, Google rolled out several updates. The last major, known updates are briefly summarized here. We'll go into more detail below.

December 2020 Core Update

Shortly before Christmas, Google rolled out a new core update on December 4th, 2020. The first effects of the update are already noticeable. According to Sistrix, dictionaries and encyclopedias are the most affected websites this time. It remains to be seen which industries will still be affected until the update is fully rolled out.

May 2020 Core Update

At the beginning of May, Google announced another core update, which was rolled out shortly afterwards. In order to be able to foresee the exact effects of the update and to determine which topics and industries it has hit this time, we have to wait a little longer. Sistrix has already made the first observations and shared them on the blog. A clear limitation to individual industries cannot yet be derived from this, even if pages from the health sector seem to be among those affected this time.

January 2020 Core Update

In mid-January 2020, Google rolled out a new core update. As with the core updates last year, websites from the health and financial sectors were again increasingly affected. These either increased significantly in visibility or lost some positions. According to Sistrix, the trustworthiness of the domains was reassessed in the course of the update. Sites that were affected by the previous core updates are very likely to be now. The fluctuations in visibility seem to decrease with each update, but the range of affected domains is expanded.

November 2019 Local Search Update (Bedlam Update)

In November 2019 there was a new update that affects the local search. With the help of neural matching, search queries with a local reference should now be able to be interpreted even better. Neural matching is based on AI (Artificial Intelligence: Artificial Intelligence). For local entrepreneurs, however, there is no need for action.

Basics: Update vs. Data Refresh

Not all optimizations that Google makes are automatic updates. Anyone who digs deeper into the subject of Google Updates will sooner or later also come across the term "Data refresh" bump.

The difference is easy to explain: An update is an actual change in the Google algorithm, such as the first introduction of the Panda or Penguin update. Such updates are quasi filters that are placed over the algorithm and relate to a certain aspect of the entire algorithm, consisting of a large number of ranking factors. But these filters always need one Databaseon which they can work. If Google improves a filter, such as Penguin, and wants to weight individual factors of the filter differently, then only the database is updated for this. A data refresh is therefore carried out. However, if Google identifies new signals and aspects and includes them in the filter, then this is an update.

Updates therefore have a much greater effect on websites and their ranking than a data refresh (source: Sistrix.de).

Panda Update (2011)

One of the most famous updates is this Panda update. It was first rolled out in 2011 and serves as a Quality filters for Google in relation to website content. With the help of this filter, the content of websites is checked for quality and, above all, the Added value for the user checked.

Since the first roll-out, four updated versions have been published at irregular intervals. The Panda Updates 4.0 and 4.1 in 2014 in particular had a significant impact on website rankings in search results. After that, updates to the Panda update were no longer communicated, as the filter now functions as an integral part of the Google algorithm.

What are the effects of the Panda update?

Websites with good editorial content were rated better as a result of the Panda update and were able to gain ranking positions. In return, websites with dubious and bad content lost their visibility because they were displaced by other search results. With each subsequent Panda update, Google adds more signals to the filter, which help to distinguish high-quality content from low-quality content.

According to Searchmetrics, the 4.1 update had some negative effects on international players such as HRS.de or Nokia.com. Smaller and medium-sized pages with high quality content, on the other hand, were able to gain visibility.

In order to benefit from the Panda update, it is important that you have your own high quality content on the website which primarily aim at the added value of the user. You should avoid duplicate content and information from external sources that have been adopted 1-to-1. What distinguishes high quality content, Google explained in the webmaster blog.

Penguin Update (2012)

The Penguin update was first rolled out in April 2012 and communicated by Google. It's another quality filter from Google that's all about the Identification of spam, links and backlinks turns. With the help of this filter, Google recognizes websites that have unnatural backlink patterns have or Keyword stuffing operate. These are punished by losing rankings or being banned from the Google index entirely. Since the Penguin Update 4.0, the quality filter has become even better, according to Google: The filter is now capable URL-based spam signals to identify and only devalue the URLs with spammy links.

There have been four updated versions of the update since it was first rolled out. This has been integrated into the core algorithm since Penguin 4.0. Since September 2016, these updates are no longer communicated separately, as Penguin now acts in real time. According to Google, updates to the filter are made continuously, without notice.

What are the effects of the Penguin Update?

Websites that unnatural and dubious link building have operated, or have artificially advanced their page in the rankings with the help of keyword stuffing, are rigorously punished by Google. In the meantime, Google has become very good and sensitive at identifying unnatural backlinks. You can find tips on sustainable link building at OMT.

When Penguin was not yet integrated into the core algorithm in real time, but was rolled out manually by Google, the impact of a Penguin update could always be easily recognized by increasing the visibility of a website or losing it. Since Penguin now takes effect in real time every time a URL is crawled, the effects are much less easy to identify. In general, however, the real-time filter function is positive, as pages affected by Penguin do not have to wait until the next roll-out, but can see the effects of their work in link management directly.

Hummingbird Update - a new Google algorithm (2013)

In August 2013, Google confirmed the launch of a new generation of the Google algorithm - suitable for the 15th birthday of the Google search. At Hummingbird it's not just a change in the algorithm, it represents one completely new algorithm which has since formed the basis of the search. It should now be possible to deliver search results faster and, above all, more precisely.

The goal of Hummingbird is to do that To better understand the user's intention behind a search query and deliver more suitable search results accordingly. Instead of searching for individual words in a user's search query and then interpreting them individually, since Hummingbird Google has been able to analyze the search query as a whole and provide a semantic relationship between the individual words to manufacture. This makes the user intention clearer and the alignment and intention of a website can also be read out better with the help of Hummingbird.

Mostly conversational searches, so Search queries that are asked in the style of a conversation or question, can be interpreted better. For example, Google can now very well recognize the intent behind search queries such as “Show me pictures of Cologne Cathedral” or “Will it rain today?”:

Search result for search query "Will it rain today?"

Search result for the query: "Show me pictures of Cologne Cathedral"

The update also affects the search using the voice function. This should be improved enormously, since search queries are often formulated in whole sentences.

Hummingbird is a step towards that semantic search and represents the switch to a new generation of search, which was continued with RankBrain and BERT (see below). Almost 90% of all search queries were affected by Hummingbird at the time of the rollout.

What are the effects of the Hummingbird Update?

According to Searchmetrics, the diversity of the search for Hummingbird decreased because more similar results are displayed for semantically identical search queries. The number of different URLs in the search results has decreased by almost 6%. This is especially the case with keywords with semantic similarity.

A little tip on the side: With structured data, you can better align your website with semantic searches, as the data available on the website can be better read and interpreted by search engines.

Google Mobile Update & mobile-first Index (2015)

In February 2015, Google announced a mobile update for April 21, 2015 and thus started the offensive, make mobile search more user-friendly. Since more and more users are performing Google searches on their smartphones, pages that are optimized for them should also appear in the search results on mobile devices.

This update was referred to by the media as "Mobilegeddon“Because it was feared that the search could be expected to have a significant impact. Some webmasters did not use the time until the update to optimize their website for mobile devices and had to accept major losses in terms of their visibility. Overall, however, the feared fluctuations were kept within limits.

A second announced mobile update took place in May 2016, but this had a much smaller effect. Most websites have now been optimized for mobile use, as Daniel Furch from Searchmetrics confirms and is also suspected on moz.com.

Then, in May 2018, Google rolled that out Mobile first indexing out. This means that Google websites based on the Rated content viewed on mobile devices. Previously, the desktop version of a website was primarily used to determine the relevance for a specific search query. Content that is hidden on smartphones, for example to increase user-friendliness, is not taken into account for the ranking. Exceptions are content that is visible by clicking, such as content in accordions.

According to Google, the first wave of index changes started with websites that corresponded to best practices for mobile-first indexing. All websites that have already been switched from desktop to mobile-first indexing received a notification in the Google Search Console.

Notice in the Google Search Console that Mobile First indexing has been rolled out for a website

If you want to test your mobile site, you can do so with the help of a Google tool.

What are the effects of the mobile update & the mobile-first index?

As announced by Google Websites with responsive design or mobile websites after the update in smartphone search results better positioned. According to Searchmetrics, websites that were affected by the first update have largely improved and now also have a mobile-friendly page. In general, the proportion of websites that have a mobile-friendly page has increased significantly. According to Searchmetrics, their share of increased from 76% in May 2015 to 98% in May 2016.

In its best practices, Google has recorded the theoretical effects of the mobile-first index:

If you have this website type, the following happens:
Desktop version only: Your website is only available as a desktop version and you don't have a mobile-friendly versionNo change. The mobile version is the same as the desktop version.
Responsive website: Your website adapts to the screen size of the user.No change. The mobile version is the same as the desktop version.
Canonical AMP website: All of your pages are created as AMP HTML.No change. The mobile version is the same as the desktop version.
Separate urls: Each desktop URL has an equivalent for mobile device users, but the URL is different. This type of page is also known as the m-dot website.Google prefers the mobile URLs for indexing.
Dynamic assignment: Your website offers users different content, depending on the device the user came from. However, users all see the same URL.Google prefers mobile-optimized content for indexing.
AMP and non-AMP: Your website has both AMP and non-AMP versions for one page. Users see two different URLs.Google prefers the mobile version of the non-AMP URL for indexing.

Another positive aspect is that with the mobile-first index, hamburger menus and accordions, i.e. elements that only present more content on mobile devices after a click, are finally recognized by Google. The content behind these elements flows into the rating of a page, as Barry Schwartz sums it up. So far, Google has often not been able to read and evaluate such "hidden" content correctly.

RankBrain - Machine Learning in the Google Algorithm (2015)

In October 2015, Google announced that RankBrain has been part of the algorithm for several months and is looking for content and backlinks as a third most important ranking factor count. RankBrain is based on machine learning.

Ryte defines machine learning as follows:

Machine learning is an important area of ​​computer science and part of artificial intelligence. Computer programs based on machine learning can use algorithms to independently find solutions to new and unknown problems. Machine learning is also used in online marketing and web analysis.

According to Google Senior Research Scientist Greg Corrado, the difference between RankBrain and all other Google ranking factors is that the findings that form the basis of the previous ranking factors were based on internal research. RankBrain is now gaining this knowledge based on machine learning self.

In general, Google's aim with RankBrain is to analyze user search queries so well that websites are also displayed in the search results that may not match the exact wording of the user's search query. Especially with very long (long-tail keywords) or generic search queries, RankBrain should help to interpret them better. (Sources: searchengineland.com and Searchmetrics)

In short, RankBrain makes it easier to deal with new or very generic search queries by creating links to similar known search queries and learning to better identify the search intention behind the search queries of the user based on statistical experience. The user intention is once again in the foreground.

What are the effects of RankBrain?

Since RankBrain is used for every search query, it influences the rankings of a large number of websites. To date, Google has officially only given a few understandable examples of the exact function of RankBrain. However, Kai Spriestersbach has collected some good examples. For the following search query for RankBrain, a significantly more meaningful search result was displayed than before:

Search results for the search query "Why are PDFs so weak" before RankBrain

Search results for the search query "Why are PDFs so weak" according to RankBrain

RankBrain also helps in Germany to better assign search queries. Before RankBrain only the more obvious reference to the anti-lock braking system (ABS) was made to the question “Who is abs?”, Google is now playing other more relevant pages for the search query, as RankBrain recognizes that the question “WHO is abs? “Looking for a person rather than a system.

Search results for the search query "Wer ist Abs" before RankBrain

Search results for the search query "Wer ist Abs" according to RankBrain

According to Kai Spriestersbach, other ranking factors such as backlinks and keywords in the title have lost their importance since RankBrain, as Google is now able to establish relationships between search queries that were previously not possible based on machine-recorded statistical experience. User signals and the user intention behind a search query play a central role here.

You can find more information about Rankbrain in the article "Artificial intelligence in use or: What is RankBrain?"

Quality Update or Phantom Update (2015)

Again and again there are unexpected and unexplained fluctuations in the search results, as Google tests changes in the algorithm and does not communicate them as with the Panda or Penguin updates. In general, updates of this type appear to be Changes in the core of the algorithm (core update) to act, which, according to Google, relate to the quality signals of the core algorithm. However, Google does not comment on the exact nature of the ranking signals affected. In May, June and December 2015, several of these updates caused turbulence in the search results. Since there were no official names for these updates, they became Quality and Phantom Updates called. It was also suspected in mid-2016 that another Phantom Update was rolled out (Searchmetrics and internetworld reported on this).

According to Searchmetrics, the Phantom UpdatesFocus on quality and user intention. High-quality, relevant and unique content and a good website structure that makes it easy to access all content are rated positively. According to the Quality Rater Guidelines published by Google, the focus when evaluating websites is primarily on the degree of satisfaction with the user's intention.

What are the effects of the Phantom Update?

According to Sistrix, domains lost between 20% and 60% of their visibility worldwide in the calendar week from May 4th, 2015 to May 11th, 2015. Google then confirmed that it had made a change to the core algorithm.

However, the update appears no specific website type to concern, but had an impact on different types of websites. According to Searchmetrics, above all User signals decisive. If the content of a page is thematically or structurally not tailored to the user's intention, this will have a long-term effect on the visibility of a website.

Fred Update (2017)

At the beginning of March 2017 there were again big discussions triggered by Barry Schwartz and other well-known figures in the industry about another one Update of the core algorithm. More for fun, the update was unofficially named Fred baptized.

After Fred was officially confirmed at the end of March, the exact mode of action of the update is not one hundred percent clear, as only little information was disclosed here by Google. Gary Illyes pointed out, however, that pages that were penalized had specifically violated current webmaster guidelines.

In an analysis of the effects of the Fred update, Sistrix came to the conclusion that this was again an optimization of the ranking criteria that deal with the quality of the page content of a website. Here, too, the focus is on the added value of the content for the user.

What are the effects of the Fred Update?

According to Sistrix's analysis, the losers in particular are websites that inferior content offer and strong ad-heavy are, i.e. have many advertising banners above-the-fold (in the directly visible area of ​​a page). This is also confirmed by the analytical findings of Barry Schwartz. Pages with very outdated or very thin content, as well as "SEO texts" that have been over-optimized with keywords, are rated negatively. Likewise, sites whose main goal is to generate many users with poor quality content via Google and to monetize themselves via affiliate programs instead of offering the user added value lose.

Domains that have already been affected by Panda and / or Penguin updates are again mostly among the losers this time and suffered ranking losses of 40 to 100+ positions.

User Localization Update - More relevance for local search (2017)

Towards the end of October 2017, Google announced that it was no longer possible to carry out targeted search queries in specific Google indexes by entering the respective country code top level domain (ccTLD). Previously, based on the entered ccTLD, you could see specific search results for each country, even if you were not physically in this country, but this is no longer possible. Instead, by default the search results of the respective Google index based on its physical location played out.

Specifically, this means that if you are in Germany and google something there, SERPs of the German index (google.de) will be displayed. However, if you are on vacation in France and are looking for the weather there, you will automatically receive the most relevant search result, namely the weather forecast for your location in the French index (google.fr) and not for the weather at home in Germany. The goal of this update is to make the Relevance of the search results to increase for the user and the respective individual (local) context of the user to be able to take more into account.

From now on, your physical location will determine the search results that will be played out to you. The name User localization update is probably due to Searchmetrics - Google itself did not give this update a specific name.

What are the effects of the localization update?

Searchmetrics has written an interesting article on the subject and asked industry experts for their opinions. You should take the following key messages with you:

  • The rating of websites does not change. Tracking the rankings per country may be a challenge for tool providers in this area
  • Global portals may find it more difficult to appear in local searches in the future, as more relevance is assigned to local results
  • International SEO standards such as hreflang and the regional orientation with the help of the Google Search Console as well as the use of the relevant terms per country and language should be checked again if you have not yet taken them into account in your international SEO strategy
  • In addition, the update supports the relevance of mobile search queries, as these are very often local, according to Searchengineland and Google

Speed ​​Update (2018)

That PageSpeed ​​is an important ranking factor has probably been known to most of you for a long time. With the Speed ​​update In July 2018, however, Google once again emphasized the importance of this factor above all for mobile search underlined. So far, when evaluating pages, the main focus was on the loading times of the desktop version of your website. With the Speed ​​Update, the PageSpeed ​​of the mobile websites is the focus of the evaluation of your website. Mobile pages that load very slowly have a disadvantage in terms of ranking.

The loading time of a mobile page is critical with regard to the bounce rate

What are the effects of the Page Speed ​​Update?

According to Google Webmasters, the speed update only affects the pages that the slowest user experience and therefore only affects a small number of pages and search queries. Regardless of the technology of a page, only the loading time itself is considered here.

According to Searchmetrics, only a small number of very slow mobile websites are affected. Due to the early announcement of the update and the generally quite open communication from Google when it comes to mobile updates, website operators had enough time to prepare for the topic.

There are a number of tools to check your own website performance in terms of loading times. Among other things, this includes Page Speed ​​Insights from Google itself. Here you can simply enter the URL of the page to be analyzed and, based on this, you will receive a report for the desktop and smartphone performance of the page.

Result of the Page Speed ​​Insights test for a URL

There is also one since October 2019 Report on loading times in the Search Console, with which you can see which of your URLs load particularly slowly or particularly quickly. An indication that the PageSpeed ​​topic will still be relevant in the future.

Report on loading times in the Google Search Console

Medic Update I & II (2018)

At the beginning of August 2018, Google rolled a big one worldwide Core algorithm update from that the rankings of a striking number Health Sector Pages influenced. For this reason, Barry Schwartz coined the termMedic update for the changes to the Google algorithm.

It turned out that the Medic Update not only had an impact on pages from the health sector, but also in general Pages of the category YMYL - also "Your Money, Your Life" - concerned. In addition to health pages were accordingly also Finance pages, fitness or wellness pages affected. Sometimes they saw crashes or increases in visibility of up to 60%.

Google kept a low profile in its statements during this update, too, but presumably an entire industry was not re-evaluated in a targeted manner, but adjustments were made that affect a certain type of page that happen to be often in these subject areas.

At the beginning of October 2018, another medic update took place, whereby the visibility of some previously dropped pages normalized again, while others continued to gain or lose visibility. Obviously, Google has readjusted here and fixed bugs.

Google only commented on this core update to a very limited extent. The meaning of high quality content as a ranking factor stressed.

Source: https://twitter.com/searchliaison/status/1050447185177800704

What are the effects of the Medic updates?

The medic updates seem to be mostly about that Trustworthiness of the sites to go. Important factors are here Expertise, authority and trustworthiness (Expertise, Authoritativeness and Trustworthiness) - in short E-A-T. The updates are therefore too E-A-T updates called. Pages where the Very sensitive content are and the "Trust" factor plays a special role. Site operators should therefore continue to use high quality and relevant content pay attention, but also the Expertise of the authors - recognizable for the search engine - highlight.

Google also emphasizes this in its General Guidelines under point 2.3:

We have very high Page Quality rating standards for YMYL pages because low quality YMYL pages could potentially negatively impact users ’happiness, health, financial stability, or safety.

Google's E-A-T score gives a good insight into which factors a text should meet in order to be considered trustworthy.

For pages with a high E-A-T score on medical topics, the following is recommended under point 3.2:

High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis.

The Medic Update only reminded us once more how important quality content is for the ranking of websites. Website operators should therefore continue to use high quality content invest, provide scientific evidence (if necessary) and also rely on a high level of trustworthiness on your own site - e.g. through an existing imprint, terms and conditions and contact options.

Core algorithm updates 2019

In 2019 there were relatively continuous intervals (March, June and September) Core updateswhich have also been confirmed by Google and in some cases even announced via Twitter. They were named after the month in which they were rolled out.

What is noticeable about these updates is that the affected pages show changes in visibility very quickly and very clearly. There are clear losers, but also clear winner the updates. Domains that had already felt the effects of the Medic updates or E-A-T updates from 2018 are now mostly also among the affected pages. It can also be seen that pages that responded to the March update were also influenced by the two subsequent updates. For example at netdoktor.de this can be seen very clearly:

Visibility shows strong changes at the time of the core updates

You can find an overview of the changes in visibility in large portals from the health sector from Martin Missfeld.

March 2019 Core Update

Affected by March 2019 Core Update were mainly domains from the "Your Money Your Live" (YMYL) environment and pages from the Health sector. Pages that a strong branding and a extended range of topics According to Searchmetrics, they were among the winners of the update. Niche websites, on the other hand, tend to be the losers. In addition, sites with good user signals seem to have benefited from the update.

June 2019 Core Update

In June, Google announced a core update via Twitter for the first time. Again, many websites were out of the YMYL area affected and domains that had to lose visibility in March could catch up again in June. In addition to the Medical and pharmaceutical sites, which reacted to the Medic Updates in 2018, now also belonged to websites from the News area and Content pages to those affected. Searchmetrics also saw an increase in video carousels in the SERPs.

September 2019 Core Update

The September 2019 Core Update was announced again by Google via Twitter. According to Sistrix were reinforced Pages from the travel sector affected, but also in Health and Finance there was some movement in the SERPs again. Searchmetrics also noted an upgrade in video content so that video carousels are now more often ahead of the actual organic results in the SERPs.

Effects of the core updates

The aim of the core updates is To evaluate the content of websites even better and to provide users with the best possible content for their search query. According to Google, websites whose SEO performance has been affected by the updates are not necessarily doing something wrong and are therefore penalized. Much more are new competitors have arrived who deliver better content.

So this is what it is not penalties for violating Google guidelines. Accordingly, nothing on the page can be fixed in order to quickly regain the old visibility. Orientation is provided by the websites that have gained visibility through the update. From Google's point of view, these seem to be doing something right. The focus here should be on the Quality of the content lie. When creating them, greater attention should be paid to the search intention. E-A-T continues to play a major role.

In the Webmaster Forum, Google has provided a list of questions from which webmasters can critically review their content. You should also deal with the Google Quality Rater Guidelines deal with.

BERT update (2019)

In October 2019, Google released a new update to help Understand complex search queries correctly and deliver the corresponding search results. The update bears the name BERT, short for "Bidirectional Encoder Representations from transformers". According to Google's own statements, this update is the greatest development of the search engine in the past five years.

BERT is based on Natural Language Processing (NLP) and thus follows on from Hummingbird and the Rank Brain algorithm. The aim is to better understand search queries in their completeness and individual ones Interpret words in the context of a sentence to be able to instead of each word individually. This mainly relates to prepositions such as “to” or “for”, which so far could not be evaluated in the context of the search query. As an example, Google cites the search query “2019 brazil traveler to usa need a visit”. The preposition "to" plays a crucial role in the meaning of the search query. Finally, the user is looking for information for travelers from Brazil who want to enter the USA. Google can now recognize this and display the corresponding information. The example shows that this was not always possible before:

Search results before and after the BERT update

For us, this means that search queries that are made in natural language or in whole sentences can be better understood and corresponding search results can be delivered according to the search intention. The improvements affect both the usual search results and the display of featured snippets.

What are the effects of the BERT update?

First up is BERT for the US English language only in action. According to Google should approx. 10% of all English searches in America be affected by it.But knowledge gained from one language can be transferred to another. As a result, it is possible that effects will also be felt in other languages ​​before the update itself is rolled out for the respective language. So even if BERT has not yet arrived in Germany, our search results can still benefit from it.

There is no optimization for BERT in the sense that this update is not about a new evaluation of websites, but about a better understanding of the search query. Nevertheless, in this context it makes sense to provide exactly the content that users are looking for on your own website and to formulate it as simply and clearly understandable as possible. You can find more about this at SEOSüdWest.

Conclusion

The latest Google updates show that Google is turning two screws: The Evaluation of websites and at the Interpretation of search queries.

So the search engine gets better and better at that Quality of a website to judge. Websites with bad or outdated content lose their relevance and thus their ranking positions. On the other hand, websites with good user signals gain visibility. In addition to excellent content, the Usability a website. Operation on mobile devices or loading times are the focus of optimization.

On the other hand, the Google algorithm learns to interpret the search queries made to it more precisely. This allows the search engine a more targeted display of search results. In addition, this makes communication with the search engine easier for the user, since search queries that are made in the style of the spoken language can be better evaluated.

As a rule of thumb, you should remember:

Anyone who implements SEO in high quality, works seriously and transparently, brings creativity and always keeps the context of the user in mind, will not be impressed by Panda, Penguin, RankBrain, BERT and Co. In addition, you shouldn't ignore your mobile users and make sure that your website also has good usability on the move, presents relevant content and performs well.

 

Photo credits: Some of the images used come from stramark.

Mareike Doll

Mareike heads the SEO team at lunapark. She originally comes from the humanities and studied German and Italian philology. She started out in online marketing as an online editor, and is now at home in the world of search engines. At lunapark, in addition to the SEO team, she is also responsible for editing the Fimern blog and is passionate about writing specialist articles. She also gives workshops and has also been a guest at one or the other SEO conference as a speaker.

Ranking factor E-A-T and its influence on SEO

by Alina Loup | May 11, 2021

The internet is subject to constant change. For us in search engine optimization this means: We have to rethink our strategies regularly and adapt them to current trends. Since the Medic update in August 2018 at the latest, however, two ...

What is the robots.txt and what is its function?

by Stefan Knappe | May 3, 2021

With very large websites, the problem often arises that the Googlebot does not crawl them completely, and for this reason relevant sub-pages are not indexed and then do not rank. With the robots.txt the crawlers of the search engines can be controlled ....

5 comments

  1. Anna on April 27, 2021 at 2:01 p.m.
  2. the-5-minute-blog on May 9, 2020 at 2:15 pm
  3. Sandra on May 7th, 2020 at 9:35 pm
  4. Khoa Nguyen on January 7th, 2020 at 11:20 pm
  5. Michael on November 11, 2019 at 9:10 am