You published a new site, months have passed and yet you can’t find it anywhere near the first Google page. Sounds familiar? If you recognized yourself in this situation, it’s the right time to look for some reasons behind it and to start doing some site audit to find out what’s wrong and what should be modified. Often, individuals and companies aren’t even aware that their site isn’t optimized well and they are pretty sure they did everything they should to make their sites eligible for good organic ranking. So, recognizing that there is some issue that needs to be resolved is the first step in allowing your site to reach a higher ranking and helping you overcome competitors.
The thing about SEO is that we need to look at it in a holistic way, as it includes different parts, like technical aspects, content-related ones, but it also refers to building quality links and performing various kinds of analysis. Frequently, a wide range of issues occur because while developing the site, the focus was put only on one SEO part, and all others have been neglected. Also, it’s not strange that many people think that some activities aren’t necessary so they don’t include them on the site, which later results in lower rankings.
That’s said, in the following lines I will give a brief overview of the SEO issues I encountered most frequently during my work. Besides, I’ll try to explain why some aspects are important and how they can impact overall SEO performances and rankings. The goal is to help you understand what you might have done wrong and how you can fix it in the following period.
Without further ado, let’s dive into the most common SEO mistakes that can give you headaches.
Trying to rank for the wrong keywords
The recipe for ranking among the highest Google positions is quite simple - the first step is to make sure that you’re using the right keywords for every page. One of the biggest mistakes that can frequently be noticed is that sites are being optimized for some generic keywords or ones that don’t have the right correlation with your business. If you are running a business in a pretty saturated industry, for instance, beauty, retail, and so on, you have to be careful what your main target keyword is. You shouldn’t go for some generic one, like for example, ‘hair dresser’ or ‘makeup artist’ as you will face a lot of competitors, probably even worldwide which means you won’t be able to rank the site for those phrases high. Although it’s useful to pay attention to search volumes of keywords as that indicates their popularity, always try to target phrases that are included in many search queries, but that aren’t so competitive, which you can check by doing simple research in some tools for this purpose. So, the key is in finding the right balance.
The longer and more specific keywords give a better chance of ranking higher in organic searches, but they also have lower search volume, ie, fewer people are including them in their queries. One of the good ideas that can be efficient is to try to optimize your site for different long-tail keywords on different pages so that you can attract more organic traffic and compete for more phrases related to your business. It will give you way better results than if you would target only one, highly competitive keyword, even if you think it’s significant.
Another thing that is worth mentioning here is that you must avoid using keywords on your site that have nothing to do with the nature of your business and your offer. Some companies think they can “cheat the system” by using some popular phrases that aren’t closely related to them and that this activity will bring more website visitors. But, even if that was the case, those visitors would leave the site shortly after if they couldn’t find desired information on it. In the worst-case scenario, your site could even be penalized for using irrelevant keywords, so you have to be very careful about which phrases you’re adding and targeting.
Creating Content Before Doing SEO Research
This is somewhat related to the previously mentioned mistake and adding the wrong keywords to your site’s pages. If you want to increase your chances of obtaining a higher organic ranking, you need to conduct thorough research and analysis before you start writing the content. Many think that SEO work becomes relevant once everything is prepared and published, but it’s actually the opposite. For optimizing the content on the site quality, you need to complete different kinds of research before you jump into the writing process. This will help you not only find good keywords that you should target but will also allow you to get to know your audience better and understand their needs. Content can be a powerful tool for getting more traffic and reaching your target group, but it must have some value and be useful to the readers so they are interested in staying longer on the site and visiting it more often.
In order to have meaningful and informative content on the site, you should focus on doing various analyses - the one related to finding relevant keywords that are related to your business and present in search queries, you should also try to find data about your targeted audience and discover their interests, motivations, behavior on the internet, and so on. Besides, it could be extremely helpful if you would also do a competitor analysis, as it will provide you valuable insights into your competitors’ activities and help you figure out why some of them are ranked better than you. One good suggestion is to go through the content of highly ranked competitors and analyze it in detail to see which keywords they used, how they structured their articles, does their content have some special value, and so on.
Preparation is really a key factor for doing successful optimization of the site, and it’s almost mandatory to collect quality data before you actually start writing the content for the site, as it will give you everything you need for building the right structure and making it appealing for the readers. Content is one of the significant ranking factors and it can’t be neglected or be of average quality if expect your site to be found among the first Google search results. Hence, don’t find excuses and be lazy; start researching and write something that will be useful to the community.
Not Paying Attention to the Site’s Loading Speed
The fast loading speed of the site is essential not only for a good user experience but also for achieving good search rankings. This is something that you simply can’t overlook if you expect to be ranked high. Modern websites usually have many elements on them, such as images, GIFs, videos, some JavsScript codes, and functionalities, and all of these elements can slow down the site if they’re not optimized properly.
As of August 2021, page experience signals were included as the Google Search ranking factors and one of the most important signals is Core web vitals that measure various performance metrics and overall user experience on some pages. Within this report, we can measure three things - first input delay (FID), largest contentful paint (LCP), and cumulative layout shifts (CLS). The first two factors are closely related to the loading speed as FID measures the time from when a user first interacts with your site, and LCP monitors how quickly the main, and usually the biggest, element of a web page is loaded.
Slow load time is a very common issue that can be found on various sites, as many companies just can’t recognize how important it is that pages loads in under 5 seconds, ideally. But, luckily, there are some great tools available, like for example, Google PageSpeed Insights, that can help you quickly check your site speed and get suggestions on how to fix some existing issues. Some issues will require a fix from the developer, but there will definitely also be many SEO-related tasks, so it’s significant that you go through those reports thoroughly. Usually, problems occur with large images, usage of too many plugins and fonts, embedded media, some design elements, and so on. If users have to wait for a couple of seconds to be able to see the first element on the page, that can have a very negative impact on their experience and can make them leave the site quickly and never come back. Consequently, this can lead to high bounce rates and low traffic which algorithms will perceive as a negative signal and rank your site lower.
Not Optimizing Images For Web
Many websites have great designs and it’s obvious they put effort into creating creative visuals, but on the other hand, they are not paying too much attention to how those design elements will impact search rankings. The major problem comes from the fact that designers who prepare images for the site usually aren’t web designers so they’re not familiar with web standards. If there isn’t an SEO specialist who works closely with the designer and can give the right inputs about the adequate image sizes, it often happens that media elements that are being uploaded on the site are too big, and that can have a very negative impact on performances and ranking. In my personal working experience, I’ve seen many sites that look pretty good from a design perspective and contain some really creative elements and quality images, but those images are too big and simply not suitable for the web.
Some general rule of thumb says that images that are uploaded on the site shouldn’t be bigger than 200 KB, and although we can it’s not mandatory to fully respect this, it’s still necessary to keep them somewhere around this size, as otherwise, they can slow down the site. Believe it or not, I’ve even found images on sites of 5 MB and those sites had significantly low scores and poor organic rankings. While planning and developing the site, it’s crucial to find the right balance between creativity and functionality, as no matter how good-looking your site is, it won’t help you get the desired position on Google if it’s slow and poorly optimized. Thus, before uploading the images to the site, check their size, and if you notice it’s too big, resize it with Photoshop, or even better with some online tools like TinyPNG or Squoosh as they will help you do the job pretty quickly and to remain the original quality of images.
Another thing that has to be mentioned here and that’s related to the images are so-called alternative tags, or more popular, alt tags that are also often overlooked on many sites. They are basically HTML attributes of images that help describe the content. In practice, we will once in a while be in the situation when the image component on our website won’t render well due to improper filename, incorrect file path, or some severe-related issue. If something like that happens, the content that we inserted inside of the alt tag will appear on the page and users will be able to know what that image is about. Besides users, these tags are also useful for crawlers as they help them understand what the information on the page is about. This is a very good SEO practice and has a huge impact on performance, especially for the accessibility part which is considered a significant ranking factor. By not adding alt tags we would omit screen readers and wouldn’t let them scan and read the content on our site to blind or visually impaired users which is more than enough reason to always add them to every image.
Having Poor UX on Mobile Devices
Many modern sites nowadays have mobile apps that accompany the desktop version of their sites, but even those that don’t, should pay a lot of attention to how the site is shown and rendered on mobile devices. Some acknowledged digital marketing portals like Statista have done research that has shown that by 2022 mobile internet traffic accounts for almost 55 percent of total web traffic. This is a huge proof of how important good optimization on mobile devices is, as if your site has poor user experience on smartphones and tablets, it’s slow, some links aren’t functioning, and there’s an extensive problem with cumulative layout shifts when it comes to images, visitors will likely leave it quickly and never come back. I have seen a bunch of sites that have solid desktop performances, but are loading slowly on mobile phones and have small fonts and bad navigation which completely ruins a good experience.
Also, we should also take into the consideration the fact that Google is preparing to totally switch to the mobile version of the webpage for ranking and indexing purposes, and what this means is that Google will look primarily at the mobile version of the site to decide how high it should rank. So, no matter how good the desktop version of your site is set up, if the mobile version of your site isn’t responsive or has some serious bugs, this will surely negatively reflect in your organic rankings.
For many elements on your site, you can do a manual mobile-friendliness check to see if everything runs as it is supposed to and if there aren’t any obstacles that prevent smooth user experience flow. The simplest way is to go to the mobile version of your site and try to click on every link, and button, scroll down the pages to see how images are rendering, and similar as this will help you check if everything is functioning properly. But, if you want to get a broader picture of your site’s performance on mobile devices, you can use some tools, like Google Lighthouse, Google Search Console, and so on, as they will indicate what should be modified and changed exactly. Here you will get all the necessary information regarding different aspects, from the report about Core Web vitals to the ones about the site’s speed on mobile devices, some issues related to some visual elements, and so on. It’s highly recommendable to analyze the site’s performance on all devices before it’s actually launched as this way you’ll avoid unnecessary troubles and your website visitors will be satisfied which will improve their session duration rates.
Having Broken Links and Duplicate Content
One of the most frequent errors that I came across recently is that many sites, even the ones that belong to some big and famous companies have broken links and dead pages that haven’t been properly redirected to the new source. This is really bad from the user experience perspective as people might get disappointed if they hurry to find some valuable information on your site and instead of it they see a message “404 page not found”. If that happens only one time, there’s a huge chance that those users won’t ever come back to your site as it will look unprofessional to them and they will be aware it hasn’t been updated recently as some links aren’t functional. That’s why as a website owner you should regularly review all pages and links on the site to ensure they are all working, and if you notice a dead page, you should either redirect it to the new URL or remove it if it’s not necessary anymore.
For smaller sites, this can easily be done manually as you can just click on every page and link to check if everything runs smoothly, and for bigger ones you can use some tool, like for example Screaming Frog which I personally use and can prove it’s pretty good, to detect all issues regarding this aspect. If you neglect unfunctional pages and links you can really risk a poor ranking, as this isn’t just a bad signal for your audience, but it also has a very negative impact on the algorithms. Sites with many broken links are usually positioned lower as this tells Google that it isn’t updated frequently and they definitely won’t be eager to show those sites among the first search results.
Another thing that can also be mentioned in this context and that can be found very often as well, is duplicate content on some sites. When we say duplicate content, we primarily mean any piece of content that is either similar or completely identical to other content that is also placed on the same site. Although this doesn’t mean, as many people think, that sites with duplicated content will be penalized, it’s still very likely that it will show a significant impact on organic rankings. Search engines don’t prefer showing more versions of the same content in search results so they will choose the one that seems more relevant to the users.
So, if you have duplicates, some pages on your site will surely be affected as they won’t be ranked anywhere. There are many reasons why there might be some duplicated content on the site, and it often time doesn’t happen on purpose, so if you recently changed something on the site, migrated it to the new domain and similar, make sure that you crawl it thoroughly with some tool as it will show you if you have any duplications. If you find some duplicated pages, the good idea is to use a canonical tag to point to the preferred URL on the site. These tags can help you in these situations as they tell search engines that one page is a duplicate of another, as well as which of the duplicate pages to consider the primary one for indexing.
Neglecting Meta Tags
Lastly, I will also point out another common mistake when it comes to good SEO and that’s especially present in the past few years. It refers to meta tags, like many people, even some who work as SEO professionals, started wrongly believing that it’s something that’s not necessary to be added to the pages anymore and they completely overlook it. Even though meta tags don’t represent a direct ranking factor, they are still important because they impact how your site will appear in search results and how many people will be attracted to click and visit it. This especially applies to meta description tags that are being shown directly below the title, and you definitely don’t want to leave this choice to Google.
Additionally, they could be a factor in some other, not-so-traditional search results, like Knowledge Graph, Google image search, voice, and similar. When algorithms crawl the site they look at the whole structure and they value higher sites that have everything implemented and that respect all rules. Nowadays, most website platforms already have built-in sections for completing meta tags, especially description ones for each page, so this shouldn’t be neglected by no means. But, while adding meta tags, it’s also essential to pay attention that they really have some meaning and logic and that relevant keywords are added to them. This also means that you will need to do some research before implementing them to be sure that you found the most suitable terms that will help you reach the most adequate audience.
Besides, you will also need to focus on the optimal length, which for meta titles is around 50-60 characters, and for meta descriptions, it ideally doesn’t pass 160 characters, as otherwise it won’t be completely displayed in search results which doesn’t look good. If you apply these tags to some CMS, you will see clear guidelines about limitations so it won’t be hard to respect recommendations, but if you are inserting them directly into the code, you will have to manually check the word count. Furthermore, it’s highly recommendable that you write title tags and meta descriptions in some uniform way, ie, to follow a similar pattern for all pages. For example, what I usually use in my work when it comes to titles, is a scheme that looks something like this - Primary Keyword | Page Name | Brand Name. Whenever you can, it’s a good idea to insert some keywords into the tags to try to boost the ranking this way as well, but make sure not to overload descriptions with them as that can produce a counter effect.
In this, pretty descriptive article, my main goal was to point out some of the mistakes that are often repeated on many different websites, and that negatively affect their good optimization. Their not just mistakes that are done by people that haven't got anything to do with SEO, in fact, it frequently happens that those who are actively working in this field are also missing some important elements, either because they are not aware of them or because they don’t consider them significant for high rankings. We all know that SEO requires a lot of work and effort and that big results definitely won’t come overnight. But, to be on the right path, it’s necessary to recognize all elements that can have some impact on ranking and to do everything that’s in your power to optimize the site in the best possible way. As I mentioned in the introductory part of this article, SEO must be looked at in a holistic way, and you have to pay attention to various elements at once if you expect to see your site among the first positions in search results.
To summarize, when you start working on a site’s optimization focus on its speed, make sure you have great content that contains relevant keywords, add meta tags, check the functionality of links, test the experience on mobile devices, and so on. Some of the SEO mistakes can easily be seen while visiting the site, and even regular users can notice that something’s wrong. But, there are also those that aren’t so obvious, and that require doing a thorough site audit to help you check all elements and find existing issues. That’s why you’ll need to check your site both manually and with the help of some tool to understand it deeply and to be able to fix all errors that prevent its good ranking.
From this text, you could surely notice there are many mistakes and factors that can influence high rankings, and unfortunately, if you only overlook one element, you will probably have to pay for it. This can even be literally as if you don’t achieve desired results with organic rankings, you would maybe need to orientate on making paid campaigns, which seems pretty disappointing. Therefore, to avoid getting into this situation read all suggestions given here as they can be some sort of guidelines on how to fix some common issues and on what to concentrate on while optimizing the site. Good luck! :)