1. Do the web pages contain multiple H1 tags?
  2. Is the website easy to crawl?
  3. Are error pages configured correctly?
  4. Does the navigation use JavaScript?
  5. Are URLs resolved in an individual case?
  6. Is your site using a flat architecture?
  7. Is there thin content on the website?
  8. Are you planning to reuse existing code or create a site from scratch?
  9. Does the site have any Schema.org structured data, if applicable?
  10. Does the site have an XML sitemap?
  11. Landing pages are not optimized properly?
  12. Is the Robots.txt file free of errors?
  13. Does the website use responsive design?
  14. Are CSS and JavaScript blocked in Robots.txt?
  15. Are there excessive dynamic URLs being used across the site?
  16. Is the site plagued by too many links?
  17. Does the site have daisy chain URLs and do these redirects exceed 5 or more?
  18. Are links on the website provided with Javascript?
  19. Is the anchor text in the site's link profile too optimized?
  20. Is your canonicalization implemented correctly on your website?
  21. Are the pictures on the site too big?
  22. Is the site missing the schema markup for videos?
  23. Does the site have all of the required page titles?
  24. Does the site have all the required meta descriptions?
  25. Is the page speed of the top landing pages longer than 2-3 seconds?
  26. Does the website use browser caching?
  27. Does the website use a content delivery network?
  28. Has the website content been optimized for targeted keyword phrases?
  29. How deeply has the website content been optimized?
  30. Have you done keyword research on the website?
  31. Has the website content been proofread?
  32. Have the images on the website been optimized?
  33. Are the websites following web development best practices?
  34. Was an HTTPS migration performed correctly?
  35. Has a new disavow file been sent to the correct Google Search Console profile?
  36. Have the GSC settings been transferred to the new account?
  37. Did you make sure you made a note of the migration in Google Analytics?
  38. Is the implementation of social media carried out correctly on the website?
  39. Are the lead submission forms working properly?
  40. Are lead tracking scripts working properly?
  41. Is call tracking set up properly?
  42. Is the site using excessive inline CSS and JavaScript?
  43. Are the correct GSC / GA accounts linked? Are they properly linked?
  44. Are your urls too long?
  45. How targeted are keywords on the website?
  46. Is Google Analytics properly set up on the website at all?
  47. Is Google Tag Manager working properly?
  48. Otherwise, does the site have other important quality issues?
  49. Is your report information correct?
  50. Is my website actually ready?

Sometimes you look at your own website and wonder if this is a good website.

It can be helpful to take a step back, take a critical look at your work, and ask unbiased questions.

Something can go wrong on any website.

Problems can arise and be ignored or are not urgent enough to be resolved immediately.

Minor technical SEO problems can pile up, and new libraries can add more delay to pages before a website becomes unusable.

It is important to evaluate the quality of your website before doing this.

What are the shortcomings of your website?

How can you fix it and make it work properly?

Read on for 50 questions you should ask to evaluate the quality of your website.

1. Do the web pages contain multiple H1 tags?

The page should not contain multiple H1 tags.

advertising

Read on below

H1s may no longer be a significant problem for Google, but they are still a problem for screen readers. Best practices are that there should only be one.

The H1 tag indicates the focus of the page and should therefore only appear once on the page.

If you put more than one H1 tag on the page, the focus of the page will be watered down.

The H1 is one side's thesis. It's important to keep it simple.

2. Is the website easy to crawl?

Try crawling your page with a third-party tool like Screaming Frog, check your crawl errors in the Google Search Console, or connect a page to Google's mobile test.

If you can't crawl your website, search engines probably can't crawl it effectively either.

3. Are the error pages configured correctly?

Problems can arise if error pages are not configured correctly.

Many websites have discrete 404 pages that don't send 404 signals to Google. Instead her say 404 to users and send 200 HTTP codes that signal to search engines that the page is content and indexable.

advertising

Read on below

This can lead to crawling and indexing conflicts.

The safe solution is to make sure that status and status are displayed on all pages.

4xx pages should show 4xx errors with 4xx status.

5xx pages should show 5xx status.

Using a different configuration will create confusion and will not help with your specific problems.

Identifying and fixing problems this way can improve the quality of your website.

Are your error pages configured correctly?

4. Does the navigation use JavaScript?

If you use a navigation that uses JavaScript to implement, you will affect cross-platform and cross-browser compatibility.

Use direct HTML links to navigate the user when redirected to a new URL.

Many of the same effects can be achieved by direct CSS 3 coding, which should be used for navigation instead.

If your website uses a responsive design, this should already be part of it.

Seriously, drop the JavaScript.

5. Are URLs resolved in an individual case?

As with the canonicalization issues mentioned above, URLs can lead to duplicate content problems in several cases because search engines see all of these instances of URLs at the same time.

One trick you can use to take care of multi-case urls is to add a lower command to the website's htaccess file.

This will render all multi-case urls as a canonical url you choose.

You can also check the canonical on the page to see if it matches and see if there are multiple pages depending on whether or not there is a slash at the end of the URL.

6. Is your site using a flat architecture?

Using a flat architecture is fine, but it doesn't suit the current direction and organization.

advertising

Read on below

With a flat architecture, all of your pages are usually stored in the root directory and there is no focus on topics or related topics.

Using an isolated architecture, you can group web pages by subject and subject and organize them with a link structure that further reinforces the current focus.

This, in turn, helps search engines better understand what you're trying to rank for.

For sites with a narrower current focus, a flat architecture might be a better way to go, but you should still have the option to pause the navigation.

7. Is there thin content on the website?

Short-form content is not necessarily a problem as long as it is of a high quality to answer a user's request with information for that request.

Content should not be measured by the number of words, but by quality, uniqueness, authority, relevance and trust.

How can you tell if the content is thin? Ask yourself the following questions:

  • Is that quality content?
  • Is the content unique (is it written so clearly that it will not be displayed anywhere else (on Google or on site)?
  • Does the content largely correspond to the user's request?
  • Is it relevant and does it create trust when you visit the page?

advertising

Read on below

8. Are you planning to reuse existing code or create a site from scratch?

Copying and pasting code is not as easy as expected.

Have you ever seen websites that appear to have errors in every line of code when checked with the W3C validator?

This is usually because the developer copied and pasted the code written for one DOCTYPE and used it for another.

If you copy and paste the code for XHTML 1.0 into an HTML 5 DOCTYPE, you can expect thousands of errors.

This is why it is important to keep this in mind when moving a website to WordPress. Check the DOCTYPE used.

This can affect cross-browser and cross-platform compatibility.

9. Does the site have Schema.org structured data, if applicable?

The schema is key to getting large snippets on Google.

Even if your site does not do well for certain structured data elements, there are still ways to add structured markup.

First, doing an entity check is helpful in determining what your site is already doing.

advertising

Read on below

If nothing, you can correct this inequality by adding schema markup to:

  • Navigation.
  • Logo.
  • Phone number.
  • Certain elements of content that are standard on any website.

Make sure that you are only using one data type of the schema and that the schema conforms to Google standards.

10. Does the site have an XML sitemap?

A point rarely leads to an improvement in the overall quality of a website, but this is one of those rare animals.

An XML sitemap makes it much easier for search engines to crawl your website.

Things like 4xx and 5xx errors in the sitemap, non-canonical URLs in the sitemap, blocked pages in the sitemap, oversized sitemaps, and other issues should be investigated to determine how the sitemap affects the quality of a website.

11. Are landing pages not properly optimized?

Sometimes optimizing multiple landing pages for the same keywords on the website doesn't make your website more relevant.

advertising

Read on below

This can lead to the cannibalization of keywords.

Having multiple landing pages for the same keyword can hurt your link equity as well.

If other websites are interested in linking to your pages on a specific topic, Google can dilute link fairness on all pages on that specific topic.

12. Is the Robots.txt file free of errors?

Robots.txt can be a big problem if the website owner hasn't configured it properly.

One of the things I come across during website audits is a robots.txt file that is not configured properly.

Too often I see websites with indexing problems and the following code has been added unnecessarily:

Do not allow: /

This prevents any crawlers from crawling down the website from the root folder.

Sometimes you need to prohibit robots from crawling a certain part of your website – and that's fine.

However, check your robots.txt to make sure that Googlebot can crawl your most important pages.

advertising

Read on below

13. Does the website use responsive design?

Gone are the days of separate websites for mobile devices (you know, the websites that use subdomains for the mobile website: "mobile.example.com" or "m.example.com").

Thanks to the responsive design, this is no longer necessary.

Instead, the modern method uses HTML 5 and CSS 3 media queries to create a beautiful design.

This is all the more important now that Google's mobile-first index appears.

14. Are CSS and JavaScript blocked in Robots.txt?

It's important to go through these as robots.txt shouldn't be blocking any CSS or JS resources.

Google sent a bulk warning about blocking CSS and JS resources in July 2015.

In short, don't do it.

You can determine whether this causes problems in Google's mobile test.

15. Are there excessive dynamic URLs being used across the site?

Determining the number of dynamic URLs and whether they are a problem can be a challenge.

The best way to do this is to see if the number of dynamic URLs outweighs the static URLs of the site.

advertising

Read on below

In this case, there may be an issue with dynamic URLs affecting crawlability.

This makes it difficult for search engines to understand your website and its content.

For more information, check out the Ultimate Guide to an SEO Friendly URL Structure.

16. Is the site plagued by too many links?

Too many links can be a problem, but not in the way you might think.

Google no longer penalizes pages for having more than 100 links on a page (John Mueller confirmed this in 2014).

However, if you have more than that amount – possibly significantly more – it can be considered a spam signal.

After a certain point, Google just gives up looking at all of these links.

17. Does the site have daisy chain URLs and do these redirects exceed 5 or more?

Google may track up to five redirects, but it can still cause problems.

Redirects can cause even more problems if they get into the excessive area – past five redirects.

So it is a good idea to make sure your website has two redirects or less, provided that this step does not affect previous SEO efforts on the website.

advertising

Read on below

Redirects are annoying to users too and after a certain point, users will give up.

Providing a navigation item with JavaScript is a bad idea as it will limit compatibility between browsers and platforms, and it will affect the user experience.

When in doubt, do it Not Provide links with JavaScript and just use plain HTML to help links.

Don't provide links with JavaScript and just use plain HTML to help links.

19. Is the anchor text in the site's link profile too optimized?

If your site has a link profile with overly optimized and repetitive anchor text, this can potentially lead to possible actions (whether algorithmic or manual).

advertising

Read on below

Ideally, your site should have a healthy mix of anchor text pointing to your sites.

The right balance follows the 20% rule: 20% branded anchors, 20% exact match, 20% current match, and probably 20% naked URLs.

The challenge is to achieve that link profile balance while not leaving any identifying footprints that you manipulate.

20. Is your canonicalization implemented correctly on your website?

Canonicalization ensures that Google sees the URL that you prefer.

In short, you can use a snippet of code to declare that Google will consider a URL as the preferred source of content for that URL.

This causes many different problems at the same time, including:

  • The dilution of inbound link equity.
  • The self-cannibalization of the SERPs (where multiple versions of this URL compete for results).
  • Inefficient crawling when search engines spend even more time crawling the same content every time.

To troubleshoot canonicalization issues, use one URL for all publicly available content on every URL on the site.

The preferred solution is to use 301 redirects to redirect all non-canonical versions of URLs to the canonical version.

advertising

Read on below

Decide which URL structure and format to use early in the web development phase and use that as the canonical version of the URL.

21. Are the pictures on the site too big?

If the images on your website are too large, you run the risk of encountering load time issues, especially with Core Web Vitals.

If your page is loading 2MB of images, this is a significant problem. It is not necessary and you are missing out on the opportunity to identify these issues in the first place.

If the images on your website are too large, you run the risk of encountering load time issues, especially with Core Web Vitals.

Run a lighthouse report to see which images can be compressed.

advertising

Read on below

22. Is the site missing the schema markup for videos?

It is also possible to add Schema.org structured data to videos.

The video object element allows you to tag all of your videos with scheme.

Make sure to add transcripts for videos on the page for accessibility and content.

23. Does the site have all of the required page titles?

Missing SEO titles on a website can be a problem.

If your website is missing an SEO title, Google can automatically generate a title based on your content.

You never want Google to automatically generate titles and descriptions.

You don't want to leave anything to chance when it comes to properly optimizing a website. Therefore, all page titles should be written manually.

24. Does the site have all the required meta descriptions?

If your website doesn't have meta descriptions, Google can automatically create one based on your content. This isn't always the one you want to add to search results.

Be careful and make sure you don't have any problems with it. Always make sure to write a custom meta description for each page.

advertising

Read on below

You should also consider meta keywords. Google and Bing may have indicated that these are not used in search rankings, but other search engines continue to use them. It's a mistake to have such a narrow focus that you're only optimizing for Google and Bing.

There is also the concept of the linear distribution of keywords which helps with points of interest.

While filling meta keywords doesn't necessarily add to ranking, adding meta keywords carefully and strategically can add relevant points to the document.

This can only hurt if you're sending out spam and Google decides to use it as a spam signal to nail your website.

25. Is the page speed of the top landing pages longer than 2-3 seconds?

It is important to test and find out the actual page speed of your top landing pages.

This can affect or affect the performance of your website.

If your website takes 15 seconds to load, that's bad.

Always make sure that your site takes less than a second to load.

advertising

Read on below

While Google's recommendation shows 2-3 seconds, the name of the game is better than the recommendations and better than your competition.

This, too, will only gain in importance with the Page Experience Update and the Core Web Vitals.

26. Does the website use browser caching?

It is important to take advantage of browser caching as it is a component of faster website speed.

To take advantage of browser caching, you can simply add the following line of code to your htaccess file. Please be sure to read the documentation on how to use it.

PLEASE NOTE: Use this code at your own risk and only after reading the documentation on its use. The author assumes no liability that this code does not work for your website.

## CACHING IS RUNNING ##

ExpiresActive On

ExpiresByType image / jpg "Access 1 year"

ExpiresByType image / jpeg "Access 1 year"

ExpiresByType image / gif "Access 1 year"

ExpiresByType image / png "Access 1 year"

ExpiresByType text / css "Access 1 month"

ExpiresByType text / html "Access 1 month"

advertising

Read on below

ExpiresByType application / pdf "Access 1 month"

ExpiresByType text / x-javascript "Access 1 month"

ExpiresByType application / x-Shockwave-Flash "Access 1 month"

ExpiresByType image / x-icon "Access 1 year"

ExpiresDefault "Access 1 month"

## CACHING IS RUNNING ##

27. Does the website use a content delivery network?

Using a network to deliver content can increase the speed of the website by reducing the distance between servers and customers. This will reduce the time it takes for people in those locations to load the website.

Depending on the size of your website, using a network to deliver content can help improve performance significantly.

28. Has the website content been optimized for targeted keyword phrases?

It's usually easy to tell when a website has been properly optimized for targeted keyword phrases.

Keywords tend to stick out like a sore thumb when not used naturally.

You know how it is. Spam text reads similarly to the following when it is optimized for widgets: “These widgets are the most impressive widgets in the history of widgets with widgets. We promise these widgets will rock your world. "

advertising

Read on below

Well-optimized keywords can be read well with the surrounding text. Studying the tweaks will likely help you identify them.

On the other hand, if there is so much spam text that it negatively affects the reader's experience, it may be time to pollute some of the content and rewrite it from scratch.

29. How deeply has the website content been optimized?

Just as there are different levels of link acquisition, there are also different levels of content optimization.

Some optimizations are done at the surface level, depending on the initial scope of the content execution mandate.

Other optimizations are more detailed as images, links, and keywords are fully optimized.

Some questions you may want to ask to make sure your website content is properly optimized include:

  • Does my content contain targeted keywords throughout the copy of the text?
  • Does my content have headings that have been optimized with keyword variations?
  • If necessary, will my content include lists, images, and quotations? Don't add these things randomly in your content. They should be contextually relevant and support the content.
  • Does my content include bold and italic text for emphasis where necessary?
  • Does my content read well?

30. Did you do keyword research on the website?

Adding keywords everywhere without a strategy doesn't work well. It feels unnatural and strange to read.

advertising

Read on below

You need to know things like search volume, how to target those words appropriately, and how to figure out what to do next.

This is where keyword research comes in.

You wouldn't create a website without first researching your target market, would you?

Likewise, you wouldn't write any content without doing targeted keyword research.

31. Has the content of the website been proofread?

Did you proofread the content of your website before posting?

Did you proofread the content of your website before posting?

I can't tell you the number of times I took an exam and found stupid mistakes like grammatical errors, misspellings, and other major issues.

advertising

Read on below

Make sure you proofread your content before posting. This saves a lot of editing work going forward when you have situations where the SEO pro has to do a lot of the editing.

32. Have the images on the website been optimized?

Image optimizations include keyword phrases in the file name, in the image size, in the loading time of the image and ensure that the images are optimized for Google image search.

The image size should match the design of your website or appear in a different way.

You wouldn't include images that would be completely irrelevant if you did the marketing right, would you?

Also, don't add images that appear to be completely spamming your audience.

33. Are the websites following web development best practices?

This is a big one.

Websites even violate the basics of web development best practices in a number of ways – from polyglot documents to invalid code as tested on the W3C to excessive load times.

Now I know that developers will learn from developers that some of my usual requirements are "unrealistic".

advertising

Read on below

However, if you've been practicing these development techniques for years, they aren't all that difficult.

It just takes a slightly different mindset than you're used to: you know the mentality of always building and following the biggest, best, and therefore most impressive website you can build.

Instead, the mindset should be to create the lightest, least resource-intensive site.

Yes, I know a lot of websites don't follow the W3C. If your customer pays you and the customer requests it, you need to know your details and how to ensure that your site is validated in the validator.

Coming up with excuses will only make you look unprofessional.

  • Are things like loading times of 1-2 seconds unrealistic? Not if you are properly using CSS sprites and lossless compression in Adobe Photoshop.
  • Are fewer than 2-3 HTTP requests unrealistic? Not if you structure the site properly and get rid of unnecessary WordPress scripts that take up valuable code real estate.
  • Do you want to get some quick page load times? Get rid of WordPress completely and code the site yourself. You will remove at least 1.5 seconds of loading time due to WordPress alone.

Stop being an armchair developer and become a professional web developer. Broaden this horizon!

advertising

Read on below

Think different.

Be different. Be honest. Be the best.

Stop thinking that web development best practices are unrealistic – because the only unrealistic thing is your attitude and how much you don't want to work hard.

Or learn something new instead of machine guns at your website development work in the name of profit.

34. Was an HTTPS migration carried out correctly?

If you are setting up your site for proper HTTPS migrations, you will need to purchase a website security certificate.

One of the first steps is to complete the purchase of this certificate. If you don't do this step correctly, you could completely screw up your HTTPS migration later.

Suppose you purchased an SSL certificate for this reason. You have chosen an option that only applies to one subdomain. This way, you may have accidentally created 100+ errors by simply selecting the wrong option during the purchase process.

For this reason, it is best to always consider at least one wildcard SSL certificate for all domain variants.

advertising

Read on below

This is usually a little more expensive, but it prevents these types of errors.

35. Has a new disavow file been sent to the correct Google Search Console profile?

You would be surprised how often this occurs during website audits. Sometimes a disavow file was never sent to the Google Search Console (GSC) new HTTPS profile.

A GSC-HTTPS profile was never created and the current GSC profile reports either too little or too little, depending on the implementation.

Fortunately, the update is pretty easy – just make sure to transfer the old HTTP disavow file to the new HTTPS profile and update it regularly.

36. Have the GSC settings been transferred to the new account?

Settings can also cause problems with an HTTPS migration.

Suppose you set up the HTTP domain as www. But then you set the domain in the new GSC to non-WWW (or something different than the one in the original profile).

This is an example where incorrect GSC settings can cause problems with an HTTPS migration.

advertising

Read on below

37. Did you make sure you made a note of the migration in Google Analytics?

Failure to note important changes, revisions, or other changes to the website may affect your decision-making later.

If exact details are not saved by noting them in Google Analytics (GA), you can become blind to making website changes that depend on those details.

Here is an example: Let's assume that there has been a major overhaul of the content. You got a penalty later. A change of department heads and SEOs took place.

Noting this change in Google Analytics will help future SEO folks understand what happened before it affected the website in the here and now.

38. Is the implementation of social media on the website carried out correctly?

This is a common occurrence during audits.

I see where social media links weren't entirely removed when things changed (e.g. when social media efforts were unnecessarily concentrated on a particular platform) or when smaller things like potential customer interactions aren't strictly kosher were.

These things will affect the quality of your website.

advertising

Read on below

Wenn Sie Ihre sozialen Beiträge ständig mit Maschinengewehren beschießen und nicht richtig mit Kunden interagieren, machen Sie es falsch.

39. Funktionieren die Einreichungsformulare für Leads ordnungsgemäß?

Wenn ein Formular zur Lead-Generierung nicht ordnungsgemäß funktioniert, erhalten Sie möglicherweise nicht alle möglichen Leads.

Wenn in einer E-Mail-Adresse ein Tippfehler oder in einer Codezeile ein Tippfehler vorliegt, der das Formular beschädigt, müssen diese behoben werden.

Machen Sie es zu einer hohen Priorität, regelmäßige Wartungsarbeiten an Formularen zur Lead-Generierung durchzuführen. Dies hilft zu verhindern, dass Dinge wie eine Unterberichterstattung über Leads und fehlerhafte Informationen übermittelt werden.

Nichts ist schlimmer, als Informationen aus einem Formular abzurufen und festzustellen, dass diese Telefonnummer eine Ziffer entfernt ist oder die E-Mail-Adresse aufgrund eines Programmierfehlers und nicht unbedingt aufgrund eines Übermittlungsfehlers falsch ist.

40. Funktionieren Lead-Tracking-Skripte ordnungsgemäß?

Die Durchführung laufender Tests mit Lead-Tracking-Skripten ist entscheidend, um das ordnungsgemäße Funktionieren Ihrer Website sicherzustellen.

Wenn Ihre Lead-Tracking-Skripte jemals kaputt gehen und Sie am Wochenende fehlerhafte Einreichungen erhalten, kann dies Ihre Albträume bei der Kundenakquise zerstören.

advertising

Read on below

41. Ist die Anrufverfolgung ordnungsgemäß eingerichtet?

Ich erinnere mich, dass ich mit einem Kunden in einer Agentur zusammengearbeitet habe und auf seiner Website eine Anrufverfolgung eingerichtet hatte.

Alles schien richtig zu funktionieren.

Als ich den Kunden anrief und die Angelegenheit besprach, schien alles korrekt zu sein.

Wir besprachen die Telefonnummer und der Kunde erwähnte, dass er diese Telefonnummer vor einiger Zeit geändert hatte.

Es war eine Ziffer entfernt.

Sie können sich die Reaktion des Kunden vorstellen, als ich ihn über die Telefonnummer der Website informierte.

Es ist leicht zu vergessen, etwas so Einfaches wie die Telefonnummer bei immer komplexeren Website-Optimierungen zu prüfen.

Deshalb ist es wichtig, immer einen Schritt zurückzutreten, Dinge zu testen und mit Ihrem Kunden zu sprechen, um sicherzustellen, dass Ihre Implementierungen überall korrekt funktionieren.

42. Verwendet die Site übermäßiges Inline-CSS und JavaScript?

Um auf ein früheres Thema einzugehen, ist Inline-CSS und JavaScript schrecklich, wenn es übermäßig wird.

Dies führt zu übermäßigen Renderzeiten für Browser und kann möglicherweise die browser- und plattformübergreifende Funktionalität beeinträchtigen, indem auf Inline-Implementierungen von CSS und JavaScript zurückgegriffen wird.

advertising

Read on below

Vermeiden Sie diese in Ihrer Webentwicklung am besten und stellen Sie sicher, dass Sie dem CSS-Stylesheet immer neue Stile hinzufügen und dass neues JavaScript korrekt erstellt und berücksichtigt wird und nicht inline.

43. Sind die richtigen GSC / GA-Konten verknüpft? Sind sie richtig verknüpft?

Sie würden nicht glauben, wie oft dies passiert ist, als ich eine Website übernommen habe. Ich habe mir ihre GSC- oder GA-Konten angesehen, und sie haben nicht richtig berichtet oder auf andere Weise funktioniert.

Es stellt sich heraus, dass das GA- oder GSC-Konto irgendwann auf ein anderes Konto umgestellt wurde und sich niemand die Mühe gemacht hat, die Website entsprechend zu aktualisieren. Oder ein anderes seltsames Szenario.

Aus diesem Grund ist es doppelt wichtig, immer die GSC- und GA-Konten zu überprüfen. Stellen Sie sicher, dass auf der Site die richtigen Profile implementiert sind.

44. Sind Ihre URLs zu lang?

Indem Sie sicherstellen, dass URLs relativ kurz sind und keine extra langen URLs (über 100 Zeichen) enthalten, können Probleme mit der Benutzererfahrung vermieden werden.

advertising

Read on below

Es ist wichtig zu beachten, dass viel längere URLs zu Problemen mit der Benutzererfahrung führen können.

50 Fragen, die Sie stellen müssen, um die Qualität Ihrer Website zu bewerten

Wenn Sie im Zweifelsfall zwei URLs haben, die Sie in einem Umleitungsszenario verwenden möchten, und eine kürzer als die andere ist, verwenden Sie die kürzere Version.

Es wird auch als Standard-SEO-Best Practice angesehen, URLs auf weniger als 100 Zeichen zu beschränken – der Grund dafür liegt in der Benutzerfreundlichkeit und Benutzererfahrung.

advertising

Read on below

Google kann längere URLs verarbeiten. Kürzere URLs lassen sich jedoch viel einfacher analysieren, kopieren, einfügen und in sozialen Netzwerken freigeben.

Dies kann auch ziemlich chaotisch werden. Längere URLs, insbesondere dynamische, können Ihre Analysedaten zerstören.

Angenommen, Sie haben eine dynamische URL mit Parametern.

Diese URL wird aus irgendeinem Grund mehrmals im Monat aktualisiert und generiert neue Variationen derselben URL mit demselben Inhalt und aktualisiert die Parameter.

Wenn URLs in dieser Situation sehr lang sind, kann es schwierig sein, alle Analysedaten zu durchsuchen und zu identifizieren, was was ist.

Hier kommen kürzere URLs ins Spiel. Sie können einen solchen Prozess einfacher verwalten, eine Seiten-URL für jeden eindeutigen Inhalt sicherstellen und Sie laufen nicht Gefahr, die Berichtsdaten der Website negativ zu beschädigen.

Es hängt alles von Ihrer Branche ab und davon, was Sie tun. Dieser Rat ist möglicherweise nicht so sinnvoll für eine E-Commerce-Website, die möglicherweise genauso viele URLs mit solchen Parametern enthält.

advertising

Read on below

In einer solchen Situation kann ein anderes Verfahren zum Behandeln solcher URLs erwünscht sein.

45. How Targeted Are Keywords on the Site?

You can have the best content in the world.

Your technical SEO can exceed 100 percent and be the best, fastest-loading website ever. But, in the end, keywords are the name of the game.

Keyword queries are how Google understands what people are searching for.

The more targeted your keywords are, the easier it is for Google to discern where to place your site in the search results.

What exactly is meant by targeted keywords? These are any words that users use to find your site that are mapped to queries from Google.

And what is the best, most awesome method to use to optimize these keywords?

The keyword optimization concept of linear distribution applies. It’s not about how many keywords you can add to the page, but more about what linear distribution tells the search engines.

It is better to have keywords sprinkled throughout the text evenly (from the title tag, description, and meta keywords down to the bottom of the page) than stuff everything up the wazoo with keywords.

advertising

Read on below

I don’t think you can randomly stuff keywords into a page with high keyword density and make it work for long. That’s just random keyword spamming, and the search engines don’t like that.

There is a significant difference between spamming the search engines and keyword targeting. Just make sure your site adheres to proper keyword targeting for the latter and that you are not seen as a spammer.

46. Is Google Analytics Even Setup Properly on the Site?

Expanding on our earlier discussions about the right accounts being linked, even just setting up Google Analytics can be overlooked by even the most experienced SEO professionals.

While not always identified during an audit, it is a detail that can wreak havoc on reporting data later.

During a site migration or design, it can be easy to miss an errant Google Analytics installation or otherwise think the current implementation is working correctly.

Even during domain changes, overall technical domain implementations, and other site-wide changes, always ensure the proper Google Analytics and GSC implementations are working and set up properly on-site.

advertising

Read on below

You do not want to run into the situation later where an implementation went wrong, and you don’t know why content was not correctly performing when it was posted.

47. Is Google Tag Manager Working Properly?

If you use Google Tag Manager (GTM) for your reporting, it is also important to test to ensure it is working correctly.

If your reporting implementations are not working, they can end up underreporting or over-reporting, and you can make decisions based on false positives being presented by errant data.

Using preview and debug modes, Google Tag Assistant, and using Screaming Frog can be excellent means to that end.

For example, identifying pages that don’t have Google Tag Manager code added is easy with Screaming Frog.

Using custom search and extraction can help you do this. This method can find pages that, quite simply, just do not have GTM installed.

Using Google Tag Assistant, a Chrome Extension for Google Tag Manager can help you troubleshoot GTM, Google Analytics, and AdWords.

advertising

Read on below

It works with recordings by recording a browsing session. This session will then report on anything happening and how the data interactions will show up in GA.

48. Does the Site Otherwise Have Any Other Major Quality Issues?

Other quality issues that can impact your site include lousy design.

Sei ehrlich.

Look at your site and other competitors in the space.

How much do you like your site in comparison to those competitors? This is one of those things that you just can’t put a finger on.

It must be felt out or otherwise navigated through intangibles like a gut instinct. If you don’t like what your design is doing, it may be time to go back to the drawing board and start again from scratch.

Other issues to keep an eye out for include errors within the content, grainy images, plug-ins that aren’t working, or anything that negatively impacts your site from a reporting point of view.

advertising

Read on below

It may not even be a penalty either. It may merely be errant reporting due to a plug-in implementation that went wrong.

49. Is Your Reporting Data Accurate?

A consistent theme I wanted to include throughout this article contains inaccuracies in reporting.

Often we use GSC and GA reporting data when it comes to making decisions. So it is so essential to make sure that your GSC and GA implementations are all 100 percent correct.

Dark traffic, or hidden traffic, can be a problem if not dealt with properly.

This can skew a massive chunk of what you think you know about your visitor traffic statistics.

That can be a major problem!

Analytics platforms, including Google, have a hard time tracking every single kind of traffic source.

50. Is My Website Actually Done?

A website is never done!

Evaluating the quality of a website is an ongoing process that is never truly finished. It’s important to stick to a regular schedule.

Perhaps schedule website audits to occur every year or so.

advertising

Read on below

That way, you can continue an evaluation process that identifies issues and gets them in the development queue before they become problems.

More Website Audit Resources:

LEAVE A REPLY

Please enter your comment!
Please enter your name here