1. What are the Google Webmaster Guidelines?
  2. Make sure your keywords are relevant
  3. Make sure that pages on your site can be linked to from another searchable page
  4. Limit the number of links on a page to a reasonable number
  5. Use the Robots.txt file to manage your crawl budget
  6. Create a useful, information-rich site and write pages that clearly and accurately describe your content
  7. Think about the words users would type to find your pages
  8. Design your site so that it has a clear conceptual page hierarchy
  9. Make sure all website assets are fully crawlable and indexable
  10. Make the important content of your website visible by default
  11. Google's Webmaster Guidelines are really guidelines

Google's Webmaster Guidelines are fundamental to achieving sustainable SEO results using techniques that meet Google's expectations.

If you don't follow the guidelines, you can expect either algorithmic devaluation or direct manual punishment.

In the most severe cases, you can expect to be completely banned from Google SERPs.

A full understanding of the guidelines is the only way to avoid possible missteps and future damage to your website. Of course, there is always more than one way to interpret their guidelines.

Fortunately, Google associates like John Mueller and Gary Illyes are available on Twitter for most questions about these guidelines.

The guidelines are mostly dry cut.

However, you need to check the guidelines regularly as Google will update them. They don't necessarily post changes to them, so you need to be on the alert.

advertising

Read on below

One of the most recent changes to the guidelines was when Google added a line that made sure that your websites were coded in valid HTML and CSS and that they should be validated using the W3C.

This does not necessarily mean that it will be helpful in searching (because of the better cross-browser compatibility, it may be possible from a user experience perspective)

This is a different type of guide than most of the others.

The aim of this guide is to provide possible solutions to common problems so that you are provided with actionable advice that you can use the next time you release a website.

Doesn't that sound like fun?

What are the Google Webmaster Guidelines?

These guidelines are divided into:

  • Webmaster Guidelines.
  • General guidelines.
  • Content-specific guidelines.
  • Quality guidelines.

Webmaster Guidelines are more general best practices that can help you build your website so that it appears more easily in Google searches.

advertising

Read on below

Additional guidelines are guidelines that prevent your website from appearing in search.

General guidelines are the best practices that will help your website display the best it can in the Google SERPs (Search Engine Results Pages).

Content-specific guidelines are more specific to the different types of content on your website, such as: B. Pictures, videos and others.

The Quality Guidelines cover all techniques that are prohibited and could result in your site being banned from the SERPs.

If that is not enough, then using these techniques can also result in manual action being taken against your website.

These guidelines are aimed at making sure that you are not writing spam content and that you are writing content for people, not search engine spiders.

It's easier said than done, however. Building websites that adhere to Google's webmaster guidelines is a challenge.

But when you fully understand them, you have cleared the first hurdle.

The next hurdle is applying them to your websites in a way that conforms to these guidelines. But as with any SEO challenge, practice makes perfect!

Make sure your keywords are relevant

Your website should be easy to find. One way to determine keyword relevance is to examine the top ranked websites and see which keywords are used in their page title and meta description.

Another option is to do a competitive analysis of the top websites in your niche. There are many tools that you can use to see how websites use keywords on the page.

Common problem

You have a website with zero keywords. The client gave you a list of keywords and you want to optimize the website for those keywords. The problem is you only have branded keywords in the entire copy of the text and website optimization was not taken into account.

solution

The solution here is not that easy. You would need to do keyword research and in-depth competition analysis to find the optimal point to optimize keywords based on your target market.

Make sure that pages on your site can be linked to from another searchable page

Google recommends that all pages on your website contain at least one link from another page. Links make up the World Wide Web, so it makes sense that your primary means of navigation are links. This can be done through your navigation menu, breadcrumbs, or context links.

advertising

Read on below

Links should also be crawlable. Having your links crawled ensures a great user experience and makes it easy for Google to crawl and understand your website. Avoid using generic anchor text to create these links and use keyword phrases to describe the outgoing page.

An isolated website architecture is best as it strengthens the current relevance of pages on your website and arranges them in a hierarchical structure that Google can understand. It also helps reinforce the current focus.

Common problem

You come across a site that has orphaned pages everywhere and in the sitemap.

solution

Make sure that at least one link from the site points to every other potential page on your site. If the page is not part of your website, either delete it entirely or do not index it.

Limit the number of links on a page to a reasonable number

In the past, Google announced that you shouldn't use more than 100 links per page.

It is better to have links that are useful to the user than to stick to a certain set. In fact, sticking to certain amounts can be harmful if they negatively impact the user experience.

advertising

Read on below

Google's guidelines now state that a page can contain a few thousand at most. It is not unreasonable to assume that Google uses a large number of links as a spam signal.

If you want to keep linking back, do so at your own risk. Even then, John Mueller stated that they don't care about internal links and that you can do what you want.

Common problem

You have a website with over 10,000 links per page. This creates problems with Google crawling your website.

solution

This depends on the size and type of your website. Make sure to reduce the number of links per page to less than a few thousand if your website requires it.

Use the Robots.txt file to manage your crawl budget

Optimizing the crawl budget is an important part of ensuring that Google can crawl your website efficiently and easily.

They make it more efficient and easier for Google to crawl your website through this process. You optimize your crawl budget in two ways: the links on your website and robots.txt.

advertising

Read on below

The method we are primarily concerned with in this step is to use robots.txt to optimize the crawl budget. This guide from Google will tell you everything you need to know about the robots.txt file and how it can affect crawling.

Common problem

You come across a site whose robots.txt file contains the following line:

Do not allow: /

This means that the robots.txt file does not allow top-to-bottom crawling.

solution

Delete this line.

Common problem

You come across a site that doesn't have a sitemap.xml directive in robots.txt. This is considered a best practice for SEO.

solution

Make sure to add a statement specifying the location of your sitemap file, e.g. B. the following:

Sitemap: https://www.example.com/sitemap.xml

Create a useful, information-rich site and write pages that clearly and accurately describe your content

According to guidelines, Google prefers websites with a lot of information. This is dictated by the industry. Therefore, a competitive analysis is critical to finding websites that are considered "information rich".

advertising

Read on below

This “information-rich” requirement varies from industry to industry, which is why such a competitive analysis is necessary.

The competitive analysis should reveal the following:

  • What other websites write about.
  • How they write about these topics.
  • Among other things, how their websites are structured.

With this data, you can create a site that complies with these guidelines.

Common problem

You have a website full of thin, short content that is not valuable.

But let's be clear here: the number of words is not the be-all and end-all of the content. It's about the quality, depth and breadth of content.

Back to our website – you find it is full of thin content.

solution

A comprehensive content strategy is required to overcome the content weaknesses of this website.

Think about the words users would type to find your pages

When doing keyword research, it's important to make sure you're figuring out how people are searching for your website. If you don't know the words users are looking for, all of the world's keyword research is in vain.

advertising

Read on below

This is where effective keyword research comes into play.

If you're doing effective keyword research, you need to consider things like your prospect's intent when looking for a phrase.

For example, for someone earlier in the purchase funnel, everything revolves around research. You wouldn't search for the same keywords as someone at the bottom of the buying funnel (e.g., they are about to buy).

Additionally, you also need to consider your prospect's mindset – what do they think when they search for these keywords?

Once you have completed the keyword research phase of your project, you need to do some on-page optimization. This on-page optimization process usually ensures that every page on your website mentions that page's target keyword phrase.

You can't do SEO without effective keyword research and targeting. SEO doesn't work like that. Otherwise you are not doing SEO.

Common problem

You come across a website that only has branded keyword phrases and hasn't done much to differentiate itself in the marketplace.

advertising

Read on below

Through your research, you find that they haven't updated their blog too much with a wide variety of keyword topics and instead just focused on branded posts.

solution

The solution to this is pretty simple: make sure you use targeted keyword phrases that are of more current relevance to create content, rather than branded keywords.

This goes back to the basics of SEO or SEO 101: provide the keywords your users would type in to find these pages and make sure your website has those words on their pages.

In fact, this is part of Google's general guidelines for helping people better understand your pages.

Design your site so that it has a clear conceptual page hierarchy

What is a clear conceptual page hierarchy? This means that your site is organized according to current relevance.

You have arranged the main topics of your site as main topics, with sub-topics arranged below the main topics. These are called SEO silos. SEO silos are a great way to organize the pages of your website by topic.

advertising

Read on below

The deeper the clear conceptual page hierarchy, the better. This tells Google that your website is aware of the topic.

There are two schools of thought in this guideline – one is that you should never stray from a flat architecture – which means that a page should be no more than three clicks from the home page.

The other mindset involves siloing – that you should create a clear conceptual page hierarchy that delves deep into the breadth and depth of your topic.

Ideally, you need to create a website architecture that makes sense for your topic. SEO siloing helps by making sure that your website is as detailed as possible on your topic.

SEO siloing also provides a cohesive organization of current pages and discussions. For this reason, and due to the fact that SEO silos have been observed to get great results, I recommend that most websites that keep track of high density topics build a silo architecture appropriate for that topic.

Common problem

You come across a website with pages scattered all over the place without thinking much about the organization, linking, or other website architecture. These pages are also randomized, which means they don't have a huge organizational flow.

advertising

Read on below

solution

You can fix this problem by creating an isolated website architecture that matches what your competitors are doing. The idea behind this is that this website architecture helps to strengthen your current focus and improve your rankings through entity relationships between your pages and the current reinforcement.

This current reinforcement then creates greater relevance for your keyword phrases.

Make sure all website assets are fully crawlable and indexable

You need to consider why not all assets should be fully crawlable and indexable.

Well, there are situations where blocking CSS (Cascading Style Sheets) and JS (JavaScript) files is acceptable.

  • Firstly, when you blocked them because they had trouble playing well together on the server.
  • Second, if you've blocked them because of some other conflict, Google has guidelines for that too.

Google's guidelines on the matter state:

"To help Google fully understand the content of your website, crawl any website asset that would significantly affect page rendering: such as CSS and JavaScript files that affect the understanding of the pages. " The Google indexing system renders a web page as the user would see it, including images, CSS and JavaScript files. To see which page elements Googlebot cannot crawl, use the url inspection tool to debug instructions in your robots.txt file and use the robots.txt tester tool. "

advertising

Read on below

This is important. You don't want to block CSS and JavaScript.

All of the elements are important to ensure that Google fully understands the context of your page.

Most website owners block CSS and JavaScript through robots.txt. Sometimes this is due to conflicts with other site files. In other cases, they pose more of a problem when fully rendered.

If site files are having trouble rendering, it's time to create a completely redesigned website.

Common problem

You come across a site that has CSS and JavaScript blocked in robots.txt.

solution

Unlock CSS and JavaScript in robots.txt. And when they have so much conflict (and a mess in general), you want to have the cleanest online presence possible.

Make the important content of your website visible by default

Google's guidelines ensure that the most important content on your website is visible by default. This means you don't want buttons, tabs, and other navigation elements to be required to display this content.

advertising

Read on below

Google also states that "this content is less accessible to users and that you think you should use the default page view to display your most important information".

Tabbed content – yes, this falls under content that is less accessible to users.

Why?

Note the following: You have a tabbed content block on your page. The first tab is the only one that is fully visible and visible to users until you click the tab above to go to the second. And so on.

Imagine Google – they think this type of content is less accessible.

While this may be a relatively minor consideration, you don't want to outrageously do it – especially not on your home page.

Make sure that all tabbed content is fully visible.

Common problem

You are assigned a website that contains tabbed content. How's it going?

solution

Recommend the client to create a version of the content that is fully visible.

For example, turn tabbed content into content with paragraphs that go up and down the page.

advertising

Read on below

Google's Webmaster Guidelines are really guidelines

As SEOs, Google's Webmaster Guidelines are just that, guidelines. It can be said that these are just "guidelines" and not necessarily a rule.

But watch out – if you violate them tremendously, you could be banned from the SERPs. I'd rather stay on the good side of Google. Manual actions are ugly.

Don't say we didn't warn you.

Of course, the penalties can range from algorithmic devaluations to direct manual actions. It all depends on the severity of the violation of this policy.

And pages and folders can be devalued when it comes to penguin problems. Don't forget that Real-Time Penguin is inherently very detailed in this regard.

However, it is important to note that not all policy violations result in penalties. Some lead to problems with crawling and indexing, which can also affect your ranking. Others lead to important manual actions, e.g. B. Spam links back to your website.

When performing manual action, it is important to remain calm. You likely brought this on yourself through link spam or some other type of spam on your website.

advertising

Read on below

The best thing you can do right now is to investigate and work with Google to remove the manual action.

Generally, if you've spent a lot of time getting into trouble, Google expects the same amount of time to resolve issues before you're back in good hands.

In other cases, the website is so terrible that the only solution is to destroy it and start over.

With this knowledge, you should be able to tell if certain techniques have caused serious problems.

That being said, it's definitely worth following Google's guidelines from the start.

Although results are slower compared to other, more aggressive methods, we recommend this approach. This approach will help you maintain a more stable online presence, and you won't have to suffer from manual actions or algorithmic devaluations.

It's your decision.

Selected image source: Paulo Bobita

LEAVE A REPLY

Please enter your comment!
Please enter your name here