One of the great things about doing SEO in an agency is that you are constantly working on different projects that you may not have researched before. As an SEO agency site, you can see such a wide variety of websites that you can get a more holistic perspective on the algorithm and work with all kinds of unique problems and implementations.

This year, one of the most interesting projects we've worked on at Go Fish Digital revolved around helping a large media company get into Google's Top Stories at major one-day events.

While doing competitive research for the project, we found that one way how many websites seem to be doing this is by using a schema type called LiveBlogPosting. This led us to do a pretty thorough study of what this structured data type is, how websites use it, and what impact it could have on top story visibility.

Today I want to share all of the knowledge we have made about this type of schema and draw conclusions about what this means for future searches.

Who does it apply to?

In terms of the LiveBlogPosting scheme, the most relevant types of websites are websites where getting into Google's Top Stories is a priority. These websites are usually publishers who post news on a regular basis. Ideally, AMP will already be implemented as the vast majority of Top Stories URLs are AMP compliant (but this is not required).

Why non-publisher websites should still care

Even if your website is not a publisher for Top Stories results, the content of this article may provide you with interesting information. You may not be able to implement the structured data directly at this point. However, I feel that we can use the results of this article to draw conclusions about where the search engines might be leading.

If Google is rating articles that are regularly updated and even providing rich functionality for that content, it could be an indication that Google is trying to incentivize indexing more real-time content. This structured data can be an attempt to help Google fill a gap it has in terms of providing real-time results to its users.

While it makes sense that “freshness” ranking factors apply most to publishers, there might be interesting tests other non-publishers can run to gauge whether your site's content is making a difference.

What is the LiveBlogPosting Scheme?

The LiveBlogPosting schema type consists of structured data that you can use to signal to search engines that your content is being updated in real time. This gives search engines contextual signals that the page is being updated frequently for a period of time.

The structured LiveBlogPosting data can be found on as a sub-type of the structured data of the article. The official definition on the website is: "A blog post that is designed to provide ongoing text coverage of an ongoing event through continuous updates."

Imagine a columnist watching a soccer game and blogging about it. With every single game, the columnist updates the blog with what happened and the outcome of that game. Each time the columnist updates, the structured data is also updated to indicate that the article was recently added.

Articles with structured LiveBlogPosting data are often shown in Google's Top Stories function. There is a live indicator in the upper left corner of the thumbnail to signal to users that live updates are being made to the page.

Two top stories results with the "Live" tag

The image above shows an example of two publishers (The Washington Post and CNN) implementing the LiveBlogPosting scheme for the term "coronavirus" on their pages. It is likely that they will use this structured data type to greatly improve the visibility of their top stories.

Why is this structured data important?

Now you may be wondering why this scheme is important in the first place. I certainly don't have the resources to get an editor to keep posting updates to content all day long.

We have monitored Google's use of this structured data specifically for publishers. Stories with this structured data type seem to have significantly improved visibility in the SERPs, and we can see publishers are aggressively using them for large events.

For example, the following screenshot shows you the mobile SERP for the query "US election" on November 3, 2020. Note that four of the seven results in the carousel use the LiveBlogPosting scheme. Also, under that carousel, you'll see that the same CNN page with the tag "Live" next to it will be included in the organic results:

Now let's look at the same query for the day after the election, November 4, 2020. We still see publishers make heavy use of this structured data type. In this result, five of the first seven Top Stories results use this structured data type.

In addition, CNN can double-dive and claim an additional organic result with the same URL that is already shown in the Top Stories. This is another common result of implementing LiveBlogPosting.

In fact, this type of live blog post was one of CNN's core strategies for ranking well in the US elections.

Here's how they implemented this strategy:

  1. Create a new url every day (to signal freshness)
  2. Apply the LiveBlogPosting scheme and update this URL continuously
  3. Make sure that each update has its own time stamp

Below are some examples of URLs CNN posted during this event. Every day a new URL was published with the attached LiveBlogPosting scheme:

Here's another telling result for the November 4th, 2020 “US election”. We can see that the New York Times ranks # 2 on mobile devices for that period. While the ranking page isn't a live blog post, we see an AMP carousel underneath the result. Their strategy was to blog the results of each individual state live:

It is clear that publishers use this type of scheme heavily for highly competitive news articles based on big events. We often see that this strategy leads to outstanding visibility in top stories and even to the organic results.

How do you implement the LiveBlogPosting scheme?

So you have a big event that you want to tweak and you want to implement the LiveBlogPosting scheme. What should i do?

1. Let yourself be whitelisted

The first thing you need to do is get Google whitelist you. If you have a Google representative who is in contact with your organization, I recommend reaching out to them. There isn't a lot of information on this, and we can even tell that Google previously removed the help documentation for it. However, the form to request access to the Live Coverage Pilot is still available.

This makes sense as Google may not want news sites with questionable credibility to access this feature. This is another indication that this feature could be very powerful if Google wants to limit the number of websites it has.

2. Technical implementation

Next, you need to use a developer to implement LiveBlogPosting structured data on your site. There are several key properties that you need to consider, such as:

  1. CoverageStartTime: When the live blog post starts
  2. CoverageEndTime: When the live blog post ends
  3. liveBlogUpdate: A property that indicates an update to the live blog. This is perhaps the most important feature:
    1. Headline: The headline of the blog update
    2. articleBody: The full description of the blog update
    3. datePublished: The time when the update was originally published
    4. dateModified: The time at which the update was modified

Below is an example of how CNN implemented this on one of their live blogs to make it easy to design. The following example shows two "liveBlogUpdate" properties reporting on the November 3, 2020 election.

Case study

As mentioned, many of these results were discovered while researching for a specific client interested in improving visibility for several large one-day events. Due to the agility of the client, it was actually possible for him to get structured LiveBlogPosting data up and running on his site in a relatively short time. We then tested whether this structured data would help improve visibility for very competitive "head" keywords throughout the day.

While we can't share too much information about each win we've seen, we've seen significant improvements in visibility for the terms of the competition that the live blog post has been associated with. If we look in the search console, we can see improvements in clicks and visibility year-on-year for many of these terms between + 200% and + 600% +. During our samples during the day, we often found the live blog post ranking in the 1-3 results (first carousel) in Top Stories. The implementation appeared to be a huge success in improving the visibility for this section of the SERPs.

Google vs. Twitter and the need for real-time updates

Then the question arises why Google is placing the structured data type LiveBlogPosting so much in the foreground. Is it the fact that the page is likely to have really extensive content? Does it improve E-A-T in any way?

I would interpret that the success of this feature shows one of the weaknesses of a search engine and how Google tries to adjust it accordingly. One of the main problems with a search engine is that it is much more difficult to work in real time. When "something" happens in the world, it will take time for search engines to deliver this information to users. Not only does the information need to be published, but Google must then crawl, index, and evaluate this information.

At this point, the news may already be available on platforms like Twitter. One of the main reasons users might be navigating to the Twitterverse from Google is because people are looking for information they want to know now, rather than waiting 30 minutes to an hour for it to appear on Google News.

For example, if I watch the Steelers and see that one of our players is unfortunate enough to be injured, I don't start searching Google hoping that the answer will come up. Instead, I immediately jump over to Twitter and start freshening up like crazy to see if any sports beat writer posted any news about it.

I believe Google creates a type of schema that signals that a page is in real time. This gives Google peace of mind that a trusted publisher has created content that should be crawled and served to users much more frequently, as the information is more likely to be current and accurate. By providing rich functionality and better visibility of articles using this structured data, Google is encouraging the creation of real-time content that persists searches on their platform.

This evidence also signals that websites showing search engines that content is current and updated regularly can be an increasingly important factor in the algorithm. Speaking to Dan Hinckley, CTO of Go Fish Digital, he suggested that search engines may need to give preference to articles that have recently been updated. Google may not be able to trust that older articles still contain accurate information. Therefore, for a search engine to trust the accuracy of the results, it can be important to ensure that the content is updated.


You really never know what kinds of paths your SEO will take, and this has been by far one of the most interesting ones during my time in the industry. By examining this one example, we not only figured out part of the Top Stories algorithm, but also gained insight into the future of the algorithm.

It is entirely possible that Google will continue to incentivize and reward "real-time" content in order to better compete with platforms like Twitter. I'm very interested in new research on the LiveBlogPosting scheme or in Google's constant preference for updated content.


Please enter your comment!
Please enter your name here