GoogleBot is pretty good, actually very good at not crawling most websites too much when it actually causes the website to stop loading. GoogleBot was developed to recognize this and to withdraw. However, if you have a specific set of pages or a page that is very slow because the way you wrote your database query is very inefficient, Google says – don't blame them.
Gary Illyes, the official house elf and boss of Sunshine and Happiness at Google, wrote on Twitter as the official villain on Google (he has many titles): "If you have a site that uses a significant amount of server CPU when access – possibly due to database queries? – you need to fix / optimize this. If Googlebot uses up your server quota because it accesses this page, it's up to you, not Googlebot. "
If you have a page that uses a significant amount of server CPU to access – possibly due to database queries? -, you have to fix / optimize it. If Googlebot uses up your server quota because it accesses this page, it's up to you, not Googlebot pic.twitter.com/yvheCEYg33
– Gary ly 理 ／ 경리 Illyes (@ Method) July 16, 2020
If Google could only help you optimize your queries automatically, add some indexes or something …
But the truth is, yes, Gary is 100% right here. Make sure your database queries are designed to handle traffic, not only from GoogleBot, but also from your users, who Google Search may send you.
Forum discussion on Twitter.