Botify's BQL API can be used in 2 ways:
This section covers a list of questions to choose the best-suited tool and some common information that can help understand the limitations of each.
If yes, an export is the way to go.
For example, you want to retrieve data for each URL Botify crawled from your website, the data consists potentially of millions of rows which is a case very well suited for exports.
For example, in a CMS you want to display Botify SEO data when accessing one specific page.
In this case, a query is better suited since you query data in real-time and want the freshest information.
You might have an internal data lake to which you want to plug the SEO data.
For those cases, you will want to export your data directly into your data lake so that it can be ingested and update your reports automatically.
Your collections (the data sources that are available for querying) update at different frequencies. The two main frequencies are:
- for your crawls. It depends on the recurrence settings and your plan. Often, the crawl frequency is either monthly, weekly or daily.
- for other data sources, they are updated daily with fresh data.
Querying your data interactively using the API has a response limit of 2000 rows. The API supports pagination, but if more than 10 pages are queried, it's a sign that an export would probably be more appropriate.
Exporting your data creates a job in a queue system, which means that it won't be prioritized the same way an interactive query would be.
Updated 13 days ago