Without knowing what is intended for the data, how static it is, how much data is it, budget, and what the queries are doing I’m going to put down an idea that I’ve often used to overcome system with no simple rate limiting. The main reason for this is often users will still find ways around rate limiting so lets explore another possibility but it has some assumptions that could not make this a possible solution. Hopefully it helps.
1 Word: Caching
Instead of rate limiting if possible think of it in a way of, how can I remove the desire for someone to spam requests. Looking at some options think of something like Cloudflare where the data is served from a host pipped through Cloudflare and let them handle the requests. Services like Cloudflare often have automatic rate limiting (that will ban IPs that spam too much too quickly if wanted) but mostly they also have nice caching services without much effort. A Cloudflare worker perhaps could query your data and cache it for quite a long time.
This is just one quick example but you would need to open your mind to many possibilities and explore. If you can cache it, cache it and don’t rate limit. Because if you can cache it you have so many more options available to you. You could also have some service like Google Cloud Functions/Amazon Lambdas/App Service Triggers automatically on timers to put these results a user often calls for to a cache/storage/hosting that doesn’t directly require you to open the gates to direct user interaction that requires rate limiting. If you can go this caching way and done correctly it wouldn’t matter if they do 1 request or 1000 request in 10 minutes because your cache says 10 minutes, its gonna be 10 minutes before they see new data. People don’t really spam if they gonna get the same answer no matter what.
Good luck!