Time Series for 132 kb/second data

Hi guys. I could use. some input from mongodb veterans.

My company is thinking about using mongo Db to store incoming sensitive data every 5 min. The data we will retrieve from 3rd party will be exact json so this helps. The issue is that we need to make these api calls via azure functions to get updated data (132 kb per second).

Are time-series collections appropriate for this? From my understanding, we cant store a lot of data with them

Hi @Gindu_Dheer,

I’m wondering if the new Data API would work in this situation. I don’t know the Azure platform that much so I can’t tell if you can use a MongoDB Driver in Azure Functions - but if not, you can use the Data API to write the incoming data into your MongoDB Collection.

Apart from that, TimeSeries collection are exclusively made for TimeSeries data because you need to specify the timeField, metaField and granularity which is then used in the background to implement the bucket pattern and optimize the queries & aggregations.

TimeSeries collections are a fairly recent addition in the MongoDB ecosystem and I wouldn’t be honest if I didn’t mention that they come with a bunch of restrictions which can be a problem in certain cases. For example, you mentioned that your data was sensitive. So I guess you could consider using Client-Side Field Level Encryption to protect it ─ but TimeSeries collection don’t support CSFLE for now at least.