BI Solutions in MongoDB


With our company we are using Mongo Atlas M10 and now we are considering solutions to visualization, we are using Mongo Charts but we also want to use Tableau by using Mongo Atlas BI Connector or a ETL approach with a datawarehouse.

I want to know if it’s possible to connect Mongo BI Connector to our Production Cluster and proceed to create all the dashboard that we need in Tableau without affecting our writing performance in the database (we have a mobile app working that writes in MongoDB). I want to know if doing this is inside the “good practices” of a BI Solution in MongoDB or if it’s preferable to move all the data to a datawarehouse.

If connecting the BI Connector directly to the Production Cluster is a good practice I would like to know what I have to configure in the BI Connector in order to extract the data correctly.

Thank you.

Hi There! I will try to give you some information so you are able to make the best decision based on your needs. It sounds like you are happy with Charts, but want to also use Tableau for a dashboard.

You are correct that you could extract data from MongoDB place it in another data store, like a data warehouse, then connect to Tableau. I would personally only go this route if you have the need to do this. Some organizations already have a data warehouse and it has some governance over it, so this makes sense that additional reporting data will land here. I have also seen data warehouses used (or I have created them) to bring together data from various places (beyond one database) for the purpose of organizing and serving up data for analysts to report from. There are a lot of great use cases for a data warehouse, but the set up and monitoring of the ETL can be a large feat.

Alternatively, if you don’t specifically have a use case or an existing scenario that requires a data warehouse, you can use the MongoDB BI Connector. The BI Connector works well with Tableau. If your use case is that you simply want to connect and build dashboards using your MongoDB Atlas data - this is probably the easiest and fastest method. Now, as you mentioned, there may be some performance impact to your cluster.

Here are some things you can make note of when investigating performance:

  1. Upgrading the cluster to larger instances is better for performance
    “The MongoDB Connector for Business Intelligence for Atlas ( is only available for M10 and larger clusters.
    The BI Connector is a powerful tool which provides users SQL-based access to their MongoDB databases. As a result, the BI Connector performs operations which may be CPU and memory intensive. Given the limited hardware resources on M10 and M20 cluster tiers, you may experience performance degradation of the cluster when enabling the BI Connector. If this occurs, upgrade to an M30 or larger cluster or disable the BI Connector.”

  2. Add an analytics node to the cluster and have the BIC read from that node

  3. To mimic the performance of the BI Connector on your database, an equivalent test would be to repeatedly execute these queries with MQL aggregation pipeline. The BI connector shouldn’t put any locks on the database during the query.

I would think you would be fine using the BI Connector (though I might need to understand more of your needs) instead of a data warehouse. But I would highly suggest that you do some due diligence to test performance (as mentioned above).

To configure the BI Connector for Atlas you can reference this doc to enable and connect:

Let me know if you get stuck, or if you want to share more about your specific use case and I will be happy to help.
Best, Alexi Antonino


Thank you Alexi so much for your detailed and clear answer, it was really helpful!

This weekend I will do some tests of performance with the BIC in an analytic node created in M10 to see how it performs. I think that I will be able to see how many resources I’m using in the cluster monitor inside Atlas but I would also know more about doing a mimic of the performance using MQL aggregation pipeline. The tools that I use to write querys are Mongo Compass and Pymongo. If it’s necessary to learn another tool to make this test tell me and I will learn it!

Regarding our use case, we are a startup working with agriculture, and we write in MongoDB information about growers. For our business is necessary to have information of every grower at all time in order to make sure that things are going well. The growers aren’t using the app all the time so there aren’t too many writings in the database. But for us is important to show this information to plenty of users at different levels (growers, investors, our own business, and so on), so we need a lot of dashboards because everyone need to see the data in a different way. We only use MongoDB as a data source, we don’t need to combine information from other sources. We have only 100MB of data right now but we are growing at a fast pace so it’s important that our BI Solution scales well. We probably don’t have the best MongoDB design in our database so I’m worried about performance (but I will check this last point doing the tests).

I have a few more questions but I don’t want to overwhelm you, I thank you very much for your good disposition.

Hello again Fryderyk - Thanks so much for giving me some background and context on how you’re using MongoDB and Analytics. This is so helpful. I am encouraged that you are making sure your solution will scale over time. I think your plan sounds good to run queries (that mimic the queries needed for the data for your dashboard objects) using MQL aggregation pipeline. You can do this with Mongo Compass, or Mongo Shell, or from the Atlas interface under “browse collections” then the Aggregation tab. I have not personally used Pymongo, but it seems that it would provide you a baseline to understand performance as is, and as you grow.
Getting a baseline of performance is key, so that you can be proactive to tune/tweak over time. I know customers with big data will adjust some db settings and/or reconsider their data model. It really all depends on variables like those you provided within your use case. So for now, get a baseline and if all is running great, set a threshold (upper limit) to make sure you make adjustments when needed.
Here is a link that might help you with your testing:

Not sure if you have already enabled a BIC and connected it to Tableau yet, but the process is pretty quick and documented here:

Best to you!

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.