Is the number of Atlas Search Dynamic Mappings unlimited?

Hello everyone!

I noticed in the docs for Dynamic Mappings there is no mention to the amount of fields that can be indexed, which leads me to believe that it is unlimited. Here is the text I am referring to:

Atlas Search automatically indexes the fields of supported types in each document. … Use dynamic mappings if your schema changes regularly or is unknown

I’m not doing anything fancy regarding types as all searchable fields will be text in my case, however we are allowing our users to create arbitrary properties such as firstName while another user might have first_name in their account, so manually creating indexes on each property would get out of hand and dynamic mappings seems to be the solution.

I noticed elastic search has a default of 1,000 and they recommend flattening if the number of fields is unknown - I can’t link the docs them because I’m a new user :frowning:

We’re in the process of migrating off of DynamoDB to Mongo, and would obviously prefer to have our search on Atlas instead of Elastic since we’re already here.

For more context, we are building an applicant management system
and each organization can create questions for applicants to answer, and I would like to provide search functionality for any (text) field on their applicants.

@joswayski First of all, I want to welcome you to the MongoDB community. We are so excited to have you, and I am particularly interested in the amazing application you have built. Please DM me on Twitter if you’d like to chat more about it.

Secondarily, we do not limit the number of fields in an index, though when you are in the range of thousands of fields you could see performance issues unless you scale your boxes. You have a few options.

  1. To avoid linear scaling of boxes. You could spin up a new dynamic index for each tenant. I don’t expect any single tenant to create more than a few thousand fields.

  2. You can keep the approach you have today for simplicity and keep an eye on the number of fields because we don’t limit the number in our dedicated tier. I have seen customers go to 180,000 fields, though this is obviously an anti-pattern. Atlas Search may still work. I’m not sure what the absolute largest number would be for a mapping explosion, but I know the subsystem will choke at 2.147 Billion fields. Your page cache will probably pop long before that number. I’d bet in the millions of unique fields on very hefty machines.

Please note: Dynamic fields do not index boolean fields by default today. Those need to be set for now, but that will change.

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.