Error creating index with null value

Laravel version: 9
PHP version: 8.2
jenssegers/mongodb package version: 3.9

Estou tentando criar um índice único, segue meu código:

I’m trying to create an unique index, code follows:

try { Schema::table('invoice_services', function (Blueprint $collection) { $collection->index('company_id'); $collection->index('numRps'); $collection->index('serieRps'); $options['sparse'] = true; $options['unique'] = true; $collection->index(['company_id', 'numRps', 'serieRps'], 'uniques', null, $options); }); } catch (\Exception $ex) { Log::debug($ex->getMessage()); }

But even using sparse to ignore null value entries, I get the following error message:

Index build failed: fd1868d8-db3c-4367-b818-42939da3bd5d: Collection invoice_hmlg.invoice_services ( 34a28ecd-882d-49b4-8e8b-e6e90b08b932 ) :: caused by :: E11000 duplicate key error collection: invoice_hmlg.invoice_services index: company_id_1_numRps_1_serieRps_1 dup key: { company_id: "5b6da812e014611c866da693", numRps: null, serieRps: null }

Any help appreciated. Thanks in advance.

It looks like you are building a unique index but there are multiple documents with the same key. With unique indexes you can’t have the same values for the keys that you defined. So to resolve this issue you will need to make sure there are no duplicates and all keys are unique.

E11000 duplicate key error collection: invoice_hmlg.invoice_services index: company_id_1_numRps_1_serieRps_1 dup key: { company_id: "5b6da812e014611c866da693", numRps: null, serieRps: null }
1 Like

Cross-referencing with Erro ao criar índice com valor nulo · jenssegers laravel-mongodb · Discussion #2524 · GitHub where I’ve answered the question already.

@Kleber_Marioti please cross-link discussions that you create in multiple forums so people can check if the question has been answered.

@tapiocaPENGUIN yes, that is correct. The fact that OP is using sparse and unique at the same time indicates that they have fields that shouldn’t count towards the uniqueness constraint if they are null. In that case, removing duplicates is impossible, but it is possible to work around the problem using partial indexes.