Constant duplicate key error even when unique is set to false in the schema

i have the following schema:

const productSchema = new mongoose.Schema(
  {
    name: { type: String, required: true, unique: false },
    slug: { type: String, required: true, unique: false },
    category: { type: String, required: true, unique: false },
    image: { type: String, required: true, unique: false },
    price: { type: Number, unique: false },
    countInStock: { type: Number, required: true, unique: false },
    brand: { type: String, required: true, unique: false },
    rating: { type: Number, required: true, unique: false },
    reviews: { type: Number, required: true, unique: false },
    description: { type: String, required: true, unique: false },
  },
  {
    timestamp: true,
  }
);

Even though i’ve set the unique: false property everywhere, i always get MongoBulkWriteError: E11000 duplicate key error collection: react-store.Products index: price_1 dup key: { price: 123 }
How does that even happen? There is no restraint on the unique, I don’t want price to be unique and i’ve explicitly said so in the schema.
I’m losing my mind here

FIXED IT!!! Oh my GOD What a disgusting error it turns out Mongoose doesn’t remove existing indexes so you’ll need to explicitly drop the index to get rid of it. In the shell:
> db.Products.dropIndex('price_1')

What a miserable experience this has been i was losing it! If you initially set it to unique:true you can’t set it to false anymore. It stays as true forever until you drop the index manually from the shell and

2 Likes

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.