Kafka Source Connector and Schema Registry

I’m trying to make MongoDB Source Connector 1.5 with Avro and Schema Registry. I’m facing the following issues (in all cases output.schema.value is provided):

  1. when auto.register.schemas is true, the created schema is identical to the one provided in the connector creation request, but it’s missing namespace fields, even though enhanced.avro.schema.support is set to true.
  2. when auto.register.schemas is false, use.latest.version is true, latest.compatible.strict is false, and the schema is registered in the schema registry externally, I get UnresolvedUnionException: Not in union, even though the same schema is used as in the scenario above.
  3. when auto.register.schemas is false and use.latest.version is false, I get Schema not found; error code: 40403

How should I configure the source connector to be able to provide a schema externally (i.e. not infer and not auto register), and preserve the namespace information in schema registry?

Also, serialization failures do not result in sending failed records to the dead letter queue. Is this an expected behavior?

@kobasad_N_A did you find a resolution to this, we are facing same issue

@Oli_Allan1 With a lot of trial and error, I found out my schema definition was incorrect, so I had to fix it field-by-field in a number of iterations until it worked. I had to resort to auto.register.schemas set to false and use.latest.version set to true, register the schema at the registry manually, and provide the same schema in the connector create request.
For some reason, ingestion of existing documents failed with UnresolvedUnionException: Not in union when some fields were declared as optional, while it worked fine with other optional fields. This is still not clear to me why it happened.

1 Like

thank you for sharing that, it is much appreciated @kobasad_N_A

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.