DecodeException when creating mongo connector

hi,
when i am creating a mongo conenctor in openshift, i get a DecodeException:
Failed to decode:No content to map due to end-of-input
at [Source: (io.netty.buffer.ByteBufInputStream); line 1, column: 0]
reason: DecodeException

my connector jar file is located in /opt/kafka/plugins and when i am specifying the class conenctor,
it recognizes the plugin as a valid conenctor type and yet i am still getting this error…
Has someone encountered this issue?

I can make some guesses but I am not completely sure. There are kafka connector builds that include all the dependencies like Avro, not sure if you used one that included dependencies or not. Im not sure which libraries are available by default on openshift out of the box.

I have tried to use the newest mongo plugin (mongo-kafka-1.5.1-all.jar),
and now the sink connector works properly but the source connector still throws the same decoding exception…
any ideas why the sink works and the source not?