Page tree
Skip to end of metadata
Go to start of metadata

Status

FunctionalityExperimental
Support statusUnknown
Support provided byEvolveum
OriginEvolveum
Target systemsKafka server

Description

Kafka connector was implemented for Kafka server with formatted data by avro scheme from Schema Registry server. Connector was tested with midPoint 4.1. Connector can work as Consumer (only read data from topic), Producer (only write data to topic) or both (write data to one topic and read data from another topic).


Framework

ConnId 1.5.0.10

Bundle name

com.evolveum.polygon.connector.kafka

Connector name

connector-kafka

Capabilities and Features


ProvisioningYES

Live Synchronization

YES (connector unsupported typical read operation,
however read identity by Live Synchronization)

Password

YES

Activation

YES

Paging support

NO

Scripting

NO

Versions

ONLY SNAPSHOT

Licensing

The connector itself is available under the terms of Apache License 2.0. To our best knowledge there is no extra license needed to use this connector. (Additional  licensing terms and conditions may apply with services on which the connector is used).

Known limitations

During creating of this connector were detecting some limitations:

  • Unsupported typical read operation, read identities by Live Synchronization

  • Support createOp, updateDeltaOp and DeleteOp, however for updateDeltaOp need whole object, so all changed and unchanged attributes

  • Avro schema have more types, however midPoint support only primitive types(boolean, double, bytes, float, int, long and string), array of primitive type, record and union of two types, where one is null and second is array or primitive type. So midPoint unsupported enum, map and fixed

Schema

Schema for connector is generated from avro schema. When the connector is only Consumer avro schema will get from Schema Registry server. But when the connector is Producer, then the schema will get from file and will push to Schema Registry server. If some attribute in avro schema have default value is optional, failing which it is required. For example :

Example of avro schema
{
  "namespace": "com.evolveum.test",
  "type": "record",
  "name": "test_schema",
  "fields": [
    {
      "name": "username",
      "type": "string"
    },
    {
      "name": "full_name",
      "type": ["string", "null"],
      "default": null
    },
    {
      "name": "favorite_number",
      "type": ["int", "null"],
      "default": null
    },
    {
      "name": "favorite_color_array",
      "type": [{
      		"type": "array",
      		"items": "string"
              }, "null" ],
      "default": null
    },
    {
      "name": "address",
      "type": {
	    "name": "address_insade",
		"type": "record",
      	"fields": [
          {
            "name": "street",
       	    "type": ["string", "null"],
		    "default": null
          },
          {
      	    "name": "number",
      		"type": ["int", "null"],
      		"default": null
    	  }
	    ]
	  }
    }
  ]
}


Certificate Renewal

This connector support possibility automatically renewal certificate and primary key for communication with Kafka server and Schema Registry server.

Configuration

Schema Registry

NameDescriptionRequiredType
schemaRegistryUrlURL of schema registry to which this client connects to. For ex: http://localhost:9090/api/v1trueString
pathToMorePropertiesForSchemaRegistryPath to file with next configuration properties for Schema Registry clientfalseString
schemaRegistrySslProtocolSSL protocol for schema registryfalseString

Certificate Renewal

NameDescriptionRequiredType
ssoUrlRenewalUrl of SSO service for Certificate renewal servicefalseString
serviceUrlRenewalUrl for Certificate renewal servicefalseString
usernameRenewalUsername for authentication to SSO servicefalseString
passwordRenewalPassword for authentication to SSO servicefalseGuardedString
clientIdRenewalClient id for authentication to SSO servicefalseString
intervalForCertificateRenewalInterval in minutes, which define how long before expiration of certificate, it will be renewal. It doesn't have default value, so compare only actual time with expiration time.false

Integer

sslPrivateKeyEntryAliasAlias for primary key in keystore.falseString
sslPrivateKeyEntryPasswordPassword for primary key in keystore.falseGuardedString
sslTrustCertificateAliasPrefixWith this prefix have to start every alias of certificate, which should be renewal. Sufix is number started from 0. For example prefix is 'caroot', so aliases have to be 'caroot0', 'caroot1', 'caroot2'... If one number will be miss next will not be processed.falseString

Common Properties for Consumer and Producer

NameDescriptionRequiredType
useOfConnectorKafka connector can be use as Consumer(CONSUMER), Producer(PRODUCER) or both Consumer and Producer(CONSUMER_AND_PRODUCER). Consumer can read data from Kafka server and Producer can write data to Kafka server. So possible value are 'CONSUMER', 'PRODUCER' and 'CONSUMER_AND_PRODUCER'.trueString
uniqueAttributeName of unique attribute in avro schema.trueString
nameAttributeName attribute for account in a resource. In most cases, it is equal to unique attribute, but there can be differences.falseString
passwordAttributePassword attribute for account in a resource.falseString
bootstrapServersBootstrap servers property is a comma-separated list of host and port pairs that are the addresses of the Kafka brokers.trueString
nameOfSchemaName of used avro schema. When this connector is only Consumer this schema will get from Schema Registry server. But when connector is Producer, then schema will get from file and will push to Schema Registry server.trueString
kafkaSecurityProtocolSecurity protocol for Kafka Server.falseString
sslKeyStoreTypeSsl key store type used for Kafka Server and Schema Registry server.falseString
sslKeyStorePathSsl key store path used for Kafka Server and Schema Registry server.falseString
sslKeyStorePasswordSsl key store password used for Kafka Server and Schema Registry server.falseGuardedString
sslKeyStoreProviderSsl key store provider used for Kafka Server and Schema Registry server.falseString
sslKeyPasswordSsl key password used for Kafka Server and Schema Registry server.falseGuardedString
sslKeyManagerFactoryProviderSsl key manager factory provider used for Kafka Server and Schema Registry server.falseString
sslKeyManagerFactoryAlgorithmSsl key manager factory algorithm used for Kafka Server and Schema Registry server.falseString
sslTrustStoreTypeSsl trust store type used for Kafka Server and Schema Registry server.falseString
sslTrustStorePathSsl trust store path used for Kafka Server and Schema Registry server.falseString
sslTrustStorePasswordSsl trust store password used for Kafka Server and Schema Registry server.falseGuardedString
sslTrustStoreProviderSsl trust store provider used for Kafka Server and Schema Registry server.falseString
sslTrustManagerFactoryProviderSsl trust manager factory provider used for Kafka Server and Schema Registry server.falseString
sslTrustManagerFactoryAlgorithmSsl trust manager factory algorithm used for Kafka Server and Schema Registry server.falseString

Consumer

If some from next property is required, so it is required when use connector as Consumer.

NameDescriptionRequiredType
consumerNameOfTopicName of the topic, from which the connector will read.trueString
consumerVersionOfSchemaVersion of avro schema, which connector use. If connector is Producer this property will be automatically updated.trueInteger
consumerGroupIdA unique string that identifies the consumer group this consumer belongs to.trueString
consumerPartitionOfTopicList partitions of topic, from which the connector will read. List is a comma-separated, for example '1,2,3,5-7'. Default value is 0.falseString
consumerDurationIfFailThe time, in minutes, spent waiting in poll if data is not available in the buffer. Default value is 2.falseInteger
consumerMaxRecordsThe maximum number of records returned in a single call.falseInteger
pathToMorePropertiesForConsumerPath to file with next configuration properties for Consumer.falseString

Producer

If some from next property is required, so it is required when use connector as Producer.

NameDescriptionRequiredType
producerPathToFileContainingSchemaPath to the file, which contains avro schema.trueString
producerNameOfTopicName of the topic, from which the connector will write.trueString
pathToMorePropertiesForProducerPath to file with next configuration properties for Producer.falseString
  • No labels