How To
Summary
You can validate your Kerberos configuration outside of CDC replication before you attempt to set up replication. Taking this approach can reduce the overall complexity of the task.
Steps
Step 1: Validate keytab, principal name, and Kerberos client configuration by using the command line
This step validates a keytab file, principal name, and Kerberos client configuration that you received from Kerberos administrator.
First, print the contents of a keytab file. This command lists all the keys stored in a keytab file, along with encryption algorithms and corresponding principals. Verify that the principal name that you received is on the list.
klist -k -e -K -t FILE:/path/to/keytab
Next, manually request a Kerberos ticket. The command prints debug information on standard output.
KRB5_TRACE=/dev/stdout kinit -V -k -t /path/to/keytab principalName
Check whether the ticket was successfully acquired by consulting the content of credentials cache.
klist
Once the test is done, destroy the tickets in credentials cache.
kdestroy
Step 2: Validate IDR CDC for Kafka configuration
Once you determine that a keytab file, principal name and Kerberos client configuration are correct, move on to testing your IDR CDC for Kafka configuration.
You will be validating:
- JAAS file.
- “kafkaproducer.properties” file.
- “kafkaconsumer.properties” file.
For instructions on how to populate the files, refer to the article How to install and configure the CDC replication engine for Apache Kafka®.
You can find relevant information in “Specify Kafka producer properties” and “Create a JAAS file” sections.
To conduct the test, first log in to Kafka server and issue these commands to produce to a Kafka topic.
export JAVA_HOME=/cdc/install/dir/jre64/jre
export SCHEMA_REGISTRY_OPTS="-Djava.security.auth.login.config=/path/to/cdc-jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf"
echo 16830912 | /kafka/install/dir/bin/kafka-avro-console-producer --broker-list
brokerHostname:brokerPort --topic topicName --producer.config /path/to/kafkaproducer.properties -property schema.registry.url=http://schemaHost:schemaPort --property value.schema='{"type":"int"}'
Next, consume the data that you just produced.
/kafka/install/dir/bin/kafka-avro-console-consumer --bootstrap-server brokerHostname:brokerPort --topic topicName --consumer.config /path/to/kafkaconsumer.properties --property schema.registry.url=http://schemaHost:schemaPort --from-beginning
If your Kafka environment does not use a schema registry, issue these commands.
export JAVA_HOME=/cdc/install/dir/jre64/jre
export KAFKA_OPTS="-Djava.security.auth.login.config=/path/to/cdc-jaas.conf Djava.security.krb5.conf=/etc/krb5.conf"
echo 16830912 | /kafka/install/dir/bin/kafka-console-producer --broker-list brokerHostname:brokerPort -topic topicName --producer.config /path/to/kafkaproducer.properties
/kafka/install/dir/bin/kafka-console-consumer --bootstrap-server brokerHostname:brokerPort --topic topicName --consumer.config /path/to/kafkaconsumer.properties --from-beginning
If you need Kerberos debug information, set these SCHEMA_REGISTRY_OPTS or KAFKA_OPTS environmental variables values.
export SCHEMA_REGISTRY_OPTS="-Djava.security.auth.login.config=/path/to/cdc-jaas.conf Djava.security.krb5.conf=/etc/krb5.conf -Dcom.ibm.security.jgss.debug=all Dcom.ibm.security.krb5.Krb5Debug=all"
export KAFKA_OPTS="-Djava.security.auth.login.config=/path/to/cdc-jaas.conf -Djava.security.krb5.conf=/etc/krb5.conf -Dcom.ibm.security.jgss.debug=all Dcom.ibm.security.krb5.Krb5Debug=all"
Related Information
Document Location
Worldwide
Product Synonym
IDR;IBM Data Replication;IIDR;IBM InfoSphere Data Replication;CDC;Change Data Capture
Was this topic helpful?
Document Information
Modified date:
19 November 2020
UID
ibm10967415