Use Kerberos with MS Active Directory in Kafka

This article describes the first steps in Kafka with Kerberos LDAP installed.

NOTE
Enabling Kerberos LDAP authentication is done in accordance with the article Kerberos with MS Active Directory.

Verify installed Kerberos LDAP

  1. Check the settings set for Kerberos in the cluster configurations.

    Go to the cluster configuration, select Show advanced, find the section with Kerberos configurations, expand it.

    Installed ADS cluster configuration settings for Kerberos
    Installed ADS cluster configuration settings for Kerberos
  2. Check the security and authentication configuration of the Kafka brokers.

    On each host with a Kafka broker, issue the command:

    $ sudo vim /usr/lib/kafka/config/server.properties

    Ensure that in the server.properties file for each Kafka broker, the lines defining the security protocol are changed to SASL_PLAINTEXT; lines defining the authentication mechanism - to GSSAPI.

    security.inter.broker.protocol=SASL_PLAINTEXT
    
    
    sasl.mechanism.inter.broker.protocol=GSSAPI
    sasl.enabled.mechanisms=GSSAPI
  3. Verify that after installing Kerberos SASL, the listeners parameter in the Kafka service settings has changed from PLAINTEXT://:9092 to SASL_PLAINTEXT://:9092.

    Kafka service settings
    Kafka service settings
  4. Using the command line utility to search the LDAP directory ldapsearch from the Kafka broker host, it is possible to test the generated principals in the AD.RANGER-TEST environment at a given Container DN.

    Enter the command, specifying the Admin DN, Container DN and user password specified when starting LDAP on ADCM:

    $ ldapsearch -v -H ldaps://ldap_host -x -D "cn=admin,ou=kerberos,ou=adh,dc=ad,dc=ranger-test" -b "ou=admin_ou,ou=kerberos,ou =adh,dc=ad,dc=ranger-test" -w Password

    As a result, information is displayed for each created account in LDAP for each instance of each ADS service. Withdrawal option for one account:

    # kafka/sov-ads-test-5.ru-central1.internal, admin_ou, kerberos, adh, ad.ranger-test
    dn: CN=kafka/sov-ads-test-5.ru-central1.internal,OU=admin_ou,OU=kerberos,OU=adh,DC=ad,DC=ranger-test
    objectClass: top
    objectClass: person
    objectClass: organizationalPerson
    objectClass: user
    cn: kafka/sov-ads-test-5.ru-central1.internal
    distinguishedName: CN=kafka/sov-ads-test-5.ru-central1.internal,OU=admin_ou,OU=kerberos,OU=adh,DC=ad,DC=ranger-test
    instanceType: 4
    whenCreated: 20220816100038.0Z
    whenChanged: 20220816100042.0Z
    uSNCreated: 14950769
    uSNChanged: 14950786
    name: kafka/sov-ads-test-5.ru-central1.internal
    objectGUID:: S/7BU9kZcE6pOwqxKUDJRQ==
    userAccountControl: 66048
    badPwdCount: 0
    codePage: 0
    countryCode: 0
    badPasswordTime: 0
    lastLogoff: 0
    lastLogon: 133053546143620502
    pwdLastSet: 133051176384557312
    primaryGroupID: 513
    objectSid:: AQUAAAAAAAUVAAAAKcXXWJ6nBwdKJbrR0v8QAA==
    accountExpires: 0
    logonCount: 50
    sAMAccountName: $IUV110-ESD8M6LCFJGL
    sAMAccountType: 805306368
    userPrincipalName: kafka/sov-ads-test-5.ru-central1.internal@AD.RANGER-TEST
    servicePrincipalName: kafka/sov-ads-test-5.ru-central1.internal
    objectCategory: CN=Person,CN=Schema,CN=Configuration,DC=ad,DC=ranger-test
    dSCorePropagationData: 16010101000000.0Z
    lastLogonTimestamp: 133051176422839302
    NOTE
    For a complete description of the ldapsearch command-line utility features and applicable options, see ldapsearch.
  5. Check for the presence on hosts with installed services of files for storing passwords *.service.keytab.

    On each host with installed services, issue the command:

    $ ls -la /etc/security/keytabs/

    The file listing shows that *.service.keytab files have been created for each service installed on the host:

    total 16
    drwxr-xr-x. 2 root       root        102 Aug  9 20:55 .
    drwxr-xr-x. 7 root       root       4096 Aug  9 19:38 ..
    -rw-------. 1 dockerroot dockerroot  890 Aug  9 20:55 kafka-manager.service.keytab
    -rw-------. 1 kafka      kafka       826 Aug  9 20:54 kafka.service.keytab
    -rw-------. 1 zookeeper  zookeeper   858 Aug  9 20:55 zookeeper.service.keytab

Client connection to Kafka with LDAP authentication

By default, every user in the Active Directory database that has a principal in a given realm has rights to connect to the Kafka cluster and perform actions on topics.

Create a JAAS file for a user

A JAAS file (Java Authentication and Authorization Service) must be created for all principals. It specifies how tickets for a particular principal will be used.

NOTE
  • A description of the JAAS file and the options used are given in the article Krb5LoginModule.

  • Client connection data can also be specified in the KafkaClient section of the kafka-jaas.conf file when using JAAS file template for the Kafka service.

For broker principals, сonnection parameters are automatically created in the kafka-jaas.conf file after kerberization. To view the contents of the file, enter the following command:

$ sudo vim /usr/lib/kafka/config/kafka-jaas.conf

For client principals, you need to create the JAAS file yourself.

  1. Run command:

    $ sudo vim /tmp/client.jaas
  2. Write data to the file:

    KafkaClient {
    com.sun.security.auth.module.Krb5LoginModule required
    useTicketCache=true;
    };

     — where useTicketCache is a parameter specifying whether a ticket for this user will be obtained from the ticket cache. If you set this parameter to true, you must create a user ticket before connecting to Kafka.

Create a configuration file .properties for the user

To create a configuration file .properties for the user, run the command:

$ sudo vim /tmp/client.properties

Fill the file with data:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=GSSAPI
sasl.kerberos.service.name=kafka

Connect a user to Kafka (create tickets) and work with .sh files (scripts)

  1. Open a terminal session and connect to one of the Kafka brokers.

  2. Create a ticket for a user by entering a password:

    $ kinit -p admin-kafka@AD.RANGER-TEST
    Password for admin-kafka@AD.RANGER-TEST:
    NOTE
    • In this example, admin-kafka is a user with an entry in the Active Directory database and a principal for the AD.RANGER-TEST realm.

    • For a complete description of the kinit command functions and applicable options, see kinit.

  3. Check ticket:

    $ klist
    Ticket cache: FILE:/tmp/krb5cc_1000
    Default principal: admin-kafka@AD.RANGER-TEST
    
    Valid starting       Expires              Service principal
    08/19/2022 14:18:33  08/20/2022 00:18:33  krbtgt/AD.RANGER-TEST@AD.RANGER-TEST
    	renew until 08/20/2022 14:17:35
  4. Export the generated client.jaas file as a JVM option for the given user using the KAFKA_OPTS environment variable:

    $ export  KAFKA_OPTS="-Djava.security.auth.login.config=/tmp/client.jaas"
  5. Create a topic by specifying the path to the created client.properties file:

    $ /usr/lib/kafka/bin/kafka-topics.sh --create --topic test-topic --bootstrap-server sov-ads-test-1.ru-central1.internal:9092,sov-ads-test-2.ru-central1.internal:9092,sov-ads-test-3.ru-central1.internal:9092 --command-config /tmp/client.properties

    Get a confirmation:

    Created topic test-topic.
  6. Write a message to the topic, specifying the path to the created client.properties file:

    $ /usr/lib/kafka/bin/kafka-console-producer.sh --topic test-topic --bootstrap-server sov-ads-test-1.ru-central1.internal:9092,sov-ads-test-2.ru-central1.internal:9092,sov-ads-test-3.ru-central1.internal:9092 --producer.config /tmp/client.properties
    >One
    >Two
    >Three
    >Four
    >Five
  7. Open terminal session 2 and connect to one of the Kafka brokers.

  8. Create a ticket for user reader:

    $ kinit -k reader@ADS-KAFKA.LOCAL -t /tmp/reader.user.keytab
  9. Check ticket:

    $ klist
    Ticket cache: FILE:/tmp/krb5cc_1000
    Default principal: reader@ADS-KAFKA.LOCAL
    
    Valid starting       Expires              Service principal
    08/10/2022 21:30:47  08/11/2022 21:30:47  krbtgt/ADS-KAFKA.LOCAL@ADS-KAFKA.LOCAL
  10. Export the generated client.jaas file as a JVM option for the given user using the KAFKA_OPTS environment variable:

    $ export  KAFKA_OPTS="-Djava.security.auth.login.config=/tmp/client.jaas"
  11. Read messages from a topic by specifying the path to the created client.properties file:

    $ /usr/lib/kafka/bin/kafka-console-consumer.sh --topic test-topic --from-beginning  --bootstrap-server sov-ads-test-1.ru-central1.internal:9092,sov-ads-test-2.ru-central1.internal:9092,sov-ads-test-3.ru-central1.internal:9092 --consumer.config /tmp/client.properties

    Messages received:

    One
    Two
    Three
    Four
    Five

    Verify that the received messages are correct.

Found a mistake? Seleсt text and press Ctrl+Enter to report it