Console Output

Skipping 14,975 KB.. Full Log
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:testCompile (scala-test-compile-first) @ spark-streaming-kafka-0-10_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/jenkins/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 6 Scala sources and 4 Java sources to /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10/target/scala-2.12/test-classes ...
OpenJDK 64-Bit Server VM warning: CodeCache is full. Compiler has been disabled.
OpenJDK 64-Bit Server VM warning: Try increasing the code cache size using -XX:ReservedCodeCacheSize=
CodeCache: size=1048576Kb used=1042288Kb max_used=1042326Kb free=6288Kb
 bounds [0x00007fd8d4000000, 0x00007fd914000000, 0x00007fd914000000]
 total_blobs=249532 nmethods=248885 adapters=550
 compilation: disabled (not enough contiguous free space left)
[INFO] Done compiling.
[INFO] compile in 28.8 s
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ spark-streaming-kafka-0-10_2.12 ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.spark.streaming.kafka010.JavaDirectKafkaStreamSuite
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.53 s - in org.apache.spark.streaming.kafka010.JavaDirectKafkaStreamSuite
[INFO] Running org.apache.spark.streaming.kafka010.JavaLocationStrategySuite
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.apache.spark.streaming.kafka010.JavaLocationStrategySuite
[INFO] Running org.apache.spark.streaming.kafka010.JavaKafkaRDDSuite
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.305 s - in org.apache.spark.streaming.kafka010.JavaKafkaRDDSuite
[INFO] Running org.apache.spark.streaming.kafka010.JavaConsumerStrategySuite
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.apache.spark.streaming.kafka010.JavaConsumerStrategySuite
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M5:test (test) @ spark-streaming-kafka-0-10_2.12 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ spark-streaming-kafka-0-10_2.12 ---
Discovery starting.
Discovery completed in 709 milliseconds.
Run starting. Expected test count is: 21
KafkaRDDSuite:
- basic usage
- compacted topic
- iterator boundary conditions
- executor sorting
DirectKafkaStreamSuite:
- basic stream receiving with multiple topics and smallest starting offset
- pattern based subscription
- receiving from largest starting offset
- creating stream by offset
- offset recovery
- offset recovery from kafka
- Direct Kafka stream report input information
- maxMessagesPerPartition with backpressure disabled
- maxMessagesPerPartition with no lag
- maxMessagesPerPartition respects max rate
- using rate controller
- backpressure.initialRate should honor maxRatePerPartition
- use backpressure.initialRate with backpressure
- maxMessagesPerPartition with zero offset and rate equal to the specified minimum with default 1
KafkaDataConsumerSuite:
- KafkaDataConsumer reuse in case of same groupId and TopicPartition
- new KafkaDataConsumer instance in case of Task retry
- concurrent use of KafkaDataConsumer
Run completed in 1 minute, 8 seconds.
Total number of tests run: 21
Suites: completed 4, aborted 0
Tests: succeeded 21, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO] 
[INFO] -------------< org.apache.spark:spark-sql-kafka-0-10_2.12 >-------------
[INFO] Building Kafka 0.10+ Source for Structured Streaming 3.1.3-SNAPSHOT [26/31]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-versions) @ spark-sql-kafka-0-10_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-duplicate-dependencies) @ spark-sql-kafka-0-10_2.12 ---
[INFO] 
[INFO] --- mvn-scalafmt_2.12:1.0.3:format (default) @ spark-sql-kafka-0-10_2.12 ---
[WARNING] format.skipSources set, ignoring main directories
[WARNING] format.skipTestSources set, ignoring validateOnly directories
[WARNING] No sources specified, skipping formatting
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:add-source (eclipse-add-source) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Add Source directory: /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/src/main/scala
[INFO] Add Test Source directory: /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/src/test/scala
[INFO] 
[INFO] --- maven-dependency-plugin:3.1.1:build-classpath (default-cli) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/org/apache/kerby/kerb-simplekdc/1.0.1/kerb-simplekdc-1.0.1.jar:/home/jenkins/.m2/repository/org/threeten/threeten-extra/1.5.0/threeten-extra-1.5.0.jar:/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-column/1.10.1/parquet-column-1.10.1.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.10.0/jackson-annotations-2.10.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-core/1.0.1/kerb-core-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/3.2.0/hadoop-mapreduce-client-core-3.2.0.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.10.0/jackson-core-2.10.0.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.5/xz-1.5.jar:/home/jenkins/.m2/repository/org/apache/htrace/htrace-core4/4.1.0-incubating/htrace-core4-4.1.0-incubating.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-annotations/3.2.0/hadoop-annotations-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-util/1.0.1/kerb-util-1.0.1.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-jaxb-annotations/2.10.0/jackson-module-jaxb-annotations-2.10.0.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-hadoop/1.10.1/parquet-hadoop-1.10.1.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.12.10/scala-library-2.12.10.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-xdr/1.0.1/kerby-xdr-1.0.1.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/org/lz4/lz4-java/1.7.1/lz4-java-1.7.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs-client/3.2.0/hadoop-hdfs-client-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-config/1.0.1/kerby-config-1.0.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/classes:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-pkix/1.0.1/kerby-pkix-1.0.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/tags/target/scala-2.12/classes:/home/jenkins/.m2/repository/com/nimbusds/nimbus-jose-jwt/4.41.1/nimbus-jose-jwt-4.41.1.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-encoding/1.10.1/parquet-encoding-1.10.1.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpcore/4.4.12/httpcore-4.4.12.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils/1.9.4/commons-beanutils-1.9.4.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-server/1.0.1/kerb-server-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-shims/1.5.13/orc-shims-1.5.13.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-client/3.2.0/hadoop-yarn-client-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-client/1.0.1/kerb-client-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client/3.2.0/hadoop-client-3.2.0.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-base/2.9.5/jackson-jaxrs-base-2.9.5.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-mapred/1.8.2/avro-mapred-1.8.2-hadoop2.jar:/home/jenkins/.m2/repository/dnsjava/dnsjava/2.1.7/dnsjava-2.1.7.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-configuration2/2.1.1/commons-configuration2-2.1.1.jar:/home/jenkins/.m2/repository/net/minidev/json-smart/2.3/json-smart-2.3.jar:/home/jenkins/.m2/repository/org/apache/kerby/token-provider/1.0.1/token-provider-1.0.1.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-framework/2.13.0/curator-framework-2.13.0.jar:/home/jenkins/.m2/repository/com/github/luben/zstd-jni/1.4.8-1/zstd-jni-1.4.8-1.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/home/jenkins/.m2/repository/net/minidev/accessors-smart/1.2/accessors-smart-1.2.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-json-provider/2.9.5/jackson-jaxrs-json-provider-2.9.5.jar:/home/jenkins/.m2/repository/com/google/re2j/re2j/1.1/re2j-1.1.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-util/1.0.1/kerby-util-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-core/1.5.13/orc-core-1.5.13.jar:/home/jenkins/.m2/repository/com/squareup/okio/okio/1.14.0/okio-1.14.0.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.2.0/hadoop-mapreduce-client-jobclient-3.2.0.jar:/home/jenkins/.m2/repository/javax/xml/bind/jaxb-api/2.2.11/jaxb-api-2.2.11.jar:/home/jenkins/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/home/jenkins/.m2/repository/org/apache/hive/hive-storage-api/2.7.2/hive-storage-api-2.7.2.jar:/home/jenkins/.m2/repository/jakarta/activation/jakarta.activation-api/1.2.1/jakarta.activation-api-1.2.1.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-recipes/2.13.0/curator-recipes-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-ipc/1.8.2/avro-ipc-1.8.2.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.0/jackson-databind-2.10.0.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/3.2.0/hadoop-mapreduce-client-common-3.2.0.jar:/home/jenkins/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-common/1.10.1/parquet-common-1.10.1.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-client/2.13.0/curator-client-2.13.0.jar:/home/jenkins/.m2/repository/com/github/stephenc/jcip/jcip-annotations/1.0-1/jcip-annotations-1.0-1.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-admin/1.0.1/kerb-admin-1.0.1.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.5/commons-io-2.5.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-format/2.4.0/parquet-format-2.4.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.21/commons-compress-1.21.jar:/home/jenkins/.m2/repository/io/airlift/aircompressor/0.10/aircompressor-0.10.jar:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/2.6.0/kafka-clients-2.6.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-lang3/3.10/commons-lang3-3.10.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-api/3.2.0/hadoop-yarn-api-3.2.0.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.8.2/snappy-java-1.1.8.2.jar:/home/jenkins/.m2/repository/jakarta/xml/bind/jakarta.xml.bind-api/2.3.2/jakarta.xml.bind-api-2.3.2.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jenkins/.m2/repository/com/fasterxml/woodstox/woodstox-core/5.0.3/woodstox-core-5.0.3.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-common/3.2.0/hadoop-yarn-common-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-asn1/1.0.1/kerby-asn1-1.0.1.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/.m2/repository/org/codehaus/woodstox/stax2-api/3.1.4/stax2-api-3.1.4.jar:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-jackson/1.10.1/parquet-jackson-1.10.1.jar:/home/jenkins/.m2/repository/com/squareup/okhttp/okhttp/2.7.5/okhttp-2.7.5.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.5.6/httpclient-4.5.6.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-identity/1.0.1/kerb-identity-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-common/3.2.0/hadoop-common-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-pool2/2.6.2/commons-pool2-2.6.2.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-auth/3.2.0/hadoop-auth-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-common/1.0.1/kerb-common-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-mapreduce/1.5.13/orc-mapreduce-1.5.13.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-crypto/1.0.1/kerb-crypto-1.0.1.jar
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) @ spark-sql-kafka-0-10_2.12 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Not compiling main sources
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/jenkins/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 30 Scala sources and 1 Java source to /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/classes ...
[INFO] Done compiling.
[INFO] compile in 57.3 s
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 13 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Not compiling test sources
[INFO] 
[INFO] --- maven-dependency-plugin:3.1.1:build-classpath (generate-test-classpath) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/org/apache/kerby/kerb-simplekdc/1.0.1/kerb-simplekdc-1.0.1.jar:/home/jenkins/.m2/repository/net/jcip/jcip-annotations/1.0/jcip-annotations-1.0.jar:/home/jenkins/.m2/repository/com/google/crypto/tink/tink/1.6.0/tink-1.6.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/external/jakarta.inject/2.6.1/jakarta.inject-2.6.1.jar:/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-freespec_2.12/3.2.3/scalatest-freespec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-funsuite_2.12/3.2.3/scalatest-funsuite_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-column/1.10.1/parquet-column-1.10.1.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-plus/9.4.40.v20210413/jetty-plus-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/jakarta/servlet/jakarta.servlet-api/4.0.3/jakarta.servlet-api-4.0.3.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-funspec_2.12/3.2.3/scalatest-funspec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-memory-netty/2.0.0/arrow-memory-netty-2.0.0.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/3.2.0/hadoop-mapreduce-client-core-3.2.0.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/media/jersey-media-jaxb/2.30/jersey-media-jaxb-2.30.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.10.0/jackson-module-scala_2.12-2.10.0.jar:/home/jenkins/.m2/repository/org/json4s/json4s-core_2.12/3.7.0-M5/json4s-core_2.12-3.7.0-M5.jar:/home/jenkins/.m2/repository/org/apache/htrace/htrace-core4/4.1.0-incubating/htrace-core4-4.1.0-incubating.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-csv/2.10.2/jackson-dataformat-csv-2.10.2.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-jaxb-annotations/2.10.0/jackson-module-jaxb-annotations-2.10.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-http/9.4.40.v20210413/jetty-http-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.12.10/scala-library-2.12.10.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-jndi/9.4.40.v20210413/jetty-jndi-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-xdr/1.0.1/kerby-xdr-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-format/2.0.0/arrow-format-2.0.0.jar:/home/jenkins/.m2/repository/org/apache-extras/beanshell/bsh/2.0b6/bsh-2.0b6.jar:/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.10.13/byte-buddy-1.10.13.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-util/9.4.40.v20210413/jetty-util-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/com/twitter/chill-java/0.9.5/chill-java-0.9.5.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/launcher/target/scala-2.12/classes:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/org/scalatestplus/scalatestplus-scalacheck_2.12/3.1.0.0-RC2/scalatestplus-scalacheck_2.12-3.1.0.0-RC2.jar:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/core/target/scala-2.12/test-classes:/home/jenkins/.m2/repository/org/scalatest/scalatest-wordspec_2.12/3.2.3/scalatest-wordspec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-pkix/1.0.1/kerby-pkix-1.0.1.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-edge-driver/3.141.59/selenium-edge-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-server/9.4.40.v20210413/jetty-server-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-exec/1.3/commons-exec-1.3.jar:/home/jenkins/.m2/repository/com/nimbusds/nimbus-jose-jwt/4.41.1/nimbus-jose-jwt-4.41.1.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-encoding/1.10.1/parquet-encoding-1.10.1.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.5.7/zookeeper-3.5.7.jar:/home/jenkins/.m2/repository/org/scalactic/scalactic_2.12/3.2.3/scalactic_2.12-3.2.3.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils/1.9.4/commons-beanutils-1.9.4.jar:/home/jenkins/.m2/repository/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/network-shuffle/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/apache/kerby/kerb-server/1.0.1/kerb-server-1.0.1.jar:/home/jenkins/.m2/repository/com/twitter/chill_2.12/0.9.5/chill_2.12-0.9.5.jar:/home/jenkins/.m2/repository/net/razorvine/pyrolite/4.30/pyrolite-4.30.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-shims/1.5.13/orc-shims-1.5.13.jar:/home/jenkins/.m2/repository/jakarta/validation/jakarta.validation-api/2.0.2/jakarta.validation-api-2.0.2.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/inject/jersey-hk2/2.30/jersey-hk2-2.30.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/core/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/json4s/json4s-ast_2.12/3.7.0-M5/json4s-ast_2.12-3.7.0-M5.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/sketch/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.12/0.9.1/scala-java8-compat_2.12-0.9.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client/3.2.0/hadoop-client-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-mapred/1.8.2/avro-mapred-1.8.2-hadoop2.jar:/home/jenkins/.m2/repository/dnsjava/dnsjava/2.1.7/dnsjava-2.1.7.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-reflect/2.12.10/scala-reflect-2.12.10.jar:/home/jenkins/.m2/repository/org/jmock/jmock-testjar/2.12.0/jmock-testjar-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-configuration2/2.1.1/commons-configuration2-2.1.1.jar:/home/jenkins/.m2/repository/net/minidev/json-smart/2.3/json-smart-2.3.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/org/codehaus/janino/commons-compiler/3.0.16/commons-compiler-3.0.16.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-framework/2.13.0/curator-framework-2.13.0.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-core/4.1.1/metrics-core-4.1.1.jar:/home/jenkins/.m2/repository/junit/junit/4.13.1/junit-4.13.1.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-json-provider/2.9.5/jackson-jaxrs-json-provider-2.9.5.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-json/4.1.1/metrics-json-4.1.1.jar:/home/jenkins/.m2/repository/io/netty/netty-common/4.1.45.Final/netty-common-4.1.45.Final.jar:/home/jenkins/.m2/repository/com/google/re2j/re2j/1.1/re2j-1.1.jar:/home/jenkins/.m2/repository/org/roaringbitmap/shims/0.9.0/shims-0.9.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-security/9.4.40.v20210413/jetty-security-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.3/osgi-resource-locator-1.0.3.jar:/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy-agent/1.10.13/byte-buddy-agent-1.10.13.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-util/1.0.1/kerby-util-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-core/1.5.13/orc-core-1.5.13.jar:/home/jenkins/.m2/repository/org/scala-sbt/test-interface/1.0/test-interface-1.0.jar:/home/jenkins/.m2/repository/com/squareup/okio/okio/1.14.0/okio-1.14.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-continuation/9.4.40.v20210413/jetty-continuation-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/ow2/asm/asm/7.1/asm-7.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/sql/core/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/jenkins/.m2/repository/javax/xml/bind/jaxb-api/2.2.11/jaxb-api-2.2.11.jar:/home/jenkins/.m2/repository/org/apache/kafka/kafka_2.12/2.6.0/kafka_2.12-2.6.0.jar:/home/jenkins/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-servlet/9.4.40.v20210413/jetty-servlet-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/apache/hive/hive-storage-api/2.7.2/hive-storage-api-2.7.2.jar:/home/jenkins/.m2/repository/jakarta/activation/jakarta.activation-api/1.2.1/jakarta.activation-api-1.2.1.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar:/home/jenkins/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.30/jersey-container-servlet-2.30.jar:/home/jenkins/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-remote-driver/3.141.59/selenium-remote-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-common/1.10.1/parquet-common-1.10.1.jar:/home/jenkins/.m2/repository/com/github/stephenc/jcip/jcip-annotations/1.0-1/jcip-annotations-1.0-1.jar:/home/jenkins/.m2/repository/org/javassist/javassist/3.25.0-GA/javassist-3.25.0-GA.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-servlets/9.4.40.v20210413/jetty-servlets-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet-core/2.30/jersey-container-servlet-core-2.30.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/sql/catalyst/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.21/commons-compress-1.21.jar:/home/jenkins/.m2/repository/io/airlift/aircompressor/0.10/aircompressor-0.10.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-flatspec_2.12/3.2.3/scalatest-flatspec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-api/3.2.0/hadoop-yarn-api-3.2.0.jar:/home/jenkins/.m2/repository/org/mockito/mockito-core/3.4.6/mockito-core-3.4.6.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.8.2/snappy-java-1.1.8.2.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper-jute/3.5.7/zookeeper-jute-3.5.7.jar:/home/jenkins/.m2/repository/org/scalacheck/scalacheck_2.12/1.14.2/scalacheck_2.12-1.14.2.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-common/3.2.0/hadoop-yarn-common-3.2.0.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/sql/catalyst/target/spark-catalyst_2.12-3.1.3-SNAPSHOT-tests.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-compatible/3.2.3/scalatest-compatible-3.2.3.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-propspec_2.12/3.2.3/scalatest-propspec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-jackson/1.10.1/parquet-jackson-1.10.1.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.5.6/httpclient-4.5.6.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-identity/1.0.1/kerb-identity-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-common/3.2.0/hadoop-common-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-mapreduce/1.5.13/orc-mapreduce-1.5.13.jar:/home/jenkins/.m2/repository/org/jmock/jmock-legacy/2.12.0/jmock-legacy-2.12.0.jar:/home/jenkins/.m2/repository/org/scalatestplus/scalatestplus-mockito_2.12/1.0.0-SNAP5/scalatestplus-mockito_2.12-1.0.0-SNAP5.jar:/home/jenkins/.m2/repository/org/threeten/threeten-extra/1.5.0/threeten-extra-1.5.0.jar:/home/jenkins/.m2/repository/com/clearspring/analytics/stream/2.9.6/stream-2.9.6.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-locator/2.6.1/hk2-locator-2.6.1.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-utils/2.6.1/hk2-utils-2.6.1.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/home/jenkins/.m2/repository/io/netty/netty-all/4.1.51.Final/netty-all-4.1.51.Final.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.10.0/jackson-annotations-2.10.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-core/1.0.1/kerb-core-1.0.1.jar:/home/jenkins/.m2/repository/io/netty/netty-buffer/4.1.45.Final/netty-buffer-4.1.45.Final.jar:/home/jenkins/.m2/repository/org/objenesis/objenesis/2.6/objenesis-2.6.jar:/home/jenkins/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.10.0/jackson-core-2.10.0.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-diagrams_2.12/3.2.3/scalatest-diagrams_2.12-3.2.3.jar:/home/jenkins/.m2/repository/jakarta/ws/rs/jakarta.ws.rs-api/2.1.6/jakarta.ws.rs-api-2.1.6.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.5/xz-1.5.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.6.1/aopalliance-repackaged-2.6.1.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-memory-core/2.0.0/arrow-memory-core-2.0.0.jar:/home/jenkins/.m2/repository/org/jmock/jmock-junit4/2.12.0/jmock-junit4-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-annotations/3.2.0/hadoop-annotations-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-util/1.0.1/kerb-util-1.0.1.jar:/home/jenkins/.m2/repository/io/netty/netty-codec/4.1.45.Final/netty-codec-4.1.45.Final.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-hadoop/1.10.1/parquet-hadoop-1.10.1.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-paranamer/2.10.0/jackson-module-paranamer-2.10.0.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/tags/target/scala-2.12/test-classes:/home/jenkins/.m2/repository/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/home/jenkins/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/jenkins/.m2/repository/net/sf/py4j/py4j/0.10.9/py4j-0.10.9.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest_2.12/3.2.3/scalatest_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-util-ajax/9.4.40.v20210413/jetty-util-ajax-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-mustmatchers_2.12/3.2.3/scalatest-mustmatchers_2.12-3.2.3.jar:/home/jenkins/.m2/repository/com/squareup/okhttp3/okhttp/3.11.0/okhttp-3.11.0.jar:/home/jenkins/.m2/repository/io/netty/netty-resolver/4.1.45.Final/netty-resolver-4.1.45.Final.jar:/home/jenkins/.m2/repository/org/slf4j/jul-to-slf4j/1.7.30/jul-to-slf4j-1.7.30.jar:/home/jenkins/.m2/repository/org/json4s/json4s-scalap_2.12/3.7.0-M5/json4s-scalap_2.12-3.7.0-M5.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-ie-driver/3.141.59/selenium-ie-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/lz4/lz4-java/1.7.1/lz4-java-1.7.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs-client/3.2.0/hadoop-hdfs-client-3.2.0.jar:/home/jenkins/.m2/repository/com/google/flatbuffers/flatbuffers-java/1.9.0/flatbuffers-java-1.9.0.jar:/home/jenkins/.m2/repository/org/hamcrest/hamcrest/2.1/hamcrest-2.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/sql/core/target/spark-sql_2.12-3.1.3-SNAPSHOT-tests.jar:/home/jenkins/.m2/repository/org/jmock/jmock/2.12.0/jmock-2.12.0.jar:/home/jenkins/.m2/repository/org/json4s/json4s-jackson_2.12/3.7.0-M5/json4s-jackson_2.12-3.7.0-M5.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-config/1.0.1/kerby-config-1.0.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.12/1.1.2/scala-parser-combinators_2.12-1.1.2.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-graphite/4.1.1/metrics-graphite-4.1.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/unsafe/target/scala-2.12/classes:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/tags/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/codehaus/janino/janino/3.0.16/janino-3.0.16.jar:/home/jenkins/.m2/repository/io/netty/netty-transport/4.1.45.Final/netty-transport-4.1.45.Final.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/home/jenkins/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpcore/4.4.12/httpcore-4.4.12.jar:/home/jenkins/.m2/repository/com/typesafe/scala-logging/scala-logging_2.12/3.9.2/scala-logging_2.12-3.9.2.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-xml/9.4.40.v20210413/jetty-xml-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-io/9.4.40.v20210413/jetty-io-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-vector/2.0.0/arrow-vector-2.0.0.jar:/home/jenkins/.m2/repository/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-client/3.2.0/hadoop-yarn-client-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-client/1.0.1/kerb-client-1.0.1.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-core_2.12/3.2.3/scalatest-core_2.12-3.2.3.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/test-classes:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-webapp/9.4.40.v20210413/jetty-webapp-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-common/2.30/jersey-common-2.30.jar:/home/jenkins/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-base/2.9.5/jackson-jaxrs-base-2.9.5.jar:/home/jenkins/.m2/repository/org/jmock/jmock-imposters/2.12.0/jmock-imposters-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/token-provider/1.0.1/token-provider-1.0.1.jar:/home/jenkins/.m2/repository/io/netty/netty-handler/4.1.45.Final/netty-handler-4.1.45.Final.jar:/home/jenkins/.m2/repository/com/novocode/junit-interface/0.11/junit-interface-0.11.jar:/home/jenkins/.m2/repository/org/apache/xbean/xbean-asm7-shaded/4.15/xbean-asm7-shaded-4.15.jar:/home/jenkins/.m2/repository/cglib/cglib/3.2.8/cglib-3.2.8.jar:/home/jenkins/.m2/repository/org/antlr/antlr4-runtime/4.8-1/antlr4-runtime-4.8-1.jar:/home/jenkins/.m2/repository/com/github/luben/zstd-jni/1.4.8-1/zstd-jni-1.4.8-1.jar:/home/jenkins/.m2/repository/org/hamcrest/hamcrest-library/1.3/hamcrest-library-1.3.jar:/home/jenkins/.m2/repository/net/minidev/accessors-smart/1.2/accessors-smart-1.2.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-api/2.6.1/hk2-api-2.6.1.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-chrome-driver/3.141.59/selenium-chrome-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-opera-driver/3.141.59/selenium-opera-driver-3.141.59.jar:/home/jenkins/.m2/repository/jakarta/annotation/jakarta.annotation-api/1.3.5/jakarta.annotation-api-1.3.5.jar:/home/jenkins/.m2/repository/org/apache/yetus/audience-annotations/0.5.0/audience-annotations-0.5.0.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-jvm/4.1.1/metrics-jvm-4.1.1.jar:/home/jenkins/.m2/repository/org/scalatestplus/scalatestplus-selenium_2.12/1.0.0-SNAP5/scalatestplus-selenium_2.12-1.0.0-SNAP5.jar:/home/jenkins/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.2.0/hadoop-mapreduce-client-jobclient-3.2.0.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-support/3.141.59/selenium-support-3.141.59.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-xml_2.12/1.2.0/scala-xml_2.12-1.2.0.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-recipes/2.13.0/curator-recipes-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-ipc/1.8.2/avro-ipc-1.8.2.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-jdk8/2.10.2/jackson-datatype-jdk8-2.10.2.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.0/jackson-databind-2.10.0.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/3.2.0/hadoop-mapreduce-client-common-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-client/2.13.0/curator-client-2.13.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-client/9.4.40.v20210413/jetty-client-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-admin/1.0.1/kerb-admin-1.0.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/kvstore/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/apache/commons/commons-crypto/1.1.0/commons-crypto-1.1.0.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.5/commons-io-2.5.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-format/2.4.0/parquet-format-2.4.0.jar:/home/jenkins/.m2/repository/net/sf/jopt-simple/jopt-simple/3.2/jopt-simple-3.2.jar:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/2.6.0/kafka-clients-2.6.0.jar:/home/jenkins/.m2/repository/io/netty/netty-transport-native-unix-common/4.1.45.Final/netty-transport-native-unix-common-4.1.45.Final.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-lang3/3.10/commons-lang3-3.10.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-refspec_2.12/3.2.3/scalatest-refspec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-api/3.141.59/selenium-api-3.141.59.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-java/3.141.59/selenium-java-3.141.59.jar:/home/jenkins/.m2/repository/io/netty/netty-transport-native-epoll/4.1.45.Final/netty-transport-native-epoll-4.1.45.Final.jar:/home/jenkins/.m2/repository/org/roaringbitmap/RoaringBitmap/0.9.0/RoaringBitmap-0.9.0.jar:/home/jenkins/.m2/repository/jakarta/xml/bind/jakarta.xml.bind-api/2.3.2/jakarta.xml.bind-api-2.3.2.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-collection-compat_2.12/2.1.6/scala-collection-compat_2.12-2.1.6.jar:/home/jenkins/.m2/repository/com/fasterxml/woodstox/woodstox-core/5.0.3/woodstox-core-5.0.3.jar:/home/jenkins/.m2/repository/com/univocity/univocity-parsers/2.9.1/univocity-parsers-2.9.1.jar:/home/jenkins/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.30/jcl-over-slf4j-1.7.30.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-asn1/1.0.1/kerby-asn1-1.0.1.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-server/2.30/jersey-server-2.30.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-jmx/4.1.1/metrics-jmx-4.1.1.jar:/home/jenkins/.m2/repository/org/codehaus/woodstox/stax2-api/3.1.4/stax2-api-3.1.4.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-client/2.30/jersey-client-2.30.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/annotations/3.0.1/annotations-3.0.1.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-featurespec_2.12/3.2.3/scalatest-featurespec_2.12-3.2.3.jar:/home/jenkins/.m2/repository/com/squareup/okhttp/okhttp/2.7.5/okhttp-2.7.5.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-firefox-driver/3.141.59/selenium-firefox-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-proxy/9.4.40.v20210413/jetty-proxy-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-safari-driver/3.141.59/selenium-safari-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-pool2/2.6.2/commons-pool2-2.6.2.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-shouldmatchers_2.12/3.2.3/scalatest-shouldmatchers_2.12-3.2.3.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-auth/3.2.0/hadoop-auth-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-minikdc/3.2.0/hadoop-minikdc-3.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-common/1.0.1/kerb-common-1.0.1.jar:/home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/common/network-common/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/apache/kerby/kerb-crypto/1.0.1/kerb-crypto-1.0.1.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-matchers-core_2.12/3.2.3/scalatest-matchers-core_2.12-3.2.3.jar
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:testCompile (scala-test-compile-first) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/jenkins/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 21 Scala sources to /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/test-classes ...
[INFO] Done compiling.
[INFO] compile in 285.2 s
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ spark-sql-kafka-0-10_2.12 ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M5:test (test) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.0:test (test) @ spark-sql-kafka-0-10_2.12 ---
Discovery starting.
Discovery completed in 1 second, 249 milliseconds.
Run starting. Expected test count is: 411
KafkaSourceProviderSuite:
- batch mode - options should be handled as case-insensitive
- micro-batch mode - options should be handled as case-insensitive
- continuous mode - options should be handled as case-insensitive
ConsumerStrategySuite:
- createAdmin must create admin properly
- AssignStrategy.assignedTopicPartitions must give back all assigned
- AssignStrategy.assignedTopicPartitions must skip invalid partitions
- SubscribeStrategy.assignedTopicPartitions must give back all assigned
- SubscribePatternStrategy.assignedTopicPartitions must give back all assigned
FetchedDataPoolSuite:
- acquire fetched data from multiple keys
- continuous use of fetched data from single key
- multiple tasks referring same key continuously using fetched data
- evict idle fetched data
- invalidate key
KafkaRelationSuiteV1:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- no matched offset for timestamp - startingOffsets
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V1 Source is used when set through SQLConf
KafkaMicroBatchV2SourceSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- (de)serialization of initial offsets
- SPARK-26718 Rate limit set to Long.Max should not overflow integer during end offset calculation
- maxOffsetsPerTrigger
- input row metrics
- subscribing topic by pattern with topic deletions
- subscribe topic by pattern with topic recreation between batches
- ensure that initial offset are written with an extra byte in the beginning (SPARK-19517)
- deserialization of initial offset written by Spark 2.1.0 (SPARK-19517)
- deserialization of initial offset written by future version
- KafkaSource with watermark
- delete a topic when a Spark job is running
- SPARK-22956: currentPartitionOffsets should be set when no new data comes in
- allow group.id prefix
- allow group.id override
- ensure stream-stream self-join generates only one offset in log and correct metrics
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-25495: FetchedData.reset should reset all fields
- SPARK-27494: read kafka record containing null key/values.
- SPARK-30656: minPartitions
- V2 Source is used by default
- minPartitions is supported
- default config of includeHeader doesn't break existing query from Spark 2.4
KafkaRelationSuiteWithAdminV2:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- no matched offset for timestamp - startingOffsets
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V2 Source is used when set through SQLConf
KafkaSinkMicroBatchStreamingSuite:
- streaming - write to kafka with topic field
- streaming - write w/o topic field, with topic option
- streaming - topic field and topic option
- streaming - write data with bad schema
- streaming - write data with valid schema but wrong types
- streaming - write to non-existing topic
- streaming - exception on config serializer
- streaming - sink progress is produced
KafkaContinuousSourceStressForDontFailOnDataLossSuite:
- stress test for failOnDataLoss=false
KafkaContinuousSinkSuite:
- streaming - write to kafka with topic field
- streaming - write w/o topic field, with topic option
- streaming - topic field and topic option
- streaming - write data with bad schema
- streaming - write data with valid schema but wrong types
- streaming - write to non-existing topic
- streaming - exception on config serializer
- generic - write big data with small producer buffer
KafkaOffsetRangeCalculatorSuite:
- with no minPartition: N TopicPartitions to N offset ranges
- with no minPartition: empty ranges ignored
- with minPartition = 3: N TopicPartitions to N offset ranges
- with minPartition = 4: 1 TopicPartition to N offset ranges
- with minPartition = 3: N skewed TopicPartitions to M offset ranges
- with minPartition = 4: SPARK-30656: ignore empty ranges and split the rest
- with minPartition = 3: SPARK-30656: N very skewed TopicPartitions to M offset ranges
- with minPartition = 1: SPARK-30656: minPartitions less than the length of topic partitions
- with minPartition = 3: range inexact multiple of minPartitions
- with minPartition = 3: empty ranges ignored
- with minPartition = 6: SPARK-28489: never drop offsets
KafkaMicroBatchV1SourceWithAdminSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- (de)serialization of initial offsets
- SPARK-26718 Rate limit set to Long.Max should not overflow integer during end offset calculation
- maxOffsetsPerTrigger
- input row metrics
- subscribing topic by pattern with topic deletions
- subscribe topic by pattern with topic recreation between batches
- ensure that initial offset are written with an extra byte in the beginning (SPARK-19517)
- deserialization of initial offset written by Spark 2.1.0 (SPARK-19517)
- deserialization of initial offset written by future version
- KafkaSource with watermark
- delete a topic when a Spark job is running
- SPARK-22956: currentPartitionOffsets should be set when no new data comes in
- allow group.id prefix
- allow group.id override
- ensure stream-stream self-join generates only one offset in log and correct metrics
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-25495: FetchedData.reset should reset all fields
- SPARK-27494: read kafka record containing null key/values.
- SPARK-30656: minPartitions
- V1 Source is used when disabled through SQLConf
InternalKafkaConsumerPoolSuite:
- basic multiple borrows and returns for single key
- basic borrow and return for multiple keys
- borrow more than soft max capacity from pool which is neither free space nor idle object
- borrow more than soft max capacity from pool frees up idle objects automatically
- evicting idle objects on background
KafkaDataConsumerSuite:
- SPARK-19886: Report error cause correctly in reportDataLoss
- new KafkaDataConsumer instance in case of Task retry
- same KafkaDataConsumer instance in case of same token
- new KafkaDataConsumer instance in case of token renewal
- SPARK-23623: concurrent use of KafkaDataConsumer
- SPARK-25151 Handles multiple tasks in executor fetching same (topic, partition) pair
- SPARK-25151 Handles multiple tasks in executor fetching same (topic, partition) pair and same offset (edge-case) - data in use
- SPARK-25151 Handles multiple tasks in executor fetching same (topic, partition) pair and same offset (edge-case) - data not in use
KafkaRelationSuiteV2:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- no matched offset for timestamp - startingOffsets
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V2 Source is used when set through SQLConf
KafkaMicroBatchV1SourceSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- (de)serialization of initial offsets
- SPARK-26718 Rate limit set to Long.Max should not overflow integer during end offset calculation
- maxOffsetsPerTrigger
- input row metrics
- subscribing topic by pattern with topic deletions
- subscribe topic by pattern with topic recreation between batches
- ensure that initial offset are written with an extra byte in the beginning (SPARK-19517)
- deserialization of initial offset written by Spark 2.1.0 (SPARK-19517)
- deserialization of initial offset written by future version
- KafkaSource with watermark
- delete a topic when a Spark job is running
- SPARK-22956: currentPartitionOffsets should be set when no new data comes in
- allow group.id prefix
- allow group.id override
- ensure stream-stream self-join generates only one offset in log and correct metrics
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-25495: FetchedData.reset should reset all fields
- SPARK-27494: read kafka record containing null key/values.
- SPARK-30656: minPartitions
- V1 Source is used when disabled through SQLConf
KafkaRelationSuiteWithAdminV1:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- no matched offset for timestamp - startingOffsets
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V1 Source is used when set through SQLConf
JsonUtilsSuite:
- parsing partitions
- parsing partitionOffsets
KafkaDelegationTokenSuite:
Java config name: /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-6594e58d-e5f1-4fcc-a150-2549e20c0169/1638531097408/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zookeeper
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 66; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zookeeper
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 74; type: 16
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=148
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=148
>>>DEBUG: TCPClient reading 163 bytes
>>> KrbKdcReq send: #bytes read=163
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:39073
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Fri Dec 03 03:31:37 PST 2021 1638531097000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zookeeper/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=235
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=235
>>>DEBUG: TCPClient reading 545 bytes
>>> KrbKdcReq send: #bytes read=545
>>> KdcAccessibility: remove localhost:39073
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zookeeper/localhost
Java config name: /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-6594e58d-e5f1-4fcc-a150-2549e20c0169/1638531097408/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zkclient
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 65; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zkclient
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 73; type: 16
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=147
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:39073
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Fri Dec 03 03:31:37 PST 2021 1638531097000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=234
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=234
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:39073
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
Java config name: /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-6594e58d-e5f1-4fcc-a150-2549e20c0169/1638531097408/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=147
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:39073
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Fri Dec 03 03:31:38 PST 2021 1638531098000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=234
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=234
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:39073
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/zookeeper.keytab for zookeeper/localhost@EXAMPLE.COM
Found ticket for zookeeper/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:37 PDT 2024
Java config name: /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-6594e58d-e5f1-4fcc-a150-2549e20c0169/1638531097408/krb5.conf
Loaded from Java config
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:38 PDT 2024
>>> KdcAccessibility: reset
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=147
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:38 PDT 2024
Service ticket not found in the subject
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>> Credentials serviceCredsSingle: same realm
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:39073
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Fri Dec 03 03:31:38 PST 2021 1638531098000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tgs_enctypes: 17Looking for keys for: zkclient/localhost@EXAMPLE.COM
.
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=234
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=234
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=578
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=578
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:39073
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/zookeeper.keytab for zookeeper/localhost@EXAMPLE.COM
Found ticket for zookeeper/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:37 PDT 2024
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:38 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:38 PDT 2024
Service ticket not found in the subject
>>> Credentials serviceCredsSingle: same realm
default etypes for default_tgs_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=578
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=578
>>>DEBUG: TCPClient reading 540 bytes
>>>DEBUG: TCPClient reading 540 bytes
>>> KrbKdcReq send: #bytes read=540
>>> KrbKdcReq send: #bytes read=540
>>> KdcAccessibility: remove localhost:39073
>>> KdcAccessibility: remove localhost:39073
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: zkclient/localhost@EXAMPLE.COM
	server: zookeeper/localhost@EXAMPLE.COM
	ticket: sname: zookeeper/localhost@EXAMPLE.COM
	endTime: 1724931098000
        ----Credentials end----
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: zkclient/localhost@EXAMPLE.COM
	server: zookeeper/localhost@EXAMPLE.COM
	ticket: sname: zookeeper/localhost@EXAMPLE.COM
	endTime: 1724931098000
        ----Credentials end----
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 41323747
Krb5Context setting peerSeqNumber to: 41323747
Krb5Context setting mySeqNumber to: 581512453
Krb5Context setting peerSeqNumber to: 581512453
Created InitSecContextToken:
0000: 01 00 6E 82 01 E8 30 82   01 E4 A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 FF  ................
0020: 61 81 FC 30 81 F9 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 21 30 1F A0 03  XAMPLE.COM.!0...
0040: 02 01 03 A1 18 30 16 1B   09 7A 6F 6F 6B 65 65 70  .....0...zookeep
0050: 65 72 1B 09 6C 6F 63 61   6C 68 6F 73 74 A3 81 BF  er..localhost...
0060: 30 81 BC A0 03 02 01 11   A1 03 02 01 01 A2 81 AF  0...............
0070: 04 81 AC B0 D5 DA FA 45   8B D9 20 78 E2 8E 94 61  .......E.. x...a
0080: E8 E5 43 B5 3D 8F 57 C8   F5 35 9B 80 66 F1 67 9E  ..C.=.W..5..f.g.
0090: 05 74 E0 56 20 54 8F E9   6E C5 42 B6 3D BD BC 08  .t.V T..n.B.=...
00A0: BE 12 BB 4F 30 E6 D4 C5   C2 1A 1F 4F CE 71 2B DA  ...O0......O.q+.
00B0: 9D C9 BF A4 C3 D1 38 E4   93 E8 8C 95 D5 F1 2A 47  ......8.......*G
00C0: FD FC 2C 1D E6 C0 CE 9C   3C 27 E1 17 D5 A7 4A 0F  ..,.....<'....J.
00D0: 03 04 AD D9 BB 5D 9E F9   B6 83 9D 09 85 33 33 CE  .....].......33.
00E0: 97 A6 65 43 1A 9B 97 88   48 13 62 B4 C8 A2 DA CE  ..eC....H.b.....
00F0: 59 A5 A6 2A 4B EF 7B 4B   1D E6 9C 53 93 BD B6 E5  Y..*K..K...S....
0100: CF BE 99 CD 0E AF 97 E2   70 6A E4 99 ED 9E 12 68  ........pj.....h
0110: 57 12 1C F6 30 59 DF 7F   D5 85 D1 2A 02 FC 9B A4  W...0Y.....*....
0120: 81 CC 30 81 C9 A0 03 02   01 11 A2 81 C1 04 81 BE  ..0.............
0130: 8C BF D6 96 CE 2B 60 14   70 D3 14 A7 0E 5C 19 C4  .....+`.p....\..
0140: 8A CC 8B A7 E5 E7 A9 75   35 FF 63 5D 69 BA 8B EE  .......u5.c]i...
0150: 00 E8 49 79 ED 70 47 30   29 F6 EC 7C 1D 46 08 03  ..Iy.pG0)....F..
0160: CB 17 BF 96 FA D2 66 97   31 78 C0 06 1A BF 03 F6  ......f.1x......
0170: ED 35 CB A7 F0 F1 4E AB   14 4E A8 CF EF CF C8 A1  .5....N..N......
0180: 9E 2B 32 4A 7A 2A 50 20   01 8B E5 FE 44 2E 49 DE  .+2Jz*P ....D.I.
0190: 8F D9 BB 7B 04 25 D0 3F   33 6E 71 C7 10 7B 1B 49  .....%.?3nq....I
01A0: 46 77 4F 2E C2 A5 8C 90   76 6E B9 29 66 CC F0 E4  FwO.....vn.)f...
01B0: 90 26 BB 6D 80 46 00 38   B2 F4 66 11 87 86 6F E8  .&.m.F.8..f...o.
01C0: 76 58 3F 1E 74 99 FC 11   E6 F7 77 CE 91 54 28 3A  vX?.t.....w..T(:
01D0: A6 CA 86 F9 5C 2D 10 AF   1C C1 8B D5 07 CE FD 8A  ....\-..........
01E0: 2D B0 EE 36 8D C5 2A FB   8B DF 11 84 07 DA        -..6..*.......

Created InitSecContextToken:
0000: 01 00 6E 82 01 E8 30 82   01 E4 A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 FF  ................
0020: 61 81 FC 30 81 F9 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 21 30 1F A0 03  XAMPLE.COM.!0...
0040: 02 01 03 A1 18 30 16 1B   09 7A 6F 6F 6B 65 65 70  .....0...zookeep
0050: 65 72 1B 09 6C 6F 63 61   6C 68 6F 73 74 A3 81 BF  er..localhost...
0060: 30 81 BC A0 03 02 01 11   A1 03 02 01 01 A2 81 AF  0...............
0070: 04 81 AC DA 13 74 49 3B   FC A5 1C 22 2F 6C 7C 06  .....tI;..."/l..
0080: D5 89 67 C2 17 51 D2 38   DC A7 0A 02 AB 4A FA C3  ..g..Q.8.....J..
0090: 20 12 3C F7 96 8C D3 B9   C9 A1 B4 F5 01 08 DD CE   .<.............
00A0: C3 36 C4 7C 96 B5 E9 43   2E 1E FC CA F6 DD 01 65  .6.....C.......e
00B0: D8 35 3D FB AB 15 3D 3A   9B 4E E6 28 FC 39 47 E6  .5=...=:.N.(.9G.
00C0: 3C 23 63 F7 12 3E 8A CD   51 CC 94 2E A1 95 3D E4  <#c..>..Q.....=.
00D0: 54 99 F0 20 99 1F 1E 80   23 5A 31 67 29 D3 B7 F7  T.. ....#Z1g)...
00E0: 32 6D 49 B2 EA 67 90 95   F0 68 AD 36 3F E6 1B 0B  2mI..g...h.6?...
00F0: 09 DB C0 84 74 85 39 B9   8E C8 7C 5C 89 5C AC BC  ....t.9....\.\..
0100: E6 56 13 2C 9A E5 F0 3F   E8 C7 0F 67 53 AB 5A B1  .V.,...?...gS.Z.
0110: C2 64 AC 43 4D 94 DA 30   6F 44 DF E1 D5 43 C9 A4  .d.CM..0oD...C..
0120: 81 CC 30 81 C9 A0 03 02   01 11 A2 81 C1 04 81 BE  ..0.............
0130: F0 D9 C3 68 48 13 8A A3   1D F5 69 F9 CE 3F 6E EB  ...hH.....i..?n.
0140: D8 5F 8A D5 1A 63 BA 9C   71 10 40 AA A8 FA 88 57  ._...c..q.@....W
0150: 34 E5 B4 92 72 B9 CD C4   55 71 4D 5C 05 BD 63 3C  4...r...UqM\..c<
0160: FE 55 BA 1E 04 21 14 16   36 A9 A4 35 55 CE 6C 38  .U...!..6..5U.l8
0170: 9F 37 85 D7 78 B8 4C 69   F1 CB 74 EF EC E5 C0 B6  .7..x.Li..t.....
0180: 40 E6 B9 2B D4 9B 57 18   9F 16 A3 2B F2 89 CD E4  @..+..W....+....
0190: 48 F0 1A 97 72 8B D7 96   D6 74 F7 97 BA A2 EE E2  H...r....t......
01A0: D6 87 A8 4B B4 ED A3 4D   6C 9F B4 12 A7 79 7C 4B  ...K...Ml....y.K
01B0: 9C 5C FC E0 97 77 9E 38   89 77 B6 CD A1 B1 39 6B  .\...w.8.w....9k
01C0: 25 28 55 35 4B DB E6 D8   10 A8 C3 9D 9E D6 A8 47  %(U5K..........G
01D0: C2 DF 68 20 23 7A 81 9A   43 79 5A 08 CE 27 16 C1  ..h #z..CyZ..'..
01E0: 0A 47 91 F2 F8 9F 57 AA   54 49 34 BA 11 1C        .G....W.TI4...

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 16version: 1
Added key: 17version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16default etypes for permitted_enctypes: 23.
 18>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531098/884739/583DC0EF7B8D65B8D9BFA33885BBB84F/zkclient/localhost@EXAMPLE.COM to zkclient/localhost@EXAMPLE.COM|zookeeper/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 581512453
MemoryCache: add 1638531098/884736/38A2732A7313EA1D4574D4880E75C751/zkclient/localhost@EXAMPLE.COM to zkclient/localhost@EXAMPLE.COM|zookeeper/localhost@EXAMPLE.COM
Krb5Context setting mySeqNumber to: 581512453
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 41323747
Krb5Context setting mySeqNumber to: 41323747
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 22 a9 2d 05 01 01 00 00 eb 80 57 b2 af 32 a8 cd fb 4f 1f c7 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 02 76 8c e3 01 01 00 00 a3 57 c0 a4 41 fa 55 d0 c7 37 0b 8b ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 02 76 8c e3 01 01 00 00 a3 57 c0 a4 41 fa 55 d0 c7 37 0b 8b ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 22 a9 2d 05 01 01 00 00 eb 80 57 b2 af 32 a8 cd fb 4f 1f c7 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 22 a9 2d 05 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 58 8a b7 66 c6 22 67 ad ad eb 03 70 ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 02 76 8c e3 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d bf bb 14 63 38 10 24 47 51 65 cd e6 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 22 a9 2d 05 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 58 8a b7 66 c6 22 67 ad ad eb 03 70 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 02 76 8c e3 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d bf bb 14 63 38 10 24 47 51 65 cd e6 ]
Krb5Context.unwrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.unwrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): kafka
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 62; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): kafka
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 70; type: 16
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=143
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=143
>>>DEBUG: TCPClient reading 159 bytes
>>> KrbKdcReq send: #bytes read=159
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:39073
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Fri Dec 03 03:31:39 PST 2021 1638531099000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is kafka/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=229
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=229
>>>DEBUG: TCPClient reading 537 bytes
>>> KrbKdcReq send: #bytes read=537
>>> KdcAccessibility: remove localhost:39073
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply kafka/localhost
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): client
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 63; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): client
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 71; type: 16
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=144
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=144
>>>DEBUG: TCPClient reading 160 bytes
>>> KrbKdcReq send: #bytes read=160
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:39073
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Fri Dec 03 03:31:39 PST 2021 1638531099000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is client/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=231
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=231
>>>DEBUG: TCPClient reading 539 bytes
>>> KrbKdcReq send: #bytes read=539
>>> KdcAccessibility: remove localhost:39073
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
>>> KrbAsRep cons in KrbAsReq.getReply client/localhost
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Service ticket not found in the subject
>>> Credentials serviceCredsSingle: same realm
default etypes for default_tgs_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=568
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=568
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Service ticket not found in the subject
>>> Credentials serviceCredsSingle: same realm
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/kafka.keytab for kafka/localhost@EXAMPLE.COM
default etypes for default_tgs_enctypes: 17.
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>>DEBUG: TCPClient reading 526 bytes
>>> KrbKdcReq send: #bytes read=526
>>> KdcAccessibility: remove localhost:39073
>>> KrbKdcReq send: kdc=localhost TCP:39073, timeout=30000, number of retries =3, #bytes=570
>>> KDCCommunication: kdc=localhost TCP:39073, timeout=30000,Attempt =1, #bytes=570
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: kafka/localhost@EXAMPLE.COM
	server: kafka/localhost@EXAMPLE.COM
	ticket: sname: kafka/localhost@EXAMPLE.COM
	endTime: 1724931099000
        ----Credentials end----
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 613130673
Krb5Context setting peerSeqNumber to: 613130673
Created InitSecContextToken:
0000: 01 00 6E 82 01 DE 30 82   01 DA A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F8  ................
0020: 61 81 F5 30 81 F2 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BC 30 81 B9 A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AC 04 81 A9 64  ...............d
0070: D8 90 E3 A1 8B B8 9F A8   F8 DD 85 12 BF 9E E5 9C  ................
0080: 6A 41 DC 22 A6 85 37 8B   9D D1 D1 B8 44 D9 2B F0  jA."..7.....D.+.
0090: 75 DC F9 00 31 F7 5A 2B   8F 8B 88 EF 36 C7 19 D9  u...1.Z+....6...
00A0: F5 52 71 99 6A 92 29 47   67 1D 95 86 ED 87 94 81  .Rq.j.)Gg.......
00B0: FB 4B A7 16 A2 55 A4 47   91 65 61 E8 DF CA 42 D5  .K...U.G.ea...B.
00C0: 4C C1 68 09 2D DD 25 21   6A 71 49 69 BC 67 EC 9F  L.h.-.%!jqIi.g..
00D0: C2 F5 E9 87 36 C4 A3 2D   AC FE FB C5 F6 65 FC A5  ....6..-.....e..
00E0: 78 EA D8 97 52 59 BF 8D   57 5D EE 57 5F 04 C9 1F  x...RY..W].W_...
00F0: 58 3C 72 DD 40 21 9D CA   91 58 2E E6 E1 07 FB AA  X<r.@!...X......
0100: 0E 32 CF 33 76 19 A4 A2   37 DC 9E AE D1 20 59 58  .2.3v...7.... YX
0110: 88 1A A4 B6 78 7D 1A 8E   A4 81 C9 30 81 C6 A0 03  ....x......0....
0120: 02 01 11 A2 81 BE 04 81   BB 8B C4 EB 9F 4E 26 A7  .............N&.
0130: C5 1B 2A 06 FD 73 3A A0   9E 8F 78 0E BD 0E 32 92  ..*..s:...x...2.
0140: 76 C5 F4 EC B5 3D F5 14   E9 BF 2F 40 F8 B3 5A 4D  v....=..../@..ZM
0150: 79 62 8B 82 14 A3 40 BA   3E 4C EB BD B6 42 F0 D3  yb....@.>L...B..
0160: CA 81 6E 4D 05 FB 2F CB   10 55 DF 8F C8 03 92 C8  ..nM../..U......
0170: 26 38 E9 28 88 64 61 E4   37 ED 67 FB D8 EE 6B A3  &8.(.da.7.g...k.
0180: 32 B4 CC B0 DB 1F ED 8B   D2 FD A7 AF 20 85 20 B6  2........... . .
0190: BC B7 BD 49 0F CA 86 50   50 01 6F 27 66 F8 DB D2  ...I...PP.o'f...
01A0: 18 CD 20 96 D0 8E 42 6C   E5 08 3B 68 1F 8F 7D 55  .. ...Bl..;h...U
01B0: 9E 43 12 18 2E BB EE 1C   7C B0 91 71 C1 A5 6D 66  .C.........q..mf
01C0: 81 68 37 2F FA E7 A4 F9   49 D1 3E 5D 3C 98 77 83  .h7/....I.>]<.w.
01D0: F8 B3 83 86 F3 AF 8D E6   E6 58 E8 7B DC 7B 69 7F  .........X....i.
01E0: 1F CD F9 EA                                        ....

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531099/087242/1D72ABBB3F5AC5AD8EC73F03125EC778/kafka/localhost@EXAMPLE.COM to kafka/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 613130673
Krb5Context setting mySeqNumber to: 613130673
Krb5Context.wrap: data=[01 01 00 00 ]
>>>DEBUG: TCPClient reading 528 bytes
>>> KrbKdcReq send: #bytes read=528
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 24 8b a1 b1 01 01 00 00 fc c4 15 4b a1 9f 25 e4 5d f7 4f b6 ]
>>> KdcAccessibility: remove localhost:39073
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 24 8b a1 b1 01 01 00 00 fc c4 15 4b a1 9f 25 e4 5d f7 4f b6 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: client/localhost@EXAMPLE.COM
	server: kafka/localhost@EXAMPLE.COM
	ticket: sname: kafka/localhost@EXAMPLE.COM
	endTime: 1724931099000
        ----Credentials end----
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 24 8b a1 b1 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 0a 92 4d c7 3b 59 5c a8 a1 f4 9a 94 ]
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 24 8b a1 b1 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 0a 92 4d c7 3b 59 5c a8 a1 f4 9a 94 ]
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context.unwrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context setting mySeqNumber to: 502980259
Krb5Context setting peerSeqNumber to: 502980259
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0070: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0080: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0090: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
00A0: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
00B0: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00C0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00D0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00E0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00F0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
0100: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
0110: 43 A6 54 B6 35 44 23 3C   C3 A4 81 CA 30 81 C7 A0  C.T.5D#<....0...
0120: 03 02 01 11 A2 81 BF 04   81 BC 2C D8 67 4C E4 12  ..........,.gL..
0130: A6 6B B5 8A 59 4A 38 2E   A0 D7 30 9F E8 45 3A AB  .k..YJ8...0..E:.
0140: A6 D8 41 97 C1 F9 57 66   A1 0F 84 DE 56 B5 13 6E  ..A...Wf....V..n
0150: CD 0D 79 93 7C CD 5C F0   CC 78 72 B3 3C 26 8E 39  ..y...\..xr.<&.9
0160: DE A1 7E 6B 6C D2 36 38   6B 45 3B 45 9A 54 0E 57  ...kl.68kE;E.T.W
0170: 8E 59 92 16 8F 73 E0 23   D2 34 D9 16 21 DE 13 D7  .Y...s.#.4..!...
0180: 66 F0 F8 DF F9 40 19 93   44 5F 04 4D 7F 19 9D B0  f....@..D_.M....
0190: D3 97 04 BA EE 62 CB 25   5C DC 24 66 02 BB 7A C6  .....b.%\.$f..z.
01A0: 7B D1 6B 42 88 FA E4 91   69 59 44 C7 13 34 22 7E  ..kB....iYD..4".
01B0: 99 79 ED 1F 84 69 C6 23   59 FB 18 85 52 9B A4 24  .y...i.#Y...R..$
01C0: F2 07 FD EF A3 2A A7 19   26 19 DB A2 36 33 FD 72  .....*..&...63.r
01D0: B9 89 7A 8B F7 1E 83 1E   85 BF BE F9 24 75 50 F1  ..z.........$uP.
01E0: B6 64 C4 29 B9 69                                  .d.).i

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531099/094609/ECD7175FD75658E22D086C575AF22451/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 502980259
Krb5Context setting mySeqNumber to: 502980259
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 1d fa de a3 01 01 00 00 08 65 a1 c3 78 fc db e1 8d 35 dc 0c ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 1d fa de a3 01 01 00 00 08 65 a1 c3 78 fc db e1 8d 35 dc 0c ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 1d fa de a3 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 63 5b b9 eb e8 6f 88 bb bd 8f a3 bd ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 1d fa de a3 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 63 5b b9 eb e8 6f 88 bb bd 8f a3 bd ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0050: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0060: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0070: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
0080: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
0090: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00A0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00B0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00C0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00D0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
00E0: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
00F0: 43 A6 54 B6 35 44 23 3C   C3                       C.T.5D#<.

Client Principal = client/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 06 B6 C1 29 A6 E0 CB 0B   3D C6 E8 1F 90 C7 F0 61  ...)....=......a


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Fri Dec 03 03:31:39 PST 2021
Start Time = Fri Dec 03 03:31:39 PST 2021
End Time = Thu Aug 29 04:31:39 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 1032727452
Krb5Context setting peerSeqNumber to: 1032727452
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0070: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0080: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0090: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
00A0: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
00B0: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00C0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00D0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00E0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00F0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
0100: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
0110: 43 A6 54 B6 35 44 23 3C   C3 A4 81 CA 30 81 C7 A0  C.T.5D#<....0...
0120: 03 02 01 11 A2 81 BF 04   81 BC CE BE C0 86 50 8E  ..............P.
0130: A2 23 26 D4 0D EA E1 FC   34 5B 19 81 50 BE A2 B2  .#&.....4[..P...
0140: B9 39 09 7B AE 30 CC 26   63 92 18 D7 D5 2B FD C5  .9...0.&c....+..
0150: 43 04 5F 96 E5 66 CF D8   6A B9 F3 F5 64 47 F3 04  C._..f..j...dG..
0160: A9 FA B7 EB 89 70 1E CB   3D 65 6C 1B F0 DE 84 B8  .....p..=el.....
0170: 98 04 0E 64 A4 0A 8F 81   66 3C A5 94 0D FE 69 C2  ...d....f<....i.
0180: CC 75 68 26 5B A9 46 27   54 0A 8F 06 D3 C2 1A 63  .uh&[.F'T......c
0190: FD B2 36 C7 E9 51 CA 38   FE 3B 34 D0 43 6C 6B 9E  ..6..Q.8.;4.Clk.
01A0: 1E 17 62 42 F1 D6 4B ED   0A 64 92 EB A2 C2 0B 74  ..bB..K..d.....t
01B0: 75 95 10 CA 1F 36 F1 37   BB F4 F6 CA F9 61 F6 81  u....6.7.....a..
01C0: F7 4C DE 2E 56 6E BC C4   DF 91 B9 5B E2 AC D4 BD  .L..Vn.....[....
01D0: 7F 90 90 5F 16 F6 11 B4   15 D1 ED 89 45 FB 2B E2  ..._........E.+.
01E0: EC 66 D9 1F DF FD                                  .f....

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531099/173395/8D79040D4CF4255F40753BFA11B5F17B/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 1032727452
Krb5Context setting mySeqNumber to: 1032727452
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 3d 8e 2b 9c 01 01 00 00 f5 ce cd 6e 8f 6e db 0a 0e 89 81 a1 ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 3d 8e 2b 9c 01 01 00 00 f5 ce cd 6e 8f 6e db 0a 0e 89 81 a1 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 3d 8e 2b 9c 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 17 d1 df 51 c1 d9 47 d7 68 d2 8b 64 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 3d 8e 2b 9c 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 17 d1 df 51 c1 d9 47 d7 68 d2 8b 64 ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0050: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0060: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0070: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
0080: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
0090: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00A0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00B0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00C0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00D0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
00E0: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
00F0: 43 A6 54 B6 35 44 23 3C   C3                       C.T.5D#<.

Client Principal = client/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 06 B6 C1 29 A6 E0 CB 0B   3D C6 E8 1F 90 C7 F0 61  ...)....=......a


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Fri Dec 03 03:31:39 PST 2021
Start Time = Fri Dec 03 03:31:39 PST 2021
End Time = Thu Aug 29 04:31:39 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 874228705
Krb5Context setting peerSeqNumber to: 874228705
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0070: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0080: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0090: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
00A0: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
00B0: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00C0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00D0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00E0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00F0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
0100: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
0110: 43 A6 54 B6 35 44 23 3C   C3 A4 81 CA 30 81 C7 A0  C.T.5D#<....0...
0120: 03 02 01 11 A2 81 BF 04   81 BC 94 18 F3 E0 76 40  ..............v@
0130: 1F 4C 41 AC 76 C8 89 6D   F3 D0 5B 33 CD CC D7 29  .LA.v..m..[3...)
0140: 8B 38 94 F2 82 D4 C6 45   E0 4E 05 4A 1F 52 A9 FA  .8.....E.N.J.R..
0150: CE 06 73 1C 23 63 ED D0   D9 D6 DA 1B 1D FB EF 10  ..s.#c..........
0160: F2 89 9B CE 61 1C F5 5F   DE B5 8B 75 82 F4 D3 65  ....a.._...u...e
0170: 4E 8C 57 F6 C3 9E 61 BB   BA D1 1E A5 5F 42 0D 70  N.W...a....._B.p
0180: 66 D5 E5 DA 54 BF 0C 5B   E4 4C 4A 3C 58 E5 ED 5B  f...T..[.LJ<X..[
0190: 0B FD 74 F2 04 61 5F 4A   9C 3E 15 31 6D 67 71 D6  ..t..a_J.>.1mgq.
01A0: FF A2 98 9E 61 C1 F6 8C   11 A5 7D 55 20 A4 B0 C4  ....a......U ...
01B0: 77 F7 A8 D2 85 8B A6 E1   F6 8C 83 BB 79 99 03 05  w...........y...
01C0: 32 73 AA 3F 26 98 44 16   34 96 72 AF CB F9 68 C3  2s.?&.D.4.r...h.
01D0: F7 50 39 DF B8 E8 5C E0   D7 B3 56 C5 3B 7C 3D BA  .P9...\...V.;.=.
01E0: 3F 1E 1C AD F4 B1                                  ?.....

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531099/183107/AD20A86621BB30907D906BD73FA1256E/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 874228705
Krb5Context setting mySeqNumber to: 874228705
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 34 1b ab e1 01 01 00 00 5b b8 4a 1a 56 5d 9a 3a 95 a4 5a 7b ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 34 1b ab e1 01 01 00 00 5b b8 4a 1a 56 5d 9a 3a 95 a4 5a 7b ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 34 1b ab e1 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d db 23 81 09 4b 5d e9 4b 79 1b 8b 73 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 34 1b ab e1 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d db 23 81 09 4b 5d e9 4b 79 1b 8b 73 ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0050: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0060: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0070: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
0080: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
0090: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00A0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00B0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00C0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00D0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
00E0: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
00F0: 43 A6 54 B6 35 44 23 3C   C3                       C.T.5D#<.

Client Principal = client/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 06 B6 C1 29 A6 E0 CB 0B   3D C6 E8 1F 90 C7 F0 61  ...)....=......a


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Fri Dec 03 03:31:39 PST 2021
Start Time = Fri Dec 03 03:31:39 PST 2021
End Time = Thu Aug 29 04:31:39 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 573120651
Krb5Context setting peerSeqNumber to: 573120651
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA AD  ................
0070: 92 61 F4 4D 2E 6C F1 46   C8 23 31 BB AB B8 F1 1E  .a.M.l.F.#1.....
0080: 38 25 50 B3 FD 37 25 70   EF EA 37 A8 39 64 31 13  8%P..7%p..7.9d1.
0090: DD 94 A1 DB 3D B9 67 E1   70 F6 21 72 57 7A 6C 0A  ....=.g.p.!rWzl.
00A0: 29 A8 AC 31 95 38 6A BB   CE FF C4 49 82 8C 1B 18  )..1.8j....I....
00B0: 3E 4F 14 6C 27 67 59 22   E1 E3 68 B9 9F BB 50 94  >O.l'gY"..h...P.
00C0: 28 70 9F 21 E0 63 09 4F   0F DB 96 C7 16 3D 28 1B  (p.!.c.O.....=(.
00D0: 24 42 42 1A FD CB B5 41   4E 80 5C EB 9A 91 BF AD  $BB....AN.\.....
00E0: 3E 63 42 D0 91 53 A0 80   61 3A CA 7B 16 0A 28 13  >cB..S..a:....(.
00F0: 60 85 D0 FF F1 19 CB 5F   2B 31 F2 40 97 B1 E2 03  `......_+1.@....
0100: 12 31 F9 DB F7 21 22 A0   FE 7E AD D7 6E 10 7F A0  .1...!".....n...
0110: 43 A6 54 B6 35 44 23 3C   C3 A4 81 CA 30 81 C7 A0  C.T.5D#<....0...
0120: 03 02 01 11 A2 81 BF 04   81 BC 0A 81 8C F0 47 2B  ..............G+
0130: 00 86 6A 78 3F EF E9 90   4E A1 C8 F2 D5 9A 6E B4  ..jx?...N.....n.
0140: F2 C5 35 93 E6 48 56 F2   B6 68 02 E5 F7 A8 50 39  ..5..HV..h....P9
0150: 78 D3 90 FD 82 C4 D7 5E   9C 6E 87 F2 46 39 4D 81  x......^.n..F9M.
0160: 1B 5F 5F 53 07 F7 26 F8   40 75 03 4A A7 DE AC 23  .__S..&.@u.J...#
0170: C0 FD 7A 8F CD F0 BE 47   34 E3 78 BF 45 5D 8C 7D  ..z....G4.x.E]..
0180: AE 99 96 8C 09 80 9A 4C   26 2C 3D 17 25 46 0A 89  .......L&,=.%F..
0190: 0F AD 21 AC 8A 56 E9 80   10 4E C4 AA E0 0A 5A AF  ..!..V...N....Z.
01A0: 82 CB 26 CD 30 BF 96 A4   0B 9A 33 9F 1F 26 4C 29  ..&.0.....3..&L)
01B0: AA 00 AD 11 6D 20 D8 B8   FB 0C E0 7D 95 AD 32 45  ....m ........2E
01C0: 06 6F AA E4 C8 BC F8 C3   DB F3 04 C2 8A 38 1D 8E  .o...........8..
01D0: 43 C0 21 B0 B0 47 C4 06   57 6B E0 A4 12 10 54 29  C.!..G..Wk....T)
01E0: A4 B7 77 F3 F8 C3                                  ..w...

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531099/316933/78B436984AAB0598B396D5ED7FD368B5/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 573120651
Krb5Context setting mySeqNumber to: 573120651
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 22 29 20 8b 01 01 00 00 88 fc 4b 9c 4d 23 79 ff ea c8 dc 85 ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 22 29 20 8b 01 01 00 00 88 fc 4b 9c 4d 23 79 ff ea c8 dc 85 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 22 29 20 8b 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 2b 89 f5 32 33 82 42 82 d9 85 ec 24 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 22 29 20 8b 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 2b 89 f5 32 33 82 42 82 d9 85 ec 24 ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
- Roundtrip
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found ticket for kafka/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-branch-3.1-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-69270809-043a-418c-8f34-8b0737e1c95c/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Thu Aug 29 04:31:39 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F5 30 81 F2 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BC 30 81 B9 A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AC 04 81 A9 64  ...............d
0050: D8 90 E3 A1 8B B8 9F A8   F8 DD 85 12 BF 9E E5 9C  ................
0060: 6A 41 DC 22 A6 85 37 8B   9D D1 D1 B8 44 D9 2B F0  jA."..7.....D.+.
0070: 75 DC F9 00 31 F7 5A 2B   8F 8B 88 EF 36 C7 19 D9  u...1.Z+....6...
0080: F5 52 71 99 6A 92 29 47   67 1D 95 86 ED 87 94 81  .Rq.j.)Gg.......
0090: FB 4B A7 16 A2 55 A4 47   91 65 61 E8 DF CA 42 D5  .K...U.G.ea...B.
00A0: 4C C1 68 09 2D DD 25 21   6A 71 49 69 BC 67 EC 9F  L.h.-.%!jqIi.g..
00B0: C2 F5 E9 87 36 C4 A3 2D   AC FE FB C5 F6 65 FC A5  ....6..-.....e..
00C0: 78 EA D8 97 52 59 BF 8D   57 5D EE 57 5F 04 C9 1F  x...RY..W].W_...
00D0: 58 3C 72 DD 40 21 9D CA   91 58 2E E6 E1 07 FB AA  X<r.@!...X......
00E0: 0E 32 CF 33 76 19 A4 A2   37 DC 9E AE D1 20 59 58  .2.3v...7.... YX
00F0: 88 1A A4 B6 78 7D 1A 8E                            ....x...

Client Principal = kafka/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 56 FB 32 3A FC 96 99 A0   4B 3A 68 31 83 AB 95 90  V.2:....K:h1....


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Fri Dec 03 03:31:39 PST 2021
Start Time = Fri Dec 03 03:31:39 PST 2021
End Time = Thu Aug 29 04:31:39 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 581273597
Krb5Context setting peerSeqNumber to: 581273597
Created InitSecContextToken:
0000: 01 00 6E 82 01 DE 30 82   01 DA A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F8  ................
0020: 61 81 F5 30 81 F2 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BC 30 81 B9 A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AC 04 81 A9 64  ...............d
0070: D8 90 E3 A1 8B B8 9F A8   F8 DD 85 12 BF 9E E5 9C  ................
0080: 6A 41 DC 22 A6 85 37 8B   9D D1 D1 B8 44 D9 2B F0  jA."..7.....D.+.
0090: 75 DC F9 00 31 F7 5A 2B   8F 8B 88 EF 36 C7 19 D9  u...1.Z+....6...
00A0: F5 52 71 99 6A 92 29 47   67 1D 95 86 ED 87 94 81  .Rq.j.)Gg.......
00B0: FB 4B A7 16 A2 55 A4 47   91 65 61 E8 DF CA 42 D5  .K...U.G.ea...B.
00C0: 4C C1 68 09 2D DD 25 21   6A 71 49 69 BC 67 EC 9F  L.h.-.%!jqIi.g..
00D0: C2 F5 E9 87 36 C4 A3 2D   AC FE FB C5 F6 65 FC A5  ....6..-.....e..
00E0: 78 EA D8 97 52 59 BF 8D   57 5D EE 57 5F 04 C9 1F  x...RY..W].W_...
00F0: 58 3C 72 DD 40 21 9D CA   91 58 2E E6 E1 07 FB AA  X<r.@!...X......
0100: 0E 32 CF 33 76 19 A4 A2   37 DC 9E AE D1 20 59 58  .2.3v...7.... YX
0110: 88 1A A4 B6 78 7D 1A 8E   A4 81 C9 30 81 C6 A0 03  ....x......0....
0120: 02 01 11 A2 81 BE 04 81   BB F1 76 8D 35 04 E2 05  ..........v.5...
0130: 20 A3 C1 BA FA B3 85 72   B4 24 6C 0F FC 64 46 DA   ......r.$l..dF.
0140: 4A E6 63 25 15 30 04 E9   FC E1 51 97 50 03 62 EF  J.c%.0....Q.P.b.
0150: 74 95 5C 06 91 0E F1 9F   37 D1 9D B6 9F AE A3 0F  t.\.....7.......
0160: 81 60 2D 7B 9C D3 52 CF   9B 7D 4D 07 D6 DC BD A9  .`-...R...M.....
0170: 14 A2 63 E6 CF ED 57 25   B7 7D 96 73 30 6D A6 7B  ..c...W%...s0m..
0180: 01 AF 5D 86 73 D7 5C 35   71 B5 4C FC F2 A8 0A C7  ..].s.\5q.L.....
0190: F0 C8 EB 39 20 3E 3A 46   3F 14 85 52 24 90 76 BD  ...9 >:F?..R$.v.
01A0: FA 66 41 2B 6D 95 06 A4   2F FA 91 FD 3F 13 1E C6  .fA+m.../...?...
01B0: A3 8E A0 21 F7 BC 6C C8   6A A8 61 08 EE 85 04 01  ...!..l.j.a.....
01C0: 62 69 B7 0C B3 73 83 8F   42 4E 77 76 B8 C2 6E 7A  bi...s..BNwv..nz
01D0: B5 96 83 BD D2 BF 8F 13   B4 BE A0 2F AA 8B 2C 3B  .........../..,;
01E0: 87 75 6C 0E                                        .ul.

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638531100/692300/7C13CC328766321934221E376EE90FBB/kafka/localhost@EXAMPLE.COM to kafka/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 581273597
Krb5Context setting mySeqNumber to: 581273597
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 22 a5 87 fd 01 01 00 00 b8 a0 e0 6a 6b 13 9d af 70 c3 87 fe ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 22 a5 87 fd 01 01 00 00 b8 a0 e0 6a 6b 13 9d af 70 c3 87 fe ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 22 a5 87 fd 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d e9 c1 90 ff 1d 6b 0e 93 cf f2 07 89 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 22 a5 87 fd 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d e9 c1 90 ff 1d 6b 0e 93 cf f2 07 89 ]
Krb5Context.unwrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
KafkaSourceOffsetSuite:
- comparison {"t":{"0":1}} <=> {"t":{"0":2}}
- comparison {"t":{"1":0,"0":1}} <=> {"t":{"1":1,"0":2}}
- comparison {"t":{"0":1},"T":{"0":0}} <=> {"t":{"0":2},"T":{"0":1}}
- comparison {"t":{"0":1}} <=> {"t":{"1":1,"0":2}}
- comparison {"t":{"0":1}} <=> {"t":{"1":3,"0":2}}
- basic serialization - deserialization
- OffsetSeqLog serialization - deserialization
- read Spark 2.1.0 offset format
KafkaOffsetReaderSuite:
- isolationLevel must give back default isolation level when not set
- isolationLevel must give back READ_UNCOMMITTED when set
- isolationLevel must give back READ_COMMITTED when set
- isolationLevel must throw exception when invalid isolation level set
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using specific offsets with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using specific offsets with useDeprecatedOffsetFetching false
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using special offsets with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using special offsets with useDeprecatedOffsetFetching false
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - multiple topic partitions with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - multiple topic partitions with useDeprecatedOffsetFetching false
- SPARK-30656: getOffsetRangesFromResolvedOffsets with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromResolvedOffsets with useDeprecatedOffsetFetching false
KafkaSinkBatchSuiteV2:
- batch - write to kafka
- batch - partition column and partitioner priorities
- batch - null topic field value, and no topic option
- SPARK-20496: batch - enforce analyzed plans
- batch - unsupported save modes
- generic - write big data with small producer buffer
KafkaSourceStressSuite:
- stress test with multiple topics and partitions
KafkaDontFailOnDataLossSuite:
- failOnDataLoss=false should not return duplicated records: microbatch v1
- failOnDataLoss=false should not return duplicated records: microbatch v2
- failOnDataLoss=false should not return duplicated records: continuous processing
- failOnDataLoss=false should not return duplicated records: batch v1
- failOnDataLoss=false should not return duplicated records: batch v2
KafkaSinkBatchSuiteV1:
- batch - write to kafka
- batch - partition column and partitioner priorities
- batch - null topic field value, and no topic option
- SPARK-20496: batch - enforce analyzed plans
- batch - unsupported save modes
KafkaSparkConfSuite:
- deprecated configs
KafkaContinuousSourceSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- ensure continuous stream is being used
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-27494: read kafka record containing null key/values.
KafkaSourceStressForDontFailOnDataLossSuite:
- stress test for failOnDataLoss=false
Build timed out (after 480 minutes). Marking the build as aborted.
Build was aborted
Archiving artifacts
KafkaContinuousSourceTopicDeletionSuite:
org.apache.spark.sql.kafka010.KafkaContinuousSourceTopicDeletionSuite *** ABORTED ***
  java.lang.IllegalStateException: Shutdown hooks cannot be modified during shutdown.
  at org.apache.spark.util.SparkShutdownHookManager.add(ShutdownHookManager.scala:195)
  at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
  at org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:158)
  at org.apache.spark.storage.DiskBlockManager.<init>(DiskBlockManager.scala:55)
  at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:191)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:394)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:137)
  ...
KafkaMicroBatchV2SourceWithAdminSuite:
org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite *** ABORTED ***
  java.lang.IllegalStateException: Shutdown hooks cannot be modified during shutdown.
  at org.apache.spark.util.SparkShutdownHookManager.add(ShutdownHookManager.scala:195)
  at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
  at org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:158)
  at org.apache.spark.storage.DiskBlockManager.<init>(DiskBlockManager.scala:55)
  at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:191)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:394)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:137)
  ...
InternalKafkaProducerPoolSuite:
org.apache.spark.sql.kafka010.producer.InternalKafkaProducerPoolSuite *** ABORTED ***
  java.lang.IllegalStateException: Shutdown hooks cannot be modified during shutdown.
  at org.apache.spark.util.SparkShutdownHookManager.add(ShutdownHookManager.scala:195)
  at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153)
  at org.apache.spark.storage.DiskBlockManager.addShutdownHook(DiskBlockManager.scala:158)
  at org.apache.spark.storage.DiskBlockManager.<init>(DiskBlockManager.scala:55)
  at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:191)
  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:394)
  at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
  at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:137)
  ...
Run completed in 19 minutes, 57 seconds.
Total number of tests run: 353
Suites: completed 28, aborted 3
Tests: succeeded 353, failed 0, canceled 0, ignored 0, pending 0
*** 3 SUITES ABORTED ***
[INFO] 
[INFO] ---------< org.apache.spark:spark-streaming-kinesis-asl_2.12 >----------
[INFO] Building Spark Kinesis Integration 3.1.3-SNAPSHOT                [27/31]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-versions) @ spark-streaming-kinesis-asl_2.12 ---
+ retcode2=143
+ [[ 0 -ne 0 ]]
+ [[ 143 -ne 0 ]]
+ [[ 0 -ne 0 ]]
+ [[ 143 -ne 0 ]]
+ echo 'Testing Spark with Maven failed'
Testing Spark with Maven failed
+ exit 1
Recording test results
[Checks API] No suitable checks publisher found.
Finished: ABORTED