Console Output

Skipping 19,887 KB.. Full Log
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) @ spark-sql-kafka-0-10_2.12 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Not compiling main sources
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/jenkins/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.15__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.15,1.7.6,null)
[INFO] Compiling 31 Scala sources and 1 Java source to /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/classes ...
[INFO] Done compiling.
[INFO] compile in 37.6 s
[INFO] 
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.7:testResources (default-testResources) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 13 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Not compiling test sources
[INFO] 
[INFO] --- maven-dependency-plugin:3.1.1:build-classpath (generate-test-classpath) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/org/scalatest/scalatest-matchers-core_2.12/3.2.9/scalatest-matchers-core_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-simplekdc/1.0.1/kerb-simplekdc-1.0.1.jar:/home/jenkins/.m2/repository/net/jcip/jcip-annotations/1.0/jcip-annotations-1.0.jar:/home/jenkins/.m2/repository/com/google/crypto/tink/tink/1.6.0/tink-1.6.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/external/jakarta.inject/2.6.1/jakarta.inject-2.6.1.jar:/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-core_2.12/3.2.9/scalatest-core_2.12-3.2.9.jar:/home/jenkins/.m2/repository/jakarta/servlet/jakarta.servlet-api/4.0.3/jakarta.servlet-api-4.0.3.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-propspec_2.12/3.2.9/scalatest-propspec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/rocksdb/rocksdbjni/6.20.3/rocksdbjni-6.20.3.jar:/home/jenkins/.m2/repository/org/json4s/json4s-jackson_2.12/3.7.0-M11/json4s-jackson_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/jenkins/.m2/repository/com/shapesecurity/salvation2/3.0.0/salvation2-3.0.0.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest_2.12/3.2.9/scalatest_2.12-3.2.9.jar:/home/jenkins/.m2/repository/net/sourceforge/htmlunit/neko-htmlunit/2.50.0/neko-htmlunit-2.50.0.jar:/home/jenkins/.m2/repository/org/apache/htrace/htrace-core4/4.1.0-incubating/htrace-core4-4.1.0-incubating.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-column/1.12.2/parquet-column-1.12.2.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.15/commons-codec-1.15.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/sql/core/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-client/9.4.43.v20210629/jetty-client-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-xdr/1.0.1/kerby-xdr-1.0.1.jar:/home/jenkins/.m2/repository/xerces/xercesImpl/2.12.0/xercesImpl-2.12.0.jar:/home/jenkins/.m2/repository/org/apache-extras/beanshell/bsh/2.0b6/bsh-2.0b6.jar:/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.10.13/byte-buddy-1.10.13.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-diagrams_2.12/3.2.9/scalatest-diagrams_2.12-3.2.9.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-graphite/4.2.2/metrics-graphite-4.2.2.jar:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-encoding/1.12.2/parquet-encoding-1.12.2.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.5.13/httpclient-4.5.13.jar:/home/jenkins/.m2/repository/org/jetbrains/annotations/17.0.0/annotations-17.0.0.jar:/home/jenkins/.m2/repository/net/sourceforge/argparse4j/argparse4j/0.7.0/argparse4j-0.7.0.jar:/home/jenkins/.m2/repository/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-pkix/1.0.1/kerby-pkix-1.0.1.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-mustmatchers_2.12/3.2.9/scalatest-mustmatchers_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-edge-driver/3.141.59/selenium-edge-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-exec/1.3/commons-exec-1.3.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.5.7/zookeeper-3.5.7.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.5.0/commons-cli-1.5.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-server/1.0.1/kerb-server-1.0.1.jar:/home/jenkins/.m2/repository/jakarta/validation/jakarta.validation-api/2.0.2/jakarta.validation-api-2.0.2.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/network-common/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-java8-compat_2.12/0.9.1/scala-java8-compat_2.12-0.9.1.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-reflect/2.12.15/scala-reflect-2.12.15.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-format/6.0.0/arrow-format-6.0.0.jar:/home/jenkins/.m2/repository/org/jmock/jmock-testjar/2.12.0/jmock-testjar-2.12.0.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/org/codehaus/janino/commons-compiler/3.0.16/commons-compiler-3.0.16.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-framework/2.13.0/curator-framework-2.13.0.jar:/home/jenkins/.m2/repository/io/netty/netty-all/4.1.68.Final/netty-all-4.1.68.Final.jar:/home/jenkins/.m2/repository/org/bouncycastle/bcprov-jdk15on/1.60/bcprov-jdk15on-1.60.jar:/home/jenkins/.m2/repository/junit/junit/4.13.1/junit-4.13.1.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.8.4/snappy-java-1.1.8.4.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-csv/2.10.5/jackson-dataformat-csv-2.10.5.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-client/2.34/jersey-client-2.34.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-format-structures/1.12.2/parquet-format-structures-1.12.2.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-mapred/1.11.0/avro-mapred-1.11.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.3/osgi-resource-locator-1.0.3.jar:/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy-agent/1.10.13/byte-buddy-agent-1.10.13.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-memory-netty/6.0.0/arrow-memory-netty-6.0.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-util/1.0.1/kerby-util-1.0.1.jar:/home/jenkins/.m2/repository/org/scala-sbt/test-interface/1.0/test-interface-1.0.jar:/home/jenkins/.m2/repository/com/squareup/okio/okio/1.14.0/okio-1.14.0.jar:/home/jenkins/.m2/repository/org/ow2/asm/asm/7.1/asm-7.1.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-servlets/9.4.43.v20210629/jetty-servlets-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.13.0/jackson-core-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/hive/hive-storage-api/2.7.2/hive-storage-api-2.7.2.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-common/1.12.2/parquet-common-1.12.2.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar:/home/jenkins/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-util/9.4.43.v20210629/jetty-util-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/com/github/luben/zstd-jni/1.5.0-4/zstd-jni-1.5.0-4.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-jvm/4.2.2/metrics-jvm-4.2.2.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-util-ajax/9.4.43.v20210629/jetty-util-ajax-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-remote-driver/3.141.59/selenium-remote-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpmime/4.5.13/httpmime-4.5.13.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.13.0/jackson-databind-2.13.0.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-collection-compat_2.12/2.3.0/scala-collection-compat_2.12-2.3.0.jar:/home/jenkins/.m2/repository/org/javassist/javassist/3.25.0-GA/javassist-3.25.0-GA.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.21/commons-compress-1.21.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-featurespec_2.12/3.2.9/scalatest-featurespec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-vector/6.0.0/arrow-vector-6.0.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/core/target/scala-2.12/test-classes:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/scalatestplus/mockito-3-4_2.12/3.2.9.0/mockito-3-4_2.12-3.2.9.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-plus/9.4.43.v20210629/jetty-plus-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/mockito/mockito-core/3.4.6/mockito-core-3.4.6.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper-jute/3.5.7/zookeeper-jute-3.5.7.jar:/home/jenkins/.m2/repository/org/apache/kafka/kafka_2.12/2.8.1/kafka_2.12-2.8.1.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-jmx/4.2.2/metrics-jmx-4.2.2.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/htmlunit-driver/2.50.0/htmlunit-driver-2.50.0.jar:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-minikdc/3.3.1/hadoop-minikdc-3.3.1.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-token-provider/target/scala-2.12/test-classes:/home/jenkins/.m2/repository/net/sf/py4j/py4j/0.10.9.2/py4j-0.10.9.2.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-identity/1.0.1/kerb-identity-1.0.1.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/core/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/scalatest/scalatest-refspec_2.12/3.2.9/scalatest-refspec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/jmock/jmock-legacy/2.12.0/jmock-legacy-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/parquet/parquet-jackson/1.12.2/parquet-jackson-1.12.2.jar:/home/jenkins/.m2/repository/org/threeten/threeten-extra/1.5.0/threeten-extra-1.5.0.jar:/home/jenkins/.m2/repository/com/clearspring/analytics/stream/2.9.6/stream-2.9.6.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-locator/2.6.1/hk2-locator-2.6.1.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-utils/2.6.1/hk2-utils-2.6.1.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/websocket/websocket-common/9.4.40.v20210413/websocket-common-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-security/9.4.43.v20210629/jetty-security-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-core/1.0.1/kerb-core-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpcore/4.4.14/httpcore-4.4.14.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-lang3/3.12.0/commons-lang3-3.12.0.jar:/home/jenkins/.m2/repository/org/objenesis/objenesis/2.6/objenesis-2.6.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-shouldmatchers_2.12/3.2.9/scalatest-shouldmatchers_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/home/jenkins/.m2/repository/xml-apis/xml-apis/1.4.01/xml-apis-1.4.01.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/home/jenkins/.m2/repository/jakarta/ws/rs/jakarta.ws.rs-api/2.1.6/jakarta.ws.rs-api-2.1.6.jar:/home/jenkins/.m2/repository/net/sourceforge/htmlunit/htmlunit/2.50.0/htmlunit-2.50.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.6.1/aopalliance-repackaged-2.6.1.jar:/home/jenkins/.m2/repository/org/jmock/jmock-junit4/2.12.0/jmock-junit4-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-util/1.0.1/kerb-util-1.0.1.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-funsuite_2.12/3.2.9/scalatest-funsuite_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-freespec_2.12/3.2.9/scalatest-freespec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/xalan/serializer/2.7.2/serializer-2.7.2.jar:/home/jenkins/.m2/repository/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/home/jenkins/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/home/jenkins/.m2/repository/xalan/xalan/2.7.2/xalan-2.7.2.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client-api/3.3.1/hadoop-client-api-3.3.1.jar:/home/jenkins/.m2/repository/com/google/flatbuffers/flatbuffers-java/1.12.0/flatbuffers-java-1.12.0.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-funspec_2.12/3.2.9/scalatest-funspec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/net/sourceforge/htmlunit/htmlunit-cssparser/1.7.0/htmlunit-cssparser-1.7.0.jar:/home/jenkins/.m2/repository/com/squareup/okhttp3/okhttp/3.11.0/okhttp-3.11.0.jar:/home/jenkins/.m2/repository/org/slf4j/jul-to-slf4j/1.7.30/jul-to-slf4j-1.7.30.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-continuation/9.4.43.v20210629/jetty-continuation-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/datatype/jackson-datatype-jdk8/2.10.5/jackson-datatype-jdk8-2.10.5.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-shims/1.7.1/orc-shims-1.7.1.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-server/2.34/jersey-server-2.34.jar:/home/jenkins/.m2/repository/org/roaringbitmap/RoaringBitmap/0.9.22/RoaringBitmap-0.9.22.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-servlet/9.4.43.v20210629/jetty-servlet-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-ie-driver/3.141.59/selenium-ie-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/hamcrest/hamcrest/2.1/hamcrest-2.1.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/sql/catalyst/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/jmock/jmock/2.12.0/jmock-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-config/1.0.1/kerby-config-1.0.1.jar:/home/jenkins/.m2/repository/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar:/home/jenkins/.m2/repository/org/antlr/antlr4-runtime/4.8/antlr4-runtime-4.8.jar:/home/jenkins/.m2/repository/org/json4s/json4s-core_2.12/3.7.0-M11/json4s-core_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet-core/2.34/jersey-container-servlet-core-2.34.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.11.0/commons-io-2.11.0.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.12/1.1.2/scala-parser-combinators_2.12-1.1.2.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-core/1.7.1/orc-core-1.7.1.jar:/home/jenkins/.m2/repository/org/apache/kafka/kafka-metadata/2.8.1/kafka-metadata-2.8.1.jar:/home/jenkins/.m2/repository/org/scalatestplus/selenium-3-141_2.12/3.2.9.0/selenium-3-141_2.12-3.2.9.0.jar:/home/jenkins/.m2/repository/org/codehaus/janino/janino/3.0.16/janino-3.0.16.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/kvstore/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/apache/parquet/parquet-hadoop/1.12.2/parquet-hadoop-1.12.2.jar:/home/jenkins/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.jar:/home/jenkins/.m2/repository/com/typesafe/scala-logging/scala-logging_2.12/3.9.2/scala-logging_2.12-3.9.2.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/websocket/websocket-client/9.4.40.v20210413/websocket-client-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.12.15/scala-library-2.12.15.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-webapp/9.4.43.v20210629/jetty-webapp-9.4.43.v20210629.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/sql/catalyst/target/spark-catalyst_2.12-3.3.0-SNAPSHOT-tests.jar:/home/jenkins/.m2/repository/javax/servlet/javax.servlet-api/3.1.0/javax.servlet-api-3.1.0.jar:/home/jenkins/.m2/repository/org/apache/xbean/xbean-asm9-shaded/4.20/xbean-asm9-shaded-4.20.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-client/1.0.1/kerb-client-1.0.1.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.8/xz-1.8.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-wordspec_2.12/3.2.9/scalatest-wordspec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/com/twitter/chill_2.12/0.10.0/chill_2.12-0.10.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-io/9.4.43.v20210629/jetty-io-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/home/jenkins/.m2/repository/org/apache/ivy/ivy/2.5.0/ivy-2.5.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-server/9.4.43.v20210629/jetty-server-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/json4s/json4s-scalap_2.12/3.7.0-M11/json4s-scalap_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/org/scalactic/scalactic_2.12/3.2.9/scalactic_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/roaringbitmap/shims/0.9.22/shims-0.9.22.jar:/home/jenkins/.m2/repository/org/jmock/jmock-imposters/2.12.0/jmock-imposters-2.12.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/token-provider/1.0.1/token-provider-1.0.1.jar:/home/jenkins/.m2/repository/com/novocode/junit-interface/0.11/junit-interface-0.11.jar:/home/jenkins/.m2/repository/org/scalacheck/scalacheck_2.12/1.15.4/scalacheck_2.12-1.15.4.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.11.0/avro-1.11.0.jar:/home/jenkins/.m2/repository/cglib/cglib/3.2.8/cglib-3.2.8.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-compatible/3.2.9/scalatest-compatible-3.2.9.jar:/home/jenkins/.m2/repository/org/lz4/lz4-java/1.8.0/lz4-java-1.8.0.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-xml/9.4.43.v20210629/jetty-xml-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/hamcrest/hamcrest-library/1.3/hamcrest-library-1.3.jar:/home/jenkins/.m2/repository/net/minidev/json-smart/1.3.1/json-smart-1.3.1.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-api/2.6.1/hk2-api-2.6.1.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-chrome-driver/3.141.59/selenium-chrome-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-opera-driver/3.141.59/selenium-opera-driver-3.141.59.jar:/home/jenkins/.m2/repository/jakarta/annotation/jakarta.annotation-api/1.3.5/jakarta.annotation-api-1.3.5.jar:/home/jenkins/.m2/repository/org/apache/yetus/audience-annotations/0.5.0/audience-annotations-0.5.0.jar:/home/jenkins/.m2/repository/org/scalatest/scalatest-flatspec_2.12/3.2.9/scalatest-flatspec_2.12-3.2.9.jar:/home/jenkins/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/home/jenkins/.m2/repository/io/airlift/aircompressor/0.21/aircompressor-0.21.jar:/home/jenkins/.m2/repository/com/twitter/chill-java/0.10.0/chill-java-0.10.0.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-json/4.2.2/metrics-json-4.2.2.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-support/3.141.59/selenium-support-3.141.59.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-http/9.4.43.v20210629/jetty-http-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-xml_2.12/1.2.0/scala-xml_2.12-1.2.0.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-recipes/2.13.0/curator-recipes-2.13.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/tags/target/scala-2.12/test-classes:/home/jenkins/.m2/repository/org/apache/kafka/kafka-raft/2.8.1/kafka-raft-2.8.1.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/websocket/websocket-api/9.4.40.v20210413/websocket-api-9.4.40.v20210413.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-jndi/9.4.43.v20210629/jetty-jndi-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-core/4.2.2/metrics-core-4.2.2.jar:/home/jenkins/.m2/repository/org/apache/kafka/kafka-clients/2.8.1/kafka-clients-2.8.1.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-client/2.13.0/curator-client-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-admin/1.0.1/kerb-admin-1.0.1.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-crypto/1.1.0/commons-crypto-1.1.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/launcher/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-common/2.34/jersey-common-2.34.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.13.0/jackson-annotations-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-ipc/1.11.0/avro-ipc-1.11.0.jar:/home/jenkins/.m2/repository/net/sf/jopt-simple/jopt-simple/3.2/jopt-simple-3.2.jar:/home/jenkins/.m2/repository/org/brotli/dec/0.1.2/dec-0.1.2.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/sql/core/target/spark-sql_2.12-3.3.0-SNAPSHOT-tests.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.13.0/jackson-module-scala_2.12-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/arrow/arrow-memory-core/6.0.0/arrow-memory-core-6.0.0.jar:/home/jenkins/.m2/repository/net/razorvine/pickle/1.2/pickle-1.2.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-api/3.141.59/selenium-api-3.141.59.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-java/3.141.59/selenium-java-3.141.59.jar:/home/jenkins/.m2/repository/net/sourceforge/htmlunit/htmlunit-core-js/2.50.0/htmlunit-core-js-2.50.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/tags/target/scala-2.12/classes:/home/jenkins/.m2/repository/com/univocity/univocity-parsers/2.9.1/univocity-parsers-2.9.1.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.34/jersey-container-servlet-2.34.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client-runtime/3.3.1/hadoop-client-runtime-3.3.1.jar:/home/jenkins/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.30/jcl-over-slf4j-1.7.30.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/inject/jersey-hk2/2.34/jersey-hk2-2.34.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerby-asn1/1.0.1/kerby-asn1-1.0.1.jar:/home/jenkins/.m2/repository/org/eclipse/jetty/jetty-proxy/9.4.43.v20210629/jetty-proxy-9.4.43.v20210629.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/annotations/3.0.1/annotations-3.0.1.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/sketch/target/scala-2.12/classes:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/network-shuffle/target/scala-2.12/classes:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/unsafe/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-firefox-driver/3.141.59/selenium-firefox-driver-3.141.59.jar:/home/jenkins/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/home/jenkins/.m2/repository/org/json4s/json4s-ast_2.12/3.7.0-M11/json4s-ast_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/com/nimbusds/nimbus-jose-jwt/3.10/nimbus-jose-jwt-3.10.jar:/home/jenkins/.m2/repository/org/seleniumhq/selenium/selenium-safari-driver/3.141.59/selenium-safari-driver-3.141.59.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-pool2/2.11.1/commons-pool2-2.11.1.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-common/1.0.1/kerb-common-1.0.1.jar:/home/jenkins/.m2/repository/org/scalatestplus/scalacheck-1-15_2.12/3.2.9.0/scalacheck-1-15_2.12-3.2.9.0.jar:/home/jenkins/.m2/repository/org/apache/orc/orc-mapreduce/1.7.1/orc-mapreduce-1.7.1.jar:/home/jenkins/.m2/repository/org/apache/kerby/kerb-crypto/1.0.1/kerb-crypto-1.0.1.jar
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:testCompile (scala-test-compile-first) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/jenkins/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.15__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.15,1.7.6,null)
[INFO] Compiling 21 Scala sources to /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/scala-2.12/test-classes ...
[INFO] Done compiling.
[INFO] compile in 225.1 s
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ spark-sql-kafka-0-10_2.12 ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-surefire-plugin:3.0.0-M5:test (test) @ spark-sql-kafka-0-10_2.12 ---
[INFO] Skipping execution of surefire because it has already been run for this configuration
[INFO] 
[INFO] --- scalatest-maven-plugin:2.0.2:test (test) @ spark-sql-kafka-0-10_2.12 ---
Discovery starting.
Discovery completed in 1 second, 84 milliseconds.
Run starting. Expected test count is: 487
KafkaSourceProviderSuite:
- batch mode - options should be handled as case-insensitive
- micro-batch mode - options should be handled as case-insensitive
- continuous mode - options should be handled as case-insensitive
ConsumerStrategySuite:
- createAdmin must create admin properly
- AssignStrategy.assignedTopicPartitions must give back all assigned
- AssignStrategy.assignedTopicPartitions must skip invalid partitions
- SubscribeStrategy.assignedTopicPartitions must give back all assigned
- SubscribePatternStrategy.assignedTopicPartitions must give back all assigned
FetchedDataPoolSuite:
- acquire fetched data from multiple keys
- continuous use of fetched data from single key
- multiple tasks referring same key continuously using fetched data
- evict idle fetched data
- invalidate key
KafkaRelationSuiteV1:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- global timestamp provided for starting and ending
- no matched offset for timestamp - startingOffsets
- preferences on offset related options
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V1 Source is used when set through SQLConf
KafkaMicroBatchV2SourceSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- assign from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- assign from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from specific timestamps with non-matching starting offset
- subscribing topic by name from global timestamp per topic with non-matching starting offset
- subscribing topic by pattern from specific timestamps with non-matching starting offset
- subscribing topic by pattern from global timestamp per topic with non-matching starting offset
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- (de)serialization of initial offsets
- SPARK-26718 Rate limit set to Long.Max should not overflow integer during end offset calculation
- maxOffsetsPerTrigger
- minOffsetsPerTrigger
- compositeReadLimit
- input row metrics
- subscribing topic by pattern with topic deletions
- subscribe topic by pattern with topic recreation between batches
- ensure that initial offset are written with an extra byte in the beginning (SPARK-19517)
- deserialization of initial offset written by Spark 2.1.0 (SPARK-19517)
- deserialization of initial offset written by future version
- KafkaSource with watermark
- delete a topic when a Spark job is running
- SPARK-22956: currentPartitionOffsets should be set when no new data comes in
- allow group.id prefix
- allow group.id override
- ensure stream-stream self-join generates only one offset in log and correct metrics
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-25495: FetchedData.reset should reset all fields
- SPARK-27494: read kafka record containing null key/values.
- SPARK-30656: minPartitions
- V2 Source is used by default
- minPartitions is supported
- default config of includeHeader doesn't break existing query from Spark 2.4
- test custom metrics - with rate limit
- test custom metrics - no rate limit
- test custom metrics - corner cases
KafkaRelationSuiteWithAdminV2:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- global timestamp provided for starting and ending
- no matched offset for timestamp - startingOffsets
- preferences on offset related options
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V2 Source is used when set through SQLConf
KafkaSinkMicroBatchStreamingSuite:
- streaming - write to kafka with topic field
- streaming - write w/o topic field, with topic option
- streaming - topic field and topic option
- streaming - write data with bad schema
- streaming - write data with valid schema but wrong types
- streaming - write to non-existing topic
- streaming - exception on config serializer
- streaming - sink progress is produced
KafkaContinuousSourceStressForDontFailOnDataLossSuite:
- stress test for failOnDataLoss=false
KafkaContinuousSinkSuite:
- streaming - write to kafka with topic field
- streaming - write w/o topic field, with topic option
- streaming - topic field and topic option
- streaming - write data with bad schema
- streaming - write data with valid schema but wrong types
- streaming - write to non-existing topic
- streaming - exception on config serializer
- generic - write big data with small producer buffer
KafkaOffsetRangeCalculatorSuite:
- with no minPartition: N TopicPartitions to N offset ranges
- with no minPartition: empty ranges ignored
- with minPartition = 3: N TopicPartitions to N offset ranges
- with minPartition = 3: N TopicPartitions to N offset ranges with executors
- with minPartition = 4: 1 TopicPartition to N offset ranges
- with minPartition = 4: N skewed TopicPartitions to M offset ranges
- with minPartition = 4: SPARK-30656: ignore empty ranges and split the rest
- with minPartition = 4: SPARK-30656: N very skewed TopicPartitions to M offset ranges
- with minPartition = 1: SPARK-30656: minPartitions less than the length of topic partitions
- with minPartition = 3: range inexact multiple of minPartitions
- with minPartition = 4: empty ranges ignored
- with minPartition = 6: SPARK-28489: never drop offsets
- with minPartition = 9: SPARK-36576: 0 small unsplit ranges and 3 large split ranges
- with minPartition = 6: SPARK-36576: 1 small unsplit range and 2 large split ranges
- with minPartition = 6: SPARK-36576: 2 small unsplit ranges and 1 large split range
KafkaMicroBatchV1SourceWithAdminSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- assign from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- assign from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from specific timestamps with non-matching starting offset
- subscribing topic by name from global timestamp per topic with non-matching starting offset
- subscribing topic by pattern from specific timestamps with non-matching starting offset
- subscribing topic by pattern from global timestamp per topic with non-matching starting offset
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- (de)serialization of initial offsets
- SPARK-26718 Rate limit set to Long.Max should not overflow integer during end offset calculation
- maxOffsetsPerTrigger
- minOffsetsPerTrigger
- compositeReadLimit
- input row metrics
- subscribing topic by pattern with topic deletions
- subscribe topic by pattern with topic recreation between batches
- ensure that initial offset are written with an extra byte in the beginning (SPARK-19517)
- deserialization of initial offset written by Spark 2.1.0 (SPARK-19517)
- deserialization of initial offset written by future version
- KafkaSource with watermark
- delete a topic when a Spark job is running
- SPARK-22956: currentPartitionOffsets should be set when no new data comes in
- allow group.id prefix
- allow group.id override
- ensure stream-stream self-join generates only one offset in log and correct metrics
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-25495: FetchedData.reset should reset all fields
- SPARK-27494: read kafka record containing null key/values.
- SPARK-30656: minPartitions
- V1 Source is used when disabled through SQLConf
InternalKafkaConsumerPoolSuite:
- basic multiple borrows and returns for single key
- basic borrow and return for multiple keys
- borrow more than soft max capacity from pool which is neither free space nor idle object
- borrow more than soft max capacity from pool frees up idle objects automatically
- evicting idle objects on background
KafkaDataConsumerSuite:
- SPARK-19886: Report error cause correctly in reportDataLoss
- new KafkaDataConsumer instance in case of Task retry
- same KafkaDataConsumer instance in case of same token
- new KafkaDataConsumer instance in case of token renewal
- SPARK-23623: concurrent use of KafkaDataConsumer
- SPARK-25151 Handles multiple tasks in executor fetching same (topic, partition) pair
- SPARK-25151 Handles multiple tasks in executor fetching same (topic, partition) pair and same offset (edge-case) - data in use
- SPARK-25151 Handles multiple tasks in executor fetching same (topic, partition) pair and same offset (edge-case) - data not in use
KafkaRelationSuiteV2:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- global timestamp provided for starting and ending
- no matched offset for timestamp - startingOffsets
- preferences on offset related options
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V2 Source is used when set through SQLConf
KafkaMicroBatchV1SourceSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- assign from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- assign from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from specific timestamps with non-matching starting offset
- subscribing topic by name from global timestamp per topic with non-matching starting offset
- subscribing topic by pattern from specific timestamps with non-matching starting offset
- subscribing topic by pattern from global timestamp per topic with non-matching starting offset
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- (de)serialization of initial offsets
- SPARK-26718 Rate limit set to Long.Max should not overflow integer during end offset calculation
- maxOffsetsPerTrigger
- minOffsetsPerTrigger
- compositeReadLimit
- input row metrics
- subscribing topic by pattern with topic deletions
- subscribe topic by pattern with topic recreation between batches
- ensure that initial offset are written with an extra byte in the beginning (SPARK-19517)
- deserialization of initial offset written by Spark 2.1.0 (SPARK-19517)
- deserialization of initial offset written by future version
- KafkaSource with watermark
- delete a topic when a Spark job is running
- SPARK-22956: currentPartitionOffsets should be set when no new data comes in
- allow group.id prefix
- allow group.id override
- ensure stream-stream self-join generates only one offset in log and correct metrics
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-25495: FetchedData.reset should reset all fields
- SPARK-27494: read kafka record containing null key/values.
- SPARK-30656: minPartitions
- V1 Source is used when disabled through SQLConf
KafkaRelationSuiteWithAdminV1:
- explicit earliest to latest offsets
- default starting and ending offsets
- explicit offsets
- default starting and ending offsets with headers
- timestamp provided for starting and ending
- timestamp provided for starting, offset provided for ending
- timestamp provided for ending, offset provided for starting
- timestamp provided for starting, ending not provided
- timestamp provided for ending, starting not provided
- global timestamp provided for starting and ending
- no matched offset for timestamp - startingOffsets
- preferences on offset related options
- no matched offset for timestamp - endingOffsets
- reuse same dataframe in query
- test late binding start offsets
- bad batch query options
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-30656: minPartitions
- V1 Source is used when set through SQLConf
JsonUtilsSuite:
- parsing partitions
- parsing partitionOffsets
KafkaDelegationTokenSuite:
Java config name: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-7142dcc9-c0b6-43ba-b06b-352dc368466b/1638054067678/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zookeeper
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 66; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zookeeper
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 74; type: 16
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=148
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=148
>>>DEBUG: TCPClient reading 163 bytes
>>> KrbKdcReq send: #bytes read=163
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:07 PST 2021 1638054067000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zookeeper/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=235
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=235
>>>DEBUG: TCPClient reading 545 bytes
>>> KrbKdcReq send: #bytes read=545
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zookeeper/localhost
Java config name: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-7142dcc9-c0b6-43ba-b06b-352dc368466b/1638054067678/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zkclient
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 65; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): zkclient
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 73; type: 16
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=147
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:08 PST 2021 1638054068000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=233
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=233
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
Java config name: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-7142dcc9-c0b6-43ba-b06b-352dc368466b/1638054067678/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=147
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:09 PST 2021 1638054069000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=234
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=234
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/zookeeper.keytab for zookeeper/localhost@EXAMPLE.COM
Found ticket for zookeeper/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:07 PDT 2024
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:09 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Java config name: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-7142dcc9-c0b6-43ba-b06b-352dc368466b/1638054067678/krb5.conf
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:09 PDT 2024
Service ticket not found in the subject
Loaded from Java config
>>> KdcAccessibility: reset
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> Credentials serviceCredsSingle: same realm
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=147
default etypes for default_tgs_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:09 PST 2021 1638054069000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=234
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=234
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=578
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=578
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
>>>DEBUG: TCPClient reading 540 bytes
>>> KrbKdcReq send: #bytes read=540
>>> KdcAccessibility: remove localhost:46465
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: zkclient/localhost@EXAMPLE.COM
	server: zookeeper/localhost@EXAMPLE.COM
	ticket: sname: zookeeper/localhost@EXAMPLE.COM
	endTime: 1724454069000
        ----Credentials end----
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 281406264
Krb5Context setting peerSeqNumber to: 281406264
Created InitSecContextToken:
0000: 01 00 6E 82 01 E8 30 82   01 E4 A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 FF  ................
0020: 61 81 FC 30 81 F9 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 21 30 1F A0 03  XAMPLE.COM.!0...
0040: 02 01 03 A1 18 30 16 1B   09 7A 6F 6F 6B 65 65 70  .....0...zookeep
0050: 65 72 1B 09 6C 6F 63 61   6C 68 6F 73 74 A3 81 BF  er..localhost...
0060: 30 81 BC A0 03 02 01 11   A1 03 02 01 01 A2 81 AF  0...............
0070: 04 81 AC A5 F1 54 CE BF   EC 2D 75 A4 62 06 A9 F4  .....T...-u.b...
0080: 17 B7 0E 5E F2 B5 71 79   80 71 8F 5E FA DB 7C 5E  ...^..qy.q.^...^
0090: C6 D8 D7 81 32 4B 60 95   2C 19 80 2E 53 CD F4 49  ....2K`.,...S..I
00A0: BC 6D 3C 50 8D A6 08 A5   BC 3D 36 69 EB 85 1A 30  .m<P.....=6i...0
00B0: 15 FC 11 DD CD D7 C1 37   96 55 99 4B FE 7E D7 A6  .......7.U.K....
00C0: 1E A1 23 81 B9 2B 73 05   10 BB 93 A5 3C 2A 97 E5  ..#..+s.....<*..
00D0: AB AC E0 46 2A DE B4 C4   DD 78 FE F7 60 B2 0D AF  ...F*....x..`...
00E0: A6 7B 14 66 5E B3 0A 53   EB B0 B8 DE 00 70 6D 20  ...f^..S.....pm 
00F0: 7E 78 16 33 29 C5 D8 90   D3 B9 46 39 15 D4 18 D0  .x.3).....F9....
0100: 05 8D 2A FB C3 78 52 D4   9A 00 80 F2 54 07 8F F6  ..*..xR.....T...
0110: 6F 4F EA 02 3A 09 D6 4E   69 3C 64 DC A8 56 DD A4  oO..:..Ni<d..V..
0120: 81 CC 30 81 C9 A0 03 02   01 11 A2 81 C1 04 81 BE  ..0.............
0130: B0 CE D1 FB FF 21 EC 0C   CB 5B 13 59 69 1E 78 F3  .....!...[.Yi.x.
0140: 9A D1 52 8D 8A 69 23 98   24 BE 60 10 21 FC C4 00  ..R..i#.$.`.!...
0150: 58 25 D8 29 FE AB 24 9F   FC 6B 37 A2 0D BD 8C 58  X%.)..$..k7....X
0160: 36 33 EC E3 F8 73 1A 26   93 EA 8E AF 5A 43 40 E5  63...s.&....ZC@.
0170: D6 0B B9 4E 36 2A 3C F9   3C 34 91 FF CF B6 0C 35  ...N6*<.<4.....5
0180: 1D B2 82 22 95 1F 6A F1   83 C7 A8 98 40 BD 7A 2B  ..."..j.....@.z+
0190: 63 E5 81 5C F3 E4 62 2F   E4 F4 92 75 55 32 B9 4A  c..\..b/...uU2.J
01A0: 41 89 52 C7 8A 88 72 33   03 2A 62 AD E0 70 BB 9A  A.R...r3.*b..p..
01B0: 05 23 DF EA 40 E2 E4 2D   68 C8 2E 77 73 EF 81 07  .#..@..-h..ws...
01C0: 39 15 8D F4 7C 27 87 23   83 8D 30 2E 31 D8 D5 35  9....'.#..0.1..5
01D0: EE F6 DA 1E 87 66 C0 B3   C8 6C 50 7F DE 48 D0 68  .....f...lP..H.h
01E0: 2D 15 7E B7 5C 65 58 35   6C 83 5D D2 68 05        -...\eX5l.].h.

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054069/226226/E9830689E5942F09E7B57C4472CD440C/zkclient/localhost@EXAMPLE.COM to zkclient/localhost@EXAMPLE.COM|zookeeper/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 281406264
Krb5Context setting mySeqNumber to: 281406264
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 10 c5 eb 38 01 01 00 00 31 3e c1 cb ee fc ab 75 38 bc fc 46 ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 10 c5 eb 38 01 01 00 00 31 3e c1 cb ee fc ab 75 38 bc fc 46 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 10 c5 eb 38 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 94 90 33 58 99 9b 28 db 01 cf 22 2d ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 10 c5 eb 38 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 94 90 33 58 99 9b 28 db 01 cf 22 2d ]
Krb5Context.unwrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Java config name: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-7142dcc9-c0b6-43ba-b06b-352dc368466b/1638054067678/krb5.conf
Loaded from Java config
>>> KdcAccessibility: reset
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=147
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=147
>>>DEBUG: TCPClient reading 162 bytes
>>> KrbKdcReq send: #bytes read=162
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:10 PST 2021 1638054070000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is zkclient/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=234
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=234
>>>DEBUG: TCPClient reading 543 bytes
>>> KrbKdcReq send: #bytes read=543
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: zkclient/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply zkclient/localhost
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/zookeeper.keytab for zookeeper/localhost@EXAMPLE.COM
Found ticket for zookeeper/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:07 PDT 2024
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for zkclient/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Service ticket not found in the subject
>>> Credentials serviceCredsSingle: same realm
default etypes for default_tgs_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=578
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=578
>>>DEBUG: TCPClient reading 540 bytes
>>> KrbKdcReq send: #bytes read=540
>>> KdcAccessibility: remove localhost:46465
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: zkclient/localhost@EXAMPLE.COM
	server: zookeeper/localhost@EXAMPLE.COM
	ticket: sname: zookeeper/localhost@EXAMPLE.COM
	endTime: 1724454070000
        ----Credentials end----
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 149419032
Krb5Context setting peerSeqNumber to: 149419032
Created InitSecContextToken:
0000: 01 00 6E 82 01 E8 30 82   01 E4 A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 FF  ................
0020: 61 81 FC 30 81 F9 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 21 30 1F A0 03  XAMPLE.COM.!0...
0040: 02 01 03 A1 18 30 16 1B   09 7A 6F 6F 6B 65 65 70  .....0...zookeep
0050: 65 72 1B 09 6C 6F 63 61   6C 68 6F 73 74 A3 81 BF  er..localhost...
0060: 30 81 BC A0 03 02 01 11   A1 03 02 01 01 A2 81 AF  0...............
0070: 04 81 AC A8 98 0D 94 66   98 F1 5B 96 7B 86 D8 09  .......f..[.....
0080: 70 0C 4C 81 BB 8B 2E 5C   46 95 30 F2 5E 51 E6 BD  p.L....\F.0.^Q..
0090: A0 44 65 73 AF E8 45 4F   5D 1A 8B 0C D8 23 06 F9  .Des..EO]....#..
00A0: 23 71 82 27 31 28 A2 77   B8 1A D8 03 54 CC F4 E1  #q.'1(.w....T...
00B0: F7 74 7B FD 32 F1 9F 39   CC 6F B1 A2 23 C2 BE 28  .t..2..9.o..#..(
00C0: 4F A3 5A D5 6C 6B E8 83   FB ED BD A5 0F AF 4A 80  O.Z.lk........J.
00D0: BE 68 2A 08 68 27 C7 E1   FC DA A6 7D 95 E2 91 E8  .h*.h'..........
00E0: 5C 88 E7 88 45 F2 53 F7   2C 91 9B C0 59 5A 45 75  \...E.S.,...YZEu
00F0: DF 5B C5 4E 95 BD 52 51   54 49 9F DD 8F 67 E4 BC  .[.N..RQTI...g..
0100: 0F 1D 30 60 12 9D 8A 67   8F A7 5B 3A F7 EF 3E A8  ..0`...g..[:..>.
0110: F2 88 F8 D8 F9 5F 8F 0D   E9 DA FE 9C DB D0 D3 A4  ....._..........
0120: 81 CC 30 81 C9 A0 03 02   01 11 A2 81 C1 04 81 BE  ..0.............
0130: 98 DE 09 D2 44 D8 51 78   F4 5A 14 17 67 6C 9C F4  ....D.Qx.Z..gl..
0140: F7 FD 2C F8 FA A6 12 5F   B8 4D 2D 9E 3B 6B 08 49  ..,...._.M-.;k.I
0150: BA AB 68 11 25 D9 28 CC   46 D3 F2 C0 94 9F AA 33  ..h.%.(.F......3
0160: CD 7B 62 2B 3B FD 22 C3   85 B1 6F 4E 68 DB C4 5C  ..b+;."...oNh..\
0170: DB 12 B8 8C F0 C2 00 BC   0D 74 E0 B0 89 0C 8D C7  .........t......
0180: 30 54 7C 2C 0F 25 CC E0   5E 65 B3 0C AB 48 30 65  0T.,.%..^e...H0e
0190: F6 38 FD 00 46 E5 18 B7   5A 90 1A 48 C2 8D 99 D1  .8..F...Z..H....
01A0: D1 A1 BF D1 FD 66 4F 24   EC 8B EB 51 54 6D F6 9B  .....fO$...QTm..
01B0: 90 57 19 AF 3F A8 AF 40   2D B7 30 E6 31 76 B4 AA  .W..?..@-.0.1v..
01C0: B3 4A A2 0A 7E 9A D5 7F   A3 BF 45 3D 90 B7 FF 7D  .J........E=....
01D0: 0D 01 E7 65 55 C2 22 52   0D 66 DE 57 38 3D CA 92  ...eU."R.f.W8=..
01E0: 80 94 FA 45 78 13 B4 84   A8 D6 A9 F5 2E 52        ...Ex........R

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: zookeeper/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054070/379012/BE3D534BA14D0B42B5280EA27B20D047/zkclient/localhost@EXAMPLE.COM to zkclient/localhost@EXAMPLE.COM|zookeeper/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 149419032
Krb5Context setting mySeqNumber to: 149419032
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 08 e7 f4 18 01 01 00 00 1d d5 c6 f9 9e 0a bf e6 a5 7f 28 d4 ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 08 e7 f4 18 01 01 00 00 1d d5 c6 f9 9e 0a bf e6 a5 7f 28 d4 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 08 e7 f4 18 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d af 24 34 7e 3c 34 da c9 2d a5 8b 32 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 08 e7 f4 18 01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d af 24 34 7e 3c 34 da c9 2d a5 8b 32 ]
Krb5Context.unwrap: data=[01 01 00 00 7a 6b 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): kafka
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 62; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): kafka
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 70; type: 16
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=143
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=143
>>>DEBUG: TCPClient reading 159 bytes
>>> KrbKdcReq send: #bytes read=159
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:10 PST 2021 1638054070000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is kafka/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=230
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=230
>>>DEBUG: TCPClient reading 537 bytes
>>> KrbKdcReq send: #bytes read=537
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsRep cons in KrbAsReq.getReply kafka/localhost
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): client
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 63; type: 17
>>> KeyTabInputStream, readName(): EXAMPLE.COM
>>> KeyTabInputStream, readName(): client
>>> KeyTabInputStream, readName(): localhost
>>> KeyTab: load() entry length: 71; type: 16
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=144
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=144
>>>DEBUG: TCPClient reading 160 bytes
>>> KrbKdcReq send: #bytes read=160
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

>>> KdcAccessibility: remove localhost:46465
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
	 sTime is Sat Nov 27 15:01:10 PST 2021 1638054070000
	 suSec is 100
	 error code is 25
	 error Message is Additional pre-authentication required
	 sname is client/localhost@EXAMPLE.COM
	 eData provided.
	 msgType is 30
>>>Pre-Authentication Data:
	 PA-DATA type = 19
	 PA-ETYPE-INFO2 etype = 17, salt = null, s2kparams = null

KRBError received: Additional pre-authentication required
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
default etypes for default_tkt_enctypes: 17.
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
default etypes for default_tkt_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=231
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=231
>>>DEBUG: TCPClient reading 539 bytes
>>> KrbKdcReq send: #bytes read=539
>>> KdcAccessibility: remove localhost:46465
Looking for keys for: client/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Service ticket not found in the subject
>>> Credentials serviceCredsSingle: same realm
default etypes for default_tgs_enctypes: 17.
>>> KrbAsRep cons in KrbAsReq.getReply client/localhost
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=568
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=568
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Service ticket not found in the subject
>>> Credentials serviceCredsSingle: same realm
default etypes for default_tgs_enctypes: 17.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/kafka.keytab for kafka/localhost@EXAMPLE.COM
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
>>> KrbKdcReq send: kdc=localhost TCP:46465, timeout=30000, number of retries =3, #bytes=570
>>> KDCCommunication: kdc=localhost TCP:46465, timeout=30000,Attempt =1, #bytes=570
>>>DEBUG: TCPClient reading 526 bytes
>>> KrbKdcReq send: #bytes read=526
>>> KdcAccessibility: remove localhost:46465
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: kafka/localhost@EXAMPLE.COM
	server: kafka/localhost@EXAMPLE.COM
	ticket: sname: kafka/localhost@EXAMPLE.COM
	endTime: 1724454070000
        ----Credentials end----
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 462716540
Krb5Context setting peerSeqNumber to: 462716540
Created InitSecContextToken:
0000: 01 00 6E 82 01 DE 30 82   01 DA A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F8  ................
0020: 61 81 F5 30 81 F2 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BC 30 81 B9 A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AC 04 81 A9 84  ................
0070: 43 9E 24 FA 92 1D 9A 59   1D 8E 55 36 05 D6 BA 3E  C.$....Y..U6...>
0080: 7F 37 27 F9 5C C5 52 77   98 D7 27 95 FD D4 14 84  .7'.\.Rw..'.....
0090: 21 0F 81 EA 76 21 3D A3   BB CE E9 C9 91 2B CA 03  !...v!=......+..
00A0: E4 17 C5 54 74 31 75 6B   CD A1 62 91 36 50 40 4F  ...Tt1uk..b.6P@O
00B0: 44 DE 16 77 83 C2 45 D7   34 A2 8D 88 7A 7C 14 DA  D..w..E.4...z...
00C0: 44 F0 9A B2 EA 75 D0 EC   05 EE 71 63 4E 46 B6 56  D....u....qcNF.V
00D0: FB B5 4C 90 C4 2A A5 DC   C0 DA 2B FA 40 AB 7F 2D  ..L..*....+.@..-
00E0: DB 0F BE 27 C1 A1 44 96   90 3B CB C0 38 5D 43 14  ...'..D..;..8]C.
00F0: 80 06 29 8F 03 00 01 AE   97 8E 96 8C 28 83 6E F0  ..).........(.n.
0100: A5 78 91 27 48 A5 2C BB   C9 86 97 0C D4 3C C2 95  .x.'H.,......<..
0110: 2D ED 47 93 4C B6 D8 B4   A4 81 C9 30 81 C6 A0 03  -.G.L......0....
0120: 02 01 11 A2 81 BE 04 81   BB 62 25 CE D2 71 88 E1  .........b%..q..
0130: F4 9E 48 C9 FF A3 0A B2   16 84 1D C8 61 ED C8 28  ..H.........a..(
0140: 22 37 1D CB 19 54 A3 AD   6F 47 CB AB D6 F3 FE C9  "7...T..oG......
0150: 78 9C FB FB EC B7 D4 7C   78 03 AC F1 24 69 73 A5  x.......x...$is.
0160: 7A 05 30 07 29 6B 7E 0E   2E 1D 3C 49 60 C8 52 A3  z.0.)k....<I`.R.
0170: 59 BC 17 88 4B DC AC 21   F1 BC D5 FD D1 95 6B 95  Y...K..!......k.
0180: A2 16 14 F0 6F BE 26 A3   25 4F 81 2E 65 CE 9B E5  ....o.&.%O..e...
0190: 7C 75 4E A1 0D 31 62 1D   B1 1F B6 BD C3 11 FB 71  .uN..1b........q
01A0: DB 05 9C 9C C9 E7 9A 44   6D 4F 1F 37 44 05 4F AD  .......DmO.7D.O.
01B0: 13 09 17 65 45 90 21 D2   5F 4C 1D 45 2B 63 26 CF  ...eE.!._L.E+c&.
01C0: 12 E9 D5 B3 A7 1E 8E 93   94 15 3D A9 32 97 86 3D  ..........=.2..=
01D0: 3D 0D 64 9C 3D AE 66 89   2F 9C BB D6 AD F1 89 39  =.d.=.f./......9
01E0: 44 58 97 39                                        DX.9

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054070/677932/E8C133ADC3AD7F650FC63730F9E804D0/kafka/localhost@EXAMPLE.COM to kafka/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 462716540
Krb5Context setting mySeqNumber to: 462716540
>>>DEBUG: TCPClient reading 528 bytes
>>> KrbKdcReq send: #bytes read=528
>>> KdcAccessibility: remove localhost:46465
Krb5Context.wrap: data=[01 01 00 00 ]
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 1b 94 7e 7c 01 01 00 00 63 04 f0 b2 ad a6 3c 88 d2 df 15 de ]
>>> TGS credentials serviceCredsSingle:
>>> DEBUG: ----Credentials----
	client: client/localhost@EXAMPLE.COM
	server: kafka/localhost@EXAMPLE.COM
	ticket: sname: kafka/localhost@EXAMPLE.COM
	endTime: 1724454070000
        ----Credentials end----
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 1b 94 7e 7c 01 01 00 00 63 04 f0 b2 ad a6 3c 88 d2 df 15 de ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context setting mySeqNumber to: 827660987
Krb5Context setting peerSeqNumber to: 827660987
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 1b 94 7e 7c 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d f7 1a a6 6f 69 60 2a 08 04 92 5b ab ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 1b 94 7e 7c 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d f7 1a a6 6f 69 60 2a 08 04 92 5b ab ]
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0070: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0080: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0090: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
00A0: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
00B0: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00C0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00D0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00E0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00F0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
0100: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
0110: 5D 26 70 05 26 41 F6 CD   2A A4 81 CA 30 81 C7 A0  ]&p.&A..*...0...
0120: 03 02 01 11 A2 81 BF 04   81 BC 1C BC 4D CD 8B EF  ............M...
0130: 74 5A 57 E4 57 51 E0 2B   14 99 29 47 54 D2 50 87  tZW.WQ.+..)GT.P.
0140: E8 73 FB 0D 70 95 5C CB   55 90 87 71 DE 3F D6 11  .s..p.\.U..q.?..
0150: FF 26 1F F5 FA D8 F1 CF   DE 24 68 CE 6B 07 94 DE  .&.......$h.k...
0160: A1 2A 2E 6A 37 64 EC BB   C5 57 B3 63 AA 06 6C A8  .*.j7d...W.c..l.
0170: F3 76 C0 F1 2A BC D1 78   2B 6B DD 47 D9 38 DD 46  .v..*..x+k.G.8.F
0180: 52 0A 7C 5A E2 2D 0A F8   E7 1A B4 42 50 C1 CE C2  R..Z.-.....BP...
0190: A1 4D 20 1A A4 F2 52 B1   C7 60 64 E2 7D B4 E3 95  .M ...R..`d.....
01A0: F2 7C DC 77 1C E1 14 12   4C 57 FB 5C FC E6 A7 66  ...w....LW.\...f
01B0: A6 1A 8B 40 7E 52 6F BD   A8 5A C9 29 BE A4 D7 3F  ...@.Ro..Z.)...?
01C0: 61 8E 5E 05 EA 76 6D 42   1D 20 97 AC B0 25 94 B1  a.^..vmB. ...%..
01D0: 06 77 E3 7C C3 94 05 D3   C2 F1 93 E3 D0 92 24 D6  .w............$.
01E0: 35 5E 95 20 D0 FC                                  5^. ..

Krb5Context.unwrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054070/685136/B172E0E91B3BAC00F34C611606B15634/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 827660987
Krb5Context setting mySeqNumber to: 827660987
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 31 55 1a bb 01 01 00 00 b9 29 e7 29 2a e2 fb b5 56 11 26 ab ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 31 55 1a bb 01 01 00 00 b9 29 e7 29 2a e2 fb b5 56 11 26 ab ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 31 55 1a bb 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d c8 c6 f1 2c 6d 39 86 3e 11 11 f8 14 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 31 55 1a bb 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d c8 c6 f1 2c 6d 39 86 3e 11 11 f8 14 ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0050: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0060: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0070: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
0080: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
0090: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00A0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00B0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00C0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00D0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
00E0: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
00F0: 5D 26 70 05 26 41 F6 CD   2A                       ]&p.&A..*

Client Principal = client/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 2D 1C 39 F0 2A 7D 72 BE   8F 61 5A 12 5E 43 34 0E  -.9.*.r..aZ.^C4.


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Sat Nov 27 15:01:10 PST 2021
Start Time = Sat Nov 27 15:01:10 PST 2021
End Time = Fri Aug 23 16:01:10 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 504079825
Krb5Context setting peerSeqNumber to: 504079825
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0070: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0080: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0090: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
00A0: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
00B0: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00C0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00D0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00E0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00F0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
0100: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
0110: 5D 26 70 05 26 41 F6 CD   2A A4 81 CA 30 81 C7 A0  ]&p.&A..*...0...
0120: 03 02 01 11 A2 81 BF 04   81 BC 6A 4E A6 17 92 40  ..........jN...@
0130: C8 41 9B 6C 40 50 1E FC   74 9A 63 05 3D B8 76 B3  .A.l@P..t.c.=.v.
0140: A4 0A E0 06 7B 6C D5 C8   AA DE 8B 2E DB FD 76 DE  .....l........v.
0150: F7 23 2A 2F 06 F2 DD 9D   9E C9 1A 69 53 5A EB 6E  .#*/.......iSZ.n
0160: 30 6A 6D 20 E1 24 90 3E   ED A5 1D 16 05 45 FB 79  0jm .$.>.....E.y
0170: BA 66 60 50 54 58 0B C8   A0 B8 8A C0 3F C2 29 DC  .f`PTX......?.).
0180: 28 47 17 70 B8 BE 36 5B   8C 00 1B E6 91 94 37 92  (G.p..6[......7.
0190: 54 18 9E 63 85 F5 EA 17   6E 13 DC 4C E9 B8 9C 68  T..c....n..L...h
01A0: 99 8E 29 86 6E E7 8B 52   C7 5B 43 D5 29 E5 C9 56  ..).n..R.[C.)..V
01B0: 12 35 E4 CF F1 0B 48 4A   C1 90 70 AA ED 1C 1D AC  .5....HJ..p.....
01C0: 8F B9 4B 51 ED 10 6C C6   6B DB D5 61 43 4E 51 AF  ..KQ..l.k..aCNQ.
01D0: 06 FA 6E 49 A3 80 29 D4   68 52 D1 FC 54 F3 AA C0  ..nI..).hR..T...
01E0: C0 1B 42 4F 47 F0                                  ..BOG.

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054070/746673/F0EF01A509D8CFE78A8719C7EA3E2C3A/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 504079825
Krb5Context setting mySeqNumber to: 504079825
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 1e 0b a5 d1 01 01 00 00 b2 de 7a ab 36 2e 12 48 da 20 1f ab ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 1e 0b a5 d1 01 01 00 00 b2 de 7a ab 36 2e 12 48 da 20 1f ab ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 1e 0b a5 d1 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 58 48 a3 95 77 37 c9 1b c1 8f b2 7c ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 1e 0b a5 d1 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 58 48 a3 95 77 37 c9 1b c1 8f b2 7c ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0050: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0060: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0070: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
0080: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
0090: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00A0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00B0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00C0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00D0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
00E0: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
00F0: 5D 26 70 05 26 41 F6 CD   2A                       ]&p.&A..*

Client Principal = client/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 2D 1C 39 F0 2A 7D 72 BE   8F 61 5A 12 5E 43 34 0E  -.9.*.r..aZ.^C4.


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Sat Nov 27 15:01:10 PST 2021
Start Time = Sat Nov 27 15:01:10 PST 2021
End Time = Fri Aug 23 16:01:10 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 613036713
Krb5Context setting peerSeqNumber to: 613036713
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0070: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0080: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0090: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
00A0: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
00B0: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00C0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00D0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00E0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00F0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
0100: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
0110: 5D 26 70 05 26 41 F6 CD   2A A4 81 CA 30 81 C7 A0  ]&p.&A..*...0...
0120: 03 02 01 11 A2 81 BF 04   81 BC A5 AC 1E 48 A7 99  .............H..
0130: 1F A9 E3 A3 10 DD 11 7F   91 74 D3 28 AB 1E EB 0C  .........t.(....
0140: 22 2C 79 DE 2F 4D 41 CB   AF A9 4C FA 9E 5A A2 F3  ",y./MA...L..Z..
0150: A3 53 E5 19 C2 82 1C 4D   D1 14 19 F3 BD 73 69 E1  .S.....M.....si.
0160: 08 97 93 FA 72 D3 6E 4A   55 9E 60 DA E0 BE D4 4C  ....r.nJU.`....L
0170: 42 A8 F0 7C 4F 7A 76 C2   7A 93 38 73 99 40 28 9A  B...Ozv.z.8s.@(.
0180: CB 72 D4 81 4C 50 04 F1   D9 B2 7F 45 C2 E7 92 15  .r..LP.....E....
0190: 2B DF 3C B9 FF A2 64 6B   51 36 8B 6E 19 55 3C 1A  +.<...dkQ6.n.U<.
01A0: E6 06 DC 0E EB 4A 84 ED   02 C3 BA 11 10 FA AD 48  .....J.........H
01B0: 3C 1B 25 7D D2 BB 37 41   29 FA 0C B4 D9 36 AB FD  <.%...7A)....6..
01C0: FA 69 8F AE 23 04 08 49   F0 A3 BA 96 4E 54 3A 8C  .i..#..I....NT:.
01D0: AF 96 A0 CB EF D1 E2 96   0F C4 44 50 2D B5 CE 70  ..........DP-..p
01E0: A8 1E 74 0F 3C AC                                  ..t.<.

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054070/758790/A2DF504C0EECB68A247A8C5C1B82F1A1/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 613036713
Krb5Context setting mySeqNumber to: 613036713
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 24 8a 32 a9 01 01 00 00 80 4f 36 50 9f 94 f5 0f f0 2d 89 24 ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 24 8a 32 a9 01 01 00 00 80 4f 36 50 9f 94 f5 0f f0 2d 89 24 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 24 8a 32 a9 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 09 25 05 b6 38 25 43 77 c8 87 5e 75 ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 24 8a 32 a9 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d 09 25 05 b6 38 25 43 77 c8 87 5e 75 ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for client/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found ticket for client/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0050: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0060: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0070: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
0080: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
0090: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00A0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00B0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00C0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00D0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
00E0: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
00F0: 5D 26 70 05 26 41 F6 CD   2A                       ]&p.&A..*

Client Principal = client/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 2D 1C 39 F0 2A 7D 72 BE   8F 61 5A 12 5E 43 34 0E  -.9.*.r..aZ.^C4.


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Sat Nov 27 15:01:10 PST 2021
Start Time = Sat Nov 27 15:01:10 PST 2021
End Time = Fri Aug 23 16:01:10 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/kafka.keytab for kafka/localhost@EXAMPLE.COM
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Krb5Context setting mySeqNumber to: 820227385
Krb5Context setting peerSeqNumber to: 820227385
Created InitSecContextToken:
0000: 01 00 6E 82 01 E0 30 82   01 DC A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F9  ................
0020: 61 81 F6 30 81 F3 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BD 30 81 BA A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AD 04 81 AA 60  ...............`
0070: 1E A4 25 A7 31 C5 D3 B0   10 35 B0 7D 0B CA 41 BA  ..%.1....5....A.
0080: 66 87 3C 2F 2D 29 B1 BB   1B 6A E9 9E 07 1B 4A FE  f.</-)...j....J.
0090: E9 4C CD C2 AA 20 B6 4F   0A 87 6A 21 AC 8D 88 13  .L... .O..j!....
00A0: 4C D1 EE 99 0D BF A3 92   B2 9C 3B 10 A7 0C 0D B8  L.........;.....
00B0: 5A 42 19 D2 9A 1B 78 F3   4A 31 EA CE 9F A0 C6 26  ZB....x.J1.....&
00C0: EA C1 5F 0C 0D E6 B1 B0   C2 A7 A3 83 11 6F A5 49  .._..........o.I
00D0: C4 BA 58 B3 1D D3 2C E7   E8 B8 88 97 F4 68 B6 DA  ..X...,......h..
00E0: 7D 42 FA F2 72 FB 80 F0   90 9C 92 9C BF B3 32 17  .B..r.........2.
00F0: 19 84 66 33 A8 E6 8B DF   E1 E5 3E D2 4E 2C 95 E2  ..f3......>.N,..
0100: 96 70 47 02 8A 39 CA E1   F5 1E B9 4F 42 08 1B F9  .pG..9.....OB...
0110: 5D 26 70 05 26 41 F6 CD   2A A4 81 CA 30 81 C7 A0  ]&p.&A..*...0...
0120: 03 02 01 11 A2 81 BF 04   81 BC 41 8E 66 42 43 46  ..........A.fBCF
0130: D5 95 F2 49 3D 5D A1 23   3B C0 F0 16 59 3B 23 F9  ...I=].#;...Y;#.
0140: EF B3 BA A4 B8 57 00 D7   A9 BD B2 69 4C 6B A1 E3  .....W.....iLk..
0150: 0C 56 7C 32 37 B9 2C CA   DD 1A 18 7F BF D5 F7 27  .V.27.,........'
0160: A5 2C DD 14 47 A4 03 10   2C 41 1D 66 39 FE 18 C5  .,..G...,A.f9...
0170: 24 F2 01 4D B5 FE FE D3   4B 8F 9B 0D 1E 73 94 A8  $..M....K....s..
0180: 44 64 2A A5 B7 B6 A1 6F   F1 C3 B2 E3 F9 6F FA 79  Dd*....o.....o.y
0190: 6C 08 A4 6C 65 72 CF 2E   60 D4 CE E1 3F CC E3 4B  l..ler..`...?..K
01A0: 13 56 0C F8 7A 08 FC 85   47 A6 A4 26 12 E3 D2 63  .V..z...G..&...c
01B0: A2 45 33 06 F6 83 D2 87   A7 82 AC 65 7C 29 2E F4  .E3........e.)..
01C0: 42 01 D7 1B 35 77 5C 38   50 9C 2C C2 85 1F 81 D2  B...5w\8P.,.....
01D0: 2B 6E D4 84 00 7C 8A 57   22 3D 64 01 23 0D 26 A1  +n.....W"=d.#.&.
01E0: 20 CD F4 6A 88 53                                   ..j.S

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054070/892459/31BED171DFB2DC5B62EF91F7E53993B7/client/localhost@EXAMPLE.COM to client/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 820227385
Krb5Context setting mySeqNumber to: 820227385
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 30 e3 ad 39 01 01 00 00 d3 ca f0 a7 83 72 9c 2a 18 6f 09 a8 ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 30 e3 ad 39 01 01 00 00 d3 ca f0 a7 83 72 9c 2a 18 6f 09 a8 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 30 e3 ad 39 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d bf 47 cc 68 fb 60 b3 94 13 df 9b 8d ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 30 e3 ad 39 01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d bf 47 cc 68 fb 60 b3 94 13 df 9b 8d ]
Krb5Context.unwrap: data=[01 01 00 00 63 6c 69 65 6e 74 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
- Roundtrip
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found KeyTab /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kafka-0-10-sql/target/tmp/spark-4d12aed7-6c84-4089-9d77-2b057e69226b/kafka.keytab for kafka/localhost@EXAMPLE.COM
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for kafka/localhost@EXAMPLE.COM to go to krbtgt/EXAMPLE.COM@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found ticket for kafka/localhost@EXAMPLE.COM to go to kafka/localhost@EXAMPLE.COM expiring on Fri Aug 23 16:01:10 PDT 2024
Found service ticket in the subjectTicket (hex) = 
0000: 61 81 F5 30 81 F2 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0010: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0020: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0030: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BC 30 81 B9 A0  localhost...0...
0040: 03 02 01 11 A1 03 02 01   01 A2 81 AC 04 81 A9 84  ................
0050: 43 9E 24 FA 92 1D 9A 59   1D 8E 55 36 05 D6 BA 3E  C.$....Y..U6...>
0060: 7F 37 27 F9 5C C5 52 77   98 D7 27 95 FD D4 14 84  .7'.\.Rw..'.....
0070: 21 0F 81 EA 76 21 3D A3   BB CE E9 C9 91 2B CA 03  !...v!=......+..
0080: E4 17 C5 54 74 31 75 6B   CD A1 62 91 36 50 40 4F  ...Tt1uk..b.6P@O
0090: 44 DE 16 77 83 C2 45 D7   34 A2 8D 88 7A 7C 14 DA  D..w..E.4...z...
00A0: 44 F0 9A B2 EA 75 D0 EC   05 EE 71 63 4E 46 B6 56  D....u....qcNF.V
00B0: FB B5 4C 90 C4 2A A5 DC   C0 DA 2B FA 40 AB 7F 2D  ..L..*....+.@..-
00C0: DB 0F BE 27 C1 A1 44 96   90 3B CB C0 38 5D 43 14  ...'..D..;..8]C.
00D0: 80 06 29 8F 03 00 01 AE   97 8E 96 8C 28 83 6E F0  ..).........(.n.
00E0: A5 78 91 27 48 A5 2C BB   C9 86 97 0C D4 3C C2 95  .x.'H.,......<..
00F0: 2D ED 47 93 4C B6 D8 B4                            -.G.L...

Client Principal = kafka/localhost@EXAMPLE.COM
Server Principal = kafka/localhost@EXAMPLE.COM
Session Key = EncryptionKey: keyType=17 keyBytes (hex dump)=
0000: 69 ED 20 B0 14 31 30 E0   A3 50 04 9D 4A C5 D5 D3  i. ..10..P..J...


Forwardable Ticket false
Forwarded Ticket false
Proxiable Ticket false
Proxy Ticket false
Postdated Ticket false
Renewable Ticket false
Initial Ticket false
Auth Time = Sat Nov 27 15:01:10 PST 2021
Start Time = Sat Nov 27 15:01:10 PST 2021
End Time = Fri Aug 23 16:01:10 PDT 2024
Renew Till = null
Client Addresses  Null 
>>> KrbApReq: APOptions are 00000000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 708848541
Krb5Context setting peerSeqNumber to: 708848541
Created InitSecContextToken:
0000: 01 00 6E 82 01 DE 30 82   01 DA A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 00 00 00 00 A3 81 F8  ................
0020: 61 81 F5 30 81 F2 A0 03   02 01 05 A1 0D 1B 0B 45  a..0...........E
0030: 58 41 4D 50 4C 45 2E 43   4F 4D A2 1D 30 1B A0 03  XAMPLE.COM..0...
0040: 02 01 03 A1 14 30 12 1B   05 6B 61 66 6B 61 1B 09  .....0...kafka..
0050: 6C 6F 63 61 6C 68 6F 73   74 A3 81 BC 30 81 B9 A0  localhost...0...
0060: 03 02 01 11 A1 03 02 01   01 A2 81 AC 04 81 A9 84  ................
0070: 43 9E 24 FA 92 1D 9A 59   1D 8E 55 36 05 D6 BA 3E  C.$....Y..U6...>
0080: 7F 37 27 F9 5C C5 52 77   98 D7 27 95 FD D4 14 84  .7'.\.Rw..'.....
0090: 21 0F 81 EA 76 21 3D A3   BB CE E9 C9 91 2B CA 03  !...v!=......+..
00A0: E4 17 C5 54 74 31 75 6B   CD A1 62 91 36 50 40 4F  ...Tt1uk..b.6P@O
00B0: 44 DE 16 77 83 C2 45 D7   34 A2 8D 88 7A 7C 14 DA  D..w..E.4...z...
00C0: 44 F0 9A B2 EA 75 D0 EC   05 EE 71 63 4E 46 B6 56  D....u....qcNF.V
00D0: FB B5 4C 90 C4 2A A5 DC   C0 DA 2B FA 40 AB 7F 2D  ..L..*....+.@..-
00E0: DB 0F BE 27 C1 A1 44 96   90 3B CB C0 38 5D 43 14  ...'..D..;..8]C.
00F0: 80 06 29 8F 03 00 01 AE   97 8E 96 8C 28 83 6E F0  ..).........(.n.
0100: A5 78 91 27 48 A5 2C BB   C9 86 97 0C D4 3C C2 95  .x.'H.,......<..
0110: 2D ED 47 93 4C B6 D8 B4   A4 81 C9 30 81 C6 A0 03  -.G.L......0....
0120: 02 01 11 A2 81 BE 04 81   BB 71 DD C8 6A D5 24 3E  .........q..j.$>
0130: 92 C6 2E 8D 40 0C DF DC   04 F8 A2 C5 19 7E A6 7B  ....@...........
0140: 18 99 BA 23 D4 B4 8D 63   FF 27 4D B9 6B A1 E3 58  ...#...c.'M.k..X
0150: BE EF 7D 92 A8 BE BA 2A   E6 B5 97 7F E8 7F 26 A6  .......*......&.
0160: 70 23 D3 0D 57 6F 27 87   21 58 55 E4 00 AC 1F 29  p#..Wo'.!XU....)
0170: 92 10 41 E4 62 79 AF 3C   A3 E4 9D AD C2 EA E1 3B  ..A.by.<.......;
0180: D2 DD 63 81 00 04 25 B3   11 49 09 34 3A E1 00 44  ..c...%..I.4:..D
0190: A1 D5 E0 29 BC E6 19 7A   C4 ED 8A E5 B3 75 72 D6  ...)...z.....ur.
01A0: E0 4C 88 CD 90 75 CE C7   ED AD 24 70 0B 47 16 6A  .L...u....$p.G.j
01B0: 91 55 F3 C2 F8 11 F3 2E   04 E8 C5 66 E8 1E 20 00  .U.........f.. .
01C0: 2A AB 77 60 91 F8 5C 85   04 CC 09 2D 60 D8 B1 49  *.w`..\....-`..I
01D0: 8E A9 C0 03 F8 FC 2A A1   DC 9C 88 93 65 58 18 1C  ......*.....eX..
01E0: 36 B7 31 2D                                        6.1-

Entered Krb5Context.acceptSecContext with state=STATE_NEW
Looking for keys for: kafka/localhost@EXAMPLE.COM
Added key: 16version: 1
Added key: 17version: 1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Using builtin default etypes for permitted_enctypes
default etypes for permitted_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
MemoryCache: add 1638054072/517796/6751E9404175785ADD4B17F10EF1645F/kafka/localhost@EXAMPLE.COM to kafka/localhost@EXAMPLE.COM|kafka/localhost@EXAMPLE.COM
>>> KrbApReq: authenticate succeed.
Krb5Context setting peerSeqNumber to: 708848541
Krb5Context setting mySeqNumber to: 708848541
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 2a 40 2b 9d 01 01 00 00 db 5e 4c cd e6 f3 d6 e9 b8 d7 06 7b ]
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 2a 40 2b 9d 01 01 00 00 db 5e 4c cd e6 f3 d6 e9 b8 d7 06 7b ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 2a 40 2b 9d 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d c6 bb 0f b7 f7 89 40 8a 71 be 2c 8d ]
Krb5Context.unwrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 2a 40 2b 9d 01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d c6 bb 0f b7 f7 89 40 8a 71 be 2c 8d ]
Krb5Context.unwrap: data=[01 01 00 00 6b 61 66 6b 61 2f 6c 6f 63 61 6c 68 6f 73 74 40 45 58 41 4d 50 4c 45 2e 43 4f 4d ]
KafkaSourceOffsetSuite:
- comparison {"t":{"0":1}} <=> {"t":{"0":2}}
- comparison {"t":{"1":0,"0":1}} <=> {"t":{"1":1,"0":2}}
- comparison {"t":{"0":1},"T":{"0":0}} <=> {"t":{"0":2},"T":{"0":1}}
- comparison {"t":{"0":1}} <=> {"t":{"1":1,"0":2}}
- comparison {"t":{"0":1}} <=> {"t":{"1":3,"0":2}}
- basic serialization - deserialization
- OffsetSeqLog serialization - deserialization
- read Spark 2.1.0 offset format
KafkaOffsetReaderSuite:
- isolationLevel must give back default isolation level when not set
- isolationLevel must give back READ_UNCOMMITTED when set
- isolationLevel must give back READ_COMMITTED when set
- isolationLevel must throw exception when invalid isolation level set
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using specific offsets with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using specific offsets with useDeprecatedOffsetFetching false
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using special offsets with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - using special offsets with useDeprecatedOffsetFetching false
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - multiple topic partitions with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromUnresolvedOffsets - multiple topic partitions with useDeprecatedOffsetFetching false
- SPARK-30656: getOffsetRangesFromResolvedOffsets with useDeprecatedOffsetFetching true
- SPARK-30656: getOffsetRangesFromResolvedOffsets with useDeprecatedOffsetFetching false
KafkaSinkBatchSuiteV2:
- batch - write to kafka
- batch - partition column and partitioner priorities
- batch - null topic field value, and no topic option
- SPARK-20496: batch - enforce analyzed plans
- batch - unsupported save modes
- generic - write big data with small producer buffer
KafkaSourceStressSuite:
- stress test with multiple topics and partitions
KafkaDontFailOnDataLossSuite:
- failOnDataLoss=false should not return duplicated records: microbatch v1
- failOnDataLoss=false should not return duplicated records: microbatch v2
- failOnDataLoss=false should not return duplicated records: continuous processing
- failOnDataLoss=false should not return duplicated records: batch v1
- failOnDataLoss=false should not return duplicated records: batch v2
KafkaSinkBatchSuiteV1:
- batch - write to kafka
- batch - partition column and partitioner priorities
- batch - null topic field value, and no topic option
- SPARK-20496: batch - enforce analyzed plans
- batch - unsupported save modes
KafkaSparkConfSuite:
- deprecated configs
KafkaContinuousSourceSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- assign from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- assign from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific offsets (failOnDataLoss: false)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from specific timestamps with non-matching starting offset
- subscribing topic by name from global timestamp per topic with non-matching starting offset
- subscribing topic by pattern from specific timestamps with non-matching starting offset
- subscribing topic by pattern from global timestamp per topic with non-matching starting offset
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
- Kafka column types
- ensure continuous stream is being used
- read Kafka transactional messages: read_committed
- read Kafka transactional messages: read_uncommitted
- SPARK-27494: read kafka record containing null key/values.
KafkaSourceStressForDontFailOnDataLossSuite:
- stress test for failOnDataLoss=false
KafkaContinuousSourceTopicDeletionSuite:
- ensure continuous stream is being used
- subscribing topic by pattern with topic deletions
KafkaMicroBatchV2SourceWithAdminSuite:
- cannot stop Kafka stream
- assign from latest offsets (failOnDataLoss: true)
- assign from earliest offsets (failOnDataLoss: true)
- assign from specific offsets (failOnDataLoss: true)
- assign from specific timestamps (failOnDataLoss: true)
- assign from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by name from latest offsets (failOnDataLoss: true)
- subscribing topic by name from earliest offsets (failOnDataLoss: true)
- subscribing topic by name from specific offsets (failOnDataLoss: true)
- subscribing topic by name from specific timestamps (failOnDataLoss: true)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: true)
- subscribing topic by pattern from latest offsets (failOnDataLoss: true)
- subscribing topic by pattern from earliest offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific offsets (failOnDataLoss: true)
- subscribing topic by pattern from specific timestamps (failOnDataLoss: true)
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: true)
- assign from latest offsets (failOnDataLoss: false)
- assign from earliest offsets (failOnDataLoss: false)
- assign from specific offsets (failOnDataLoss: false)
- assign from specific timestamps (failOnDataLoss: false)
- assign from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by name from latest offsets (failOnDataLoss: false)
- subscribing topic by name from earliest offsets (failOnDataLoss: false)
- subscribing topic by name from specific offsets (failOnDataLoss: false)
- subscribing topic by name from specific timestamps (failOnDataLoss: false)
- subscribing topic by name from global timestamp per topic (failOnDataLoss: false)
- subscribing topic by pattern from latest offsets (failOnDataLoss: false)
Build timed out (after 480 minutes). Marking the build as aborted.
Build was aborted
Archiving artifacts
- subscribing topic by pattern from earliest offsets (failOnDataLoss: false) *** FAILED ***
  org.apache.spark.sql.streaming.StreamingQueryException: Query [id = 5842981d-0643-424e-8a13-bc3e7e451c2b, runId = 133e1c29-a717-40d1-b8d4-96fe85fc0211] terminated with exception: null
  at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:325)
  at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:209)
  ...
  Cause: java.lang.NullPointerException:
  at org.apache.spark.sql.kafka010.KafkaOffsetReaderAdmin.getSortedExecutorList(KafkaOffsetReaderAdmin.scala:436)
  at org.apache.spark.sql.kafka010.KafkaOffsetReaderAdmin.getOffsetRangesFromResolvedOffsets(KafkaOffsetReaderAdmin.scala:491)
  at org.apache.spark.sql.kafka010.KafkaMicroBatchStream.planInputPartitions(KafkaMicroBatchStream.scala:183)
  at org.apache.spark.sql.execution.datasources.v2.MicroBatchScanExec.partitions$lzycompute(MicroBatchScanExec.scala:44)
  at org.apache.spark.sql.execution.datasources.v2.MicroBatchScanExec.partitions(MicroBatchScanExec.scala:44)
  at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar(DataSourceV2ScanExecBase.scala:93)
  at org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar$(DataSourceV2ScanExecBase.scala:92)
  at org.apache.spark.sql.execution.datasources.v2.MicroBatchScanExec.supportsColumnar(MicroBatchScanExec.scala:29)
  at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy.apply(DataSourceV2Strategy.scala:150)
  at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)
  ...
- subscribing topic by pattern from specific offsets (failOnDataLoss: false) *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- subscribing topic by pattern from specific timestamps (failOnDataLoss: false) *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- subscribing topic by pattern from global timestamp per topic (failOnDataLoss: false) *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- subscribing topic by name from specific timestamps with non-matching starting offset *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- subscribing topic by name from global timestamp per topic with non-matching starting offset *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- subscribing topic by pattern from specific timestamps with non-matching starting offset *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- subscribing topic by pattern from global timestamp per topic with non-matching starting offset *** FAILED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.kafka010.KafkaMicroBatchV2SourceWithAdminSuite.beforeAll(KafkaMicroBatchSourceSuite.scala:1319)
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:62)
org.scalatest.Suite.callExecuteOnSuite$1(Suite.scala:1175)
org.scalatest.Suite.$anonfun$runNestedSuites$1(Suite.scala:1222)
scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
org.scalatest.Suite.runNestedSuites(Suite.scala:1220)
org.scalatest.Suite.runNestedSuites$(Suite.scala:1154)
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
org.scalatest.Suite.run(Suite.scala:1109)
org.scalatest.Suite.run$(Suite.scala:1094)
org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1322)
org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1316)
scala.collection.immutable.List.foreach(List.scala:431)

The currently active SparkContext was created at:

(No active SparkContext.)
  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:119)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:109)
  at org.apache.spark.sql.SparkSession.cloneSession(SparkSession.scala:272)
  at org.apache.spark.sql.execution.streaming.StreamExecution.<init>(StreamExecution.scala:196)
  at org.apache.spark.sql.execution.streaming.MicroBatchExecution.<init>(MicroBatchExecution.scala:47)
  at org.apache.spark.sql.streaming.StreamingQueryManager.createQuery(StreamingQueryManager.scala:275)
  at org.apache.spark.sql.streaming.StreamingQueryManager.startQuery(StreamingQueryManager.scala:322)
  at org.apache.spark.sql.streaming.StreamTest.executeAction$1(StreamTest.scala:549)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56(StreamTest.scala:786)
  at org.apache.spark.sql.streaming.StreamTest.$anonfun$testStream$56$adapted(StreamTest.scala:773)
  ...
- bad source options
- unsupported kafka configs
- get offsets from case insensitive parameters
[INFO] 
[INFO] ---------< org.apache.spark:spark-streaming-kinesis-asl_2.12 >----------
[INFO] Building Spark Kinesis Integration 3.3.0-SNAPSHOT                [27/31]
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-versions) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-duplicate-dependencies) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] 
[INFO] --- mvn-scalafmt_2.12:1.0.4:format (default) @ spark-streaming-kinesis-asl_2.12 ---
[WARNING] format.skipSources set, ignoring main directories
[WARNING] format.skipTestSources set, ignoring validateOnly directories
[WARNING] No sources specified, skipping formatting
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:add-source (eclipse-add-source) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] Add Source directory: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kinesis-asl/src/main/scala
[INFO] Add Test Source directory: /home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/external/kinesis-asl/src/test/scala
[INFO] 
[INFO] --- maven-dependency-plugin:3.1.1:build-classpath (default-cli) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] Dependencies classpath:
/home/jenkins/.m2/repository/com/clearspring/analytics/stream/2.9.6/stream-2.9.6.jar:/home/jenkins/.m2/repository/com/google/crypto/tink/tink/1.6.0/tink-1.6.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/external/jakarta.inject/2.6.1/jakarta.inject-2.6.1.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-locator/2.6.1/hk2-locator-2.6.1.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar:/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-utils/2.6.1/hk2-utils-2.6.1.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper-jute/3.6.2/zookeeper-jute-3.6.2.jar:/home/jenkins/.m2/repository/jakarta/servlet/jakarta.servlet-api/4.0.3/jakarta.servlet-api-4.0.3.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpcore/4.4.14/httpcore-4.4.14.jar:/home/jenkins/.m2/repository/org/json4s/json4s-jackson_2.12/3.7.0-M11/json4s-jackson_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-lang3/3.12.0/commons-lang3-3.12.0.jar:/home/jenkins/.m2/repository/org/objenesis/objenesis/2.6/objenesis-2.6.jar:/home/jenkins/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/home/jenkins/.m2/repository/jakarta/ws/rs/jakarta.ws.rs-api/2.1.6/jakarta.ws.rs-api-2.1.6.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.6.1/aopalliance-repackaged-2.6.1.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-core/1.11.655/aws-java-sdk-core-1.11.655.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.6.2/zookeeper-3.6.2.jar:/home/jenkins/.m2/repository/org/apache/htrace/htrace-core4/4.1.0-incubating/htrace-core4-4.1.0-incubating.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.15/commons-codec-1.15.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-sts/1.11.655/aws-java-sdk-sts-1.11.655.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-cloudwatch/1.11.655/aws-java-sdk-cloudwatch-1.11.655.jar:/home/jenkins/.m2/repository/joda-time/joda-time/2.10.12/joda-time-2.10.12.jar:/home/jenkins/.m2/repository/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/home/jenkins/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client-api/3.3.1/hadoop-client-api-3.3.1.jar:/home/jenkins/.m2/repository/org/slf4j/jul-to-slf4j/1.7.30/jul-to-slf4j-1.7.30.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-kms/1.11.655/aws-java-sdk-kms-1.11.655.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-server/2.34/jersey-server-2.34.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-graphite/4.2.2/metrics-graphite-4.2.2.jar:/home/jenkins/.m2/repository/org/roaringbitmap/RoaringBitmap/0.9.22/RoaringBitmap-0.9.22.jar:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.5.13/httpclient-4.5.13.jar:/home/jenkins/.m2/repository/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar:/home/jenkins/.m2/repository/org/json4s/json4s-core_2.12/3.7.0-M11/json4s-core_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet-core/2.34/jersey-container-servlet-core-2.34.jar:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.11.0/commons-io-2.11.0.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-s3/1.11.655/aws-java-sdk-s3-1.11.655.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-dynamodb/1.11.655/aws-java-sdk-dynamodb-1.11.655.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/kvstore/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/scala-lang/scala-library/2.12.15/scala-library-2.12.15.jar:/home/jenkins/.m2/repository/org/apache/xbean/xbean-asm9-shaded/4.20/xbean-asm9-shaded-4.20.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/home/jenkins/.m2/repository/jakarta/validation/jakarta.validation-api/2.0.2/jakarta.validation-api-2.0.2.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.8/xz-1.8.jar:/home/jenkins/.m2/repository/com/amazonaws/amazon-kinesis-client/1.12.0/amazon-kinesis-client-1.12.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/network-common/target/scala-2.12/classes:/home/jenkins/.m2/repository/com/twitter/chill_2.12/0.10.0/chill_2.12-0.10.0.jar:/home/jenkins/.m2/repository/org/scala-lang/scala-reflect/2.12.15/scala-reflect-2.12.15.jar:/home/jenkins/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/home/jenkins/.m2/repository/org/apache/ivy/ivy/2.5.0/ivy-2.5.0.jar:/home/jenkins/.m2/repository/org/json4s/json4s-scalap_2.12/3.7.0-M11/json4s-scalap_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/org/roaringbitmap/shims/0.9.22/shims-0.9.22.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-framework/2.13.0/curator-framework-2.13.0.jar:/home/jenkins/.m2/repository/io/netty/netty-all/4.1.68.Final/netty-all-4.1.68.Final.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.11.0/avro-1.11.0.jar:/home/jenkins/.m2/repository/org/lz4/lz4-java/1.8.0/lz4-java-1.8.0.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.1.8.4/snappy-java-1.1.8.4.jar:/home/jenkins/.m2/repository/com/amazonaws/aws-java-sdk-kinesis/1.11.655/aws-java-sdk-kinesis-1.11.655.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/hk2-api/2.6.1/hk2-api-2.6.1.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-client/2.34/jersey-client-2.34.jar:/home/jenkins/.m2/repository/jakarta/annotation/jakarta.annotation-api/1.3.5/jakarta.annotation-api-1.3.5.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-mapred/1.11.0/avro-mapred-1.11.0.jar:/home/jenkins/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.3/osgi-resource-locator-1.0.3.jar:/home/jenkins/.m2/repository/org/apache/yetus/audience-annotations/0.5.0/audience-annotations-0.5.0.jar:/home/jenkins/.m2/repository/com/twitter/chill-java/0.10.0/chill-java-0.10.0.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-json/4.2.2/metrics-json-4.2.2.jar:/home/jenkins/.m2/repository/org/scala-lang/modules/scala-xml_2.12/1.2.0/scala-xml_2.12-1.2.0.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.13.0/jackson-core-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-recipes/2.13.0/curator-recipes-2.13.0.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar:/home/jenkins/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-core/4.2.2/metrics-core-4.2.2.jar:/home/jenkins/.m2/repository/com/github/luben/zstd-jni/1.5.0-4/zstd-jni-1.5.0-4.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-jvm/4.2.2/metrics-jvm-4.2.2.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-client/2.13.0/curator-client-2.13.0.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.13.0/jackson-databind-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-crypto/1.1.0/commons-crypto-1.1.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/launcher/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/javassist/javassist/3.25.0-GA/javassist-3.25.0-GA.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/core/jersey-common/2.34/jersey-common-2.34.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.13.0/jackson-annotations-2.13.0.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.21/commons-compress-1.21.jar:/home/jenkins/.m2/repository/org/apache/avro/avro-ipc/1.11.0/avro-ipc-1.11.0.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-cbor/2.13.0/jackson-dataformat-cbor-2.13.0.jar:/home/jenkins/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.13.0/jackson-module-scala_2.12-2.13.0.jar:/home/jenkins/.m2/repository/net/razorvine/pickle/1.2/pickle-1.2.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/tags/target/scala-2.12/classes:/home/jenkins/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.34/jersey-container-servlet-2.34.jar:/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-client-runtime/3.3.1/hadoop-client-runtime-3.3.1.jar:/home/jenkins/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.30/jcl-over-slf4j-1.7.30.jar:/home/jenkins/.m2/repository/io/dropwizard/metrics/metrics-jmx/4.2.2/metrics-jmx-4.2.2.jar:/home/jenkins/.m2/repository/org/glassfish/jersey/inject/jersey-hk2/2.34/jersey-hk2-2.34.jar:/home/jenkins/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/streaming/target/scala-2.12/classes:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/network-shuffle/target/scala-2.12/classes:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/common/unsafe/target/scala-2.12/classes:/home/jenkins/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/home/jenkins/.m2/repository/net/sf/py4j/py4j/0.10.9.2/py4j-0.10.9.2.jar:/home/jenkins/.m2/repository/org/json4s/json4s-ast_2.12/3.7.0-M11/json4s-ast_2.12-3.7.0-M11.jar:/home/jenkins/.m2/repository/com/amazonaws/jmespath-java/1.11.655/jmespath-java-1.11.655.jar:/home/jenkins/.m2/repository/software/amazon/ion/ion-java/1.0.2/ion-java-1.0.2.jar:/home/jenkins/workspace/spark-master-test-maven-hadoop-3.2/core/target/scala-2.12/classes
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] Not compiling main sources
[INFO] 
[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @ spark-streaming-kinesis-asl_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: /home/jenkins/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.15__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.15,1.7.6,null)
+ retcode2=143
+ [[ 0 -ne 0 ]]
+ [[ 143 -ne 0 ]]
+ [[ 0 -ne 0 ]]
+ [[ 143 -ne 0 ]]
+ echo 'Testing Spark with Maven failed'
Testing Spark with Maven failed
+ exit 1
Recording test results
[Checks API] No suitable checks publisher found.
Finished: ABORTED