FailedConsole Output

Skipping 30,357 KB.. Full Log
20:05:43.795 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:43.795 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:43.830 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:43.830 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:43.830 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:43.873 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:43.873 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:43.873 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:43.883 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1L + 1 (712 milliseconds)
20:05:44.515 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:44.515 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:44.550 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:44.550 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:44.550 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:44.591 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:44.591 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:44.591 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:44.600 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1L + 1.0 (720 milliseconds)
20:05:45.231 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:45.231 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:45.266 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:45.266 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:45.266 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:45.307 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:45.307 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:45.307 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:45.317 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1L + 1L (707 milliseconds)
20:05:45.943 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:45.944 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:45.978 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:45.978 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:45.978 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:46.018 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:46.018 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:46.018 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:46.028 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1L + 1S (673 milliseconds)
20:05:46.610 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:46.610 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:46.646 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:46.646 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:46.646 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:46.692 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:46.692 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:46.692 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:46.703 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1L + 1Y (690 milliseconds)
20:05:47.307 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:47.307 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:47.342 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:47.342 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:47.343 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:47.387 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:47.387 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:47.387 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:47.397 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1L + '1' (733 milliseconds)
20:05:48.029 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:48.029 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:48.066 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:48.067 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:48.067 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:48.111 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:48.111 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:48.111 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:48.121 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1S + 1 (662 milliseconds)
20:05:48.689 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:48.689 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:48.723 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:48.723 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:48.723 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:48.764 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:48.764 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:48.764 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:48.773 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1S + 1.0 (678 milliseconds)
20:05:49.374 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:49.374 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:49.409 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:49.409 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:49.409 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:49.454 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:49.454 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:49.454 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:49.465 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1S + 1L (678 milliseconds)
20:05:50.046 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:50.046 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:50.080 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:50.080 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:50.080 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:50.123 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:50.123 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:50.123 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:50.132 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1S + 1S (677 milliseconds)
20:05:50.727 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:50.727 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:50.765 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:50.765 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:50.765 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:50.818 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:50.818 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:50.818 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:50.828 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1S + 1Y (682 milliseconds)
20:05:51.405 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:51.405 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:51.440 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:51.440 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:51.441 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:51.484 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:51.484 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:51.484 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:51.493 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1S + '1' (664 milliseconds)
20:05:52.075 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:52.075 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:52.109 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:52.109 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:52.109 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:52.154 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:52.154 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:52.154 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:52.164 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1Y + 1 (684 milliseconds)
20:05:52.768 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:52.768 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:52.806 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:52.807 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:52.807 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:52.849 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:52.849 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:52.849 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:52.858 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1Y + 1.0 (681 milliseconds)
20:05:53.437 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:53.437 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:53.471 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:53.471 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:53.472 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:53.513 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:53.513 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:53.513 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:53.522 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1Y + 1L (718 milliseconds)
20:05:54.171 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:54.171 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:54.206 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:54.207 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:54.207 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:54.247 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:54.247 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:54.247 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:54.257 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1Y + 1S (758 milliseconds)
20:05:54.928 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:54.928 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:54.964 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:54.964 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:54.964 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:55.005 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:55.005 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:55.005 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:55.014 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1Y + 1Y (745 milliseconds)
20:05:55.674 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:55.674 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:55.708 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:55.708 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:55.708 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:55.749 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:55.749 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:55.749 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:55.759 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - 1Y + '1' (736 milliseconds)
20:05:56.409 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:56.410 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:56.444 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:56.444 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:56.445 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:56.486 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:56.486 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:56.486 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:56.496 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - '1' + 1 (737 milliseconds)
20:05:57.148 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:57.148 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:57.183 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:57.183 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:57.183 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:57.223 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:57.224 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:57.224 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:57.233 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - '1' + 1.0 (724 milliseconds)
20:05:57.865 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:57.865 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:57.901 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:57.901 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:57.902 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:57.950 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:57.951 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:57.951 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:57.960 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - '1' + 1L (772 milliseconds)
20:05:58.632 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:58.632 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:58.668 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:58.668 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:58.669 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:58.713 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:58.713 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:58.713 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:58.723 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - '1' + 1S (714 milliseconds)
20:05:59.352 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:05:59.352 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:05:59.394 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:59.395 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:59.395 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:59.442 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:05:59.442 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:05:59.442 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:05:59.451 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - '1' + 1Y (697 milliseconds)
20:06:00.058 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:00.059 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:00.103 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:00.103 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:00.103 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:00.153 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:00.153 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:00.153 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:00.169 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - '1' + '1' (729 milliseconds)
20:06:00.773 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:00.773 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:00.809 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:00.809 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:00.809 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:00.857 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:00.857 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:00.857 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:00.868 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then 1 else null end  (717 milliseconds)
20:06:01.499 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:01.499 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:01.545 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:01.545 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:01.545 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:01.592 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:01.592 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:01.593 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:01.602 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then null else 1 end  (753 milliseconds)
20:06:02.257 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:02.257 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:02.304 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:02.304 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:02.304 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:02.350 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:02.351 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:02.351 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:02.360 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then 1.0 else null end  (713 milliseconds)
20:06:02.961 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:02.961 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:02.997 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:02.997 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:02.997 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:03.066 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:03.066 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:03.067 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:03.076 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then null else 1.0 end  (701 milliseconds)
20:06:03.666 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:03.666 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:03.703 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:03.704 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:03.704 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:03.749 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:03.749 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:03.749 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:03.760 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then 1L else null end  (693 milliseconds)
20:06:04.352 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:04.352 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:04.391 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:04.391 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:04.391 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:04.437 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:04.437 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:04.437 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:04.446 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then null else 1L end  (693 milliseconds)
20:06:05.051 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:05.052 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:05.100 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:05.100 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:05.101 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:05.153 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:05.153 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:05.154 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:05.163 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then 1S else null end  (705 milliseconds)
20:06:05.754 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:05.754 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:05.791 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:05.791 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:05.791 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:05.839 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:05.839 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:05.839 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:05.848 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then null else 1S end  (672 milliseconds)
20:06:06.431 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:06.431 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:06.470 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:06.470 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:06.470 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:06.518 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:06.518 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:06.518 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:06.527 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then 1Y else null end  (681 milliseconds)
20:06:07.106 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:07.106 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:07.145 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:07.145 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:07.145 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:07.190 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:07.190 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:07.190 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:07.203 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - case when then null else 1Y end  (670 milliseconds)
[info] - [SPARK-2210] boolean cast on boolean value should be removed (33 milliseconds)
20:06:07.813 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:07.813 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:07.855 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:07.855 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:07.855 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:07.903 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:07.903 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:07.903 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
[info] HiveParquetSuite:
[info] - Case insensitive attribute names (943 milliseconds)
[info] - SELECT on Parquet table (683 milliseconds)
[info] - Simple column projection + filter on Parquet table (859 milliseconds)
20:06:10.414 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
[info] - Converting Hive to Parquet Table via saveAsParquetFile (1 second, 602 milliseconds)
[info] - INSERT OVERWRITE TABLE Parquet table (2 seconds, 121 milliseconds)
[info] - SPARK-25206: wrong records are returned by filter pushdown when Hive metastore schema and parquet schema are in different letter cases (707 milliseconds)
20:06:14.898 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/targettable specified for non-external table:targettable
[info] - SPARK-25271: write empty map into hive parquet table (866 milliseconds)
20:06:15.761 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:15.761 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:15.797 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:15.797 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:15.797 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:15.842 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:15.842 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:15.842 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
[info] ScriptTransformationSuite:
[info] - cat without SerDe (223 milliseconds)
[info] - cat with LazySimpleSerDe (164 milliseconds)
20:06:17.391 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: Thread-ScriptTransformation-Feed exit cause by: 
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator.$anonfun$doExecute$1(ScriptTransformationSuite.scala:336)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.$anonfun$run$1(ScriptTransformationExec.scala:290)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1934)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformationExec.scala:278)
20:06:17.393 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 15931.0 (TID 29184)
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator.$anonfun$doExecute$1(ScriptTransformationSuite.scala:336)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.$anonfun$run$1(ScriptTransformationExec.scala:290)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1934)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformationExec.scala:278)
20:06:17.398 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 15931.0 (TID 29184, 192.168.122.1, executor driver): java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator.$anonfun$doExecute$1(ScriptTransformationSuite.scala:336)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.$anonfun$run$1(ScriptTransformationExec.scala:290)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1934)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformationExec.scala:278)

20:06:17.398 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 15931.0 failed 1 times; aborting job
20:06:17.401 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: 
[info] - script transformation should not swallow errors from upstream operators (no serde) (1 second, 171 milliseconds)
20:06:18.548 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: Thread-ScriptTransformation-Feed exit cause by: 
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator.$anonfun$doExecute$1(ScriptTransformationSuite.scala:336)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.$anonfun$run$1(ScriptTransformationExec.scala:313)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1934)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformationExec.scala:278)
20:06:18.551 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 15933.0 (TID 29186)
java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator.$anonfun$doExecute$1(ScriptTransformationSuite.scala:336)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.$anonfun$run$1(ScriptTransformationExec.scala:313)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1934)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformationExec.scala:278)
20:06:18.553 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 15933.0 (TID 29186, 192.168.122.1, executor driver): java.lang.IllegalArgumentException: intentional exception
	at org.apache.spark.sql.hive.execution.ExceptionInjectingOperator.$anonfun$doExecute$1(ScriptTransformationSuite.scala:336)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
	at scala.collection.Iterator.foreach(Iterator.scala:941)
	at scala.collection.Iterator.foreach$(Iterator.scala:941)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.$anonfun$run$1(ScriptTransformationExec.scala:313)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1934)
	at org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread.run(ScriptTransformationExec.scala:278)

20:06:18.553 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 15933.0 failed 1 times; aborting job
[info] - script transformation should not swallow errors from upstream operators (with serde) (1 second, 145 milliseconds)
20:06:18.558 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: 
20:06:18.695 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: /bin/bash: some_non_existent_command: command not found

20:06:18.698 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationExec: /bin/bash: some_non_existent_command: command not found

20:06:18.698 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationExec: /bin/bash: some_non_existent_command: command not found

20:06:18.698 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 15934.0 (TID 29187)
org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:198)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:187)
	... 15 more
20:06:18.700 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 15934.0 (TID 29187, 192.168.122.1, executor driver): org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:198)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:187)
	... 15 more

20:06:18.701 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 15934.0 failed 1 times; aborting job
[info] - SPARK-14400 script transformation should fail for bad script command (146 milliseconds)
[info] - SPARK-24339 verify the result after pruning the unused columns (198 milliseconds)
[info] - SPARK-25990: TRANSFORM should handle different data types correctly (359 milliseconds)
20:06:19.389 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationExec: /bin/bash: some_non_existent_command: command not found

20:06:19.389 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: /bin/bash: some_non_existent_command: command not found

20:06:19.389 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationExec: /bin/bash: some_non_existent_command: command not found

20:06:19.392 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 15939.0 (TID 29192)
org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:198)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:166)
	... 15 more
20:06:19.394 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 15939.0 (TID 29192, 192.168.122.1, executor driver): org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:198)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:166)
	... 15 more

20:06:19.394 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 15939.0 failed 1 times; aborting job
[info] - SPARK-30973: TRANSFORM should wait for the termination of the script (no serde) (138 milliseconds)
20:06:19.526 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationExec: /bin/bash: some_non_existent_command: command not found

20:06:19.526 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationWriterThread: /bin/bash: some_non_existent_command: command not found

20:06:19.526 ERROR org.apache.spark.sql.hive.execution.ScriptTransformationExec: /bin/bash: some_non_existent_command: command not found

20:06:19.529 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 15940.0 (TID 29193)
org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:198)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:187)
	... 15 more
20:06:19.531 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 15940.0 (TID 29193, 192.168.122.1, executor driver): org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:198)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:458)
	at org.apache.spark.sql.execution.SparkPlan.$anonfun$getByteArrayRdd$1(SparkPlan.scala:345)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2(RDD.scala:872)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsInternal$2$adapted(RDD.scala:872)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:349)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:313)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
	at org.apache.spark.scheduler.Task.run(Task.scala:127)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:462)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:465)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Subprocess exited with status 127. Error: /bin/bash: some_non_existent_command: command not found

	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.checkFailureAndPropagate(ScriptTransformationExec.scala:155)
	at org.apache.spark.sql.hive.execution.ScriptTransformationExec$$anon$1.hasNext(ScriptTransformationExec.scala:187)
	... 15 more

20:06:19.531 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 15940.0 failed 1 times; aborting job
[info] - SPARK-30973: TRANSFORM should wait for the termination of the script (with serde) (132 milliseconds)
[info] - SPARK-32608: Script Transform ROW FORMAT DELIMIT value should format value (395 milliseconds)
20:06:19.996 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:19.996 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:19.996 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:20.044 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:20.044 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:20.045 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
[info] ListTablesSuite:
20:06:20.066 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/hivelisttablessuitetable specified for non-external table:hivelisttablessuitetable
20:06:20.237 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database listtablessuitedb, returning NoSuchObjectException
20:06:20.397 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/listtablessuitedb.db/hiveindblisttablessuitetable specified for non-external table:hiveindblisttablessuitetable
[info] - get all tables of current database (493 milliseconds)
[info] - getting all tables with a database name (290 milliseconds)
20:06:21.652 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:21.652 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:21.653 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:21.698 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:21.699 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:21.699 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
[info] TestHiveSuite:
20:06:21.715 WARN org.apache.hadoop.hive.metastore.HiveMetaStore: Location: file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src specified for non-external table:src
20:06:22.233 WARN org.apache.hadoop.hive.common.FileUtils: File file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src does not exist; Force to delete it.
20:06:22.233 ERROR org.apache.hadoop.hive.common.FileUtils: Failed to delete file:/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/target/tmp/warehouse-46202497-12ff-43b0-bcda-5975db0e1a96/src
20:06:22.273 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:22.273 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:22.273 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:22.315 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:22.315 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:22.316 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:22.388 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:22.389 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:22.389 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:22.439 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:22.439 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:22.439 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
[info] - load test table based on case sensitivity (736 milliseconds)
[info] - SPARK-15887: hive-site.xml should be loaded (1 millisecond)
20:06:22.515 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:22.515 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:22.515 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
20:06:22.577 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
20:06:22.577 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
20:06:22.577 WARN org.apache.hadoop.hive.conf.HiveConf: HiveConf of name hive.stats.retries.wait does not exist
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.testUDAF started
[info] Test org.apache.spark.sql.hive.JavaDataFrameSuite.saveTableAndQueryIt started
[info] Test run finished: 0 failed, 0 ignored, 2 total, 1.947s
[info] Test run started
[info] Test org.apache.spark.sql.hive.JavaMetastoreDataSourcesSuite.saveTableAndQueryIt started
20:06:25.057 WARN org.apache.spark.sql.hive.test.TestHiveExternalCatalog: Couldn't find corresponding Hive SerDe for data source provider org.apache.spark.sql.json. Persisting data source table `default`.`javasavedtable` into Hive metastore in Spark SQL specific format, which is NOT compatible with Hive.
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.61s
[info] ScalaTest
[info] Run completed in 2 hours, 1 minute, 15 seconds.
[info] Total number of tests run: 3687
[info] Suites: completed 132, aborted 0
[info] Tests: succeeded 3687, failed 0, canceled 0, ignored 598, pending 0
[info] All tests passed.
[info] Passed: Total 3690, Failed 0, Errors 0, Passed 3690, Ignored 598
[success] Total time: 7343 s, completed May 16, 2021 8:06:57 PM

========================================================================
Running PySpark tests
========================================================================
Running PySpark tests. Output is in /home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/unit-tests.log
Will test against the following Python executables: ['python3.6', 'python2.7', 'pypy3']
Will test the following Python modules: ['pyspark-core', 'pyspark-ml', 'pyspark-mllib', 'pyspark-sql', 'pyspark-streaming']
python3.6 python_implementation is CPython
python3.6 version is: Python 3.6.8 :: Anaconda, Inc.
python2.7 python_implementation is CPython
python2.7 version is: Python 2.7.16 :: Anaconda, Inc.
pypy3 python_implementation is PyPy
pypy3 version is: Python 3.6.9 (7.3.1+dfsg-4, Apr 22 2020, 05:15:29)
[PyPy 7.3.1 with GCC 9.3.0]
Starting test(pypy3): pyspark.sql.tests.test_column
Starting test(pypy3): pyspark.sql.tests.test_catalog
Starting test(pypy3): pyspark.sql.tests.test_conf
Starting test(pypy3): pyspark.sql.tests.test_arrow
Starting test(pypy3): pyspark.sql.tests.test_dataframe
Starting test(pypy3): pyspark.sql.tests.test_functions
Starting test(pypy3): pyspark.sql.tests.test_datasources
Starting test(pypy3): pyspark.sql.tests.test_context
Finished test(pypy3): pyspark.sql.tests.test_arrow (0s) ... 59 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_group
Finished test(pypy3): pyspark.sql.tests.test_conf (13s)
Starting test(pypy3): pyspark.sql.tests.test_pandas_cogrouped_map
Finished test(pypy3): pyspark.sql.tests.test_pandas_cogrouped_map (0s) ... 16 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_grouped_map
Finished test(pypy3): pyspark.sql.tests.test_pandas_grouped_map (0s) ... 21 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_map
Finished test(pypy3): pyspark.sql.tests.test_pandas_map (1s) ... 7 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_udf
Finished test(pypy3): pyspark.sql.tests.test_pandas_udf (1s) ... 6 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_udf_grouped_agg
Finished test(pypy3): pyspark.sql.tests.test_pandas_udf_grouped_agg (1s) ... 16 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_udf_scalar
Finished test(pypy3): pyspark.sql.tests.test_pandas_udf_scalar (1s) ... 49 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_udf_typehints
Finished test(pypy3): pyspark.sql.tests.test_pandas_udf_typehints (1s) ... 10 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_pandas_udf_window
Finished test(pypy3): pyspark.sql.tests.test_pandas_udf_window (1s) ... 14 tests were skipped
Starting test(pypy3): pyspark.sql.tests.test_readwriter
Finished test(pypy3): pyspark.sql.tests.test_catalog (23s)
Starting test(pypy3): pyspark.sql.tests.test_serde
Finished test(pypy3): pyspark.sql.tests.test_group (23s)
Starting test(pypy3): pyspark.sql.tests.test_session
Finished test(pypy3): pyspark.sql.tests.test_column (26s)
Starting test(pypy3): pyspark.sql.tests.test_streaming
Finished test(pypy3): pyspark.sql.tests.test_context (28s)
Starting test(pypy3): pyspark.sql.tests.test_types
Finished test(pypy3): pyspark.sql.tests.test_datasources (28s)
Starting test(pypy3): pyspark.sql.tests.test_udf
Finished test(pypy3): pyspark.sql.tests.test_functions (47s)
Starting test(pypy3): pyspark.sql.tests.test_utils
Finished test(pypy3): pyspark.sql.tests.test_dataframe (50s) ... 10 tests were skipped
Starting test(pypy3): pyspark.streaming.tests.test_context
Finished test(pypy3): pyspark.sql.tests.test_serde (32s)
Starting test(pypy3): pyspark.streaming.tests.test_dstream
Finished test(pypy3): pyspark.streaming.tests.test_dstream (1s) ... 21 tests were skipped
Starting test(pypy3): pyspark.streaming.tests.test_kinesis
Finished test(pypy3): pyspark.streaming.tests.test_kinesis (0s) ... 2 tests were skipped
Starting test(pypy3): pyspark.streaming.tests.test_listener
Finished test(pypy3): pyspark.sql.tests.test_readwriter (37s)
Starting test(pypy3): pyspark.tests.test_appsubmit
Finished test(pypy3): pyspark.sql.tests.test_session (37s)
Starting test(pypy3): pyspark.tests.test_broadcast
Finished test(pypy3): pyspark.sql.tests.test_utils (15s)
Starting test(pypy3): pyspark.tests.test_conf
Finished test(pypy3): pyspark.sql.tests.test_streaming (43s)
Starting test(pypy3): pyspark.tests.test_context
Finished test(pypy3): pyspark.streaming.tests.test_listener (12s)
Starting test(pypy3): pyspark.tests.test_daemon
Finished test(pypy3): pyspark.tests.test_daemon (6s)
Starting test(pypy3): pyspark.tests.test_join
Finished test(pypy3): pyspark.streaming.tests.test_context (30s)
Starting test(pypy3): pyspark.tests.test_profiler
Finished test(pypy3): pyspark.sql.tests.test_types (55s)
Starting test(pypy3): pyspark.tests.test_rdd
Finished test(pypy3): pyspark.tests.test_conf (24s)
Starting test(pypy3): pyspark.tests.test_rddbarrier
Finished test(pypy3): pyspark.sql.tests.test_udf (63s)
Starting test(pypy3): pyspark.tests.test_readwrite
Finished test(pypy3): pyspark.tests.test_join (14s)
Starting test(pypy3): pyspark.tests.test_serializers
Finished test(pypy3): pyspark.tests.test_serializers (0s) ... 2 tests were skipped
Starting test(pypy3): pyspark.tests.test_shuffle
Finished test(pypy3): pyspark.tests.test_profiler (15s)
Starting test(pypy3): pyspark.tests.test_taskcontext
Finished test(pypy3): pyspark.tests.test_rddbarrier (11s)
Starting test(pypy3): pyspark.tests.test_util
Finished test(pypy3): pyspark.tests.test_util (10s)
Starting test(pypy3): pyspark.tests.test_worker
Finished test(pypy3): pyspark.tests.test_broadcast (46s)
Starting test(python2.7): pyspark.ml.tests.test_algorithms
Finished test(pypy3): pyspark.tests.test_shuffle (19s)
Starting test(python2.7): pyspark.ml.tests.test_base
Finished test(pypy3): pyspark.tests.test_readwrite (27s) ... 3 tests were skipped
Starting test(python2.7): pyspark.ml.tests.test_evaluation
Finished test(python2.7): pyspark.ml.tests.test_base (18s)
Starting test(python2.7): pyspark.ml.tests.test_feature
Finished test(pypy3): pyspark.tests.test_worker (25s)
Starting test(python2.7): pyspark.ml.tests.test_image
Finished test(pypy3): pyspark.tests.test_context (67s)
Starting test(python2.7): pyspark.ml.tests.test_linalg
Finished test(python2.7): pyspark.ml.tests.test_evaluation (19s)
Starting test(python2.7): pyspark.ml.tests.test_param
Finished test(python2.7): pyspark.ml.tests.test_image (19s)
Starting test(python2.7): pyspark.ml.tests.test_persistence
Finished test(python2.7): pyspark.ml.tests.test_param (25s)
Starting test(python2.7): pyspark.ml.tests.test_pipeline
Finished test(python2.7): pyspark.ml.tests.test_feature (33s)
Starting test(python2.7): pyspark.ml.tests.test_stat
Finished test(pypy3): pyspark.tests.test_taskcontext (76s)
Starting test(python2.7): pyspark.ml.tests.test_training_summary
Finished test(python2.7): pyspark.ml.tests.test_pipeline (6s)
Starting test(python2.7): pyspark.ml.tests.test_tuning
Finished test(python2.7): pyspark.ml.tests.test_linalg (36s)
Starting test(python2.7): pyspark.ml.tests.test_util
Finished test(python2.7): pyspark.ml.tests.test_stat (18s)
Starting test(python2.7): pyspark.ml.tests.test_wrapper
Finished test(pypy3): pyspark.tests.test_appsubmit (124s)
Starting test(python2.7): pyspark.mllib.tests.test_algorithms
Finished test(python2.7): pyspark.ml.tests.test_util (25s)
Starting test(python2.7): pyspark.mllib.tests.test_feature
Finished test(pypy3): pyspark.tests.test_rdd (119s)
Starting test(python2.7): pyspark.mllib.tests.test_linalg
Finished test(python2.7): pyspark.ml.tests.test_wrapper (22s)
Starting test(python2.7): pyspark.mllib.tests.test_stat
Finished test(python2.7): pyspark.ml.tests.test_training_summary (38s)
Starting test(python2.7): pyspark.mllib.tests.test_streaming_algorithms
Finished test(python2.7): pyspark.ml.tests.test_algorithms (102s)
Starting test(python2.7): pyspark.mllib.tests.test_util
Finished test(python2.7): pyspark.ml.tests.test_persistence (64s)
Starting test(python2.7): pyspark.sql.tests.test_arrow
Finished test(python2.7): pyspark.sql.tests.test_arrow (1s) ... 59 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_catalog
Finished test(python2.7): pyspark.mllib.tests.test_util (16s)
Starting test(python2.7): pyspark.sql.tests.test_column
Finished test(python2.7): pyspark.mllib.tests.test_feature (34s)
Starting test(python2.7): pyspark.sql.tests.test_conf
Finished test(python2.7): pyspark.mllib.tests.test_stat (31s)
Starting test(python2.7): pyspark.sql.tests.test_context
Finished test(python2.7): pyspark.sql.tests.test_catalog (24s)
Starting test(python2.7): pyspark.sql.tests.test_dataframe
Finished test(python2.7): pyspark.sql.tests.test_conf (13s)
Starting test(python2.7): pyspark.sql.tests.test_datasources
Finished test(python2.7): pyspark.sql.tests.test_column (23s)
Starting test(python2.7): pyspark.sql.tests.test_functions
Finished test(python2.7): pyspark.mllib.tests.test_algorithms (75s)
Starting test(python2.7): pyspark.sql.tests.test_group
Finished test(python2.7): pyspark.sql.tests.test_context (24s)
Starting test(python2.7): pyspark.sql.tests.test_pandas_cogrouped_map
Finished test(python2.7): pyspark.sql.tests.test_pandas_cogrouped_map (1s) ... 16 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_pandas_grouped_map
Finished test(python2.7): pyspark.sql.tests.test_pandas_grouped_map (1s) ... 21 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_pandas_map
Finished test(python2.7): pyspark.sql.tests.test_pandas_map (1s) ... 7 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_pandas_udf
Finished test(python2.7): pyspark.sql.tests.test_pandas_udf (1s) ... 6 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_pandas_udf_grouped_agg
Finished test(python2.7): pyspark.sql.tests.test_pandas_udf_grouped_agg (1s) ... 16 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_pandas_udf_scalar
Finished test(python2.7): pyspark.sql.tests.test_pandas_udf_scalar (1s) ... 49 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_pandas_udf_typehints
Finished test(python2.7): pyspark.sql.tests.test_datasources (24s)
Starting test(python2.7): pyspark.sql.tests.test_pandas_udf_window
Finished test(python2.7): pyspark.sql.tests.test_pandas_udf_typehints (1s) ... 10 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_readwriter
Finished test(python2.7): pyspark.sql.tests.test_pandas_udf_window (1s) ... 14 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_serde
Finished test(python2.7): pyspark.mllib.tests.test_linalg (71s)
Starting test(python2.7): pyspark.sql.tests.test_session
Finished test(python2.7): pyspark.sql.tests.test_group (23s)
Starting test(python2.7): pyspark.sql.tests.test_streaming
Finished test(python2.7): pyspark.sql.tests.test_functions (40s)
Starting test(python2.7): pyspark.sql.tests.test_types
Finished test(python2.7): pyspark.sql.tests.test_dataframe (54s) ... 3 tests were skipped
Starting test(python2.7): pyspark.sql.tests.test_udf
Finished test(python2.7): pyspark.sql.tests.test_serde (30s)
Starting test(python2.7): pyspark.sql.tests.test_utils
Finished test(python2.7): pyspark.sql.tests.test_readwriter (36s)
Starting test(python2.7): pyspark.streaming.tests.test_context
Finished test(python2.7): pyspark.sql.tests.test_session (35s)
Starting test(python2.7): pyspark.streaming.tests.test_dstream
Finished test(python2.7): pyspark.sql.tests.test_streaming (37s)
Starting test(python2.7): pyspark.streaming.tests.test_kinesis
Finished test(python2.7): pyspark.sql.tests.test_utils (16s)
Starting test(python2.7): pyspark.streaming.tests.test_listener
Finished test(python2.7): pyspark.streaming.tests.test_kinesis (0s) ... 2 tests were skipped
Starting test(python2.7): pyspark.tests.test_appsubmit
Finished test(python2.7): pyspark.streaming.tests.test_listener (12s)
Starting test(python2.7): pyspark.tests.test_broadcast
Finished test(python2.7): pyspark.streaming.tests.test_context (27s)
Starting test(python2.7): pyspark.tests.test_conf
Finished test(python2.7): pyspark.sql.tests.test_types (56s) ... 2 tests were skipped
Starting test(python2.7): pyspark.tests.test_context
Finished test(python2.7): pyspark.sql.tests.test_udf (55s)
Starting test(python2.7): pyspark.tests.test_daemon
Finished test(python2.7): pyspark.tests.test_conf (18s)
Starting test(python2.7): pyspark.tests.test_join
Finished test(python2.7): pyspark.tests.test_daemon (5s)
Starting test(python2.7): pyspark.tests.test_profiler
Finished test(python2.7): pyspark.mllib.tests.test_streaming_algorithms (150s)
Starting test(python2.7): pyspark.tests.test_rdd
Finished test(python2.7): pyspark.tests.test_join (12s)
Starting test(python2.7): pyspark.tests.test_rddbarrier
Finished test(python2.7): pyspark.tests.test_profiler (13s)
Starting test(python2.7): pyspark.tests.test_readwrite
Finished test(python2.7): pyspark.tests.test_broadcast (45s)
Starting test(python2.7): pyspark.tests.test_serializers
Finished test(python2.7): pyspark.tests.test_rddbarrier (11s)
Starting test(python2.7): pyspark.tests.test_shuffle
Finished test(python2.7): pyspark.tests.test_serializers (12s)
Starting test(python2.7): pyspark.tests.test_taskcontext
Finished test(python2.7): pyspark.tests.test_shuffle (15s)
Starting test(python2.7): pyspark.tests.test_util
Finished test(python2.7): pyspark.tests.test_readwrite (28s)
Starting test(python2.7): pyspark.tests.test_worker
Finished test(python2.7): pyspark.tests.test_util (7s)
Starting test(python3.6): pyspark.ml.tests.test_algorithms
Finished test(python2.7): pyspark.tests.test_context (60s)
Starting test(python3.6): pyspark.ml.tests.test_base
Finished test(python2.7): pyspark.tests.test_worker (19s)
Starting test(python3.6): pyspark.ml.tests.test_evaluation
Finished test(python3.6): pyspark.ml.tests.test_base (18s)
Starting test(python3.6): pyspark.ml.tests.test_feature
Finished test(python2.7): pyspark.tests.test_appsubmit (107s)
Starting test(python3.6): pyspark.ml.tests.test_image
Finished test(python2.7): pyspark.streaming.tests.test_dstream (121s)
Starting test(python3.6): pyspark.ml.tests.test_linalg
Finished test(python3.6): pyspark.ml.tests.test_evaluation (23s)
Starting test(python3.6): pyspark.ml.tests.test_param
Finished test(python2.7): pyspark.ml.tests.test_tuning (274s)
Starting test(python3.6): pyspark.ml.tests.test_persistence
Finished test(python3.6): pyspark.ml.tests.test_image (19s)
Starting test(python3.6): pyspark.ml.tests.test_pipeline
Finished test(python3.6): pyspark.ml.tests.test_pipeline (7s)
Starting test(python3.6): pyspark.ml.tests.test_stat
Finished test(python2.7): pyspark.tests.test_taskcontext (66s)
Starting test(python3.6): pyspark.ml.tests.test_training_summary
Finished test(python3.6): pyspark.ml.tests.test_feature (33s)
Starting test(python3.6): pyspark.ml.tests.test_tuning
Finished test(python2.7): pyspark.tests.test_rdd (103s)
Starting test(python3.6): pyspark.ml.tests.test_util
Finished test(python3.6): pyspark.ml.tests.test_param (23s)
Starting test(python3.6): pyspark.ml.tests.test_wrapper
Finished test(python3.6): pyspark.ml.tests.test_linalg (37s)
Starting test(python3.6): pyspark.mllib.tests.test_algorithms
Finished test(python3.6): pyspark.ml.tests.test_stat (21s)
Starting test(python3.6): pyspark.mllib.tests.test_feature
Finished test(python3.6): pyspark.ml.tests.test_wrapper (23s)
Starting test(python3.6): pyspark.mllib.tests.test_linalg
Finished test(python3.6): pyspark.ml.tests.test_util (28s)
Starting test(python3.6): pyspark.mllib.tests.test_stat
Finished test(python3.6): pyspark.ml.tests.test_training_summary (38s)
Starting test(python3.6): pyspark.mllib.tests.test_streaming_algorithms
Finished test(python3.6): pyspark.ml.tests.test_algorithms (103s)
Starting test(python3.6): pyspark.mllib.tests.test_util
Finished test(python3.6): pyspark.ml.tests.test_persistence (67s)
Starting test(python3.6): pyspark.sql.tests.test_arrow
Finished test(python3.6): pyspark.mllib.tests.test_feature (39s)
Starting test(python3.6): pyspark.sql.tests.test_catalog
Finished test(python3.6): pyspark.mllib.tests.test_util (16s)
Starting test(python3.6): pyspark.sql.tests.test_column
Finished test(python3.6): pyspark.mllib.tests.test_stat (34s)
Starting test(python3.6): pyspark.sql.tests.test_conf
Finished test(python3.6): pyspark.sql.tests.test_conf (14s)
Starting test(python3.6): pyspark.sql.tests.test_context
Finished test(python3.6): pyspark.sql.tests.test_catalog (24s)
Starting test(python3.6): pyspark.sql.tests.test_dataframe
Finished test(python3.6): pyspark.sql.tests.test_column (24s)
Starting test(python3.6): pyspark.sql.tests.test_datasources
Finished test(python3.6): pyspark.mllib.tests.test_algorithms (84s)
Starting test(python3.6): pyspark.sql.tests.test_functions
Finished test(python3.6): pyspark.sql.tests.test_arrow (43s)
Starting test(python3.6): pyspark.sql.tests.test_group
Finished test(python3.6): pyspark.sql.tests.test_context (23s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_cogrouped_map
Finished test(python3.6): pyspark.mllib.tests.test_linalg (74s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_grouped_map
Finished test(python3.6): pyspark.sql.tests.test_datasources (25s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_map
Finished test(python3.6): pyspark.sql.tests.test_group (23s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_udf
Finished test(python3.6): pyspark.sql.tests.test_dataframe (51s) ... 2 tests were skipped
Starting test(python3.6): pyspark.sql.tests.test_pandas_udf_grouped_agg
Finished test(python3.6): pyspark.sql.tests.test_pandas_map (27s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_udf_scalar
Finished test(python3.6): pyspark.sql.tests.test_functions (45s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_udf_typehints
Finished test(python3.6): pyspark.sql.tests.test_pandas_udf (30s)
Starting test(python3.6): pyspark.sql.tests.test_pandas_udf_window
Finished test(python3.6): pyspark.sql.tests.test_pandas_grouped_map (59s)
Starting test(python3.6): pyspark.sql.tests.test_readwriter
Finished test(python3.6): pyspark.sql.tests.test_pandas_udf_typehints (26s)
Starting test(python3.6): pyspark.sql.tests.test_serde
Finished test(python3.6): pyspark.sql.tests.test_serde (33s)
Starting test(python3.6): pyspark.sql.tests.test_session
Finished test(python3.6): pyspark.sql.tests.test_readwriter (40s)
Starting test(python3.6): pyspark.sql.tests.test_streaming
Finished test(python3.6): pyspark.sql.tests.test_pandas_udf_window (62s)
Starting test(python3.6): pyspark.sql.tests.test_types
Finished test(python3.6): pyspark.mllib.tests.test_streaming_algorithms (186s)
Starting test(python3.6): pyspark.sql.tests.test_udf
Finished test(python3.6): pyspark.sql.tests.test_pandas_udf_scalar (92s)
Starting test(python3.6): pyspark.sql.tests.test_utils
Finished test(python3.6): pyspark.sql.tests.test_session (35s)
Starting test(python3.6): pyspark.streaming.tests.test_context
Finished test(python3.6): pyspark.sql.tests.test_pandas_udf_grouped_agg (113s)
Starting test(python3.6): pyspark.streaming.tests.test_dstream
Finished test(python3.6): pyspark.sql.tests.test_pandas_cogrouped_map (143s)
Starting test(python3.6): pyspark.streaming.tests.test_kinesis
Finished test(python3.6): pyspark.sql.tests.test_utils (16s)
Starting test(python3.6): pyspark.streaming.tests.test_listener
Finished test(python3.6): pyspark.streaming.tests.test_kinesis (0s) ... 2 tests were skipped
Starting test(python3.6): pyspark.tests.test_appsubmit
Finished test(python3.6): pyspark.sql.tests.test_streaming (42s)
Starting test(python3.6): pyspark.tests.test_broadcast
Finished test(python3.6): pyspark.streaming.tests.test_listener (13s)
Starting test(python3.6): pyspark.tests.test_conf
Finished test(python3.6): pyspark.streaming.tests.test_context (27s)
Starting test(python3.6): pyspark.tests.test_context
Finished test(python3.6): pyspark.sql.tests.test_types (51s)
Starting test(python3.6): pyspark.tests.test_daemon
Finished test(python3.6): pyspark.tests.test_daemon (5s)
Starting test(python3.6): pyspark.tests.test_join
Finished test(python3.6): pyspark.sql.tests.test_udf (57s)
Starting test(python3.6): pyspark.tests.test_profiler
Finished test(python3.6): pyspark.tests.test_conf (19s)
Starting test(python3.6): pyspark.tests.test_rdd
Finished test(python3.6): pyspark.tests.test_join (12s)
Starting test(python3.6): pyspark.tests.test_rddbarrier
Finished test(python3.6): pyspark.ml.tests.test_tuning (286s)
Starting test(python3.6): pyspark.tests.test_readwrite
Finished test(python3.6): pyspark.tests.test_broadcast (44s)
Starting test(python3.6): pyspark.tests.test_serializers
Finished test(python3.6): pyspark.tests.test_profiler (14s)
Starting test(python3.6): pyspark.tests.test_shuffle
Finished test(python3.6): pyspark.tests.test_rddbarrier (10s)
Starting test(python3.6): pyspark.tests.test_taskcontext
Finished test(python3.6): pyspark.tests.test_serializers (14s)
Starting test(python3.6): pyspark.tests.test_util
Finished test(python3.6): pyspark.tests.test_shuffle (13s)
Starting test(python3.6): pyspark.tests.test_worker
Finished test(python3.6): pyspark.tests.test_readwrite (23s) ... 3 tests were skipped
Starting test(pypy3): pyspark.accumulators
Finished test(python3.6): pyspark.tests.test_util (7s)
Starting test(pypy3): pyspark.broadcast
Finished test(pypy3): pyspark.accumulators (9s)
Starting test(pypy3): pyspark.conf
Finished test(pypy3): pyspark.broadcast (9s)
Starting test(pypy3): pyspark.context
Finished test(python3.6): pyspark.tests.test_context (61s)
Starting test(pypy3): pyspark.profiler
Finished test(pypy3): pyspark.conf (6s)
Starting test(pypy3): pyspark.rdd
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 1) / 1]
/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/lib/pyspark.zip/pyspark/sql/types.py:1495: UserWarning: The environment variable 'PYSPARK_ROW_FIELD_SORTING_ENABLED' is deprecated and will be removed in future versions of Spark
Current mem limits: -1 of max -1

Setting mem limits to 2147483648 of max 2147483648


[Stage 0:===========================================================(1 + 0) / 1]

                                                                                

[Stage 0:>                                                          (0 + 4) / 8]
/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/lib/pyspark.zip/pyspark/sql/types.py:1495: UserWarning: The environment variable 'PYSPARK_ROW_FIELD_SORTING_ENABLED' is deprecated and will be removed in future versions of Spark

[Stage 0:>                                                          (0 + 7) / 8]

                                                                                

[Stage 0:>                                                         (0 + 4) / 20]
/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/lib/pyspark.zip/pyspark/sql/types.py:1495: UserWarning: The environment variable 'PYSPARK_ROW_FIELD_SORTING_ENABLED' is deprecated and will be removed in future versions of Spark

[Stage 0:==========================================>              (15 + 5) / 20]

                                                                                

[Stage 9:>                                                          (0 + 1) / 1]

                                                                                
/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/pyspark/sql/types.py:1495: UserWarning: The environment variable 'PYSPARK_ROW_FIELD_SORTING_ENABLED' is deprecated and will be removed in future versions of Spark
  warnings.warn("The environment variable 'PYSPARK_ROW_FIELD_SORTING_ENABLED' "

Running tests...
----------------------------------------------------------------------
  test_memory_limit (pyspark.tests.test_worker.WorkerMemoryTest) ... OK (8.827s)
  test_reuse_worker_of_parallelize_xrange (pyspark.tests.test_worker.WorkerReuseTest) ... FAIL (2.186s)
  test_accumulator_when_reuse_worker (pyspark.tests.test_worker.WorkerTests) ... OK (1.407s)
  test_after_exception (pyspark.tests.test_worker.WorkerTests) ... OK (0.359s)
  test_after_jvm_exception (pyspark.tests.test_worker.WorkerTests) ... OK (0.838s)
  test_after_non_exception_error (pyspark.tests.test_worker.WorkerTests) ... OK (0.132s)
  test_cancel_task (pyspark.tests.test_worker.WorkerTests) ... OK (4.169s)
  test_python_exception_non_hanging (pyspark.tests.test_worker.WorkerTests) ... OK (0.108s)
  test_reuse_worker_after_take (pyspark.tests.test_worker.WorkerTests) ... OK (0.226s)
  test_with_different_versions_of_python (pyspark.tests.test_worker.WorkerTests) ... OK (0.212s)

======================================================================
ERROR [2.186s]: test_reuse_worker_of_parallelize_xrange (pyspark.tests.test_worker.WorkerReuseTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/pyspark/tests/test_worker.py", line 185, in test_reuse_worker_of_parallelize_xrange
    self.assertTrue(pid in previous_pids)
AssertionError: False is not true

----------------------------------------------------------------------
Ran 10 tests in 19.527s

FAILED (errors=1)

Generating XML reports...

Had test failures in pyspark.tests.test_worker with python3.6; see logs.
[error] running /home/jenkins/workspace/spark-branch-3.0-test-sbt-hadoop-3.2-hive-2.3/python/run-tests --parallelism=8 ; received return code 255
Process leaked file descriptors. See https://jenkins.io/redirect/troubleshooting/process-leaked-file-descriptors for more information
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
[Checks API] No suitable checks publisher found.
Finished: FAILURE