Console Output

Skipping 40,294 KB.. Full Log
copying deps/jars/kubernetes-model-events-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-extensions-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-flowcontrol-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-metrics-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-networking-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-node-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-policy-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-rbac-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-scheduling-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-storageclass-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/lapack-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.12.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/lz4-java-1.8.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/mesos-1.4.3-shaded-protobuf.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-core-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-graphite-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-jmx-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-json-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-jvm-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/mockito-3-4_2.12-3.2.9.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/mockito-core-3.4.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/neko-htmlunit-2.39.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.68.Final.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/objenesis-2.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.12.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/orc-core-1.7.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.7.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/orc-shims-1.7.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-column-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-common-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-format-structures-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/pickle-1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/protobuf-java-3.14.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/py4j-0.10.9.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/remotetea-oncrpc-1.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/rocksdbjni-6.20.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-collection-compat_2.12-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.3.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalacheck-1-15_2.12-3.2.9.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalacheck_2.12-1.15.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalactic_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-compatible-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-core_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-diagrams_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-featurespec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-flatspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-freespec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-funspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-funsuite_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-matchers-core_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-mustmatchers_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-propspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-refspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-shouldmatchers_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-wordspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-3-141_2.12-3.2.9.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-api-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-chrome-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-edge-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-firefox-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-ie-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-java-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-opera-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-remote-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-safari-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-support-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/serializer-2.7.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/shims-0.9.22.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/silencer-lib_2.12.15-1.7.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.32.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.30.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.28.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.8.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.3.0-SNAPSHOT-tests.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-hadoop-cloud_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire-platform_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire-util_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/test-interface-1.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/threeten-extra-1.5.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/tink-1.6.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/transaction-api-1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.9.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/velocity-1.5.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/websocket-api-9.4.27.v20200227.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/websocket-client-9.4.27.v20200227.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/websocket-common-9.4.27.v20200227.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/wildfly-openssl-1.0.7.Final.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xalan-2.7.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xbean-asm9-shaded-4.20.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xml-apis-1.4.01.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xz-1.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zookeeper-3.6.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zookeeper-jute-3.6.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.5.0-4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis-timeline.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.3.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.3.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.3.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.3.0.dev0/deps/sbin
copying lib/py4j-0.10.9.2-src.zip -> pyspark-3.3.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.3.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/__init__.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/_typing.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/accumulators.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/broadcast.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/context.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/install.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/profiler.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/py.typed -> pyspark-3.3.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/rdd.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark/cloudpickle/__init__.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/cloudpickle/cloudpickle.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/cloudpickle/cloudpickle_fast.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/cloudpickle/compat.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/ml/__init__.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/_typing.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/base.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/classification.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/clustering.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/feature.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/fpm.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/functions.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/functions.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/image.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/regression.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/stat.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tree.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tree.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tuning.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/util.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.3.0.dev0/pyspark/ml/linalg
copying pyspark/ml/linalg/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.pyi -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.pyi -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/_typing.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/random.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/util.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.pyi -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.pyi -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/pandas/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/_typing.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/accessors.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/base.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/categorical.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/config.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/datetimes.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/exceptions.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/extensions.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/frame.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/generic.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/groupby.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/indexing.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/internal.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/ml.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/mlflow.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/namespace.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/numpy_compat.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/series.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/sql_formatter.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/sql_processor.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/strings.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/utils.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/window.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/data_type_ops/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/base.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/binary_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/boolean_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/categorical_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/complex_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/date_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/datetime_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/null_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/num_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/string_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/udt_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/indexes/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/base.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/category.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/datetimes.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/multi.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/numeric.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/missing/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/common.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/frame.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/groupby.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/indexes.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/series.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/window.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/plot/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/plot/core.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/plot/matplotlib.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/plot/plotly.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/spark/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/spark/accessors.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/spark/functions.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/spark/utils.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/typedef/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/typedef
copying pyspark/pandas/typedef/string_typehints.py -> pyspark-3.3.0.dev0/pyspark/pandas/typedef
copying pyspark/pandas/typedef/typehints.py -> pyspark-3.3.0.dev0/pyspark/pandas/typedef
copying pyspark/pandas/usage_logging/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/usage_logging
copying pyspark/pandas/usage_logging/usage_logger.py -> pyspark-3.3.0.dev0/pyspark/pandas/usage_logging
copying pyspark/python/pyspark/shell.py -> pyspark-3.3.0.dev0/pyspark/python/pyspark
copying pyspark/resource/__init__.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/resource/information.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/resource/profile.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/resource/requests.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/sql/__init__.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/_typing.pyi -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/observation.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/avro/__init__.py -> pyspark-3.3.0.dev0/pyspark/sql/avro
copying pyspark/sql/avro/functions.py -> pyspark-3.3.0.dev0/pyspark/sql/avro
copying pyspark/sql/pandas/__init__.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/conversion.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/functions.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/functions.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/group_ops.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/map_ops.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/serializers.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/typehints.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/types.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/utils.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/_typing/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing
copying pyspark/sql/pandas/_typing/protocols/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
copying pyspark/sql/pandas/_typing/protocols/frame.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
copying pyspark/sql/pandas/_typing/protocols/series.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
copying pyspark/streaming/__init__.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/context.pyi -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.pyi -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.3.0.dev0/pyspark/streaming
Writing pyspark-3.3.0.dev0/setup.cfg
creating dist
Creating tar archive
removing 'pyspark-3.3.0.dev0' (and everything under it)
Installing dist into virtual env
Processing ./python/dist/pyspark-3.3.0.dev0.tar.gz
Collecting py4j==0.10.9.2
  Downloading py4j-0.10.9.2-py2.py3-none-any.whl (198 kB)
Building wheels for collected packages: pyspark
  Building wheel for pyspark (setup.py): started
  Building wheel for pyspark (setup.py): finished with status 'done'
  Created wheel for pyspark: filename=pyspark-3.3.0.dev0-py2.py3-none-any.whl size=492372955 sha256=1f47e30b1b50469cc10b378c103975395e5e1b7416335647dee31e0ea8bb51f2
  Stored in directory: /tmp/pip-ephem-wheel-cache-ovez_3fe/wheels/aa/cf/7b/8503f86eb8bc05b575fde9a670eb24e8fb1159f99d81bf5306
Successfully built pyspark
Installing collected packages: py4j, pyspark
Successfully installed py4j-0.10.9.2 pyspark-3.3.0.dev0
Run basic sanity check on pip installed version with spark-submit
21/11/26 07:23:52 WARN Utils: Your hostname, research-jenkins-worker-08 resolves to a loopback address: 127.0.1.1; using 192.168.10.28 instead (on interface igb0)
21/11/26 07:23:52 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/11/26 07:23:53 INFO SparkContext: Running Spark version 3.3.0-SNAPSHOT
21/11/26 07:23:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/11/26 07:23:54 INFO ResourceUtils: ==============================================================
21/11/26 07:23:54 INFO ResourceUtils: No custom resources configured for spark.driver.
21/11/26 07:23:54 INFO ResourceUtils: ==============================================================
21/11/26 07:23:54 INFO SparkContext: Submitted application: PipSanityCheck
21/11/26 07:23:54 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
21/11/26 07:23:54 INFO ResourceProfile: Limiting resource is cpu
21/11/26 07:23:54 INFO ResourceProfileManager: Added ResourceProfile id: 0
21/11/26 07:23:54 INFO SecurityManager: Changing view acls to: jenkins
21/11/26 07:23:54 INFO SecurityManager: Changing modify acls to: jenkins
21/11/26 07:23:54 INFO SecurityManager: Changing view acls groups to: 
21/11/26 07:23:54 INFO SecurityManager: Changing modify acls groups to: 
21/11/26 07:23:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
21/11/26 07:23:54 INFO Utils: Successfully started service 'sparkDriver' on port 42141.
21/11/26 07:23:54 INFO SparkEnv: Registering MapOutputTracker
21/11/26 07:23:54 INFO SparkEnv: Registering BlockManagerMaster
21/11/26 07:23:54 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/11/26 07:23:54 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/11/26 07:23:54 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/11/26 07:23:54 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-72b48949-aa5f-4501-9ce9-0c4349407b91
21/11/26 07:23:54 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
21/11/26 07:23:54 INFO SparkEnv: Registering OutputCommitCoordinator
21/11/26 07:23:54 INFO log: Logging initialized @2730ms to org.eclipse.jetty.util.log.Slf4jLog
21/11/26 07:23:54 INFO Server: jetty-9.4.43.v20210629; built: 2021-06-30T11:07:22.254Z; git: 526006ecfa3af7f1a27ef3a288e2bef7ea9dd7e8; jvm 1.8.0_282-8u282-b08-0ubuntu1~20.04-b08
21/11/26 07:23:54 INFO Server: Started @2883ms
21/11/26 07:23:54 INFO AbstractConnector: Started ServerConnector@13eacca{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
21/11/26 07:23:54 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/11/26 07:23:54 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5ab604b5{/,null,AVAILABLE,@Spark}
21/11/26 07:23:54 INFO Executor: Starting executor ID driver on host 192.168.10.28
21/11/26 07:23:54 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): ''
21/11/26 07:23:54 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45153.
21/11/26 07:23:54 INFO NettyBlockTransferService: Server created on 192.168.10.28:45153
21/11/26 07:23:54 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/11/26 07:23:55 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.10.28, 45153, None)
21/11/26 07:23:55 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.10.28:45153 with 366.3 MiB RAM, BlockManagerId(driver, 192.168.10.28, 45153, None)
21/11/26 07:23:55 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.10.28, 45153, None)
21/11/26 07:23:55 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.10.28, 45153, None)
21/11/26 07:23:55 INFO ContextHandler: Stopped o.e.j.s.ServletContextHandler@5ab604b5{/,null,STOPPED,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@70baa16a{/jobs,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2cf45867{/jobs/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@56b292b9{/jobs/job,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5adae7d5{/jobs/job/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1d737a2b{/stages,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@166cae88{/stages/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3fbec8f6{/stages/stage,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1d41b9c4{/stages/stage/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3179ae06{/stages/pool,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@17b3ac5d{/stages/pool/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5aba5843{/storage,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3278a87e{/storage/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6247d7bd{/storage/rdd,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@34da62e0{/storage/rdd/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3d093f96{/environment,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@290bfc6c{/environment/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@438a656a{/executors,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@71a0633c{/executors/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@eb9b48d{/executors/threadDump,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@b82d8a8{/executors/threadDump/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@179339a8{/static,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@21db1d80{/,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@58d4e61f{/api,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3356957f{/jobs/job/kill,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@37812ee0{/stages/stage/kill,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1de789c7{/metrics/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
21/11/26 07:23:55 INFO SharedState: Warehouse path is 'file:/spark-warehouse'.
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@17856acc{/SQL,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2ad8dbe2{/SQL/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@71bdb175{/SQL/execution,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3b5aaa64{/SQL/execution/json,null,AVAILABLE,@Spark}
21/11/26 07:23:55 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5cf3c73b{/static/sql,null,AVAILABLE,@Spark}
21/11/26 07:23:56 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28
21/11/26 07:23:56 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28) with 10 output partitions
21/11/26 07:23:56 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28)
21/11/26 07:23:56 INFO DAGScheduler: Parents of final stage: List()
21/11/26 07:23:56 INFO DAGScheduler: Missing parents: List()
21/11/26 07:23:56 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28), which has no missing parents
21/11/26 07:23:56 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 7.1 KiB, free 366.3 MiB)
21/11/26 07:23:57 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.5 KiB, free 366.3 MiB)
21/11/26 07:23:57 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.10.28:45153 (size: 4.5 KiB, free: 366.3 MiB)
21/11/26 07:23:57 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1474
21/11/26 07:23:57 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
21/11/26 07:23:57 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks resource profile 0
21/11/26 07:23:57 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0) (192.168.10.28, executor driver, partition 0, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1) (192.168.10.28, executor driver, partition 1, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2) (192.168.10.28, executor driver, partition 2, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3) (192.168.10.28, executor driver, partition 3, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4) (192.168.10.28, executor driver, partition 4, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5) (192.168.10.28, executor driver, partition 5, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6) (192.168.10.28, executor driver, partition 6, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7) (192.168.10.28, executor driver, partition 7, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8) (192.168.10.28, executor driver, partition 8, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9) (192.168.10.28, executor driver, partition 9, PROCESS_LOCAL, 4433 bytes) taskResourceAssignments Map()
21/11/26 07:23:57 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
21/11/26 07:23:57 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
21/11/26 07:23:57 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
21/11/26 07:23:57 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/11/26 07:23:57 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
21/11/26 07:23:57 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
21/11/26 07:23:57 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
21/11/26 07:23:57 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
21/11/26 07:23:57 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
21/11/26 07:23:57 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
21/11/26 07:23:58 INFO PythonRunner: Times: total = 547, boot = 536, init = 10, finish = 1
21/11/26 07:23:58 INFO PythonRunner: Times: total = 547, boot = 541, init = 6, finish = 0
21/11/26 07:23:58 INFO PythonRunner: Times: total = 547, boot = 526, init = 20, finish = 1
21/11/26 07:23:58 INFO PythonRunner: Times: total = 547, boot = 531, init = 15, finish = 1
21/11/26 07:23:58 INFO PythonRunner: Times: total = 553, boot = 546, init = 6, finish = 1
21/11/26 07:23:58 INFO PythonRunner: Times: total = 557, boot = 551, init = 5, finish = 1
21/11/26 07:23:58 INFO PythonRunner: Times: total = 563, boot = 557, init = 5, finish = 1
21/11/26 07:23:58 INFO PythonRunner: Times: total = 568, boot = 563, init = 5, finish = 0
21/11/26 07:23:58 INFO PythonRunner: Times: total = 573, boot = 567, init = 6, finish = 0
21/11/26 07:23:58 INFO PythonRunner: Times: total = 578, boot = 572, init = 6, finish = 0
21/11/26 07:23:58 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1354 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1354 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1354 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1355 bytes result sent to driver
21/11/26 07:23:58 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 995 ms on 192.168.10.28 (executor driver) (1/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 999 ms on 192.168.10.28 (executor driver) (2/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1001 ms on 192.168.10.28 (executor driver) (3/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1019 ms on 192.168.10.28 (executor driver) (4/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 999 ms on 192.168.10.28 (executor driver) (5/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 997 ms on 192.168.10.28 (executor driver) (6/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 995 ms on 192.168.10.28 (executor driver) (7/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 994 ms on 192.168.10.28 (executor driver) (8/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 997 ms on 192.168.10.28 (executor driver) (9/10)
21/11/26 07:23:58 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 994 ms on 192.168.10.28 (executor driver) (10/10)
21/11/26 07:23:58 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/11/26 07:23:58 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 38419
21/11/26 07:23:58 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28) finished in 1.224 s
21/11/26 07:23:58 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
21/11/26 07:23:58 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
21/11/26 07:23:58 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/dev/pip-sanity-check.py:28, took 1.291253 s
Successfully ran pip sanity check
21/11/26 07:23:58 INFO AbstractConnector: Stopped Spark@13eacca{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
21/11/26 07:23:58 INFO SparkUI: Stopped Spark web UI at http://192.168.10.28:4040
21/11/26 07:23:58 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/11/26 07:23:58 INFO MemoryStore: MemoryStore cleared
21/11/26 07:23:58 INFO BlockManager: BlockManager stopped
21/11/26 07:23:58 INFO BlockManagerMaster: BlockManagerMaster stopped
21/11/26 07:23:58 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/11/26 07:23:58 INFO SparkContext: Successfully stopped SparkContext
21/11/26 07:23:59 INFO ShutdownHookManager: Shutdown hook called
21/11/26 07:23:59 INFO ShutdownHookManager: Deleting directory /tmp/spark-c528589b-6e45-419b-8e8a-f30f418f76af
21/11/26 07:23:59 INFO ShutdownHookManager: Deleting directory /tmp/spark-d72c409e-c5ce-4e95-b722-e3cdfc4ff0fc
21/11/26 07:23:59 INFO ShutdownHookManager: Deleting directory /tmp/spark-d72c409e-c5ce-4e95-b722-e3cdfc4ff0fc/pyspark-d6e4f014-33de-4b81-913a-a58b4e73ed4f
Run basic sanity check with import based
21/11/26 07:24:00 WARN Utils: Your hostname, research-jenkins-worker-08 resolves to a loopback address: 127.0.1.1; using 192.168.10.28 instead (on interface igb0)
21/11/26 07:24:00 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/11/26 07:24:01 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[Stage 0:>                                                        (0 + 10) / 10]

                                                                                
Successfully ran pip sanity check
Run the tests for context.py
21/11/26 07:24:08 WARN Utils: Your hostname, research-jenkins-worker-08 resolves to a loopback address: 127.0.1.1; using 192.168.10.28 instead (on interface igb0)
21/11/26 07:24:08 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/11/26 07:24:08 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/11/26 07:24:10 WARN Utils: Your hostname, research-jenkins-worker-08 resolves to a loopback address: 127.0.1.1; using 192.168.10.28 instead (on interface igb0)
21/11/26 07:24:10 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/11/26 07:24:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[Stage 0:>                                                          (0 + 4) / 4]

                                                                                

[Stage 10:>                                                         (0 + 4) / 4]

[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]
21/11/26 07:24:23 WARN PythonRunner: Incomplete task 1.0 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker
21/11/26 07:24:23 WARN PythonRunner: Incomplete task 0.0 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker
21/11/26 07:24:23 WARN PythonRunner: Incomplete task 2.0 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker
21/11/26 07:24:23 WARN PythonRunner: Incomplete task 3.0 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker

[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 2) / 2]
21/11/26 07:24:23 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42) (192.168.10.28 executor driver): TaskKilled (Stage cancelled)
21/11/26 07:24:23 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40) (192.168.10.28 executor driver): TaskKilled (Stage cancelled)
21/11/26 07:24:23 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41) (192.168.10.28 executor driver): TaskKilled (Stage cancelled)
21/11/26 07:24:23 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39) (192.168.10.28 executor driver): TaskKilled (Stage cancelled)

                                                                                
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
Testing pip installation with python 3.9
Using /tmp/tmp.PVpg6xqDGH for virtualenv
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done


==> WARNING: A newer version of conda exists. <==
  current version: 4.7.12
  latest version: 4.10.3

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: /tmp/tmp.PVpg6xqDGH/3.9

  added / updated specs:
    - numpy
    - pandas
    - pip
    - python=3.9
    - setuptools


The following NEW packages will be INSTALLED:

  _libgcc_mutex      pkgs/main/linux-64::_libgcc_mutex-0.1-main
  blas               pkgs/main/linux-64::blas-1.0-mkl
  bottleneck         pkgs/main/linux-64::bottleneck-1.3.2-py39hdd57654_1
  ca-certificates    pkgs/main/linux-64::ca-certificates-2021.10.26-h06a4308_2
  certifi            pkgs/main/linux-64::certifi-2021.10.8-py39h06a4308_0
  intel-openmp       pkgs/main/linux-64::intel-openmp-2021.4.0-h06a4308_3561
  ld_impl_linux-64   pkgs/main/linux-64::ld_impl_linux-64-2.35.1-h7274673_9
  libffi             pkgs/main/linux-64::libffi-3.3-he6710b0_2
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
  mkl                pkgs/main/linux-64::mkl-2021.4.0-h06a4308_640
  mkl-service        pkgs/main/linux-64::mkl-service-2.4.0-py39h7f8727e_0
  mkl_fft            pkgs/main/linux-64::mkl_fft-1.3.1-py39hd3c417c_0
  mkl_random         pkgs/main/linux-64::mkl_random-1.2.2-py39h51133e4_0
  ncurses            pkgs/main/linux-64::ncurses-6.3-h7f8727e_2
  numexpr            pkgs/main/linux-64::numexpr-2.7.3-py39h22e1b3c_1
  numpy              pkgs/main/linux-64::numpy-1.21.2-py39h20f2e39_0
  numpy-base         pkgs/main/linux-64::numpy-base-1.21.2-py39h79a1101_0
  openssl            pkgs/main/linux-64::openssl-1.1.1l-h7f8727e_0
  pandas             pkgs/main/linux-64::pandas-1.3.4-py39h8c16a72_0
  pip                pkgs/main/linux-64::pip-21.2.4-py39h06a4308_0
  python             pkgs/main/linux-64::python-3.9.7-h12debd9_1
  python-dateutil    pkgs/main/noarch::python-dateutil-2.8.2-pyhd3eb1b0_0
  pytz               pkgs/main/noarch::pytz-2021.3-pyhd3eb1b0_0
  readline           pkgs/main/linux-64::readline-8.1-h27cfd23_0
  setuptools         pkgs/main/linux-64::setuptools-58.0.4-py39h06a4308_0
  six                pkgs/main/noarch::six-1.16.0-pyhd3eb1b0_0
  sqlite             pkgs/main/linux-64::sqlite-3.36.0-hc218d9a_0
  tk                 pkgs/main/linux-64::tk-8.6.11-h1ccaba5_0
  tzdata             pkgs/main/noarch::tzdata-2021e-hda174b7_0
  wheel              pkgs/main/noarch::wheel-0.37.0-pyhd3eb1b0_1
  xz                 pkgs/main/linux-64::xz-5.2.5-h7b6447c_0
  zlib               pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3


Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
#
# To activate this environment, use
#
#     $ conda activate /tmp/tmp.PVpg6xqDGH/3.9
#
# To deactivate an active environment, use
#
#     $ conda deactivate

Creating pip installable source dist
/tmp/tmp.PVpg6xqDGH/3.9/lib/python3.9/site-packages/setuptools/dist.py:717: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
running sdist
running egg_info
creating pyspark.egg-info
writing pyspark.egg-info/PKG-INFO
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing requirements to pyspark.egg-info/requires.txt
writing top-level names to pyspark.egg-info/top_level.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/sbin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-3.3.0.dev0
creating pyspark-3.3.0.dev0/deps
creating pyspark-3.3.0.dev0/deps/bin
creating pyspark-3.3.0.dev0/deps/data
creating pyspark-3.3.0.dev0/deps/data/graphx
creating pyspark-3.3.0.dev0/deps/data/mllib
creating pyspark-3.3.0.dev0/deps/data/mllib/als
creating pyspark-3.3.0.dev0/deps/data/mllib/images
creating pyspark-3.3.0.dev0/deps/data/mllib/images/origin
creating pyspark-3.3.0.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-3.3.0.dev0/deps/data/mllib/images/partitioned
creating pyspark-3.3.0.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-3.3.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-3.3.0.dev0/deps/data/mllib/ridge-data
creating pyspark-3.3.0.dev0/deps/data/streaming
creating pyspark-3.3.0.dev0/deps/examples
creating pyspark-3.3.0.dev0/deps/examples/ml
creating pyspark-3.3.0.dev0/deps/examples/mllib
creating pyspark-3.3.0.dev0/deps/examples/sql
creating pyspark-3.3.0.dev0/deps/examples/sql/streaming
creating pyspark-3.3.0.dev0/deps/examples/streaming
creating pyspark-3.3.0.dev0/deps/jars
creating pyspark-3.3.0.dev0/deps/licenses
creating pyspark-3.3.0.dev0/deps/sbin
creating pyspark-3.3.0.dev0/lib
creating pyspark-3.3.0.dev0/pyspark
creating pyspark-3.3.0.dev0/pyspark.egg-info
creating pyspark-3.3.0.dev0/pyspark/cloudpickle
creating pyspark-3.3.0.dev0/pyspark/ml
creating pyspark-3.3.0.dev0/pyspark/ml/linalg
creating pyspark-3.3.0.dev0/pyspark/ml/param
creating pyspark-3.3.0.dev0/pyspark/mllib
creating pyspark-3.3.0.dev0/pyspark/mllib/linalg
creating pyspark-3.3.0.dev0/pyspark/mllib/stat
creating pyspark-3.3.0.dev0/pyspark/pandas
creating pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
creating pyspark-3.3.0.dev0/pyspark/pandas/indexes
creating pyspark-3.3.0.dev0/pyspark/pandas/missing
creating pyspark-3.3.0.dev0/pyspark/pandas/plot
creating pyspark-3.3.0.dev0/pyspark/pandas/spark
creating pyspark-3.3.0.dev0/pyspark/pandas/typedef
creating pyspark-3.3.0.dev0/pyspark/pandas/usage_logging
creating pyspark-3.3.0.dev0/pyspark/python
creating pyspark-3.3.0.dev0/pyspark/python/pyspark
creating pyspark-3.3.0.dev0/pyspark/resource
creating pyspark-3.3.0.dev0/pyspark/sql
creating pyspark-3.3.0.dev0/pyspark/sql/avro
creating pyspark-3.3.0.dev0/pyspark/sql/pandas
creating pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing
creating pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
creating pyspark-3.3.0.dev0/pyspark/streaming
copying files to pyspark-3.3.0.dev0...
copying MANIFEST.in -> pyspark-3.3.0.dev0
copying README.md -> pyspark-3.3.0.dev0
copying setup.cfg -> pyspark-3.3.0.dev0
copying setup.py -> pyspark-3.3.0.dev0
copying deps/bin/beeline -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/run-example -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-3.3.0.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-3.3.0.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-3.3.0.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-3.3.0.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-3.3.0.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-3.3.0.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-3.3.0.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-3.3.0.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-3.3.0.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-3.3.0.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-3.3.0.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-3.3.0.dev0/deps/data/streaming
copying deps/examples/__init__.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/als.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-3.3.0.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/fm_classifier_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/fm_regressor_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/interaction_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/power_iteration_clustering_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/robust_scaler_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/univariate_feature_selector_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/variance_threshold_selector_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-3.3.0.dev0/deps/examples/ml
copying deps/examples/mllib/__init__.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-3.3.0.dev0/deps/examples/mllib
copying deps/examples/sql/__init__.py -> pyspark-3.3.0.dev0/deps/examples/sql
copying deps/examples/sql/arrow.py -> pyspark-3.3.0.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-3.3.0.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-3.3.0.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-3.3.0.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-3.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_sessionization.py -> pyspark-3.3.0.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/__init__.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-3.3.0.dev0/deps/examples/streaming
copying deps/jars/HikariCP-2.5.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/JLargeArrays-1.5.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/JTransforms-3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.9.22.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aircompressor-0.21.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/algebra_2.12-2.0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aliyun-java-sdk-core-3.4.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aliyun-java-sdk-ecs-4.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aliyun-java-sdk-ram-3.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aliyun-java-sdk-sts-3.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aliyun-sdk-oss-3.4.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/annotations-17.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/antlr-runtime-3.5.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.6.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/arpack-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/arrow-format-6.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/arrow-memory-core-6.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/arrow-memory-netty-6.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/arrow-vector-6.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/audience-annotations-0.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/avro-1.10.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/avro-ipc-1.11.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/avro-mapred-1.11.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/aws-java-sdk-bundle-1.11.901.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/azure-data-lake-store-sdk-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/azure-keyvault-core-1.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/azure-storage-7.0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/blas-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/breeze-macros_2.12-1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/breeze_2.12-1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/byte-buddy-1.10.13.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/byte-buddy-agent-1.10.13.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/cats-kernel_2.12-2.1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/chill-java-0.10.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/chill_2.12-0.10.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-cli-1.5.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-codec-1.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.16.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-compress-1.20.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-crypto-1.1.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-exec-1.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-io-2.11.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-lang3-3.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-net-3.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/commons-text-1.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/cos_api-bundle-5.6.19.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/curator-client-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/curator-framework-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/curator-recipes-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-4.2.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/datanucleus-core-4.1.17.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-4.1.19.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/dec-0.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/derby-10.14.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/flatbuffers-java-1.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/generex-1.0.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.10.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/gson-2.8.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-aliyun-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-annotations-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-aws-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-azure-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-azure-datalake-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-client-api-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-client-runtime-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-cloud-storage-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-cos-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-openstack-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-shaded-guava-1.1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-3.3.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hamcrest-core-1.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-beeline-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-cli-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-common-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-exec-2.3.9-core.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-jdbc-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-llap-client-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-llap-common-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-metastore-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-serde-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-service-rpc-3.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-shims-0.23-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-shims-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-shims-common-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-shims-scheduler-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-storage-api-2.7.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hive-vector-code-gen-2.3.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hk2-api-2.6.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hk2-locator-2.6.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/hk2-utils-2.6.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/htmlunit-2.39.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/htmlunit-core-js-2.39.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/htmlunit-cssparser-1.5.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/htmlunit-driver-2.39.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/htrace-core4-4.1.0-incubating.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/httpclient-4.5.13.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/httpcore-4.4.14.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/httpmime-4.5.12.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/istack-commons-runtime-3.0.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/ivy-2.5.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-annotations-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-core-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-databind-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-cbor-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-datatype-jsr310-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.12-2.13.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jakarta.annotation-api-1.3.5.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jakarta.inject-2.6.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jakarta.servlet-api-4.0.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jakarta.validation-api-2.0.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jakarta.ws.rs-api-2.1.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jakarta.xml.bind-api-2.3.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/janino-3.0.16.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/javassist-3.25.0-GA.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.3.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/javax.jdo-3.2.0-m3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jaxb-runtime-2.3.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.30.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jdom-1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jersey-client-2.34.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jersey-common-2.34.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.34.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.34.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jersey-hk2-2.34.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jersey-server-2.34.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-client-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-http-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-io-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-security-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-server-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-util-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-util-ajax-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.43.v20210629.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/joda-time-2.10.12.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/json-1.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/json4s-ast_2.12-3.7.0-M11.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/json4s-core_2.12-3.7.0-M11.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/json4s-jackson_2.12-3.7.0-M11.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/json4s-scalap_2.12-3.7.0-M11.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.30.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/junit-4.13.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/junit-interface-0.11.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-client-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-admissionregistration-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-apiextensions-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-apps-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-autoscaling-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-batch-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-certificates-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-common-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-coordination-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-core-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-discovery-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-events-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-extensions-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-flowcontrol-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-metrics-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-networking-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-node-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-policy-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-rbac-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-scheduling-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/kubernetes-model-storageclass-5.10.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/lapack-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/libthrift-0.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.12.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/lz4-java-1.8.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/macro-compat_2.12-1.1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/mesos-1.4.3-shaded-protobuf.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-core-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-graphite-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-jmx-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-json-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/metrics-jvm-4.2.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/mockito-3-4_2.12-3.2.9.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/mockito-core-3.4.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/neko-htmlunit-2.39.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/netty-all-4.1.68.Final.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/objenesis-2.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/okhttp-3.12.12.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/orc-core-1.7.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.7.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/orc-shims-1.7.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-column-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-common-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-encoding-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-format-structures-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/parquet-jackson-1.12.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/pickle-1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/pmml-model-1.4.8.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/protobuf-java-3.14.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/py4j-0.10.9.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/remotetea-oncrpc-1.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/rocksdbjni-6.20.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-collection-compat_2.12-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-compiler-2.12.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-library-2.12.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.12-1.1.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-reflect-2.12.15.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scala-xml_2.12-1.3.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalacheck-1-15_2.12-3.2.9.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalacheck_2.12-1.15.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalactic_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-compatible-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-core_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-diagrams_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-featurespec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-flatspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-freespec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-funspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-funsuite_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-matchers-core_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-mustmatchers_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-propspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-refspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-shouldmatchers_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest-wordspec_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/scalatest_2.12-3.2.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-3-141_2.12-3.2.9.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-api-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-chrome-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-edge-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-firefox-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-ie-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-java-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-opera-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-remote-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-safari-driver-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/selenium-support-3.141.59.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/serializer-2.7.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/shapeless_2.12-2.3.3.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/shims-0.9.22.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/silencer-lib_2.12.15-1.7.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.32.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.30.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/snakeyaml-1.28.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/snappy-java-1.1.8.4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.3.0-SNAPSHOT-tests.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-assembly_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-catalyst_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-core_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-graphx_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-hadoop-cloud_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-hive_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-kvstore_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-launcher_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-mesos_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-mllib_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-network-common_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-repl_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-sketch_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-sql_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-streaming_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-tags_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-unsafe_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spark-yarn_2.12-3.3.0-SNAPSHOT.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire-macros_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire-platform_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire-util_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/spire_2.12-0.17.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/stream-2.9.6.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/test-interface-1.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/threeten-extra-1.5.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/tink-1.6.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/transaction-api-1.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/univocity-parsers-2.9.1.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/velocity-1.5.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/websocket-api-9.4.27.v20200227.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/websocket-client-9.4.27.v20200227.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/websocket-common-9.4.27.v20200227.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/wildfly-openssl-1.0.7.Final.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xalan-2.7.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xbean-asm9-shaded-4.20.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xercesImpl-2.12.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xml-apis-1.4.01.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/xz-1.9.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zookeeper-3.6.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zookeeper-jute-3.6.2.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/jars/zstd-jni-1.5.0-4.jar -> pyspark-3.3.0.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-copybutton.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/licenses/LICENSE-vis-timeline.txt -> pyspark-3.3.0.dev0/deps/licenses
copying deps/sbin/spark-config.sh -> pyspark-3.3.0.dev0/deps/sbin
copying deps/sbin/spark-daemon.sh -> pyspark-3.3.0.dev0/deps/sbin
copying deps/sbin/start-history-server.sh -> pyspark-3.3.0.dev0/deps/sbin
copying deps/sbin/stop-history-server.sh -> pyspark-3.3.0.dev0/deps/sbin
copying lib/py4j-0.10.9.2-src.zip -> pyspark-3.3.0.dev0/lib
copying lib/pyspark.zip -> pyspark-3.3.0.dev0/lib
copying pyspark/__init__.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/__init__.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/_globals.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/_typing.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/accumulators.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/broadcast.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/conf.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/context.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/context.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/daemon.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/files.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/install.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/join.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/profiler.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/profiler.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/py.typed -> pyspark-3.3.0.dev0/pyspark
copying pyspark/rdd.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/rdd.pyi -> pyspark-3.3.0.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/serializers.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/shell.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/status.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/util.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/version.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark/worker.py -> pyspark-3.3.0.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-3.3.0.dev0/pyspark.egg-info
copying pyspark/cloudpickle/__init__.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/cloudpickle/cloudpickle.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/cloudpickle/cloudpickle_fast.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/cloudpickle/compat.py -> pyspark-3.3.0.dev0/pyspark/cloudpickle
copying pyspark/ml/__init__.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/_typing.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/base.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/classification.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/clustering.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/evaluation.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/feature.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/fpm.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/functions.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/functions.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/image.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/pipeline.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/recommendation.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/regression.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/stat.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tree.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tree.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/tuning.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/util.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/wrapper.pyi -> pyspark-3.3.0.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-3.3.0.dev0/pyspark/ml/linalg
copying pyspark/ml/linalg/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.pyi -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.pyi -> pyspark-3.3.0.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/_typing.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/classification.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/clustering.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/feature.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/fpm.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/random.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/regression.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/tree.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/util.pyi -> pyspark-3.3.0.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.pyi -> pyspark-3.3.0.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.pyi -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-3.3.0.dev0/pyspark/mllib/stat
copying pyspark/pandas/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/_typing.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/accessors.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/base.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/categorical.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/config.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/datetimes.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/exceptions.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/extensions.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/frame.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/generic.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/groupby.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/indexing.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/internal.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/ml.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/mlflow.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/namespace.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/numpy_compat.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/series.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/sql_formatter.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/sql_processor.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/strings.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/utils.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/window.py -> pyspark-3.3.0.dev0/pyspark/pandas
copying pyspark/pandas/data_type_ops/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/base.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/binary_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/boolean_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/categorical_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/complex_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/date_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/datetime_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/null_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/num_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/string_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/data_type_ops/udt_ops.py -> pyspark-3.3.0.dev0/pyspark/pandas/data_type_ops
copying pyspark/pandas/indexes/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/base.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/category.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/datetimes.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/multi.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/indexes/numeric.py -> pyspark-3.3.0.dev0/pyspark/pandas/indexes
copying pyspark/pandas/missing/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/common.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/frame.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/groupby.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/indexes.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/series.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/missing/window.py -> pyspark-3.3.0.dev0/pyspark/pandas/missing
copying pyspark/pandas/plot/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/plot/core.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/plot/matplotlib.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/plot/plotly.py -> pyspark-3.3.0.dev0/pyspark/pandas/plot
copying pyspark/pandas/spark/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/spark/accessors.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/spark/functions.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/spark/utils.py -> pyspark-3.3.0.dev0/pyspark/pandas/spark
copying pyspark/pandas/typedef/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/typedef
copying pyspark/pandas/typedef/string_typehints.py -> pyspark-3.3.0.dev0/pyspark/pandas/typedef
copying pyspark/pandas/typedef/typehints.py -> pyspark-3.3.0.dev0/pyspark/pandas/typedef
copying pyspark/pandas/usage_logging/__init__.py -> pyspark-3.3.0.dev0/pyspark/pandas/usage_logging
copying pyspark/pandas/usage_logging/usage_logger.py -> pyspark-3.3.0.dev0/pyspark/pandas/usage_logging
copying pyspark/python/pyspark/shell.py -> pyspark-3.3.0.dev0/pyspark/python/pyspark
copying pyspark/resource/__init__.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/resource/information.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/resource/profile.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/resource/requests.py -> pyspark-3.3.0.dev0/pyspark/resource
copying pyspark/sql/__init__.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/_typing.pyi -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/observation.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-3.3.0.dev0/pyspark/sql
copying pyspark/sql/avro/__init__.py -> pyspark-3.3.0.dev0/pyspark/sql/avro
copying pyspark/sql/avro/functions.py -> pyspark-3.3.0.dev0/pyspark/sql/avro
copying pyspark/sql/pandas/__init__.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/conversion.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/functions.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/functions.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/group_ops.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/map_ops.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/serializers.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/typehints.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/types.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/utils.py -> pyspark-3.3.0.dev0/pyspark/sql/pandas
copying pyspark/sql/pandas/_typing/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing
copying pyspark/sql/pandas/_typing/protocols/__init__.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
copying pyspark/sql/pandas/_typing/protocols/frame.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
copying pyspark/sql/pandas/_typing/protocols/series.pyi -> pyspark-3.3.0.dev0/pyspark/sql/pandas/_typing/protocols
copying pyspark/streaming/__init__.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/context.pyi -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/dstream.pyi -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-3.3.0.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-3.3.0.dev0/pyspark/streaming
Writing pyspark-3.3.0.dev0/setup.cfg
Creating tar archive
removing 'pyspark-3.3.0.dev0' (and everything under it)
Installing dist into virtual env
Obtaining file:///home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/python
Collecting py4j==0.10.9.2
  Downloading py4j-0.10.9.2-py2.py3-none-any.whl (198 kB)
Installing collected packages: py4j, pyspark
  Running setup.py develop for pyspark
Successfully installed py4j-0.10.9.2 pyspark-3.3.0.dev0
Run basic sanity check on pip installed version with spark-submit
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Successfully ran pip sanity check
Run basic sanity check with import based
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]

                                                                                
Successfully ran pip sanity check
Run the tests for context.py
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
	at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
	at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:75)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:83)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]

                                                                                

[Stage 10:>                                                         (0 + 4) / 4]

[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]

                                                                                
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
Cleaning up temporary directory - /tmp/tmp.PVpg6xqDGH

========================================================================
Running SparkR tests
========================================================================

Attaching package: ‘SparkR’

The following objects are masked from ‘package:testthat’:

    describe, not

The following objects are masked from ‘package:stats’:

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from ‘package:base’:

    as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
    rank, rbind, sample, startsWith, subset, summary, transform, union

Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2
[ FAIL 0 | WARN 0 | SKIP 0 | PASS 13 ]
binary_function: 
binary functions: ...........
binaryFile: 
functions on binary files: ....
broadcast: 
broadcast variables: ..
client: 
functions in client.R: .....
context: 
test functions in sparkR.R: ..............................................
includePackage: 
include R packages: ..
jvm_api: 
JVM API: ..
mllib_classification: 
MLlib classification algorithms, except for tree-based algorithms: ...........................................................................
mllib_clustering: 
MLlib clustering algorithms: ......................................................................
mllib_fpm: 
MLlib frequent pattern mining: ......
mllib_recommendation: 
MLlib recommendation algorithms: ........
mllib_regression: 
MLlib regression algorithms, except for tree-based algorithms: ........................................................................................................................................
mllib_stat: 
MLlib statistics algorithms: ........
mllib_tree: 
MLlib tree-based algorithms: ..............................................................................................
parallelize_collect: 
parallelize() and collect(): .............................
rdd: 
basic RDD functions: ............................................................................................................................................................................................................................................................................................................................................................................................................................................
Serde: 
SerDe functionality: .......................................
shuffle: 
partitionBy, groupByKey, reduceByKey etc.: ....................
sparkR: 
functions in sparkR.R: ....
sparkSQL_arrow: 
SparkSQL Arrow optimization: SSSSSSSSSSS
sparkSQL_eager: 
test show SparkDataFrame when eager execution is enabled.: ......
sparkSQL: 
SparkSQL functions: .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
streaming: 
Structured Streaming: ..........................................
take: 
tests RDD function take(): ................
textFile: 
the textFile() function: ..............
utils: 
functions in utils.R: ..............................................
Windows: 
Windows-specific tests: S

══ Skipped ═════════════════════════════════════════════════════════════════════
1. createDataFrame/collect Arrow optimization (test_sparkSQL_arrow.R:28:3) - Reason: arrow cannot be loaded

2. createDataFrame/collect Arrow optimization - many partitions (partition order test) (test_sparkSQL_arrow.R:45:3) - Reason: arrow cannot be loaded

3. createDataFrame/collect Arrow optimization - type specification (test_sparkSQL_arrow.R:51:3) - Reason: arrow cannot be loaded

4. dapply() Arrow optimization (test_sparkSQL_arrow.R:75:3) - Reason: arrow cannot be loaded

5. dapply() Arrow optimization - type specification (test_sparkSQL_arrow.R:109:3) - Reason: arrow cannot be loaded

6. dapply() Arrow optimization - type specification (date and timestamp) (test_sparkSQL_arrow.R:138:3) - Reason: arrow cannot be loaded

7. gapply() Arrow optimization (test_sparkSQL_arrow.R:147:3) - Reason: arrow cannot be loaded

8. gapply() Arrow optimization - type specification (test_sparkSQL_arrow.R:190:3) - Reason: arrow cannot be loaded

9. gapply() Arrow optimization - type specification (date and timestamp) (test_sparkSQL_arrow.R:222:3) - Reason: arrow cannot be loaded

10. Arrow optimization - unsupported types (test_sparkSQL_arrow.R:233:3) - Reason: arrow cannot be loaded

11. SPARK-32478: gapply() Arrow optimization - error message for schema mismatch (test_sparkSQL_arrow.R:244:3) - Reason: arrow cannot be loaded

12. sparkJars tag in SparkContext (test_Windows.R:22:5) - Reason: This test is only for Windows, skipped

══ DONE ════════════════════════════════════════════════════════════════════════
Using R_SCRIPT_PATH = /usr/bin
++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/install-dev.sh
+++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R
++ LIB_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib
++ mkdir -p /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib
++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R
++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/create-rd.sh
+++ set -o pipefail
+++ set -e
+++++ dirname /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/create-rd.sh
++++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R
++++ pwd
+++ FWDIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R
+++ pushd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R
+++ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/find-r.sh
++++ '[' -z /usr/bin ']'
+++ /usr/bin/Rscript -e ' if(requireNamespace("devtools", quietly=TRUE)) { setwd("/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R"); devtools::document(pkg="./pkg", roclets="rd") }'
Updating SparkR documentation
ℹ Loading SparkR
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘endsWith’ in package ‘SparkR’
Creating a new generic function for ‘startsWith’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
++ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/pkg/
* installing *source* package ‘SparkR’ ...
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘endsWith’ in package ‘SparkR’
Creating a new generic function for ‘startsWith’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (SparkR)
++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib
++ jar cfM /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/lib/sparkr.zip SparkR
++ popd
++ cd /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/..
++ pwd
+ SPARK_HOME=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2
+ . /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/bin/load-spark-env.sh
++ '[' -z /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2 ']'
++ SPARK_ENV_SH=spark-env.sh
++ '[' -z '' ']'
++ export SPARK_ENV_LOADED=1
++ SPARK_ENV_LOADED=1
++ export SPARK_CONF_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/conf
++ SPARK_CONF_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/conf
++ SPARK_ENV_SH=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/conf/spark-env.sh
++ [[ -f /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/conf/spark-env.sh ]]
++ '[' -z '' ']'
++ SCALA_VERSION_1=2.13
++ SCALA_VERSION_2=2.12
++ ASSEMBLY_DIR_1=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/assembly/target/scala-2.13
++ ASSEMBLY_DIR_2=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/assembly/target/scala-2.12
++ ENV_VARIABLE_DOC=https://spark.apache.org/docs/latest/configuration.html#environment-variables
++ [[ -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/assembly/target/scala-2.13 ]]
++ [[ -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/assembly/target/scala-2.13 ]]
++ export SPARK_SCALA_VERSION=2.12
++ SPARK_SCALA_VERSION=2.12
+ '[' -f /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/RELEASE ']'
+ SPARK_JARS_DIR=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/assembly/target/scala-2.12/jars
+ '[' -d /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/assembly/target/scala-2.12/jars ']'
+ SPARK_HOME=/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2
+ /usr/bin/R CMD build /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/pkg
* checking for file ‘/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/pkg/DESCRIPTION’ ... OK
* preparing ‘SparkR’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... OK
* checking for LF line-endings in source and make files and shell scripts
* checking for empty or unneeded directories
* building ‘SparkR_3.3.0.tar.gz’

+ find pkg/vignettes/. -not -name . -not -name '*.Rmd' -not -name '*.md' -not -name '*.pdf' -not -name '*.html' -delete
++ grep Version /home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/pkg/DESCRIPTION
++ awk '{print $NF}'
+ VERSION=3.3.0
+ CRAN_CHECK_OPTIONS=--as-cran
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests'
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests --no-manual --no-vignettes'
+ echo 'Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options'
Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options
+ export _R_CHECK_FORCE_SUGGESTS_=FALSE
+ _R_CHECK_FORCE_SUGGESTS_=FALSE
+ '[' -n 1 ']'
+ '[' -n 1 ']'
+ /usr/bin/R CMD check --as-cran --no-tests --no-manual --no-vignettes SparkR_3.3.0.tar.gz
* using log directory ‘/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/SparkR.Rcheck’
* using R version 3.6.3 (2020-02-29)
* using platform: x86_64-pc-linux-gnu (64-bit)
* using session charset: UTF-8
* using options ‘--no-tests --no-manual --no-vignettes --as-cran’
* checking for file ‘SparkR/DESCRIPTION’ ... OK
* checking extension type ... Package
* this is package ‘SparkR’ version ‘3.3.0’
* package encoding: UTF-8
* checking CRAN incoming feasibility ... NOTE
Maintainer: ‘Felix Cheung <felixcheung@apache.org>’

New submission

Package was archived on CRAN

CRAN repository db overrides:
  X-CRAN-Comment: Archived on 2021-06-28 as issues were not corrected
    in time.

  Should use tools::R_user_dir().
* checking package namespace information ... OK
* checking package dependencies ... NOTE
Package suggested but not available for checking: ‘arrow’
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... OK
* checking for sufficient/correct file permissions ... OK
* checking whether package ‘SparkR’ can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking for future file timestamps ... OK
* checking ‘build’ directory ... OK
* checking DESCRIPTION meta-information ... OK
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking use of S3 registration ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd line widths ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking installed files from ‘inst/doc’ ... OK
* checking files in ‘vignettes’ ... OK
* checking examples ... OK
* checking for unstated dependencies in ‘tests’ ... OK
* checking tests ... SKIPPED
* checking for unstated dependencies in vignettes ... OK
* checking package vignettes in ‘inst/doc’ ... OK
* checking running R code from vignettes ... SKIPPED
* checking re-building of vignette outputs ... SKIPPED
* checking for detritus in the temp directory ... OK
* DONE

Status: 2 NOTEs
See
  ‘/home/jenkins/workspace/spark-master-test-sbt-hadoop-3.2/R/SparkR.Rcheck/00check.log’
for details.


+ popd
Tests passed.
Archiving artifacts
Recording test results
[Checks API] No suitable checks publisher found.
Finished: SUCCESS