Console Output

Skipping 24,237 KB.. Full Log
Finished test(python3.6): pyspark.sql.dataframe (45s)
Starting test(python3.6): pyspark.sql.session
Finished test(python3.6): pyspark.sql.functions (48s)
Starting test(python3.6): pyspark.sql.streaming
Finished test(python3.6): pyspark.sql.readwriter (21s)
Starting test(python3.6): pyspark.sql.types
Finished test(python3.6): pyspark.sql.session (18s)
Starting test(python3.6): pyspark.sql.udf
Finished test(python3.6): pyspark.ml.tests (220s)
Starting test(python3.6): pyspark.sql.window
Finished test(python3.6): pyspark.sql.types (7s)
Starting test(python3.6): pyspark.streaming.util
Finished test(python3.6): pyspark.sql.window (5s)
Starting test(python3.6): pyspark.test_broadcast
Finished test(python3.6): pyspark.streaming.util (0s)
Starting test(python3.6): pyspark.test_serializers
Finished test(python3.6): pyspark.test_serializers (0s)
Starting test(python3.6): pyspark.util
Finished test(python3.6): pyspark.util (0s)
Finished test(python3.6): pyspark.sql.streaming (14s)
Finished test(python3.6): pyspark.sql.udf (16s)
Finished test(python3.6): pyspark.test_broadcast (25s)
Tests passed in 1355 seconds

Skipped tests in pyspark.sql.tests with python2.7:
    test_unbounded_frames (pyspark.sql.tests.HiveContextSQLTests) ... skipped "Unittest < 3.3 doesn't support mocking"
    test_create_dataframe_required_pandas_not_found (pyspark.sql.tests.SQLTests) ... skipped 'Required Pandas was found.'
    test_to_pandas_required_pandas_not_found (pyspark.sql.tests.SQLTests) ... skipped 'Required Pandas was found.'
    test_type_annotation (pyspark.sql.tests.ScalarPandasUDFTests) ... skipped 'Type hints are supported from Python 3.5.'

========================================================================
Running PySpark packaging tests
========================================================================
Constructing virtual env for testing
Using conda virtual environments
Testing pip installation with python 3.5
Using /tmp/tmp.MhcJSkguAC for virtualenv
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done


==> WARNING: A newer version of conda exists. <==
  current version: 4.7.12
  latest version: 4.10.1

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: /tmp/tmp.MhcJSkguAC/3.5

  added / updated specs:
    - numpy
    - pandas
    - pip
    - python=3.5
    - setuptools


The following NEW packages will be INSTALLED:

  _libgcc_mutex      pkgs/main/linux-64::_libgcc_mutex-0.1-main
  blas               pkgs/main/linux-64::blas-1.0-mkl
  ca-certificates    pkgs/main/linux-64::ca-certificates-2021.4.13-h06a4308_1
  certifi            pkgs/main/noarch::certifi-2020.6.20-pyhd3eb1b0_3
  intel-openmp       pkgs/main/linux-64::intel-openmp-2021.2.0-h06a4308_610
  libedit            pkgs/main/linux-64::libedit-3.1.20210216-h27cfd23_1
  libffi             pkgs/main/linux-64::libffi-3.2.1-hf484d3e_1007
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
  libgfortran-ng     pkgs/main/linux-64::libgfortran-ng-7.3.0-hdf63c60_0
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
  mkl                pkgs/main/linux-64::mkl-2018.0.3-1
  mkl_fft            pkgs/main/linux-64::mkl_fft-1.0.6-py35h7dd41cf_0
  mkl_random         pkgs/main/linux-64::mkl_random-1.0.1-py35h4414c95_1
  ncurses            pkgs/main/linux-64::ncurses-6.2-he6710b0_1
  numpy              pkgs/main/linux-64::numpy-1.15.2-py35h1d66e8a_0
  numpy-base         pkgs/main/linux-64::numpy-base-1.15.2-py35h81de0dd_0
  openssl            pkgs/main/linux-64::openssl-1.0.2u-h7b6447c_0
  pandas             pkgs/main/linux-64::pandas-0.23.4-py35h04863e7_0
  pip                pkgs/main/linux-64::pip-10.0.1-py35_0
  python             pkgs/main/linux-64::python-3.5.6-hc3d631a_0
  python-dateutil    pkgs/main/noarch::python-dateutil-2.8.1-pyhd3eb1b0_0
  pytz               pkgs/main/noarch::pytz-2021.1-pyhd3eb1b0_0
  readline           pkgs/main/linux-64::readline-7.0-h7b6447c_5
  setuptools         pkgs/main/linux-64::setuptools-40.2.0-py35_0
  six                pkgs/main/noarch::six-1.15.0-pyhd3eb1b0_0
  sqlite             pkgs/main/linux-64::sqlite-3.33.0-h62c20be_0
  tbb                pkgs/main/linux-64::tbb-2021.2.0-hff7bd54_0
  tbb4py             pkgs/main/linux-64::tbb4py-2018.0.5-py35h6bb024c_0
  tk                 pkgs/main/linux-64::tk-8.6.10-hbc83047_0
  wheel              pkgs/main/noarch::wheel-0.36.2-pyhd3eb1b0_0
  xz                 pkgs/main/linux-64::xz-5.2.5-h7b6447c_0
  zlib               pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3


Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
#
# To activate this environment, use
#
#     $ conda activate /tmp/tmp.MhcJSkguAC/3.5
#
# To deactivate an active environment, use
#
#     $ conda deactivate

Creating pip installable source dist
Could not import pypandoc - required to package PySpark
zip_safe flag not set; analyzing archive contents...
pypandoc.__pycache__.__init__.cpython-35: module references __file__

Installed /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/python/.eggs/pypandoc-1.5-py3.5.egg
running sdist
running egg_info
creating pyspark.egg-info
writing pyspark.egg-info/PKG-INFO
writing requirements to pyspark.egg-info/requires.txt
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing top-level names to pyspark.egg-info/top_level.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-2.4.9.dev0
creating pyspark-2.4.9.dev0/deps
creating pyspark-2.4.9.dev0/deps/bin
creating pyspark-2.4.9.dev0/deps/data
creating pyspark-2.4.9.dev0/deps/data/graphx
creating pyspark-2.4.9.dev0/deps/data/mllib
creating pyspark-2.4.9.dev0/deps/data/mllib/als
creating pyspark-2.4.9.dev0/deps/data/mllib/images
creating pyspark-2.4.9.dev0/deps/data/mllib/images/origin
creating pyspark-2.4.9.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned
creating pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-2.4.9.dev0/deps/data/mllib/ridge-data
creating pyspark-2.4.9.dev0/deps/data/streaming
creating pyspark-2.4.9.dev0/deps/examples
creating pyspark-2.4.9.dev0/deps/examples/ml
creating pyspark-2.4.9.dev0/deps/examples/mllib
creating pyspark-2.4.9.dev0/deps/examples/sql
creating pyspark-2.4.9.dev0/deps/examples/sql/streaming
creating pyspark-2.4.9.dev0/deps/examples/streaming
creating pyspark-2.4.9.dev0/deps/jars
creating pyspark-2.4.9.dev0/deps/licenses
creating pyspark-2.4.9.dev0/lib
creating pyspark-2.4.9.dev0/pyspark
creating pyspark-2.4.9.dev0/pyspark.egg-info
creating pyspark-2.4.9.dev0/pyspark/ml
creating pyspark-2.4.9.dev0/pyspark/ml/linalg
creating pyspark-2.4.9.dev0/pyspark/ml/param
creating pyspark-2.4.9.dev0/pyspark/mllib
creating pyspark-2.4.9.dev0/pyspark/mllib/linalg
creating pyspark-2.4.9.dev0/pyspark/mllib/stat
creating pyspark-2.4.9.dev0/pyspark/python
creating pyspark-2.4.9.dev0/pyspark/python/pyspark
creating pyspark-2.4.9.dev0/pyspark/sql
creating pyspark-2.4.9.dev0/pyspark/streaming
copying files to pyspark-2.4.9.dev0...
copying MANIFEST.in -> pyspark-2.4.9.dev0
copying README.md -> pyspark-2.4.9.dev0
copying setup.cfg -> pyspark-2.4.9.dev0
copying setup.py -> pyspark-2.4.9.dev0
copying deps/bin/beeline -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/run-example -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-2.4.9.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-2.4.9.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-2.4.9.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-2.4.9.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-2.4.9.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-2.4.9.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_estimator_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-2.4.9.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/direct_kafka_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/flume_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/kafka_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arrow-format-0.10.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arrow-memory-0.10.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arrow-vector-0.10.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/breeze-macros_2.11-0.13.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/breeze_2.11-0.13.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/calcite-avatica-1.2.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/calcite-core-1.2.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/calcite-linq4j-1.2.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/chill_2.11-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-lang3-3.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/eigenbase-properties-1.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/flatbuffers-1.2.0-3f79e055.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/generex-1.0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-annotations-2.6.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-core-2.7.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-databind-2.6.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.6.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.7.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.11-2.6.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/janino-3.0.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-client-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-http-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-io-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-security-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-server-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-util-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-util-ajax-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-ast_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-core_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-jackson_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-scalap_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kubernetes-client-4.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kubernetes-model-4.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/libthrift-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/lz4-java-1.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/machinist_2.11-0.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/macro-compat_2.11-1.1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/netty-all-4.1.47.Final.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/orc-core-1.5.5-nohive.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.5-nohive.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/orc-shims-1.5.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/pmml-model-1.2.15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/pmml-schema-1.2.15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/py4j-0.10.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/pyrolite-4.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-compiler-2.11.12.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-library-2.11.12.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.11-1.1.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-reflect-2.11.12.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-xml_2.11-1.0.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/shapeless_2.11-2.3.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/snakeyaml-1.15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/snappy-java-1.1.8.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.4.9-SNAPSHOT-tests.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-catalyst_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-core_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-graphx_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-hive_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-kvstore_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-launcher_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-mesos_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-mllib_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-network-common_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-repl_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-sketch_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-sql_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-streaming_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-tags_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-unsafe_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-yarn_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spire-macros_2.11-0.13.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spire_2.11-0.13.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stream-2.7.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xbean-asm6-shaded-4.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.4-3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-vis-timeline.txt -> pyspark-2.4.9.dev0/deps/licenses
copying lib/py4j-0.10.7-src.zip -> pyspark-2.4.9.dev0/lib
copying lib/pyspark.zip -> pyspark-2.4.9.dev0/lib
copying pyspark/__init__.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/_globals.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/conf.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/context.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/daemon.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/files.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/join.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/profiler.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/rdd.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/serializers.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/shell.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/status.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/test_broadcast.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/test_serializers.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/tests.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/util.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/version.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/worker.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/tests.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-2.4.9.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-2.4.9.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-2.4.9.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-2.4.9.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/tests.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-2.4.9.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-2.4.9.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-2.4.9.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/tests.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/flume.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/kafka.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/tests.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-2.4.9.dev0/pyspark/streaming
Writing pyspark-2.4.9.dev0/setup.cfg
creating dist
Creating tar archive
removing 'pyspark-2.4.9.dev0' (and everything under it)
Installing dist into virtual env
Processing ./python/dist/pyspark-2.4.9.dev0.tar.gz
Collecting py4j==0.10.7 (from pyspark==2.4.9.dev0)
  Downloading https://files.pythonhosted.org/packages/e3/53/c737818eb9a7dc32a7cd4f1396e787bd94200c3997c72c1dbe028587bd76/py4j-0.10.7-py2.py3-none-any.whl (197kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py install for pyspark: started
    Running setup.py install for pyspark: finished with status 'done'
Successfully installed py4j-0.10.7 pyspark-2.4.9.dev0
You are using pip version 10.0.1, however version 20.3.4 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
21/05/07 11:45:11 WARN Utils: Your hostname, research-jenkins-worker-03 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/05/07 11:45:11 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/05/07 11:45:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/05/07 11:45:12 INFO SparkContext: Running Spark version 2.4.9-SNAPSHOT
21/05/07 11:45:12 INFO SparkContext: Submitted application: PipSanityCheck
21/05/07 11:45:12 INFO SecurityManager: Changing view acls to: jenkins
21/05/07 11:45:12 INFO SecurityManager: Changing modify acls to: jenkins
21/05/07 11:45:12 INFO SecurityManager: Changing view acls groups to: 
21/05/07 11:45:12 INFO SecurityManager: Changing modify acls groups to: 
21/05/07 11:45:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
21/05/07 11:45:12 INFO Utils: Successfully started service 'sparkDriver' on port 46547.
21/05/07 11:45:12 INFO SparkEnv: Registering MapOutputTracker
21/05/07 11:45:12 INFO SparkEnv: Registering BlockManagerMaster
21/05/07 11:45:12 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/05/07 11:45:12 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/05/07 11:45:13 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-43cf90b0-13d4-4549-90cf-ce87ea8eea87
21/05/07 11:45:13 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
21/05/07 11:45:13 INFO SparkEnv: Registering OutputCommitCoordinator
21/05/07 11:45:13 INFO log: Logging initialized @2882ms to org.eclipse.jetty.util.log.Slf4jLog
21/05/07 11:45:13 INFO Server: jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git: b881a572662e1943a14ae12e7e1207989f218b74; jvm 1.8.0_275-8u275-b01-0ubuntu1~20.04-b01
21/05/07 11:45:13 INFO Server: Started @3000ms
21/05/07 11:45:13 INFO AbstractConnector: Started ServerConnector@3ecd1c8a{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
21/05/07 11:45:13 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7285c11c{/jobs,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@14f24885{/jobs/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3efbce2a{/jobs/job,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@391363a1{/jobs/job/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3fa3a266{/stages,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5fc9a126{/stages/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@39bd4c6e{/stages/stage,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2b514992{/stages/stage/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@785cbeaa{/stages/pool,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6a2be284{/stages/pool/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4b71a373{/storage,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6a4d6e56{/storage/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7779245b{/storage/rdd,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42f3c7f2{/storage/rdd/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@45f9ee4{/environment,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5616fe7{/environment/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@3b04d8ec{/executors,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@67145429{/executors/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@1fc44e91{/executors/threadDump,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@115404d9{/executors/threadDump/json,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@c0a6da7{/static,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5bdbfb21{/,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@951da3{/api,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@34d74688{/jobs/job/kill,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5247dc72{/stages/stage/kill,null,AVAILABLE,@Spark}
21/05/07 11:45:13 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.122.1:4040
21/05/07 11:45:13 INFO Executor: Starting executor ID driver on host localhost
21/05/07 11:45:13 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44419.
21/05/07 11:45:13 INFO NettyBlockTransferService: Server created on 192.168.122.1:44419
21/05/07 11:45:13 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/05/07 11:45:13 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.122.1, 44419, None)
21/05/07 11:45:13 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.122.1:44419 with 366.3 MB RAM, BlockManagerId(driver, 192.168.122.1, 44419, None)
21/05/07 11:45:13 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.122.1, 44419, None)
21/05/07 11:45:13 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.122.1, 44419, None)
21/05/07 11:45:13 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@a34204d{/metrics/json,null,AVAILABLE,@Spark}
21/05/07 11:45:14 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
21/05/07 11:45:14 INFO SharedState: Warehouse path is 'file:/spark-warehouse'.
21/05/07 11:45:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5e08ca09{/SQL,null,AVAILABLE,@Spark}
21/05/07 11:45:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@274cfc60{/SQL/json,null,AVAILABLE,@Spark}
21/05/07 11:45:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@673c5d4f{/SQL/execution,null,AVAILABLE,@Spark}
21/05/07 11:45:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5523ff81{/SQL/execution/json,null,AVAILABLE,@Spark}
21/05/07 11:45:14 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@288a73c9{/static/sql,null,AVAILABLE,@Spark}
21/05/07 11:45:14 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
21/05/07 11:45:15 INFO SparkContext: Starting job: reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32
21/05/07 11:45:15 INFO DAGScheduler: Got job 0 (reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32) with 10 output partitions
21/05/07 11:45:15 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32)
21/05/07 11:45:15 INFO DAGScheduler: Parents of final stage: List()
21/05/07 11:45:15 INFO DAGScheduler: Missing parents: List()
21/05/07 11:45:15 INFO DAGScheduler: Submitting ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32), which has no missing parents
21/05/07 11:45:15 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 6.1 KB, free 366.3 MB)
21/05/07 11:45:15 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.1 KB, free 366.3 MB)
21/05/07 11:45:15 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.122.1:44419 (size: 4.1 KB, free: 366.3 MB)
21/05/07 11:45:15 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1184
21/05/07 11:45:15 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
21/05/07 11:45:15 INFO TaskSchedulerImpl: Adding task set 0.0 with 10 tasks
21/05/07 11:45:15 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7852 bytes)
21/05/07 11:45:15 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/05/07 11:45:15 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
21/05/07 11:45:15 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
21/05/07 11:45:15 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
21/05/07 11:45:15 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
21/05/07 11:45:15 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
21/05/07 11:45:15 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
21/05/07 11:45:15 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
21/05/07 11:45:15 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
21/05/07 11:45:15 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
21/05/07 11:45:16 INFO PythonRunner: Times: total = 449, boot = 446, init = 3, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 449, boot = 438, init = 11, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 453, boot = 450, init = 3, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 456, boot = 453, init = 3, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 460, boot = 457, init = 3, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 463, boot = 460, init = 3, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 467, boot = 464, init = 2, finish = 1
21/05/07 11:45:16 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1462 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1462 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1418 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1419 bytes result sent to driver
21/05/07 11:45:16 INFO PythonRunner: Times: total = 471, boot = 468, init = 3, finish = 0
21/05/07 11:45:16 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 1462 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1461 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1461 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 1462 bytes result sent to driver
21/05/07 11:45:16 INFO PythonRunner: Times: total = 502, boot = 471, init = 31, finish = 0
21/05/07 11:45:16 INFO PythonRunner: Times: total = 504, boot = 500, init = 3, finish = 1
21/05/07 11:45:16 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1462 bytes result sent to driver
21/05/07 11:45:16 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1462 bytes result sent to driver
21/05/07 11:45:16 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 626 ms on localhost (executor driver) (1/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 628 ms on localhost (executor driver) (2/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 626 ms on localhost (executor driver) (3/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 627 ms on localhost (executor driver) (4/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 626 ms on localhost (executor driver) (5/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 632 ms on localhost (executor driver) (6/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 626 ms on localhost (executor driver) (7/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 656 ms on localhost (executor driver) (8/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 632 ms on localhost (executor driver) (9/10)
21/05/07 11:45:16 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 629 ms on localhost (executor driver) (10/10)
21/05/07 11:45:16 INFO PythonAccumulatorV2: Connected to AccumulatorServer at host: 127.0.0.1 port: 39239
21/05/07 11:45:16 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/05/07 11:45:16 INFO DAGScheduler: ResultStage 0 (reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32) finished in 0.825 s
21/05/07 11:45:16 INFO DAGScheduler: Job 0 finished: reduce at /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/dev/pip-sanity-check.py:32, took 0.877269 s
Successfully ran pip sanity check
21/05/07 11:45:16 INFO AbstractConnector: Stopped Spark@3ecd1c8a{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
21/05/07 11:45:16 INFO SparkUI: Stopped Spark web UI at http://192.168.122.1:4040
21/05/07 11:45:16 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/05/07 11:45:16 INFO MemoryStore: MemoryStore cleared
21/05/07 11:45:16 INFO BlockManager: BlockManager stopped
21/05/07 11:45:16 INFO BlockManagerMaster: BlockManagerMaster stopped
21/05/07 11:45:16 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/05/07 11:45:16 INFO SparkContext: Successfully stopped SparkContext
21/05/07 11:45:17 INFO ShutdownHookManager: Shutdown hook called
21/05/07 11:45:17 INFO ShutdownHookManager: Deleting directory /tmp/spark-87559fa3-91cc-4d74-a90b-430970b27812/pyspark-ecd0bed8-23f0-402e-b18a-7e46ba9d8406
21/05/07 11:45:17 INFO ShutdownHookManager: Deleting directory /tmp/spark-87559fa3-91cc-4d74-a90b-430970b27812
21/05/07 11:45:17 INFO ShutdownHookManager: Deleting directory /tmp/spark-09119ddd-07fb-40b5-b0da-5de29790f549
Run basic sanity check with import based
21/05/07 11:45:18 WARN Utils: Your hostname, research-jenkins-worker-03 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/05/07 11:45:18 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/05/07 11:45:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
21/05/07 11:45:24 WARN Utils: Your hostname, research-jenkins-worker-03 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/05/07 11:45:24 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/05/07 11:45:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/05/07 11:45:27 WARN Utils: Your hostname, research-jenkins-worker-03 resolves to a loopback address: 127.0.1.1; using 192.168.122.1 instead (on interface virbr0)
21/05/07 11:45:27 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/05/07 11:45:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]21/05/07 11:45:40 WARN PythonRunner: Incomplete task 3.42 in stage 10 (TID 42) interrupted: Attempting to kill Python Worker
21/05/07 11:45:40 WARN PythonRunner: Incomplete task 1.40 in stage 10 (TID 40) interrupted: Attempting to kill Python Worker
21/05/07 11:45:40 WARN PythonRunner: Incomplete task 2.41 in stage 10 (TID 41) interrupted: Attempting to kill Python Worker
21/05/07 11:45:40 WARN PythonRunner: Incomplete task 0.39 in stage 10 (TID 39) interrupted: Attempting to kill Python Worker
21/05/07 11:45:40 WARN TaskSetManager: Lost task 1.0 in stage 10.0 (TID 40, localhost, executor driver): TaskKilled (Stage cancelled)
21/05/07 11:45:40 WARN TaskSetManager: Lost task 2.0 in stage 10.0 (TID 41, localhost, executor driver): TaskKilled (Stage cancelled)
21/05/07 11:45:40 WARN TaskSetManager: Lost task 3.0 in stage 10.0 (TID 42, localhost, executor driver): TaskKilled (Stage cancelled)
21/05/07 11:45:40 WARN TaskSetManager: Lost task 0.0 in stage 10.0 (TID 39, localhost, executor driver): TaskKilled (Stage cancelled)

                                                                                
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
Testing pip installation with python 3.5
Using /tmp/tmp.MhcJSkguAC for virtualenv
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done


==> WARNING: A newer version of conda exists. <==
  current version: 4.7.12
  latest version: 4.10.1

Please update conda by running

    $ conda update -n base -c defaults conda



## Package Plan ##

  environment location: /tmp/tmp.MhcJSkguAC/3.5

  added / updated specs:
    - numpy
    - pandas
    - pip
    - python=3.5
    - setuptools


The following NEW packages will be INSTALLED:

  _libgcc_mutex      pkgs/main/linux-64::_libgcc_mutex-0.1-main
  blas               pkgs/main/linux-64::blas-1.0-mkl
  ca-certificates    pkgs/main/linux-64::ca-certificates-2021.4.13-h06a4308_1
  certifi            pkgs/main/noarch::certifi-2020.6.20-pyhd3eb1b0_3
  intel-openmp       pkgs/main/linux-64::intel-openmp-2021.2.0-h06a4308_610
  libedit            pkgs/main/linux-64::libedit-3.1.20210216-h27cfd23_1
  libffi             pkgs/main/linux-64::libffi-3.2.1-hf484d3e_1007
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-9.1.0-hdf63c60_0
  libgfortran-ng     pkgs/main/linux-64::libgfortran-ng-7.3.0-hdf63c60_0
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-9.1.0-hdf63c60_0
  mkl                pkgs/main/linux-64::mkl-2018.0.3-1
  mkl_fft            pkgs/main/linux-64::mkl_fft-1.0.6-py35h7dd41cf_0
  mkl_random         pkgs/main/linux-64::mkl_random-1.0.1-py35h4414c95_1
  ncurses            pkgs/main/linux-64::ncurses-6.2-he6710b0_1
  numpy              pkgs/main/linux-64::numpy-1.15.2-py35h1d66e8a_0
  numpy-base         pkgs/main/linux-64::numpy-base-1.15.2-py35h81de0dd_0
  openssl            pkgs/main/linux-64::openssl-1.0.2u-h7b6447c_0
  pandas             pkgs/main/linux-64::pandas-0.23.4-py35h04863e7_0
  pip                pkgs/main/linux-64::pip-10.0.1-py35_0
  python             pkgs/main/linux-64::python-3.5.6-hc3d631a_0
  python-dateutil    pkgs/main/noarch::python-dateutil-2.8.1-pyhd3eb1b0_0
  pytz               pkgs/main/noarch::pytz-2021.1-pyhd3eb1b0_0
  readline           pkgs/main/linux-64::readline-7.0-h7b6447c_5
  setuptools         pkgs/main/linux-64::setuptools-40.2.0-py35_0
  six                pkgs/main/noarch::six-1.15.0-pyhd3eb1b0_0
  sqlite             pkgs/main/linux-64::sqlite-3.33.0-h62c20be_0
  tbb                pkgs/main/linux-64::tbb-2021.2.0-hff7bd54_0
  tbb4py             pkgs/main/linux-64::tbb4py-2018.0.5-py35h6bb024c_0
  tk                 pkgs/main/linux-64::tk-8.6.10-hbc83047_0
  wheel              pkgs/main/noarch::wheel-0.36.2-pyhd3eb1b0_0
  xz                 pkgs/main/linux-64::xz-5.2.5-h7b6447c_0
  zlib               pkgs/main/linux-64::zlib-1.2.11-h7b6447c_3


Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
#
# To activate this environment, use
#
#     $ conda activate /tmp/tmp.MhcJSkguAC/3.5
#
# To deactivate an active environment, use
#
#     $ conda deactivate

Creating pip installable source dist
running sdist
running egg_info
creating pyspark.egg-info
writing top-level names to pyspark.egg-info/top_level.txt
writing pyspark.egg-info/PKG-INFO
writing requirements to pyspark.egg-info/requires.txt
writing dependency_links to pyspark.egg-info/dependency_links.txt
writing manifest file 'pyspark.egg-info/SOURCES.txt'
Could not import pypandoc - required to package PySpark
package init file 'deps/bin/__init__.py' not found (or not a regular file)
package init file 'deps/jars/__init__.py' not found (or not a regular file)
package init file 'pyspark/python/pyspark/__init__.py' not found (or not a regular file)
package init file 'lib/__init__.py' not found (or not a regular file)
package init file 'deps/data/__init__.py' not found (or not a regular file)
package init file 'deps/licenses/__init__.py' not found (or not a regular file)
package init file 'deps/examples/__init__.py' not found (or not a regular file)
reading manifest file 'pyspark.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.py[cod]' found anywhere in distribution
warning: no previously-included files matching '__pycache__' found anywhere in distribution
warning: no previously-included files matching '.DS_Store' found anywhere in distribution
writing manifest file 'pyspark.egg-info/SOURCES.txt'
running check
creating pyspark-2.4.9.dev0
creating pyspark-2.4.9.dev0/deps
creating pyspark-2.4.9.dev0/deps/bin
creating pyspark-2.4.9.dev0/deps/data
creating pyspark-2.4.9.dev0/deps/data/graphx
creating pyspark-2.4.9.dev0/deps/data/mllib
creating pyspark-2.4.9.dev0/deps/data/mllib/als
creating pyspark-2.4.9.dev0/deps/data/mllib/images
creating pyspark-2.4.9.dev0/deps/data/mllib/images/origin
creating pyspark-2.4.9.dev0/deps/data/mllib/images/origin/kittens
creating pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned
creating pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned/cls=kittens
creating pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
creating pyspark-2.4.9.dev0/deps/data/mllib/ridge-data
creating pyspark-2.4.9.dev0/deps/data/streaming
creating pyspark-2.4.9.dev0/deps/examples
creating pyspark-2.4.9.dev0/deps/examples/ml
creating pyspark-2.4.9.dev0/deps/examples/mllib
creating pyspark-2.4.9.dev0/deps/examples/sql
creating pyspark-2.4.9.dev0/deps/examples/sql/streaming
creating pyspark-2.4.9.dev0/deps/examples/streaming
creating pyspark-2.4.9.dev0/deps/jars
creating pyspark-2.4.9.dev0/deps/licenses
creating pyspark-2.4.9.dev0/lib
creating pyspark-2.4.9.dev0/pyspark
creating pyspark-2.4.9.dev0/pyspark.egg-info
creating pyspark-2.4.9.dev0/pyspark/ml
creating pyspark-2.4.9.dev0/pyspark/ml/linalg
creating pyspark-2.4.9.dev0/pyspark/ml/param
creating pyspark-2.4.9.dev0/pyspark/mllib
creating pyspark-2.4.9.dev0/pyspark/mllib/linalg
creating pyspark-2.4.9.dev0/pyspark/mllib/stat
creating pyspark-2.4.9.dev0/pyspark/python
creating pyspark-2.4.9.dev0/pyspark/python/pyspark
creating pyspark-2.4.9.dev0/pyspark/sql
creating pyspark-2.4.9.dev0/pyspark/streaming
copying files to pyspark-2.4.9.dev0...
copying MANIFEST.in -> pyspark-2.4.9.dev0
copying README.md -> pyspark-2.4.9.dev0
copying setup.cfg -> pyspark-2.4.9.dev0
copying setup.py -> pyspark-2.4.9.dev0
copying deps/bin/beeline -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/beeline.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/docker-image-tool.sh -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/find-spark-home -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/find-spark-home.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/load-spark-env.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/load-spark-env.sh -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/pyspark -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/pyspark.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/pyspark2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/run-example -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/run-example.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-class -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-class.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-class2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-shell -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-shell.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-shell2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-sql -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-sql.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-sql2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-submit -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-submit.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/spark-submit2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/sparkR -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/sparkR.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/bin/sparkR2.cmd -> pyspark-2.4.9.dev0/deps/bin
copying deps/data/graphx/followers.txt -> pyspark-2.4.9.dev0/deps/data/graphx
copying deps/data/graphx/users.txt -> pyspark-2.4.9.dev0/deps/data/graphx
copying deps/data/mllib/gmm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/iris_libsvm.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/kmeans_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/pagerank_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/pic_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_binary_classification_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_fpgrowth.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_isotonic_regression_libsvm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_kmeans_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_lda_libsvm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_libsvm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_linear_regression_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_movielens_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_multiclass_classification_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/sample_svm_data.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/streaming_kmeans_data_test.txt -> pyspark-2.4.9.dev0/deps/data/mllib
copying deps/data/mllib/als/sample_movielens_ratings.txt -> pyspark-2.4.9.dev0/deps/data/mllib/als
copying deps/data/mllib/als/test.data -> pyspark-2.4.9.dev0/deps/data/mllib/als
copying deps/data/mllib/images/license.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images
copying deps/data/mllib/images/origin/license.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images/origin
copying deps/data/mllib/images/origin/kittens/not-image.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images/origin/kittens
copying deps/data/mllib/images/partitioned/cls=kittens/date=2018-01/not-image.txt -> pyspark-2.4.9.dev0/deps/data/mllib/images/partitioned/cls=kittens/date=2018-01
copying deps/data/mllib/ridge-data/lpsa.data -> pyspark-2.4.9.dev0/deps/data/mllib/ridge-data
copying deps/data/streaming/AFINN-111.txt -> pyspark-2.4.9.dev0/deps/data/streaming
copying deps/examples/als.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/avro_inputformat.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/kmeans.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/logistic_regression.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/pagerank.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/parquet_inputformat.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/pi.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/sort.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/status_api_demo.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/transitive_closure.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/wordcount.py -> pyspark-2.4.9.dev0/deps/examples
copying deps/examples/ml/aft_survival_regression.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/als_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/binarizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/bisecting_k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/bucketed_random_projection_lsh_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/bucketizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/chi_square_test_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/chisq_selector_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/correlation_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/count_vectorizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/cross_validator.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/dataframe_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/dct_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/decision_tree_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/elementwise_product_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/estimator_transformer_param_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/feature_hasher_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/fpgrowth_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/gaussian_mixture_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/generalized_linear_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_classifier_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/gradient_boosted_tree_regressor_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/imputer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/index_to_string_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/isotonic_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/kmeans_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/lda_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/linear_regression_with_elastic_net.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/linearsvc.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_summary_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/logistic_regression_with_elastic_net.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/max_abs_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/min_hash_lsh_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/min_max_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/multiclass_logistic_regression_with_elastic_net.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/multilayer_perceptron_classification.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/n_gram_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/naive_bayes_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/normalizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/one_vs_rest_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/onehot_encoder_estimator_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/pca_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/pipeline_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/polynomial_expansion_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/prefixspan_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/quantile_discretizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_classifier_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/random_forest_regressor_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/rformula_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/sql_transformer.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/standard_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/stopwords_remover_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/string_indexer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/summarizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/tf_idf_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/tokenizer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/train_validation_split.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_assembler_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_indexer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_size_hint_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/vector_slicer_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/ml/word2vec_example.py -> pyspark-2.4.9.dev0/deps/examples/ml
copying deps/examples/mllib/binary_classification_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/bisecting_k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/correlations_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/decision_tree_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/elementwise_product_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/fpgrowth_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gaussian_mixture_model.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/gradient_boosting_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/hypothesis_testing_kolmogorov_smirnov_test_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/isotonic_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/kernel_density_estimation_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/kmeans.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/latent_dirichlet_allocation_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/linear_regression_with_sgd_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/logistic_regression_with_lbfgs_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_class_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/multi_label_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/naive_bayes_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/normalizer_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/pca_rowmatrix_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/power_iteration_clustering_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_classification_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/random_forest_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/random_rdd_generation.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/ranking_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/recommendation_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/regression_metrics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/sampled_rdds.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/standard_scaler_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/stratified_sampling_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_k_means_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/streaming_linear_regression_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/summary_statistics_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/svd_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/svm_with_sgd_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/tf_idf_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/mllib/word2vec_example.py -> pyspark-2.4.9.dev0/deps/examples/mllib
copying deps/examples/sql/arrow.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/basic.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/datasource.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/hive.py -> pyspark-2.4.9.dev0/deps/examples/sql
copying deps/examples/sql/streaming/structured_kafka_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/sql/streaming
copying deps/examples/sql/streaming/structured_network_wordcount_windowed.py -> pyspark-2.4.9.dev0/deps/examples/sql/streaming
copying deps/examples/streaming/direct_kafka_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/flume_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/hdfs_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/kafka_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/network_wordjoinsentiments.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/queue_stream.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/recoverable_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/sql_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/examples/streaming/stateful_network_wordcount.py -> pyspark-2.4.9.dev0/deps/examples/streaming
copying deps/jars/JavaEWAH-0.3.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/RoaringBitmap-0.7.45.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/ST4-4.0.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/activation-1.1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/aircompressor-0.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/antlr-2.7.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/antlr-runtime-3.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/antlr4-runtime-4.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/aopalliance-1.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/aopalliance-repackaged-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/apache-log4j-extras-1.2.17.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/apacheds-i18n-2.0.0-M15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/apacheds-kerberos-codec-2.0.0-M15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/api-asn1-api-1.0.0-M20.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/api-util-1.0.0-M20.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arpack_combined_all-0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arrow-format-0.10.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arrow-memory-0.10.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/arrow-vector-0.10.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/automaton-1.11-8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/avro-1.8.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/avro-ipc-1.8.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/avro-mapred-1.8.2-hadoop2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/bonecp-0.8.0.RELEASE.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/breeze-macros_2.11-0.13.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/breeze_2.11-0.13.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/calcite-avatica-1.2.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/calcite-core-1.2.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/calcite-linq4j-1.2.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/cglib-2.2.1-v20090111.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/chill-java-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/chill_2.11-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-beanutils-1.7.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-cli-1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-codec-1.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-collections-3.2.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-compiler-3.0.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-compress-1.8.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-configuration-1.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-crypto-1.0.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-dbcp-1.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-digester-1.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-httpclient-3.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-io-2.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-lang-2.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-lang3-3.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-logging-1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-math3-3.4.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-net-3.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/commons-pool-1.5.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/compress-lzf-1.0.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/core-1.1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/curator-client-2.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/curator-framework-2.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/curator-recipes-2.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/datanucleus-api-jdo-3.2.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/datanucleus-core-3.2.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/datanucleus-rdbms-3.2.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/derby-10.12.1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/eigenbase-properties-1.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/flatbuffers-1.2.0-3f79e055.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/generex-1.0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/gmetric4j-1.0.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/gson-2.2.4.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/guava-14.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/guice-3.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-annotations-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-auth-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-client-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-hdfs-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-app-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-core-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-jobclient-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-mapreduce-client-shuffle-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-api-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-client-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-common-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hadoop-yarn-server-web-proxy-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-beeline-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-cli-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-exec-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-jdbc-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hive-metastore-1.2.1.spark2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hk2-api-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hk2-locator-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hk2-utils-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/hppc-0.7.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/htrace-core-3.1.0-incubating.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/httpclient-4.5.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/httpcore-4.4.10.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/ivy-2.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-annotations-2.6.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-core-2.7.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-core-asl-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-databind-2.6.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-dataformat-yaml-2.6.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-jaxrs-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-mapper-asl-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-module-jaxb-annotations-2.9.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-module-paranamer-2.7.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-module-scala_2.11-2.6.7.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jackson-xc-1.9.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/janino-3.0.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javassist-3.18.1-GA.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.annotation-api-1.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.inject-1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.inject-2.4.0-b34.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.servlet-api-3.1.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javax.ws.rs-api-2.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/javolution-5.5.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jaxb-api-2.2.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jcl-over-slf4j-1.7.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jdo-api-3.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-client-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-common-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-container-servlet-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-container-servlet-core-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-guava-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-media-jaxb-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jersey-server-2.22.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jettison-1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-6.1.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-client-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-continuation-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-http-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-io-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-jndi-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-plus-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-proxy-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-security-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-server-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-servlet-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-servlets-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-util-6.1.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-util-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-util-ajax-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-webapp-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jetty-xml-9.4.40.v20210413.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jline-2.14.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/joda-time-2.9.9.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jodd-core-3.5.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jpam-1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-ast_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-core_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-jackson_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/json4s-scalap_2.11-3.5.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jsp-api-2.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jsr305-3.0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jta-1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jtransforms-2.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/jul-to-slf4j-1.7.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kryo-shaded-4.0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kubernetes-client-4.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kubernetes-model-4.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/kubernetes-model-common-4.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/leveldbjni-all-1.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/libfb303-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/libthrift-0.9.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/log4j-1.2.17.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/logging-interceptor-3.12.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/lz4-java-1.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/machinist_2.11-0.6.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/macro-compat_2.11-1.1.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/mesos-1.4.0-shaded-protobuf.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-core-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-ganglia-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-graphite-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-json-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/metrics-jvm-3.1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/minlog-1.3.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/netty-3.9.9.Final.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/netty-all-4.1.47.Final.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/objenesis-2.5.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/okhttp-3.12.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/okio-1.15.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/oncrpc-1.0.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/opencsv-2.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/orc-core-1.5.5-nohive.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/orc-mapreduce-1.5.5-nohive.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/orc-shims-1.5.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/oro-2.0.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/osgi-resource-locator-1.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/paranamer-2.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-column-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-common-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-encoding-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-format-2.4.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-hadoop-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-hadoop-bundle-1.6.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/parquet-jackson-1.10.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/pmml-model-1.2.15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/pmml-schema-1.2.15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/protobuf-java-2.5.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/py4j-0.10.7.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/pyrolite-4.13.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-compiler-2.11.12.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-library-2.11.12.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-parser-combinators_2.11-1.1.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-reflect-2.11.12.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/scala-xml_2.11-1.0.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/shapeless_2.11-2.3.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/shims-0.7.45.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/slf4j-api-1.7.26.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/slf4j-log4j12-1.7.16.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/snakeyaml-1.15.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/snappy-0.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/snappy-java-1.1.8.2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.4.9-SNAPSHOT-tests.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-assembly_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-catalyst_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-core_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-ganglia-lgpl_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-graphx_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-hive-thriftserver_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-hive_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-kubernetes_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-kvstore_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-launcher_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-mesos_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-mllib-local_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-mllib_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-network-common_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-network-shuffle_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-repl_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-sketch_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-sql_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-streaming_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-tags_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-unsafe_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spark-yarn_2.11-2.4.9-SNAPSHOT.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spire-macros_2.11-0.13.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/spire_2.11-0.13.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stax-api-1.0-2.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stax-api-1.0.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stream-2.7.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/stringtemplate-3.2.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/super-csv-2.2.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/univocity-parsers-2.7.3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/unused-1.0.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/validation-api-1.1.0.Final.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xbean-asm6-shaded-4.8.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xercesImpl-2.9.1.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xml-apis-1.3.04.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xmlenc-0.52.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/xz-1.5.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/zjsonpatch-0.3.0.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/zookeeper-3.4.6.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/jars/zstd-jni-1.4.4-3.jar -> pyspark-2.4.9.dev0/deps/jars
copying deps/licenses/LICENSE-AnchorJS.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-CC0.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-bootstrap.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-cloudpickle.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-d3.min.js.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-dagre-d3.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-datatables.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-graphlib-dot.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-heapq.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-join.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-jquery.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-json-formatter.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-matchMedia-polyfill.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-modernizr.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-mustache.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-py4j.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-respond.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-sbt-launch-lib.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-sorttable.js.txt -> pyspark-2.4.9.dev0/deps/licenses
copying deps/licenses/LICENSE-vis-timeline.txt -> pyspark-2.4.9.dev0/deps/licenses
copying lib/py4j-0.10.7-src.zip -> pyspark-2.4.9.dev0/lib
copying lib/pyspark.zip -> pyspark-2.4.9.dev0/lib
copying pyspark/__init__.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/_globals.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/accumulators.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/broadcast.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/cloudpickle.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/conf.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/context.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/daemon.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/files.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/find_spark_home.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/heapq3.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/java_gateway.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/join.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/profiler.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/rdd.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/rddsampler.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/resultiterable.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/serializers.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/shell.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/shuffle.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/statcounter.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/status.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/storagelevel.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/taskcontext.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/test_broadcast.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/test_serializers.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/tests.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/traceback_utils.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/util.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/version.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark/worker.py -> pyspark-2.4.9.dev0/pyspark
copying pyspark.egg-info/PKG-INFO -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/SOURCES.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/dependency_links.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/requires.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark.egg-info/top_level.txt -> pyspark-2.4.9.dev0/pyspark.egg-info
copying pyspark/ml/__init__.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/base.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/classification.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/clustering.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/common.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/evaluation.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/feature.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/fpm.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/image.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/pipeline.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/recommendation.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/regression.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/stat.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/tests.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/tuning.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/util.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/wrapper.py -> pyspark-2.4.9.dev0/pyspark/ml
copying pyspark/ml/linalg/__init__.py -> pyspark-2.4.9.dev0/pyspark/ml/linalg
copying pyspark/ml/param/__init__.py -> pyspark-2.4.9.dev0/pyspark/ml/param
copying pyspark/ml/param/_shared_params_code_gen.py -> pyspark-2.4.9.dev0/pyspark/ml/param
copying pyspark/ml/param/shared.py -> pyspark-2.4.9.dev0/pyspark/ml/param
copying pyspark/mllib/__init__.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/classification.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/clustering.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/common.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/evaluation.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/feature.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/fpm.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/random.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/recommendation.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/regression.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/tests.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/tree.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/util.py -> pyspark-2.4.9.dev0/pyspark/mllib
copying pyspark/mllib/linalg/__init__.py -> pyspark-2.4.9.dev0/pyspark/mllib/linalg
copying pyspark/mllib/linalg/distributed.py -> pyspark-2.4.9.dev0/pyspark/mllib/linalg
copying pyspark/mllib/stat/KernelDensity.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/__init__.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/_statistics.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/distribution.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/mllib/stat/test.py -> pyspark-2.4.9.dev0/pyspark/mllib/stat
copying pyspark/python/pyspark/shell.py -> pyspark-2.4.9.dev0/pyspark/python/pyspark
copying pyspark/sql/__init__.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/catalog.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/column.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/conf.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/context.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/dataframe.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/functions.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/group.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/readwriter.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/session.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/streaming.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/tests.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/types.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/udf.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/utils.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/sql/window.py -> pyspark-2.4.9.dev0/pyspark/sql
copying pyspark/streaming/__init__.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/context.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/dstream.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/flume.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/kafka.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/kinesis.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/listener.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/tests.py -> pyspark-2.4.9.dev0/pyspark/streaming
copying pyspark/streaming/util.py -> pyspark-2.4.9.dev0/pyspark/streaming
Writing pyspark-2.4.9.dev0/setup.cfg
Creating tar archive
removing 'pyspark-2.4.9.dev0' (and everything under it)
Installing dist into virtual env
Obtaining file:///home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/python
Collecting py4j==0.10.7 (from pyspark==2.4.9.dev0)
  Downloading https://files.pythonhosted.org/packages/e3/53/c737818eb9a7dc32a7cd4f1396e787bd94200c3997c72c1dbe028587bd76/py4j-0.10.7-py2.py3-none-any.whl (197kB)
mkl-random 1.0.1 requires cython, which is not installed.
Installing collected packages: py4j, pyspark
  Running setup.py develop for pyspark
Successfully installed py4j-0.10.7 pyspark
You are using pip version 10.0.1, however version 20.3.4 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Run basic sanity check on pip installed version with spark-submit
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:71)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:79)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Successfully ran pip sanity check
Run basic sanity check with import based
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:71)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:79)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                        (0 + 10) / 10]
[Stage 0:=====>                                                    (1 + 9) / 10]
                                                                                
Successfully ran pip sanity check
Run the tests for context.py
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:71)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:79)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: target/unit-tests.log (No such file or directory)
	at java.io.FileOutputStream.open0(Native Method)
	at java.io.FileOutputStream.open(FileOutputStream.java:270)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:133)
	at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
	at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
	at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
	at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
	at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
	at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
	at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
	at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
	at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
	at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
	at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:120)
	at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:108)
	at org.apache.spark.deploy.SparkSubmit.initializeLogIfNecessary(SparkSubmit.scala:71)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:79)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

[Stage 0:>                                                          (0 + 4) / 4]
                                                                                

[Stage 10:>                                                         (0 + 4) / 4]
[Stage 10:>                 (0 + 4) / 4][Stage 11:>                 (0 + 0) / 2]
                                                                                
DeprecationWarning: 'source deactivate' is deprecated. Use 'conda deactivate'.
Cleaning up temporary directory - /tmp/tmp.MhcJSkguAC

========================================================================
Running SparkR tests
========================================================================

Attaching package: ‘SparkR’

The following objects are masked from ‘package:testthat’:

    describe, not

The following objects are masked from ‘package:stats’:

    cov, filter, lag, na.omit, predict, sd, var, window

The following objects are masked from ‘package:base’:

    as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
    rank, rbind, sample, startsWith, subset, summary, transform, union

Spark package found in SPARK_HOME: /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7
binary_function: 
binary functions: ...........
binaryFile: 
functions on binary files: ....
broadcast: 
broadcast variables: ..
client: 
functions in client.R: .....
context: 
test functions in sparkR.R: ..................................................
includePackage: 
include R packages: ..
jvm_api: 
JVM API: ..
mllib_classification: 
MLlib classification algorithms, except for tree-based algorithms: ......................................................................
mllib_clustering: 
MLlib clustering algorithms: .....................................................................
mllib_fpm: 
MLlib frequent pattern mining: .....
mllib_recommendation: 
MLlib recommendation algorithms: ........
mllib_regression: 
MLlib regression algorithms, except for tree-based algorithms: ................................................................................................................................
mllib_stat: 
MLlib statistics algorithms: ........
mllib_tree: 
MLlib tree-based algorithms: ..............................................................................................
parallelize_collect: 
parallelize() and collect(): .............................
rdd: 
basic RDD functions: ............................................................................................................................................................................................................................................................................................................................................................................................................................................
Serde: 
SerDe functionality: .......................................
shuffle: 
partitionBy, groupByKey, reduceByKey etc.: ....................
sparkR: 
functions in sparkR.R: ....
sparkSQL: 
SparkSQL functions: .....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
streaming: 
Structured Streaming: ..........................................
take: 
tests RDD function take(): ................
textFile: 
the textFile() function: ..............
utils: 
functions in utils.R: .............................................
Windows: 
Windows-specific tests: S

══ Skipped ═════════════════════════════════════════════════════════════════════
1. sparkJars tag in SparkContext (test_Windows.R:22:5) - Reason: This test is only for Windows, skipped

══ DONE ════════════════════════════════════════════════════════════════════════
Using R_SCRIPT_PATH = /usr/bin
++++ dirname /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/install-dev.sh
+++ cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R
+++ pwd
++ FWDIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R
++ LIB_DIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/lib
++ mkdir -p /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/lib
++ pushd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R
++ . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/find-r.sh
+++ '[' -z /usr/bin ']'
++ . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/create-rd.sh
+++ set -o pipefail
+++ set -e
+++++ dirname /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/create-rd.sh
++++ cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R
++++ pwd
+++ FWDIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R
+++ pushd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R
+++ . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/find-r.sh
++++ '[' -z /usr/bin ']'
+++ /usr/bin/Rscript -e ' if("devtools" %in% rownames(installed.packages())) { library(devtools); devtools::document(pkg="./pkg", roclets=c("rd")) }'
Loading required package: usethis
Updating SparkR documentation
Loading SparkR
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘endsWith’ in package ‘SparkR’
Creating a new generic function for ‘startsWith’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
Warning: [/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/pkg/R/SQLContext.R:592] @name May only use one @name per block
Warning: [/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/pkg/R/SQLContext.R:733] @name May only use one @name per block
++ /usr/bin/R CMD INSTALL --library=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/lib /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/pkg/
* installing *source* package ‘SparkR’ ...
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
Creating a new generic function for ‘colnames’ in package ‘SparkR’
Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
Creating a new generic function for ‘cov’ in package ‘SparkR’
Creating a new generic function for ‘drop’ in package ‘SparkR’
Creating a new generic function for ‘na.omit’ in package ‘SparkR’
Creating a new generic function for ‘filter’ in package ‘SparkR’
Creating a new generic function for ‘intersect’ in package ‘SparkR’
Creating a new generic function for ‘sample’ in package ‘SparkR’
Creating a new generic function for ‘transform’ in package ‘SparkR’
Creating a new generic function for ‘subset’ in package ‘SparkR’
Creating a new generic function for ‘summary’ in package ‘SparkR’
Creating a new generic function for ‘union’ in package ‘SparkR’
Creating a new generic function for ‘endsWith’ in package ‘SparkR’
Creating a new generic function for ‘startsWith’ in package ‘SparkR’
Creating a new generic function for ‘lag’ in package ‘SparkR’
Creating a new generic function for ‘rank’ in package ‘SparkR’
Creating a new generic function for ‘sd’ in package ‘SparkR’
Creating a new generic function for ‘var’ in package ‘SparkR’
Creating a new generic function for ‘window’ in package ‘SparkR’
Creating a new generic function for ‘predict’ in package ‘SparkR’
Creating a new generic function for ‘rbind’ in package ‘SparkR’
Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation path
* DONE (SparkR)
++ cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/lib
++ jar cfM /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/lib/sparkr.zip SparkR
++ popd
++ cd /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/..
++ pwd
+ SPARK_HOME=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7
+ . /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/bin/load-spark-env.sh
++ '[' -z /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7 ']'
++ '[' -z '' ']'
++ export SPARK_ENV_LOADED=1
++ SPARK_ENV_LOADED=1
++ export SPARK_CONF_DIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/conf
++ SPARK_CONF_DIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/conf
++ '[' -f /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/conf/spark-env.sh ']'
++ '[' -z '' ']'
++ ASSEMBLY_DIR2=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.11
++ ASSEMBLY_DIR1=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.12
++ [[ -d /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.11 ]]
++ [[ -d /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.12 ]]
++ '[' -d /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.11 ']'
++ export SPARK_SCALA_VERSION=2.11
++ SPARK_SCALA_VERSION=2.11
+ '[' -f /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/RELEASE ']'
+ SPARK_JARS_DIR=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.11/jars
+ '[' -d /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/assembly/target/scala-2.11/jars ']'
+ SPARK_HOME=/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7
+ /usr/bin/R CMD build /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/pkg
* checking for file ‘/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/pkg/DESCRIPTION’ ... OK
* preparing ‘SparkR’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... OK
* checking for LF line-endings in source and make files and shell scripts
* checking for empty or unneeded directories
* building ‘SparkR_2.4.9.tar.gz’

+ find pkg/vignettes/. -not -name . -not -name '*.Rmd' -not -name '*.md' -not -name '*.pdf' -not -name '*.html' -delete
++ grep Version /home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/pkg/DESCRIPTION
++ awk '{print $NF}'
+ VERSION=2.4.9
+ CRAN_CHECK_OPTIONS=--as-cran
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests'
+ '[' -n 1 ']'
+ CRAN_CHECK_OPTIONS='--as-cran --no-tests --no-manual --no-vignettes'
+ echo 'Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options'
Running CRAN check with --as-cran --no-tests --no-manual --no-vignettes options
+ '[' -n 1 ']'
+ '[' -n 1 ']'
+ /usr/bin/R CMD check --as-cran --no-tests --no-manual --no-vignettes SparkR_2.4.9.tar.gz
* using log directory ‘/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/SparkR.Rcheck’
* using R version 3.6.3 (2020-02-29)
* using platform: x86_64-pc-linux-gnu (64-bit)
* using session charset: UTF-8
* using options ‘--no-tests --no-manual --no-vignettes --as-cran’
* checking for file ‘SparkR/DESCRIPTION’ ... OK
* checking extension type ... Package
* this is package ‘SparkR’ version ‘2.4.9’
* package encoding: UTF-8
* checking CRAN incoming feasibility ... NOTE
Maintainer: ‘Shivaram Venkataraman <shivaram@cs.berkeley.edu>’

New submission

Package was archived on CRAN

CRAN repository db overrides:
  X-CRAN-Comment: Archived on 2020-07-10 as check problems were not
    corrected im time.

  including overly restrictive and incorrect Java version requirement.
* checking package namespace information ... OK
* checking package dependencies ... OK
* checking if this is a source package ... OK
* checking if there is a namespace ... OK
* checking for executable files ... OK
* checking for hidden files and directories ... OK
* checking for portable file names ... OK
* checking for sufficient/correct file permissions ... OK
* checking whether package ‘SparkR’ can be installed ... OK
* checking installed package size ... OK
* checking package directory ... OK
* checking for future file timestamps ... OK
* checking ‘build’ directory ... OK
* checking DESCRIPTION meta-information ... OK
* checking top-level files ... OK
* checking for left-over files ... OK
* checking index information ... OK
* checking package subdirectories ... OK
* checking R files for non-ASCII characters ... OK
* checking R files for syntax errors ... OK
* checking whether the package can be loaded ... OK
* checking whether the package can be loaded with stated dependencies ... OK
* checking whether the package can be unloaded cleanly ... OK
* checking whether the namespace can be loaded with stated dependencies ... OK
* checking whether the namespace can be unloaded cleanly ... OK
* checking loading without being on the library search path ... OK
* checking use of S3 registration ... OK
* checking dependencies in R code ... OK
* checking S3 generic/method consistency ... OK
* checking replacement functions ... OK
* checking foreign function calls ... OK
* checking R code for possible problems ... OK
* checking Rd files ... OK
* checking Rd metadata ... OK
* checking Rd line widths ... OK
* checking Rd cross-references ... OK
* checking for missing documentation entries ... OK
* checking for code/documentation mismatches ... OK
* checking Rd \usage sections ... OK
* checking Rd contents ... OK
* checking for unstated dependencies in examples ... OK
* checking installed files from ‘inst/doc’ ... OK
* checking files in ‘vignettes’ ... OK
* checking examples ... OK
* checking for unstated dependencies in ‘tests’ ... OK
* checking tests ... SKIPPED
* checking for unstated dependencies in vignettes ... OK
* checking package vignettes in ‘inst/doc’ ... OK
* checking running R code from vignettes ... SKIPPED
* checking re-building of vignette outputs ... SKIPPED
* checking for detritus in the temp directory ... OK
* DONE

Status: 1 NOTE
See
  ‘/home/jenkins/workspace/spark-branch-2.4-test-sbt-hadoop-2.7/R/SparkR.Rcheck/00check.log’
for details.


+ popd
Tests passed.
Archiving artifacts
Recording test results
[Checks API] No suitable checks publisher found.
Finished: SUCCESS