Skip to content

Build / Python-only (master, Python 3.13) #411

Build / Python-only (master, Python 3.13)

Build / Python-only (master, Python 3.13) #411

Triggered via schedule November 11, 2025 20:00
Status Success
Total duration 1h 45m 55s
Artifacts 9

build_python_3.13.yml

on: schedule
Run  /  Check changes
37s
Run / Check changes
Run  /  Base image build
57s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
0s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Java 17 build with Maven
Run / Java 17 build with Maven
Run  /  Java 25 build with Maven
Run / Java 25 build with Maven
Run  /  Run TPC-DS queries with SF=1
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
0s
Run / Run Spark UI tests
Matrix: Run / build
Run  /  Build modules: sparkr
0s
Run / Build modules: sparkr
Run  /  Linters, licenses, and dependencies
Run / Linters, licenses, and dependencies
Run  /  Documentation generation
0s
Run / Documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

3 warnings
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml **/target/surefire-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
WARNING conda.cli.main_config:_set_key(451): Key auto_activate_base is an alias of auto_activate; setting value with latter
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.

Artifacts

Produced during runtime
Name Size Digest
apache~spark~LO1RGP.dockerbuild
28.5 KB
sha256:320ceeca6e3d6ae1083c72714da70e2ff45f2306800865e44415a503d50457e5
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.13
227 KB
sha256:1add07c1663356878aec1105ebc3a8551214a93ea231fbacc11d68496569d725
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.13
170 KB
sha256:2e2e37549bc57558dab5ba5d2b986325268230572b863c7099933fe72f90775d
test-results-pyspark-pandas--17-hadoop3-hive2.3-python3.13
196 KB
sha256:f01d0c1174ec9b4ce44233b9608352fed2fc62d37d9f16ccd395e954364e783b
test-results-pyspark-pandas-connect--17-hadoop3-hive2.3-python3.13
206 KB
sha256:4d9618a81cac54cbe32cf7cf0fda799f65bef0eb8c860707f595ddc3f3f58897
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3-python3.13
173 KB
sha256:5e5d035f040e9cbfbcd243486b098b3872202e5b671194ff508e4bf3da19ea0c
test-results-pyspark-pandas-slow-connect--17-hadoop3-hive2.3-python3.13
183 KB
sha256:e528c1c36d4d0e7cf162cd531118c0cb55827325d9eee9fa9baef4daca963a79
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.13
250 KB
sha256:c0aa51af87e6edfb49534420684e95c54d065cf6329ce460a8b3bf9667b8df2b
test-results-pyspark-structured-streaming, pyspark-structured-streaming-connect--17-hadoop3-hive2.3-python3.13
41 KB
sha256:9f65707032fa4eaa80e987f6fbce022a1401b4c0681c87d1ae24e873e20d4ec6