Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
1800a04
build: add Spark 4.0.2 version property and Scala 2.13 support
tkaymak Apr 17, 2026
435b395
refactor: make shared Spark source compatible with Scala 2.12 and 2.13
tkaymak Apr 17, 2026
335c340
build: add runners/spark/4/ build configuration
tkaymak Apr 17, 2026
7b2745b
feat: add Spark 4 structured streaming runner source
tkaymak Apr 17, 2026
b5de876
ci: add Spark 4 PreCommit and PostCommit workflows
tkaymak Apr 17, 2026
792801e
Add PreCommit Java Spark4 Versions workflow
tkaymak Apr 17, 2026
2c4c9ca
Add cancellation support to Spark pipeline execution
tkaymak Apr 17, 2026
1d38a7f
Remove unused endOfData() call in close method
tkaymak Apr 17, 2026
3709a9c
build: add Spark 4 job-server and container modules
tkaymak Apr 17, 2026
32947d3
build: remove spark.driver.host workaround from Spark 4 build
tkaymak Apr 17, 2026
ce7701b
docs: add Spark 4 runner entry to CHANGES.md
tkaymak Apr 17, 2026
e2e0135
docs: explain classic.SparkSession downcast in BoundedDatasetFactory
tkaymak Apr 17, 2026
8762e66
fix: log warning when neither WrappedArray nor ArraySeq class is found
tkaymak Apr 17, 2026
87e90dc
build: compare spark_version numerically via isSparkAtLeast helper
tkaymak Apr 17, 2026
864114b
[Spark Runner] Slim Spark 4 to override-only files
tkaymak Apr 17, 2026
83b3c8e
[Spark Runner] Use java.io.Serializable in DoFnRunnerFactory base
tkaymak Apr 21, 2026
a665cf3
[Spark Runner] Null-guard error message logging in EvaluationContext …
tkaymak Apr 21, 2026
1e2c8d7
[Spark Runner] Cancel execution future and use Beam-vendored Guava in…
tkaymak Apr 21, 2026
d0960eb
ci: re-trigger to clear flaky UnboundedScheduledExecutorServiceTest
tkaymak Apr 21, 2026
1e14c83
[Spark Runner] Fix maxTimestamp to handle multi-window values
tkaymak Apr 21, 2026
3b6dae2
[Spark Runner] Drop unchecked cast in BoundedDatasetFactory.split
tkaymak Apr 21, 2026
5636e59
[Spark Runner] Drop redundant Iterator cast in Spark 4 GroupByKeyTran…
tkaymak Apr 21, 2026
2bbd22b
[Spark Runner] Spark 4 EncoderFactory: stable constructor lookup + do…
tkaymak Apr 21, 2026
5bfcdd9
[Spark Runner] Fix Javadoc/comment typos flagged by Gemini
tkaymak Apr 21, 2026
4f6f47f
[Spark Runner] Switch EncoderFactory.invoke on the right constructor
tkaymak Apr 21, 2026
a08517b
Update CHANGES.md
tkaymak Apr 21, 2026
b8d49a1
[Spark 4] Drop redundant Collection cast in GroupByKeyHelpers
tkaymak Apr 22, 2026
3a29b47
[Spark 4] Add module README with slf4j-jdk14 known-issue note
tkaymak Apr 22, 2026
3631077
[runners-spark] Address Gemini nits: use encoder terminology in excep…
tkaymak Apr 22, 2026
ac00cc9
Trigger Build
tkaymak Apr 22, 2026
16639c8
ci: re-trigger to clear flaky FlinkRequiresStableInputTest
tkaymak Apr 23, 2026
8ebed02
ci: re-trigger to clear flaky SqsIOWriteBatchesTest.testWriteBatchesT…
tkaymak Apr 24, 2026
08eff30
ci: re-trigger to clear Maven Central 403 on Windows wordcount
tkaymak Apr 24, 2026
aa4e68e
ci: re-trigger to clear flaky Spotless + GCP IO Direct PreCommits
tkaymak Apr 25, 2026
f2d27e4
[runners-spark] Cover Scala-array fallback and EvaluationContext erro…
tkaymak Apr 25, 2026
17aae9b
[runners-spark] spotless: inline two findFirstAvailableClass calls in…
tkaymak Apr 26, 2026
8a55bb2
flaky SqsIOWriteBatchesTest retry
tkaymak Apr 28, 2026
467897d
flaky ExampleEchoPipelineTest retry
tkaymak Apr 28, 2026
2bf3607
rebase cleanup: drop duplicate isSparkAtLeast helper now in master vi…
tkaymak Apr 30, 2026
58bd932
Address @Abacn 2026-05-07 review
tkaymak May 7, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"comment": "Modify this file in a trivial way to cause this test suite to run"
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"comment": "Modify this file in a trivial way to cause this test suite to run"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both trigger files are not needed (one not effective yet; another one is a leftover of deleted workflow)

}
1 change: 1 addition & 0 deletions .github/workflows/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -373,6 +373,7 @@ PostCommit Jobs run in a schedule against master branch and generally do not get
| [ PostCommit Java ValidatesRunner Spark Java8 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java8.yml) | N/A |`beam_PostCommit_Java_ValidatesRunner_Spark_Java8.json`| [![.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java8.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java8.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark_Java8.yml?query=event%3Aschedule) |
| [ PostCommit Java ValidatesRunner Spark ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml) | N/A |`beam_PostCommit_Java_ValidatesRunner_Spark.json`| [![.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark.yml?query=event%3Aschedule) |
| [ PostCommit Java ValidatesRunner SparkStructuredStreaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml) | N/A |`beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.json`| [![.github/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming.yml?query=event%3Aschedule) |
| [ PostCommit Java ValidatesRunner Spark4StructuredStreaming ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming.yml) | N/A |`beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming.json`| [![.github/workflows/beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming.yml?query=event%3Aschedule) |
| [ PostCommit Java ValidatesRunner Twister2 ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml) | N/A |`beam_PostCommit_Java_ValidatesRunner_Twister2.json`| [![.github/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_Twister2.yml?query=event%3Aschedule) |
| [ PostCommit Java ValidatesRunner ULR ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml) | N/A |`beam_PostCommit_Java_ValidatesRunner_ULR.json`| [![.github/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java_ValidatesRunner_ULR.yml?query=event%3Aschedule) |
| [ PostCommit Java ](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml) | N/A |`beam_PostCommit_Java.json`| [![.github/workflows/beam_PostCommit_Java.yml](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml/badge.svg?event=schedule)](https://github.com/apache/beam/actions/workflows/beam_PostCommit_Java.yml?query=event%3Aschedule) |
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

name: PostCommit Java ValidatesRunner Spark4 StructuredStreaming

on:
schedule:
- cron: '45 4/6 * * *'
pull_request_target:
paths: ['release/trigger_all_tests.json', '.github/trigger_files/beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming.json']
workflow_dispatch:

#Setting explicit permissions for the action to avoid the default permissions which are `write-all` in case of pull_request_target event
permissions:
actions: write
pull-requests: write
checks: write
contents: read
deployments: read
id-token: none
issues: write
discussions: read
packages: read
pages: read
repository-projects: read
security-events: read
statuses: read

# This allows a subsequently queued workflow run to interrupt previous runs
concurrency:
group: '${{ github.workflow }} @ ${{ github.event.pull_request.number || github.sha || github.head_ref || github.ref }}-${{ github.event.schedule || github.event.comment.id || github.event.sender.login }}'
cancel-in-progress: true

env:
DEVELOCITY_ACCESS_KEY: ${{ secrets.DEVELOCITY_ACCESS_KEY }}
GRADLE_ENTERPRISE_CACHE_USERNAME: ${{ secrets.GE_CACHE_USERNAME }}
GRADLE_ENTERPRISE_CACHE_PASSWORD: ${{ secrets.GE_CACHE_PASSWORD }}

jobs:
beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming:
name: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
runs-on: [self-hosted, ubuntu-24.04, main]
timeout-minutes: 120
strategy:
matrix:
job_name: [beam_PostCommit_Java_ValidatesRunner_Spark4StructuredStreaming]
job_phrase: [Run Spark4 StructuredStreaming ValidatesRunner]
if: |
github.event_name == 'workflow_dispatch' ||
github.event_name == 'pull_request_target' ||
(github.event_name == 'schedule' && github.repository == 'apache/beam') ||
github.event.comment.body == 'Run Spark4 StructuredStreaming ValidatesRunner'
steps:
- uses: actions/checkout@v4
- name: Setup repository
uses: ./.github/actions/setup-action
with:
comment_phrase: ${{ matrix.job_phrase }}
github_token: ${{ secrets.GITHUB_TOKEN }}
github_job: ${{ matrix.job_name }} (${{ matrix.job_phrase }})
- name: Setup environment
uses: ./.github/actions/setup-environment-action
with:
java-version: '17'
- name: run validatesStructuredStreamingRunnerBatch script
uses: ./.github/actions/gradle-command-self-hosted-action
with:
gradle-command: :runners:spark:4:validatesStructuredStreamingRunnerBatch
arguments: |
-PtestJavaVersion=17 \
-PdisableSpotlessCheck=true \
- name: Archive JUnit Test Results
uses: actions/upload-artifact@v4
if: ${{ !success() }}
with:
name: JUnit Test Results
path: "**/build/reports/tests/"
- name: Publish JUnit Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
commit: '${{ env.prsha || env.GITHUB_SHA }}'
comment_mode: ${{ github.event_name == 'issue_comment' && 'always' || 'off' }}
files: '**/build/test-results/**/*.xml'
large_files: true
1 change: 0 additions & 1 deletion CHANGES.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,6 @@
## Highlights

* New highly anticipated feature X added to Python SDK ([#X](https://github.com/apache/beam/issues/X)).
* New highly anticipated feature Y added to Java SDK ([#Y](https://github.com/apache/beam/issues/Y)).
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possibly a rebase error. Just revert the change on this branch.


## I/Os

Expand Down
2 changes: 1 addition & 1 deletion gradle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,6 @@ docker_image_default_repo_prefix=beam_
# supported flink versions
flink_versions=1.17,1.18,1.19,1.20,2.0
# supported spark versions
spark_versions=3
spark_versions=3,4
# supported python versions
python_versions=3.10,3.11,3.12,3.13,3.14
73 changes: 73 additions & 0 deletions runners/spark/4/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Apache Beam Spark 4 Runner

Experimental Beam runner for Apache Spark 4 (batch-only). Built on the shared
`runners/spark` source base via `spark_runner.gradle`'s per-version
source-overrides mechanism: this module contributes the small set of files
under `src/main/java/.../structuredstreaming/` that diverge from the Spark 3
implementation. See the parent `runners/spark/` module for the bulk of the
runner code.

## Requirements

* **Spark 4.0.2** (and other Spark 4.0.x patch releases)
* **Scala 2.13**
* **Java 17** — Spark 4 does not run on earlier JDKs

## Status

Batch only. Streaming is tracked in
[#36841](https://github.com/apache/beam/issues/36841).

## Known issues

### `StackOverflowError` from `slf4j-jdk14` on the runtime classpath

Spark 4 ships `org.slf4j:jul-to-slf4j` to route `java.util.logging` records
into SLF4J. If `org.slf4j:slf4j-jdk14` is also resolved at runtime — it routes
the other direction (SLF4J → JUL) — the first log line creates an infinite
loop:

```
java.lang.StackOverflowError
at org.slf4j.bridge.SLF4JBridgeHandler.publish(...)
at java.util.logging.Logger.log(...)
at org.slf4j.impl.JDK14LoggerAdapter.log(...)
at org.slf4j.bridge.SLF4JBridgeHandler.publish(...)
...
```

This is the same condition that broke the Spark 3 runner in
[#26985](https://github.com/apache/beam/issues/26985), fixed in
[#27001](https://github.com/apache/beam/pull/27001).

The shared `spark_runner.gradle` already excludes `slf4j-jdk14` from the
runner module's own `configurations.all`, so in-tree builds are unaffected.
Downstream Gradle consumers that assemble a runtime classpath against
`beam-runners-spark-4` should mirror that exclude:

```groovy
configurations.all {
exclude group: "org.slf4j", module: "slf4j-jdk14"
}
```

For Maven, exclude `org.slf4j:slf4j-jdk14` from any dependency that pulls it
transitively (commonly the Beam SDK harness and several IO connectors).
53 changes: 53 additions & 0 deletions runners/spark/4/build.gradle
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

def basePath = '..'
/* All properties required for loading the Spark build script */
project.ext {
spark_major = '4'
// Spark 4 version as defined in BeamModulePlugin; requires Scala 2.13 and Java 17
spark_version = spark4_version
spark_scala_version = '2.13'
archives_base_name = 'beam-runners-spark-4'
}

// Load the main build script which contains all build logic.
// spark_runner.gradle handles the per-version source-overrides Copy:
// shared base (runners/spark/src/) + previous majors + this module's ./src/ are
// merged into build/source-overrides/src using DuplicatesStrategy.INCLUDE so the
// 11 files under runners/spark/4/src/.../structuredstreaming/ override the
// shared-base versions.
apply from: "$basePath/spark_runner.gradle"

// Spark 4 always requires Java 17, so unconditionally add the --add-opens flags
// required by Kryo and other libraries that use reflection on JDK internals.
test {
jvmArgs "--add-opens=java.base/sun.nio.ch=ALL-UNNAMED",
"--add-opens=java.base/java.nio=ALL-UNNAMED",
"--add-opens=java.base/java.util=ALL-UNNAMED",
"--add-opens=java.base/java.lang.invoke=ALL-UNNAMED"
}

// Exclude DStream-based streaming tests from the shared-base copy: the Spark 4 module
// supports only structured streaming (batch) and does not include legacy DStream support.
// Streaming test utilities also depend on kafka.server.KafkaServerStartable which was
// removed in Kafka 2.8.0 (the first Kafka version with a _2.13 artifact).
tasks.named("copyTestSourceOverrides") {
exclude "**/translation/streaming/**"
}

31 changes: 31 additions & 0 deletions runners/spark/4/job-server/build.gradle
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

def basePath = '../../job-server'

project.ext {
// Look for the source code in the parent module
main_source_dirs = ["$basePath/src/main/java"]
test_source_dirs = ["$basePath/src/test/java"]
main_resources_dirs = ["$basePath/src/main/resources"]
test_resources_dirs = ["$basePath/src/test/resources"]
archives_base_name = 'beam-runners-spark-4-job-server'
}

// Load the main build script which contains all build logic.
apply from: "$basePath/spark_job_server.gradle"
Loading
Loading