Update release policy page#688
Conversation
| <li><strong>FEATURE</strong>: Feature releases will typically contain new features, improvements, and bug fixes. | ||
| Each feature release will have a merge window where new patches can be merged, a QA window when | ||
| <li><strong>MAJOR</strong>: Major releases occur annually, third-party dependency | ||
| upgrades, and major code refactoring. All releases with the same major version number will have |
There was a problem hiding this comment.
major code refactoring does not need to wait for the yearly major release, we should mention breaking changes here instead.
| <td>6 months</td> | ||
| </tr> | ||
| <tr> | ||
| <td>LTS (final feature of a major)</td> |
There was a problem hiding this comment.
shall we make it explicit that every x.3 release is the LTS?
There was a problem hiding this comment.
Since this PR already mentions two exceptions like the following, it would be great to mention more specifically by giving examples like 5.3 or 6.3 in addition to every x.3 release.
As an exception from the normal versioning policy, version 3.5.x has an ...
For example, Spark 4.5 (the final 4.x feature release) would be maintained for 18 months from its release date.
| <ul> | ||
| <li><strong>MAINTENANCE</strong>: Maintenance releases will occur on an ad hoc basis depending on specific patches | ||
| introduced (e.g. critical bug fixes and security patches) and their urgency. In general these releases | ||
| are designed to patch bugs. However, higher level libraries may introduce small features, such as a |
There was a problem hiding this comment.
-
First, what is the definition of
higher level libraries? In the context, this could mean every module exceptSpark Core. In other words,SQL,MLLIB,GRAPHXand so on. -
Second, why do we need to allow this in the maintenance releases? I believe we had better keep the policy simple. In other words, no new features at all.
However, higher level libraries may introduce small features, such as a new algorithm, provided they are entirely additive and isolated from existing code paths.
There was a problem hiding this comment.
The bootstrapping is always difficult. Given that, why don't we give an explicit example like 2026 and 2027 release plan, @HyukjinKwon and @cloud-fan ?
For example, I guessed like the following but it seems that this PR has a different timeline.
- 4.2.0 (The original plan): 2026 Summer + 18 months.
- 4.3.0 (LTS based on SPIP)
- 4.3.0 RC1 will start after 3 months from the release day of Apache Spark 4.2.0 in 2026.
- In 2027, Spark 5 will start as the annual release with the quarterly feature releases.
- Spark 5.0: 2027.1
- Spark 5.1: 2027.4
- Spark 5.2: 2027.7
- Spark 5.3: 2027.10 (LTS based on SPIP)
I just want to understand our plan of migration toward this SPIP. So, if there is an explicit example for 2026 and 2027. It would be perfect for me and all. The above is only one of personal understanding.
|
cc @peter-toth , too. |
| The following table summarizes the maintenance window for each release type: | ||
|
|
||
| | Release Type | Cadence | Maintenance Window | | ||
| | ----- | ----- | ----- | |
There was a problem hiding this comment.
Don't we need Major type with version (x.0.0) here?
|
|
||
| | Release Type | Cadence | Maintenance Window | | ||
| | ----- | ----- | ----- | | ||
| | Feature (x.y) | Every 3 months | 6 months | |
There was a problem hiding this comment.
Shall we use version (x.1+.0) for Feature and (x.y.1+) for Maintenance?
This PR updates Release Policy page per SPIP: Accelerating Apache Spark Release Cadence.