Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flink 1.19 #53

Draft
wants to merge 48 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
3d67a2b
Update for 1.19.0
lincoln-lil Feb 7, 2024
74057eb
[FLINK-34360][ci] Adds cleanup to job_init action
XComp Feb 6, 2024
cfdf05c
[hotfix][build] Fixes error message when printing the 15 biggest dire…
XComp Feb 6, 2024
8b28900
[FLINK-34200][test] Fix the bug that AutoRescalingITCase.testCheckpoi…
1996fanrui Feb 1, 2024
296f140
[FLINK-34420] Correct hadoop.tar.gz download url
rkhachatryan Feb 11, 2024
3775656
[FLINK-34344] Pass JobID to CheckpointStatsTracker
rkhachatryan Feb 2, 2024
04d3b1b
[FLINK-33958] Fix IntervalJoin restore test flakiness
bvarghese1 Jan 25, 2024
994850d
[FLINK-34422] Migrate BatchTestBase subclass to jUnit5
zentol Feb 11, 2024
3fcbe3d
[FLINK-34422][test] BatchTestBase uses MiniClusterExtension
zentol Feb 11, 2024
e7ac988
[hotfix] Rename ProtobufSQLITCaseTest -> ProtobufSQLITCase
zentol Feb 12, 2024
891a975
[FLINK-33960][JUnit5 migration] Migrate SlotSharingSlotAllocatorTest …
1996fanrui Jan 1, 2024
d039c6f
[FLINK-33960][Scheduler] Fix the bug that Adaptive Scheduler doesn't …
1996fanrui Jan 1, 2024
b7c935d
[hotfix][ci] Removes obsolete line
XComp Feb 13, 2024
c6c1f79
[FLINK-34418][ci] Mounts /mnt folder to /root
XComp Feb 16, 2024
f903ce2
[hotfix][test] Removes duplicate Apache header
XComp Feb 15, 2024
ec73bc5
[FLINK-22765][test] Hardens ExceptionUtilsITCase#testIsMetaspaceOutOf…
XComp Feb 15, 2024
8cf2996
[hotfix][test] Assert the slot allocation eventually succeed in dedic…
KarmaGYZ Feb 19, 2024
45d4dc1
[FLINK-34434][slotmanager] Complete the returnedFuture when slot remo…
KarmaGYZ Feb 19, 2024
4f7cc9f
[hotfix][docs] Integrate mongodb v1.1 docs
leonardBang Feb 20, 2024
a9bec20
[hotfix][docs] Update the versions of mongodb supported by mongodb-co…
Jiabao-Sun Feb 20, 2024
b7ea090
[hotfix][tests] Disables cool down phase for faster test execution
1996fanrui Feb 1, 2024
c91029b
[FLINK-34336][test] Fix the bug that AutoRescalingITCase may hang som…
1996fanrui Feb 2, 2024
a25fca9
[FLINK-34202][python] Optimize Python nightly CI time (#24321)
HuangXingBo Feb 21, 2024
d19e886
[FLINK-34479][documentation] Fix missed changelog configs in the docu…
masteryhx Feb 21, 2024
f21ee01
[FLINK-34476][table-planner] Consider assignment operator during TVF …
twalthr Feb 22, 2024
dd77ee5
[FLINK-34496] Break circular dependency in static initialization
zentol Feb 23, 2024
0af2540
[FLINK-34265][doc] Add the doc of named parameters (#24377)
hackergin Feb 26, 2024
d743ee3
[FLINK-34518][runtime] Fixes AdaptiveScheduler#suspend bug when the j…
XComp Feb 26, 2024
62d1b8f
[hotfix][runtime] Refactors suspend and cancel logic
XComp Feb 26, 2024
3c04316
[BP-1.19][FLINK-34274][runtime] Implicitly disable resource wait time…
XComp Feb 28, 2024
628ae78
[FLINK-34498] GSFileSystemFactory should not log full Flink config
jeyhunkarimov Feb 22, 2024
5016325
[FLINK-34499] Configuration#toString hides sensitive values
zentol Feb 22, 2024
12ea64c
[FLINK-33436][docs] Add the docs of built-in async-profiler
yuchen-ecnu Feb 28, 2024
161defe
[FLINK-34522][core] Changing the Time to Duration for StateTtlConfig.…
1996fanrui Feb 27, 2024
697d2b6
[hotfix] Fix the StateTtlConfig#newBuilder doc from Time to Duration
1996fanrui Feb 27, 2024
7618bde
[Hotfix] Fix Duration class can't load for pyflink
RocMarshal Mar 1, 2024
fa738bb
[FLINK-34582][realse][python] Updates cibuildwheel to support cpython…
HuangXingBo Mar 6, 2024
837f8e5
Revert "[FLINK-33532][network] Move the serialization of ShuffleDescr…
Mar 6, 2024
75c88fa
[FLINK-34616][python] Fix python dist dir doesn't clean when open met…
liuyongvs Mar 8, 2024
0a85a08
[FLINK-34622][docs] fix typo in execution_mode.md
yuchen-ecnu Mar 8, 2024
c6d96b7
[FLINK-34617][docs] Correct the Javadoc of org.apache.flink.api.commo…
Myasuka Mar 7, 2024
943d9a4
[FLINK-33798][statebackend/rocksdb] automatically clean up rocksdb l…
liming30 Mar 13, 2024
4d5327d
[FLINK-29114][connector][filesystem] Fix issue of file overwriting ca…
LadyForest Mar 14, 2024
4f7f6a9
[FLINK-34571][test] Fix flaky test SortMergeResultPartitionReadSchedu…
reswqa Mar 8, 2024
511814b
[FLINK-34593][release] Add release note for version 1.19
lincoln-lil Mar 18, 2024
a6a4667
[FLINK-34716][release] Build 1.19 docs in GitHub Action and mark 1.19…
lincoln-lil Mar 18, 2024
b6ba252
[hotfix] Fix maven property typo in root pom.xml
RyanSkraba Mar 19, 2024
10dc234
Remove raw
juha-aiven Feb 4, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions .github/actions/job_init/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,30 @@ runs:
echo "GHA_PIPELINE_START_TIME=${job_start_time}" >> "${GITHUB_ENV}"
echo "The job's start time is set to ${job_start_time}."

- name: "Pre-cleanup Disk Info"
shell: bash
run: df -h

- name: "Delete unused binaries"
shell: bash
run: |
# inspired by https://github.com/easimon/maximize-build-space
for label_with_path in \
"Android SDK:/usr/local/lib/android" \
"CodeQL:/opt/hostedtoolcache/CodeQL" \
".NET:/usr/share/dotnet"; do
dependency_name="$(echo "${label_with_path}" | cut -d: -f1)"
dependency_path="$(echo "${label_with_path}" | cut -d: -f2)"

if [ -d "${dependency_path}" ]; then
echo "[INFO] Deleting binaries of ${dependency_name} in ${dependency_path}."
sudo rm -rf "${dependency_path}"
df -h
else
echo "[INFO] The directory '${dependency_path}' doesn't exist. ${dependency_name} won't be removed."
fi
done

- name: "Set JDK version to ${{ inputs.jdk_version }}"
shell: bash
run: |
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ jobs:
matrix:
branch:
- master
- release-1.19
- release-1.18
- release-1.17
- release-1.16
steps:
- uses: actions/checkout@v3
with:
Expand All @@ -42,8 +42,8 @@ jobs:
echo "flink_branch=${currentBranch}" >> ${GITHUB_ENV}

if [ "${currentBranch}" = "master" ]; then
echo "flink_alias=release-1.19" >> ${GITHUB_ENV}
elif [ "${currentBranch}" = "release-1.18" ]; then
echo "flink_alias=release-1.20" >> ${GITHUB_ENV}
elif [ "${currentBranch}" = "release-1.19" ]; then
echo "flink_alias=stable" >> ${GITHUB_ENV}
fi
- name: Build documentation
Expand Down
14 changes: 14 additions & 0 deletions .github/workflows/template.flink-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -177,6 +177,9 @@ jobs:
# --init makes the process in the container being started as an init process which will clean up any daemon processes during shutdown
# --privileged allows writing coredumps in docker (FLINK-16973)
options: --init --privileged
# the /mnt folder is a separate disk mounted to the host filesystem with more free disk space that can be utilized
volumes:
- /mnt:/root
env:
# timeout in minutes - this environment variable is required by uploading_watchdog.sh
GHA_JOB_TIMEOUT: 240
Expand Down Expand Up @@ -251,6 +254,17 @@ jobs:
${{ inputs.environment }} PROFILE="$PROFILE -Pgithub-actions" ./tools/azure-pipelines/uploading_watchdog.sh \
./tools/ci/test_controller.sh ${{ matrix.module }}

- name: "Post-build Disk Info"
if: ${{ always() }}
shell: bash
run: df -h

- name: "Top 15 biggest directories in terms of used disk space"
if: ${{ always() }}
shell: bash
run: |
du -ah --exclude="proc" -t100M . | sort -h -r | head -n 15

- name: "Post-process build artifacts"
working-directory: ${{ env.CONTAINER_LOCAL_WORKING_DIR }}
run: find ${{ steps.test-run.outputs.debug-files-output-dir }} -type f -exec rename 's/[:<>|*?]/-/' {} \;
Expand Down
23 changes: 12 additions & 11 deletions docs/config.toml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.

baseURL = '//nightlies.apache.org/flink/flink-docs-master'
baseURL = '//nightlies.apache.org/flink/flink-docs-release-1.19'
languageCode = "en-us"
title = "Apache Flink"
enableGitInfo = false
Expand All @@ -24,7 +24,7 @@ pygmentsUseClasses = true
[params]
# Flag whether this is a stable version or not.
# Used for the quickstart page.
IsStable = false
IsStable = true

# Flag to indicate whether an outdated warning should be shown.
ShowOutDatedWarning = false
Expand All @@ -34,14 +34,14 @@ pygmentsUseClasses = true
# we change the version for the complete docs when forking of a release branch
# etc.
# The full version string as referenced in Maven (e.g. 1.2.1)
Version = "1.19-SNAPSHOT"
Version = "1.19.0"

# For stable releases, leave the bugfix version out (e.g. 1.2). For snapshot
# release this should be the same as the regular version
VersionTitle = "1.19-SNAPSHOT"
VersionTitle = "1.19"

# The branch for this version of Apache Flink
Branch = "master"
Branch = "release-1.19"

# The github repository for Apache Flink
Repo = "//github.com/apache/flink"
Expand All @@ -60,22 +60,23 @@ pygmentsUseClasses = true

ZhDownloadPage = "//flink.apache.org/zh/downloads.html"

JavaDocs = "//nightlies.apache.org/flink/flink-docs-master/api/java/"
JavaDocs = "//nightlies.apache.org/flink/flink-docs-release-1.19/api/java/"

ScalaDocs = "//nightlies.apache.org/flink/flink-docs-master/api/scala/index.html#org.apache.flink.api.scala.package"
ScalaDocs = "//nightlies.apache.org/flink/flink-docs-release-1.19/api/scala/index.html#org.apache.flink.api.scala.package"

PyDocs = "//nightlies.apache.org/flink/flink-docs-master/api/python/"
PyDocs = "//nightlies.apache.org/flink/flink-docs-release-1.19/api/python/"

# External links at the bottom
# of the menu
MenuLinks = [
["Project Homepage", "//flink.apache.org"],
["JavaDocs", "//nightlies.apache.org/flink/flink-docs-master/api/java/"],
["ScalaDocs", "//nightlies.apache.org/flink/flink-docs-master/api/scala/index.html#org.apache.flink.api.scala.package"],
["PyDocs", "//nightlies.apache.org/flink/flink-docs-master/api/python/"]
["JavaDocs", "//nightlies.apache.org/flink/flink-docs-release-1.19/api/java/"],
["ScalaDocs", "//nightlies.apache.org/flink/flink-docs-release-1.19/api/scala/index.html#org.apache.flink.api.scala.package/"],
["PyDocs", "//nightlies.apache.org/flink/flink-docs-release-1.19/api/python/"]
]

PreviousDocs = [
["1.19", "http://nightlies.apache.org/flink/flink-docs-release-1.19"],
["1.18", "http://nightlies.apache.org/flink/flink-docs-release-1.18"],
["1.17", "http://nightlies.apache.org/flink/flink-docs-release-1.17"],
["1.16", "http://nightlies.apache.org/flink/flink-docs-release-1.16"],
Expand Down
3 changes: 2 additions & 1 deletion docs/content.zh/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,8 @@ under the License.
For some reason Hugo will only allow linking to the
release notes if there is a leading '/' and file extension.
-->
请参阅 [Flink 1.18]({{< ref "/release-notes/flink-1.18.md" >}}),
请参阅 [Flink 1.19]({{< ref "/release-notes/flink-1.19.md" >}}),
[Flink 1.18]({{< ref "/release-notes/flink-1.18.md" >}}),
[Flink 1.17]({{< ref "/release-notes/flink-1.17.md" >}}),
[Flink 1.16]({{< ref "/release-notes/flink-1.16.md" >}}),
[Flink 1.15]({{< ref "/release-notes/flink-1.15.md" >}}),
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/table/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ Flink natively support various connectors. The following tables list all availab
</tr>
<tr>
<td><a href="{{< ref "docs/connectors/table/mongodb" >}}">MongoDB</a></td>
<td></td>
<td>3.6.x & 4.x & 5.x & 6.0.x</td>
<td>Bounded Scan, Lookup</td>
<td>Streaming Sink, Batch Sink</td>
</tr>
Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/deployment/config.md
Original file line number Diff line number Diff line change
Expand Up @@ -466,7 +466,7 @@ Advanced options to tune RocksDB and RocksDB checkpoints.
### State Changelog Options

Please refer to [State Backends]({{< ref "docs/ops/state/state_backends#enabling-changelog" >}}) for information on
using State Changelog.
using State Changelog. {{< generated/state_backend_changelog_section >}}

#### FileSystem-based Changelog options

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/dev/datastream/execution_mode.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Apache Flink 对流处理和批处理采取统一的处理方式,这意味着

## 配置批执行模式

执行模式可以通过 `execute.runtime-mode` 设置来配置。有三种可选的值:
执行模式可以通过 `execution.runtime-mode` 设置来配置。有三种可选的值:

- `STREAMING`: 经典 DataStream 执行模式(默认)
- `BATCH`: 在 DataStream API 上进行批量式执行
Expand Down
29 changes: 15 additions & 14 deletions docs/content.zh/docs/dev/datastream/fault-tolerance/state.md
Original file line number Diff line number Diff line change
Expand Up @@ -305,7 +305,7 @@ import org.apache.flink.api.common.state.ValueStateDescriptor;
import org.apache.flink.api.common.time.Time;

StateTtlConfig ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.setUpdateType(StateTtlConfig.UpdateType.OnCreateAndWrite)
.setStateVisibility(StateTtlConfig.StateVisibility.NeverReturnExpired)
.build();
Expand All @@ -321,7 +321,7 @@ import org.apache.flink.api.common.state.ValueStateDescriptor
import org.apache.flink.api.common.time.Time

val ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.setUpdateType(StateTtlConfig.UpdateType.OnCreateAndWrite)
.setStateVisibility(StateTtlConfig.StateVisibility.NeverReturnExpired)
.build
Expand Down Expand Up @@ -399,7 +399,7 @@ Heap state backend 会额外存储一个包括用户状态以及时间戳的 Jav
import org.apache.flink.api.common.state.StateTtlConfig;

StateTtlConfig ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.disableCleanupInBackground()
.build();
```
Expand All @@ -409,7 +409,7 @@ StateTtlConfig ttlConfig = StateTtlConfig
import org.apache.flink.api.common.state.StateTtlConfig

val ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.disableCleanupInBackground
.build
```
Expand Down Expand Up @@ -441,7 +441,7 @@ import org.apache.flink.api.common.state.StateTtlConfig;
import org.apache.flink.api.common.time.Time;

StateTtlConfig ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.cleanupFullSnapshot()
.build();
```
Expand All @@ -452,7 +452,7 @@ import org.apache.flink.api.common.state.StateTtlConfig
import org.apache.flink.api.common.time.Time

val ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.cleanupFullSnapshot
.build
```
Expand Down Expand Up @@ -487,7 +487,7 @@ ttl_config = StateTtlConfig \
```java
import org.apache.flink.api.common.state.StateTtlConfig;
StateTtlConfig ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.cleanupIncrementally(10, true)
.build();
```
Expand All @@ -496,7 +496,7 @@ import org.apache.flink.api.common.state.StateTtlConfig;
```scala
import org.apache.flink.api.common.state.StateTtlConfig
val ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.newBuilder(Duration.ofSeconds(1))
.cleanupIncrementally(10, true)
.build
```
Expand Down Expand Up @@ -537,8 +537,8 @@ Flink 提供的 RocksDB 压缩过滤器会在压缩时过滤掉已经过期的
import org.apache.flink.api.common.state.StateTtlConfig;

StateTtlConfig ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.cleanupInRocksdbCompactFilter(1000, Time.hours(1))
.newBuilder(Duration.ofSeconds(1))
.cleanupInRocksdbCompactFilter(1000, Duration.ofHours(1))
.build();
```
{{< /tab >}}
Expand All @@ -547,19 +547,20 @@ StateTtlConfig ttlConfig = StateTtlConfig
import org.apache.flink.api.common.state.StateTtlConfig

val ttlConfig = StateTtlConfig
.newBuilder(Time.seconds(1))
.cleanupInRocksdbCompactFilter(1000, Time.hours(1))
.newBuilder(Duration.ofSeconds(1))
.cleanupInRocksdbCompactFilter(1000, Duration.ofHours(1))
.build
```
{{< /tab >}}
{{< tab "Python" >}}
```python
from pyflink.common import Duration
from pyflink.common.time import Time
from pyflink.datastream.state import StateTtlConfig

ttl_config = StateTtlConfig \
.new_builder(Time.seconds(1)) \
.cleanup_in_rocksdb_compact_filter(1000, Time.hours(1)) \
.cleanup_in_rocksdb_compact_filter(1000, Duration.of_hours(1)) \
.build()
```
{{< /tab >}}
Expand All @@ -573,7 +574,7 @@ RocksDB backend 的默认后台清理策略会每处理 1000 条数据进行一
定期压缩可以加速过期状态条目的清理,特别是对于很少访问的状态条目。
比这个值早的文件将被选取进行压缩,并重新写入与之前相同的 Level 中。
该功能可以确保文件定期通过压缩过滤器压缩。
您可以通过`StateTtlConfig.newBuilder(...).cleanupInRocksdbCompactFilter(long queryTimeAfterNumEntries, Time periodicCompactionTime)`
您可以通过`StateTtlConfig.newBuilder(...).cleanupInRocksdbCompactFilter(long queryTimeAfterNumEntries, Duration periodicCompactionTime)`
方法设定定期压缩的时间。
定期压缩的时间的默认值是 30 天。
您可以将其设置为 0 以关闭定期压缩或设置一个较小的值以加速过期状态条目的清理,但它将会触发更多压缩。
Expand Down
Loading
Loading