Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MINOR][DNM][TESTING] Flink bundle testing 5.01.1 #11330

Closed
wants to merge 112 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
112 commits
Select commit Hold shift + click to select a range
7b5b6c7
Move version to 0.15.0-SNAPSHOT
yihua Feb 26, 2024
6f25f41
[HUDI-6825] Use UTF_8 to encode String to byte array in all places (#…
yihua Feb 23, 2024
232255e
[HUDI-6826] Port BloomFilter related classes from Hadoop library to r…
yihua Sep 12, 2023
d0e98e1
[HUDI-6850] Add tests and docs for ported Bloom Filter classes (#9700)
yihua Sep 13, 2023
ec91bbc
[MINOR] Update cleaner docs (#9716)
jonvex Sep 14, 2023
3998ef6
[MINOR] Move hoodie hfile/orc reader/writer test cases from hudi-clie…
Mulavar Sep 17, 2023
82bd765
[MINOR] Mark advanced configs and fix since version (#9757)
yihua Sep 21, 2023
52c42f8
[HUDI-53] Update RFC-8 for Metadata based Record Index (#9775)
prashantwason Sep 28, 2023
69d0998
[MINOR] Update DOAP with 0.14.0 Release (#9803)
prashantwason Sep 28, 2023
1911c27
[HUDI-7016] Fix bundling of RoaringBitmap dependency (#9963)
yihua Nov 1, 2023
7acc41e
[HUDI-6993] Support Flink 1.18 (#9949)
PrabhuJoseph Feb 26, 2024
8fc4135
[HUDI-7082] Add Flink 1.14 and Spark 3.13 docker image script (#10066)
danny0405 Nov 13, 2023
c072007
[HUDI-7016] Fix bundling of RoaringBitmap in hudi-utilities-bundle (#…
codope Nov 14, 2023
ae80cbd
[HUDI-6806] Support Spark 3.5.0 (#9717)
CTTY Feb 26, 2024
1605c28
[HUDI-7113] Update release scripts and docs for Spark 3.5 support (#1…
yihua Nov 17, 2023
149ca9a
[HUDI-7072] Remove support for Flink 1.13 (#10052)
beyond1920 Nov 19, 2023
d1366d8
[MINOR] Add Hopsworks File System to StorageSchemes (#10141)
SirOibaf Nov 20, 2023
008320c
[HUDI-7207] Sequentially delete complete instant files in archival to…
majian1998 Feb 22, 2024
af3f258
[HUDI-4699] Claiming RFC for auto record key generation (#10357)
nsivabalan Dec 18, 2023
50119d2
[HUDI-4699] Adding RFC for auto record key generation (#10365)
nsivabalan Dec 19, 2023
155a66c
[HUDI-7190] Fix nested columns vectorized read for spark33+ legacy fo…
stream2000 Feb 26, 2024
e1625b1
[HUDI-7213] When using wrong tabe.type value in hudi catalog happends…
LXin96 Dec 21, 2023
a8ef9d4
[HUDI-7242] Avoid unnecessary bigquery table update when using sync t…
jp0317 Dec 22, 2023
353d281
[MINOR] Merge logs into check instant file of HoodieActiveTimeline.tr…
zhuanshenbsj1 Dec 23, 2023
5faefcd
[MINOR] DataStream need in closeure in FileSystemBasedLockProvider (#…
xuzifu666 Dec 27, 2023
1be7447
[HUDI-7249] Disable mor compaction scheduling when using append mode …
hehuiyuan Dec 28, 2023
94a162a
[HUDI-7268] HoodieFlinkStreamer should disable compaction in pipeline…
xuzifu666 Jan 1, 2024
acace8f
[HUDI-7260] Fix call repair_overwrite_hoodie_props failure error due …
empcl Jan 2, 2024
2601a0e
[MINOR] Fix ArchivalUtils Logger named (#10436)
eric9204 Jan 3, 2024
595d230
[HUDI-7198] Create nested node path if does not exist for zookeeper. …
harsh1231 Jan 4, 2024
37ff8fe
[HUDI-7271] Copy a conf in ClusteringOperator to avoid configuration …
LXin96 Jan 5, 2024
91d7983
[MINOR] Updating doap file for 0.14.1 release (#10439)
nsivabalan Jan 5, 2024
60b073f
[HUDI-7266] Add clustering metric for flink (#10420)
LXin96 Jan 7, 2024
6ffc817
[MINOR] Disable flaky test (#10449)
jonvex Jan 8, 2024
ef1ccce
[HUDI-7279] make sampling rate configurable for BOUNDED_IN_MEMORY exe…
waitingF Jan 10, 2024
fc587b3
[HUDI-5973] Fixing refreshing of schemas in HoodieStreamer continuous…
nsivabalan Jan 10, 2024
b712666
[MINOR] Fix unit tests (#10362)
geserdugarov Jan 10, 2024
d1dd4a4
[HUDI-7284] Stream sync doesn't differentiate replace commits (#10467)
jonvex Jan 10, 2024
c0e59e9
[HUDI-7241] Avoid always broadcast HUDI relation if not using HoodieS…
beyond1920 Jan 10, 2024
26df317
[MINOR] Fix usages of orElse (#10435)
the-other-tim-brown Jan 10, 2024
fcd6cd9
[MINOR] Avoid resource leaks (#10345)
the-other-tim-brown Jan 11, 2024
cdefb4b
[HUDI-7288] Fix ArrayIndexOutOfBoundsException when upgrade nonPartit…
beyond1920 Jan 11, 2024
ef7f523
[MINOR] Turning on publishing of test results to Azure Devops (#10477)
vinothchandar Jan 11, 2024
635d0c6
[MINOR] Parallelized the check for existence of files in IncrementalR…
prashantwason Jan 12, 2024
8546cbf
[HUDI-7282] Avoid verification failure due to append writing of the c…
Akihito-Liang Jan 12, 2024
744befe
[HUDI-6902] Use mvnw command for hadoo-mr test (#10474)
linliu-code Jan 12, 2024
36eeb94
[HUDI-6902] Give minimum memory for unit tests (#10469)
linliu-code Jan 12, 2024
da6a490
[HUDI-7278] make bloom filter skippable for CPU saving (#10457)
waitingF Jan 12, 2024
7d97216
[HUDI-7293] Incremental read of insert table using rebalance strategy…
empcl Jan 14, 2024
2b2e1a0
[HUDI-7286] Flink get hudi index type ignore case sensitive (#10476)
Akihito-Liang Jan 16, 2024
0de5f07
[HUDI-6092] Set the timeout for the forked JVM (#10496)
linliu-code Jan 16, 2024
d414b60
[MINOR] Clean default Hadoop configuration values in tests (#10495)
linliu-code Jan 16, 2024
9ddcfb1
[HUDI-7300] Merge schema in ParuqetDFSSource (#10199)
Jan 17, 2024
5bc160b
[MINOR] Fix eager rollback mdt ut (#10506)
KnightChess Jan 17, 2024
8048c99
[HUDI-7296] Reduce CI Time by Minimizing Duplicate Code Coverage in T…
jonvex Jan 17, 2024
7c13eb3
[HUDI-7246] Fix Data Skipping Issue: No Results When Query Conditions…
majian1998 Jan 18, 2024
2337270
[HUDI-7170] Implement HFile reader independent of HBase (#10241)
yihua Feb 26, 2024
a508d54
[HUDI-6902] Fix a unit test (#10513)
linliu-code Jan 18, 2024
3facb0a
[HUDI-6902] Shutdown metric hooks properly (#10520)
linliu-code Jan 18, 2024
e8f34c3
[HUDI-7305] Fix cast exception for byte/short/float partitioned field…
stream2000 Jan 19, 2024
975ba22
[HUDI-7297] Fix ambiguous error message when field type defined in sc…
paul8263 Feb 26, 2024
cefc530
[HUDI-7309] Disable constructing AND & OR filter predicates when filt…
paul8263 Jan 19, 2024
0705849
[HUDI-7284] Fix cluster stream sync check (#10501)
jonvex Feb 26, 2024
4361432
[HUDI-7314] Hudi Create table support index type check (#10536)
xuzifu666 Jan 19, 2024
ccb5993
[HUDI-7277] Fix `hoodie.bulkinsert.shuffle.parallelism` not activated…
KnightChess Jan 20, 2024
38525de
[MINOR] Added descriptive exception if column present in required avr…
prathit06 Jan 20, 2024
e5cabe6
[HUDI-7315] Disable constructing NOT filter predicate when pushing do…
paul8263 Jan 20, 2024
c9cdc2a
[HUDI-7317] FlinkTableFactory snatifyCheck should contains index type…
xuzifu666 Jan 22, 2024
288898e
[HUDI-7303] Fix date field type unexpectedly convert to Long when usi…
paul8263 Jan 23, 2024
1554908
[MINOR] Reduce UT spark-datasource test times (#10547)
vinothchandar Jan 23, 2024
1b37ee2
[HUDI-7237] Hudi Streamer: Handle edge case with null schema, minor c…
the-other-tim-brown Jan 24, 2024
cef039f
[HUDI-7316] AbstractHoodieLogRecordReader should accept HoodieTableMe…
kbuci Jan 24, 2024
492daf0
[HUDI-7311] Add implicit literal type conversion before filter push d…
paul8263 Jan 24, 2024
126010b
[HUDI-7228] Fix eager closure of log reader input streams with log re…
nsivabalan Feb 26, 2024
9002a02
[HUDI-7298] Write bad records to error table in more cases instead of…
jonvex Feb 27, 2024
31adbb9
[HUDI-7323] Use a schema supplier instead of a static value (#10549)
the-other-tim-brown Jan 25, 2024
6f27d81
[HUDI-7327] remove meta cols from incoming schema in stream sync (#10…
jonvex Feb 27, 2024
54a3b67
[HUDI-6230] Handle aws glue partition index (#8743)
parisni Jan 26, 2024
e76f2e8
[MINOR] add logger to CompactionPlanOperator & ClusteringPlanOperator…
eric9204 Jan 26, 2024
6dd4bea
[HUDI-7308] LockManager::unlock should not call updateLockHeldTimerMe…
kbuci Jan 27, 2024
86e3ca6
[HUDI-7335] Create hudi-hadoop-common for hadoop-specific implementat…
yihua Feb 27, 2024
b5200bf
[HUDI-7351] Fix missing implementation for glue metastore schema retr…
parisni Jan 29, 2024
005c758
[HUDI-7336] Introduce new HoodieStorage abstraction (#10567)
yihua Jan 29, 2024
e00e2d7
[HUDI-7342] Use BaseFileUtils to hide format-specific logic in Hoodie…
yihua Jan 29, 2024
a058344
[HUDI-7218] Integrate new HFile reader with file reader factory (#10330)
yihua Feb 27, 2024
8fda151
[HUDI-6902] Disable a flaky test (#10551)
linliu-code Jan 29, 2024
90ca4f0
[HUDI-7346] Remove usage of org.apache.hadoop.hbase.util.Bytes (#10574)
yihua Feb 27, 2024
97ce215
[HUDI-7343] Replace Path.SEPARATOR with HoodieLocation.SEPARATOR (#10…
yihua Feb 27, 2024
4d49fa4
[HUDI-7345] Remove usage of org.apache.hadoop.util.VersionUtil (#10571)
yihua Jan 31, 2024
bcfcd9f
[HUDI-7344] Use Java <Input/Output>Stream instead of FSData<Input/Out…
yihua Feb 27, 2024
e38c731
[HUDI-7347] Introduce SeekableDataInputStream for random access (#10575)
yihua Feb 27, 2024
aef157a
[MINOR] Add serialVersionUID to HoodieRecord class (#10592)
KingdomWG Feb 1, 2024
104fa7d
[HUDI-6902] Fix a test about timestamp format (#10606)
linliu-code Feb 2, 2024
cb2d94b
[HUDI-6868] Support extracting passwords from credential store for Hi…
ad1happy2go Feb 2, 2024
fa6e499
[Hudi-6902] Fix the timestamp format in hive test (#10610)
linliu-code Feb 3, 2024
4a04292
[HUDI-7284] Fix bad method name getLastPendingClusterCommit to getLas…
jonvex Feb 3, 2024
692f0d1
[HUDI-7351] Implement partition pushdown for glue (#10604)
parisni Feb 4, 2024
18f10ba
[HUDI-7375] Disable a flaky test method (#10627)
linliu-code Feb 5, 2024
b8b88cf
[HUDI-7366] Fix HoodieLocation with encoded paths (#10602)
yihua Feb 6, 2024
d17ae75
[HUDI-7338] Bump HBase, Pulsar, Jetty version (#10223)
CTTY Feb 6, 2024
51a364c
[HUDI-7367] Add makeQualified APIs (#10607)
yihua Feb 7, 2024
66ac9ff
[HUDI-7351] Handle case when glue expression larger than 2048 limit (…
parisni Feb 8, 2024
e03a88c
[HUDI-7392] Fix connection leak causing lingering CLOSE_WAIT (#10636)
voonhous Feb 8, 2024
9911497
[HUDI-7387] Serializable Class need contains serialVersionUID to keep…
xuzifu666 Feb 8, 2024
32fe3b6
[MINOR] fix typo (#10634)
lxliyou001 Feb 8, 2024
8436feb
[HUDI-7394] Fix run script of hudi-timeline-server-bundle (#10640)
voonhous Feb 8, 2024
09f3fb5
[HUDI-7373] revert config hoodie.write.handle.missing.cols.with.lossl…
jonvex Feb 8, 2024
a0ebac8
[HUDI-6902] Containerize the Azure CI (#10512)
linliu-code Feb 10, 2024
40cc538
[HUDI-7707] Enable bundle validation on Java 8 and 11
yihua May 27, 2024
287e4f3
Bundle validation only
yihua May 27, 2024
e880168
Fix ci_run.sh
yihua May 27, 2024
b36297b
Remove maven cache
yihua May 27, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
326 changes: 25 additions & 301 deletions .github/workflows/bot.yml

Large diffs are not rendered by default.

31 changes: 31 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Use a home made image as the base, which includes:
# utuntu:latest
# git
# thrift
# maven
# java8
# Use an official Ubuntu base image
FROM apachehudi/hudi-ci-bundle-validation-base:azure_ci_test_base_new

CMD ["java", "-version"]

# Set the working directory to /app
WORKDIR /hudi

# Copy git repo into the working directory
COPY . /hudi
15 changes: 14 additions & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,20 @@ This product includes code from Apache Hadoop

* org.apache.hudi.common.bloom.InternalDynamicBloomFilter.java adapted from org.apache.hadoop.util.bloom.DynamicBloomFilter.java

* org.apache.hudi.common.bloom.InternalFilter copied from classes in org.apache.hadoop.util.bloom package
* org.apache.hudi.common.bloom.InternalFilter.java adapted from org.apache.hadoop.util.bloom.Filter.java
and org.apache.hadoop.io.Writable.java

* org.apache.hudi.common.bloom.InternalBloomFilter adapted from org.apache.hadoop.util.bloom.BloomFilter.java

* org.apache.hudi.common.bloom.Key.java adapted from org.apache.hadoop.util.bloom.Key.java

* org.apache.hudi.common.bloom.HashFunction.java ported from org.apache.hadoop.util.bloom.HashFunction.java

* org.apache.hudi.common.util.hash.Hash.java ported from org.apache.hadoop.util.hash.Hash.java

* org.apache.hudi.common.util.hash.JenkinsHash.java ported from org.apache.hadoop.util.hash.JenkinsHash.java

* org.apache.hudi.common.util.hash.MurmurHash.java ported from org.apache.hadoop.util.hash.MurmurHash.java

with the following license

Expand Down
21 changes: 9 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,8 +66,8 @@ git clone https://github.com/apache/hudi.git && cd hudi
mvn clean package -DskipTests

# Start command
spark-3.2.3-bin-hadoop3.2/bin/spark-shell \
--jars `ls packaging/hudi-spark-bundle/target/hudi-spark3.2-bundle_2.12-*.*.*-SNAPSHOT.jar` \
spark-3.5.0-bin-hadoop3/bin/spark-shell \
--jars `ls packaging/hudi-spark-bundle/target/hudi-spark3.5-bundle_2.12-*.*.*-SNAPSHOT.jar` \
--conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
--conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \
--conf 'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog' \
Expand All @@ -85,7 +85,7 @@ mvn clean javadoc:aggregate -Pjavadocs
### Build with different Spark versions

The default Spark 2.x version supported is 2.4.4. The default Spark 3.x version, corresponding to `spark3` profile is
3.4.0. The default Scala version is 2.12. Refer to the table below for building with different Spark and Scala versions.
3.5.0. The default Scala version is 2.12. Refer to the table below for building with different Spark and Scala versions.

| Maven build options | Expected Spark bundle jar name | Notes |
|:--------------------------|:---------------------------------------------|:-------------------------------------------------|
Expand All @@ -96,9 +96,10 @@ The default Spark 2.x version supported is 2.4.4. The default Spark 3.x version,
| `-Dspark3.2` | hudi-spark3.2-bundle_2.12 | For Spark 3.2.x and Scala 2.12 (same as default) |
| `-Dspark3.3` | hudi-spark3.3-bundle_2.12 | For Spark 3.3.x and Scala 2.12 |
| `-Dspark3.4` | hudi-spark3.4-bundle_2.12 | For Spark 3.4.x and Scala 2.12 |
| `-Dspark3.5` | hudi-spark3.5-bundle_2.12 | For Spark 3.5.x and Scala 2.12 |
| `-Dspark2 -Dscala-2.11` | hudi-spark-bundle_2.11 (legacy bundle name) | For Spark 2.4.4 and Scala 2.11 |
| `-Dspark2 -Dscala-2.12` | hudi-spark-bundle_2.12 (legacy bundle name) | For Spark 2.4.4 and Scala 2.12 |
| `-Dspark3` | hudi-spark3-bundle_2.12 (legacy bundle name) | For Spark 3.4.x and Scala 2.12 |
| `-Dspark3` | hudi-spark3-bundle_2.12 (legacy bundle name) | For Spark 3.5.x and Scala 2.12 |

For example,
```
Expand All @@ -118,20 +119,19 @@ Starting from versions 0.11, Hudi no longer requires `spark-avro` to be specifie

### Build with different Flink versions

The default Flink version supported is 1.17. The default Flink 1.17.x version, corresponding to `flink1.17` profile is 1.17.0.
The default Flink version supported is 1.18. The default Flink 1.18.x version, corresponding to `flink1.18` profile is 1.18.0.
Flink is Scala-free since 1.15.x, there is no need to specify the Scala version for Flink 1.15.x and above versions.
Refer to the table below for building with different Flink and Scala versions.

| Maven build options | Expected Flink bundle jar name | Notes |
|:---------------------------|:-------------------------------|:---------------------------------|
| (empty) | hudi-flink1.17-bundle | For Flink 1.17 (default options) |
| `-Dflink1.17` | hudi-flink1.17-bundle | For Flink 1.17 (same as default) |
| (empty) | hudi-flink1.18-bundle | For Flink 1.18 (default options) |
| `-Dflink1.18` | hudi-flink1.18-bundle | For Flink 1.18 (same as default) |
| `-Dflink1.17` | hudi-flink1.17-bundle | For Flink 1.17 |
| `-Dflink1.16` | hudi-flink1.16-bundle | For Flink 1.16 |
| `-Dflink1.15` | hudi-flink1.15-bundle | For Flink 1.15 |
| `-Dflink1.14` | hudi-flink1.14-bundle | For Flink 1.14 and Scala 2.12 |
| `-Dflink1.14 -Dscala-2.11` | hudi-flink1.14-bundle | For Flink 1.14 and Scala 2.11 |
| `-Dflink1.13` | hudi-flink1.13-bundle | For Flink 1.13 and Scala 2.12 |
| `-Dflink1.13 -Dscala-2.11` | hudi-flink1.13-bundle | For Flink 1.13 and Scala 2.11 |

For example,
```
Expand All @@ -140,9 +140,6 @@ mvn clean package -DskipTests -Dflink1.15

# Build against Flink 1.14.x and Scala 2.11
mvn clean package -DskipTests -Dflink1.14 -Dscala-2.11

# Build against Flink 1.13.x and Scala 2.12
mvn clean package -DskipTests -Dflink1.13
```

## Running Tests
Expand Down
194 changes: 99 additions & 95 deletions azure-pipelines-20230430.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
# limitations under the License.

# NOTE:
# This config file defines how Azure CI runs tests with Spark 2.4 and Flink 1.17 profiles.
# This config file defines how Azure CI runs tests with Spark 2.4 and Flink 1.18 profiles.
# PRs will need to keep in sync with master's version to trigger the CI runs.

trigger:
Expand All @@ -32,15 +32,16 @@ parameters:
- 'hudi-common'
- 'hudi-flink-datasource'
- 'hudi-flink-datasource/hudi-flink'
- 'hudi-flink-datasource/hudi-flink1.13.x'
- 'hudi-flink-datasource/hudi-flink1.14.x'
- 'hudi-flink-datasource/hudi-flink1.15.x'
- 'hudi-flink-datasource/hudi-flink1.16.x'
- 'hudi-flink-datasource/hudi-flink1.17.x'
- 'hudi-flink-datasource/hudi-flink1.18.x'
- name: job2Modules
type: object
default:
- 'hudi-client/hudi-spark-client'
- 'hudi-spark-datasource/hudi-spark'
- name: job3UTModules
type: object
default:
Expand All @@ -64,11 +65,11 @@ parameters:
- '!hudi-examples/hudi-examples-spark'
- '!hudi-flink-datasource'
- '!hudi-flink-datasource/hudi-flink'
- '!hudi-flink-datasource/hudi-flink1.13.x'
- '!hudi-flink-datasource/hudi-flink1.14.x'
- '!hudi-flink-datasource/hudi-flink1.15.x'
- '!hudi-flink-datasource/hudi-flink1.16.x'
- '!hudi-flink-datasource/hudi-flink1.17.x'
- '!hudi-flink-datasource/hudi-flink1.18.x'
- '!hudi-spark-datasource'
- '!hudi-spark-datasource/hudi-spark'
- '!hudi-spark-datasource/hudi-spark3.2.x'
Expand All @@ -87,16 +88,17 @@ parameters:
- '!hudi-examples/hudi-examples-spark'
- '!hudi-flink-datasource'
- '!hudi-flink-datasource/hudi-flink'
- '!hudi-flink-datasource/hudi-flink1.13.x'
- '!hudi-flink-datasource/hudi-flink1.14.x'
- '!hudi-flink-datasource/hudi-flink1.15.x'
- '!hudi-flink-datasource/hudi-flink1.16.x'
- '!hudi-flink-datasource/hudi-flink1.17.x'
- '!hudi-flink-datasource/hudi-flink1.18.x'
- '!hudi-spark-datasource/hudi-spark'

variables:
BUILD_PROFILES: '-Dscala-2.12 -Dspark3.2 -Dflink1.17'
BUILD_PROFILES: '-Dscala-2.12 -Dspark3.2 -Dflink1.18'
PLUGIN_OPTS: '-Dcheckstyle.skip=true -Drat.skip=true -Djacoco.skip=true -ntp -B -V -Pwarn-log -Dorg.slf4j.simpleLogger.log.org.apache.maven.plugins.shade=warn -Dorg.slf4j.simpleLogger.log.org.apache.maven.plugins.dependency=warn'
MVN_OPTS_INSTALL: '-Phudi-platform-service -DskipTests $(BUILD_PROFILES) $(PLUGIN_OPTS) -Dmaven.wagon.httpconnectionManager.ttlSeconds=25 -Dmaven.wagon.http.retryHandler.count=5'
MVN_OPTS_INSTALL: '-DskipTests $(BUILD_PROFILES) $(PLUGIN_OPTS) -Dmaven.wagon.httpconnectionManager.ttlSeconds=25 -Dmaven.wagon.http.retryHandler.count=5'
MVN_OPTS_TEST: '-fae -Pwarn-log $(BUILD_PROFILES) $(PLUGIN_OPTS)'
JOB1_MODULES: ${{ join(',',parameters.job1Modules) }}
JOB2_MODULES: ${{ join(',',parameters.job2Modules) }}
Expand All @@ -106,118 +108,120 @@ variables:

stages:
- stage: test
variables:
- name: DOCKER_BUILDKIT
value: 1
jobs:
- job: UT_FT_1
displayName: UT FT common & flink & UT client/spark-client
timeoutInMinutes: '150'
steps:
- task: Maven@4
displayName: maven install
- task: Docker@2
displayName: "login to docker"
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: UT common flink client/spark-client
command: "login"
containerRegistry: "apachehudi-docker-hub"
- task: Docker@2
displayName: "load repo into image"
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB1_MODULES),hudi-client/hudi-spark-client
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- task: Maven@4
displayName: FT common flink
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'build'
Dockerfile: '**/Dockerfile'
ImageName: $(Build.BuildId)
- task: Docker@2
displayName: "UT FT common flink client/spark-client"
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB1_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'run'
arguments: >
-i docker.io/apachehudi/hudi-ci-bundle-validation-base:$(Build.BuildId)
/bin/bash -c "mvn clean install $(MVN_OPTS_INSTALL)
&& mvn test $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB1_MODULES),hudi-client/hudi-spark-client
&& mvn test $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB1_MODULES)
&& grep \"testcase\" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'\"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100"
- job: UT_FT_2
displayName: FT client/spark-client
displayName: FT client/spark-client & hudi-spark-datasource/hudi-spark
timeoutInMinutes: '150'
steps:
- task: Maven@4
displayName: maven install
- task: Docker@2
displayName: "login to docker"
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: FT client/spark-client
command: "login"
containerRegistry: "apachehudi-docker-hub"
- task: Docker@2
displayName: "load repo into image"
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB2_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'build'
Dockerfile: '**/Dockerfile'
ImageName: $(Build.BuildId)
- task: Docker@2
displayName: "FT client/spark-client & hudi-spark-datasource/hudi-spark"
inputs:
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'run'
arguments: >
-i docker.io/apachehudi/hudi-ci-bundle-validation-base:$(Build.BuildId)
/bin/bash -c "mvn clean install $(MVN_OPTS_INSTALL)
&& mvn test $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB2_MODULES)
&& grep \"testcase\" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'\"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100"
- job: UT_FT_3
displayName: UT spark-datasource
timeoutInMinutes: '240'
steps:
- task: Maven@4
displayName: maven install
- task: Docker@2
displayName: "login to docker"
inputs:
command: "login"
containerRegistry: "apachehudi-docker-hub"
- task: Docker@2
displayName: "load repo into image"
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: UT spark-datasource
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'build'
Dockerfile: '**/Dockerfile'
ImageName: $(Build.BuildId)
- task: Docker@2
displayName: "UT spark-datasource"
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB3_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'run'
arguments: >
-i docker.io/apachehudi/hudi-ci-bundle-validation-base:$(Build.BuildId)
/bin/bash -c "mvn clean install $(MVN_OPTS_INSTALL) && mvn test $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB3_MODULES)
&& grep \"testcase\" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'\"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100"
- job: UT_FT_4
displayName: UT FT other modules
timeoutInMinutes: '240'
steps:
- task: Maven@4
displayName: maven install
- task: Docker@2
displayName: "login to docker hub"
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean install'
options: $(MVN_OPTS_INSTALL)
publishJUnitResults: false
jdkVersionOption: '1.8'
- task: Maven@4
displayName: UT other modules
command: "login"
containerRegistry: "apachehudi-docker-hub"
- task: Docker@2
displayName: "load repo into image"
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB4_UT_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- task: Maven@4
displayName: FT other modules
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'build'
Dockerfile: '**/Dockerfile'
ImageName: $(Build.BuildId)
- task: Docker@2
displayName: "UT FT other modules"
inputs:
mavenPomFile: 'pom.xml'
goals: 'test'
options: $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB4_FT_MODULES)
publishJUnitResults: false
jdkVersionOption: '1.8'
mavenOptions: '-Xmx4g'
- script: |
grep "testcase" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100
displayName: Top 100 long-running testcases
containerRegistry: 'apachehudi-docker-hub'
repository: 'apachehudi/hudi-ci-bundle-validation-base'
command: 'run'
arguments: >
-i docker.io/apachehudi/hudi-ci-bundle-validation-base:$(Build.BuildId)
/bin/bash -c "mvn clean install $(MVN_OPTS_INSTALL) -Phudi-platform-service -Pthrift-gen-source
&& mvn test $(MVN_OPTS_TEST) -Punit-tests -pl $(JOB4_UT_MODULES)
&& mvn test $(MVN_OPTS_TEST) -Pfunctional-tests -pl $(JOB4_UT_MODULES)
&& grep \"testcase\" */target/surefire-reports/*.xml */*/target/surefire-reports/*.xml | awk -F'\"' ' { print $6,$4,$2 } ' | sort -nr | head -n 100"
Loading
Loading