Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-32444][SQL] Infer filters from DPP #29243

Closed
wants to merge 12 commits into from
Closed

[SPARK-32444][SQL] Infer filters from DPP #29243

wants to merge 12 commits into from

Conversation

wangyum
Copy link
Member

@wangyum wangyum commented Jul 26, 2020

What changes were proposed in this pull request?

This pr add support infer filters from DPP.

For this test suite.
Table fact_stats is partitioned by store_id and table code_stats is partitioned by store_id.
DPP add a new predicate: fact_stats.store_id IN dynamicpruning#2723.
We can infer code_stats.store_id IN dynamicpruning#2723 base on t1.store_id = t2.store_id.

Why are the changes needed?

Improve query performance.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Unit test.

@SparkQA
Copy link

SparkQA commented Jul 26, 2020

Test build #126568 has finished for PR 29243 at commit 69be17b.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jul 26, 2020

Test build #126574 has finished for PR 29243 at commit f001b27.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jul 27, 2020

Test build #126641 has finished for PR 29243 at commit bcc81be.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@bart-samwel
Copy link

The partition pruning already prunes all partitions that don't satisfy that predicate, so all remaining rows will satisfy that predicate, right? Where is the performance gain then?

@wangyum
Copy link
Member Author

wangyum commented Aug 19, 2020

Before this pr. We can only prune on fact_stats.store_id column:

== Physical Plan ==
*(3) Project [store_id#2705, code#2706, product_id#2708L]
+- *(3) BroadcastHashJoin [cast(store_id#2705 as bigint)], [store_id#2709L], Inner, BuildRight, false
   :- *(3) Project [store_id#2705, code#2706]
   :  +- *(3) BroadcastHashJoin [store_id#2705], [store_id#2707], Inner, BuildRight, false
   :     :- *(3) ColumnarToRow
   :     :  +- FileScan parquet default.fact_stats[store_id#2705] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [isnotnull(store_id#2705), dynamicpruningexpression(cast(store_id#2705 as bigint) IN subquery#2716)], PushedFilters: [], ReadSchema: struct<>
   :     :        +- Subquery subquery#2716, [id=#245]
   :     :           +- *(2) HashAggregate(keys=[store_id#2709L#2715L], functions=[])
   :     :              +- Exchange hashpartitioning(store_id#2709L#2715L, 5), true, [id=#241]
   :     :                 +- *(1) HashAggregate(keys=[store_id#2709L AS store_id#2709L#2715L], functions=[])
   :     :                    +- *(1) Filter ((isnotnull(product_id#2708L) AND (product_id#2708L < 3)) AND isnotnull(store_id#2709L))
   :     :                       +- *(1) ColumnarToRow
   :     :                          +- FileScan parquet default.product[product_id#2708L,store_id#2709L] Batched: true, DataFilters: [isnotnull(product_id#2708L), (product_id#2708L < 3), isnotnull(store_id#2709L)], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [], PushedFilters: [IsNotNull(product_id), LessThan(product_id,3), IsNotNull(store_id)], ReadSchema: struct<product_id:bigint,store_id:bigint>
   :     +- BroadcastExchange HashedRelationBroadcastMode(List(cast(input[1, int, true] as bigint)),false), [id=#275]
   :        +- *(1) ColumnarToRow
   :           +- FileScan parquet default.code_stats[code#2706,store_id#2707] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [isnotnull(store_id#2707)], PushedFilters: [], ReadSchema: struct<code:int>
   +- BroadcastExchange HashedRelationBroadcastMode(List(input[1, bigint, false]),false), [id=#283]
      +- *(2) Filter ((isnotnull(product_id#2708L) AND (product_id#2708L < 3)) AND isnotnull(store_id#2709L))
         +- *(2) ColumnarToRow
            +- FileScan parquet default.product[product_id#2708L,store_id#2709L] Batched: true, DataFilters: [isnotnull(product_id#2708L), (product_id#2708L < 3), isnotnull(store_id#2709L)], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [], PushedFilters: [IsNotNull(product_id), LessThan(product_id,3), IsNotNull(store_id)], ReadSchema: struct<product_id:bigint,store_id:bigint>

After this pr. We can also prune on code_stats.store_id column:

== Physical Plan ==
*(3) Project [store_id#2705, code#2706, product_id#2708L]
+- *(3) BroadcastHashJoin [cast(store_id#2705 as bigint)], [store_id#2709L], Inner, BuildRight, false
   :- *(3) Project [store_id#2705, code#2706]
   :  +- *(3) BroadcastHashJoin [store_id#2705], [store_id#2707], Inner, BuildRight, false
   :     :- *(3) ColumnarToRow
   :     :  +- FileScan parquet default.fact_stats[store_id#2705] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [isnotnull(store_id#2705), dynamicpruningexpression(cast(store_id#2705 as bigint) IN subquery#2716)], PushedFilters: [], ReadSchema: struct<>
   :     :        +- Subquery subquery#2716, [id=#250]
   :     :           +- *(2) HashAggregate(keys=[store_id#2709L#2715L], functions=[])
   :     :              +- Exchange hashpartitioning(store_id#2709L#2715L, 5), true, [id=#246]
   :     :                 +- *(1) HashAggregate(keys=[store_id#2709L AS store_id#2709L#2715L], functions=[])
   :     :                    +- *(1) Filter ((isnotnull(product_id#2708L) AND (product_id#2708L < 3)) AND isnotnull(store_id#2709L))
   :     :                       +- *(1) ColumnarToRow
   :     :                          +- FileScan parquet default.product[product_id#2708L,store_id#2709L] Batched: true, DataFilters: [isnotnull(product_id#2708L), (product_id#2708L < 3), isnotnull(store_id#2709L)], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [], PushedFilters: [IsNotNull(product_id), LessThan(product_id,3), IsNotNull(store_id)], ReadSchema: struct<product_id:bigint,store_id:bigint>
   :     +- BroadcastExchange HashedRelationBroadcastMode(List(cast(input[1, int, true] as bigint)),false), [id=#309]
   :        +- *(1) ColumnarToRow
   :           +- FileScan parquet default.code_stats[code#2706,store_id#2707] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [isnotnull(store_id#2707), dynamicpruningexpression(cast(store_id#2707 as bigint) IN subquery#2718)], PushedFilters: [], ReadSchema: struct<code:int>
   :                 +- Subquery subquery#2718, [id=#279]
   :                    +- *(2) HashAggregate(keys=[store_id#2709L#2717L], functions=[])
   :                       +- Exchange hashpartitioning(store_id#2709L#2717L, 5), true, [id=#275]
   :                          +- *(1) HashAggregate(keys=[store_id#2709L AS store_id#2709L#2717L], functions=[])
   :                             +- *(1) Filter ((isnotnull(product_id#2708L) AND (product_id#2708L < 3)) AND isnotnull(store_id#2709L))
   :                                +- *(1) ColumnarToRow
   :                                   +- FileScan parquet default.product[product_id#2708L,store_id#2709L] Batched: true, DataFilters: [isnotnull(product_id#2708L), (product_id#2708L < 3), isnotnull(store_id#2709L)], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [], PushedFilters: [IsNotNull(product_id), LessThan(product_id,3), IsNotNull(store_id)], ReadSchema: struct<product_id:bigint,store_id:bigint>
   +- BroadcastExchange HashedRelationBroadcastMode(List(input[1, bigint, false]),false), [id=#317]
      +- *(2) Filter ((isnotnull(product_id#2708L) AND (product_id#2708L < 3)) AND isnotnull(store_id#2709L))
         +- *(2) ColumnarToRow
            +- FileScan parquet default.product[product_id#2708L,store_id#2709L] Batched: true, DataFilters: [isnotnull(product_id#2708L), (product_id#2708L < 3), isnotnull(store_id#2709L)], Format: Parquet, Location: InMemoryFileIndex[file:/Users/yumwang/spark/SPARK-27227/sql/core/spark-warehouse/org.apache.spark..., PartitionFilters: [], PushedFilters: [IsNotNull(product_id), LessThan(product_id,3), IsNotNull(store_id)], ReadSchema: struct<product_id:bigint,store_id:bigint>

@bart-samwel
Copy link

Ah, that makes sense. Thanks for the example!

@maryannxue
Copy link
Contributor

We don't just apply DPP to any column, right? Shouldn't we check that the inferred filter is applied on partition columns?

In that sense we should fix the DPP rule for it to be apply the same dimension table filter on partition columns from different tables. In other words, the DPP rule itself should generate a filter on fact_stats.store_id and another on code_stats.store_id.

@wangyum
Copy link
Member Author

wangyum commented Aug 21, 2020

Yes. We should only infer the filter to partition column.

The DPP rule itself could not generate a filter on code_stats.store_id because it should be after PushDownPredicates.

I add logic to CleanupDynamicPruningFilters to remove all filters with DynamicPruning that are not filtered on the partition column.

@SparkQA
Copy link

SparkQA commented Aug 21, 2020

Test build #127734 has finished for PR 29243 at commit 98f7275.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@maryannxue
Copy link
Contributor

I add logic to CleanupDynamicPruningFilters to remove all filters with DynamicPruning that are not filtered on the partition column.

It is better now.
Still, the pruningHasBenefit flag carried over from the original DPP filter might not be correct for the new filter.

@SparkQA
Copy link

SparkQA commented Aug 24, 2020

Test build #127835 has finished for PR 29243 at commit c436bc4.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@wangyum
Copy link
Member Author

wangyum commented Aug 24, 2020

I have fixed pruningHasBenefit for the new filter, and this change will infer more filters for subquery.

# Conflicts:
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q10.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q10/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14a/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14b.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14b/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q16.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q16/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23a/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23b.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23b/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q33.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q33/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q35.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q35/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q5.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q5/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q56.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q56/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q58.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q58/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q60.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q60/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q69.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q69/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q70.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q70/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q83.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q83/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q93.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q93/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q94.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q94/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q95.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q95/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q10a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q10a/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14a/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35a/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q5a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q5a/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q70a.sf100/explain.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q70a/explain.txt
#	sql/core/src/test/scala/org/apache/spark/sql/DynamicPartitionPruningSuite.scala
@SparkQA
Copy link

SparkQA commented Aug 28, 2020

Test build #127987 has finished for PR 29243 at commit b5bb9a2.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@wangyum
Copy link
Member Author

wangyum commented Aug 28, 2020

This change is similar to #22778.

@wangyum
Copy link
Member Author

wangyum commented Aug 28, 2020

retest this please

@SparkQA
Copy link

SparkQA commented Aug 28, 2020

Test build #127990 has finished for PR 29243 at commit b5bb9a2.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@github-actions
Copy link

github-actions bot commented Dec 7, 2020

We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!

@github-actions github-actions bot added the Stale label Dec 7, 2020
@github-actions github-actions bot closed this Dec 8, 2020
# Conflicts:
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q10.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q10/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14a/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14b.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q14b/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q16.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q16/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23a/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23b.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q23b/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q33.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q35.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q35/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q5.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q5/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q69.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q69/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q93.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q93/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q94.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q94/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q95.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v1_4/q95/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q10a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q10a/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q14a/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q35a/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q5a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q5a/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q70a.sf100/simplified.txt
#	sql/core/src/test/resources/tpcds-plan-stability/approved-plans-v2_7/q70a/simplified.txt
#	sql/core/src/test/scala/org/apache/spark/sql/DynamicPartitionPruningSuite.scala
@wangyum wangyum reopened this Feb 12, 2021
@wangyum wangyum removed the Stale label Feb 12, 2021
@SparkQA
Copy link

SparkQA commented Feb 12, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39700/

@SparkQA
Copy link

SparkQA commented Feb 12, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39700/

@SparkQA
Copy link

SparkQA commented Feb 12, 2021

Test build #135118 has finished for PR 29243 at commit b5bb9a2.

  • This patch passes all tests.
  • This patch does not merge cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 12, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39706/

@SparkQA
Copy link

SparkQA commented Feb 12, 2021

Kubernetes integration test status success
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39706/

@SparkQA
Copy link

SparkQA commented Feb 12, 2021

Test build #135125 has finished for PR 29243 at commit 9e9a633.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 13, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39717/

@SparkQA
Copy link

SparkQA commented Feb 13, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39717/

@SparkQA
Copy link

SparkQA commented Feb 13, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39718/

@SparkQA
Copy link

SparkQA commented Feb 13, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/39718/

@SparkQA
Copy link

SparkQA commented Feb 13, 2021

Test build #135136 has finished for PR 29243 at commit ba71ead.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 13, 2021

Test build #135137 has finished for PR 29243 at commit c63b162.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@wangyum wangyum closed this Mar 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants