Permalink
Please sign in to comment.
Browse files
[SPARK-18949][SQL] Add recoverPartitions API to Catalog
### What changes were proposed in this pull request?
Currently, we only have a SQL interface for recovering all the partitions in the directory of a table and update the catalog. `MSCK REPAIR TABLE` or `ALTER TABLE table RECOVER PARTITIONS`. (Actually, very hard for me to remember `MSCK` and have no clue what it means)
After the new "Scalable Partition Handling", the table repair becomes much more important for making visible the data in the created data source partitioned table.
Thus, this PR is to add it into the Catalog interface. After this PR, users can repair the table by
```Scala
spark.catalog.recoverPartitions("testTable")
```
### How was this patch tested?
Modified the existing test cases.
Author: gatorsmile <[email protected]>
Closes #16356 from gatorsmile/repairTable.- Loading branch information...
Showing
with
32 additions
and 4 deletions.
- +3 −1 project/MimaExcludes.scala
- +5 −0 python/pyspark/sql/catalog.py
- +7 −0 sql/core/src/main/scala/org/apache/spark/sql/catalog/Catalog.scala
- +14 −0 sql/core/src/main/scala/org/apache/spark/sql/internal/CatalogImpl.scala
- +3 −3 sql/hive/src/test/scala/org/apache/spark/sql/hive/PartitionProviderCompatibilitySuite.scala
0 comments on commit
24c0c94