如何可保护到Hive(以便管理蜂巢表可以管理_table)



当我尝试在没有明确路径的情况下保存表格时,HiveMetastore将具有伪造的"路径"属性,指向"/user/heve/hive/warehouse",而不是"/hive/warehouse"。如果i 显式设置了我想要使用.option("路径","/hive/warehouse")的路径,则一切都可以使用,但是Hive会创建一个外部表。有没有办法将托管表保存到Hive Metastore,并且没有伪造路径属性与Hive中的文件位置不匹配?

from pyspark.sql import SparkSession
spark = SparkSession.builder.master(master_url).enableHiveSupport().getOrCreate()
df = spark.range(100)
df.write.saveAsTable("test1")
df.write.option("path", "/hive/warehouse").saveAsTable("test2")
hive> describe formatted test1;
OK
# col_name              data_type               comment             
id                      bigint                                      
# Detailed Table Information         
Database:               default                  
Owner:                  root                     
CreateTime:             Fri Mar 10 18:53:07 UTC 2017     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               file:/hive/warehouse/test1 
Table Type:             MANAGED_TABLE            
Table Parameters:        
    spark.sql.sources.provider  parquet             
    spark.sql.sources.schema.numParts   1                   
    spark.sql.sources.schema.part.0 {"type":"struct","fields":[{"name":"id","type":"long","nullable":true,"metadata":{}}]}
    transient_lastDdlTime   1489171987          
# Storage Information        
SerDe Library:          org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe  
InputFormat:            org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    path                    file:/user/hive/warehouse/test1
    serialization.format    1                   
Time taken: 0.423 seconds, Fetched: 30 row(s)

hive> describe formatted test2;
OK
# col_name              data_type               comment             
id                      bigint                                      
# Detailed Table Information         
Database:               default                  
Owner:                  root                     
CreateTime:             Fri Mar 10 16:02:07 UTC 2017     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               file:/hive/warehouse/test2   
Table Type:             EXTERNAL_TABLE           
Table Parameters:        
    COLUMN_STATS_ACCURATE   false               
    EXTERNAL                TRUE                
    numFiles                2                   
    numRows                 -1                  
    rawDataSize             -1                  
    spark.sql.sources.provider  parquet             
    spark.sql.sources.schema.numParts   1                   
    spark.sql.sources.schema.part.0 {"type":"struct","fields":[{"name":"id","type":"long","nullable":true,"metadata":{}}]}
    totalSize               4755                
    transient_lastDdlTime   1489161727          
# Storage Information        
SerDe Library:          org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe  
InputFormat:            org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    path                    file:/hive/warehouse/test2
    serialization.format    1                   
Time taken: 0.402 seconds, Fetched: 36 row(s)

修复了问题。对于有类似问题的人,我会发布我的修复程序。

只有在将表保存到默认的Hive数据库中时,"路径"参数的此问题是不正确的(如下所示)。这使我认为可能"旧"数据库正在使用旧的配置值(Hive.metastore.warehouse.dir),而新数据库正在使用新值。

因此,修复程序是删除默认数据库,重新创建数据库,现在在Hive Metastore中创建的所有数据库都将使用Hive.metastore.warehouse.darhouse.dir的正确值。

spark.sql("create database testdb")
spark.sql("use testdb")
df.write.saveAsTable("test3")
hive> describe formatted test.test3;
OK
# col_name              data_type               comment             
id                      bigint                                      
# Detailed Table Information         
Database:               testdb                   
Owner:                  root                     
CreateTime:             Fri Mar 10 22:10:10 UTC 2017     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               file:/hive/warehouse/test.db/test3   
Table Type:             MANAGED_TABLE            
Table Parameters:        
    COLUMN_STATS_ACCURATE   false               
    numFiles                1                   
    numRows                 -1                  
    rawDataSize             -1                  
    spark.sql.sources.provider  parquet             
    spark.sql.sources.schema.numParts   1                   
    spark.sql.sources.schema.part.0 {"type":"struct","fields":[{"name":"id","type":"long","nullable":true,"metadata":{}}]}
    totalSize               409                 
    transient_lastDdlTime   1489183810          
# Storage Information        
SerDe Library:          org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe  
InputFormat:            org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    path                    file:/hive/warehouse/test.db/test3
    serialization.format    1                   
Time taken: 0.243 seconds, Fetched: 35 row(s)

hive.metastore.warehouse.dir

  • 默认值:/user/hive/warehouse
  • 在:Hive 0.2.0

    中添加

    仓库的默认数据库的位置。

https://cwiki.apache.org/confluence/display/hive/configuration properties

相关内容

  • 没有找到相关文章

最新更新