Hive -> Kafka:插入 Hive Kafka 集成外部表失败


#Exception I got:
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.ParquetHiveRecord cannot be cast to org.apache.hadoop.hive.kafka.KafkaWritable
#create non-native external hive table    
CREATE EXTERNAL TABLE IF NOT EXISTS weatherHive(lng DOUBLE, lat DOUBLE, avg_tmpr_f DOUBLE, avg_tmpr_c DOUBLE, wthr_date STRING) PARTITIONED BY (year INT, month INT, day INT) TBLPROPERTIES ('kafka.topic' = 'weatherHive', 'kafka.bootstrap.servers'='sandbox-hdp:6667', "kafka.serde.class"="org.apache.hadoop.hive.serde2.avro.AvroSerDe" );
#insert test data 
INSERT INTO TABLE weatherHive VALUES (-111,22,80,23,'2016-10-01',2020, 10 ,1 );

然后我得到了异常,我该怎么办,我需要对处理程序罐做什么吗? 我加载到蜂巢中:

add add jar hdfs://sandbox-hdp.hortonworks.com:8020/sandbox/jars/kafka-handler-3.1.2000.7.0.3.0-79.jar;

您可能缺少类"org.apache.hadoop.hive.kafka.KafkaSerDe"。看看这个答案。