我试图将第四个(targetFileCount)参数传递给下面的方法
val config = ConfigFactory.load("market_opt_partition.properties")
val targetFileCount = (config.getInt(Code))
writeArray1.par.foreach {
case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
}
object Utility {
def write(sourceDf: DataFrame, path: String, toggle: String, targetFileCount:Int): Unit
但是我面临以下错误,
Error:(368, 12) constructor cannot be instantiated to expected type;
found : (T1, T2, T3, T4)
required: (org.apache.spark.sql.DataFrame, String, String)
case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
Error:(368, 67) not found: value df
case (df, path, tog, targetFileCount) => Utility.write(df, path, tog, targetFileCount)
请让我知道如何纠正相同。
writeArray1
包含 org.apache.spark.sql.DataFrame, String, String
的元组3因此,在4个参数上匹配的模式匹配无法正常工作。
另一个示例:
val l = List(5)
l.map { case (a, b) => a.toString }
也产生相同的错误:
error: constructor cannot be instantiated to expected type;
found : (T1, T2)
required: Int
如上所述writearray1.par包含3 org.apache.spark.sql.dataframe的元组,string,string,string so so so so params上的模式无法正常工作。
请使用以下。
val config = ConfigFactory.load("market_opt_partition.properties")
val targetFileCount = (config.getInt(Code))
writeArray1.par.foreach {
case (df, path, tog) => Utility.write(df, path, tog, targetFileCount)
}
object Utility {
def write(sourceDf: DataFrame, path: String, toggle: String, targetFileCount:Int): Unit