Spark DataFrame和重命名多列(Java)



是否有更好的方法来前缀或重命名所有或多个列在同一时间给定的SparkSQL DataFrame比调用多次dataFrame.withColumnRenamed() ?

一个例子是,如果我想检测更改(使用完整的外连接)。然后剩下两个结构相同的DataFrame

我建议使用select()方法来执行此操作。实际上,withColumnRenamed()方法本身使用select()。下面是如何重命名多个列的示例:

import org.apache.spark.sql.functions._
val someDataframe: DataFrame = ...
val initialColumnNames = Seq("a", "b", "c")
val renamedColumns = initialColumnNames.map(name => col(name).as(s"renamed_$name"))
someDataframe.select(renamedColumns : _*)

我想这个方法可以帮到你。

public static Dataset<Row> renameDataFrame(Dataset<Row> dataset) {
    for (String column : dataset.columns()) {
        dataset = dataset.withColumnRenamed(column, SystemUtils.underscoreToCamelCase(column));
    }
    return dataset;
}

    public static String underscoreToCamelCase(String underscoreName) {
        StringBuilder result = new StringBuilder();
        if (underscoreName != null && underscoreName.length() > 0) {
            boolean flag = false;
            for (int i = 0; i < underscoreName.length(); i++) {
                char ch = underscoreName.charAt(i);
                if ("_".charAt(0) == ch) {
                    flag = true;
                } else {
                    if (flag) {
                        result.append(Character.toUpperCase(ch));
                        flag = false;
                    } else {
                        result.append(ch);
                    }
                }
            }
        }
        return result.toString();
    }

我已经找到答案了

df1_r = df1.select(*(col(x).alias(x + '_df1') for x in df1.columns))

at stackoverflow这里(见接受的答案的末尾)

or (a <- 0 to newsales.columns.length - 1) 
{ 
 var new_c = newsales.columns(a).replace('(','_').replace(')',' ').trim  
 newsales_var = newsales.withColumnRenamed(newsales.columns(a),new_c) 
}

相关内容

  • 没有找到相关文章