如何将字节数组转换为org.apache.spark.sql.Column
?
例如:
import org.apache.spark.sql.Column
...
def toColumn(bytes: Array[Byte]): Column = {
// todo
}
使用lit
应该可以完成工作:
def toColumn(bytes: Array[Byte]): Column = {
lit(bytes)
}
val df = spark.range(1)
df.withColumn("byte", toColumn(Array[Byte](1,1,1,1))).show
+---+-------------+
| id| byte|
+---+-------------+
| 0|[01 01 01 01]|
+---+-------------+