Spark 将隐藏参数添加到 Scala 类的构造函数中



我不知道如何解释这一点,但Spark似乎在构造函数中添加了一个隐藏的(隐式的?(参数。这是我在spark-shell中尝试的代码(在常规 Scala shell 参数列表中为空(:

scala> class A {}
defined class A
scala> classOf[A].getConstructors()(0).getAnnotatedParameterTypes
res0: Array[java.lang.reflect.AnnotatedType] = Array(sun.reflect.annotation.AnnotatedTypeFactory$AnnotatedTypeBaseImpl@5ed65e4b)

由于此参数,我无法将自定义InputFormat类传递给 Spark 的hadoopFile函数。关于这里发生的事情的任何提示,或者至少如何使用无参数构造函数创建类?

行为似乎与普通的 Scala REPL 相同

$ scala
Welcome to Scala 2.13.3 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231).
Type in expressions for evaluation. Or try :help.
scala> class A {}
class A
scala> classOf[A].getConstructors()(0).getAnnotatedParameterTypes
val res0: Array[java.lang.reflect.AnnotatedType] = Array(sun.reflect.annotation.AnnotatedTypeFactory$AnnotatedTypeBaseImpl@383864d5)
scala> classOf[A].getConstructors()(0).getParameters
val res1: Array[java.lang.reflect.Parameter] = Array(final $iw $outer)

REPL使类嵌套(REPL 中的每一行都是外部类的实例化(。这会将外部类的实例作为参数添加到构造函数中($outer是参数的名称,$iw是外部类(。您可以重现此行为,如下所示

class X {
class A {}
}
object App {
def main(args: Array[String]): Unit = {
val x = new X
println(classOf[x.A].getConstructors()(0).getAnnotatedParameterTypes.mkString(","))
// sun.reflect.annotation.AnnotatedTypeFactory$AnnotatedTypeBaseImpl@2f7c7260
println(classOf[x.A].getConstructors()(0).getParameters.mkString(","))
// final X $outer
}
}

如果您在打开编译器选项-Xprint:typer的情况下运行 REPL(如scala -Xprint:typerspark-shell -Xprint:typer(,您将看到

$ scala -Xprint:typer
Welcome to Scala 2.13.3 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231).
Type in expressions for evaluation. Or try :help.
scala> class A
[[syntax trees at end of                     typer]] // <console>
package $line3 {
sealed class $read extends AnyRef with Serializable {
def <init>(): $line3.$read = {
$read.super.<init>();
()
};
sealed class $iw extends AnyRef with java.io.Serializable {
def <init>(): $iw = {
$iw.super.<init>();
()
};
class A extends scala.AnyRef {
def <init>(): A = {
A.super.<init>();
()
}
}
};
private[this] val $iw: $iw = new $read.this.$iw();
<stable> <accessor> def $iw: $iw = $read.this.$iw
};
object $read extends scala.AnyRef with java.io.Serializable {
def <init>(): type = {
$read.super.<init>();
()
};
private[this] val INSTANCE: $line3.$read = new $read();
<stable> <accessor> def INSTANCE: $line3.$read = $read.this.INSTANCE;
<synthetic> private def writeReplace(): Object = new scala.runtime.ModuleSerializationProxy(classOf[$line3.$read$])
}
}
class A

因此,这个附加的构造函数参数$outer可以作为$line3.$read.INSTANCE.$iw

scala> classOf[A].getConstructors()(0).newInstance($line3.$read.INSTANCE.$iw)
...
val res0: Object = A@282ffbf5

请注意,编码可能会在不同版本的 Scala 中更改。例如,Spark 3.0.1(为Hadoop 3.2预先构建(的spark-shell使用Scala 2.12.10,应该有$lineXXX.$read.INSTANCE.$iw.$iw而不是$lineXXX.$read.INSTANCE.$iw

$ spark-shell -Xprint:typer
20/11/25 16:32:16 WARN Utils: Your hostname, dmitin-HP-Pavilion-Laptop resolves to a loopback address: 127.0.1.1; using 192.168.0.103 instead (on interface wlo1)
20/11/25 16:32:16 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/11/25 16:32:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.103:4040
Spark context available as 'sc' (master = local[*], app id = local-1606314741512).
Spark session available as 'spark'.
Welcome to
____              __
/ __/__  ___ _____/ /__
_ / _ / _ `/ __/  '_/
/___/ .__/_,_/_/ /_/_   version 3.0.1
/_/

Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231)
Type in expressions to have them evaluated.
Type :help for more information.
scala> class A
[[syntax trees at end of                     typer]] // <console>
package $line14 {
sealed class $read extends AnyRef with java.io.Serializable {
def <init>(): $line14.$read = {
$read.super.<init>();
()
};
sealed class $iw extends AnyRef with java.io.Serializable {
def <init>(): $read.this.$iw = {
$iw.super.<init>();
()
};
sealed class $iw extends AnyRef with java.io.Serializable {
def <init>(): $iw = {
$iw.super.<init>();
()
};
class A extends scala.AnyRef {
def <init>(): A = {
A.super.<init>();
()
}
}
};
private[this] val $iw: $iw = new $iw.this.$iw();
<stable> <accessor> def $iw: $iw = $iw.this.$iw
};
private[this] val $iw: $read.this.$iw = new $read.this.$iw();
<stable> <accessor> def $iw: $read.this.$iw = $read.this.$iw
};
object $read extends scala.AnyRef with Serializable {
def <init>(): $line14.$read.type = {
$read.super.<init>();
()
};
private[this] val INSTANCE: $line14.$read = new $read();
<stable> <accessor> def INSTANCE: $line14.$read = $read.this.INSTANCE;
<synthetic> private def readResolve(): Object = $line14.$read
}
}
defined class A
scala> classOf[A].getConstructors()(0).newInstance($line14.$read.INSTANCE.$iw.$iw)
...
res0: Any = A@6621ab0c

在 Scala 2.12.6 中scala -Xprint:typer产生

$ ./scala -Xprint:typer
Welcome to Scala 2.12.6 (Java HotSpot(TM) 64-Bit GraalVM EE 19.3.0, Java 1.8.0_231).
Type in expressions for evaluation. Or try :help.
scala> class A
[[syntax trees at end of                     typer]] // <console>
package $line3 {
object $read extends scala.AnyRef {
def <init>(): $line3.$read.type = {
$read.super.<init>();
()
};
object $iw extends scala.AnyRef {
def <init>(): type = {
$iw.super.<init>();
()
};
object $iw extends scala.AnyRef {
def <init>(): type = {
$iw.super.<init>();
()
};
class A extends scala.AnyRef {
def <init>(): A = {
A.super.<init>();
()
}
}
}
}
}
}
defined class A

所以现在类A嵌套在一个对象($line3.$read.$iw.$iw(而不是类中,在这种情况下,附加参数不会添加到A的构造函数中

object X {
class A {}
}
object App {
def main(args: Array[String]): Unit = {
val x = X
println(classOf[x.A].getConstructors()(0).getAnnotatedParameterTypes.toList)
// List()
println(classOf[x.A].getConstructors()(0).getParameters.toList)
// List()
}
}

最新更新