预测iio-0.10.0-未为JDBC找到合适的驱动程序:Postgresql:// localhost/pio



ubuntu上的预测-0.10.0-16.04返回错误"线程中的异常" main" java.sqlexlection.sqlexception:jdbc没有找到合适的驱动程序:postgresql://postgresql://localhost/pio pio pio/pio pio pio/pio pio"当我跑步时 bin/pio导入 - appid 1-输入引擎/data/stopwords.json

PIO在哪里寻找驱动程序?

我运行bin/pio状态时没有问题。

[INFO] [Console$] Inspecting PredictionIO...
[INFO] [Console$] PredictionIO 0.10.0-incubating is installed at /home/homedir/mnt/predictionio/apache-predictionio-0.10.0-incubating/PredictionIO-0.10.0-incubating
[INFO] [Console$] Inspecting Apache Spark...
[INFO] [Console$] Apache Spark is installed at /home/homedir/mnt/predictionio/apache-predictionio-0.10.0-incubating/PredictionIO-0.10.0-incubating/vendors/spark-1.5.1-bin-hadoop2.6
[INFO] [Console$] Apache Spark 1.5.1 detected (meets minimum requirement of 1.3.0)
[INFO] [Console$] Inspecting storage backend connections...
[INFO] [Storage$] Verifying Meta Data Backend (Source: PGSQL)...
[INFO] [Storage$] Verifying Model Data Backend (Source: PGSQL)...
[INFO] [Storage$] Verifying Event Data Backend (Source: PGSQL)...
[INFO] [Storage$] Test writing to Event Store (App Id 0)...
[INFO] [Console$] (sleeping 5 seconds for all messages to show up...)
[INFO] [Console$] Your system is all ready to go.

postgresql jdbc jar已安装在/USR/lib/jvm/postgresql-42.0.0.0.jar

bin/pio import --appid 1 --input engine/data/stopwords.json
    [INFO] [Remoting] Starting remoting
    [INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.66:41940]
    [WARN] [MetricsSystem] Using default name DAGScheduler for source because spark.app.id is not set.
    Exception in thread "main" java.sql.SQLException: No suitable driver found for jdbc:postgresql://localhost/pio
            at java.sql.DriverManager.getConnection(DriverManager.java:689)
            at java.sql.DriverManager.getConnection(DriverManager.java:208)
            at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:188)
            at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$getConnector$1.apply(JDBCRDD.scala:181)
            at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnection(JdbcUtils.scala:39)
            at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:253)
            at org.apache.predictionio.data.storage.jdbc.JDBCPEvents.write(JDBCPEvents.scala:162)
            at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$main$1.apply(FileToEvents.scala:101)
            at org.apache.predictionio.tools.imprt.FileToEvents$$anonfun$main$1.apply(FileToEvents.scala:68)
            at scala.Option.map(Option.scala:145)
            at org.apache.predictionio.tools.imprt.FileToEvents$.main(FileToEvents.scala:68)
            at org.apache.predictionio.tools.imprt.FileToEvents.main(FileToEvents.scala)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:498)
            at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
            at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
            at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

告诉pio在哪里可以找到jar

bin/pio import --appid 1 --input engine/data/emails.json -- --driver-class-path  /usr/lib/jvm/postgresql-42.0.0.jar

默认情况下,它在目录中查看'$ pio_home/lib/postgresql-9.4-104.jdbc41.jar'。

这是根据文件$ pio_home/conf/pio-env.sh。

在文件中定义的

例如:postgres_jdbc_driver = $ pio_home/lib/postgresql-9.4-1204.jdbc41.jar

最新更新