GeoSpark转换SQL函数失败



我正在使用GeoSpark 1.3.1,在那里我试图找到包含在一个圆中的所有地理点,给定以米为单位的中心和半径。要做到这一点,我不想将中心从度数转换为米,创建圆(使用ST_Buffer(,然后将返回的多边形转换回度数,然后在与所有地理点的连接中应用ST_Contains函数。请参阅下面的SQL:

WITH point_data AS (
SELECT
ST_Point(CAST(c.lon as Decimal(24,20)), CAST(c.lat as Decimal(24,20))) as geo_point
FROM point_data_view as c
)
SELECT * FROM point_data as pd
WHERE ST_Contains(ST_Transform(ST_Buffer(ST_Transform(ST_Point(<LON>, <LAT>), 'epsg:4326', 'epsg:3857'), 1000.0), 'epsg:3857', 'epsg:4326'), pd.geo_point) = true

然而,当我按照指南GeoSpark将依赖项添加到我的pom文件并创建一个提交的uber jar(使用spark2-submit(时,我会收到以下错误(仅当使用ST_Transform函数时(

java.lang.NoSuchMethodError: org.hsqldb.DatabaseURL.parseURL(Ljava/lang/String;ZZ)Lorg/hsqldb/persist/HsqlProperties;
at org.hsqldb.jdbc.JDBCDriver.getConnection(Unknown Source)
at org.hsqldb.jdbc.JDBCDataSource.getConnection(Unknown Source)
at org.hsqldb.jdbc.JDBCDataSource.getConnection(Unknown Source)
at org.geotools.referencing.factory.epsg.DirectEpsgFactory.getConnection(DirectEpsgFactory.java:3302)
at org.geotools.referencing.factory.epsg.ThreadedEpsgFactory.createBackingStore(ThreadedEpsgFactory.java:436)
at org.geotools.referencing.factory.DeferredAuthorityFactory.getBackingStore(DeferredAuthorityFactory.java:133)
at org.geotools.referencing.factory.BufferedAuthorityFactory.isAvailable(BufferedAuthorityFactory.java:235)
at org.geotools.referencing.factory.DeferredAuthorityFactory.isAvailable(DeferredAuthorityFactory.java:119)
at org.geotools.factory.FactoryRegistry.isAvailable(FactoryRegistry.java:667)
at org.geotools.factory.FactoryRegistry.isAcceptable(FactoryRegistry.java:501)
at org.geotools.factory.FactoryRegistry.getServiceImplementation(FactoryRegistry.java:437)
at org.geotools.factory.FactoryRegistry.getServiceProvider(FactoryRegistry.java:365)
at org.geotools.factory.FactoryCreator.getServiceProvider(FactoryCreator.java:145)
at org.geotools.referencing.ReferencingFactoryFinder.getAuthorityFactory(ReferencingFactoryFinder.java:220)
at org.geotools.referencing.ReferencingFactoryFinder.getCRSAuthorityFactory(ReferencingFactoryFinder.java:440)
at org.geotools.referencing.factory.epsg.LongitudeFirstFactory.createBackingStore(LongitudeFirstFactory.java:192)
at org.geotools.referencing.factory.DeferredAuthorityFactory.getBackingStore(DeferredAuthorityFactory.java:133)
at org.geotools.referencing.factory.BufferedAuthorityFactory.isAvailable(BufferedAuthorityFactory.java:235)
at org.geotools.referencing.factory.DeferredAuthorityFactory.isAvailable(DeferredAuthorityFactory.java:119)
at org.geotools.factory.FactoryRegistry.isAvailable(FactoryRegistry.java:667)
at org.geotools.factory.FactoryRegistry.isAcceptable(FactoryRegistry.java:501)
at org.geotools.factory.FactoryRegistry$1.filter(FactoryRegistry.java:192)
at javax.imageio.spi.FilterIterator.advance(ServiceRegistry.java:834)
at javax.imageio.spi.FilterIterator.<init>(ServiceRegistry.java:828)
at javax.imageio.spi.ServiceRegistry.getServiceProviders(ServiceRegistry.java:519)
at org.geotools.factory.FactoryRegistry.getServiceProviders(FactoryRegistry.java:197)
at org.geotools.referencing.ReferencingFactoryFinder.getFactories(ReferencingFactoryFinder.java:180)
at org.geotools.referencing.ReferencingFactoryFinder.getCRSAuthorityFactories(ReferencingFactoryFinder.java:455)
at org.geotools.referencing.DefaultAuthorityFactory.getBackingFactory(DefaultAuthorityFactory.java:89)
at org.geotools.referencing.DefaultAuthorityFactory.<init>(DefaultAuthorityFactory.java:69)
at org.geotools.referencing.CRS.getAuthorityFactory(CRS.java:263)
at org.geotools.referencing.CRS.decode(CRS.java:525)
at org.geotools.referencing.CRS.decode(CRS.java:453)
at org.apache.spark.sql.geosparksql.expressions.ST_Transform.eval(Functions.scala:237)

我曾试图遮蔽和重新安置";org.hsqldb";在我的构建中,但这并没有改变任何事情,并且由于没有在uber jar中包括GeoSpark jar,而是将它们作为spark2-submit的一部分加载,我得到了相同的错误。我真的找不到解决这个问题的方法,只有当我使用ST_Transform方法时才会发生这种情况?

看起来我的Spark平台的org.hsqldb版本比GeoSpark提供的版本旧!

我的着色看起来像这样:

<properties>
<geoSparkVersion>1.3.1</geoSparkVersion>
<sparkVersion>2.3.0</sparkVersion>
<clouderaPackage>cloudera2</clouderaPackage>
<scalaVersion>2.11.0</scalaVersion>
<scalaBinaryVersion>2.11</scalaBinaryVersion>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scalaBinaryVersion}</artifactId>
<version>${sparkVersion}.${clouderaPackage}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scalaBinaryVersion}</artifactId>
<version>${sparkVersion}.${clouderaPackage}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark</artifactId>
<version>${geoSparkVersion}</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-sql_2.3</artifactId>
<version>${geoSparkVersion}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<finalName>${artifactId}-${version}-${jarNameWithDependencies}</finalName>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>org.hsqldb</pattern>
<shadedPattern>shaded.org.hsqldb</shadedPattern>
</relocation>
</relocations>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

看起来像是经典的依赖关系问题,可能是你的ubber jar容器不同版本的org.hsqldb库,你应该尝试从依赖关系中排除org.hsqldb.*,或者对其进行着色。我猜你用的是maven shaded插件吗?如果您这样做了,您可以在这里查看如何排除依赖项:https://maven.apache.org/plugins/maven-shade-plugin/examples/includes-excludes.html

相关内容

  • 没有找到相关文章

最新更新