Apache beam通配符递归搜索文件



我使用Spotify的Scio库在scala中编写apache梁管道。我想在文件系统上以递归的方式搜索目录下的文件,文件系统可以是hdfs、alluxio或GCS。像*.jar应该在提供的目录和子目录下找到所有文件。

Apachebeam-sdk提供了org.apache.beam.sdk.io.FileIO类,用于我可以使用pipeline.apply(FileIO.match().filepattern(filesPattern))在一个目录级别上查找文件的目的。

如何使其递归搜索与所提供模式匹配的所有文件?

目前,我正在尝试另一种方法,即创建所提供的模式的resourceId并获取所提供模式的当前目录,然后尝试使用resourceId.resolve()方法解析当前目录中的所有子目录。但这是一个例外。

val currentDir = FileSystems.matchNewResource(filesPattern, false).getCurrentDirectory
val childDir = currentDir.resolve("{@literal *}", StandardResolveOptions.RESOLVE_DIRECTORY)

对于currentDir.resolve,我得到以下异常:

------------------------------------------------------------
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: java.lang.IllegalArgumentException: Illegal character in path at index 0: {@literal *}/
at java.net.URI.create(URI.java:852)
at java.net.URI.resolve(URI.java:1036)
at org.apache.beam.sdk.io.hdfs.HadoopResourceId.resolve(HadoopResourceId.java:46)
at com.sparkcognition.foundation.ingest.jobs.copyjob.FileOperations$.findFiles(BinaryFilesSink.scala:110)
at com.sparkcognition.foundation.ingest.jobs.copyjob.BinaryFilesSink$.main(BinaryFilesSink.scala:39)
at com.sparkcognition.foundation.ingest.jobs.copyjob.BinaryFilesSink.main(BinaryFilesSink.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
... 12 more
Caused by: java.net.URISyntaxException: Illegal character in path at index 0: {@literal *}/
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3063)
at java.net.URI.<init>(URI.java:588)
at java.net.URI.create(URI.java:850)
... 22 more

请建议使用apachebeam递归搜索文件的正确方法是什么?

参考文献:https://beam.apache.org/releases/javadoc/2.11.0/index.html?org/apache/beam/sdk/io/fs/ResourceId.html

看起来您有一些代码是从某个错误的javadoc中复制的。示例代码的一些旧版本发布时星号周围有错误。

查找currentDir:中的所有文件

val childDir = currentDir.resolve("**", StandardResolveOptions.RESOLVE_FILES)

最新更新