如何在Webhdfs - HDFS - Hadoop - Origin http://localhost:4200 中启



当我尝试从我的Angular 6应用程序访问Webhdfs时,我得到如下所示的错误。在我看来,我几乎尝试了所有方法,包括更改core-site.xml中的设置,不幸的是hdfs-site.xml没有积极的结果。显然,很可能需要正确配置Hadoop。有没有人知道如何解决这个问题?

[Error] Origin http://localhost:4200 is not allowed by Access-Control-Allow-Origin.
[Error] XMLHttpRequest cannot load http://192.168.0.16:9870/webhdfs/v1/user/myuser/myfile.csv?op=CREATE&user.name=myuser&createflag=&createparent=true&overwrite=false due to access control checks.
[Error] Failed to load resource: Origin http://localhost:4200 is not allowed by Access-Control-Allow-Origin. (myfile.csv, line 0)

从文档中:

要启用跨源支持 (CORS(,请设置以下配置参数:

将 org.apache.hadoop.security.HttpCrossOriginFilterInitializer 添加到 core-site.xml 中的 hadoop.http.filter.initializers。您还需要在核心站点中设置以下属性.xml -

hadoop.http.cross-origin.enabled = true

hadoop.http.cross-origin.allowed-origins = *

HADOOP.http.cross-origin.allowed-methods = GET,POST,HEAD,DELETE,OPTIONS

hadoop.http.cross-origin.allowed-headers = X-Request-With,Content-Type,Accept,Origin

hadoop.http.cross-origin.max-age = 1800

你应该配置hdfs-site.xml,并添加配置

<property>
<name>dfs.permissions</name>
<value>false</value>
<description>If "true", enable permission checking in HDFS. If "false", permission checking is turned off, but all other behavior is unchanged. Switching from one parameter value to the other does not change the mode, owner or group of files or directories.</description>
</property>

在核心站点中.xml如果没有,请添加此内容...

<property>
<name>hadoop.http.filter.initializers</name>
<value>org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.security.HttpCrossOriginFilterInitializer</value>
<description>A comma separated list of class names. Each class in the list
must extend org.apache.hadoop.http.FilterInitializer. The corresponding
Filter will be initialized. Then, the Filter will be applied to all user
facing jsp and servlet web pages.  The ordering of the list defines the
ordering of the filters.</description>
</property>
<property>
<name>hadoop.http.cross-origin.enabled</name>
<value>true</value>
<description>Enables cross origin support for all web-services</description>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-origins</name>
<value>*</value>
<description>Comma separated list of origins that are allowed, wildcards (*) and patterns allowed</description>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-methods</name>
<value>GET,POST,HEAD,PUT,OPTIONS,DELETE</value>
<description>Comma separated list of methods that are allowed</description>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-headers</name>
<value>X-Requested-With,Content-Type,Accept,Origin,WWW-Authenticate,Accept-Encoding,Transfer-Encoding</value>
<description>Comma separated list of headers that are allowed</description>
</property>
<property>
<name>hadoop.http.cross-origin.max-age</name>
<value>1800</value>
<description>Number of seconds a pre-flighted request can be cached</description>
</property>

您必须将这些属性附加到此文件中etc/hadoop/core-site.xml

<configuration>
<property>
<name>hadoop.http.cross-origin.enabled</name>
<value>false</value>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-origins</name>
<value>*</value>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-methods</name>
<value>GET,POST,HEAD</value>
</property>
<property>
<name>hadoop.http.cross-origin.allowed-headers</name>
<value>X-Requested-With,Content-Type,Accept,Origin</value>
</property>
<property>
<name>hadoop.http.cross-origin.max-age</name>
<value>1800</value>
</property></configuration>

最新更新