HUE文件管理器,能够创建HDFS子目录/文件夹,但无法将文件上传到HDFS



我得到一个:

每次我尝试上传文件时,HUE中的Error: Undefined message。我可以在HDFS中创建一个子目录/文件夹,但文件上传不起作用。

我尝试使用hadoop用户从linux CLI将一个文件复制到HDFS,结果成功了。

色调用户为hadoop

HDFS目录所有者是hadoop:hadoop

编辑:添加错误

ERROR    Internal Server Error: /filebrowser/upload/file
Traceback (most recent call last):
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/core/handlers/exception.py", line 41, in inner
response = get_response(request)
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/core/handlers/base.py", line 249, in _legacy_get_response
response = self._get_response(request)
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/core/handlers/base.py", line 178, in _get_response
response = middleware_method(request, callback, callback_args, callback_kwargs)
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/middleware/csrf.py", line 300, in process_view
request_csrf_token = request.POST.get('csrfmiddlewaretoken', '')
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/core/handlers/wsgi.py", line 126, in _get_post
self._load_post_and_files()
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/http/request.py", line 299, in _load_post_and_files
self._post, self._files = self.parse_file_upload(self.META, data)
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/http/request.py", line 258, in parse_file_upload
return parser.parse()
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/http/multipartparser.py", line 269, in parse
self._close_files()
File "/usr/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11.20-py2.7.egg/django/http/multipartparser.py", line 316, in _close_files
handler.file.close()
AttributeError: 'NoneType' object has no attribute 'close'
[12/Apr/2020 22:48:51 -0700] upload       DEBUG    HDFSfileUploadHandler receive_data_chunk
[12/Apr/2020 22:48:51 -0700] upload       ERROR    Not using HDFS upload handler: 
[12/Apr/2020 22:48:51 -0700] resource     ERROR    All 1 clients failed: {'http://IRedactedMyinstanceIdentHere.ap-southeast-1.compute.internal:14000/webhdfs/v1': u'500 Server Error: Internal Server Error for url: http://IRedactedMyinstanceIdentHere.ap-southeast-1.compute.internal:14000/webhdfs/v1/user/hadoop/Test-Data?op=CHECKACCESS&fsaction=rw-&user.name=hue&doas=hadoopn{"RemoteException":{"message":"java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.fs.http.client.HttpFSFileSystem.Operation.CHECKACCESS","exception":"QueryParamException","javaClassName":"com.sun.jersey.api.ParamException$QueryParamException"}}n'}
[12/Apr/2020 22:48:51 -0700] resource     ERROR    Caught exception from http://IRedactedMyinstanceIdentHere.ap-southeast-1.compute.internal:14000/webhdfs/v1: 500 Server Error: Internal Server Error for url: http://IRedactedMyinstanceIdentHere.ap-southeast-1.compute.internal:14000/webhdfs/v1/user/hadoop/Test-Data?op=CHECKACCESS&fsaction=rw-&user.name=hue&doas=hadoop
{"RemoteException":{"message":"java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.fs.http.client.HttpFSFileSystem.Operation.CHECKACCESS","exception":"QueryParamException","javaClassName":"com.sun.jersey.api.ParamException$QueryParamException"}}
(error 500)

正如您从错误消息中看到的,它报告在hue尝试执行CHECKACCESS操作时传递的查询参数没有找到匹配项。

http://IRedactedMyinstanceIdentHere.ap-southeast-1.compute.internal:14000/webhdfs/v1/user/hadoop/Test-Data?op=CHECKACCESS&fsaction=rw-&user.name=hue&doas=hadoop
{"RemoteException":{"message":"java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.fs.http.client.HttpFSFileSystem.Operation.CHECKACCESS","exception":"QueryParamException","javaClassName":"com.sun.jersey.api.ParamException$QueryParamException"}}

这个实现在一些hadoop版本中似乎丢失了,并且是HTTPFS-CHECKACCESS操作丢失的已知错误。

最新更新