我试图使用 ambari 设置一个 HDP 的 4 节点集群,但我在 Accumulo 客户端中遇到错误。
> stderr:
> Traceback (most recent call last):
> File
> "/var/lib/ambari-agent/cache/common-services/ACCUMULO/1.6.1.2.2.0/package/scripts/accumulo_client.py",
> line 66, in <module>
> AccumuloClient().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 329, in execute
> method(env) File "/var/lib/ambari-agent/cache/common-services/ACCUMULO/1.6.1.2.2.0/package/scripts/accumulo_client.py",
> line 37, in install
> self.install_packages(env)
> File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 693, in install_packages
> retry_count=agent_stack_retry_count) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
> line 155, in __init__
> self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 160, in run
> self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 124, in run_action
> provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 54, in action_install
> self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py",
> line 51, in install_package
> self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput()) File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 86, in checked_call_with_retries
> return self._call_with_retries(cmd, is_checked=True, **kwargs) File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 98, in _call_with_retries
> code, out = func(cmd, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 72, in inner
> result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 102, in checked_call
> tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 150, in _call_wrapper
> result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
> line 303, in _call
> raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of
> '/usr/bin/yum -d 0 -e 0 -y install accumulo_2_6_3_0_235' returned 1.
> Error: Package: hadoop_2_6_3_0_235-hdfs-2.7.3.2.6.3.0-235.x86_64
> (HDP-2.6)
> Requires: libtirpc-devel
> You could try using --skip-broken to work around the problem
> You could try running: rpm -Va --nofiles --nodigest stdout:
> 2018-03-10 09:59:32,386 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None ->
> 2.6 2018-03-10 09:59:32,394 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf User Group mapping (user_group) is
> missing in the hostLevelParams 2018-03-10 09:59:32,395 - Group['livy']
> {} 2018-03-10 09:59:32,397 - Adding group Group['livy'] 2018-03-10
> 09:59:32,432 - Group['spark'] {} 2018-03-10 09:59:32,432 - Adding
> group Group['spark'] 2018-03-10 09:59:32,451 - Group['zeppelin'] {}
> 2018-03-10 09:59:32,452 - Adding group Group['zeppelin']
> 2018-03-10 09:59:32,468 - Group['hadoop'] {}
> 2018-03-10 09:59:32,469 - Adding group Group['hadoop']
> 2018-03-10 09:59:32,484 - Group['users'] {}
> 2018-03-10 09:59:32,484 - Group['knox'] {}
> 2018-03-10 09:59:32,484 - Adding group Group['knox']
2018-03-10 09:59:32,500 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,501 - Adding user User['hive']
> 2018-03-10 09:59:32,537 - User['storm'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,537 - Adding user User['storm']
> 2018-03-10 09:59:32,562 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,563 - Adding user User['infra-solr']
> 2018-03-10 09:59:32,587 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,587 - Adding user User['zookeeper']
> 2018-03-10 09:59:32,613 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,614 - Adding user User['atlas']
> 2018-03-10 09:59:32,645 - User['oozie'] {'gid': 'hadoop', fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2018-03-10 09:59:32,645 - Adding user User['oozie']
> 2018-03-10 09:59:32,673 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,673 - Adding user User['ams']
> 2018-03-10 09:59:32,699 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2018-03-10 09:59:32,700 - Adding user User['falcon']
> 2018-03-10 09:59:32,724 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2018-03-10 09:59:32,724 - Adding user User['tez']
> 2018-03-10 09:59:32,749 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True,
> 'groups': [u'zeppelin', u'hadoop']} 2018-03-10 09:59:32,749 - Adding
> user User['zeppelin'] 2018-03-10 09:59:32,774 - User['accumulo']
> {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups':
> [u'hadoop']} 2018-03-10 09:59:32,775 - Adding user User['accumulo']
> 2018-03-10 09:59:32,800 - User['livy'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2018-03-10
> 09:59:32,800 - Adding user User['livy'] 2018-03-10 09:59:32,825 -
> User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True,
> 'groups': [u'hadoop']} 2018-03-10 09:59:32,826 - Adding user
> User['mahout'] 2018-03-10 09:59:32,851 - User['spark'] {'gid':
> 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,851 - Adding user User['spark'] 2018-03-10
> 09:59:32,877 - User['ambari-qa'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2018-03-10
> 09:59:32,877 - Adding user User['ambari-qa'] 2018-03-10 09:59:32,903 -
> User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True,
> 'groups': [u'hadoop']} 2018-03-10 09:59:32,903 - Adding user
> User['flume'] 2018-03-10 09:59:32,928 - User['kafka'] {'gid':
> 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:32,929 - Adding user User['kafka'] 2018-03-10
> 09:59:32,953 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups':
> True, 'groups': [u'hadoop']} 2018-03-10 09:59:32,954 - Adding user
> User['hdfs'] 2018-03-10 09:59:32,979 - User['sqoop'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2018-03-10
> 09:59:32,979 - Adding user User['sqoop'] 2018-03-10 09:59:33,003 -
> User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True,
> 'groups': [u'hadoop']} 2018-03-10 09:59:33,004 - Adding user
> User['yarn'] 2018-03-10 09:59:33,030 - User['hbase'] {'gid': 'hadoop',
> 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2018-03-10
> 09:59:33,030 - Adding user User['hbase'] 2018-03-10 09:59:33,056 -
> User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True,
> 'groups': [u'hadoop']} 2018-03-10 09:59:33,056 - Adding user
> User['hcat'] 2018-03-10 09:59:33,082 - User['mapred'] {'gid':
> 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2018-03-10 09:59:33,082 - Adding user User['mapred'] 2018-03-10
> 09:59:33,107 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups':
> True, 'groups': [u'hadoop']} 2018-03-10 09:59:33,107 - Adding user
> User['knox'] 2018-03-10 09:59:33,132 -
> File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
> StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-03-10
> 09:59:33,136 - Writing File['/var/lib/ambari-agent/tmp/changeUid.sh']
> because it doesn't exist 2018-03-10 09:59:33,136 - Changing permission
> for /var/lib/ambari-agent/tmp/changeUid.sh from 644 to 555 2018-03-10
> 09:59:33,136 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
> ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2018-03-10
> 09:59:33,141 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
> due to not_if 2018-03-10 09:59:33,142 - Directory['/tmp/hbase-hbase']
> {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access':
> 'a'} 2018-03-10 09:59:33,142 - Creating directory
> Directory['/tmp/hbase-hbase'] since it doesn't exist. 2018-03-10
> 09:59:33,143 - Changing owner for /tmp/hbase-hbase from 0 to hbase
> 2018-03-10 09:59:33,143 - Changing permission for /tmp/hbase-hbase
> from 755 to 775 2018-03-10 09:59:33,144 -
> File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content':
> StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2018-03-10
> 09:59:33,145 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase']
> {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2018-03-10
> 09:59:33,149 - Skipping
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase']
> due to not_if 2018-03-10 09:59:33,150 - Group['hdfs'] {} 2018-03-10
> 09:59:33,150 - Adding group Group['hdfs'] 2018-03-10 09:59:33,166 -
> User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop',
> u'hdfs']} 2018-03-10 09:59:33,167 - Modifying user hdfs 2018-03-10
> 09:59:33,186 - FS Type: 2018-03-10 09:59:33,187 -
> Directory['/etc/hadoop'] {'mode': 0755} 2018-03-10 09:59:33,187 -
> Creating directory Directory['/etc/hadoop'] since it doesn't exist.
> 2018-03-10 09:59:33,188 -
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner':
> 'hdfs', 'group': 'hadoop', 'mode': 01777} 2018-03-10 09:59:33,188 -
> Creating directory
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] since it
> doesn't exist. 2018-03-10 09:59:33,188 - Changing owner for
> /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
> 2018-03-10 09:59:33,188 - Changing group for
> /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hadoop
> 2018-03-10 09:59:33,189 - Changing permission for
> /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 755 to 1777
> 2018-03-10 09:59:33,189 -
> Directory['/var/lib/ambari-agent/tmp/AMBARI-artifacts/']
> {'create_parents': True} 2018-03-10 09:59:33,189 - Creating directory
> Directory['/var/lib/ambari-agent/tmp/AMBARI-artifacts/'] since it
> doesn't exist. 2018-03-10 09:59:33,190 -
> File['/var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz']
> {'content':
> DownloadSource('http://ip-172-31-26-225.ec2.internal:8080/resources//jdk-8u112-linux-x64.tar.gz'),
> 'not_if': 'test -f
> /var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz'} 2018-03-10
> 09:59:33,193 - Downloading the file from
> http://ip-172-31-26-225.ec2.internal:8080/resources//jdk-8u112-linux-x64.tar.gz 2018-03-10 09:59:39,498 -
> File['/var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz'] {'mode':
> 0755} 2018-03-10 09:59:39,643 - Changing permission for
> /var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz from 644 to 755
> 2018-03-10 09:59:39,644 - Directory['/usr/jdk64'] {} 2018-03-10
> 09:59:39,644 - Creating directory Directory['/usr/jdk64'] since it
> doesn't exist. 2018-03-10 09:59:39,645 - Execute[('chmod', 'a+x',
> u'/usr/jdk64')] {'sudo': True} 2018-03-10 09:59:40,467 - Execute['cd
> /var/lib/ambari-agent/tmp/jdk_tmp_p6UwwR && tar -xf
> /var/lib/ambari-agent/tmp/jdk-8u112-linux-x64.tar.gz && ambari-sudo.sh
> cp -rp /var/lib/ambari-agent/tmp/jdk_tmp_p6UwwR/* /usr/jdk64'] {}
> 2018-03-10 09:59:54,976 -
> Directory['/var/lib/ambari-agent/tmp/jdk_tmp_p6UwwR'] {'action':
> ['delete']} 2018-03-10 09:59:54,976 - Removing directory
> Directory['/var/lib/ambari-agent/tmp/jdk_tmp_p6UwwR'] and all its
> content 2018-03-10 09:59:55,089 -
> File['/usr/jdk64/jdk1.8.0_112/bin/java'] {'mode': 0755, 'cd_access':
> 'a'} 2018-03-10 09:59:55,090 - Execute[('chmod', '-R', '755',
> u'/usr/jdk64/jdk1.8.0_112')] {'sudo': True} 2018-03-10 09:59:55,160 -
> Initializing 2 repositories 2018-03-10 09:59:55,161 -
> Repository['HDP-2.6'] {'base_url':
> 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0',
> 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template':
> '[{{repo_id}}]nname={{repo_id}}n{% if mirror_list
> %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
> %}nnpath=/nenabled=1ngpgcheck=0', 'repo_file_name': 'HDP',
> 'mirror_list': None} 2018-03-10 09:59:55,176 -
> File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)}
> 2018-03-10 09:59:55,177 - Writing File['/etc/yum.repos.d/HDP.repo']
> because it doesn't exist 2018-03-10 09:59:55,178 -
> Repository['HDP-UTILS-1.1.0.21'] {'base_url':
> 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7',
> 'action': ['create'], 'components': [u'HDP-UTILS', 'main'],
> 'repo_template': '[{{repo_id}}]nname={{repo_id}}n{% if mirror_list
> %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
> %}nnpath=/nenabled=1ngpgcheck=0', 'repo_file_name': 'HDP-UTILS',
> 'mirror_list': None} 2018-03-10 09:59:55,180 -
> File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content':
> InlineTemplate(...)} 2018-03-10 09:59:55,181 - Writing
> File['/etc/yum.repos.d/HDP-UTILS.repo'] because it doesn't exist
> 2018-03-10 09:59:55,181 - Package['unzip']
> {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-03-10
> 09:59:55,659 - Installing package unzip ('/usr/bin/yum -d 0 -e 0 -y
> install unzip') 2018-03-10 09:59:58,922 - Package['curl']
> {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-03-10
> 09:59:58,958 - Skipping installation of existing package curl
> 2018-03-10 09:59:58,959 - Package['hdp-select']
> {'retry_on_repo_unavailability': False, 'retry_count': 5} 2018-03-10
> 09:59:58,994 - Installing package hdp-select ('/usr/bin/yum -d 0 -e 0
> -y install hdp-select') 2018-03-10 10:00:00,717 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/.el[0-9]//g''] {'stderr': -1} 2018-03-10 10:00:00,752 -
> checked_call returned (0, '2.6.3.0-235', '') 2018-03-10 10:00:00,753 -
> Package['accumulo_2_6_3_0_235'] {'retry_on_repo_unavailability':
> False, 'retry_count': 5} 2018-03-10 10:00:00,884 - Installing package
> accumulo_2_6_3_0_235 ('/usr/bin/yum -d 0 -e 0 -y install
> accumulo_2_6_3_0_235') 2018-03-10 10:00:03,797 - Execution of
> '/usr/bin/yum -d 0 -e 0 -y install accumulo_2_6_3_0_235' returned 1.
> Error: Package: hadoop_2_6_3_0_235-hdfs-2.7.3.2.6.3.0-235.x86_64
> (HDP-2.6)
> Requires: libtirpc-devel You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles
> --nodigest 2018-03-10 10:00:03,797 - Failed to install package accumulo_2_6_3_0_235. Executing '/usr/bin/yum clean metadata'
> 2018-03-10 10:00:04,015 - Retrying to install package
> accumulo_2_6_3_0_235 after 30 seconds
>
> Command failed after 1 tries
Accumulo 的依赖 Hadoop 软件包似乎需要 libtirpc-devel,而 libtirpc-devel 无法安装在您的系统上。
查看 https://community.hortonworks.com/questions/96763/hdp-26-ambari-install-fails-on-rhel-7-on-libtirpc.html,似乎您只需要安装一个额外的 Yum 存储库"rhel-7-server-optional-rpms"。
# yum-config-manager --enable rhui-REGION-rhel-server-optional