从Java应用程序使用jsch远程执行Hadoop命令



嗨,我正在尝试执行任何hadoop命令,如"hadoop fs -ls"通过一个Java应用程序远程。我的Java应用程序在本地机器上,Hadoop在AWS服务器上。

首先我建立一个ssh连接并工作。我也可以通过java代码执行linux命令,但hadoop命令不显示任何内容。

当前情况与执行下面的命令相同。

//ssddff is a non-existent file
cd ssddff

我使用了正确的命令。在服务器上工作命令很好但是hadoop命令回调不工作并停止。

hadoop fs -ls
/usr/local/hadoop/bin/hadoop fs -ls
ssh.getSSHResponse("hadoop fs -ls")
public String getSSHResponse(String command) {
StringBuilder response = null;
try {
//          connectSSH();
if(channelExec==null) {
channelExec = (ChannelExec) session.openChannel("exec");
}

channelExec.setCommand(command);
InputStream inputStream = channelExec.getInputStream();
channelExec.connect();
byte[] buffer = new byte[8192];
int decodedLength;
response = new StringBuilder();
//when debugging, stop here
while ((decodedLength = inputStream.read(buffer, 0, buffer.length)) > 0)
response.append(new String(buffer, 0, decodedLength));
} catch (JSchException e) {
log.error("JSchException");
} catch (IOException e) {
e.printStackTrace();
} catch(Exception e) {
e.printStackTrace();
}
return response.toString();
}

我试着

/usr/local/hadoop/bin/hadoop fs -ls

我通过连接一个新频道解决了这个问题。

public String getSSHResponse(String command) {
StringBuilder response = null;
try {
channelExec = (ChannelExec) session.openChannel("exec");

channelExec.setCommand(command);
InputStream inputStream = channelExec.getInputStream();
channelExec.connect();
byte[] buffer = new byte[8192];
int decodedLength;
response = new StringBuilder();
//when debugging, stop here
while ((decodedLength = inputStream.read(buffer, 0, buffer.length)) > 0)
response.append(new String(buffer, 0, decodedLength));
} catch (JSchException e) {
log.error("JSchException");
} catch (IOException e) {
e.printStackTrace();
} catch(Exception e) {
e.printStackTrace();
}
return response.toString();
}