JAVA程序链接hadoop HDFS服务报没法链接。

    在虚拟机部署hadoop的HDFS系统,在centos系统中敲各类操做命令,对文件读取、上传、删除等都没什么问题。但用JAVA写了代码去操做HDFS文件时,出现奇怪的现象:java

     代码写是写到获取fileSystem对象:node

Configuration configuration=new Configuration();
FileSystem fileSystem=FileSystem.get(configuration);

    报了如下异常现象:apache

Exception in thread "main" java.net.ConnectException: Call From DESKTOP-1A7JSVH/192.168.139.1 to hadoop.node.master:8020 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1415)
    at org.apache.hadoop.ipc.Client.call(Client.java:1364)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy14.getBlockLocations(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
    at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
    at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
    at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
    at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
    at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
    at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
    at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at com.my.hadoop.HadoopTest.main(HadoopTest.java:28)
Caused by: java.net.ConnectException: Connection refused: no further information
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
    at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
    at org.apache.hadoop.ipc.Client.call(Client.java:1382)
    ... 24 morevim

检查过程以下:centos

       一开始觉得HDFS的8020服务端口,在虚拟机服务器上敲了netstat -ntlp | grep 8020,出现以下结果 :服务器

         tcp        0      0 127.0.0.1:8020          0.0.0.0:*               LISTEN      4753/java  tcp

       服务端口正常,防火墙也已经关闭,况且打开http://hadoop.node.master:50070显示HDFS服务也正常,一时摸不着头脑。oop

      最后,分析了一下netstat命令的结果,里面显示的是127.0.0.1,感受8020端口绑定的是本地端口,有多是这个缘由引发,因而,敲vim /etc/hosts命令,fetch

将127.0.0.1 hadoop.node.master改为192.168.139.130 hadoop.node.master.net

而后重启了namenode和datanode服务,而后运行上述java程序,竟然正常了。

相关文章
相关标签/搜索