Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Connection error sandbox-hdp.hortonworks.com/172.17.0.2 to sandbox.hortonworks.com:8020

avatar
Explorer

Hi,

I have installed the hortonworks sandbox version --- and have followed this tutorial https://hortonworks.com/tutorial/learning-the-ropes-of-the-hortonworks-sandbox/ and everything seems to be up and running.

I want to use HBase and therfore followed this tutorial https://hortonworks.com/hadoop-tutorial/introduction-apache-hbase-concepts-apache-phoenix-new-backup... but when I try to import data into Hbase using ImportTsv using the following statement:

<code>hbase org.apache.hadoop.hbase.mapreduce.ImportTsv-Dimporttsv.separator=,-Dimporttsv.columns="HBASE_ROW_KEY,events:driverId,events:driverName,events:eventTime,events:eventType,events:latitudeColumn,events:longitudeColumn,events:routeId,events:routeName,events:truckId" driver_dangerous_event hdfs://sandbox.hortonworks.com:/tmp/data.csv

I get the following error:

Exception in thread "main" java.net.ConnectException: Call From sandbox-hdp.hortonworks.com/172.17.0.2 t
o sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection ref
used; For more details see: <a href="http://wiki.apache.org/hadoop/ConnectionRefused">http://wiki.apache.org/hadoop/ConnectionRefused</a> 
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)                        
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
ava:45)                                                                                                 
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)                              
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)                            
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)                              
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558)                                
        at org.apache.hadoop.ipc.Client.call(Client.java:1498)                                          
        at org.apache.hadoop.ipc.Client.call(Client.java:1398)                                          
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)           
        at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)                                           
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNameno
deProtocolTranslatorPB.java:823)                                                                        
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)                                  
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)        
        at java.lang.reflect.Method.invoke(Method.java:498)                                             
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:29
1)                                                                                                      
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)    
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)    
        at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)                                           
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2165)                            
        at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1442)      
        at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1438)      
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)          
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1454)  
        at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)                                  
        at org.apache.hadoop.fs.Globber.glob(Globber.java:252)                                          
        at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1715)                             
        at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputForma
t.java:294)                                                                                             
        at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:265)   
        at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:387)    
        at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)               
        at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)                  
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)            
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)                                        
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)                                        
        at java.security.AccessController.doPrivileged(Native Method)                                   
        at javax.security.auth.Subject.doAs(Subject.java:422)                                           
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)         
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)                                        
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)                             
        at org.apache.hadoop.hbase.mapreduce.ImportTsv.run(ImportTsv.java:721)                          
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)                                    
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)                                    
        at org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:725)                         
Caused by: java.net.ConnectException: Connection refused                                                
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)                                     
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)                       
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)              
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)                                    
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)                                    
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650)                     
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745)                      
        at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)                         
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620)                                 
        at org.apache.hadoop.ipc.Client.call(Client.java:1451)                                          
        ... 36 more                           

What I understand from this is that I don't seem to be able to connect to the namenode (sandbox.hortonworks.com:8020), but everything seem to be up and running. I can copy data from my local machine to the hdfs using this command:

<code>hadoop fs -copyFromLocal ~/data.csv /tmp

So I seem to be able to connect to the hdfs.

Would be really grateful if someone could point me in the right direction.

//Rebecca

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Hi @Rebecca B , I am sorry to say that the tutorial you are trying run has not officially been updated to work with the latest HDP 2.6.4 sandbox. Please run tutorials that are published in the Hortonworks tutorials page.

View solution in original post

5 REPLIES 5

avatar
Explorer

I solved this by changing the address to the hdfs to hdfs://sandbox-hdp.hortonworks.com:/tmp/data.csv

hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=, -Dimporttsv.columns="HBASE_ROW_KEY,events:driverId,events:driverName,events:eventTime,events:eventType,events:latitudeColumn,events:longitudeColumn,events:routeId,events:routeName,events:truckId" driver_dangerous_event hdfs://sandbox-hdp.hortonworks.com:/tmp/data.csv

avatar
Super Collaborator

Hi @Rebecca B , I am sorry to say that the tutorial you are trying run has not officially been updated to work with the latest HDP 2.6.4 sandbox. Please run tutorials that are published in the Hortonworks tutorials page.

avatar
Explorer

Thank you for your answer @gdeleon. Is there any way to get to 2.5 sandbox instead? I wanted to use HBase for a case-study I'm doing and it would be great to ge able to run the tutorial. I did manage to solve the issue by changing the address to the hdfs to hdfs://sandbox-hdp.hortonworks.com:/tmp/data.csv so maybe I'll be able to work around the reset of the changes too but it would probably be easier if I could use the older version of the sandbox.

avatar
Explorer

Found were I could download sandbox 2.5. Will download 2.5 instead. @gdeleon.

avatar
Super Collaborator

Hi @Rebecca B, you can download old sandboxes in the "Hortonworks Sandbox Archive" section of the download page.
Although, please keep in mind that these are no longer supported.