<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: I try to run commands at the terminal but get a connection refused error in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32268#M5293</link>
    <description>&lt;P&gt;did not set any variable on the session,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;this is what&amp;nbsp;i have in bash_profile,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;# .bash_profile&lt;/P&gt;&lt;P&gt;# Get the aliases and functions&lt;BR /&gt;if [ -f ~/.bashrc ]; then&lt;BR /&gt;. ~/.bashrc&lt;BR /&gt;fi&lt;/P&gt;&lt;P&gt;# User specific environment and startup programs&lt;/P&gt;&lt;P&gt;export PATH=$PATH:$HOME/bin:$HADOOP_HOME/bin&lt;BR /&gt;export CLASSPATH=/usr/lib/hadoop/client-0.20/\*:/usr/lib/hadoop/\*&lt;BR /&gt;export AVRO_CLASSPATH=/usr/lib/avro&lt;/P&gt;&lt;P&gt;alias lart="ls -lart"&lt;BR /&gt;set -o vi&lt;/P&gt;</description>
    <pubDate>Thu, 24 Sep 2015 18:01:19 GMT</pubDate>
    <dc:creator>DeepakTanna</dc:creator>
    <dc:date>2015-09-24T18:01:19Z</dc:date>
    <item>
      <title>I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25535#M5280</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;/P&gt;&lt;P&gt;I start my quickstart VM and enter the terminal the following:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;sqoop import-all-tables \&lt;BR /&gt;-m 1 \&lt;BR /&gt;--connect jdbc:mysql://quickstart.cloudera:3306/retail_db \&lt;BR /&gt;--username=retail_dba \&lt;BR /&gt;--password=cloudera \&lt;BR /&gt;--compression-codec=snappy \&lt;BR /&gt;--as-avrodatafile \&lt;BR /&gt;--warehouse-dir=/user/hive/warehouse&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;but get the following error&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;failed on connection exception: java.net.ConnectException: Connection refused; For more details see: &lt;A target="_blank" href="http://wiki.apache.org/hadoop/ConnectionRefused"&gt;http://wiki.apache.org/hadoop/ConnectionRefused&lt;/A&gt;&lt;BR /&gt;Streaming Command Failed!&lt;BR /&gt;Error in mr(map = map, reduce = reduce, combine = combine, in.folder = if (is.list(input)) { :&lt;BR /&gt;hadoop streaming failed with error code 5&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;any help is appreciated. I am just a beginner and I dont know anything thanks.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 09:24:01 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25535#M5280</guid>
      <dc:creator>eugenerory</dc:creator>
      <dc:date>2022-09-16T09:24:01Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25544#M5281</link>
      <description>&lt;P&gt;Perhaps see if you can connect directly to MySql database from the command line.&amp;nbsp; Here is how it looks for me in the quickstart VM:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;[cloudera@quickstart ~]$ &lt;STRONG&gt;mysql --user=retail_dba --password=cloudera&lt;/STRONG&gt;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Welcome to the MySQL monitor.&amp;nbsp; Commands end with ; or \g.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Your MySQL connection id is 26097&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Server version: 5.1.66 Source distribution&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;Copyright (c) 2000, 2012, Oracle and/or its affiliates. All rights reserved.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;Oracle is a registered trademark of Oracle Corporation and/or its&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;affiliates. Other names may be trademarks of their respective&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;owners.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;mysql&amp;gt; use retail_db&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Reading table information for completion of table and column names&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;You can turn off this feature to get a quicker startup with -A&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;Database changed&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;mysql&amp;gt; show tables;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;+---------------------+&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| Tables_in_retail_db |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;+---------------------+&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| categories&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| customers&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| departments&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| order_items&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| orders&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;| products&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; |&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;+---------------------+&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;6 rows in set (0.00 sec)&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;mysql&amp;gt; &lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If that fails, then perhaps MySQL is not running?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Morgan&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 15:16:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25544#M5281</guid>
      <dc:creator>Morgan</dc:creator>
      <dc:date>2015-03-13T15:16:23Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25549#M5282</link>
      <description>&lt;P&gt;Morgan,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks for your repy. I can connect to mysql - I don't think that is the problem.&amp;nbsp;&lt;/P&gt;&lt;P&gt;The problem is I can't connect to something and mapreduce jobs cannot be performed.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In the very first tutorial on cloudera, it reads "You should first log in to the Master Node of your cluster using SSH - you can get the credentials using the instructions on Your Cloudera Cluster. ".&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I don't know how to do this. I'm just using the cloudera quickstart VM via Virtual Box. I start the VM and open the terminal, and enter the lines&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;sqoop import-all-tables \&lt;BR /&gt;-m 1 \&lt;BR /&gt;--connect jdbc:mysql://quickstart.cloudera:3306/retail_db \&lt;BR /&gt;--username=retail_dba \&lt;BR /&gt;--password=cloudera \&lt;BR /&gt;--compression-codec=snappy \&lt;BR /&gt;--as-avrodatafile \&lt;BR /&gt;--warehouse-dir=/user/hive/warehouse&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;and I get a connection refused error. SAme error happens if I try to use R and Hadoop together. It seems like I can't connect to a server or something? Do I have to really login to a server after starting my VM? I'm just trying to learn and all of it is new to me. Thanks for your help.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here is the complete outpue I get after I run the above comments. I am on MacOSX 10.7.5, using cloudera quickstart VM via Virtual Box.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;BR /&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;BR /&gt;15/03/13 09:44:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.0&lt;BR /&gt;15/03/13 09:44:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;15/03/13 09:44:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;BR /&gt;15/03/13 09:44:56 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;15/03/13 09:44:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/03/13 09:44:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/03/13 09:44:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce&lt;BR /&gt;Note: /tmp/sqoop-cloudera/compile/47d81c933f89fd992607ae4a35707074/categories.java uses or overrides a deprecated API.&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details.&lt;BR /&gt;15/03/13 09:44:59 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/47d81c933f89fd992607ae4a35707074/categories.jar&lt;BR /&gt;15/03/13 09:44:59 WARN manager.MySQLManager: It looks like you are importing from mysql.&lt;BR /&gt;15/03/13 09:44:59 WARN manager.MySQLManager: This transfer can be faster! Use the --direct&lt;BR /&gt;15/03/13 09:44:59 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.&lt;BR /&gt;15/03/13 09:44:59 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)&lt;BR /&gt;15/03/13 09:44:59 INFO mapreduce.ImportJobBase: Beginning import of categories&lt;BR /&gt;15/03/13 09:45:00 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/03/13 09:45:00 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-cloudera/compile/47d81c933f89fd992607ae4a35707074/sqoop_import_categories.avsc&lt;BR /&gt;15/03/13 09:45:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:04 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:05 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:06 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:07 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:08 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:09 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:10 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/03/13 09:45:11 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:java.net.ConnectException: Call From quickstart.cloudera/127.0.0.1 to localhost:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: &lt;A target="_blank" href="http://wiki.apache.org/hadoop/ConnectionRefused"&gt;http://wiki.apache.org/hadoop/ConnectionRefused&lt;/A&gt;&lt;BR /&gt;15/03/13 09:45:11 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.net.ConnectException: Call From quickstart.cloudera/127.0.0.1 to localhost:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: &lt;A target="_blank" href="http://wiki.apache.org/hadoop/ConnectionRefused"&gt;http://wiki.apache.org/hadoop/ConnectionRefused&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;thanks,&lt;/P&gt;&lt;P&gt;ER&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 16:49:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25549#M5282</guid>
      <dc:creator>eugenerory</dc:creator>
      <dc:date>2015-03-13T16:49:43Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25552#M5283</link>
      <description>&lt;P&gt;&amp;gt;&amp;gt; In the very first tutorial on cloudera, it reads "You should first log in to the Master Node of your cluster using SSH - you can get the credentials using the instructions on Your Cloudera Cluster. "&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It's a little confusing whether you're running these commands on your host machine, or on the VM. If you're reading the tutorial hosted on a website somewhere, it's written with you running this on a fully-distributed cluster in mind and SSH'ing in to the machine. There's a modified copy hosted on the VM itself (just go to localhost in the web browser in the VM, or on your host as port-forwarding should work for VirtualBox) that (in my copy at least) just tells you to click on the terminal icon on the VM's desktop and enter commands there. Which version of the VM are you using and where do you see that text? It should be possible to SSH into the VM, and even run these commands from your host machine but doing so requires a lot of network configuration to be set up correctly - it won't be set up that way by default and it can be complicated to get it working consistently on different hosts - which is why I recommend just using the terminal on the VM's desktop.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The root cause of your connection refused error problem appears to be that Sqoop is trying to use MR1. The VM is set up to use MR2 / YARN by default, so that is probably why MR1 is not running and you can't connect. Cloudera supports running both MR1 and MR2, but you can't have a machine configured as a client to both at the same time. When I run this on my copy of the VM (and in all recent versions) Sqoop is definitely using MR2 / YARN. Have you change any other configurations before running Sqoop? Is it possible you've got Sqoop installed on your host machine and it's configured differently than Sqoop in the VM?&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 16:59:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25552#M5283</guid>
      <dc:creator>Sean</dc:creator>
      <dc:date>2015-03-13T16:59:29Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25554#M5284</link>
      <description>&lt;P&gt;Sean, Thank you for your response.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am running these commands on the VM. I am just using the terminal &lt;SPAN&gt;on my VM's desktop.&lt;/SPAN&gt; I am reading the text that says "login to the master node of your cluster using SSH" in the web browser that is&amp;nbsp;opened automatically upon starting the VM at the address (&lt;A target="_blank" href="http://quickstart.cloudera/#/tutorial/ingest_structured_data).&amp;nbsp;I"&gt;http://quickstart.cloudera/#/tutorial/ingest_structured_data).&amp;nbsp;I&lt;/A&gt; am using Oracle VM VirtualBox Manager 4.3.20.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I don't think I made any configuration changes before running sqoop. I just opened the cloudera-quickstart-vm-5.3.0-0-virtualbox-disk1.vmdk using my Virtual Box.&lt;/P&gt;&lt;P&gt;I made some changes to use R and hadoop together using the blog at&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A target="_blank" href="http://blogr-cs.blogspot.com/2012/12/integration-of-r-rstudio-and-hadoop-in.html"&gt;http://blogr-cs.blogspot.com/2012/12/integration-of-r-rstudio-and-hadoop-in.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;but I think those are irrelevant.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I do not have sqoop on my host machine. I'd really really appreciate if you could please suggest some solutions I can understand/implement. Thank you.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;ER&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 17:31:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25554#M5284</guid>
      <dc:creator>eugenerory</dc:creator>
      <dc:date>2015-03-13T17:31:57Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25558#M5285</link>
      <description>&lt;P&gt;ER,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am fairly new to this also.&amp;nbsp; Started with the Virtualbox quickstart VM running on Windows host.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;FWIW, here is what I get when I run the same command...&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT size="1" face="courier new,courier"&gt;cloudera@quickstart morgan]$ sqoop import-all-tables \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; -m 1 \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; --connect jdbc:mysql://quickstart.cloudera:3306/retail_db \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; --username=retail_dba \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; --password=cloudera \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; --compression-codec=snappy \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; --as-avrodatafile \&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;gt; --warehouse-dir=/user/hive/warehouse&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:04 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:04 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:05 INFO tool.CodeGenTool: Beginning code generation&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:05 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Note: /tmp/sqoop-cloudera/compile/034c37aed57826a53538f7603ccaa6c1/categories.java uses or overrides a deprecated API.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;Note: Recompile with -Xlint:deprecation for details.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/034c37aed57826a53538f7603ccaa6c1/categories.jar&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 WARN manager.MySQLManager: It looks like you are importing from mysql.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 WARN manager.MySQLManager: This transfer can be faster! Use the --direct&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 INFO mapreduce.ImportJobBase: Beginning import of categories&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:09 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:12 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-cloudera/compile/034c37aed57826a53538f7603ccaa6c1/sqoop_import_categories.avsc&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:12 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:12 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:15 INFO db.DBInputFormat: Using read commited transaction isolation&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:15 INFO mapreduce.JobSubmitter: number of splits:1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:15 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1425573450783_0059&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:16 INFO impl.YarnClientImpl: Submitted application application_1425573450783_0059&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:16 INFO mapreduce.Job: The url to track the job: &lt;A target="_blank" href="http://quickstart.cloudera:8088/proxy/application_1425573450783_0059/"&gt;http://quickstart.cloudera:8088/proxy/application_1425573450783_0059/&lt;/A&gt;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:16 INFO mapreduce.Job: Running job: job_1425573450783_0059&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:29 INFO mapreduce.Job: Job job_1425573450783_0059 running in uber mode : false&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:29 INFO mapreduce.Job:&amp;nbsp; map 0% reduce 0%&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:41 INFO mapreduce.Job:&amp;nbsp; map 100% reduce 0%&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:41 INFO mapreduce.Job: Job job_1425573450783_0059 completed successfully&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:41 INFO mapreduce.Job: Counters: 30&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;File System Counters&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;FILE: Number of bytes read=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;FILE: Number of bytes written=131709&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;FILE: Number of read operations=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;FILE: Number of large read operations=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;FILE: Number of write operations=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;HDFS: Number of bytes read=87&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;HDFS: Number of bytes written=1344&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;HDFS: Number of read operations=4&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;HDFS: Number of large read operations=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;HDFS: Number of write operations=2&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;Job Counters &lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Launched map tasks=1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Other local map tasks=1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Total time spent by all maps in occupied slots (ms)=9535&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Total time spent by all reduces in occupied slots (ms)=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Total time spent by all map tasks (ms)=9535&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Total vcore-seconds taken by all map tasks=9535&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Total megabyte-seconds taken by all map tasks=9763840&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;Map-Reduce Framework&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Map input records=58&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Map output records=58&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Input split bytes=87&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Spilled Records=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Failed Shuffles=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Merged Map outputs=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;GC time elapsed (ms)=118&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;CPU time spent (ms)=1430&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Physical memory (bytes) snapshot=118579200&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Virtual memory (bytes) snapshot=856969216&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Total committed heap usage (bytes)=60751872&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;File Input Format Counters &lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Bytes Read=0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;File Output Format Counters &lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;&amp;nbsp;&amp;nbsp;Bytes Written=1344&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:41 INFO mapreduce.ImportJobBase: Transferred 1.3125 KB in 29.6161 seconds (45.3808 bytes/sec)&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT size="1" face="courier new,courier"&gt;15/03/13 13:47:41 INFO mapreduce.ImportJobBase: Retrieved 58 records.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I'm not sure why you are getting this error:&lt;/P&gt;&lt;P&gt;&lt;FONT face="courier new,courier"&gt;Retrying connect to server: localhost/127.0.0.1:8021.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In fact, in my VM, I don't have a listener on port 8021, but do have one on 8020.&amp;nbsp; Maybe someone more knowledgeable can address that?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Have you tried a restart of the VM?&amp;nbsp; If you do that, give it some time for all the baclground processes to fire up before you try sqoop again&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Morgan&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 18:07:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25558#M5285</guid>
      <dc:creator>Morgan</dc:creator>
      <dc:date>2015-03-13T18:07:06Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25562#M5286</link>
      <description>&lt;P&gt;After reviewing the blog post, I noticed that it is written for the CDH 4.1.1 VM. I'm afraid there have been a number of changes since then that might be complicating things. The primary change, and the one that I think is complicating Sqoop for you, is the in CDH 4 we recommend MR1 for production, whereas in CDH 5 YARN has stabilized and we now recommend MR2 for production because of the superior resource management.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I believe the following line is responsible for setting up your environment such that Sqoop is trying to use MR1 when it is not running:&lt;/P&gt;&lt;PRE&gt;ln -s /etc/default/hadoop-0.20-mapreduce /etc/profile.d/hadoop.sh&lt;/PRE&gt;&lt;P&gt;You could either try getting rid of that symlink and anything else that's telling the system to use MR1, or you could stop YARN / MR2 and use MR1 instead. I'll try post some instructions for doing the latter shortly...&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 18:13:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25562#M5286</guid>
      <dc:creator>Sean</dc:creator>
      <dc:date>2015-03-13T18:13:45Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25563#M5287</link>
      <description>&lt;P&gt;To answer Morgan's question, port 8020 is the HDFS NameNode, port 8021 is the JobTracker in MR1, which is where you would have submitted jobs in CDH 4. It can still be used in CDH 5, but as it is not the default, you'll need to switch around some configuration and services (and understand that the rest of the tutorial may not work exactly as expected because of the switch - I'd suggest perhaps starting with a fresh copy of the tutorial to be sure everything in the tutorial will work and not conflict with what you've been doing in R).&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 18:15:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25563#M5287</guid>
      <dc:creator>Sean</dc:creator>
      <dc:date>2015-03-13T18:15:17Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25566#M5288</link>
      <description>&lt;P&gt;I believe this procedure should get you switched over from YARN / MR2 to MR1. After running it I was able to comput pi using MR1:&lt;/P&gt;&lt;PRE&gt;for service in mapreduce-historyserver yarn-nodemanager yarn-proxyserver yarn-resourcemanager; do
    sudo service hadoop-${service} stop
    sudo chkconfig hadoop-${service} off
done

sudo yum remove -y hadoop-conf-pseudo
sudo yum install -y hadoop-0.20-conf-pseudo

for service in 0.20-mapreduce-jobtracker 0.20-mapreduce-tasktracker; do
    sudo service hadoop-${service} start
    sudo chkconfig hadoop-${service} on
done&lt;/PRE&gt;&lt;P&gt;&amp;nbsp; It stops and disables the MR2 / YARN services, swaps the configuration files, then starts and enables the MR1 services. Again, the tutorial is not written to be used (or tested) with with MR1, so it's possible you'll run into some other issues. I can't think if any specific incompatibilities - just recommending that if you want to walk through the tutorial, you do it with an environment as close to the original VM as possible - otherwise who knows what differences may be involved.&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2015 19:08:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25566#M5288</guid>
      <dc:creator>Sean</dc:creator>
      <dc:date>2015-03-13T19:08:13Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25653#M5289</link>
      <description>&lt;P&gt;Sean,&amp;nbsp;&lt;/P&gt;&lt;P&gt;Your procedure for stopping MR2/YARN and starting MR1 solved the problem.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am not sure if you are familiar with R but my purpose is to set up R and Hadoop together. I did this using that blog I sent the link for. The mapreduce jobs run now and and output file is created as a result of a very simple 3 line R test code. But when I try to access that file, &amp;nbsp;I get an "output file does not exist" error, which is given below. Any comments here that could help me proceed would be&amp;nbsp;very very appreciated. Thanks. &amp;nbsp; -ER&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://localhost:8020/user/cloudera/0&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.streaming.DumpTypedBytes.run(DumpTypedBytes.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)&lt;BR /&gt;at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:41)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:212)&lt;BR /&gt;Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://localhost:8020/user/cloudera/128432&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.streaming.DumpTypedBytes.run(DumpTypedBytes.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)&lt;BR /&gt;at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:41)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:212)&lt;BR /&gt;Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://localhost:8020/user/cloudera/422&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.streaming.DumpTypedBytes.run(DumpTypedBytes.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)&lt;BR /&gt;at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:41)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:212)&lt;BR /&gt;Exception in thread "main" java.io.FileNotFoundException: File does not exist: hdfs://localhost:8020/user/cloudera/122&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)&lt;BR /&gt;at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)&lt;BR /&gt;at org.apache.hadoop.streaming.DumpTypedBytes.run(DumpTypedBytes.java:76)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)&lt;BR /&gt;at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)&lt;BR /&gt;at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:41)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.hadoop.util.RunJar.main(RunJar.java:212)&lt;/P&gt;</description>
      <pubDate>Tue, 17 Mar 2015 16:50:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25653#M5289</guid>
      <dc:creator>eugenerory</dc:creator>
      <dc:date>2015-03-17T16:50:18Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25661#M5290</link>
      <description>&lt;P&gt;I'm afraid I'm not very familiar with R and running it against Hadoop. My first thought is that perhaps the program that creates the files and the program that looks for the files are running as different users? /user/cloudera is the default working directory for the cloudera user, but other users will default to other directories. e.g. if 'root' asks for a file called '0', unless there's an absolute path with it, it means /user/root/0. Is it possible these files exist under a different user's home directory?&lt;/P&gt;</description>
      <pubDate>Tue, 17 Mar 2015 18:35:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/25661#M5290</guid>
      <dc:creator>Sean</dc:creator>
      <dc:date>2015-03-17T18:35:57Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32115#M5291</link>
      <description>&lt;P&gt;Faced the same issue and surprisingly it worked when i prefixed the squoop command with sudo - dont understand why as the cloudera should have the same privs&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;sqoop import-all-tables \&lt;BR /&gt;&amp;gt; -m 1 \&lt;BR /&gt;&amp;gt; --connect jdbc:mysql://quickstart:3306/retail_db \&lt;BR /&gt;&amp;gt; --username=retail_dba \&lt;BR /&gt;&amp;gt; --password=cloudera \&lt;BR /&gt;&amp;gt; --compression-codec=snappy \&lt;BR /&gt;&amp;gt; --as-avrodatafile \&lt;BR /&gt;&amp;gt; --warehouse-dir=/user/hive/warehouse&lt;BR /&gt;Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;BR /&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;BR /&gt;/usr/lib/hadoop-0.20-mapreduce/hadoop-core-2.6.0-mr1-cdh5.4.0.jar&lt;BR /&gt;/usr/lib/hadoop-0.20-mapreduce/hadoop-core-mr1.jar&lt;BR /&gt;15/09/21 20:32:17 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.0&lt;BR /&gt;15/09/21 20:32:17 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;15/09/21 20:32:18 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;BR /&gt;15/09/21 20:32:19 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;15/09/21 20:32:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/09/21 20:32:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/09/21 20:32:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce&lt;BR /&gt;Note: /tmp/sqoop-cloudera/compile/554ca9a57edc7fb8771c0729223df56c/categories.java uses or overrides a deprecated API.&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details.&lt;BR /&gt;15/09/21 20:32:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/554ca9a57edc7fb8771c0729223df56c/categories.jar&lt;BR /&gt;15/09/21 20:32:24 WARN manager.MySQLManager: It looks like you are importing from mysql.&lt;BR /&gt;15/09/21 20:32:24 WARN manager.MySQLManager: This transfer can be faster! Use the --direct&lt;BR /&gt;15/09/21 20:32:24 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.&lt;BR /&gt;15/09/21 20:32:24 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)&lt;BR /&gt;15/09/21 20:32:24 INFO mapreduce.ImportJobBase: Beginning import of categories&lt;BR /&gt;15/09/21 20:32:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/09/21 20:32:27 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-cloudera/compile/554ca9a57edc7fb8771c0729223df56c/sqoop_import_categories.avsc&lt;BR /&gt;15/09/21 20:32:29 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:30 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:31 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:32 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:33 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:34 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:35 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:36 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:37 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:38 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:8021. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)&lt;BR /&gt;15/09/21 20:32:38 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:java.net.ConnectException: Call From quickstart.cloudera/127.0.0.1 to localhost:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: &lt;A href="http://wiki.apache.org/hadoop/ConnectionRefused" target="_blank"&gt;http://wiki.apache.org/hadoop/ConnectionRefused&lt;/A&gt;&lt;BR /&gt;15/09/21 20:32:38 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.net.ConnectException: Call From quickstart.cloudera/127.0.0.1 to localhost:8021 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: &lt;A href="http://wiki.apache.org/hadoop/ConnectionRefused" target="_blank"&gt;http://wiki.apache.org/hadoop/ConnectionRefused&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[cloudera@quickstart ClouderaGettingStartedCode]$ sudo sqoop import-all-tables -m 1 --connect jdbc:mysql://quickstart:3306/retail_db --username=retail_dba --password=cloudera --compression-codec=snappy --as-avrodatafile --warehouse-dir=/user/hive/warehouse&lt;BR /&gt;Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.&lt;BR /&gt;Please set $ACCUMULO_HOME to the root of your Accumulo installation.&lt;BR /&gt;15/09/21 20:36:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.0&lt;BR /&gt;15/09/21 20:36:25 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.&lt;BR /&gt;15/09/21 20:36:25 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.&lt;BR /&gt;15/09/21 20:36:26 INFO tool.CodeGenTool: Beginning code generation&lt;BR /&gt;15/09/21 20:36:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/09/21 20:36:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/09/21 20:36:26 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce&lt;BR /&gt;Note: /tmp/sqoop-root/compile/d58fcf6562850c1a3a17a3fe48bfea6d/categories.java uses or overrides a deprecated API.&lt;BR /&gt;Note: Recompile with -Xlint:deprecation for details.&lt;BR /&gt;15/09/21 20:36:31 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/d58fcf6562850c1a3a17a3fe48bfea6d/categories.jar&lt;BR /&gt;15/09/21 20:36:31 WARN manager.MySQLManager: It looks like you are importing from mysql.&lt;BR /&gt;15/09/21 20:36:31 WARN manager.MySQLManager: This transfer can be faster! Use the --direct&lt;BR /&gt;15/09/21 20:36:31 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.&lt;BR /&gt;15/09/21 20:36:31 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)&lt;BR /&gt;15/09/21 20:36:31 INFO mapreduce.ImportJobBase: Beginning import of categories&lt;BR /&gt;15/09/21 20:36:31 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address&lt;BR /&gt;15/09/21 20:36:32 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar&lt;BR /&gt;15/09/21 20:36:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1&lt;BR /&gt;15/09/21 20:36:34 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/d58fcf6562850c1a3a17a3fe48bfea6d/sqoop_import_categories.avsc&lt;BR /&gt;15/09/21 20:36:34 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps&lt;BR /&gt;15/09/21 20:36:34 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032&lt;BR /&gt;15/09/21 20:36:45 INFO db.DBInputFormat: Using read commited transaction isolation&lt;BR /&gt;15/09/21 20:36:45 INFO mapreduce.JobSubmitter: number of splits:1&lt;BR /&gt;15/09/21 20:36:46 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1437690038831_0009&lt;BR /&gt;15/09/21 20:36:51 INFO impl.YarnClientImpl: Submitted application application_1437690038831_0009&lt;BR /&gt;15/09/21 20:36:52 INFO mapreduce.Job: The url to track the job: &lt;A href="http://quickstart.cloudera:8088/proxy/application_1437690038831_0009/" target="_blank"&gt;http://quickstart.cloudera:8088/proxy/application_1437690038831_0009/&lt;/A&gt;&lt;BR /&gt;15/09/21 20:36:52 INFO mapreduce.Job: Running job: job_1437690038831_0009&lt;BR /&gt;15/09/21 20:38:11 INFO mapreduce.Job: Job job_1437690038831_0009 running in uber mode : false&lt;BR /&gt;15/09/21 20:38:11 INFO mapreduce.Job: map 0% reduce 0%&lt;BR /&gt;15/09/21 20:39:11 INFO mapreduce.Job: map 100% reduce 0%&lt;BR /&gt;15/09/21 20:39:15 INFO mapreduce.Job: Job job_1437690038831_0009 completed successfully&lt;BR /&gt;15/09/21 20:39:16 INFO mapreduce.Job: Counters: 30&lt;BR /&gt;File System Counters&lt;BR /&gt;FILE: Number of bytes read=0&lt;BR /&gt;FILE: Number of bytes written=135070&lt;BR /&gt;FILE: Number of read operations=0&lt;BR /&gt;FILE: Number of large read operations=0&lt;BR /&gt;FILE: Number of write operations=0&lt;BR /&gt;HDFS: Number of bytes read=87&lt;BR /&gt;HDFS: Number of bytes written=1344&lt;BR /&gt;HDFS: Number of read operations=4&lt;BR /&gt;HDFS: Number of large read operations=0&lt;BR /&gt;HDFS: Number of write operations=2&lt;BR /&gt;Job Counters&lt;BR /&gt;Launched map tasks=1&lt;BR /&gt;Other local map tasks=1&lt;BR /&gt;Total time spent by all maps in occupied slots (ms)=55012&lt;BR /&gt;Total time spent by all reduces in occupied slots (ms)=0&lt;BR /&gt;Total time spent by all map tasks (ms)=55012&lt;BR /&gt;Total vcore-seconds taken by all map tasks=55012&lt;BR /&gt;Total megabyte-seconds taken by all map tasks=56332288&lt;BR /&gt;Map-Reduce Framework&lt;BR /&gt;Map input records=58&lt;BR /&gt;Map output records=58&lt;BR /&gt;Input split bytes=87&lt;BR /&gt;Spilled Records=0&lt;BR /&gt;Failed Shuffles=0&lt;BR /&gt;Merged Map outputs=0&lt;BR /&gt;GC time elapsed (ms)=404&lt;BR /&gt;CPU time spent (ms)=2000&lt;BR /&gt;Physical memory (bytes) snapshot=114446336&lt;BR /&gt;Virtual memory (bytes) snapshot=1508089856&lt;BR /&gt;Total committed heap usage (bytes)=60882944&lt;BR /&gt;File Input Format Counters&lt;BR /&gt;Bytes Read=0&lt;BR /&gt;File Output Format Counters&lt;BR /&gt;Bytes Written=1344&lt;BR /&gt;15/09/21 20:39:16 INFO mapreduce.ImportJobBase: Transferred 1.3125 KB in 161.5896 seconds (8.3174 bytes/sec)&lt;BR /&gt;15/09/21 20:39:16 INFO mapreduce.ImportJobBase: Retrieved 58 records.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And likewise all tables were imported. Is there any permission i must grant user cloudera as root?&lt;/P&gt;</description>
      <pubDate>Tue, 22 Sep 2015 04:05:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32115#M5291</guid>
      <dc:creator>DeepakTanna</dc:creator>
      <dc:date>2015-09-22T04:05:34Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32260#M5292</link>
      <description>&lt;P&gt;I'm not sure why that's happening to you. The cloudera user should be set up pretty similarly to the root user - I can't imagine why one would try and use MR1 and the other would use YARN, unless there was an environment variable set in that terminal or something.&lt;/P&gt;</description>
      <pubDate>Thu, 24 Sep 2015 12:47:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32260#M5292</guid>
      <dc:creator>Sean</dc:creator>
      <dc:date>2015-09-24T12:47:19Z</dc:date>
    </item>
    <item>
      <title>Re: I try to run commands at the terminal but get a connection refused error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32268#M5293</link>
      <description>&lt;P&gt;did not set any variable on the session,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;this is what&amp;nbsp;i have in bash_profile,&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;# .bash_profile&lt;/P&gt;&lt;P&gt;# Get the aliases and functions&lt;BR /&gt;if [ -f ~/.bashrc ]; then&lt;BR /&gt;. ~/.bashrc&lt;BR /&gt;fi&lt;/P&gt;&lt;P&gt;# User specific environment and startup programs&lt;/P&gt;&lt;P&gt;export PATH=$PATH:$HOME/bin:$HADOOP_HOME/bin&lt;BR /&gt;export CLASSPATH=/usr/lib/hadoop/client-0.20/\*:/usr/lib/hadoop/\*&lt;BR /&gt;export AVRO_CLASSPATH=/usr/lib/avro&lt;/P&gt;&lt;P&gt;alias lart="ls -lart"&lt;BR /&gt;set -o vi&lt;/P&gt;</description>
      <pubDate>Thu, 24 Sep 2015 18:01:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/I-try-to-run-commands-at-the-terminal-but-get-a-connection/m-p/32268#M5293</guid>
      <dc:creator>DeepakTanna</dc:creator>
      <dc:date>2015-09-24T18:01:19Z</dc:date>
    </item>
  </channel>
</rss>

