Support Questions
Find answers, ask questions, and share your expertise

Getting error "Failed on local exception: "

Getting error "Failed on local exception: "

New Contributor

Hi All,

I am running cdh-5.3.2 on a single Ubuntu box. I am trying to run a sample mapreduce (mrjob) example below on my hadoop machine 

I am getting below error and cannot figure out why. Seems this might be due to the Google API :


STDERR: mkdir: Failed on local exception: Protocol message tag had invalid wire type.; Host Details : local host is: ""; destination host is: "localhost":9000;



Can you let me know how to overcome this issue ? I tried google but all the suggestions seem to go nowhere (I tested them all):


hduser@hadoop1:~$ python mrjob-master/mrjob/examples/ /home/hduser/mrjob-master/README.rst -r hadoop -v Deprecated option hdfs_scratch_dir has been renamed to hadoop_tmp_dir Unexpected option hdfs_tmp_dir looking for configs in /home/hduser/.mrjob.conf using configs in /home/hduser/.mrjob.conf Active configuration:

{'bootstrap_mrjob': None,

 'check_input_paths': True,

 'cleanup': ['ALL'],

 'cleanup_on_failure': ['NONE'],

 'cmdenv': {},

 'hadoop_bin': None,

 'hadoop_extra_args': [],

 'hadoop_home': None,

 'hadoop_streaming_jar': None,

 'hadoop_tmp_dir': 'tmp/mrjob',

 'hadoop_version': '0.20',

 'interpreter': None,

 'jobconf': {},

 'label': None,

 'local_tmp_dir': '/tmp',

 'owner': 'hduser',

 'python_archives': [],

 'python_bin': None,

 'setup': [],

 'setup_cmds': [],

 'setup_scripts': [],

 'sh_bin': ['sh', '-ex'],

 'steps_interpreter': None,

 'steps_python_bin': None,

 'strict_protocols': True,

 'upload_archives': [],

 'upload_files': []}

Hadoop streaming jar is /home/hduser/Desktop/hadoop-2.5.0-cdh5.3.2/share/hadoop/tools/lib/hadoop-streaming-2.5.0-cdh5.3.2.jar

creating tmp directory /tmp/mr_word_freq_count.hduser.20150820.213521.092743

archiving /usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob -> /tmp/mr_word_freq_count.hduser.20150820.213521.092743/mrjob.tar.gz as mrjob/ writing wrapper script to /tmp/mr_word_freq_count.hduser.20150820.213521.092743/

WRAPPER: # store $PWD



WRAPPER: # obtain exclusive file lock

WRAPPER: exec 9>/tmp/wrapper.lock.mr_word_freq_count.hduser.20150820.213521.092743

WRAPPER: python -c 'import fcntl; fcntl.flock(9, fcntl.LOCK_EX)'


WRAPPER: # setup commands


WRAPPER:   export PYTHONPATH=$__mrjob_PWD/mrjob.tar.gz:$PYTHONPATH

WRAPPER: } 0</dev/null 1>&2


WRAPPER: # release exclusive file lock

WRAPPER: exec 9>&-


WRAPPER: # run task from the original working directory

WRAPPER: cd $__mrjob_PWD


Making directory hdfs:///user/hduser/tmp/mrjob/mr_word_freq_count.hduser.20150820.213521.092743/files/ on HDFS

> /home/hduser/Desktop/hadoop-2.5.0-cdh5.3.2/bin/hadoop version

Using Hadoop version 2.5.0

> /home/hduser/Desktop/hadoop-2.5.0-cdh5.3.2/bin/hadoop fs -mkdir -p

> hdfs:///user/hduser/tmp/mrjob/mr_word_freq_count.hduser.20150820.21352

> 1.092743/files/

STDERR: 15/08/20 14:35:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

STDERR: mkdir: Failed on local exception: Protocol message tag had invalid wire type.; Host Details : local host is: ""; destination host is: "localhost":9000; Traceback (most recent call last):

  File "mrjob-master/mrjob/examples/", line 37, in <module>

  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 433, in run


  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 451, in execute

    super(MRJob, self).execute()

  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 160, in execute


  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 227, in run_job

  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 452, in run


  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 234, in _run


  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 261, in _upload_local_files_to_hdfs


  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/", line 281, in _mkdir_on_hdfs

    self.invoke_hadoop(['fs', '-mkdir', '-p', path])

  File "/usr/local/lib/python2.7/dist-packages/mrjob-0.5.0_dev-py2.7.egg/mrjob/fs/", line 101, in invoke_hadoop

    raise CalledProcessError(proc.returncode, args)

subprocess.CalledProcessError: Command '['/home/hduser/Desktop/hadoop-2.5.0-cdh5.3.2/bin/hadoop', 'fs', '-mkdir', '-p', 'hdfs:///user/hduser/tmp/mrjob/mr_word_freq_count.hduser.20150820.213521.092743/files/']' returned non-zero exit status 1 hduser@hadoop1:~$



hduser@hadoop1:~$ jps

23519 NodeManager

23192 ResourceManager

23667 Jps

23029 SecondaryNameNode

22842 DataNode



hduser@hadoop1:~$ sudo ufw status | grep 9000

9000                       ALLOW       Anywhere

9000 (v6)                  ALLOW       Anywhere (v6)



hduser@hadoop1:~$ telnet localhost 9000


Connected to localhost.localdomain.

Escape character is '^]'.

SSH-2.0-OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.3




hduser@hadoop1:~/Desktop/hadoop-dns-checker-master$ ./ my_hosts ==== ===='s password:

sending incremental file list

created directory hadoop-dns



sent 2,449 bytes  received 106 bytes  393.08 bytes/sec total size is 2,620  speedup is 1.03's password:

# self check...

-- host :

   host lookup : success (

   reverse lookup : success (

   is reachable : yes

# end self check

==== Running on : =====

-- host :

   host lookup : success (

   reverse lookup : success (

   is reachable : yes



hduser@hadoop1:~$ cat hadoop-2.5.0-cdh5.3.2/etc/hadoop/core-site.xml












hduser@hadoop1:~$ cat hadoop-2.5.0-cdh5.3.2/etc/hadoop/hdfs-site.xml








hduser@hadoop1:~$ cat hadoop-2.5.0-cdh5.3.2/etc/hadoop/mapred-site.xml









hduser@hadoop1:~$ cat hadoop-2.5.0-cdh5.3.2/etc/hadoop/ | grep HADOOP_CONF_DIR export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}



hduser@hadoop1:~$ hostname --fqdn



hduser@hadoop1:~$ cat /etc/hosts

#           localhost

#           myhost-1

#       localhost  ubuntu

#  ubuntu

#  ubuntu

#     myhost-1

#        myhost-1     localhost.localdomain       localhost    hadoop1


# The following lines are desirable for IPv6 capable hosts

::1     ip6-localhost ip6-loopback

fe00::0 ip6-localnet

ff00::0 ip6-mcastprefix

ff02::1 ip6-allnodes

ff02::2 ip6-allrouters



hduser@hadoop1:~$ hostname



Appreciate the help.


Re: Getting error "Failed on local exception:

Master Guru
Are you positive your cluster serves its NameNode over 9000, and not 8020? You typically would receive such errors if (1) the client version does not match the server, in some cases, or (2) the port does not serve protocol buffer based services, but a HTTP/etc. one instead.

You can confirm the version of your cluster by visiting the NameNode Web UI, and you can confirm the port it uses on the Web UI, or by running "netstat -anp | grep NNPID | grep LISTEN" as root on the NameNode box. It will list out multiple LISTENs, but you need to look for if 9000 is among them.

Re: Getting error "Failed on local exception:

New Contributor

Hi Harsh,

Thanks for your reply.

I think you might be correct. I do not see Namenode process working. I did check my core-site.xml file and it shows localhost:9000 but I do not see the process running.

I formated my namenode and restarted the services but still the same results.


root@hadoop1:/home/hduser/hadoop-2.5.0-cdh5.3.2/sbin# netstat -anp | grep NNPID | grep LISTEN
root@hadoop1:/home/hduser/hadoop-2.5.0-cdh5.3.2/sbin# whoami
root@hadoop1:/home/hduser/hadoop-2.5.0-cdh5.3.2/sbin# su hduser
hduser@hadoop1:~/hadoop-2.5.0-cdh5.3.2/sbin$ jps
7568 NodeManager
7241 ResourceManager
7078 SecondaryNameNode
6896 DataNode
9842 Jps

could you let me know the files/scripts etc I could use to troubleshoot further.



Re: Getting error "Failed on local exception:

New Contributor

Hi , i just like you , and check your version of Hadoop by "Hadoop version" , so when i check my Hadoop version is 2.0.0-cdh4.2.1 , after that , i change my gradle to

compile group: 'org.apache.hadoop', name: 'hadoop-core', version: '2.0.0-mr1-cdh4.2.1', ext: 'pom'
compile group: 'org.apache.hadoop', name: 'hadoop-hdfs', version: '2.0.0-cdh4.2.1'
compile group: 'org.apache.hadoop', name: 'hadoop-common', version: '2.0.0-cdh4.2.1'
compile group: 'org.apache.hadoop', name: 'hadoop-mapreduce-client-core', version: '2.0.0-cdh4.2.1'
compile group: 'org.apache.hadoop', name: 'hadoop-client', version: '2.0.0-cdh4.2.1'

 it works !