Member since
04-27-2016
22
Posts
0
Kudos Received
0
Solutions
09-21-2018
10:56 AM
still it is failing, i am keep on trying. i will let you know once it is done
... View more
09-19-2018
12:57 PM
i newly installed HDP 3.0 on my computer,before i was using HDP 2.6.x it was working fine.But after installing HDP3.0, a newly introduced component YARN DNS registry server not starting showing address already binded exception. i tried it in two computers seperately with fresh insatllation of Ubuntu 16.04 and HDP3.0 on both the machines it is showing same error. sufred a lot but no hope.there is no proper solution available in internet. appreiciated your help on this. Thanks in advanced
... View more
Labels:
08-29-2018
09:45 AM
Thanks for your reply.But i want to fetch hbase columns in Atlas 1.0 version in HDP 2.6, In my case upgradation is not possible. How to install Hbase hook in HDP 2.6?
... View more
08-22-2018
10:35 AM
I have installed ALTAS 1.0 in HDP 2.6,and i am able to query and see the hive tables,columns,queries in Atlas UI. What i want is,i have to query and fetch the Hbase table names,column qualifier,column family in altas UI. Let me know how to fetch hbase data in atlas and do i need to change any configuration in cluster level?
... View more
Labels:
05-09-2018
11:13 AM
I tried your steps again i am getting the same error.My cluster is multi distributed cluster.
... View more
05-02-2018
09:40 AM
i have setup multinode cluster with hortonworks HDP 2.6 version. I am trying to add flume service on my cluster and i am getting following error, stderr: /var/lib/ambari-agent/data/errors-5518.txt
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/FLUME/1.4.0.2.0/package/scripts/flume_handler.py", line 122, in <module>
FlumeHandler().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/FLUME/1.4.0.2.0/package/scripts/flume_handler.py", line 45, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 821, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 53, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 75, in wrapper
return function_to_decorate(self, name, *args[2:])
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 376, in install_package
self.checked_call_with_retries(cmd, sudo=True, env=INSTALL_CMD_ENV, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 266, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 283, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install flume-2-6-4-0-91' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
The following NEW packages will be installed:
flume-2-6-4-0-91
0 upgraded, 1 newly installed, 0 to remove and 2 not upgraded.
Need to get 74.0 MB of archives.
After this operation, 86.6 MB of additional disk space will be used.
Get:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main amd64 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91 [74.0 MB]
Err:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main i386 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91
Hash Sum mismatch
Get:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main i386 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91 [74.0 MB]
Err:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main i386 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91
Hash Sum mismatch
Fetched 400 B in 0s (746 B/s)
E: Failed to fetch http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0/pool/main/f/flume/flume-2-6-4-0-91_1.5.2.2.6.4.0-91_all.deb Hash Sum mismatch
E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? stdout: /var/lib/ambari-agent/data/output-5518.txt
2018-05-02 13:25:19,415 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-05-02 13:25:19,419 - Using hadoop conf dir: /usr/hdp/2.6.4.0-91/hadoop/conf
2018-05-02 13:25:19,419 - Group['hdfs'] {}
2018-05-02 13:25:19,420 - Group['hadoop'] {}
2018-05-02 13:25:19,420 - Group['users'] {}
2018-05-02 13:25:19,420 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,421 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,421 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,422 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-05-02 13:25:19,422 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,423 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-05-02 13:25:19,423 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,424 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-05-02 13:25:19,424 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,425 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-05-02 13:25:19,425 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,426 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,426 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,427 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,427 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-05-02 13:25:19,428 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-02 13:25:19,428 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-05-02 13:25:19,432 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-05-02 13:25:19,432 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-05-02 13:25:19,433 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-02 13:25:19,434 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-05-02 13:25:19,434 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-05-02 13:25:19,439 - call returned (0, '1015')
2018-05-02 13:25:19,439 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-05-02 13:25:19,443 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1015'] due to not_if
2018-05-02 13:25:19,443 - Group['hdfs'] {}
2018-05-02 13:25:19,443 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-05-02 13:25:19,444 - FS Type:
2018-05-02 13:25:19,444 - Directory['/etc/hadoop'] {'mode': 0755}
2018-05-02 13:25:19,452 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-05-02 13:25:19,453 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-05-02 13:25:19,465 - Repository['HDP-2.6-repo-1'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-02 13:25:19,469 - File['/tmp/tmpB_grzd'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP main'}
2018-05-02 13:25:19,470 - Writing File['/tmp/tmpB_grzd'] because contents don't match
2018-05-02 13:25:19,470 - File['/tmp/tmpsicpFn'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2018-05-02 13:25:19,470 - Writing File['/tmp/tmpsicpFn'] because contents don't match
2018-05-02 13:25:19,470 - File['/etc/apt/sources.list.d/ambari-hdp-1.list'] {'content': StaticFile('/tmp/tmpB_grzd')}
2018-05-02 13:25:19,471 - Writing File['/etc/apt/sources.list.d/ambari-hdp-1.list'] because contents don't match
2018-05-02 13:25:19,471 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdp-1.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}
2018-05-02 13:25:20,248 - checked_call returned (0, '')
2018-05-02 13:25:20,249 - Repository['HDP-2.6-GPL-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-02 13:25:20,250 - File['/tmp/tmpAMTk7U'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/2.x/updates/2.6.4.0 HDP-GPL main'}
2018-05-02 13:25:20,250 - Writing File['/tmp/tmpAMTk7U'] because contents don't match
2018-05-02 13:25:20,250 - File['/tmp/tmpkRe0G7'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2018-05-02 13:25:20,250 - Writing File['/tmp/tmpkRe0G7'] because contents don't match
2018-05-02 13:25:20,251 - File['/etc/apt/sources.list.d/ambari-hdp-1.list'] {'content': StaticFile('/tmp/tmpAMTk7U')}
2018-05-02 13:25:20,251 - Writing File['/etc/apt/sources.list.d/ambari-hdp-1.list'] because contents don't match
2018-05-02 13:25:20,251 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdp-1.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}
2018-05-02 13:25:20,805 - checked_call returned (0, '')
2018-05-02 13:25:20,805 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2018-05-02 13:25:20,807 - File['/tmp/tmpbhTasl'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/2.x/updates/2.6.4.0 HDP-GPL main\ndeb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16 HDP-UTILS main'}
2018-05-02 13:25:20,807 - Writing File['/tmp/tmpbhTasl'] because contents don't match
2018-05-02 13:25:20,807 - File['/tmp/tmptkSbVn'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2018-05-02 13:25:20,807 - Writing File['/tmp/tmptkSbVn'] because contents don't match
2018-05-02 13:25:20,807 - File['/etc/apt/sources.list.d/ambari-hdp-1.list'] {'content': StaticFile('/tmp/tmpbhTasl')}
2018-05-02 13:25:20,808 - Writing File['/etc/apt/sources.list.d/ambari-hdp-1.list'] because contents don't match
2018-05-02 13:25:20,808 - checked_call[['apt-get', 'update', '-qq', '-o', u'Dir::Etc::sourcelist=sources.list.d/ambari-hdp-1.list', '-o', 'Dir::Etc::sourceparts=-', '-o', 'APT::Get::List-Cleanup=0']] {'sudo': True, 'quiet': False}
2018-05-02 13:25:21,520 - checked_call returned (0, '')
2018-05-02 13:25:21,520 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-02 13:25:21,551 - Skipping installation of existing package unzip
2018-05-02 13:25:21,551 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-02 13:25:21,580 - Skipping installation of existing package curl
2018-05-02 13:25:21,581 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-02 13:25:21,610 - Skipping installation of existing package hdp-select
2018-05-02 13:25:21,613 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-05-02 13:25:21,823 - Command repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-05-02 13:25:21,823 - Applicable repositories: HDP-2.6-repo-1, HDP-2.6-GPL-repo-1, HDP-UTILS-1.1.0.22-repo-1
2018-05-02 13:25:24,496 - Looking for matching packages in the following repositories: public-repo-1.hortonworks.com_HDP_ubuntu16_2.x_updates_2.6.4.0, public-repo-1.hortonworks.com_HDP-GPL_ubuntu16_2.x_updates_2.6.4.0, public-repo-1.hortonworks.com_HDP-UTILS-1.1.0.22_repos_ubuntu16
2018-05-02 13:25:24,497 - Package['flume-2-6-4-0-91'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-05-02 13:25:24,529 - Installing package flume-2-6-4-0-91 ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install flume-2-6-4-0-91')
2018-05-02 13:25:25,286 - Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install flume-2-6-4-0-91' returned 100. Reading package lists...
Building dependency tree...
Reading state information...
The following NEW packages will be installed:
flume-2-6-4-0-91
0 upgraded, 1 newly installed, 0 to remove and 2 not upgraded.
Need to get 74.0 MB of archives.
After this operation, 86.6 MB of additional disk space will be used.
Get:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main amd64 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91 [74.0 MB]
Err:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main i386 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91
Hash Sum mismatch
Get:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main i386 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91 [74.0 MB]
Err:1 http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0 HDP/main i386 flume-2-6-4-0-91 all 1.5.2.2.6.4.0-91
Hash Sum mismatch
Fetched 400 B in 0s (1298 B/s)
E: Failed to fetch http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.4.0/pool/main/f/flume/flume-2-6-4-0-91_1.5.2.2.6.4.0-91_all.deb Hash Sum mismatch
E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
2018-05-02 13:25:25,286 - Failed to install package flume-2-6-4-0-91. Executing '/usr/bin/apt-get update -qq'
2018-05-02 13:25:36,417 - Retrying to install package flume-2-6-4-0-91 after 30 seconds
2018-05-02 13:26:07,445 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
Command failed after 1 tries
Can anyone help me to resolve this? Thanks in advance
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
12-05-2017
07:31 AM
i want to load HDFS current Date directory to hive. Here is my HDFS directory path: /user/hadoop/2017-12-04/ /user/hadoop/2017-12-05/ /user/hadoop/2017-12-06/ here is my hive table structure: create external table test_table(id int,name string) row format delimited fields terminated by',' location 'user/hadoop/$current_date'; help me to load the data into hive, which is having current date as foldername.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
11-24-2017
10:04 AM
Hi
I have created one table in Hive database and table through oozie workflow.
But i am trying to drop the table through same oozie workflow, table become inaccessible
and i am also not able to drop. It is throwing below exception
hive> drop table oozie_test;
FAILED:
Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:java.lang.IllegalArgumentException: Wrong FS:
hdfs://xxxxxxxx:8020/apps/hive/warehouse/oozie_test.db/oozie_test, expected: hdfs://yyyyyyyyyy)
hive>
i know this is the problem of namenode HA and i gathered all HA related info from hdfs-site.xml and added it into the workflow file.
job.properties:
# proprties
nameNode = hdfs://xxxxxxxxx:8020
jobTracker =xyz:8050
queueName=default
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib
oozie.wf.application.path=${nameNode}/user/abc/Oozie_POC/oozie_ha_workflow.xml
workflow.xml:
<workflow-app name="sample-wf" xmlns="uri:oozie:workflow:0.1"><br> <start to = "myfirsthivejob-raguvaran" /><br> <action name="myfirsthivejob"><br> <hive xmlns = "uri:oozie:hive-action:0.4"><br> <job-tracker>${jobTracker}</job-tracker><br> <name-node>${nameNode}</name-node><br> <configuration><br> <property><br> <name>mapred.job.queue.name</name><br> <value>${queueName}</value><br></property><br><br><property><br> <name>mapred.compress.map.output</name><br> <value>true</value><br></property><br><br><property><br> <name>oozie.hive.defaults</name><br> <value>/hdp/apps/2.5.3.0-37/hive/hive-site.xml</value><br></property><br><br><property><br> <name>oozie.action.sharelib.for.hive</name><br> <value>hive</value><br></property><br><br><property><br> <name>dfs.nameservices</name><br> <value>yyyyyyyy</value><br></property><br><br><property><br> <name>dfs.client.failover.proxy.provider.remotehdfs</name><br> <value>org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider</value><br></property><br><br><property><br> <name>dfs.ha.automatic-failover.enabled.remotehdfs</name><br> <value>true</value><br></property><br><br> <property><br> <name>dfs.ha.namenodes.yyyyyyyy</name><br> <value>nn1,nn2</value><br> </property><br><br> <property><br> <name>dfs.namenode.rpc-address.yyyyyyyy.nn1</name><br> <value>xxxxxxxx:8020</value><br> </property><br><br> <property><br> <name>dfs.namenode.http-address.yyyyyyyy.nn1</name><br> <value>xxxxxxxx:50070</value><br> </property><br><br><property><br> <name>dfs.namenode.https-address.yyyyyyyy.nn1</name><br> <value>xxxxxxxxx:50470</value><br> </property><br><br> <property><br> <name>dfs.namenode.rpc-address.localmaster.nn2</name><br> <value>namenode2:8020</value><br> </property><br><br> <property><br> <name>dfs.namenode.http-address.localmaster.nn2</name><br> <value>namenode2:50070</value><br> </property><br><br> <property><br> <name>dfs.namenode.https-address.localmaster.nn2</name><br> <value>namenode2:50470</value><br> </property><br> <br> <property><br> <name>dfs.ha.automatic-failover.enabled</name><br> <value>true</value><br> </property><br> </configuration><br><br> <script>example.hql</script><br> <param>InputDir=/user/abc/Oozie_POC</param><br> <!-- <param>OutputDir=${jobOutput}</param> --><br> </hive><br> <ok to = "end" /><br> <error to = "kill_job" /><br> </action><br> <br> <kill name = "kill_job"><br> <message>Job failed</message><br> </kill><br><end name = "end" /><br><br></workflow-app>
Anyone have any idea on this ?
Any help is appericiated.
Thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Oozie
11-03-2017
10:31 AM
I am having a table called 'test_analysis' in that the values will be like following, ID NAME DOB LOCATION 1 bob 08/10/1985 NEW JERSEY 1 bob 08/10/1985 NEW YORK 1 bob 08/10/1985 NORTH CAROLINA 2 John 26/11/1990 OKLAHOMA i want output like, ID NAME DOB LOCATION 1 bob 08/1/1985 NEW JERSEY,NEW YORK,NORTH CAROLINA 2 John 26/11/1990 OKLAHOMA Please help me to form a hive query to get expected output.
... View more
Labels:
- Labels:
-
Apache Hive
05-02-2017
01:32 PM
In job.properties file i have hard coded oracle DB username and password for sqoop action and that will pick by workflow.xml file while running. The problem is in the oozie editor/dashboard if i see the running job sqoop action, in the configuration tab i could see the DB username and password visible. i want to make DB username and password in the oozie configuration tab as hidden value or hastrik. is it possible to do that? if yes can anyone please suggest the solution.
... View more
Labels:
- Labels:
-
Apache Oozie
04-19-2017
03:05 PM
please find the steps followed, 1.loggedin to the 'https://console.aws.amazon.com/ec2/' 2.cliked on 'sign in the console' entered the details 3.in 'EC2' launched the instance 4.in the left hand side clicked on 'community AMI' community-ami.png 5.when i trying to search for 'Hartonworks' i am getting no results found hortonworks-not-found.png
Help me out finding the practice exam in EC2 AMI Thanks in advance
... View more
01-27-2017
05:34 AM
"hadoop fs -ls /user/hue" it is working and able to see the list of directories but how to enter into this directory by using 'cd'?
... View more
01-25-2017
08:19 AM
So far i have created a directory called certification_tutorials through hue browser in my local sandbox and this is my path "/user/hue/certification_tutorials" when i try to access the same directory through putty it is always showing No such file or directories.i could not find the above directory when i use ls command. when i am trying to find i could. my present working directory is : /root i am able to redirected to /home directory after that i am not able to enter into /user/ directory please anyone help me on how to access "/user/hue/certification_tutorials" using putty. Thanks in advance
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Cloudera Hue
05-10-2016
01:49 PM
i have tried the following,but i am not able to create the table. create external table tweets(name string,time int,tweet string)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.AvroSerDe' STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat' OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat' TBLPROPERTIES (
'avro.schema.url'='http://trvlhaddv2gw1.tsh.thomson.com:8888/filebrowser/view/user/c400351/tweet.avsc'); i did not get any error messages.
... View more
05-10-2016
07:51 AM
i was trying to create hive table for storing avro file and i have stored my avro shema(.avsc file),my avro file in single location. could anyone help me to create the table in hive?
... View more
Labels:
- Labels:
-
Apache Hive