Member since
03-23-2018
20
Posts
0
Kudos Received
0
Solutions
04-25-2018
11:00 AM
org.apache.pig.backend.executionengine.ExecException: ERROR 4010: Cannot find hadoop configurations in classpath (neither hadoop-site.xml nor core-site.xml was found in the classpath). If you plan to use local mode, please put -x local option in command line
... View more
Labels:
- Labels:
-
Apache Pig
-
Apache Zeppelin
04-24-2018
12:21 PM
Please help me with a pig script to count the lines in a file
... View more
Labels:
- Labels:
-
Apache Pig
04-16-2018
12:55 PM
when i add the path of ngdbc.jar in saprk interpreter the remaining all interpreters are deleted and that too my notebooks are also not running. if any one knows the reason plz share with me
... View more
Labels:
- Labels:
-
Apache Zeppelin
04-16-2018
09:00 AM
I tried with the documentation which is provided by above link . But i was facing some errors in that code
... View more
04-16-2018
07:16 AM
i was thinking that to view my SQL data in apache zeppelin in map visualization so plz help to get this
... View more
- Tags:
- maps
- visualization
04-11-2018
02:24 PM
my usecase is to separate one logfile which is containing success and failure logs. now by using nifi i need to separate and store into different files
... View more
04-11-2018
11:15 AM
can u plz recommend me any video that having twitter configurations
... View more
04-10-2018
02:08 PM
can you please tell me How to configure twitter in nifi to get the data
... View more
- Tags:
- nifi-processor
Labels:
- Labels:
-
Apache NiFi
04-10-2018
01:12 PM
Can i separate success logs and failure log from log file through nifi
... View more
Labels:
- Labels:
-
Apache NiFi
04-09-2018
10:00 AM
Tq ......................................
... View more
04-09-2018
04:48 AM
I have done ambari managed installation and version is hdf 3.1
... View more
04-07-2018
07:00 AM
I am new to hortonworks dataflow I just installed hdf then I am not getting how to start and how to work with nifi and all can please any one help me with this
... View more
- Tags:
- help
04-05-2018
12:45 PM
stderr:
2018-04-05 12:40:04,109 - The 'registry' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (3.1.1.0-35). This is the version that will be reported.
2018-04-05 12:40:04,330 - Failed to find mysql-java-connector jar. Make sure you followed the steps to register mysql driver
2018-04-05 12:40:04,404 - The 'registry' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (3.1.1.0-35). This is the version that will be reported.
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/registry_server.py", line 170, in <module>
RegistryServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 375, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/registry_server.py", line 59, in install
import params
File "/var/lib/ambari-agent/cache/common-services/REGISTRY/0.3.0/package/scripts/params.py", line 170, in <module>
raise Fail('Unable to establish jdbc connection to your ' + registry_storage_type + ' instance.')
resource_management.core.exceptions.Fail: Unable to establish jdbc connection to your mysql instance.
stdout:
2018-04-05 12:40:03,654 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
User Group mapping (user_group) is missing in the hostLevelParams
2018-04-05 12:40:03,657 - Group['nifiregistry'] {}
2018-04-05 12:40:03,659 - Group['hadoop'] {}
2018-04-05 12:40:03,659 - Group['nifi'] {}
2018-04-05 12:40:03,659 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,660 - call['/var/lib/ambari-agent/tmp/changeUid.sh streamline'] {}
2018-04-05 12:40:03,668 - call returned (0, '1001')
2018-04-05 12:40:03,668 - User['streamline'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1001}
2018-04-05 12:40:03,669 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,670 - call['/var/lib/ambari-agent/tmp/changeUid.sh logsearch'] {}
2018-04-05 12:40:03,677 - call returned (0, '1003')
2018-04-05 12:40:03,678 - User['logsearch'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1003}
2018-04-05 12:40:03,679 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,680 - call['/var/lib/ambari-agent/tmp/changeUid.sh registry'] {}
2018-04-05 12:40:03,687 - call returned (0, '1004')
2018-04-05 12:40:03,687 - User['registry'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1004}
2018-04-05 12:40:03,688 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,689 - call['/var/lib/ambari-agent/tmp/changeUid.sh storm'] {}
2018-04-05 12:40:03,696 - call returned (0, '1005')
2018-04-05 12:40:03,696 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1005}
2018-04-05 12:40:03,697 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,698 - call['/var/lib/ambari-agent/tmp/changeUid.sh infra-solr'] {}
2018-04-05 12:40:03,705 - call returned (0, '1006')
2018-04-05 12:40:03,706 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1006}
2018-04-05 12:40:03,706 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,707 - call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {}
2018-04-05 12:40:03,714 - call returned (0, '1007')
2018-04-05 12:40:03,715 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1007}
2018-04-05 12:40:03,716 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,716 - call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {}
2018-04-05 12:40:03,724 - call returned (0, '1008')
2018-04-05 12:40:03,724 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1008}
2018-04-05 12:40:03,725 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2018-04-05 12:40:03,726 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,726 - call['/var/lib/ambari-agent/tmp/changeUid.sh kafka'] {}
2018-04-05 12:40:03,734 - call returned (0, '1010')
2018-04-05 12:40:03,734 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1010}
2018-04-05 12:40:03,735 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,736 - call['/var/lib/ambari-agent/tmp/changeUid.sh nifiregistry'] {}
2018-04-05 12:40:03,743 - call returned (0, '1011')
2018-04-05 12:40:03,743 - User['nifiregistry'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1011}
2018-04-05 12:40:03,744 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,745 - call['/var/lib/ambari-agent/tmp/changeUid.sh nifi'] {}
2018-04-05 12:40:03,752 - call returned (0, '1012')
2018-04-05 12:40:03,752 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1012}
2018-04-05 12:40:03,753 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-04-05 12:40:03,754 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-04-05 12:40:03,759 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-04-05 12:40:03,774 - Repository['HDF-3.1-repo-2'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.1.1.0', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdf-2', 'mirror_list': None}
2018-04-05 12:40:03,781 - File['/etc/yum.repos.d/ambari-hdf-2.repo'] {'content': '[HDF-3.1-repo-2]\nname=HDF-3.1-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.1.1.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-04-05 12:40:03,782 - Writing File['/etc/yum.repos.d/ambari-hdf-2.repo'] because contents don't match
2018-04-05 12:40:03,783 - Repository['HDP-UTILS-1.1.0.21-repo-2'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdf-2', 'mirror_list': None}
2018-04-05 12:40:03,786 - File['/etc/yum.repos.d/ambari-hdf-2.repo'] {'content': '[HDF-3.1-repo-2]\nname=HDF-3.1-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDF/centos7/3.x/updates/3.1.1.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-2]\nname=HDP-UTILS-1.1.0.21-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-04-05 12:40:03,786 - Writing File['/etc/yum.repos.d/ambari-hdf-2.repo'] because contents don't match
2018-04-05 12:40:03,786 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-04-05 12:40:03,932 - Skipping installation of existing package unzip
2018-04-05 12:40:03,932 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-04-05 12:40:03,981 - Skipping installation of existing package curl
2018-04-05 12:40:03,981 - Package['hdf-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-04-05 12:40:04,031 - Skipping installation of existing package hdf-select
2018-04-05 12:40:04,086 - call[('ambari-python-wrap', u'/usr/bin/hdf-select', 'versions')] {}
2018-04-05 12:40:04,109 - call returned (0, '3.1.1.0-35')
2018-04-05 12:40:04,109 - The 'registry' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (3.1.1.0-35). This is the version that will be reported.
2018-04-05 12:40:04,322 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2018-04-05 12:40:04,330 - Failed to find mysql-java-connector jar. Make sure you followed the steps to register mysql driver
2018-04-05 12:40:04,330 - Users should register the mysql java driver jar.
2018-04-05 12:40:04,331 - yum install mysql-connector-java*
2018-04-05 12:40:04,331 - sudo ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
2018-04-05 12:40:04,381 - call[('ambari-python-wrap', u'/usr/bin/hdf-select', 'versions')] {}
2018-04-05 12:40:04,403 - call returned (0, '3.1.1.0-35')
2018-04-05 12:40:04,404 - The 'registry' component did not advertise a version. This may indicate a problem with the component packaging. However, the stack-select tool was able to report a single version installed (3.1.1.0-35). This is the version that will be reported.
Command failed after 1 tries
... View more
- Tags:
- schema-registry
Labels:
- Labels:
-
Schema Registry
04-04-2018
06:47 AM
03-26-2018
05:01 AM
INFO 2018-03-26 04:59:24,672 hostname.py:67 - agent:hostname_script configuration not defined thus read hostname 'ip-172-31-15-165.us-west-1.compute.internal' using socket.getfqdn().
ERROR 2018-03-26 04:59:24,672 main.py:244 - Ambari agent machine hostname (ip-172-31-15-165.us-west-1.compute.internal) does not match expected ambari server hostname (ec2-54-153-3-119.us-west-1.compute.amazonaws.com). Aborting registration. Please check hostname, hostname -f and /etc/hosts file to confirm your hostname is setup correctly
INFO 2018-03-26 04:59:24,673 ExitHelper.py:56 - Performing cleanup before exiting...
", None)
Connection to ec2-54-153-3-119.us-west-1.compute.amazonaws.com closed.screenshot-16.png
... View more
03-23-2018
01:26 PM
After doing the above steps the problem was not solved..........
... View more
03-23-2018
11:11 AM
while starting up the hive matastore and hiveserver2 i was getting the bellow attached error. can anyone pl help me to solve that
... View more
Labels:
- Labels:
-
Apache Hive