Member since
01-06-2016
36
Posts
22
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1628 | 04-29-2016 02:51 PM | |
1935 | 03-10-2016 07:41 PM | |
5168 | 02-15-2016 08:22 PM |
03-06-2017
10:29 PM
1 Kudo
Hi - Looking for some better methods of reducing existing table region count. We have a few tables that were originally pre split with far too many regions. Time goes on and Issues like performance and compaction storms are evident. With smaller tables I've been using Export/Import or CopyTable to move the data to new tables with less regions, but larger tables (TBs) are very challenging to run to completion. Are there any better strategies for accomplishing the above? In some cases the region counts are so high that manually merging them is not feasible so I find myself back at Export/Import.
... View more
Labels:
- Labels:
-
Apache HBase
10-27-2016
08:24 PM
I added a few nodes to install Storm, but Ambari (2.2.2.0) is apparently trying to to use the wrong version.
On the destination node, the path /var/lib/ambari-agent/cache/common-services/STORM contains 2 versions, 0.9.1 and 0.9.1.2.1.
The directory 0.9.1 is empty and is the one Ambari is trying to execute script in. The other directory contains the items needed for installation. So the install fails with errors like:
Caught an exception while executing custom service command: <class 'ambari_agent.AgentException.AgentException'>: 'Script /var/lib/ambari-agent/cache/common-services/STORM/0.9.1/package/scripts/drpc_server.py does not exist'; 'Script /var/lib/ambari-agent/cache/common-services/STORM/0.9.1/package/scripts/drpc_server.py does not exist'
I suppose the question is how do I get Ambari to into shape here?
Any help/thoughts are greatly appreciated.
edit: I found that my repo_version table in the Ambari database has 2 different entries for the stack_id I'm using:
53 | 2.4.0.0-169 | HDP-2.4.0.0 | [{"repositories":[{"Repositories/repo_id":"HDP-2.4","Repositories/base_url":"http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.4.0.0","Repositories/repo_name
":"HDP"},{"Repositories/repo_id":"HDP-UTILS-1.1.0.20","Repositories/base_url":"http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6","Repositories/repo_name":"HDP-UTILS"}],"OperatingSystems/os_type":"r
edhat6"}] | 51
103 | 2.4.2.0-258 | HDP-2.4.2.0 | [{"repositories":[{"Repositories/repo_id":"HDP-2.4","Repositories/base_url":"http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.4.2.0","Repositories/repo_name
":"HDP"},{"Repositories/repo_id":"HDP-UTILS-1.1.0.20","Repositories/base_url":"http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6","Repositories/repo_name":"HDP-UTILS"}],"OperatingSystems/os_type":"r
edhat6"}] | 51
... View more
Labels:
- Labels:
-
Apache Ambari
09-26-2016
08:58 PM
Thanks for your response. I'll follow that course. Cheers
... View more
09-23-2016
03:09 PM
2 Kudos
Hello- I have a Kafka cluster that has a dedicated (not managed by Ambari) Zookeeper ensemble for which I need to change the Hostnames. No Ambari, no HDP, no anything except Kafka and ZK Beyond updating Kafka's configuration files, is there more to consider (e.g. consumer retaining consumer offsets)? The ZK ensemble will remain untouched aside from the hostname change. Thanks.
... View more
Labels:
- Labels:
-
Apache Kafka
09-12-2016
05:09 PM
Ah, thanks. Feeling very silly I left that param in. 🙂
... View more
09-11-2016
08:34 PM
1 Kudo
Hello - I'm having trouble using the Sqoop CLI with HCatalog. The error is always: org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not
found at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$2.call(MRAppMaster.java:519)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$2.call(MRAppMaster.java:499)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1598)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:499)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:285)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$5.run(MRAppMaster.java:1556)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1553)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1486)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not
found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:222)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$2.call(MRAppMaster.java:515)
... 11 more
Caused by: java.lang.ClassNotFoundException:
Class org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
... 13 more
I added the following to the sqoop-env template within Amabari (advice I found in another post) with no luck. This job works fine when in an Oozie Sqoop action, but failing with the above when executed via CLI: $ sqoop import --skip-dist-cache --username xx --password-file /dir/xx.dat --connect jdbc:postgresql://server.x.lan/xx --split-by download_id --hcatalog-table project_xx_0 --hcatalog-database default --query "select [lots of stuff with several joins] AND \$CONDITIONS" So this is a sqoop free-form query import from a Postgresql database/table to an existing hive-hcatalog table. Any help, tips are greatly appreciated, Thanks.
... View more
Labels:
- Labels:
-
Apache HCatalog
-
Apache Sqoop
07-08-2016
04:27 PM
1 Kudo
@Artem Ervits This is a long standing bug in Ambari setup. For an external Postgres database the script is: Ambari-DDL-Postgres-CREATE.sql
... View more
05-11-2016
05:50 PM
Upgrading to Ambari 2.2.2 from 2.2 and when executing $ambari-server upgrade am receiving the following warnings: "Updating properties in ambari.properties ...
WARNING: Can not find ambari-env.sh.rpmsave file from previous version, skipping restore of environment settings
Fixing database objects owner" I am using an external PostreSQL database. I do have backups, but what is this going to wipeout?
... View more
Labels:
- Labels:
-
Apache Ambari