Member since
12-09-2015
115
Posts
43
Kudos Received
12
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6340 | 07-10-2017 09:38 PM | |
3917 | 04-10-2017 03:24 PM | |
670 | 03-04-2017 04:08 PM | |
2325 | 02-17-2017 10:42 PM | |
3430 | 02-17-2017 10:41 PM |
03-14-2016
03:26 PM
1 Kudo
Execution engine from MR to TEZ
... View more
03-14-2016
03:25 PM
1 Kudo
Just upgraded HDP from 2.2 to 2.3.4. & default execution engine to MR to TEZ. table we are querying has 2.4Billion rows select * from lowes_clickstream.clickstream_mobile limit 100;
Error: java.io.IOException: java.lang.IndexOutOfBoundsException: toIndex = 993 (state=,code=0)
0: jdbc:hive2://hadoop-sa:8443/default>
... View more
Labels:
- Labels:
-
Apache Tez
02-05-2016
12:35 AM
@Neeraj Sabharwal yea...we did follow with support and they said it is a know issue...i posted the comments below..alll we need to do is change the DB engine for all the tables which are MyISAM to InnoDB. thanks for your response though..
... View more
02-05-2016
12:33 AM
1 Kudo
This is known issue for MYSQL DBs and we need to convert all the tables to use InnoDB engine. You can use the below SQL to generate the commands to alter the tables to InnoDB engine. SELECT concat('ALTER TABLE ',TABLE_NAME,' ENGINE=InnoDB;') FROM Information_schema.TABLES WHERE TABLE_SCHEMA = 'ambaridb' AND ENGINE = 'MyISAM' AND TABLE_TYPE='BASE TABLE'; Once we get the output , remove the ' | ' and run the alter commands. Once we run alter commands, commit and exit mysql. and than run ambari-server upgrade.
... View more
02-04-2016
12:10 AM
@Artem Ervits no it is not
... View more
02-03-2016
06:50 PM
@Artem Ervits Error executing query: ALTER TABLE cluster_version ADD CONSTRAINT FK_cluster_version_cluster_id FOREIGN KEY (cluster_id) REFERENCES clusters (cluster_id) clusters table has one row and cluster_version table is empty. that was the reason it is not able to add FK
... View more
02-03-2016
06:19 PM
@Artem Ervits HW doc doesnt say anything about that... If your current Ambari version is 1.6.1 or below, you must upgrade the Ambari Server version to 1.7 before upgrading to version 2.2.
... View more
02-03-2016
06:09 PM
1 Kudo
ambari upgrade upgrading ambari-server from 1.7 to 2.2 with mysql 5.6 while running ambari-server upgrade it failed with org.apache.ambari.server.AmbariException: Cannot add foreign key constraint any work around, i couldn't even create the FK manually either.. is it a known issue?
... View more
Labels:
- Labels:
-
Apache Ambari
01-27-2016
07:37 PM
@Neeraj Sabharwal this is a beautiful future...but we are hitting a bug with Hive settings in Ambari...Support is working on a case...in the mean is there any way where i can make my group as default group in Ambari?
... View more
01-27-2016
05:59 PM
@Neeraj Sabharwal i understand what you are saying...but our requirement is below... Mast2 , HS2 port 10001 Authentication None - users connect through Knox 8443 Mast1, HS2 port 10000, i can stop the instance and deploy HS2 it through Ambari...if i do that can each hiveserver2 in ambari can have individual settings? because on Mast1 i would like to enable HS2 with LDAP authentication, if we can do that that will make my work easy...i can simply enable Ranger authorization in Hive settings in ambari which will control both.
... View more
01-27-2016
04:18 PM
@Neeraj Sabharwal how do i add another hive-plugin from ambari? if i install new hiveserver2 through ambari , can i make different settings for each hiveserver2? live one can get no authentication and other with AD authentication?
... View more
01-27-2016
03:19 PM
i got 2.3.4 installed and ranger-hive-plugin is installed on one of the master node where hiveserver is running on port 10001 i would like to enable ranger-hive-pluin on another master node where hiveserver is running on port 10000. is there any way to enable plugin on master node1 in Ambari? i did try to edit install.properties of hive in mast1 and enabled it and bounced hive which did not help. hiveserver2 running on mast1 is not registered with Ambari-- just FYI
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
01-25-2016
03:36 PM
5 Kudos
I'm new to Spark and appreciate if i can get some answers. How can i enable spark authentication against LDAP in a non kerberized environment? Looks like spark is connecting to Metastore directly...how can we force it to connect to hiveserver2? Is there any way to suppress all the info it prints when we start/exit spark-sql? every session i start it start using 4040 +1 and http & sparkdriver with randon ports, is there any way we can force spark to use subs et of ports instead of random ports?
... View more
Labels:
- Labels:
-
Apache Spark
01-21-2016
11:27 AM
@Neeraj Sabharwal @Predrag Monodic @Jonas Straub @Artem Ervits thanks for looking at the post. I know this is strange issue. I did clear cache and did all the possible things..nothing helped...i opened a ticket with support and we learned it as a BUG. support forwarded this to Dev team to assist. there is a weird workaround for this issue. to see the settings...i have to click on yarn first go to yarn config and click on hive which shows hive settings...yes it does work only if i follow above method.
... View more
01-21-2016
01:25 AM
1 Kudo
is there any command which can give us the list of components that are registered with znode in real time? i want to see what are the hiveservers that are registered with zookeeper...etc
... View more
Labels:
- Labels:
-
Apache Hive
01-20-2016
10:45 PM
1 Kudo
@Kuldeep Kulkarni it is simple, we had the same problem and i simply edited hive.distro file /usr/hdp/2.2.0.0-2041/hive/bin/hive.distro go to this line and comment it and add below string --- if [ "$SERVICE" = "" ] ; then if [ "$SERVICE" = "" ] && [ "$USER" = "xxxxxxxx" ] ; then
if [ "$SERVICE" = "" ] ; then xxxxxxx - you can use your shared id /service id let me how did it go..
... View more
01-20-2016
10:34 PM
1 Kudo
upgraded from HDP 2.2 to HDP 2.3.4 and i couldn't see anything under hive where memory and other important parameters display in ambariscreen-shot-2016-01-20-at-53419-pm.png
... View more
Labels:
- Labels:
-
Apache Hive
01-14-2016
01:17 AM
3 Kudos
it happens when Ambari is upgraded to 2.2 this is a known issue -https://issues.apache.org/jira/browse/AMBARI-13325 Disable bucket cache by erasing the values from these propeerties from ambari - and restart the Hbase service.
- hbase.bucketcache.ioengine
- hbase.bucketcache.percentage.in.combinedcache
- hbase.bucketcache.size
... View more
01-14-2016
12:20 AM
@Enis
value is set to 4G isn't big enough? we hardly use hbase
... View more
01-14-2016
12:15 AM
1 Kudo
not sure what parameter i have to change Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.regionserver.HRegionServer.constructRegionServer(HRegionServer.java:2484)
... 5 more
Caused by: java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Bits.java:658)
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:306)
... View more
Labels:
- Labels:
-
Apache HBase
12-30-2015
04:16 PM
we got ambari 2.1.2 and we are able to send snmptraps to externals monitoring servers. however our monitoring folks says they don't see complete message to format it on their side. is it related to https://issues.apache.org/jira/browse/AMBARI-13205... bug?
... View more
Labels:
- Labels:
-
Apache Ambari
12-30-2015
01:30 PM
@bsaini i think i figured out the issue...it is a bug when we upgrade ambari database from Mysql 5.1 to later...as of now only option is to upgrade ambari to 2.2 or rollback the changes. thanks for taking time in answering my question...appreciate it!!! https://na9.salesforce.com/articles/en_US/Issue/Ambari-log-reports-You-have-an-error-in-your-SQL-syntax-check-the-manual-that-corresponds-to-your-MySQL-server-version-for-the-right?popup=true
... View more
12-30-2015
12:26 AM
I had a tough time with Ambari 1.7 pointing to Mysql 5.6 on remote host. AS is using all the available ports on the source to make a connection to remote host to port 3306. At one time i saw 28000 ports being used by ambari to connect. this is killing us because it even used 50070 at one time, and i had tough time to figure it out. Finally i stopped ambari-server and all is well. Did anyone face this issue? if so any idea on fix? apart from downgrading Mysql or upgrade ambari to 2.1.2?
... View more
Labels:
- Labels:
-
Apache Ambari
12-28-2015
04:06 PM
thanks Scott..but this is sad that it doesn't support AD...
... View more
12-28-2015
04:06 PM
thanks for swift response..
... View more
12-28-2015
02:31 PM
i'm trying to sqoop (Linux server) data from Sql server...i was able to do it with local account on sql server, but i want to try with Windows authentication
... View more
Labels:
- Labels:
-
Apache Sqoop
12-16-2015
05:00 PM
2 Kudos
users connect to Hive through Knox uses AD credentials...integrated HDFS with AD groups... now HDFS is not able to recognize local user groups.
... View more
Labels:
- Labels:
-
Apache Hadoop
- « Previous
- Next »