Member since
06-08-2014
33
Posts
6
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1174 | 11-23-2018 11:18 AM |
12-07-2018
01:15 PM
I have enabled ACID transactions in Hive-3 within HDP-3 at cluster level. When i am trying to disable it on session level using below properties and creating new managed table using this (TBLPROPERTIES ('orc.compress'='ZLIB'),('transactional'='true')) table properties. It's giving below error : set hive.optimize.index.filter=false; set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager ; set hive.compactor.initiator.on=false; set hive.compactor.worker.threads=0; set hive.strict.managed.tables=false; INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Table ref_edw4x_qn1useh1.dummy failed strict managed table checks due to the following reason: Table is marked as a managed table but is not transactional.)
INFO : Completed executing command(queryId=hive_20181207104022_6a3478f0-9b8a-44db-a9a4-4d2ab5fe2b11); Time taken: 0.076 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Table ref_edw4x_qn1useh1.dummy failed strict managed table checks due to the following reason: Table is marked as a managed table but is not transactional.) (state=08S01,code=1)
... View more
- Tags:
- Data Processing
- Hive
Labels:
- Labels:
-
Apache Hive
11-23-2018
11:18 AM
@Prabhu M @Mahesh Balakrishnan : Issue has been resolved. It's a bug with Tez version 0.9.0 (HDP-3.0.0.0) which they have fixed in Tez 0.9.1 version(HDP-3.0.1.0)
... View more
11-12-2018
06:07 PM
18/11/12 10:37:25 [main]: INFO jdbc.HiveConnection: Connected to server1.test.com:10000
Connected to: Apache Hive (version 3.0.0.3.0.0.0-1334)
Driver: Hive JDBC (version 3.0.0.3.0.0.0-1334)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.0.0.3.0.0.0-1334 by Apache Hive
0: jdbc:hive2://server1.test.com:> select count(*) from lnd_edw4x_qn1useh1.redemption_fact_t; INFO : Compiling command(queryId=hive_20181112103728_c02d0353-978c-457c-bde7-9ae45fcb1216): select count(*) from lnd_edw4x_qn1useh1.redemption_fact_t INFO : Semantic Analysis Completed INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], properties:null)
INFO : Completed compiling command(queryId=hive_20181112103728_c02d0353-978c-457c-bde7-9ae45fcb1216); Time taken: 1.218 seconds INFO : Executing command(queryId=hive_20181112103728_c02d0353-978c-457c-bde7-9ae45fcb1216): select count(*) from lnd_edw4x_qn1useh1.redemption_fact_t INFO : Query ID = hive_20181112103728_c02d0353-978c-457c-bde7-9ae45fcb1216
INFO : Total jobs = 1
INFO : Launching Job 1 out of 1
INFO : Starting task [Stage-1:MAPRED] in serial mode INFO : Subscribed to counters: [] for queryId: hive_20181112103728_c02d0353-978c-457c-bde7-9ae45fcb1216 INFO : Tez session hasn't been created yet. Opening session INFO : Dag name: select count(*) from ...h1.redemption_fact_t (Stage-1) ERROR : Status: Failed
ERROR : Vertex failed, vertexName=Map 1, vertexId=vertex_1542008498326_0022_1_00, diagnostics=[Vertex vertex_1542008498326_0022_1_00 [Map 1] killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: redemption_fact_t initializer failed, vertex=vertex_1542008498326_0022_1_00 [Map 1], java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.adl.AdlFileSystem not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2595) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3320) I am already having below jars in Hadoop/hive libs still it's not working: hadoop-azure-3.0.0.3.0.0.0-1334.jar, hadoop-azure-datalake-3.0.0.3.0.0.0-1334.jar, azure-data-lake-store-sdk-2.2.7.jar
... View more
Labels:
- Labels:
-
Apache Hive
07-06-2018
01:50 PM
I am having only hiveserver2 service on Server2.
... View more
07-06-2018
12:47 PM
I have configured Hiverserver2 in HA mode and started in INFO mode and it is generating logs in INFO mode on server1 but on server2 it's generating in DEBUG mode. I also checked in configuration there is no different configuration for both hiveservers. Below are running hiveserver2 processes on server1 & server2. Server1:~> ps -ef | grep hiveserver2
hive 6717 1 2 Jul02 ? 02:02:18 /usr/lib/jvm/jdk1.8.0_71/bin/java -Xmx1024m -Dhdp.version=2.5.3.0-37 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.3.0-37 -Dhadoop.log.dir=/hadoop/var/log/hadoop/hive -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx12065m -Dlog4j.configurationFile=hive-log4j2.properties -Djava.util.logging.config.file=/usr/hdp/2.5.3.0-37/hive/bin/../conf/parquet-logging.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.3.0-37/hive/lib/hive-service-1.2.1000.2.5.3.0-37.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar -hiveconf hive.metastore.uris= -hiveconf hive.log.file=hiveserver2.log -hiveconf hive.log.dir=/hadoop/var/log/hive ======================================= Server2:~> ps -ef | grep hiveserver2 hive 4459 1 3 Jul02 ? 03:30:22 /usr/lib/jvm/jdk1.8.0_71/bin/java -Xmx1024m -Dhdp.version=2.5.3.0-37 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.5.3.0-37 -Dhadoop.log.dir=/hadoop/var/log/hadoop/hive -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.5.3.0-37/hadoop -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.5.3.0-37/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -Xmx12065m -Dlog4j.configurationFile=hive-log4j2.properties -Djava.util.logging.config.file=/usr/hdp/2.5.3.0-37/hive/bin/../conf/parquet-logging.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.5.3.0-37/hive/lib/hive-service-1.2.1000.2.5.3.0-37.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar -hiveconf hive.metastore.uris= -hiveconf hive.log.file=hiveserver2.log -hiveconf hive.log.dir=/hadoop/var/log/hive Please respond.
... View more
Labels:
- Labels:
-
Apache Hive
08-17-2017
06:25 PM
Hey Nitin, Changed the configs like, object class=user Distinguished name attribute=distinguishedName Group member attribute=member And re-synced users. It is working now. Thanks for help.
... View more
08-17-2017
06:22 PM
Yes this helps a lot. Thanks for quick answer.
... View more
08-17-2017
06:08 PM
I have below set of properties, authentication.ldap.groupMembershipAttr=memberOf authentication.ldap.groupNamingAttr=cn authentication.ldap.groupObjectClass=group authentication.ldap.userObjectClass=person authentication.ldap.usernameAttribute=sAMAccountName authentication.ldap.dnAttribute=dn
... View more
08-17-2017
04:52 PM
1 Kudo
Is there a way to find out - List Users/jobs exhausting the YARN queues , running with more than 4 TB of memory and 400 containers in last 2 weeks
... View more
- Tags:
- Hadoop Core
- jobs
- YARN
Labels:
- Labels:
-
Apache YARN
08-14-2017
05:55 PM
1 Kudo
I have synced Users and groups in Ambari UI using, ambari-server sync-ldap --users /home/centos/users.txt --groups /tmp/groups.txt
Using python /usr/bin/python Syncing with LDAP...
Enter Ambari Admin login: admin
Enter Ambari Admin password:
Syncing specified users and groups...
Completed LDAP Sync.
Summary:
memberships:
removed = 0
created = 0
users:
skipped = 0
removed = 0
updated = 0
created = 4
groups:
updated = 0
removed = 0
created = 0
But users are not mapped to any group, but it does not add users to the group. Which parameters should I check in ambari.properties.
... View more
Labels:
- Labels:
-
Apache Ambari
08-11-2017
07:08 PM
Thanks nshelke it explains everything.
... View more
08-11-2017
07:08 PM
Thanks for all discriptive answer.
... View more
08-11-2017
04:06 PM
I want to set this filed to blank value. And I am getting error, ERROR 1452(23000):Cannot add or update a child row: a foreign key constraint fails (`ambari`.`clusters`, CONSTRAINT `FK_clusters_upgrade_id` FOREIGN KEY (`upgrade_id`) REFERENCES `upgrade`(`upgrade_id`))
... View more
08-11-2017
03:18 PM
1 Kudo
I checked the Ambari DB and Saw some inconsistencies while upgrading it, mysql> select cluster_id,resource_id,upgrade_id,provisioning_state,security_type,desired_stack_id from clusters;
+------------+-------------+------------+--------------------+---------------+------------------+
| cluster_id | resource_id | upgrade_id | provisioning_state | security_type | desired_stack_id |
+------------+-------------+------------+--------------------+---------------+------------------+
| 2 | 4 | NULL | INSTALLED | KERBEROS | 2 |
+------------+-------------+------------+--------------------+---------------+------------------+
1 row in set (0.00 sec)
mysql> update clusters SET upgrade_id = '' where cluster_id = 2;
ERROR 1452 (23000): Cannot add or update a child row: a foreign key constraint fails (`ambari`.`clusters`, CONSTRAINT `FK_clusters_upgrade_id` FOREIGN KEY (`upgrade_id`) REFERENCES `upgrade` (`upgrade_id`))
Getting the Above error while updating the Upgrade_id. How to update it. Please help.
... View more
Labels:
- Labels:
-
Apache Ambari
07-11-2017
07:11 PM
@nshelke, Thanks for quick reply. I will try your suggestion.
... View more
07-11-2017
07:03 PM
1 Kudo
WARN [qtp-ambari-agent-4198] HeartBeatHandler:411 - Received registration request from host with non compatible agent version, hostname=xxx, agentVersion=2.4.3.0, serverVersion=2.5.1.0
... View more
Labels:
- Labels:
-
Apache Ambari
06-26-2017
05:53 PM
@Jay SenSharma : I am using Ambari Version 2.4.2.0, i also tried with property "authentication.ldap.username.forceLowercase" in ambari.properties file still it's giving same error. Is there any other way to convert UPPERCASE or CaMeLcase into lower in Ambari version Version 2.4.2.0 except Ambari upgradation.
... View more
06-26-2017
12:28 PM
We have AD/Kerberose integrated HDP cluster in which we have users in AD are in capital letter(e.g. Abc) and same user's also it's principle are in small letter like abc@REALM.COM, when we are trying to use hive view it's showing an error message : "Service 'userhome' check failed: File does not exist: /user/Abc". User abc home directory is present under /user/abc instead of /user/Abc. So please suggest how we can use hive-view?
... View more
Labels:
- Labels:
-
Apache Hadoop
06-07-2017
10:33 AM
users are logging with userID & Password.
... View more
06-06-2017
05:41 AM
@Jay SenSharma Any update? Any help would be appreciated.
... View more
06-06-2017
05:30 AM
Its giving me error while running the job that, Permission denied for user yarn. Not sure why as I am running this oozie action by arjun user still getting this error.
... View more
06-06-2017
05:29 AM
@nshelke I tried your suggestion, only not sure how to add user in HDFS. Can you guide me?
... View more
06-06-2017
05:01 AM
I have a oozie workflow which launched Sqoop Job against Hive, While importing it is throwing the below error, [main] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:397)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:342)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246)
at org.apache.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:58)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:202)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:182)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:51)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) Can someone help me out on this?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
-
Apache Sqoop
06-04-2017
08:04 AM
@Jay SenSharma It's giving same issue on other system also, it's not about browser cache. I cleared the browser cache and tried still no luck.
... View more
06-04-2017
06:35 AM
Log-File(nifi-user.log)
2017-06-02 13:39:13,641 INFO [NiFi Web Server-11029] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for Arjun-More
2017-06-02 13:39:13,653 INFO [NiFi Web Server-10949] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<user1><UKTEHDPHDF01P.europe.odcorp.net><CN=UKTEHDPHDF03P.europe.odcorp.net, OU=Hadoop>) GET https://test1.example.com:7071/nifi-api/flow/current-user (source ip: 10.128.198.42)
2017-06-02 13:39:13,653 INFO [NiFi Web Server-10949] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
2017-06-02 13:39:13,838 INFO [NiFi Web Server-11021] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJBcmp1bi1Nb3JlQEVVUk9QRS5PRENPUlAuTkVUIiwiaXNzIjoiS2VyYmVyb3NTZXJ2aWNlIiwiYXVkIjoiS2VyYmVyb3NTZXJ2aWNlIiwicHJlZmVycmVkX3VzZXJuYW1lIjoiQXJqdW4tTW9yZUBFVVJPUEUuT0RDT1JQLk5FVCIsImtpZCI6NSwiZXhwIjoxNDk2NDUwMzUzLCJpYXQiOjE0OTY0MDcxNTN9.xsEmnpj-muk60JdqKRsyOBn66dw6hJANTG8zgEv-eQ8) GET https://test1.example.com:7071/nifi-api/flow/client-id (source ip: 10.148.171.33)
2017-06-02 13:39:13,838 INFO [NiFi Web Server-10803] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJBcmp1bi1Nb3JlQEVVUk9QRS5PRENPUlAuTkVUIiwiaXNzIjoiS2VyYmVyb3NTZXJ2aWNlIiwiYXVkIjoiS2VyYmVyb3NTZXJ2aWNlIiwicHJlZmVycmVkX3VzZXJuYW1lIjoiQXJqdW4tTW9yZUBFVVJPUEUuT0RDT1JQLk5FVCIsImtpZCI6NSwiZXhwIjoxNDk2NDUwMzUzLCJpYXQiOjE0OTY0MDcxNTN9.xsEmnpj-muk60JdqKRsyOBn66dw6hJANTG8zgEv-eQ8) GET https://test1.example.com:7071/nifi-api/flow/config (source ip: 10.148.171.33)
2017-06-02 13:39:13,838 INFO [NiFi Web Server-11021] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
2017-06-02 13:39:13,838 INFO [NiFi Web Server-10803] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
2017-06-02 13:39:13,838 INFO [NiFi Web Server-10959] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJBcmp1bi1Nb3JlQEVVUk9QRS5PRENPUlAuTkVUIiwiaXNzIjoiS2VyYmVyb3NTZXJ2aWNlIiwiYXVkIjoiS2VyYmVyb3NTZXJ2aWNlIiwicHJlZmVycmVkX3VzZXJuYW1lIjoiQXJqdW4tTW9yZUBFVVJPUEUuT0RDT1JQLk5FVCIsImtpZCI6NSwiZXhwIjoxNDk2NDUwMzUzLCJpYXQiOjE0OTY0MDcxNTN9.xsEmnpj-muk60JdqKRsyOBn66dw6hJANTG8zgEv-eQ8) GET https://test1.example.com:7071/nifi-api/flow/cluster/summary (source ip: 10.148.171.33)
2017-06-02 13:39:13,839 INFO [NiFi Web Server-10959] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
2017-06-02 13:39:13,852 INFO [NiFi Web Server-10551] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (<user1><UKTEHDPHDF01P.europe.odcorp.net><CN=UKTEHDPHDF03P.europe.odcorp.net, OU=Hadoop>) GET https://test1.example.com:7071/nifi-api/flow/config (source ip: 10.134.197.41)
2017-06-02 13:39:13,852 INFO [NiFi Web Server-10551] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
2017-06-02 13:39:13,868 INFO [NiFi Web Server-11029] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJBcmp1bi1Nb3JlQEVVUk9QRS5PRENPUlAuTkVUIiwiaXNzIjoiS2VyYmVyb3NTZXJ2aWNlIiwiYXVkIjoiS2VyYmVyb3NTZXJ2aWNlIiwicHJlZmVycmVkX3VzZXJuYW1lIjoiQXJqdW4tTW9yZUBFVVJPUEUuT0RDT1JQLk5FVCIsImtpZCI6NSwiZXhwIjoxNDk2NDUwMzUzLCJpYXQiOjE0OTY0MDcxNTN9.xsEmnpj-muk60JdqKRsyOBn66dw6hJANTG8zgEv-eQ8) GET https://test1.example.com:7071/nifi-api/flow/banners (source ip: 10.148.171.33)
2017-06-02 13:39:13,868 INFO [NiFi Web Server-11029] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
2017-06-02 13:39:14,067 INFO [NiFi Web Server-10961] o.a.n.w.s.NiFiAuthenticationFilter Attempting request for (eyJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJBcmp1bi1Nb3JlQEVVUk9QRS5PRENPUlAuTkVUIiwiaXNzIjoiS2VyYmVyb3NTZXJ2aWNlIiwiYXVkIjoiS2VyYmVyb3NTZXJ2aWNlIiwicHJlZmVycmVkX3VzZXJuYW1lIjoiQXJqdW4tTW9yZUBFVVJPUEUuT0RDT1JQLk5FVCIsImtpZCI6NSwiZXhwIjoxNDk2NDUwMzUzLCJpYXQiOjE0OTY0MDcxNTN9.xsEmnpj-muk60JdqKRsyOBn66dw6hJANTG8zgEv-eQ8) GET https://test1.example.com:7071/nifi-api/flow/processor-types (source ip: 10.148.171.33)
2017-06-02 13:39:14,067 INFO [NiFi Web Server-10961] o.a.n.w.s.NiFiAuthenticationFilter Authentication success for user1
... View more
Labels:
- Labels:
-
Apache NiFi
06-01-2017
07:41 AM
@Jay SenSharma
The /hbase-unsecure is not present in my case. No znode with the name /hbase-unsecure.
... View more
06-01-2017
07:22 AM
1 Kudo
I am getting an error in hbase while starting it in HDP. I am using HDP-2.3.2 ERROR while starting Hbase Master TableExistsexception and Loaded coprocessors 2016-09-17 04:47:58,238 FATAL [master:hb-qa:60000] master.HMaster: Master server abort: loaded coprocessors are: [] 2016-09-17 04:47:58,239 FATAL [master:hb-qa:60000] master.HMaster: Unhandled exception. Starting shutdown. org.apache.hadoop.hbase.TableExistsException: hbase:namespace at org.apache.hadoop.hbase.master.handler.CreateTableHandler.prepare(CreateTableHandler.java:133) at org.apache.hadoop.hbase.master.TableNamespaceManager.createNamespaceTable(TableNamespaceManager.java:232) at org.apache.hadoop.hbase.master.TableNamespaceManager.start(TableNamespaceManager.java:86) at org.apache.hadoop.hbase.master.HMaster.initNamespace(HMaster.java:1046) at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:925) at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:605) at java.lang.Thread.run(Thread.java:745) Can someone help me with this?
... View more
Labels:
04-12-2017
06:49 PM
Yes @Shashant Panwar, but this both were worked for me and i already mention this manual apache hadoop client installation approach in above comment.
... View more