Member since
05-30-2018
1322
Posts
715
Kudos Received
148
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3421 | 08-20-2018 08:26 PM | |
1522 | 08-15-2018 01:59 PM | |
1921 | 08-13-2018 02:20 PM | |
3431 | 07-23-2018 04:37 PM | |
4139 | 07-19-2018 12:52 PM |
03-14-2016
04:16 AM
5 Kudos
Aggregate functions on hive queries are not working for me on Zeppelin HDP 2.4 cluster. Zeppelin version: zeppelin-server-0.6.0.2.4.0.0-169.jar I run "show tables" and it works fine. I am able to delete, create and load tables. I am able to select * from table. However if I run a any select count(*), sum(*), etc query it immeditaly fails:
java.lang.NullPointerException
at org.apache.zeppelin.hive.HiveInterpreter.getConnection(HiveInterpreter.java:184)
at org.apache.zeppelin.hive.HiveInterpreter.getStatement(HiveInterpreter.java:204)
at org.apache.zeppelin.hive.HiveInterpreter.executeSql(HiveInterpreter.java:233)
at org.apache.zeppelin.hive.HiveInterpreter.interpret(HiveInterpreter.java:328)
at org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:295)
at org.apache.zeppelin.scheduler.Job.run(Job.java:171)
at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745) I ran the same query on beeline and it returned results fine.
... View more
Labels:
03-11-2016
02:08 PM
1 Kudo
@Artem Ervits
I don't see why ranger and ranger kms must be in same host. Here is instructions on installation KMS on another host.
... View more
03-10-2016
10:07 PM
2 Kudos
Once I finished configuring all placement of the services during install, i click deploy and it failed. Couldn't catch the failure. Now when log into ambari i see this: I click on launch install wizard and it takes me to home page of ambari: Clearly none of the services got installed and it won't let me start over. it is very confused. I called the rest service to determine what cluster it thinks i have running: curl -user admin:xxxx http://xxxxx.cloud.hortonworks.com/api/v1/clusters
returning curl: (52) Empty reply from server Basically how do I start over?
... View more
Labels:
- Labels:
-
Apache Ambari
03-09-2016
04:04 PM
1 Kudo
Can client connect using a lower standard like auth-int or auth if hive.server2.thrift.sasl.qop is set to auth-conf on hiveserver2?
... View more
Labels:
- Labels:
-
Apache Hive
03-09-2016
03:46 PM
1 Kudo
Try this: CREATE EXTERNAL TABLE tableName
PARTITIONED BY (ingestiondatetime BIGINT, recordtype STRING)
ROW FORMAT SERDE
'org.apache.hadoop.hive.serd2.avro.AvroSerDe'
STORED AS INPUTFORMAT
'org.apache.hadoop.hive.ql.io.avro.AvroContainerInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat'
TBLPROPERTIES ('avro.schema.url'='hdfs:///user/file.avsc');
... View more
03-09-2016
04:38 AM
1 Kudo
@Ancil McBarnett Awesome feedback! I have follow up (maybe I will ask another HCC question). Can client connect using a lower standard like auth-int or auth if hive.server2.thrift.sasl.qop is set to auth-conf on hiveserver2?
... View more
03-09-2016
03:29 AM
1 Kudo
On kerberized cluster I understand for j/odbc via TCP only SASL is supported. How do i enabled SASL via ambari for hiveserver2. I only see SSL button.
... View more
Labels:
- Labels:
-
Apache Hive
03-08-2016
08:40 PM
2 Kudos
Can Ranger use KTS (Key Trustee Server) and KMS (Key Management Server) from another distribution ie Cloudera or mapr?
... View more
Labels:
- Labels:
-
Apache Ranger
03-08-2016
07:51 PM
@Ancil McBarnett Does that mean maintaining 2 different meta repositories?
... View more
03-08-2016
04:32 AM
Does knox only use http transport method? Is it able to use TCP? The reason I ask is if I connect to hive (kerberized) cluster using TCP (default method to connect to hive) jdbc would I have to use http instead if knox were to be enabled?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Knox