Member since
12-30-2015
164
Posts
29
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
14225 | 01-07-2019 06:17 AM | |
696 | 12-27-2018 07:28 AM | |
2625 | 11-26-2018 10:12 AM | |
662 | 11-16-2018 12:15 PM | |
2462 | 10-22-2018 09:31 AM |
10-09-2019
09:28 AM
PFA the below error logs : 19/10/09 16:09:32 DEBUG ServletHandler: chain=org.apache.hadoop.security.authentication.server.AuthenticationFilter-418c020b->org.apache.spark.ui.JettyUtils$$anon$3-75e710b@986efce7==org.apache.spark.ui.JettyUtils$$anon$3,jsp=null,order=-1,inst=true 19/10/09 16:09:32 DEBUG ServletHandler: call filter org.apache.hadoop.security.authentication.server.AuthenticationFilter-418c020b 19/10/09 16:09:32 DEBUG AuthenticationFilter: Got token null from httpRequest http://ip-10-0-10.184. ************:18081/ 19/10/09 16:09:32 DEBUG AuthenticationFilter: Request [http://ip-10-0-10-184.*****:18081/] triggering authentication. handler: class org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler 19/10/09 16:09:32 DEBUG AuthenticationFilter: Authentication exception: java.lang.IllegalArgumentException org.apache.hadoop.security.authentication.client.AuthenticationException: java.lang.IllegalArgumentException at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:306) at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:536) at org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759) at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180) at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512) at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112) at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493) at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213) at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134) at org.spark_project.jetty.server.Server.handle(Server.java:539) at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:333) at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251) at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283) at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108) at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93) at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303) at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148) at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136) at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671) at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalArgumentException at java.nio.Buffer.limit(Buffer.java:275) at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:365) at org.apache.hadoop.security.authentication.util.KerberosUtil$DER.<init>(KerberosUtil.java:358) at org.apache.hadoop.security.authentication.util.KerberosUtil.getTokenServerName(KerberosUtil.java:291) at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.authenticate(KerberosAuthenticationHandler.java:285) ... 22 more 19/10/09 16:09:32 DEBUG GzipHttpOutputInterceptor: org.spark_project.jetty.server.handler.gzip.GzipHttpOutputInterceptor@17d4d832 exclude by status 403 19/10/09 16:09:32 DEBUG HttpChannel: sendResponse info=null content=HeapByteBuffer@26ea8849[p=0,l=365,c=32768,r=365]={<<<<html>\n<head>\n<me.../body>\n</html>\n>>>\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00...\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00} complete=true committing=true callback=Blocker@137652aa{null} 19/10/09 16:09:32 DEBUG HttpChannel: COMMIT for / on HttpChannelOverHttp@4d71d816{r=2,c=true,a=DISPATCHED,uri=//ip-10-0-10-184.******:18081/} 403 java.lang.IllegalArgumentException HTTP/1.1 Date: Wed, 09 Oct 2019 16:09:32 GMT Set-Cookie: hadoop.auth=; HttpOnly Cache-Control: must-revalidate,no-cache,no-store Content-Type: text/html;charset=iso-8859-1
... View more
10-08-2019
11:07 PM
Hi @Shelton , have tried with the below code but still facing the same issue ? {% if security_enabled %} export SPARK_HISTORY_OPTS='-Dspark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter -Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params="type=kerberos,kerberos.principal={{spnego_principal}},kerberos.keytab={{spnego_keytab}}"' {% endif %} do we need enable spengo authenitcation in browser ?
... View more
07-08-2019
09:49 AM
Dear community members, we were unable to connect spark-history server ui even we have enabled spengo authentication in Advanced spark-env config as following: export SPARK_HISTORY_OPTS='-Dspark.ui.filters=org.apache.hadoop.security.authentication.server.AuthenticationFilter -Dspark.org.apache.hadoop.security.authentication.server.AuthenticationFilter.params="type=kerberos,kerberos.principal=HTTP/ip-0-0-0-0@HDPCLUSTER.LOCAL,kerberos.keytab=/etc/security/keytabs/spnego.service.keytab"' Any help much appreciated to resolve the above issue ? Thanks in advance
... View more
Labels:
- Labels:
-
Apache Spark
06-19-2019
01:24 PM
HI @Jay Kumar SenSharma I am able to connect by using below connection properties: custom.postgres.jdbc.name=postgresql-jdbc.jar previous.custom.postgres.jdbc.name=postgresql-jdbc.jar server.jdbc.connection-pool=internal server.jdbc.database=postgres server.jdbc.database_name=ambari server.jdbc.driver=org.postgresql.Driver server.jdbc.hostname=localhost server.jdbc.port=5432 server.jdbc.postgres.schema=ambari server.jdbc.rca.driver=org.postgresql.Driver server.jdbc.rca.url=jdbc:postgresql://*.*.*.*:5432/ambari?ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory server.jdbc.rca.user.name=ambari server.jdbc.rca.user.passwd=${alias=ambari.db.password} server.jdbc.url=jdbc:postgresql://*.*.*.*:5432/ambari?ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory server.jdbc.user.passwd=${alias=ambari.db.password} if i removed the sslfactory=org.postgresql.ssl.NonValidatingFactory parameter form jdbc url, connection is not working?
... View more
06-18-2019
10:59 AM
Dear Community Members, I am not able to connect external postgres db with ssl enabled from ambari-server . I have changed jdbc url in ambari.properties file to make the ssl connection( Followed https://community.hortonworks.com/questions/209938/how-to-setup-ambari-with-an-external-postgresql-db.html ), but no luck. Any help much appreciated! Thank you in advance:)
... View more
Labels:
- Labels:
-
Apache Ambari
05-27-2019
08:08 AM
Hi @Predrag Minovic, Thank you for the sharing your input. No issues with the zookeeper name space, have double checked i,e jdbc:hive2://hadoop-zknode01:2181,hadoop-zknode02:2181,hadoop-zknode03:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
... View more
05-20-2019
01:36 PM
Hello All, I have installed HDP3.1 recently and enabled hive llap. i am able to connect hiveserver2 by using hive cli. when i am trying to connect hive from beeline it's throwing NULL Pointer exception. i haven't found errors in hiveserver logs: 2019-05-20 13:28:10 INFO HiveConnection:203 - Will try to open client transport with JDBC Uri: jdbc:hive2://hive.server2.instance.uri=hadoop-master-nn1:10000;hive.server2.authentication=NONE;hive.server2.transport.mode=binary;hive.server2.thrift.sasl.qop=auth;hive.server2.thrift.bind.host=hadoop-master-nn1;hive.server2.thrift.port=10000;hive.server2.use.SSL=false/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2 java.lang.NullPointerException Any suggestion to resolve the above issue ?
... View more
Labels:
05-16-2019
08:01 AM
Dear community members, i am not able see the hive table data while running select query , count query showing the count number. no errors were observed in the logs : create table syntax: CREATE TABLE `demandpriority`( `demandtypekey` int, `itemkey` int, `locationkey` int, `demandcostraw` float, `scpdemandbuildlatelimit` float, `demandvcuftraw` float, `spdemandqty` float, `demandvcasesraw` float, `demanddebugflag` float, `demandquantityraw` float, `demandwcasesraw` float, `spdemandpriority` float, `scpdemandbuildaheadlimit` float, `demandretailraw` float, `scpdemandincrementalflag` boolean, `demandwcuftraw` float) PARTITIONED BY ( `versionkey` int, `timekey` date) CLUSTERED BY ( itemkey) INTO 100 BUCKETS ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' LOCATION 'hdfs://88888/warehouse/tablespace/managed/hive/schema_wmt1dept.db/demandpriority' | TBLPROPERTIES ( 'bucketing_version'='2', 'transactional'='true', 'transactional_properties'='default', 'transient_lastDdlTime'='1557938173') Note : i have tried with MSCK REPAIR and ANALYZE commands but still no luck
... View more
Labels:
- Labels:
-
Apache Hive
05-07-2019
11:38 AM
please check below article: https://antnix07.blogspot.com/2017/05/normal-0-false-false-false-en-us-x-none.html
... View more
03-05-2019
12:54 PM
login zookeeper cli by using /usr/hdp/current/zookeeper-server/bin/zkCli.sh and run ls / The above command will display all znodes in the cluster
... View more
02-19-2019
07:36 AM
Hi @Dan Hops, It seems you are working with single node cluster and configured LLAP in top of hive with yarn 100% utilization Please configure the LLAP to use 25% of your Yarn capacity
... View more
02-13-2019
09:56 AM
Hi @Kashif Amir, The other way is automate this task by using shell or python: you can get the table creation by using "desc formatted db.tablename"
... View more
02-13-2019
09:00 AM
Hi @Kashif Amir, what is the HDP version you are using ? if the version is HDP 2.X please see https://falcon.apache.org/FalconDocumentation.html#Retention
... View more
02-13-2019
07:22 AM
Hi @Giridharan
C, For Syntax please follow : https://github.com/hortonworks-spark/spark-llap/tree/master
... View more
02-12-2019
12:04 PM
Please go through https://hortonworks.com/wp-content/uploads/2013/04/Hortonworks-Hive-ODBC-Driver-User-Guide.pdf
... View more
02-12-2019
11:59 AM
setup guide : https://community.hortonworks.com/articles/223626/integrating-apache-hive-with-apache-spark-hive-war.html
... View more
02-12-2019
11:56 AM
Hi Giridharan, you need to use hive warehouse connector to connect hive databases from HDP3 on words. please see https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.1.0/integrating-hive/content/hive_hivewarehouseconnector_for_handling_apache_spark_data.html
... View more
02-12-2019
11:50 AM
Seems hostname resolution issue. can you try to connect with ip ?
... View more
02-08-2019
08:02 AM
Hi @Madhura Mhatre, This may not the perfect answer but you can try this way: First check the below process in one data node, if it work's perfectly please replicate it others 1. Create the new config group under one data node for example you have configured /data1(1.5 T.B) datanode 2. overwrite the dfs.datanode.dir config parameter in new config group Remove /data1 and add /data2(1 T.B) /data3 (1 T.B) 3. save the changes and restart required services. 4. blocks are automatically copied from other data nodes since old drive is missing configuration Note: Cluster may get slowness due to heavy data lifting from other datanodes.
... View more
01-21-2019
11:07 AM
Hi @Dinesh Singh, Check this article https://community.hortonworks.com/articles/9148/troubleshooting-an-oozie-flow.html
... View more
01-11-2019
10:35 AM
Try the following, lets assume your hive.tez.container.size=2048.
set hive.tez.java.opts=-Xmx1640m (0.8 times hive.tez.container.size) set tez.runtime.io.sort.mb=820 (0.4 times hive.tez.container.size) set tez.runtime.unordered.output.buffer.size-mb=205 (0.1 times hive.tez.container.size)
... View more
01-10-2019
11:42 AM
Hi @Shobhna Dhami, Can you please check any OutOfMemoryError errors in the log file. please check /var/log/messages log file also. If it is OOM you need to upgrade the memory in that server
... View more
01-10-2019
09:57 AM
Hi Vikash, This https://community.hortonworks.com/questions/91265/oozie-hive-action-class-not-found-exception.html help you.
... View more
01-10-2019
08:18 AM
Hi @Naeem Ullah Khan, it's looks like an issue with proxy-server. if you are using proxy-server please add in /etc/yum.conf
... View more
01-08-2019
06:47 AM
Hi @chaouki
trabelsi, Please find the below example: Test-NetConnection 10.0.10.133 -Port 8080 ComputerName : 10.0.10.133
RemoteAddress : 10.0.10.133
RemotePort : 8080
InterfaceAlias : Ethernet 2
SourceAddress : 10.242.2.27
TcpTestSucceeded : True TcpTestSucceeded should be true. if it false you need to open the ports in firewalls.
... View more
01-07-2019
12:28 PM
Hi @chaouki trabelsi, please check the firewall in windows server to accept the connections from hive you can simple check by using below syntax from powershell. Test-NetConnection hiveserver-ip -Port 10000 if the connection status false then you need to check the firewall in bothsides (Linux+windows)
... View more
01-07-2019
07:29 AM
Hi @Rajesh Sampath, please follow the below article : https://github.com/hortonworks-spark/spark-llap/tree/master
... View more
01-07-2019
06:17 AM
1 Kudo
Hi Vinay, use the below code to connect hive and list the databases : spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://hiveserverip:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="zookeeperquoremip:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build() hive.showDatabases().show(100, false) Reference article https://github.com/hortonworks-spark/spark-llap/tree/master
... View more
01-07-2019
06:16 AM
Hi Vinay, use the below code to connect hive and list the databases : spark-shell --conf spark.sql.hive.hiveserver2.jdbc.url="jdbc:hive2://hiveserverip:10000/" spark.datasource.hive.warehouse.load.staging.dir="/tmp" spark.hadoop.hive.zookeeper.quorum="zookeeperquoremip:2181" --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.0.0.0-1634.jar val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build() hive.showDatabases().show(100, false) Reference article https://github.com/hortonworks-spark/spark-llap/tree/master
... View more
01-02-2019
12:21 PM
Hi VInay, can you post spark code which is using to display the databases ?
... View more