Member since
02-24-2016
175
Posts
56
Kudos Received
3
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1933 | 06-16-2017 10:40 AM | |
| 16560 | 05-27-2016 04:06 PM | |
| 1633 | 03-17-2016 01:29 PM |
08-12-2016
02:31 PM
@Kevin Minder, @hkropp, @Alex Miller, @Ramesh Mani, @Vipin Rathor Adding experts!!
... View more
08-02-2016
04:08 PM
Hi there, We have knox version 2.4.2.0-258 deployed in two environments. (say Prod-A and Prod-B). Everything was working fine, when you start through Ambari but when I try to connect to Knox it doesn't work. Checked in the gateway.jsk under knox/data/security/keystore, it has a valid chain of certs. 2016-08-02 15:29:23,969 DEBUG nio.ssl (SslConnection.java:wrap(475)) - SCEP@fe16e5{l(/IP1:35840)<->r(/IP2:8444),s=1,open=true,ishut=false,oshut=false,rb=false,wb=false,w=true,i=1r}-{SslConnection@4ddce8ac SSL NEED_WRAP i/o/u=0/0/0 ishut=false oshut=false {AsyncHttpConnection@20993ce7,g=HttpGenerator{s=0,h=-1,b=-1,c=-1},p=HttpParser{s=-14,l=0,c=0},r=0}}
javax.net.ssl.SSLHandshakeException: no cipher suites in common
at sun.security.ssl.Handshaker.checkThrown(Handshaker.java:1431)
at sun.security.ssl.SSLEngineImpl.checkTaskThrown(SSLEngineImpl.java:535)
at sun.security.ssl.SSLEngineImpl.writeAppRecord(SSLEngineImpl.java:1214)
at sun.security.ssl.SSLEngineImpl.wrap(SSLEngineImpl.java:1186)
at javax.net.ssl.SSLEngine.wrap(SSLEngine.java:469)
at org.eclipse.jetty.io.nio.SslConnection.wrap(SslConnection.java:460)
at org.eclipse.jetty.io.nio.SslConnection.process(SslConnection.java:386)
at org.eclipse.jetty.io.nio.SslConnection.access$900(SslConnection.java:48)
at org.eclipse.jetty.io.nio.SslConnection$SslEndPoint.fill(SslConnection.java:678)
at org.eclipse.jetty.http.HttpParser.fill(HttpParser.java:1044)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:280)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SslConnection.handle(SslConnection.java:196)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
Caused by: javax.net.ssl.SSLHandshakeException: no cipher suites in common
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.SSLEngineImpl.fatal(SSLEngineImpl.java:1666)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:304)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:292)
at sun.security.ssl.ServerHandshaker.chooseCipherSuite(ServerHandshaker.java:1035)
at sun.security.ssl.ServerHandshaker.clientHello(ServerHandshaker.java:738)
at sun.security.ssl.ServerHandshaker.processMessage(ServerHandshaker.java:221)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:979)
at sun.security.ssl.Handshaker$1.run(Handshaker.java:919)
at sun.security.ssl.Handshaker$1.run(Handshaker.java:916)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.ssl.Handshaker$DelegatedTask.run(Handshaker.java:1369)
at org.eclipse.jetty.io.nio.SslConnection.process(SslConnection.java:375)
... 12 more Here is the command(s) I used to import Host specific certificate in the jks. All following are tried but I get the same error. keytool -import -alias gateway-identity -keyalg RSA -keystore gateway.jks -trustcacerts -file /etc/pki/tls/certs/Prod-A-HostFQDN.cer -storepass JKSP@ssword
keytool -import -alias gateway-identity -keyalg RSA -keystore gateway.jks -file /etc/pki/tls/certs/Prod-A-HostFQDN.cer -storepass JKSP@ssword
keytool -import -alias gateway-identity -keystore gateway.jks -file /etc/pki/tls/certs/Prod-A-HostFQDN.cer -storepass JKSP@ssword Can anyone say what is the issue and how to go about this? Regsrds
... View more
Labels:
- Labels:
-
Apache Knox
06-17-2016
06:41 AM
1 Kudo
Guys, - How to configure Hadoop/Hive for scale of queries hitting it from the API layer ? How many concurrent connnections to WebHDFS can be supported? - How many concurrent queries can be executed on Hive? - How to configure Hadoop/Hive for scale of queries hitting it from the API layer - what data access layer then does one use with Other Platforms from the service layer into Hive/Hadoop? Is is just a JDBC connection at that point or something else? Thanks.
... View more
Labels:
- Labels:
-
Apache Hive
06-09-2016
04:33 PM
HI @Jitendra Yadav , I restarted couple of times and it worked smoothly so did not do anything else. Thank you 🙂
... View more
06-09-2016
10:22 AM
Yeah I found that after posting this 🙂 Thanks @Jitendra Yadav
... View more
06-09-2016
09:16 AM
Guys, I want to cleanup all the history of the applications run on Spark. Because of some issues previously still some of the applications started are being shown "incomplete" and when I check the /spark-history directory I see those applications with .in progress extension for example : /spark-history/application_14653124594105_0007.inprogress. When I try killing it using yarn application kill command it shows that Application application_14653124594105_0007 has already finished. I delete the appname.inprogress directory from /spark-history but it doesn't stop appearing from the UI and when I click the link on the UI, it shows error saying : Application not found. What is the safe way to delete the history and want to avoid displaying on the History servers. Many thanks.
... View more
Labels:
- Labels:
-
Apache Spark
06-08-2016
03:15 PM
Guys, When I am trying to start Spark History Server component, I see an error IllegalArgumentException not sure why it is popping up! lease use the new key 'spark.yarn.am.waitTime' instead.
16/06/08 15:04:32 ERROR FsHistoryProvider: Exception encountered when attempting to load application log hdfs://hdfsfederation/spark-history/.a5555e556-3301-433e-44de-23311665ed
java.lang.IllegalArgumentException: Codec [a5555e556-3301-433e-44de-23311665ed] is not available. Consider setting spark.io.compression.codec=lzf
at org.apache.spark.io.CompressionCodec$$anonfun$createCodec$1.apply(CompressionCodec.scala:77)
at org.apache.spark.io.CompressionCodec$$anonfun$createCodec$1.apply(CompressionCodec.scala:77)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:77)
I also tried setting spark.io.compression.codec = lzf ( This error didn't go) Also tried setting spark.io.compression.codec = org.apache.spark.io.LZFCompressionCodec . This also does not fix the error. Any one faced this situation? This is on HDP 2.4, Spark 1.6.0
... View more
Labels:
- Labels:
-
Apache Spark
06-07-2016
04:05 PM
@Sri Bandaru You can use Quest authentication/authorization services. We use it in production to grant you a TGT when you login to the box.
... View more
06-07-2016
03:49 PM
@Alex Miller Well in our scenario both the A/D are more or less replicas. Anyway I got this fixed.
... View more
06-07-2016
03:48 PM
Well I fixed this using REST APIs. Thanks.
... View more