Member since
05-09-2016
280
Posts
58
Kudos Received
31
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3744 | 03-28-2018 02:12 PM | |
3022 | 01-09-2018 09:05 PM | |
1649 | 12-13-2016 05:07 AM | |
5023 | 12-12-2016 02:57 AM | |
4309 | 12-08-2016 07:08 PM |
09-30-2016
04:23 AM
@Andrew Ryan
Thank you for trying so hard with me. I tried both commands, first one did not get it through. Ends with the following message: This command requires a running instance of Knox to be present on the same machine. It will execute a test to make sure all services are accessible through the gateway URLs. Errors are reported and suggestions to resolve any problems are returned. JSON formatted. Second one showed the message that LDAP authentication got successful. Also checked that there are no knox zombies running. I have attached the gateway.log file.gatewaylog.txt
... View more
09-29-2016
11:52 PM
There was nothing for guest, but got something though it is just a warning: WARN webapp.WebAppContext (WebAppContext.java:doStart(514)) - Failed startup of context o.e.j.w.WebAppContext@62811100{/gateway/knox_sample,null,null}{/usr/hdp/2.5.0.0-1245/knox/bin/../data/deployments/knox_sample.topo.157233d52e0/%2F} java.lang.IllegalStateException: Failed to delete temp dir /var/lib/knox/data-2.5.0.0-1245/deployments/knox_sample.topo.157233d52e0/%2F/META-INF/temp This warning is there for all 4 topologies
... View more
09-29-2016
04:46 PM
Hello @Andrew Ryan , I just compared both of them and they are same. Also found out that I am not able to connect to other topologies like default, admin and knoxsso as well.
... View more
09-29-2016
04:21 AM
Yes, I am able to contact webhdfs without Knox. Further, I am not able to access the sample file that you have attached.
... View more
09-28-2016
11:19 PM
Thank you @Andrew Ryan for replying. Checked /usr/hdp/current/knox-server/conf/topologies/knox_sample.xml, it has this section: <service> <role>WEBHDFS</role> <url>http://sandbox.hortonworks.com:50070/webhdfs</url> </service> Also, webhdfs server is listening to the port 50070. Log file is not stating any error: 2016-09-28 18:53:39,285 INFO hadoop.gateway (GatewayServer.java:logSysProp(193)) - System Property: user.name=knox 2016-09-28 18:53:39,287 INFO hadoop.gateway (GatewayServer.java:logSysProp(193)) - System Property: user.dir=/home/knox 2016-09-28 18:53:39,287 INFO hadoop.gateway (GatewayServer.java:logSysProp(193)) - System Property: java.runtime.name=OpenJDK Runtime Environment 2016-09-28 18:53:39,287 INFO hadoop.gateway (GatewayServer.java:logSysProp(193)) - System Property: java.runtime.version=1.7.0_111-mockbuild_2016_07_27_10_11-b00 2016-09-28 18:53:39,287 INFO hadoop.gateway (GatewayServer.java:logSysProp(193)) - System Property: java.home=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64/jre 2016-09-28 18:53:39,522 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigResource(321)) - Loading configuration resource jar:file:/usr/hdp/2.5.0.0-1245/knox/bin/../lib/gateway-server-0.9.0.2.5.0.0-1245.jar!/conf/gateway-default.xml 2016-09-28 18:53:39,530 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigFile(309)) - Loading configuration file /usr/hdp/2.5.0.0-1245/knox/bin/../conf/gateway-site.xml 2016-09-28 18:53:39,614 INFO hadoop.gateway (GatewayConfigImpl.java:initGatewayHomeDir(253)) - Using /usr/hdp/2.5.0.0-1245/knox/bin/.. as GATEWAY_HOME via system property. 2016-09-28 18:53:40,111 INFO hadoop.gateway (JettySSLService.java:init(95)) - Credential store for the gateway instance found - no need to create one. 2016-09-28 18:53:40,151 INFO hadoop.gateway (JettySSLService.java:init(117)) - Keystore for the gateway instance found - no need to create one. 2016-09-28 18:53:40,156 INFO hadoop.gateway (JettySSLService.java:logAndValidateCertificate(146)) - The Gateway SSL certificate is issued to hostname: sandbox.hortonworks.com. 2016-09-28 18:53:40,157 INFO hadoop.gateway (JettySSLService.java:logAndValidateCertificate(149)) - The Gateway SSL certificate is valid between: 9/13/16 10:56 AM and 9/13/17 10:56 AM. 2016-09-28 18:53:40,454 INFO hadoop.gateway (GatewayServer.java:startGateway(279)) - Starting gateway... 2016-09-28 18:53:40,797 INFO hadoop.gateway (GatewayServer.java:start(379)) - Loading topologies from directory: /usr/hdp/2.5.0.0-1245/knox/bin/../conf/topologies 2016-09-28 18:53:40,950 INFO hadoop.gateway (GatewayServer.java:handleCreateDeployment(655)) - Loading topology knoxsso from /usr/hdp/2.5.0.0-1245/knox/bin/../data/deployments/knoxsso.topo.157239f6c28 2016-09-28 18:53:40,951 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(524)) - Activating topology knoxsso 2016-09-28 18:53:40,974 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(534)) - Activating topology knoxsso archive %2F 2016-09-28 18:53:41,015 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(534)) - Activating topology knoxsso archive %2Fknoxauth 2016-09-28 18:53:41,100 INFO hadoop.gateway (GatewayServer.java:handleCreateDeployment(655)) - Loading topology admin from /usr/hdp/2.5.0.0-1245/knox/bin/../data/deployments/admin.topo.15723310670 2016-09-28 18:53:41,101 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(524)) - Activating topology admin 2016-09-28 18:53:41,102 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(534)) - Activating topology admin archive %2F 2016-09-28 18:53:41,103 INFO hadoop.gateway (GatewayServer.java:handleCreateDeployment(655)) - Loading topology default from /usr/hdp/2.5.0.0-1245/knox/bin/../data/deployments/default.topo.15723310670 2016-09-28 18:53:41,103 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(524)) - Activating topology default 2016-09-28 18:53:41,104 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(534)) - Activating topology default archive %2F 2016-09-28 18:53:41,105 INFO hadoop.gateway (GatewayServer.java:handleCreateDeployment(655)) - Loading topology knox_sample from /usr/hdp/2.5.0.0-1245/knox/bin/../data/deployments/knox_sample.topo.157233d52e0 2016-09-28 18:53:41,105 INFO hadoop.gateway (GatewayServer.java:internalActivateTopology(524)) - Activating topology knox_sample 2016-09-28 18:53:41,106 INFO hadoop.gateway (GatewayServer.java:internalActivateArchive(534)) - Activating topology knox_sample archive %2F 2016-09-28 18:53:41,289 INFO hadoop.gateway (GatewayServer.java:start(395)) - Monitoring topologies in directory: /usr/hdp/2.5.0.0-1245/knox/bin/../conf/topologies 2016-09-28 18:53:41,290 INFO hadoop.gateway (GatewayServer.java:startGateway(294)) - Started gateway on port 8,443.
... View more
09-27-2016
06:44 PM
Hello guys, I am using HDP 2.5 Sandbox and have started Knox as well as Demo LDAP service. Now when I am running: curl -k -u admin:admin-password 'https://127.0.0.1:8443/gateway/knox_sample/webhdfs/v1?op=LISTSTATUS' I am getting the following message- <html> <head> <meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/> <title>Error 503 </title> </head> <body> <h2>HTTP ERROR: 503</h2> <p>Problem accessing /gateway/knox_sample/webhdfs/v1. Reason: <pre> Service Unavailable</pre></p> <hr /><i><small>Powered by Jetty://</small></i> </body> </html> Has anyone experienced this before? Thanks in advance
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Knox
09-27-2016
03:52 PM
Hi @Robbert Naastepad, as spotted by @Michael Young, you can try changing the data type of totmiles variable to double. Drop the table riskfactor from HIve and create it again with: drop table riskfactor; CREATE TABLE riskfactor (driverid string,events bigint,totmiles double,riskfactor float) STORED AS ORC; Let us know if this works.
... View more
09-23-2016
06:15 PM
@Vasilis Vagias , you just have to go to Ambari=>Services=>Hive=>Configs and change the value of property atlas.hook.hive.synchronous to true. It is kept false by default. You can also follow the tutorial of Cross Component lineage where we talk about the lineage for MySQL-Sqoop-Hive and Kafka-Storm: http://hortonworks.com/hadoop-tutorial/cross-component-lineage-apache-atlas/
... View more
09-03-2016
06:50 AM
Its working now, I was missing -n option in the command. It should be: echo -n tom | sha256sum
... View more
09-03-2016
12:14 AM
It worked, thanks a lot @jhorsch. I was missing some mandatory fields in json payload, that should be the reason for the error.
... View more