Member since
08-11-2017
9
Posts
0
Kudos Received
0
Solutions
10-24-2017
04:26 PM
Hi, I have an uncommen question: What is Tez in general? I've read a lot the last few days (including the offical paper and the several descriptions on the Hortonworks/Tez/... websites), and I don't get the point. So far I have understood, that it's an improvement towards MR, because it's offering DAGs so that HDFS-writes can be avoided. It is also more an interface for tools like Pig and Hive and not for application-developers and you should better use Spark for DAG-related applications. Why exactly?! And how is Tez working? How are the DAGs executed? I've read several times that it's more a task-executor than an engine. Facing this statement, I'm asking myself which executor is used? MR? There wasn't written anything to this fact. Additionally, in diagrams of Tez in cluster-architectures, there is Tez below MR, Spark and other engines. Or did I misunderstand this completely and there is an engine in the background of Tez? Would be great if someone could bring light into the dark of my understandings. Thanks, Warius
... View more
Labels:
- Labels:
-
Apache Tez
10-12-2017
01:58 PM
Oh, sorry, forgot to mention, I also tried this, and the same error as on the other attempts appears: [usr@HOST ~]$ mysql -u root -p -hlocalhost
Enter password:
ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
... View more
10-12-2017
01:43 PM
Hello,
for this fix I need root access to the mysql-instance on the node where Hive is running. But I noticed, that I set it nowhere during the installation-process?! Not during ambari-installation and not during the HDP-installation. I just set the Hive-DB-password (!= root-password). When I try to execute the suggested command in the SO-question above (mysql -u root -p -hlocalhost) the Hive-password doesn't work. Also the default password 'admin' doesn't work. So, is there a trick, whats the password? Or did I do a mistake? I'd like to have full access and control over my cluster and the included components ;D Thanks, Warius
... View more
Labels:
- Labels:
-
Apache Hive
10-08-2017
05:25 PM
Hello. So, are there no other possibilities than using Kerberos? Perhaps a nice feature-request for Spark...
... View more
09-01-2017
05:03 PM
Hi, okay. Currently I'm not using Kerberos. So another way would be nice.
... View more
08-29-2017
11:12 AM
Hi,
similar to this question I want to disable the link "kill" in a listed active stage in the ApplicationManager-Web UI of a Spark-Job (see picture below). Is this possible?
Thanks!
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache YARN
08-11-2017
06:07 PM
Of course! I also use HTTPS for Ambari-Web. With port 8443. I think that's the solution, great! Thank you. Can I change the port by re-executing Step 3 in this article?
... View more
08-11-2017
05:02 PM
Hi there.
I'm setting up a small cluster of three nodes with ambari. The nodes are all fresh installed and only for HDP-usage.
During the setup, everything is working well. But when the Knox Gateway is started, the whole start-process is aborted due to a returning error:
2017-08-11 16:03:38,222 INFO hadoop.gateway (GatewayServer.java:logSysProp(197)) - System Property: user.name=knox
2017-08-11 16:03:38,224 INFO hadoop.gateway (GatewayServer.java:logSysProp(197)) - System Property: user.dir=/data/home/knox
2017-08-11 16:03:38,224 INFO hadoop.gateway (GatewayServer.java:logSysProp(197)) - System Property: java.runtime.name=OpenJDK Runtime Environment
2017-08-11 16:03:38,224 INFO hadoop.gateway (GatewayServer.java:logSysProp(197)) - System Property: java.runtime.version=1.8.0_141-b16
2017-08-11 16:03:38,224 INFO hadoop.gateway (GatewayServer.java:logSysProp(197)) - System Property: java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.141-2.b16.el7_4.x86_64/jre
2017-08-11 16:03:38,355 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigResource(367)) - Loading configuration resource jar:file:/usr/hdp/2.6.1.0-129/knox/bin/../lib/gateway-server-0.12.0.2.6.1.0-129.jar!/conf/gateway-default.xml
2017-08-11 16:03:38,360 INFO hadoop.gateway (GatewayConfigImpl.java:loadConfigFile(355)) - Loading configuration file /usr/hdp/2.6.1.0-129/knox/bin/../conf/gateway-site.xml
2017-08-11 16:03:38,380 INFO hadoop.gateway (GatewayConfigImpl.java:initGatewayHomeDir(299)) - Using /usr/hdp/2.6.1.0-129/knox/bin/.. as GATEWAY_HOME via system property.
2017-08-11 16:03:38,695 INFO hadoop.gateway (JettySSLService.java:init(95)) - Credential store for the gateway instance found - no need to create one.
2017-08-11 16:03:38,706 INFO hadoop.gateway (JettySSLService.java:init(117)) - Keystore for the gateway instance found - no need to create one.
2017-08-11 16:03:38,709 INFO hadoop.gateway (JettySSLService.java:logAndValidateCertificate(146)) - The Gateway SSL certificate is issued to hostname: xxx.yyy.zz.
2017-08-11 16:03:38,710 INFO hadoop.gateway (JettySSLService.java:logAndValidateCertificate(149)) - The Gateway SSL certificate is valid between: 8/11/17 4:03 PM and 8/11/18 4:03 PM.
2017-08-11 16:03:38,945 INFO hadoop.gateway (GatewayServer.java:startGateway(283)) - Starting gateway...
2017-08-11 16:03:39,054 FATAL hadoop.gateway (GatewayServer.java:main(155)) - Failed to start gateway: java.net.BindException: Address already in use (Bind failed)
(Source: /var/log/knox/gateway.log)
So, "Address already in use" is the problem, which I try to solve. I assume that port 8443 is meant. I don't understand that. Nothing different than HDP is running on the machine, that could occupy the port, I also didn't change anything corresponding during the cluster-configuration, so all ports are the default ones and there shouldn't be any conflicts.
Can anybody help me with this problem?
... View more
- Tags:
- knox-gateway
- Security
Labels:
- Labels:
-
Apache Knox