Member since
10-20-2016
106
Posts
0
Kudos Received
0
Solutions
12-23-2019
01:45 AM
I have created new notebook in zeppelin but unable to open it. can some one can help on this.
Attaching the screenshot
... View more
Labels:
- Labels:
-
Apache Zeppelin
12-20-2019
05:00 AM
Hi Team,
I have installed NIFI service in the existing HDP cluster and need to integrate with AD as currently the nifi web ui does not have authentication. Can someone help on this?
... View more
Labels:
- Labels:
-
Apache NiFi
12-20-2019
03:53 AM
@Shelton I have changed it to 644 but however after starting node manager it remains the same 444. Before: -rw-r--r-- 1 yarn hadoop 6 Dec 20 05:00 hadoop-yarn-nodemanager.pid After -r--r--r-- 1 yarn hadoop 6 Dec 20 05:00 hadoop-yarn-nodemanager.pid Not able to find the exact cause why it is changing again to 444 though i did the permission manually.
... View more
12-20-2019
02:07 AM
@Shelton I tried the below solution even though the pid file created with 444 permission upon multiple restarts. -r--r--r-- 1 yarn hadoop 6 Dec 20 05:00 hadoop-yarn-nodemanager.pid Still the above issue is persisting resource_management.core.exceptions.ExecutionFailed: Execution of 'ulimit -c unlimited; export HADOOP_LIBEXEC_DIR=/usr/hdp/3.0.1.0-187/hadoop/libexec && /usr/hdp/3.0.1.0-187/hadoop-yarn/bin/yarn --config /usr/hdp/3.0.1.0-187/hadoop/conf --daemon start nodemanager' returned 1. -bash: line 0: ulimit: core file size: cannot modify limit: Operation not permitted /usr/hdp/3.0.1.0-187/hadoop/libexec/hadoop-functions.sh: line 1847: /var/run/hadoop-yarn/yarn/hadoop-yarn-nodemanager.pid: Permission denied ERROR: Cannot write nodemanager pid /var/run/hadoop-yarn/yarn/hadoop-yarn-nodemanager.pid. /usr/hdp/3.0.1.0-187/hadoop/libexec/hadoop-functions.sh: line 1866: /var/log/hadoop-yarn/yarn/hadoop-yarn-nodemanager-Hostname.org.out: Permission denied
... View more
12-19-2019
05:16 AM
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/nodemanager.py", line 102, in <module> Nodemanager().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 351, in execute method(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/nodemanager.py", line 53, in start service('nodemanager',action='start') File "/usr/lib/ambari-agent/lib/ambari_commons/os_family_impl.py", line 89, in thunk return fn(*args, **kwargs) File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/package/scripts/service.py", line 93, in service Execute(daemon_cmd, user = usr, not_if = check_process) File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 263, in action_run returns=self.resource.returns) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy, returns=returns) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of 'ulimit -c unlimited; export HADOOP_LIBEXEC_DIR=/usr/hdp/3.0.1.0-187/hadoop/libexec && /usr/hdp/3.0.1.0-187/hadoop-yarn/bin/yarn --config /usr/hdp/3.0.1.0-187/hadoop/conf --daemon start nodemanager' returned 1. -bash: line 0: ulimit: core file size: cannot modify limit: Operation not permitted /usr/hdp/3.0.1.0-187/hadoop/libexec/hadoop-functions.sh: line 1847: /var/run/hadoop-yarn/yarn/hadoop-yarn-nodemanager.pid: Permission denied ERROR: Cannot write nodemanager pid /var/run/hadoop-yarn/yarn/hadoop-yarn-nodemanager.pid. /usr/hdp/3.0.1.0-187/hadoop/libexec/hadoop-functions.sh: line 1866: /var/log/hadoop-yarn/yarn/hadoop-yarn-nodemanager
... View more
Labels:
- Labels:
-
Apache YARN
12-19-2019
05:00 AM
I had installed datanode and trying to start it via ambari but it was throwing some error. I tried to check the logs in /var/log/hadoop/hdfs and could not understand the issue. Does anyone faced this issue before ?
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1417) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:500) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2782) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2690) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2732) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2876) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2900) 2019-12-19 07:42:35,127 INFO server.AbstractConnector (AbstractConnector.java:doStart(278)) - Started ServerConnector@53812a9b{HTTP/1.1,[http/1.1]}{localhost:41704} 2019-12-19 07:42:35,127 INFO server.Server (Server.java:doStart(414)) - Started @3594ms 2019-12-19 07:42:35,131 INFO server.AbstractConnector (AbstractConnector.java:doStop(318)) - Stopped ServerConnector@53812a9b{HTTP/1.1,[http/1.1]}{localhost:0} 2019-12-19 07:42:35,131 INFO handler.ContextHandler (ContextHandler.java:doStop(910)) - Stopped o.e.j.w.WebAppContext@cb191ca{/,null,UNAVAILABLE}{/datanode} 2019-12-19 07:42:35,137 INFO datanode.DataNode (DataNode.java:shutdown(2134)) - Shutdown complete. 2019-12-19 07:42:35,137 ERROR datanode.DataNode (DataNode.java:secureMain(2883)) - Exception in secureMain java.io.IOException: Problem starting http server at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1165) at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:141) at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:957) at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1417) at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:500) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2782) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2690) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2732) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2876) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2900) Caused by: java.lang.NullPointerException at org.eclipse.jetty.util.IO.delete(IO.java:344) at org.eclipse.jetty.webapp.WebInfConfiguration.deconfigure(WebInfConfiguration.java:195) at org.eclipse.jetty.webapp.WebAppContext.stopContext(WebAppContext.java:1380) at org.eclipse.jetty.server.handler.ContextHandler.doStop(ContextHandler.java:880) at org.eclipse.jetty.servlet.ServletContextHandler.doStop(ServletContextHandler.java:272) at org.eclipse.jetty.webapp.WebAppContext.doStop(WebAppContext.java:546) at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) at org.eclipse.jetty.util.component.ContainerLifeCycle.stop(ContainerLifeCycle.java:142) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStop(ContainerLifeCycle.java:160) at org.eclipse.jetty.server.handler.AbstractHandler.doStop(AbstractHandler.java:73) at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) at org.eclipse.jetty.util.component.ContainerLifeCycle.stop(ContainerLifeCycle.java:142) at org.eclipse.jetty.util.component.ContainerLifeCycle.doStop(ContainerLifeCycle.java:160) at org.eclipse.jetty.server.handler.AbstractHandler.doStop(AbstractHandler.java:73) at org.eclipse.jetty.server.Server.doStop(Server.java:493) at org.eclipse.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89) at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:1155) ... 9 more 2019-12-19 07:42:35,139 INFO util.ExitUtil (ExitUtil.java:terminate(210)) - Exiting with status 1: java.io.IOException: Problem starting http server 2019-12-19 07:42:35,143 INFO datanode.DataNode (LogAdapter.java:info(51)) - SHUTDOWN_MSG: /************************************************************
... View more
Labels:
- Labels:
-
Apache Ambari
11-28-2019
03:12 AM
@Shelton I could see some improvement on the UI, after clearing the DB old logs
... View more