Member since
03-02-2021
25
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4058 | 03-16-2021 08:53 AM |
05-04-2021
05:10 AM
Its an existing cluster where I am trying to add Ranger
... View more
05-04-2021
12:45 AM
Ranger Installation fails as ranger fileset is missing in the repo. I dont see any ranger-admin package in the repository https:/archive.cloudera.com/p/HDP/3.x/3.1.4.0/centos7-ppc/ranger/ raise RuntimeError(message) RuntimeError: Failed to execute command '/usr/bin/yum -y install ranger_3_1_4_0_315-admin', exited with code '1', message: 'https://archive.cloudera.com/p/HDP/3.x/3.1.4.0/centos7-ppc/ranger/ranger_3_1_4_0_315-admin-1.2.0.3.1.4.0-315.ppc64le.rpm: [Errno 14] Error 404 - The requested URL returned error: 404 Not Found Trying other mirror. To address this issue please refer to the below knowledge base article
... View more
Labels:
04-26-2021
07:27 AM
Getting below error post the integration with kerberos Error starting ResourceManager org.apache.hadoop.service.ServiceStateException: java.io.IOException: DestHost:destPort ces1pub.pbm.ihost.com:8020 , LocalHost:localPort cdp2pub.pbm.ihost.com/129.40.6.167:0. Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at org.apache.hadoop.service.ServiceStateException.convert(ServiceStateException.java:105) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:203) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:866) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1269) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1310) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1306) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1898) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1306) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1357) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194) at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1547) Caused by: java.io.IOException: DestHost:destPort ces1pub.pbm.ihost.com:8020 , LocalHost:localPort cdp2pub.pbm.ihost.com/129.40.6.167:0. Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1566) at org.apache.hadoop.ipc.Client.call(Client.java:1508) at org.apache.hadoop.ipc.Client.call(Client.java:1405) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy89.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:666) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:431) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:166) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:158) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:96) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:362) at com.sun.proxy.$Proxy90.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2463) at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2439) at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1476) at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1473) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1490) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1465) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2374) at org.apache.hadoop.yarn.server.resourcemanager.recovery.FileSystemRMStateStore$3.run(FileSystemRMStateStore.java:679) at org.apache.hadoop.yarn.server.resourcemanager.recovery.FileSystemRMStateStore$3.run(FileSystemRMStateStore.java:676) at org.apache.hadoop.yarn.server.resourcemanager.recovery.FileSystemRMStateStore$FSAction.runWithRetries(FileSystemRMStateStore.java:792) at org.apache.hadoop.yarn.server.resourcemanager.recovery.FileSystemRMStateStore.mkdirsWithRetries(FileSystemRMStateStore.java:682) at org.apache.hadoop.yarn.server.resourcemanager.recovery.FileSystemRMStateStore.startInternal(FileSystemRMStateStore.java:160) at org.apache.hadoop.yarn.server.resourcemanager.recovery.RMStateStore.serviceStart(RMStateStore.java:824) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194) ... 12 more Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:851) at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:413) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1636) at org.apache.hadoop.ipc.Client.call
... View more
Labels:
- Labels:
-
Cloudera Data Platform (CDP)
-
Kerberos
04-22-2021
06:55 AM
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 314, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh -H -E /usr/hdp/3.1.4.0-315/hadoop/bin/hdfs --config /usr/hdp/3.1.4.0-315/hadoop/conf --daemon start datanode' returned 1. ERROR: Cannot set priority of datanode process 45359
stdout:
2021-04-22 03:25:38,875 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-04-22 03:25:38,931 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-04-22 03:25:39,273 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=3.1.4.0-315 -> 3.1.4.0-315
2021-04-22 03:25:39,289 - Using hadoop conf dir: /usr/hdp/3.1.4.0-315/hadoop/conf
2021-04-22 03:25:39,292 - Group['hdfs'] {}
2021-04-22 03:25:39,294 - Group['hadoop'] {}
2021-04-22 03:25:39,295 - Group['users'] {}
2021-04-22 03:25:39,296 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2021-04-22 03:25:39,297 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
... View more
Labels:
- Labels:
-
Kerberos
03-18-2021
03:23 AM
One more thing I only see livy , md and angular as the interpreter option , why am I not seeing python and sh options as well.
... View more
03-18-2021
03:21 AM
@Scharan the livy url is : http://localhost:8998 When I tired to access the webui It was not responding. So I restarted the live server , [root@cdp1pub spark]# export SPARK_HOME=/opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/lib/spark/ [root@cdp1pub hadoop]# export HADOOP_CONF_DIR=/opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/etc/hadoop [root@cdp1pub hadoop]# /opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/lib/livy2/bin/livy-server start starting java -cp /opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/lib/livy2/jars/*:/opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/lib/livy2/conf:/opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/etc/hadoop: org.apache.livy.server.LivyServer, logging to /opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/lib/livy2/logs/livy-root-server.out [root@cdp1pub hadoop]# /opt/cloudera/parcels/CDH-7.1.4-1.cdh7.1.4.p0.6300266/lib/livy2/bin/livy-server status livy-server is running (pid: 24354) [root@cdp1pub hadoop]# Now I am able to login and %pyspark is not throwing any error Thanks a lot for help
... View more
03-17-2021
09:16 AM
If I try to check the interpreter settings on Zeppelin UI I am getting below error , which shows that I don't have the required permissions I have logged in as admin user.
... View more
03-17-2021
03:45 AM
%pyspark java.net.ConnectException: Connection refused (Connection refused) at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:607) at java.net.Socket.connect(Socket.java:556) at sun.net.NetworkClient.doConnect(NetworkClient.java:180) at sun.net.www.http.HttpClient.openServer(HttpClient.java:463) at sun.net.www.http.HttpClient.openServer(HttpClient.java:558) at sun.net.www.http.HttpClient.<init>(HttpClient.java:242) at sun.net.www.http.HttpClient.New(HttpClient.java:339) at sun.net.www.http.HttpClient.New(HttpClient.java:357) at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1226) at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1162) at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1056) at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:990) at org.springframework.http.client.SimpleBufferingClientHttpRequest.executeInternal(SimpleBufferingClientHttpRequest.java:78) at org.springframework.http.client.AbstractBufferingClientHttpRequest.executeInternal(AbstractBufferingClientHttpRequest.java:48) at org.springframework.http.client.AbstractClientHttpRequest.execute(AbstractClientHttpRequest.java:53) at org.springframework.web.client.RestTemplate.doExecute(RestTemplate.java:661) at org.springframework.web.client.RestTemplate.execute(RestTemplate.java:622) at org.springframework.web.client.RestTemplate.exchange(RestTemplate.java:540) at org.apache.zeppelin.livy.BaseLivyInterpreter.callRestAPI(BaseLivyInterpreter.java:706) at org.apache.zeppelin.livy.BaseLivyInterpreter.callRestAPI(BaseLivyInterpreter.java:686) at org.apache.zeppelin.livy.BaseLivyInterpreter.getLivyVersion(BaseLivyInterpreter.java:472) at org.apache.zeppelin.livy.BaseLivyInterpreter.open(BaseLivyInterpreter.java:161) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616) at org.apache.zeppelin.scheduler.Job.run(Job.java:188) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
... View more
03-16-2021
11:48 PM
I am able to login however , if I click on the interpreter option I get the error that I don't have the permission.
... View more
- « Previous
-
- 1
- 2
- Next »