Member since
08-16-2019
42
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1315 | 06-03-2020 07:21 PM |
09-22-2023
06:59 AM
Hi, I am using sandbox HDP-3.0.1.0 Ambari Version 2.7.1.0, My hadoop distcp command is stuck, not sure what is happening; [root@sandbox-hdp bin]# hadoop distcp hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/foodmart.db/currency hdfs://sandbox-hdp.hortonworks.com:8020/sandbox/tutorial-f iles/20575/ ERROR: Tools helper /usr/hdp/3.0.1.0-187/hadoop/libexec/tools/hadoop-distcp.sh was not found. 23/09/22 13:37:35 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=false, overwrite=false, append=false, useDiff=false , useRdiff=false, fromSnapshot=null, toSnapshot=null, skipCRC=false, blocking=true, numListstatusThreads=0, maxMaps=20, mapBandwidth=0.0, copyStrategy='uniformsize', preserveStatus=[BLOCKSIZE ], atomicWorkPath=null, logPath=null, sourceFileListing=null, sourcePaths=[hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/foodmart.db/currency], targetPath=hdfs://s andbox-hdp.hortonworks.com:8020/sandbox/tutorial-files/20575, filtersFile='null', blocksPerChunk=0, copyBufferSize=8192, verboseLog=false}, sourcePaths=[hdfs://sandbox-hdp.hortonworks.com:802 0/warehouse/tablespace/managed/hive/foodmart.db/currency], targetPathExists=true, preserveRawXattrsfalse 23/09/22 13:37:35 INFO client.RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.3:8050 23/09/22 13:37:35 INFO client.AHSProxy: Connecting to Application History server at sandbox-hdp.hortonworks.com/172.18.0.3:10200 23/09/22 13:37:35 INFO tools.SimpleCopyListing: Paths (files+dirs) cnt = 3; dirCnt = 2 23/09/22 13:37:35 INFO tools.SimpleCopyListing: Build file listing completed. 23/09/22 13:37:35 INFO tools.DistCp: Number of paths in the copy list: 3 23/09/22 13:37:35 INFO tools.DistCp: Number of paths in the copy list: 3 23/09/22 13:37:35 INFO client.RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.18.0.3:8050 23/09/22 13:37:35 INFO client.AHSProxy: Connecting to Application History server at sandbox-hdp.hortonworks.com/172.18.0.3:10200 23/09/22 13:37:35 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/root/.staging/job_1695389098784_0002 23/09/22 13:37:35 INFO mapreduce.JobSubmitter: number of splits:2 23/09/22 13:37:36 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1695389098784_0002 23/09/22 13:37:36 INFO mapreduce.JobSubmitter: Executing with tokens: [] 23/09/22 13:37:36 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.0.1.0-187/0/resource-types.xml 23/09/22 13:37:36 INFO impl.YarnClientImpl: Submitted application application_1695389098784_0002 23/09/22 13:37:36 INFO mapreduce.Job: The url to track the job: http://sandbox-hdp.hortonworks.com:8088/proxy/application_1695389098784_0002/ 23/09/22 13:37:36 INFO tools.DistCp: DistCp job-id: job_1695389098784_0002 23/09/22 13:37:36 INFO mapreduce.Job: Running job: job_1695389098784_0002 23/09/22 13:42:28 INFO mapreduce.Job: Job job_1695389098784_0002 running in uber mode : false 23/09/22 13:42:28 INFO mapreduce.Job: map 0% reduce 0%
... View more
Labels:
11-06-2020
08:55 AM
Hi @TimothySpann Could you please help here.
... View more
11-06-2020
07:00 AM
Hi All, I am running spark (livy) from nifi. Currently using HDP 3.0/HDF 3.1 the CDA version. I was running spark livy through nifi. I could see that nifi flow was completed successfully. But there were several livy-session keep executing in Yarn. I've tried to kill some of them but they were re-appearing in Yarn. This is causing issue that these livy sessions are consuming CPUs even though I am not running any spark application through livy-nifi. I am not sure how to resolve this issue. Needed your help here.
... View more
Labels:
- Labels:
-
Apache NiFi
-
Apache Spark
-
Apache YARN
06-03-2020
07:21 PM
There was access issue for the files event-processor.log, das-webapp.log. Gave access to those files, which resolved DAS WebUI issue.
... View more
06-03-2020
06:40 PM
Hi, There was an issue of replication of Hive metadata to DAS; I was trying to resolve it by resetting tables by taking reference of "https://docs.cloudera.com/HDPDocuments/DAS/DAS-1.2.1/troubleshooting/content/das_replication_failure_in_event_processor.html". After executing SQL commands successfully; I was trying to start DAS. Under which "Data Analytics Studio Event Processor" was started successfully and running but "Data Analytics Studio Webapp" was started successfully but not running. Below is the configuration detail HDP - 3.0.1 DAS - 1.0.2.0.0 Steps I used; 1) Stops DAS from Ambari 2) taken backup of /var/log/das log files and created empty files of event-processor.log, das-webapp.log files. 3) Executed reset tables SQL commands to das database in postgres 4) Started DAS from Ambari. Other observations - 1) After start of DAS nothing is written in event-processor.log, das-webapp.log files. 2) But can see das.db_replication_info was getting refreshed with id = 2 database_name = * last_replication_id, last_replication_start_time, last_replication_end_time, next_replication_start_time with valid values. Could you please help me here for fixing replication issue in DAS. Is there any separate instructions to fix replication issue in DAS 1.0.2.0.0, since I referred instructions to fix replication issue in DAS-1.2.1. Let me know if any other information is needed. Your reply is awaited; Thanks!
... View more
Labels:
05-31-2020
09:18 AM
1 Kudo
Hi @Shelton , Was you get chance to look into this issue, needed help on this. Thanks
... View more
03-19-2020
08:35 PM
Hi @Shelton Reason of this issue is https://docs.cloudera.com/HDPDocuments/DAS/DAS-1.4.4/troubleshooting/content/das_replication_failure_in_event_processor.html That I validated at my end, /var/log/das/event-processor.log is showing "Notification events are missing in the meta store" and /var/log/das/event-processor.log is showing replication is unsuccessful. I tried to follow instructions mentioned in above link to submit command curl -H 'X-Requested-By: das' -H 'Cookie: JSESSIONID=<session id cookie>' http(s)://<hostname>:<port>/api/replicationDump/reset please refer screen print attached This command did not work, showing; {"code":404,"message": "HTTP 404 Not Found"} Could you please help here.
... View more
12-27-2019
06:12 AM
Hi @TimothySpann , Could you please let me know if you was able to resolve issue, since I am also facing similar issue. " Traceback (most recent call last): File "testhive.py", line 1, in <module> from pyhive import hive File "/usr/local/lib/python2.7/site-packages/pyhive/hive.py", line 10, in <module> from TCLIService import TCLIService File "/usr/local/lib/python2.7/site-packages/TCLIService/TCLIService.py", line 9, in <module> from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException ImportError: No module named thrift.Thrift Any ideas? "
... View more
11-25-2019
10:44 AM
@KWiseman Agreed, I did not have any other option to re-install HDP.
... View more
11-09-2019
10:48 PM
Hi cloudera Team, Following up for this issue;
... View more