Member since
02-23-2018
108
Posts
12
Kudos Received
9
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1849 | 04-11-2019 12:32 AM | |
| 3312 | 03-28-2019 04:14 AM | |
| 1973 | 01-22-2019 12:20 AM |
06-18-2019
04:57 AM
Hi @urbanlad20, Your error is: File does not exist: /user/spark/spark2ApplicationHistory/.5e9b4c52-032a-4469-b278-80aa6254cfdf Can you restart the spark installation ? it seems to be related. Regards, Manu.
... View more
06-18-2019
02:49 AM
Hi @Hail2Ichi, Try to update Impala JDBC Connector. Regards, Manu.
... View more
05-22-2019
02:44 AM
Hi @rdbb, Its depends of your objetive. There are not any restriction, but really you need a edge node? Regards, Manu.
... View more
04-11-2019
12:32 AM
Hi @DataMike, You need to restart all afected component. If not this will run like nothing. Regards, Manu.
... View more
03-28-2019
04:14 AM
Hi @Adilm, You are right. There are not any table, you must to study your scenario(HA, security, access number ...). Some questions: - Volume users? - Volume data? All documentation is available here, according your version: https://www.cloudera.com/documentation/enterprise/latest.html Regards, Manu.
... View more
03-28-2019
03:40 AM
Hi @Adilm, 1) If you want to migrate all data, you can compress them and allocated in other nodes/servers. And not need 20TB of disk. Althow if you need availble the data information, yo have 2 scenarios: - Ten replication factor: then need 20TB per server. - One replication factor: only need 20TB distributed in 10 servers. - Best: replication factor 5 and 4TB per server. 2) Its depends, you need one namenode, one secondarynamenode, and for example 8 datanodes. You need to put attention of resources of your hosts. Regards, Manu.
... View more
03-25-2019
05:12 AM
Hi @IVenkatesh, Can you paste any logs from some services? All your services are stopped. Regards, Manu.
... View more
03-19-2019
04:12 AM
Hi @MantuDeka, Try to stop some unnecesary services that are running. Regards, Manu.
... View more
01-29-2019
04:48 AM
Hi @Tulasi, Try to kill running process on port used by Yarn Services and then Try to restart. Regards, Manu.
... View more
01-22-2019
12:20 AM
1 Kudo
Hi, @Vinn Yes, only will have a error if any service try to write them. But inmediatly this will create a new log file. Regards, Manu.
... View more