Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 554 | 06-04-2025 11:36 PM | |
| 1102 | 03-23-2025 05:23 AM | |
| 561 | 03-17-2025 10:18 AM | |
| 2104 | 03-05-2025 01:34 PM | |
| 1317 | 03-03-2025 01:09 PM |
10-05-2019
04:24 AM
@erkansirin78 Great !! it worked out for you, If you found this answer addressed your question, please take a moment to log in and click the thumbs up button. That would be a great help to Cloudera Community to find the solution quickly for these kinds of errors and mark it as a solution
... View more
10-04-2019
01:05 PM
@saivenkatg55 What are the permission on that file? It should be -rw-r--r-- 1 yarn hadoop $ ls /var/log/hadoop-yarn/yarn/hadoop-yarn-nodemanager-<host_name>.org.out Permissions should be # chmod 644 /var/log/hadoop-yarn/yarn/hadoop-yarn-nodemanager-<host_name>.org.out Ownership should be yarn:hadoop
... View more
10-04-2019
12:54 PM
@Alieer It looks likes white space issue between the $ sqoop import-all-tables \ -m 1 \ --connect jdbc:mysql://quickstart:3306/retail_db \ --username=retail_dba \ --password=cloudera \ --compression-codec=snappy \ --as-parquetfile \ --warehouse-dir=/user/hive/warehouse \ --hive-import Please try to copy and paste the above
... View more
10-04-2019
12:24 PM
@erkansirin78 It seems you have an issue with your known_hosts file, you need to remove the third(3) line in known_hosts file /c/Users/user/.ssh/known_hosts:3 Most likely an SSH altered the encryption keys due to a possible security hole. You can then purge that specific line from your known_hosts file: # sed -i 377d ~/.ssh/known_hosts but it seems your known_hosts file is mounted on the C drive on /c/Users/user/ windows 10!! You can also remove StrictHostKey checking in your ssh configuration file, typically stored at ~/.ssh/config. An example Host block is provided below: Host 101 HostName yourip|hostname User your_Userid IdentityFile /path/to/keyfile Port 22 StrictHostKeyChecking no The specifically added line is the last one StrictHostKeyChecking no which does just that. Please do that and revert
... View more
10-01-2019
12:31 PM
2 Kudos
@ThanhP I had the same situation with my Sandbox I had not used for a while. Here is my config I disabled all the network adapters except one which I attached to my LAN using the Bridged Adapter, 12 GB, 4 CPU's see attached screenshots once my Sandbox had booted I accessed the CLI and managed to turn off safe mode as the namenode was not starting. You should access the Linux CLI through the web interface. Can you try that and revert
... View more
09-26-2019
12:44 PM
@anbazhagan_muth When I look at your scripts, your Kafka scripts should run like shell scripts so your syntax should look like this from the Kafka install directory kafka/bin/kafka-console-consumer.sh --bootstrap-server quickstart.cloudera:9092 --topic smoke --from-beginning Please revert
... View more
09-25-2019
12:14 PM
@Manoj690 Go to Ambari > Hive > CONFIGS > ADVANCED > Custom hive-site and add hive.users.in.admin.role to the list of comma-separated users who require admin role authorization (such as the user hive). Restart the Hive services for the changes to take effect. The permission denied error should be fixed after adding hive .users.in.admin.role=hive and restarting hive because properties that are listed in hive.conf.restricted.list cannot be reset with hiveconf Please do that and revert.
... View more
09-25-2019
11:41 AM
@irfangk1 You can definitely use Mysql database alongside postresql any other database for the different HDP components that need back end databases see the screenshots for ranger and ambari DB options but that adds only adds extra administrative complexity to you cluster management like backup and scripts usability HTH Happy hadooping
... View more
09-25-2019
11:09 AM
@jhc I noted the database refresh took very long , did you manually refresh please do that and update this thread Happy hadooping
... View more