Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 533 | 06-04-2025 11:36 PM | |
| 1072 | 03-23-2025 05:23 AM | |
| 553 | 03-17-2025 10:18 AM | |
| 2061 | 03-05-2025 01:34 PM | |
| 1289 | 03-03-2025 01:09 PM |
12-02-2019
01:42 AM
@mike_bronson7 That's correct you need to give the path of the directory 🙂 ie usually in hdfs $ hdfs dfs -chown airflow:hdfs /path/in/hdfs/where/you/failed/to/write As you didn't include the path I assumed you'd do that with the -chown command
... View more
12-02-2019
01:27 AM
@Kou_Bou Great !
... View more
12-02-2019
12:45 AM
@mike_bronson7 The gid is numeric value just to indicate the id but the valid groups for airflow are [airflow and hdfs] $ hdfs dfs -chown airflow:hdfs Should do the magic, please revert Cheers
... View more
12-02-2019
12:03 AM
@naveensangam @jepe_desu In reference to Invalid KDC administrator credentials issue raised by @naveensangam I wrote a walkthrough of the solution that resolved the issue for other users like @jepe_desu who had encountered exactly the same problem. @naveensangam can you update the thread if my solution resolved your issue or if not can you share what errors you have. Once you accept an answer it can be referenced by other members for similar issues rather than starting a new thread. Happy hadooping
... View more
12-01-2019
11:56 PM
1 Kudo
@mike_bronson7 That's a classic permissions issue "airflow" is trying to write to that directory but has no permissions as it's owned by alapati user the inode=alapati. The easiest solution is to grant the permissions as the hdfs user $ hdfs dfs -chown airflow:{$airflow_group} Most components like Spark, Hive, sqoop need to access HDFS
... View more
11-30-2019
02:02 AM
@vishal6193 Can you share how you did your ADLS setup? Please have a look at this Hadoop Azure Support Azure Blob Storage look at the jars and credentials to create in a secure environment
... View more
11-29-2019
09:26 AM
1 Kudo
@Manoj690 Can you run the below commands, you shouldn't copy the way you did? The below steps should resolve that issue Check yum repo # yum repolist Install the mysql driver # yum install -y mysql-connector-java Launch the ambari setup # ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar Please revert
... View more
11-28-2019
04:23 AM
@saivenkatg55 Great, can you gauge the improvement in percentage? Can you back up your ambari.properties and make the below changes to further enhance the Ambari performance I usually script that as a post-install step. Can you then take time and Accept the solution and close the thread so other members can reference it should they encounter a simpler situation. Purging in the Ambari database from time to time should be an Admin maintenance task Backup the ambari server properties file # cp /etc/ambari-server/conf/ambari.properties /etc/ambari-server/conf/ambari.properties.ORIG Change the timeout of the ambari server # echo 'server.startup.web.timeout=120' >> /etc/ambari-server/conf/ambari.properties # echo 'server.jdbc.connection-pool.acquisition-size=5' >> /etc/ambari-server/conf/ambari.properties # echo 'server.jdbc.connection-pool.max-age=0' >> /etc/ambari-server/conf/ambari.properties # echo 'server.jdbc.connection-pool.max-idle-time=14400' >> /etc/ambari-server/conf/ambari.properties # echo 'server.jdbc.connection-pool.max-idle-time-excess=0' >> /etc/ambari-server/conf/ambari.properties # echo 'server.jdbc.connection-pool.idle-test-interval=7200' >> /etc/ambari-server/conf/ambari.properties Hope that helps Happy hadooping
... View more
11-28-2019
01:37 AM
@Kou_Bou I just deployed a cluster yesterday and never encountered that error Can you share your documented steps in a document if it's too big to attach use google, I am sure you might be skipping a step but before that can you do the 2 steps below vi /etc/ambari-agent/conf/ambari-agent.ini
force_https_protocol=PROTOCOL_TLSv1_2 vi /etc/python/cert-verification.cfg
[https]
verify=disable To disable SSL/TLS then retry that should resolve your problem
... View more
11-28-2019
01:28 AM
@lalprasanth That's obvious because you have double $$ in the sqoop command you forgot to remove the one in the command I posted please do that and revert. Thank you
... View more