Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1001 | 06-04-2025 11:36 PM | |
| 1568 | 03-23-2025 05:23 AM | |
| 784 | 03-17-2025 10:18 AM | |
| 2819 | 03-05-2025 01:34 PM | |
| 1861 | 03-03-2025 01:09 PM |
12-02-2019
02:06 PM
@mike_bronson7 You can change the ownership of the HDFS directory to airflow:hadoop please do run the -chown command on / ??? It should something like /users/airflow/xxx Please let me know
... View more
12-02-2019
01:59 PM
@rwinters You could be right that your issue is incompatibility? Can you check exactly what version of Postgres you upgraded to? It's usually advisable to check the Hortonworks support matrix before launching any upgrade I only hope it's a dev environment see screenshot Here you see PostgreSQL 10.7 is only compatible with Ambari 2.7.4 and HDP 3.1.4 very useful tool. Happy hadooping
... View more
12-02-2019
02:25 AM
@mike_bronson7 The hadoop group encapsulates all the users including hdfs You do run a # cat /etc/group You should see someing like like hadoop:x:1007:yarn-ats,hive,storm,infra-solr,zookeeper,oozie,atlas,ams,ranger,tez,zeppelin,kms,accumulo,livy,druid,spark,ambari-qa,kafka,hdfs,s qoop,yarn,mapred,hbase,knox So running the -chown should only target the directory in the Diagnostics logs NEVER run the -chown command on / which is the root directory !! Can you share your log please
... View more
12-02-2019
01:42 AM
@mike_bronson7 That's correct you need to give the path of the directory 🙂 ie usually in hdfs $ hdfs dfs -chown airflow:hdfs /path/in/hdfs/where/you/failed/to/write As you didn't include the path I assumed you'd do that with the -chown command
... View more
12-02-2019
01:27 AM
@Kou_Bou Great !
... View more
12-02-2019
12:45 AM
@mike_bronson7 The gid is numeric value just to indicate the id but the valid groups for airflow are [airflow and hdfs] $ hdfs dfs -chown airflow:hdfs Should do the magic, please revert Cheers
... View more
12-02-2019
12:03 AM
@naveensangam @jepe_desu In reference to Invalid KDC administrator credentials issue raised by @naveensangam I wrote a walkthrough of the solution that resolved the issue for other users like @jepe_desu who had encountered exactly the same problem. @naveensangam can you update the thread if my solution resolved your issue or if not can you share what errors you have. Once you accept an answer it can be referenced by other members for similar issues rather than starting a new thread. Happy hadooping
... View more
12-01-2019
11:56 PM
1 Kudo
@mike_bronson7 That's a classic permissions issue "airflow" is trying to write to that directory but has no permissions as it's owned by alapati user the inode=alapati. The easiest solution is to grant the permissions as the hdfs user $ hdfs dfs -chown airflow:{$airflow_group} Most components like Spark, Hive, sqoop need to access HDFS
... View more
11-30-2019
02:02 AM
@vishal6193 Can you share how you did your ADLS setup? Please have a look at this Hadoop Azure Support Azure Blob Storage look at the jars and credentials to create in a secure environment
... View more
11-29-2019
09:26 AM
1 Kudo
@Manoj690 Can you run the below commands, you shouldn't copy the way you did? The below steps should resolve that issue Check yum repo # yum repolist Install the mysql driver # yum install -y mysql-connector-java Launch the ambari setup # ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar Please revert
... View more