I run a java user XXX cat DDL.sql | hive, but in the process I get the error.
[XXX@host-001 db]$ cat DDL.sql | hive
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=YYY, access=WRITE, inode="/user/XXX/databases/data_tracing_rep/db_report/filedate=2016-09-07/r=60":XXX:hdfs:drwxr-xr-x
where does user YYY? in ENV not it.
If you have doAs disabled, your queries will run as the Hive user. Check that your directory is open to allow the Hive user to write files there. If not (e.g. permissions are rwx------ or something similar), then the Hive user will error. Since I don't know what YYY user is, I can't verify if this the right answer, but it's something else to check.
The only other thing that comes to mind is the existence of the "XXX" user on the Namenode and/or their association with the group "hadoop".
If they aren't in the group "hadoop", you may find a setting in the hadoop-policy.xml file called: security.client.protocol.acl that is set to "hadoop". This is a way to prevent users not in this group from accessing HDFS.
Note that the user account must exist on the Namenode as well. When you project a request from an Edgenode, where you obviously have the user account, the id (string version) is sent to the Namenode. The Namenode is responsible for "authorization" and does a group lookup of the user on the Namenode Host. If the user doesn't exist here OR their groups aren't the same as they were on the Edgenode where you launch the Hive command from, you'll have issues like this.