Created 07-11-2018 01:34 AM
I have created a new HDP 2.6.5 cluster on Azure vm’s. I have configured to access Azure ADLS as described in
# Have the following set in core-site.xml
fs.adl.oauth2.access.token.provider.type=ClientCredential
fs.adl.oauth2.client.id=<app id>
fs.adl.oauth2.credential=<key>
fs.adl.oauth2.refresh.url=https://login.microsoftonline.com/<...>/oauth2/token
dfs.adls.home.hostname=mystore.azuredatalakestore.net
dfs.adls.home.mountpoint=/clusters/mycluster/
And, I can successfully access files through HDFS
$ hadoop fs -ls adl://home/hive/warehouse/hivesampletable
Found 1 items
-rwxrwxrwx+ 1 66eb9aa9-958c-4a1c-898b-2b5c34576b3c ca4d06f4-3100-48a0-acb9-330eaba7dd18 4955715 2018-06-29 21:42 adl://home/hive/warehouse/hivesampletable/HiveSampleData.txt
Hive is pointed to an existing, shared metastore DB that contains tables backed by ADLS. These was created with other HDInsight clusters. Hive is unable to access any tables or databases with ADLS as the data location. With the following error:
Error: Error while compiling statement: FAILED: SemanticException java.lang.IllegalArgumentException: No value for dfs.adls.home.hostname found in conf file. (state=42000,code=40000)
Note that the dfs.adls.home.hostname is definitely set in the core-site.xml
Any one recognize this?
Created 07-11-2018 11:30 AM
Could you share the hiveserver2 exception backtrace? I'm not sure whats wrong; but that would probably help to locate the source of the problem....I think if it's in core-site.xml that should be enough...
Created 07-11-2018 12:28 PM
Going by the exception, I would ensure the core-site.xml is populated with the updated configurations on all the cluster nodes.