Member since
11-26-2018
5
Posts
0
Kudos Received
0
Solutions
11-28-2018
05:22 PM
Thanks Akhil! I did forget to set nonProxyHosts in the ambari-env.sh file. Now everything works fine.
... View more
11-27-2018
04:05 AM
@rtheron yes i just realized the knox gateway is on - I turned it off and restarted hive and hdfs, but the error remains. Should I restart Ambari server? I need to do that in the office tomorrow and give it a try. Thanks again
... View more
11-27-2018
02:34 AM
@rtheron thanks much for the quick response, but are you sure it's a file permission issue... hdfs dfs commands work just fine. The strange thing is, Ambari is able to create the hive job folder (e.g. /user/admin/hive/jobs/hive-job-75-2018-11-26_05-14), create the correct "query.hql" DDL file in it, but somehow not able to write the execution results to the very same folder. Btw I can't seem to find the impersonation settings in hive -> config -> advanced. Do you know where? Thanks
... View more
11-26-2018
11:20 PM
@rtheron actually both "hive" and "admin" (which was the end user used for signing into Ambari) were added as proxy users; the error is the same.
... View more
11-26-2018
06:20 PM
I am trying to run the HDP tutorial (trucking example) in a HDP 2.6.5 cluster. I was able to upload the CSV data files into HDFS. When I am trying to upload new table from trucks.csv, the table preview works fine but I got a "ServiceFormattedException" when I clicked the "Create" button, with the following stack trace from the ambari server logs: org.apache.ambari.view.utils.hdfs.HdfsApiException: HDFS020 Could not write file /user/admin/hive/jobs/hive-job-54-2018-11-25_10-49/logs
at org.apache.ambari.view.utils.hdfs.HdfsUtil.putStringToFile(HdfsUtil.java:57)
at org.apache.ambari.view.hive20.resources.jobs.viewJobs.JobControllerImpl.setupLogFile(JobControllerImpl.java:220)
at org.apache.ambari.view.hive20.resources.jobs.viewJobs.JobControllerImpl.setupLogFileIfNotPresent(JobControllerImpl.java:189)
at org.apache.ambari.view.hive20.resources.jobs.viewJobs.JobControllerImpl.afterCreation(JobControllerImpl.java:182)
at org.apache.ambari.view.hive20.resources.jobs.viewJobs.JobResourceManager.create(JobResourceManager.java:56)
at org.apache.ambari.view.hive20.resources.jobs.JobServiceInternal.createJob(JobServiceInternal.java:27)
at org.apache.ambari.view.hive20.resources.browser.DDLProxy.createJob(DDLProxy.java:384)
at org.apache.ambari.view.hive20.resources.browser.DDLProxy.createTable(DDLProxy.java:256)
at org.apache.ambari.view.hive20.resources.browser.DDLService.createTable(DDLService.java:147)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
Caused by: java.io.IOException: Unexpected HTTP response: code=504 != 201, op=CREATE, message=Gateway Timeout
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:467)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:114)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$FsPathOutputStreamRunner$1.close(WebHdfsFileSystem.java:950)
at org.apache.ambari.view.utils.hdfs.HdfsUtil$1.run(HdfsUtil.java:51)
at org.apache.ambari.view.utils.hdfs.HdfsUtil$1.run(HdfsUtil.java:46)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.ambari.view.utils.hdfs.HdfsApi.execute(HdfsApi.java:513)
at org.apache.ambari.view.utils.hdfs.HdfsUtil.putStringToFile(HdfsUtil.java:46)
... 105 more
Caused by: java.io.IOException: Content-Type "text/html" is incompatible with "application/json" (parsed="text/html")
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.jsonParse(WebHdfsFileSystem.java:443)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:465)
... 114 more Any ideas on what's causing the content type error? Why is the Ambari client not setting the content type correctly when calling the HDFS API? Btw I have already added hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=* so the exception is not caused by the ambari user not able to write to the HDFS volume.
... View more
Labels: