Support Questions

Find answers, ask questions, and share your expertise

Unable to open File View from Ambari UI

avatar
Rising Star

I am unable to open File View from Ambari UI due to failure in hdfs service check. The error which i am getting is "fs.defaultFS" is not configured. The value of this property is already present in cores-site.xml but it is not work out.

is there anyplace else as well where i need to mention this property in addition to core-site.xml?

Thanks

Rahul

1 ACCEPTED SOLUTION

avatar
Master Mentor

@rahul gulati

Local/Remote cluster configuration can be done at the View Level. As mentioned in :

https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-administration/content/registerin...

- Once a remote cluster is configured then you can create view instances to use that remote cluster like:

"manage ambari" --> View ---> File --> Create instance.

12520-remote-cluster-used-in-fileview.png

.

As in the above image you can see that i am using "ErieClusterRemote" a remote cluster for my file View. By default the default view instance uses "Local Cluster".

View solution in original post

16 REPLIES 16

avatar
Master Mentor

@rahul gulati

Local/Remote cluster configuration can be done at the View Level. As mentioned in :

https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-administration/content/registerin...

- Once a remote cluster is configured then you can create view instances to use that remote cluster like:

"manage ambari" --> View ---> File --> Create instance.

12520-remote-cluster-used-in-fileview.png

.

As in the above image you can see that i am using "ErieClusterRemote" a remote cluster for my file View. By default the default view instance uses "Local Cluster".

avatar
Rising Star

@Jay SenSharma

I have created a new View named File View New and Hive View New in Local Cluster. My checks are passed now and i am able to get into hdfs directories. But There are 2 things which i observed

1) When i try to open file stored in HDFS then i am getting the same errors

16 Feb 2017 07:27:58,295 ERROR [ambari-client-thread-7609] ContainerResponse:537 - Mapped exception to response: 500 (Internal Server Error) org.apache.ambari.view.commons.exceptions.ServiceFormattedException

2) Under Hive View when i run a any command then i am seeing error like "Could not write to /user/admin/hive/jobs(HDFSSApiException). Although i have created directory using below mentioned commands but still not able to see why this is happening. I am using below mentioned link to create admin directory in HDFS.

And i am getting 2nd error as mentioned in below link

https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_ambari_views_guide/content/troubleshooti...

avatar
Master Mentor

@rahul gulati

Have you changed the permission/ownership as well of the directory "/user/admin" ?

sudo -u hdfs hdfs dfs  -mkdir /user/admin
sudo -u hdfs hdfs dfs  -chown admin:hadoop /user/admin

.

avatar
Rising Star

@Jay SenSharma

Yes. I even have given 777 as permission as well

hadoop fs -ls /user

Found 8 items

drwxrwxrwx - admin hadoop 0 2017-02-16 07:33 /user/admin

It looks good but donot know why it is not working.

avatar
Master Mentor

@rahul gulati

Can you please share the error that you are getting while it is failing to write content in hdfs?

avatar
Rising Star

@Jay SenSharma

Error mentioned below

16 Feb 2017 09:11:22,733 ERROR [ambari-client-thread-10030] ContainerResponse:537 - Mapped exception to response: 500 (Internal Server Error) org.apache.ambari.view.hive2.utils.ServiceFormattedException at org.apache.ambari.view.hive2.resources.jobs.viewJobs.JobControllerImpl.setupQueryFile(JobControllerImpl.java:270) at org.apache.ambari.view.hive2.resources.jobs.viewJobs.JobControllerImpl.setupQueryFileIfNotPresent(JobControllerImpl.java:178) at org.apache.ambari.view.hive2.resources.jobs.viewJobs.JobControllerImpl.afterCreation(JobControllerImpl.java:164) at org.apache.ambari.view.hive2.resources.jobs.viewJobs.JobResourceManager.create(JobResourceManager.java:56) at org.apache.ambari.view.hive2.resources.jobs.JobService.create(JobService.java:523) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60) at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)

avatar
Rising Star

@Jay SenSharma

Following the link below i have added all the resp. properties in custom core-site.xml(Ambari) But no success.

http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_ambari_views_guide/content/_configuring_y...

i have added 4 more properties named as

hadoop.proxyuser.root.groups=*
hadoop.proxyuser.root.hosts=*

hadoop.proxyuser.admin.groups=*
hadoop.proxyuser.admin.hosts=*