Member since
01-03-2017
11
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5320 | 01-03-2017 09:37 AM |
05-11-2017
09:33 AM
Yes it was this, I just update all the services and it works 🙂 Really thanks @nshelke
... View more
05-11-2017
08:45 AM
@nshelke Really thanks Updating all the services it works: This is my yum list installed on server: yum list installed | grep ambari
ambari-agent.x86_64 2.5.0.3-7 @ambari-2.5.0.3
ambari-metrics-grafana.x86_64 2.5.0.3-7 @ambari-2.5.0.3
ambari-metrics-hadoop-sink.x86_64 2.5.0.3-7 @ambari-2.5.0.3
ambari-metrics-monitor.x86_64 2.5.0.3-7 @ambari-2.5.0.3
ambari-server.x86_64 2.5.0.3-7 @ambari-2.5.0.3
This is my yum list installed on agents: yum list installed | grep ambari
ambari-agent.x86_64 2.5.0.3-7 @ambari-2.5.0.3
ambari-metrics-hadoop-sink.x86_64 2.5.0.3-7 @ambari-2.5.0.3
ambari-metrics-monitor.x86_64 2.5.0.3-7 @ambari-2.5.0.3
... View more
05-11-2017
07:48 AM
I upgrade the ambari server and agents version from the 2.4.2 to the 2.5. After upgradring I access to the ambari dashboard: http://hdp-master1.ikerlan.es:8080, and some services are on Heartbeat Lost status. The services of the ambari-server are on that status, but for example it works when I access to HDFS: hdp-master1.ikerlan.es:50070.
... View more
Labels:
- Labels:
-
Apache Ambari
03-21-2017
01:48 PM
I only disable local machines firewall on the Master and on the agents and it works for me.
... View more
01-03-2017
10:10 AM
Really thanks it works great!!!:)
... View more
01-03-2017
09:52 AM
Why I can't upload a table to hive view? When I try to upload, it throws the next fail: E090 HDFS020 Could not write file /user/admin/hive/jobs/hive-job-11-2017-01-03_06-05/query.hql [HdfsApiException] I have all the permission on the HDFS Custom core-site: But when I try to upload I see the next: I am using Centos7, with Ambari HDP. My firewalld is disabled. Could be Selinux?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
01-03-2017
09:37 AM
Okey it works... The simple solution was to disable the firewall.. Really thanks
... View more
01-03-2017
09:33 AM
Yes I can open, this is my output: [root@hdp-master1 ~]# hdfs dfs -cat /tmp/data/geolocation.csv | head
truckid,driverid,event,latitude,longitude,city,state,velocity,event_ind,idling_ind
A54,A54,normal,38.440467,-122.714431,Santa Rosa,California,17,0,0
A20,A20,normal,36.977173,-121.899402,Aptos,California,27,0,0
A40,A40,overspeed,37.957702,-121.29078,Stockton,California,77,1,0
A31,A31,normal,39.409608,-123.355566,Willits,California,22,0,0
A71,A71,normal,33.683947,-117.794694,Irvine,California,43,0,0
A50,A50,normal,38.40765,-122.947713,Occidental,California,0,0,1
A51,A51,normal,37.639097,-120.996878,Modesto,California,0,0,1
A19,A19,normal,37.962146,-122.345526,San Pablo,California,0,0,1
A77,A77,normal,37.962146,-122.345526,San Pablo,California,25,0,0
cat: Unable to write to output stream.
... View more
01-03-2017
09:18 AM
Yes the file works fine, because if I upload from local it detects and show the data. But the problem is uploading from HDFS. I have gived all the permission for that file and recursively for the directories. The output is the next: [root@hdp-master1 ~]# hdfs dfs -ls /tmp/data/geolocation.csv
-rwxrwxrwx 3 admin hdfs 526677 2017-01-02 11:24 /tmp/data/geolocation.csv
... View more
01-03-2017
09:08 AM
I am using Ambari-HDP on Centos7 cluster. 1 master and 2 slaves. I have check it and it does not contain any blankspace. It is very strange :S. Really thanks
... View more