Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2625 | 11-01-2016 05:43 PM | |
| 8754 | 11-01-2016 05:36 PM | |
| 4925 | 07-01-2016 03:20 PM | |
| 8267 | 05-25-2016 11:36 AM | |
| 4434 | 05-24-2016 05:27 PM |
01-18-2016
03:20 PM
3 Kudos
Original post Use case: Access control on table customer, exclude column SSN. User Hive has access to see only name column. Access on column SSN is restricted. Column level security can be controlled in couple of clicks by Ranger UI.
... View more
Labels:
01-18-2016
03:16 PM
1 Kudo
@AR You can use yum install hadoop-* if that particular node is not part of ambari stack check this http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/ch_getting_ready_chapter.html for example: yum install hadoop hadoop-hdfs hadoop-libhdfs hadoop-yarn hadoop-mapreduce hadoop-client openssl You can pick hadoop-client Its really good doc.
... View more
01-18-2016
03:05 PM
@Gagan Dutt Perfect! Please accept the best answer to close the loop 🙂 It was lot of work indeed 😛
... View more
01-18-2016
01:13 PM
@Guilherme Braccialli https://community.hortonworks.com/articles/10365/apache-zeppelin-and-sparkr.html
... View more
01-18-2016
01:09 PM
1 Kudo
@Gagan Dutt Thats the real problem. telnet 192.168.64.133 8080 - if its working then http://192.168.64.133:8080
... View more
01-18-2016
12:47 PM
@Gagan Dutt few checks from your laptop 1) telnet 127.0.0.1 8080 (working? ) 2) once you are in vm , ifconfig -a 3) if 1 is working then http://127.0.0.1:8080 should work. I am assuming you are not on vpn, any firewall blocking connection etc
... View more
01-18-2016
12:39 PM
@Anshul Sisodia I hope you are not hitting this https://issues.apache.org/jira/browse/HDFS-770
... View more
01-18-2016
12:26 PM
1 Kudo
@Lars Kinder Thanks for sharing the data format. I was able to reproduce it 2016-01-18 12:20:55,528 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!2016-01-18 12:20:55,537 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b. Backend error : org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing [POUserFunc (Name: POUserFunc(org.apache.pig.builtin.ToDate2ARGS)[datetime] - scope-4 Operator Key: scope-4) children: null at []]: java.lang.IllegalArgumentException: Invalid format: "2009-10-08 12:00:00" is malformed at "-10-08 12:00:00" Details at logfile: /home/hdfs/pig_1453119614848.log grunt> It worked with this
b = Foreach a Generate ToDate(date, 'yyyy-MM-dd HH:mm:ss') as dateString;
2016-01-18 12:25:57,495 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2016-01-18 12:25:57,496 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050
2016-01-18 12:25:57,500 [main] INFO org.apache.hadoop.mapred.ClientServiceDelegate - Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting to job history server
2016-01-18 12:25:57,604 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2016-01-18 12:25:57,604 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050
2016-01-18 12:25:57,609 [main] INFO org.apache.hadoop.mapred.ClientServiceDelegate - Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting to job history server
2016-01-18 12:25:57,647 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Success!
2016-01-18 12:25:57,650 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not set... will not generate code.
2016-01-18 12:25:57,663 [main] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2016-01-18 12:25:57,664 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1 (2009-10-08T12:00:00.000Z) grunt> b = Foreach a Generate ToDate(date, 'yyyy-MM-dd HH:mm:ss') as dateString;
... View more
01-18-2016
12:14 PM
1 Kudo
@Anilkumar Panda I like your approach. This is not different from typical RDBMS or any other application stack. 1) Shut down all services using Ambari. you can use rest API to shutdown services if your like link (You can create a shell script to call stop function on the services that you have installed) 2) Shutdown ambari-agents on all nodes. 3) Shutdown ambari-server. 4) Reboot all nodes as required . 5) Restart ambari-server, agents and services in that order. (chkdconfig on ambari-server and agent helps to bring up services on reboot)
... View more