Member since
11-02-2016
31
Posts
4
Kudos Received
0
Solutions
12-15-2017
06:51 PM
Thank you bkosaraju
... View more
12-14-2017
09:56 PM
Hello all, I want to ingest data logs from application server logs into HDFS using Flume version 1.5 . Do I need to install Flume agent (client) on these application servers? How can I pull these application logs without install Flume agent? However, these servers are not part of Hadoop cluster. Can you please help? Thanks JN
... View more
Labels:
- Labels:
-
Apache Flume
08-09-2017
01:38 PM
hi Sahi, Thanks for your posts. I am able to use to_utc_timestamp() and from_utc_timestamp() functions. Thanks
... View more
08-07-2017
08:58 PM
Hello experts; I have this query is getting PST time in Hive but how can I convert this PST time to UTC time? Please help select from_unixtime(unix_timestamp()-1*60*60*4, 'yyyyMMddHH') as pst_time; Thanks in advance JN
... View more
Labels:
07-12-2017
05:43 PM
Hello all, How to set variable in hive and insert value as partition? I have below commands and the partition is not getting from the variable that I set but getting constant "var" instead datetime. set hiveVar:var= from_unixtime(unix_timestamp()-1*60*60*4, 'yyyyMMddHH');
INSERT INTO TABLE tmp Partition (datehour='${hiveVar:var}') SELECT * FROM tmp2; Please help. Thanks JN
... View more
Labels:
- Labels:
-
Apache Hive
07-06-2017
01:54 PM
Thanks Kuldeep. Its working now
... View more
07-05-2017
09:29 PM
Hello all, I am getting below error when I tried to submit Workflow from Workflow Manager View in Ambari. at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.security.authorize.AuthorizationException: User: root is not allowed to impersonate admin I also added to core-site properties: hadoop.proxyuser.root.groups="users"
hadoop.proxyuser.root.hosts=ambari-server.hostname I have Ambari runs as root but I login to Ambari as admin. Could you please help me how to avoid the AuthorizationException. Many Thanks JT Ng
... View more
Labels:
- Labels:
-
Apache Ambari
06-07-2017
06:58 PM
2 Kudos
Hi all, I am getting this error in Hive info=[Error: Failure while running task:java.lang.IllegalArgumentException: tez.runtime.io.sort.mb 2027 should be larger than 0 and should be lessthan the available task memory (MB):538 Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:12, Vertex vertex_1496756949291_0733_1_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]Verte killed, vertexName=Reducer 2, vertexId=vertex_1496756949291_0733_1_01, diagnostics=[Vertex received Kill while in RUNNING state., Vertex did not succeed due to OTHER_VERTX_FAILURE, failedTasks:0 killedTasks:101, Vertex vertex_1496756949291_0733_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]DAG did not succeed due to VERTEX_FAIURE. failedVertices:1 killedVertices:1 I am getting the error by using this "select ordernum, count(*) from status group by ordernum having count(*) > 1;" I've set tez.runtime.io.sort.mb = 2027 to tez.runtime.io.sort.mb = 409 but I still getting the same error tez.runtime.io.sort.mb 2027 Please help. Thanks
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Tez
06-07-2017
02:12 PM
Hello all, I have around 20 workflows with a lot of dependencies between them. Can I have one job.properties for 20 Workflows? How about coordinator and bundle? Should I use the same job.properties? Or Do I need to have job.properties for each Workflow, each Coordinator and Bundle? for example : /bundle.xml /job.properties (for bundle properties) /workflowA /coordinator.xml /workflow.xml /job.properties /workflowB /coordinator.xml /workflow.xml /job.properties Please give me advise. Thanks
... View more
Labels:
- Labels:
-
Apache Oozie