Member since
05-11-2016
24
Posts
5
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
20166 | 10-31-2016 06:44 PM | |
6776 | 06-07-2016 09:44 PM |
10-31-2016
06:44 PM
1 Kudo
We can use below Scala API to read file: sqlContext.sql(scala.io.Source.fromFile("/vzhome/agaram8/HQLScripts/count.hql").getLines
... View more
10-31-2016
02:47 PM
I went through this Stackoverflow link, but I don't see any 'open' API in spark...getting compiler error...
... View more
10-31-2016
02:41 PM
Please follow below steps, it should resolve the issue. 1) Create a "lib" folder directly next to workflow.xml 2) Copy sparkUber.jar to lib folder 3) In the job.properties file, add oozie.use.system.libpath=true 4) hdfs dfs -put /path/to/sparkUber.jar /<oozie_worflow_name>/lib/
... View more
10-31-2016
02:04 PM
This link explains how to execute hive sql using spark-sql shell. But I want to the call the file programatically not through shell.
... View more
10-28-2016
08:20 PM
1 Kudo
I want to read a hql file in spark job. This hql creates a table by joining 3-4 other tables. I don't want to write the sql statement in the spark job instead I want to pass HQL file as an argument to spark job and then run the hql file. Is it possible in Spark ?
... View more
Labels:
- Labels:
-
Apache Spark
10-13-2016
03:19 PM
We want to restrict the access to only specified tables( 200 tables) using Ranger Hive/HDFS policy. To Achieve this, we created HDFS and HIVE policy using REST API. In the HDFS policy, we individually list
out all the HDFS paths in the hdfs policy i.e. /apps/hive/warehouse/test.db/table1,/apps/hive/warehouse/test.db/table2, so that user can not bypass the Ranger Hive policies
and access all the tables using the Hive CLI or HDFS client. Problem is, If the number of HDFS Path is more than 128, then REST API call for HDFS policy is failing with below error: {"statusCode":1,"msgDesc":"Exception
[EclipseLink-4002] (Eclipse Persistence Services - 2.5.2.v20131113-a7346c6):
org.eclipse.persistence.exceptions.DatabaseException\nInternal Exception:
com.mysql.jdbc.MysqlDataTruncation: Data truncation: Out of range value for
column 'sort_order' at row 1\nError Code: 1264\nCall: INSERT INTO
x_policy_resource_map (ADDED_BY_ID, CREATE_TIME, sort_order, resource_id,
UPDATE_TIME, UPD_BY_ID, value) VALUES (?, ?, ?, ?, ?, ?, ?)\n\tbind => [7
parameters bound]\nQuery: InsertObjectQuery(XXPolicyResourceMap
[XXDBBase={createTime={Wed Oct 12 22:00:32 EDT 2016} updateTime={Thu Oct 13
13:56:38 EDT 2016} addedByUserId={1} updatedByUserId={1} } id=null,
resourceId=167, value=/apps/hive/warehouse/dev_tbls.db/abcd, order=128])"}
... View more
Labels:
- Labels:
-
Apache Ranger
09-28-2016
09:09 PM
I'm not getting any response when I run REST API
... View more
09-28-2016
09:06 PM
In hive schema, I have 300 tables and my requirement is provide access to 100 tables to a set of users. Is there anyway In Ranger to add tables in bulk while creating a policy. In the Ranger Create Policy UI, I can add table one by one. Is there a way to write script to add tables to Ranger policy. Thanks for your help !!
... View more
Labels:
- Labels:
-
Apache Ranger
09-28-2016
08:33 PM
I have created an internal User in Ranger UI. Now, I want to delete this user but I do not see any option in Ranger UI to delete the user. I also tried to delete it using REST API but no luck. I used following API to delete the user: curl -u admin:admin -H 'X-Requested-By: ambari' -X DELETE http://<host_name>:6080/service/xusers/users/userName/<user_name>; Please let me know how can I delete a user in Ranger.
... View more
Labels:
- Labels:
-
Apache Ranger