Member since
12-19-2016
149
Posts
15
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3893 | 04-04-2017 03:01 PM | |
1686 | 01-17-2017 10:44 AM |
08-13-2020
07:58 AM
@stevenmatison Yes got your point ! but when you create a hive table with varchar (sufficient number)Can the columns datatype changed from varchar to string automatically!? When I create a view out of that table, the datatype is getting changed to string.
... View more
04-17-2020
03:02 PM
@testingsauce I am also facing this issue. Saved df in HIVE using saveAsTable but when i try to fetch results using hiveContext.sql(query), it doesn't return anything. BADLY stuck. Please help
... View more
11-19-2017
05:42 PM
@Marco Almeida Thank you for your help. I did try for various version of phoenix core but no luck lemme try this..
... View more
08-29-2017
06:26 PM
As @ssattiraju mentioned you may use a file with commands providing it as a command line parameter. One quick note - if one of the commands fail, the script will stop executing. To avoid it you may use just simple redirect like: phoenix-sqlline localhost < file.sql
... View more
06-16-2017
02:35 PM
1 Kudo
@Dinesh Das
Prohibition against dataset combinations: https://community.hortonworks.com/content/kbentry/63664/how-to-create-a-ranger-policy-that-prohibits-combi.html
Data Expiry-based access policy: https://community.hortonworks.com/articles/92083/using-atlas-and-ranger-to-enforce-data-expiration.html Location-specific access policies: https://community.hortonworks.com/articles/57314/customizing-ranger-policies-with-dynamic-context.html As always, if you find this post helpful, don't forget to "accept" answer.
... View more
06-14-2017
03:07 PM
1 Kudo
Hi @Dinesh Das You might have better luck passing in the actual TRUNCATE TABLE statement in your sqoop eval using the --query option instead of trying to --call the stored procedure (I think those are only supported in sqoop export, not sqoop eval, but I could be mistaken). As always, if you find this post useful please "accept" the answer.
... View more
06-10-2017
09:48 AM
Perfect answer!
... View more
06-02-2017
08:05 AM
You need to create external table then load data either from local or from HDFS. CREATE EXTERNAL TABLE <table_name> (column1 <datatype>,column2 <datatype>,column3 <datatype>,...) ROW FORMAT DELIMITED STORED AS SEQUENCEFILE LOCATION '<location_path>';
Now Load data :
LOAD DATA INPATH '<HDFS PATH>' INTO <table_name>;
... View more
06-05-2017
05:21 PM
You can also use the atlas-client artifact (Java) to import the xls metadata into Atlas. I'd recommend going through atlas.incubator.apache.org/api/v2/index.html for an in-depth REST documentation for Atlas (0.8 onwards) Don't forget to upvote/accept the answer if you find it useful.
... View more