Member since
10-01-2015
3933
Posts
1150
Kudos Received
374
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3361 | 05-03-2017 05:13 PM | |
2790 | 05-02-2017 08:38 AM | |
3065 | 05-02-2017 08:13 AM | |
3002 | 04-10-2017 10:51 PM | |
1508 | 03-28-2017 02:27 AM |
11-20-2015
02:27 AM
according to http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_Sys_Admin_Guides/content/ref-db219cd6-c586-49c1-bc56-c9c1c5475276.1.html this property is still being advised to be configured. The JIRA for deprecation is https://issues.apache.org/jira/browse/HBASE-11520, is there a better instruction set to follow to enable bucketcache than the doc we provide?
... View more
Labels:
- Labels:
-
Apache HBase
11-19-2015
08:55 PM
2 Kudos
load the JSON as string with column name "json" SELECT get_json_object(json, '$.id') AS ID,get_json_object(json, '$.person.last_name') AS LastName,get_json_object(json, '$.person.first_name') AS FirstName,get_json_object(json, '$.person.email') AS email,get_json_object(json, '$.person.location.address') AS Address,get_json_object(json, '$.person.location.city') AS City,get_json_object(json, '$.person.location.state') AS State,get_json_object(json, '$.person.location.zipcode') AS Zip,get_json_object(json, '$.person.text') AS Text,get_json_object(json, '$.person.url') AS URLFROM HBASE_JSON_TABLE;ORSELECT id, lastName, firstName, email, city, state, text, url FROM hbase_json_table A LATERAL VIEW json_tuple(A.json, 'id', 'person') B AS id, personLATERAL VIEW json_tuple(person, 'last_name', 'first_name', 'email', 'text', 'url', 'location') C as lastName, firstName, email, text, url, locLATERAL VIEW json_tuple(loc, 'city', 'state') D AS city, state; OR CREATE EXTERNAL TABLE json_serde_table (
id string,
person struct<email:string, first_name:string, last_name:string, location:struct<address:string, city:string, state:string, zipcode:string>, text:string, url:string>)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
LOCATION '/tmp/json/';
SELECT id, person.first_name, person.last_name, person.email,
person.location.address, person.location.city, person.location.state,
person.location.zipcode, person.text, person.url
FROM json_serde_table LIMIT 5;
... View more
11-18-2015
11:54 PM
I think you should be able to just compile with hbase-client and hadoop-client with their respective versions.
... View more
11-18-2015
07:57 PM
that's not what I was asking but thanks.
... View more
11-18-2015
02:08 PM
2 Kudos
you can also use Microsoft Visual Studio https://azure.microsoft.com/en-us/updates/updated-visual-studio-tools-for-apache-hive/ and Oracle SQL Developer http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index-097090.html and
... View more
11-12-2015
02:10 PM
I believe the issue is that RHEL comes bundled with mysql-connector-java or at least until you try to install mysql then it grabs the package from the RHEL repo ignoring the package from our repo. If there's a way to handle the logic so that our repo takes precedence, then I think we should be good.
... View more
11-12-2015
01:50 AM
1 Kudo
@rmolina@hortonworks.com absolutely unchanged, please run "rpm -qa | grep mysql-connector-java", RHEL version of the mysql connector always supersedes the version provided in HDP-UTILS, even if HDP-UTILS version is higher. Uninstalling the connector and installing the one from HDP-UTILS, which is v. 29 still didn't make a difference. There's something else at play besides the version issues. SQOOP-1400 is included in Sqoop 1.4.6 and we ship 1.4.6 so I can't explain the problem. Here's an easy way to recreate the issue on a vanilla Sandbox VM. # Create Hive table drop table if exists export_table; create table export_table ( key int, value string ) row format delimited fields terminated by ","; # populate Hive with dummy data insert into export_table values("1", "ExportedValue"); # confirm Hive table has data select * from export_table; # export table to MySQL # MySQL table must exist su mysql mysql -u root create database export; use export; create table exported (rowkey int, value varchar(20)); exit; # on HDP 2.3.2 Sandbox, SQOOP-1400 bug, use --driver com.mysql.jdbc.Driver to overcome the problem # sqoop export from a Hive table into MySQL sqoop export --connect jdbc:mysql://127.0.0.1/export --username hive --password hive --table exported --direct --export-dir /apps/hive/warehouse/export_table --driver com.mysql.jdbc.Driver if you exclude the last flag "--driver com.mysql.jdbc.Driver" error will occur.
... View more
11-07-2015
10:01 PM
I don't have access to internal JIRAs. As far as a workaround, when you do yum update, OS will report that v. 17 is the latest even though we ship v. 29 in hdp-utils repo. Removing the connector and replacing with v. 29 will not work either, I tried it. Adding the --driver flag is the only way I was able to get it to work. I do also agree we should ship later version. The whole reason for this post is to include SQOOP-1400 in the next release of Sandbox. What is more surprising is that we ship Sqoop 1.4.6 and that should include the fix but seems it doesn't? @Neeraj @Deepesh
... View more
11-06-2015
01:25 AM
you're sqooping master, model, reportdb, not sure if you need to do that, I would limit the tables just to the ones you need. Other than that, please check ulimit on the user executing the job in foreground and background, http://www.commandlinefu.com/commands/view/9893/find-ulimit-values-of-currently-running-process.
... View more