Member since
10-02-2015
9
Posts
3
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1913 | 06-03-2016 06:54 AM | |
2804 | 05-23-2016 08:38 PM |
07-27-2016
09:12 PM
From Ambari UI you can see some HBase status and performance stats metrics information.
... View more
07-13-2016
06:23 AM
, You can use a load balancer between the client and region servers. I do not have benchmark for read/write comparisons for Phoenix thick and thin client. I would assume they are compatible. It depends on how easy to implement/deploy your application. Phoenix thick client uses JDBC connection to connect to Hbase. The overhead should be very basic. If your application can afford read delay, you can use one cluster for read and another cluster for write, and set up replication between them.
... View more
06-22-2016
11:08 PM
Can you try put the below first. load 'hbase://table/JOURNEY_OFICINA_HBASE' using org.apache.phoenix.pig.PhoenixHBaseLoader(zkQuorum);
... View more
06-22-2016
07:39 PM
Looks like you need put in your pig script: raw_data = LOAD 'hdfs:/user/xx/journey_oficina_hbase' USING PigStorage(',') AS (
CODNRBEENF CHAR(4) not null,
FECHAOPRCNF CHAR(21) not null ,
CODINTERNO CHAR(4),
CODTXF CHAR(8),
FREQ BIGINT,
);
... View more
06-14-2016
06:43 AM
You need some more in classpath especially if you are using some other features such as UDF, etc. You can add these into your classpath: hbase_config_path, hadoop_common_jar, hadoop_hdfs_jar, hadoop_conf, hadoop_classpath , You need more in classpath: hbase_config_path, hadoop_common_jar, hadoop_hdfs_jar, hadoop_conf, hadoop_classpath
... View more
06-03-2016
06:54 AM
From the installation, you can pick to install Hbase. If you ran into issue like starting the hive it shows java error, the installation had problems. You can debug from lower layer to see whether you can use hdfs command to list directories, etc. Or is it feasible for you to follow the link and re-install?
... View more
06-03-2016
06:27 AM
1 Kudo
Follow this link: http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0-Win/bk_QuickStart_HDPWin/content/inst_HDPWin.html , Please follow the link to meet the requirements http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0-Win/bk_QuickStart_HDPWin/content/inst_HDPWin.html
... View more
05-23-2016
08:58 PM
Sqoop doesn’t now permit you to import, all at once, a relational table directly into an HBase table having multiple column families. To work around this limitation, you create the HBase table first and then execute Sqoop import operations to finish the task.
... View more
05-23-2016
08:38 PM
2 Kudos
Your insert statement did not contain the key. Should be: insert into dchandra.trial_dest(key, pat_id) select src.key, src.pat_id from dchandra.trial_src src join dchandra.trial_dest dest on src.key=dest.key; , When you ran: insert into dchandra.trial_dest(pat_id) select src.pat_id from dchandra.trial_src src join dchandra.trial_dest dest on src.key=dest.key; You did not include the key in your insert statement. Should it be: insert into dchandra.trial_dest(key, pat_id) select src.key, src.pat_id from dchandra.trial_src src join dchandra.trial_dest dest on src.key=dest.key;
... View more