Member since
04-04-2016
147
Posts
40
Kudos Received
16
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1188 | 07-22-2016 12:37 AM | |
4337 | 07-21-2016 11:48 PM | |
1620 | 07-21-2016 11:28 PM | |
2272 | 07-21-2016 09:53 PM | |
3389 | 07-08-2016 07:56 PM |
06-30-2016
08:03 PM
2 Kudos
Hi, I am looking for answer for one of the RFP questions: The customers core systems are Unicode based and support multiple languages even though English is their corporate language. Can you let me know what are the multiple languages supported including which languages are supported. Not really sure with the answer. Can you please help. Thanks, Sujitha
... View more
06-24-2016
12:41 AM
2 Kudos
How to make Mysql Database
as Hive’s instance: Install Mysql if not available: brew update brew doctor brew upgrade brew install
mysql mysql.server
restart mysql_secure_installation login to mysql
-> mysql –u root –p Enter password: Happy Mysql
learning…. Mysql is
already installed on Hortonworks sandbox. Steps: Confirm with mysql –u root –p Import an already
available database into Mysql: Ref: https://dev.mysql.com/doc/employee/en/employees-installation.html shell> tar -xjf
$HOME/Downloads/employees_db-full-1.0.6.tar.bz2 shell> cd employees_db/ shell> mysql -t <
employees.sql With this installation of
employee db in mysql is complete. Configuration
of Mysql Instance with Hive: From
HIVE create Mysql metastore [root@host]#
mysqladmin -u root create hivedb mysql>
USE hivedb; mysql>
CREATE USER 'hive'@'localhost' IDENTIFIED BY 'hive'; mysql>
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'localhost'; With
this we confirm that mysql database is the Hive’s new metastore. Suppose to perform a full import of the ‘employees’
and ‘salaries’ tables into HDP: Tables created in
Hive Create database employees; Use employees; CREATE EXTERNAL TABLE IF NOT EXISTS employees ( emp_no INT, birth_date DATE, first_name
VARCHAR(14), last_name
VARCHAR(16), gender STRING, hire_date DATE ) STORED AS TEXTFILE; CREATE TABLE IF NOT EXISTS salaries ( emp_no INT, salary INT, from_date DATE, to_date DATE ) STORED AS TEXTFILE; sqoop import --connect
jdbc:mysql://172.16.16.128:3306/employees --username=hive --password=hive
--driver com.mysql.jdbc.Driver --table=employees --hive-import
--hive-table=empl.employees --target-dir=wp_users_import –direct sqoop import --connect
jdbc:mysql://172.16.16.128:3306/employees --username=hive --password=hive
--driver com.mysql.jdbc.Driver --table=employees --hive-import
--hive-table=empl.salaries --target-dir=wp_users_import –direct Suppose we need to perform some cleansing of data
using Regex expressions of Hive: use empl; drop table
empl.empl_clean; show tables; create table
empl.empl_clean(emp_no INT, birth_date STRING, first_name STRING, last_name STRING,gender
STRING, hire_date STRING ); insert overwrite table
empl.empl_clean SELECT regexp_replace(employees.emp_no,
'\t', '')emp_no, regexp_replace(employees.birth_date,
'\t', '')birth_date, regexp_replace(employees.first_name,
'\t', '')first_name, regexp_replace(employees.last_name,
'\t', '')last_name, regexp_replace(employees.gender,
'\t', '')gender, regexp_replace(employees.hire_date,
'\t', '')hire_date from empl.employees; select * from
empl.empl_clean limit 100; Cleansing the
salaries table: use empl; drop table
empl.salary_clean; create table
empl.salary_clean(emp_no INT,salary INT, from_date STRING, to_date STRING); insert overwrite table
empl.salary_clean SELECT regexp_replace(salaries.emp_no,
'\t', '')emp_no, regexp_replace(salaries.salary,
'\t', '')salary, regexp_replace(salaries.from_date,
'\t', '')from_date, regexp_replace(salaries.to_date,
'\t', '')to_date from empl.salaries; select * from
empl.salary_clean limit 100; Happy Learning….
... View more
Labels:
06-23-2016
11:05 PM
Hi @Adel Quazani, You can add the libraries in Zepplin with import statements. For example: import org.apache.spark.rdd._ import scala.collection.JavaConverters._ import au.com.bytecode.opencsv.CSVReader Hope that answers your question. Thanks, Sujitha Sanku
... View more
06-21-2016
08:59 PM
6 Kudos
SQOOP CONNECTIONS: Sqoop
is a tool designed to transfer data between Hadoop and relational databases.
You can use Sqoop to import data from a relational database management system
(RDBMS) such as MySQL or Oracle into the Hadoop Distributed File System (HDFS),
transform the data in Hadoop MapReduce, and then export the data back into an
RDBMS. Reference: sqoop user guide: https://sqoop.apache.org/docs/1.4.0-incubating/SqoopUserGuide.html JDBC ORACLE:
Examples for Import: sqoop-import
--connect jdbc:oracle:thin:@db.test.com:PORT:INSTANCE_NAME --table
DW_DATAMART.HCM_EMPLOYEE_D --fields-terminated-by '\t' --lines-terminated-by
'\n' --username SSANKU -P sqoop-import
--connect jdbc:oracle:thin:@db.test.com:PORT:INSTANCE_NAME --table
DW_DATAMART.HCM_EMPLOYEE_D --fields-terminated-by '\t' --lines-terminated-by
'\n' --username SSANKU -P JDBC ORACLE: Example
for Select: The eval tool allows users to quickly run simple SQL queries
against a database; results are printed to the console. This allows users to
preview their import queries to ensure they import the data they expect. sqoop-eval
--connect jdbc:oracle:thin:@db.test.com:PORT:INSTANCE_NAME --select * from
DW_DATAMART.HCM_COMPANY_D JDBC INFORMIX:
example JDBC Informix:
Examples for Import: sqoop-import
--connect jdbc:informix-sqli://4jane.soi.com:15062/common:INFORMIXSERVER=ids_4jane
--driver com.informix.jdbc.IfxDriver --table portal_request_params –username
username -P Sqoop Import to HBASE table: Examples: sqoop-import
--connect jdbc:oracle:thin:@db.test.com:PORT:INSTANCE_NAME --username ssanku
--P --table DW_DATAMART.PAY_PAY_CHK_OPTION_D --hbase-table
DW_DATAMART.PAY_PAY_CHK_OPTION_D --column-family cf1 --hbase-create-table If no primary key defined on the
oracle table sqoop-import
--connect jdbc:oracle:thin:@db.test.com:1725:hrlites --username ssanku --P
--table PSMERCHANTID --hbase-table PSMERCHANTID --column-family cf
--hbase-row-key MERCHANTID --hbase-create-table --split-by MERCHANTID sqoop-import
--connect jdbc:oracle:thin:@db.test.com:PORT:INSTANCE_NAME --username ssanku
--P --table DW_DATAMART.PAY_PAYGROUP_D --hbase-table DW_DATAMART.PAY_PAYGROUP_D
--column-family cf1 --hbase-create-table
sqoop-import
--connect jdbc:oracle:thin:@db.test.com:1725:hrlites --username ssanku --P
--table PSMERCHANTID --hbase-table PSMERCHANTID --column-family cf
--hbase-create-table --split-by MERCHANTID Sqoop Import to HIVE table from Mysql
Database: Examples: sqoop
import --connect jdbc:mysql://172.16.16.128:3306/employees -- username=hive
--password=hive --driver com.mysql.jdbc.Driver --table=employees -- hive-import
--hive-table=empl.employees --target-dir=wp_users_import –direct sqoop import --connect jdbc:mysql://172.16.16.128:3306/employees --
username=hive --password=hive --driver com.mysql.jdbc.Driver --table=employees
-- hive-import --hive-table=empl.salaries --target-dir=wp_users_import --direct
... View more
Labels:
06-06-2016
05:02 PM
Hi Sunile, Thanks for the quick response. Yes Oozie being a workflow scheduler, I am looking for something where we can specify these metrics and best cases documentation. As you mentioned I will look into use cases yahoo repo. I haven't looked at it there. Many Thanks, Sujitha
... View more
06-06-2016
04:56 PM
Hi, I am looking for usecases for Oozie. when I look around I get information from https://oozie.apache.org/docs/4.2.0/ but no usecases in here. Can you please point me to documentation with steps for using Oozie. Any help is highly appreciated. Many Thanks, Sujitha
... View more
Labels:
- Labels:
-
Apache Oozie
06-06-2016
04:31 PM
Hi Geetha, I came across the same error. This is the issue that comes up when the process is in accepted state and doesn't move to running state. This could be due to the error with creating Hive tables within the process or could be an issue with Hbase table. In my case it was an issue with Hive tables creation error. This can be resolved either by killing the process in accepted state and moving in with manually creating the tables. hope this helps. Thanks, Sujitha
... View more
05-04-2016
04:52 PM
Hi, I was able to resolve this issue. It was due to the missing phoenix jar. After I have it in the lib directory of the hbase all the region servers came up. Thanks, Sujitha
... View more
05-03-2016
09:57 PM
screen-shot-2016-05-03-at-23835-pm.pngscreen-shot-2016-05-03-at-23855-pm.png Hi, I am trying to restart my hbase. None of my region servers are alive. When I click on restart the affected, I get the attached error. So is the case with oozie and Mahout. Is it something similar to this also I see https://issues.apache.org/jira/browse/AMBARI-12834 https://community.hortonworks.com/articles/28543/how-to-fix-kerberos-client-in-invalid-state-invali.html Please advise, Thanks, Sujitha
... View more
- « Previous
- Next »