Member since
05-22-2018
69
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3942 | 06-07-2018 05:33 AM | |
976 | 05-30-2018 06:30 AM |
05-29-2018
05:37 AM
Thank you @Shu. This minor mistake I didn't notice. Your observation works. Jay.
... View more
05-28-2018
01:52 PM
Hi All, I have one table in MsSQL Database. I want to import one table into Hive by using --target-dir parameter. I have selected default Database in MsSQL for Hive. My Observation: sqoop import --connect jdbc:sqlserver://<HOST>:<PORT> --username XXXX --password XXXX --table <mssql_table> --hive-import --hive-table <hivedatabase.hivetable> --create-hive-table --target-dir '<PATH_WHERE_TO_STORE_IMPORTED_DATA>' --fields-terminated-by',' -m 1 P.S.: I have tried with --warehouse-dir also. Regards, Jay.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Sqoop
05-25-2018
08:17 AM
Thank you @Geoffrey Shelton Okot It helps. Actually, I was thinking it will create a user in CentOS only. But It has created a user in HDFS as well.
... View more
05-24-2018
12:53 PM
@Geoffrey Shelton Okot Is it for CentOS? Because I am using Hortonworks Sandbox on Oracle Virtual Machine.
... View more
05-24-2018
11:40 AM
I have read many documents and implements many solutions. I have made directory as `<user_name>` by following commands and gave ownership and permission to the folder as HDFS user. [root@sandbox ~]# su hdfs
[hdfs@sandbox root]$ hdfs dfs -mkdir /user/tempuser
[hdfs@sandbox root]$ hdfs dfs -chown tempuser:hdfs /user/tempuser
[hdfs@sandbox root]$ hdfs dfs -chmod 700 /user/tempuser
[hdfs@sandbox root]$ su tempuser
su: user tempuser does not exist
[hdfs@sandbox root]$ But is throwing error as user <username> does not exist. Please help me to resolve it. Regards, Jay.
... View more
Labels:
- Labels:
-
Apache Hadoop
05-22-2018
04:23 AM
Hi @Vinod369, I have done same things you mentioned. then I am facing below error. java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.SqoopMain not found Please help me to resolve this error. Regards, Jay
... View more
05-22-2018
03:35 AM
My scenario is little bit different. I want to import one table from MsSQL to HIVE table. So i copied sqljdbc42.jar to sqoop lib directory in HDFS. But i m stil facing same erroes. Please help me .
... View more
05-21-2018
08:11 AM
I am getting error while execution of Oozie workflow for Sqoop. I can execute list-database for list all database in MsSQL and also Sqoop command working well when I am executing without Workflow. Now I am trying to import table from MsSQL database into Hive using Sqoop, but while trying to import tables getting below error. Please help to resolve this. Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] Sqoop Command: sqoop import --connect jdbc:sqlserver://<IP:PORT> --username <username> --password <password> --table <SQL_Tbale> --hive-import --hive-table <hivedb.hivetbl> -m 1 Following files are being used for workflow. job.properties nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
queueName=default
appPath=${nameNode}/<HDFS_path_where_workflow.xml_file>
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib/lib_20161025075203/
oozie.wf.application.path=${appPath} workflow.xml <workflow-app name="SqoopImportAction" xmlns="uri:oozie:workflow:0.4">
<start to="sqoop-node"/>
<action name="sqoop-node">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>${appPath}/hive-site.xml</job-xml>
<command>import --connect jdbc:sqlserver://<IP:PORT> --username <username> --password <password> --table <SQL_Tbale> --hive-import --hive-table <hivedb.hivetbl> -m 1 </command>
<archive>${appPath}/sqljdbc42.jar</archive>
</sqoop>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> P.S.: I have also tried following things; Copy sqljdbc42.jar into appPath. Copy sqljdbc42.jar into oozie lib path. i.e ${nameNode}/user/oozie/share/lib/lib_20161025075203/sqoop Copy hive-exec*.jar intp oozie lib path. Run workflow as hdfs,oozie and root user Please help me to resolve this. Thanks in Advance.
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Sqoop
04-24-2018
06:15 AM
I want to perform `Oozie` for store `PIG` result into `HIVE` table. In that, I want to load files from HDFS directory and store result into specific location into HDFS using Oozie. And then I want to process PIG output into HIVE. For that, I have created HIVE and PIG script. But I want to create Oozie workflow to schedule the whole process. SO, I have created following files for Oozie workflow. `job.properties ` nameNode=hdfs://sandbox.hortonworks.com:8020
jobTracker=sandbox.hortonworks.com:8050
oozie.libpath=${nameNode}/<lib_jar_path>
oozie.wf.application.path=${nameNode}/<working_directory_path> //HDFS Directory
appPath=${nameNode}/<application_path> //HDFS Directory
queueName=default
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
outputDir=${appPath}/output `workflow.xml` <workflow-app name="OozieDemoPigHive" xmlns="uri:oozie:workflow:0.2">
<start to="fork-node"/>
<fork name="fork-node">
<path start="pig-job" />
<path start="hive-job" />
</fork>
<join name = "join-node" to ="sub-end"/>
<action name="pig-node">
<pig>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${outputDir}"/>
</prepare>
<script>demo.pig</script>
</pig>
<ok to="join-node"/>
<error to="kill"/>
</action>
<action name="hive-node">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>${appPath}/hive-site.xml</job-xml>
<configuration>
<property>
<name>oozie.hive.defaults</name>
<value>${appPath}/hive-site.xml</value>
</property>
<property>
<name>hadoop.proxyuser.oozie.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.oozie.groups</name>
<value>*</value>
</property>
</configuration>
<script>hive.hql</script>
</hive>
<ok to="join-node"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"</message>
</kill>
<end name="sub-end"/>
</workflow-app> `Oozie job:` sudo -u oozie oozie job -oozie http://127.0.0.1:11000/oozie -config job.properties -run But, When I am executing above job, getting an error like this, `Error: E0708 : E0708: Invalid transition, node [fork-node] transition [pig-job]` What am I doing wrong over here? Can anyone assist me?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Oozie
-
Apache Pig
04-12-2018
12:03 PM
@Shu Oh! I see. Thak you so much. Those minor mistakes I didn't notice.
... View more
- « Previous
- Next »