Member since
04-20-2017
5
Posts
2
Kudos Received
0
Solutions
04-26-2017
02:01 AM
My bad, that was the only thing i didn't try lol. BTW, this is the solution. Thanks
... View more
04-20-2017
08:09 PM
I'm trying to use the --query option in sqoop to import data from MSSQL. My concern is, how can we declare which schema to use ung --query in MSSQL. My script: sqoop \ --options-file sqoop/aw_mssql.cfg \ --query "select BusinessEntityId, LoginID, cast(OrganizationNode as string) from Employee where \$CONDITIONS" \ --hive-table employees \ --hive-database mssql \ -- --schema=HumanResources Still produces an error "Invalid object name 'Employee'". Also tried --connect "jdbc:sqlserver://192.168.1.17;database=AdventureWorks;schema=HumanResources" but also failed.
... View more
Labels:
- Labels:
-
Apache Sqoop
04-20-2017
07:30 PM
@saranvisa Already tried using the --query. cast the column as timestamp but still converted TIMESTAMP to BIGINT in the HIVE table. Here's my query: sqoop import \ --connect "jdbc:oracle:thin:@192.168.1.17:1521:XE" \ --username "xxxxx" \ --password-file /user/cloudera/credentials/ora.password \ --as-parquetfile \ --hive-import \ --hive-overwrite \ --target-dir /user/cloudera/oracle \ --compression-codec snappy \ --query "select EMPLOYEE_ID, START_DATE, END_DATE, JOB_ID, DEPARTMENT_ID, cast(TSTAMP as TIMESTAMP) from SQOOPDB.JOB_HISTORY where \$CONDITIONS" \ --hive-table job_history9 \ --hive-database oracle \ -m 1
... View more
04-20-2017
01:52 AM
I'm trying to load data from Oracle to Hive as parquet. Every time i load a table with date/timestamp column to hive, it automatically converts these columns to BIGINT. Is is possible to load timestamp/date formats to hive using sqoop and as a parquet file? Already tried creating the table first in hive then using impala to LOAD DATA INPATH the parquet file. Still failed with errors "file XX has an incompatible Parquet schema for column XX column: TIMESTAMP"
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Impala
-
Apache Sqoop