Member since
02-16-2016
176
Posts
197
Kudos Received
17
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4126 | 11-18-2016 08:48 PM | |
7498 | 08-23-2016 04:13 PM | |
1973 | 03-26-2016 12:01 PM | |
1823 | 03-15-2016 12:12 AM | |
17514 | 03-14-2016 10:54 PM |
03-03-2016
04:48 PM
1 Kudo
@mike pal What is the error you are receiving ? For any text file separated by 'I' you can use following properties while creating Hive table STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' and remove ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
WITH SERDEPROPERTIES ( "separatorChar" = " ", "quoteChar" = '"', "escapeChar" = "\\" ) That should do the trick.
... View more
03-03-2016
01:28 AM
1 Kudo
@Swapnil Prabhu What do you mean by granting permission to /var/run/ambari-server ? Did you grant a particular user execute permission ? Please be more specific.
... View more
03-03-2016
01:22 AM
1 Kudo
@saichand varanasi For next run, you need to modify last-value to last-value returned from first run. See https://sqoop.apache.org/docs/1.4.0-incubating/SqoopUserGuide.html#id1764421 section 7.2.7 on how to do incremental imports. You can use a sqoop job or Oozie that can automate this process.
... View more
03-03-2016
01:01 AM
3 Kudos
@Sunella Zag It looks like alerts resource doesn't allow filtering of fields. Only fields=* seems to be working. Specifying any fields results in default fields. http://ambari_server:8080/api/v1/clusters/appslogs/alerts?Alert/state.in(WARNING,CRITICAL,UNKNOWN)&Alert/maintenance_state=OFF&fields=Alert/state,Alert/host,Alert/text For services resource, I was able to specify a list of fields and get only required fields in my response. http://ambari_server:8080/api/v1/clusters/hdp_dev/services/HDFS?fields=alerts_summary so this seems to be a problem only with alert resource.
... View more
03-01-2016
11:43 AM
1 Kudo
Thank You @Benjamin Leonhardi. That was my thought process also. I am planning to skip Oozie and just run my sqoop scripts, capture output and grep for keys. Thanks for validating it.
... View more
03-01-2016
04:04 AM
2 Kudos
I have an Oozie job that is performing sqoop import action. I want to add another sqoop eval node to this job to run sqoop eval query on source database to get control totals and insert control totals to another Hive table. Is there a way to capture output of sqoop eval node in Oozie ?
... View more
Labels:
- Labels:
-
Apache Oozie
-
Apache Sqoop
03-01-2016
03:46 AM
1 Kudo
@Kevin Vasko @jdere Can you check if hive.server2.enable.doAs is set to True ? Also, check this JIRA https://issues.apache.org/jira/browse/HIVE-5160 Which version of Hive you are using ?
... View more
02-26-2016
01:28 AM
2 Kudos
SAP HANA driver doesn't accept Schema name in table parameter. Need to pass schemaname as part of connection parameter. sqoop import --connect "jdbc:sap://<server>:30015/?currentschema=ABC"--driver "com.sap.db.jdbc.Driver"--username admin --password ****--table TABLE1 -m 1
... View more
02-26-2016
01:24 AM
3 Kudos
Sqoop to fetch data from SAP HANA is giving following error. Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) Here is my sqoop statement sqoop import --connect "jdbc:sap://<server>:30015" --driver "com.sap.db.jdbc.Driver" --username admin --password **** --table ABC.TABLE1 -m 1
... View more
Labels:
- Labels:
-
Apache Sqoop
02-25-2016
01:10 AM
1 Kudo
@Neeraj Sabharwal @Sunile Manjee Are you suggesting one default policy at root level per repo with delegated admin rights and then individual users in group managing additional policies ? e.g. We can create one hive policy with root privileges and assign it to dba group with delegated admin rights ? Then DBA group can create any further Hive policies.
... View more