Member since
05-16-2016
270
Posts
18
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1717 | 07-23-2016 11:36 AM | |
3057 | 07-23-2016 11:35 AM | |
1567 | 06-05-2016 10:41 AM | |
1162 | 06-05-2016 10:37 AM |
06-23-2016
06:17 PM
1 Kudo
ALTER TABLE MAGNETO.SALES_FLAT_ORDER SET SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde' assuming you have hive 0.14 or later.
... View more
06-20-2016
06:14 AM
Could you please share the steps that resolved the issue and mark as best answer? Thanks, Sindhu
... View more
06-16-2016
04:17 PM
select TO_DATE(created_at),
DATEDIFF(TO_DATE(current_date()), TO_DATE(sales_flat_order.created_at)) as delay,
count(*) as NumberOfOrders
FROM
magentodb.sales_flat_order
WHERE
status IN ( 'packed' , 'cod_confirmed' )
GROUP BY TO_DATE(created_at),
DATEDIFF(TO_DATE(current_date()), TO_DATE(sales_flat_order.created_at))
... View more
06-10-2016
05:36 AM
1 Kudo
@sameer lail Go to Ambari-> HDFS and search for dfs.datanode.data.dir then go to your Linux server and navigate to respective directories to see the data. Ex: [root@adcp DATA]# pwd /DATA
[root@adcpDATA]#ls -lrt drwxr-xr-x. 3 root root 4096 Dec 15 2014 sdd1 drwxr-xr-x. 3 root root 4096 Dec 15 2014 sdc1 drwxr-xr-x. 3 root root 4096 Dec 15 2014 sdf1 drwxr-xr-x. 3 root root 4096 Dec 15 2014 sde1 drwxr-xr-x. 3 root root 4096 Dec 15 2014 sdg1
... View more
06-06-2016
10:44 AM
3 Kudos
if your external table is pointing to some location in hdfs and you are putting more csv data on table location which has same schema as of defined table then hive will take care of new data automatically.
... View more
06-09-2016
03:02 AM
4 Kudos
@sameer lail It is not stupid what you did. CSV is a file format, not a data structure in R. What you could is to create a dataframe with a single column with all values separated by comma then use hdfs write to output that as a file with extension csv. Another option is to write map-reduce with R and streaming API and set the output to be csv. If any of my responses were helpful, please don't forget to vote them.
... View more
06-06-2016
07:25 PM
You can try as below li <- read.table(textConnection(c), sep = ",");
... View more
06-05-2016
10:37 AM
Alright! It took fair amount of time to come up with this fix. I have shared the problem with an elaborate solution here:http://www.yourtechchick.com/sqoop/sqoop-job-fails-to-record-password-after-running-for-the-first-time-problem-fix/
... View more
05-31-2016
10:29 AM
4 Kudos
@sameer lail Please set HADOOP_CMD variable inside R before hdfs call. Sys.setenv("HADOOP_CMD"="/usr/hdp/current/hadoop-client/bin/hadoop")
... View more
06-05-2016
10:41 AM
Alright. It was pretty tricky and unintuitive but I have shared an elaborate solution here: http://www.yourtechchick.com/sqoop/sqoop-job-fails-to-record-password-after-running-for-the-first-time-problem-fix/
... View more
- « Previous
- Next »