Created 11-08-2017 04:00 PM
I believe I've followed the steps from the tutorial exactly:
I get this error on the Insert-from-temp-to-actual step: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
I've the following variations, all with the same results: Hive View 1.0, file contains line endings, load from local machine.
Created 11-09-2017 05:38 PM
I've just downloaded a fresh copy of the HDP 2.6.1 sandbox, and attempted to load the geolocation's truck.csv into HDFS and into Hive using the Hive View 2.0 and following the steps in the tutorial you are using. With the maria_dev user, I was able to successfully load the data into both HDFS and Hive without a problem. I've seen errors around this twice in the past, once with vmware (can you try with VirtualBox?) and another with not quite enough ram assigned to the VM. If you can confirm you have at least 8GB assigned to the VM, and can paste the entire error line, that could help us find a resolution
Created 12-06-2017 09:06 PM
VirtualBox worked for me. And maybe VMWare would have worked as well, had I reset to the original machine image. Thanks.
Created 12-11-2017 03:47 AM
where can I download trucks.csv and geolocation csv
Created 12-11-2017 03:55 AM
Please guide where can I download geolocation andTrucks csv files.
Created 12-11-2017 04:12 AM
Hi @Siva,
There is a hotlink in the tutorial, right at the point you are asked to download:
1. Download the sample sensor data contained in a compressed (.zip) folder here:
2. Save the Geolocation.zip file to your computer, then extract the files. You should see a Geolocation folder that contains the following files:
geolocation.csv – This is the collected geolocation data from the trucks. It contains records showing truck location, date, time, type of event, speed, etc.
trucks.csv – This is data was exported from a relational database and it shows information on truck models, driverid, truckid, and aggregated mileage info.
The link goes here: