Member since
05-15-2017
86
Posts
12
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7822 | 06-13-2017 12:53 AM | |
2027 | 06-03-2017 03:47 PM | |
1869 | 05-16-2017 08:00 PM | |
717 | 02-04-2016 02:50 AM |
06-14-2017
01:03 AM
Hi All, I am also facing the same issue. I am logged into Ambari with admin user and trying to create /move files from /to under /user/root/satish/ . Here are the details on the folder permissions. [root@sandbox sat]# hadoop fs -ls /user/root/satish/Found 3 itemsdrwxr-xr-x - root hdfs0 2017-06-14 00:54 /user/root/satish/inputdrwxr-xr-x - root hdfs0 2017-06-14 00:55 /user/root/satish/outputdrwxr-xr-x - root hdfs0 2017-06-14 00:55 /user/root/satish/scripts Even I tried with different path too, i am getting the same error. Please let me know if I am missing anything here. Error: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
... View more
06-13-2017
12:55 AM
when we override the path, it won't create any <dbname>.db folder, assuming that the specified path it self is the db /main folder for the mentioned database.
... View more
06-13-2017
12:53 AM
I found that this is how hive works when I provide (override the default ) the path. It will use the specified path assuming the path subdirectory ex: /user/hive/warehouse/, "warehouse" folder in this case as the main DB folder and it will create all the tables underneath.
... View more
06-11-2017
03:37 AM
@Gobi Subramani Here are the details on when to use MR vs Spark. Other useful links: http://www.infoworld.com/article/3014440/big-data/five-things-you-need-to-know-about-hadoop-v-apache-spark.html https://www.xplenty.com/blog/2014/11/apache-spark-vs-hadoop-mapreduce/ If this information is useful, please accept this answer.
... View more
06-10-2017
03:37 PM
@Girish Chaudhari Note, when you change the location of the file by using alter command, the old data file is not moved to new location. On your issue, 1) do you have any data files in the mentioned path? 2) Did you get any warnings / errors while you executed this ALTER command?
... View more
06-09-2017
03:00 PM
When I create a database without passing the location, then it will create that database in default location with <dbname>/db ex: hive> CREATE DATABASE Override1;
OK
Time taken: 0.06 seconds it will create a Override1.db folder in the default path as [cloudera@quickstart ~]$ hadoop fs -ls /user/hive/warehouse
Found 1item
drwxrwxrwx - cloudera supergroup 0 2017-06-09 14:46 /user/hive/warehouse/override1.db
[cloudera@quickstart ~]$ why don't it it creates folder with <dbname>.db folder when you pass the location to create a database(overriding the default path) ?
... View more
06-09-2017
10:31 AM
Good to know this.
... View more
06-08-2017
01:31 AM
@pratik vagyani Best and easy way is through Ambari File view upload option.
... View more
06-08-2017
12:00 AM
@Benjamin Hopp Looks like it's a bug, because the Char always fixed length. In this case you have declared it as char(10), even though you have only 3 chars assigned it to, it should not truncate the rest.
... View more
06-07-2017
07:53 PM
@Pooja Chawda Here my question is, It should create the <dbname>.db folder (as it does in default path), why It's not creating the <dbname>.db folder when I pass /override the default location? There supposed to be Testing.db under the testing folder right (in your case)? It's strange behaviour when you override the database default path.
... View more
06-07-2017
07:37 PM
@John ClevelandThis is what i did, changed the script to dump all data , truck_events = LOAD '/user/satu/test.csv' USING PigStorage(',')
AS (driverId:int, truckId:int, eventTime:chararray,
eventType:chararray, longitude:double, latitude:double,
eventKey:chararray, correlationId:long, driverName:chararray,
routeId:long,routeName:chararray,eventDate:chararray);
--DESCRIBE truck_events;
DUMP truck_events;
--truck_events_subset = LIMIT truck_events 100;
--DESCRIBE truck_events_subset;
--DUMP truck_events_subset; Job completed in 62 seconds.
... View more
06-07-2017
03:41 PM
@ozac Can you try without renaming the directory from a1 to a2? Create a separate directory with name as a2 and then alter the table.
... View more
06-07-2017
03:59 AM
If the target table is Hive table then you can use -hive-overwrite
... View more
06-07-2017
03:32 AM
@ozac When you alter the location for hive table, it will not move the data from old location to new location (if any data present). It will only load the data from the new location. The error shows that the required data file is not present. Make sure that the data file is also present in the new location. FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. cannot find dir =hdfs://sandbox.hortonworks.com:8020/tmp/a2/000000_0_copy_1 in pathToPartitionInfo: [hdfs:/tmp/a2]
... View more
06-07-2017
03:08 AM
@Amey Shirke I don't think you can use insert into command in Pig script. Other way is to use sqoop to load this data that was the output from pig script to load into any RDBMS tables.
... View more
06-07-2017
02:29 AM
@Stinger It won't give you the file size, but it will give you the path where those files are stored, so you can refer that path for details like file size, permissions, etc.
... View more
06-07-2017
02:23 AM
@John Cleveland Can you execute these commands on grunt and let me know the results? It should load file from the mentioned path as part of your LOAD command.
... View more
06-07-2017
12:13 AM
@John Cleveland Your script looks correct and I see below error in your log file. Not sure why it's not able to setup the load function. 2017-06-06 19:14:44,999 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2081: Unable to setup the load function. I ran the same script with same dataset and my script completed within a minute. PFA for the screenshots.
... View more
06-06-2017
07:44 PM
Access the path where these files are stored to find the size of these split files. You can do a DESCRIBE EXTENDED /FORMATTED Tablename to find the exact path of the files.
... View more
06-04-2017
09:47 PM
@Geoffrey Shelton Okot The default path is /apps/hive/warehouse. But, for this database, I have mentioned different path to create and I am am not able to find the .db folder in the mentioned path, I can see the table folder when I create one. The .db folder should be created under the same mentioned path right?
... View more
06-04-2017
03:22 PM
@John Cleveland Could you please post the complete script? I don't see where the relational join_data is defined in your script. Also, post the log details.
... View more
06-04-2017
11:26 AM
Hi, Could anyone please let me know that in which version of hive the hiveserver 2 is introduced?
... View more
Labels:
- Labels:
-
Hive
06-04-2017
06:43 AM
Hi Hive Users, I have created a Hive database by overrrinding the defailt path as follows, SQL: Create database diff location '/user/cloudera/sat'; After creating this databse, I tried to access this path to see wether it's created a .db folder , but I don't see anything with this name there. Even I created a table in this database and I only see the new folder with name as test. Not sure why its not crated diff.db folder in this path. Please refer the attached screen shots for more details. Please let me know if i am missing anything here.
... View more
Labels:
- Labels:
-
Hive
06-04-2017
12:13 AM
@Geoffrey Shelton Okot I tried to create a table in this db, when I created, I see a folder with same name as table in /user/hive path. but no folder with .db. PFA for the screenshots.
... View more
06-03-2017
11:57 PM
@Geoffrey Shelton Okot I have created this database by overriding the default path (used user/hive). I used both hdfs dfs and hadoop fs commands to check the .db folder. I don't see anything on the mentioned path that was used while creating. PFA for the attached screenshot.
... View more
06-03-2017
03:55 PM
cross checked on the path through command prompt with hadoop command, even I don't see any .db folder. Please let me know if I am missing anything here.
... View more
06-03-2017
03:50 PM
I used this command to change my password.
... View more