Created on 09-20-2016 06:21 PM - edited 09-16-2022 03:40 AM
Hi
Uploading the data from HDFS or from local works but it shows NULL value in every column instead of data. Kindly help :-
while loading the data from HDFS:-
0: jdbc:hive2://localhost:10000> create table transaction(sr int,id int,amount int,product string,city string,date string);
No rows affected (0.114 seconds)
0: jdbc:hive2://localhost:10000> show tables;
+---------------+--+
| tab_name |
+---------------+--+
| transaction |
| transaction1 |
+---------------+--+
2 rows selected (0.046 seconds)
0: jdbc:hive2://localhost:10000> load data inpath '/priyanka/txn' into table transaction;
No rows affected (0.568 seconds)
0: jdbc:hive2://localhost:10000> select * from transaction;
+-----------------+-----------------+---------------------+----------------------+-------------------+-------------------+--+
| transaction.sr | transaction.id | transaction.amount | transaction.product | transaction.city | transaction.date |
+-----------------+-----------------+---------------------+----------------------+-------------------+-------------------+--+
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
+-----------------+-----------------+---------------------+----------------------+-------------------+-------------------+--+
7 rows selected (0.154 seconds)
2) while loading the data from local:-
0: jdbc:hive2://localhost:10000> LOAD DATA LOCAL INPATH 'home/cloudera/txn' INTO table transaction1;
No rows affected (2.394 seconds)
0: jdbc:hive2://localhost:10000> select * from transaction1;
+------------------+------------------+----------------------+-----------------------+--------------------+--------------------+--+
| transaction1.sr | transaction1.id | transaction1.amount | transaction1.product | transaction1.city | transaction1.date |
+------------------+------------------+----------------------+-----------------------+--------------------+--------------------+--+
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
| NULL | NULL | NULL | NULL | NULL | NULL |
+------------------+------------------+----------------------+-----------------------+--------------------+--------------------+--+
7 rows selected (1.279 seconds)
Created 09-21-2016 01:44 PM
I think the problem is that you havent defined what the ROW FORMAT is for your hive table. hive needs to understand how to separate rows in the inputfile (think the default is '\n') and how to separate columns from each row (I am not certain on what the default is but guessing that it might be COMMA)
CREATE TABLE test(name STRING, value STRING) ROW FORMAT
DELIMITED FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
cat test.txt
name1,value1,
name2,value2,
name3,value3
then load the file above
LOAD DATA LOCAL INPATH '/tmp/test.txt' INTO TABLE test;
select * from test;
this should work.
If you can provide the content of your file, I can give you a more specific answer.
Hope this helps
Created 09-26-2016 06:08 AM
Created 09-21-2016 01:44 PM
I think the problem is that you havent defined what the ROW FORMAT is for your hive table. hive needs to understand how to separate rows in the inputfile (think the default is '\n') and how to separate columns from each row (I am not certain on what the default is but guessing that it might be COMMA)
CREATE TABLE test(name STRING, value STRING) ROW FORMAT
DELIMITED FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
cat test.txt
name1,value1,
name2,value2,
name3,value3
then load the file above
LOAD DATA LOCAL INPATH '/tmp/test.txt' INTO TABLE test;
select * from test;
this should work.
If you can provide the content of your file, I can give you a more specific answer.
Hope this helps
Created on 09-22-2016 02:08 PM - edited 09-22-2016 02:13 PM
Thanks. I want to know something. Is it necessary that I need to put the file in /tmp location. Can't it work from anyother location from HDFS. Suppose if I have a file on /hello/employee.txt . Can't I use this employee.txt file from this path to load the data.
Created 09-26-2016 06:08 AM
Created 04-19-2019 05:52 AM
we are facing another ISSSUE: inserted only partial number of rows. The source table has 7,070,150,676. The insert query ran and we could see that from the source table same number of rows were transferred and beeline end screenshot also shows numRows = 7,070,150,676 but in the target hive table only 500,912,361 many rows were inserted.
The user is using QueryGrid.
Please hekp us to find the issue.