The most common way to do bulk load into hive is not with JDBC but by ingesting files into HDFS and then use the LOAD INTO or CREATE EXTERNAL TABLE/INSERT INTO commands. ( The first one if your table has exactly the format of your data, the second one if you want to do transformations. ). The newest version of HDP also has Hive Transactions enabled which would allow you to insert large amounts of data through JDBC however they are more suited for streaming reads and update situations and are still pretty new.
I want to get row wise log for the Bulk insert, at least for failed insertions, like by any key value, which row failed. Any idea, how I can achieve that??
Yes. For whatever reason, Insertion fails, I want to get in log file
Is there anyone, who know the solution??
I create a file containing like 1000 create table statements and run beeline -f filename. The logs are stored in hive.log .we can check the log files for "Parsing command - create table .." to "Updated the size of the table" and other logs