Member since
05-16-2016
785
Posts
114
Kudos Received
39
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1810 | 06-12-2019 09:27 AM | |
3005 | 05-27-2019 08:29 AM | |
5027 | 05-27-2018 08:49 AM | |
4404 | 05-05-2018 10:47 PM | |
2747 | 05-05-2018 07:32 AM |
11-04-2016
09:01 PM
We had this exception for a while and it gone by itself. as far as i am concerned this exception occurs when namenode block locations is not fresh . check if you have HDFS Block skew condition . if you see this offten then its problem because it clearly denotes that it is missing some block otherwise you can ignore it.
... View more
11-04-2016
12:07 AM
Can you verify where you logs are pointing . also verify this in your hive-site.xml and make sure that value is true . <property>
<name>hive.server2.logging.operation.enabled</name>
<value>true</value>
</property The above should help you. also for fruther information refer this link https://cwiki.apache.org/confluence/display/Hive/AdminManual+Configuration#AdminManualConfiguration-LogFiles
... View more
11-02-2016
07:06 PM
1 Kudo
Sqoop export will transfer data to Database using Insert statement as soon the user fires the export command ,Sqoop will connect to database to fetch the metada about the table. The only prequist that pertains to sqoop export command is that (--table parameter) table must exist prior to runining sqoop. You can have the table with primary key or not is up to your design . The user have to make sure that there not be any constraint violations while performing the Sqoop export(i,e INSERT)
... View more
11-02-2016
04:31 AM
2 Kudos
1 . Type show tables in Hive and note down the tables . 2 . check under user/hive/warehuse/ using Hue -> File Browser or command line that Customer folder or categories folders are already being populated . if so Remove it using HUE->File browser-Delete or Drop table command from Hive. Then re run the script and please let me know . Or Simply change the last line of the script sqoop import-all-tables \
-m 1 \
--connect jdbc:mysql://quickstart:3306/retail_db \
--username=retail_dba \ --password=cloudera \
--compression-codec=snappy \
--as-sequencefile \
--warehouse-dir=/user/hive/warehouse \
--hive-overwrite
... View more
11-01-2016
09:35 PM
Could you replace --as-parquetfile with --as-sequencefile and let me know if you are able pass through the error.
... View more
11-01-2016
07:43 PM
Its throwing a class cast exception , meaning you are trying to cast java.lang.String to org.apache.avro.generic.IndexedRecord which is not comptabile . Could you provide the table schema and your sqoop import command.
... View more
11-01-2016
04:42 AM
Could you let us know the version you are using ? Also as far I am concered only left semi joins are supported in Hive.
... View more
10-24-2016
10:39 PM
1 Kudo
You have to bucket the hive table but not sorted . Streaming to a unpartionated table is currently not supported. In your case please check your table schema table - m_tel_record
... View more
10-20-2016
09:42 PM
Execute the below commands , to have a better insight. SHOW LOCKS <TABLE_NAME>;
SHOW LOCKS <TABLE_NAME> EXTENDED;
SHOW LOCKS <TABLE_NAME> PARTITION (<PARTITION_DESC>);
SHOW LOCKS <TABLE_NAME> PARTITION (<PARTITION_DESC>) EXTENDED;
Does you hive supports concurrency ? hive.support.concurrency = default (false) Are you using HiveServer2 ?
... View more