Member since
02-21-2019
12
Posts
0
Kudos Received
0
Solutions
04-11-2021
07:53 PM
Test 2 ways: 1.- Increase the Java Heap Size of HiveServer2 Go to Hive service. Click the Configuration tab. Search for Java Heap Size of HiveServer2 in Bytes, and increase the value. Click Save Changes. Restart HiveServer2. 2.- In Hive > Configs > Custom hive-site.xml, add the following: hive.metastore.event.db.notification.api.auth=false
... View more
05-11-2019
06:09 PM
Hi, It looks like that you running spark in cluster mode, and your ApplicationMaster is running OOM. In cluster mode, the Driver is running inside the AM, I can see that you have Driver of 110G and executor memory of 12GB. Have you tried to increase both of them to see if it can help? How much I do not know, but maybe slowly increase to and keep trying. However, the driver memory of 110GB seems to be a lot, am wondering what kind of dataset is this Spark job processing? How large is the volume? Cheers Eric
... View more
03-07-2019
09:08 PM
It appears that you're trying to use Sqoop's internal handling of DATE/TIMESTAMP data types, instead of using Strings which the Oracle connector converts them to. Have you tried the option specified at https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_java_sql_timestamp? -Doraoop.timestamp.string=false You shouldn't need to map the column types manually in this approach.
... View more
02-22-2019
04:37 PM
1 Kudo
Yes, try ORACLE TO_TIMESTAMP() format if needed
... View more