Member since
03-06-2016
18
Posts
16
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2832 | 04-06-2016 03:50 PM | |
973 | 03-07-2016 05:20 AM |
07-15-2016
07:51 PM
Below is the show create table result. CREATE TABLE `test`(
`id` string)
ROW FORMAT SERDE
'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
STORED AS INPUTFORMAT
'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
'hdfs://path'
TBLPROPERTIES (
'COLUMN_STATS_ACCURATE'='true',
'numFiles'='1',
'numRows'='1',
'rawDataSize'='10',
'totalSize'='11',
'transient_lastDdlTime'='1468612262') ;
... View more
07-15-2016
05:35 PM
I am trying to cast string to int ( cast(id as int) ) with external text file format and it is giving null values . Below are the results . Example : select id from test limit 5; 123023342
005957304
null
002191996
null select cast(id as int ) from test limit 5; NULL NULL NULL NULL NULL
... View more
Labels:
04-14-2016
04:05 PM
2 Kudos
Thank you All. Resolved the issue by changing /etc/hive/conf/hive-site.xml to /usr/hdp/current/spark-client/conf/hive-site.xml file . --files /usr/hdp/current/spark-client/conf/hive-site.xml,/etc/tez/conf/tez-site.xml Running Command : spark-submit --class Test.App --verbose --master yarn-cluster --num-executors 2 --driver-memory 4g --executor-memory 4g --executor-cores 2 --driver-cores 2 --conf spark.yarn.jar=hdfs://hdfspath/oozie/spark-assembly-1.5.2.2.3.4.0-3485-hadoop2.7.1.2.3.4.0-3485.jar --files /usr/hdp/current/spark-client/conf/hive-site.xml,/etc/tez/conf/tez-site.xml --jars hdfs://hdfspath/oozie/datanucleus-api-jdo-3.2.6.jar,hdfs://hdfs//oozie/datanucleus-core-3.2.10.jar,hdfs://hdfspath/datanucleus-rdbms-3.2.9.jar,hdfs://hdfs/oozie/mysql-connector-java.jar,hdfs://hdfspath/share/lib/hive/tez-api-0.7.0.2.3.4.0-3485.jar,hdfs://hdfspath/share/lib/hive/tez-dag-0.7.0.2.3.4.0-3485.jar hdfs://hdfspath/oozie/Test.jar 2016-04-11
... View more
04-13-2016
09:13 PM
Thanks for your response . I am trying with below parameters and still getting the same OOM . --files /etc/hive/conf/hive-site.xml --files /etc/tez/conf/tez-site.xml --driver-class-path hive-site.xml,tez-site.xml --conf spark.driver.extraJavaOptions="-XX:MaxPermSize=1120m" --driver-java-options "-Djavax.jdo.option.ConnectionURL=jdbc:mysql://testip/hive?createDatabaseIfNotExist=true -Dhive.metastore.uris=thrift://testip:9083 " Table details : File Format : Text File / External Table Size : 10 MB
... View more
04-13-2016
09:08 PM
Thanks for your response . Even I lowered all the memory as 1 GB and still got same OOM Error . Issue happens only for
HiveContext but works fine with SparkContext, SQLContext. command : --driver-memory 1g --executor-memory 1g --executor-cores 2 --driver-cores 2 --conf spark.yarn.driver.memoryOverhead=200 --driver-java-options "-XX:MaxPermSize=128m"
... View more
04-12-2016
10:44 PM
I am trying to submit the spark-sql scala code with yarn-cluster mode and got OOM exception in driver . command used : spark-submit --class Test.App --verbose --master yarn-cluster --num-executors 2 --driver-memory 5000m --executor-memory 5000m --executor-cores 2 --driver-cores 2 --conf spark.yarn.driver.memoryOverhead=1024 --conf spark.driver.maxResultSize=5g --driver-java-options "-XX:MaxPermSize=1000m" --conf spark.yarn.jar=hdfs://hdfspath/oozie/spark-assembly-1.5.2.2.3.4.0-3485-hadoop2.7.1.2.3.4.0-3485.jar --jars hdfs://hdfspath/oozie/datanucleus-api-jdo-3.2.6.jar,hdfs://hdfs//oozie/datanucleus-core-3.2.10.jar,hdfs://hdfspath/datanucleus-rdbms-3.2.9.jar,hdfs://hdfs/oozie/mysql-connector-java.jar,hdfs://hdfspath/share/lib/hive/tez-api-0.7.0.2.3.4.0-3485.jar,hdfs://hdfspath/share/lib/hive/tez-dag-0.7.0.2.3.4.0-3485.jar --conf spark.driver.extraJavaOptions="-XX:MaxPermSize=1120m",hive.metastore.uris=thrift://testip:9083,hive.server2.thrift.http.port=10001,hive.server2.thrift.port=10000 --driver-java-options "-Djavax.jdo.option.ConnectionURL=jdbc:mysql://testip/hive?createDatabaseIfNotExist=true -Dhive.metastore.uris=thrift://testip:9083 -Dhive.server2.thrift.port=10000 -Dhive.metastore.warehouse.dir=/apps/hive/warehouse" --files hdfs://hdfspath/oozie/hive-tez-site.xml --driver-class-path hive-tez-site.xml hdfs://hdfspath/oozie/Test.jar 2016-04-11 Error details : 16/04/12 18:25:19 INFO hive.HiveContext: default warehouse location is /apps/hive/warehouse
16/04/12 18:25:19 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
16/04/12 18:25:19 INFO client.ClientWrapper: Inspected Hadoop version: 2.2.0
16/04/12 18:25:19 INFO client.ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.2.0
16/04/12 18:25:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/04/12 18:25:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://test.com:9083
16/04/12 18:25:20 INFO hive.metastore: Connected to metastore.
Exception in thread "Driver"
Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "Driver"
16/04/12 18:25:24 INFO spark.SparkContext: Invoking stop() from shutdown hook
16/04/12 18:25:24 INFO history.YarnHistoryService: Application end event: SparkListenerApplicationEnd(1460499924685)
The same code works fine in spark-submit with yarn-client mode . I am getting this exception while using HiveContext only. Thanks in advance.
... View more
Labels:
04-06-2016
03:50 PM
It is working fine after removing duplicate hdp teradata connector jar file from sqoop lib location .
... View more
03-16-2016
01:45 AM
Find the below command to export the text data from HDFS to teradata . sqoop export -Dmapred.job.queue.name=xxxxx --connect jdbc:teradata://xxxxxx/Database=xxxxxx --connection-manager org.apache.sqoop.teradata.TeradataConnManager --username xxxxxx --password xxxxxx --table xxxxxx_teradata_table_name --export-dir /hdfs_data_path/insert_dt=2016-03-01 --input-null-string '\\N' --input-null-non-string '\\N' I tried with --input-null-string '\\N' --input-null-non-string '\\N' and got same exception. It is looks like some version mismatch issue . find below version details .: hdp teradata version : 1.4.1 hdp version : 2.3.4
... View more
03-16-2016
01:35 AM
Hi Neeraj, Is there any latest hdp-teradata connector for hdp 2.3.4 . https://community.hortonworks.com/questions/22957/sqoop-export-with-hdp-teradata-connector-error.html
... View more
03-14-2016
09:25 PM
2 Kudos
Trying to export the data from hdfs to teradata and got below error . 2016-03-14 17:18:09,688 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1454014691973_216609_000001 2016-03-14 17:18:10,010 FATAL [main] org.apache.hadoop.conf.Configuration: error parsing conf job.xml
org.xml.sax.SAXParseException; systemId: file://temp/yarn/local/usercache/*****/appcache/application_1454014691973_216609/container_e14_1454014691973_216609_01_000001/job.xml; lineNumber: 90; columnNumber: 62; Character reference "" is an invalid XML character.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2480)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2549)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2502)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1232)
at org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:51)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1482)
2016-03-14 17:18:10,012 FATAL [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:///temp/yarn/local/usercache/****/appcache/application_1454014691973_216609/container_e14_1454014691973_216609_01_000001/job.xml; lineNumber: 90; columnNumber: 62; Character reference "" is an invalid XML character.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2645)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2502)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1232)
at org.apache.hadoop.mapreduce.v2.util.MRWebAppUtil.initialize(MRWebAppUtil.java:51)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1482)
Caused by: org.xml.sax.SAXParseException; systemId: file:///apps/opt/data07/hadoop/yarn/local/usercache/rajenne/appcache/application_1454014691973_216609/container_e14_1454014691973_216609_01_000001/job.xml; lineNumber: 90; columnNumber: 62; Character reference "" is an invalid XML character.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2480)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2549)
... 5 more
2016-03-14 17:18:10,014 INFO [main] org.apache.hadoop.util.ExitUtil: Exiting with status 1
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop