Reply
Highlighted
New Contributor
Posts: 1
Registered: ‎02-14-2019

Sqoop importing is failing with ERROR com.jolbox.bonecp.BoneCP - Unable to start/stop JMX

[ Edited ]

Here is my Oozie log in Hue, I hope someone can help me with this. I have configured a mysql database for metastore and it's workin in the cli comand line

--------------------

Setting up log4j2
log4j2 configuration file created at /yarn/nm/usercache/hdfs/appcache/application_1550256416047_0009/container_1550256416047_0009_01_000001/sqoop-log4j2.xml
Sqoop command arguments :
             import
             --connect
             jdbc:oracle:thin:@10.35.3.43:1522:EE
             --username
             admin
             --password
             ********
             --target-dir
             /user/hdfs/Example34ff
             --num-mappers
             1
             --table
             BACKOFFICE
             --hive-import
             --hive-table
             bbackofffice23
Fetching child yarn jobs
tag id : oozie-e1e38d764777f51e0f7a66c39de66671
No child applications found
=================================================================

>>> Invoking Sqoop command line now >>>

14:46:16.777 [main] INFO  org.apache.hadoop.conf.Configuration.deprecation - mapred.jar is deprecated. Instead, use mapreduce.job.jar
14:46:16.936 [main] INFO  org.apache.hadoop.conf.Configuration.deprecation - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14:46:17.337 [main] INFO  org.apache.hadoop.mapreduce.JobResourceUploader - Disabling Erasure Coding for path: /user/hdfs/.staging/job_1550256416047_0010
14:46:21.104 [main] INFO  org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
14:46:21.225 [main] INFO  org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_1550256416047_0010
14:46:21.228 [main] INFO  org.apache.hadoop.mapreduce.JobSubmitter - Executing with tokens: [Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 9 cluster_timestamp: 1550256416047 } attemptId: 1 } keyId: 414400168)]
14:49:30.871 [main] INFO  org.apache.hadoop.yarn.client.api.impl.YarnClientImpl - Submitted application application_1550256416047_0010
14:49:31.023 [main] INFO  org.apache.hadoop.mapreduce.Job - The url to track the job: http://clouutilityhost2.com:8088/proxy/application_1550256416047_0010/
14:49:31.024 [main] INFO  org.apache.hadoop.mapreduce.Job - Running job: job_1550256416047_0010
14:50:08.134 [main] INFO  org.apache.hadoop.mapreduce.Job - Job job_1550256416047_0010 running in uber mode : false
14:50:08.140 [main] INFO  org.apache.hadoop.mapreduce.Job -  map 0% reduce 0%
14:51:15.921 [main] INFO  org.apache.hadoop.mapreduce.Job -  map 100% reduce 0%
14:51:16.956 [main] INFO  org.apache.hadoop.mapreduce.Job - Job job_1550256416047_0010 completed successfully
14:51:17.206 [main] INFO  org.apache.hadoop.mapreduce.Job - Counters: 32
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=446794
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=85
		HDFS: Number of bytes written=181
		HDFS: Number of read operations=6
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=2
	Job Counters 
		Launched map tasks=1
		Other local map tasks=1
		Total time spent by all maps in occupied slots (ms)=6599
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=6599
		Total vcore-milliseconds taken by all map tasks=6599
		Total megabyte-milliseconds taken by all map tasks=6757376
	Map-Reduce Framework
		Map input records=4
		Map output records=4
		Input split bytes=85
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=190
		CPU time spent (ms)=6220
		Physical memory (bytes) snapshot=339554304
		Virtual memory (bytes) snapshot=2601136128
		Total committed heap usage (bytes)=314572800
		Peak Map Physical memory (bytes)=339554304
		Peak Map Virtual memory (bytes)=2601136128
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=181
14:51:17.386 [main] INFO  org.apache.hadoop.hive.conf.HiveConf - Found configuration file null
2019-02-15 19:51:17,696 main WARN locateContext called with URI jar:file:/yarn/nm/filecache/2026/hive-common.jar!/hive-log4j2.properties. Existing LoggerContext has URI file:/yarn/nm/usercache/hdfs/appcache/application_1550256416047_0009/container_1550256416047_0009_01_000001/sqoop-log4j2.xml
14:51:17.697 [main] WARN  org.apache.hadoop.hive.common.LogUtils - hive-site.xml not found on CLASSPATH
14:51:17.892 [main] INFO  SessionState - 
Logging initialized using configuration in jar:file:/yarn/nm/filecache/2026/hive-common.jar!/hive-log4j2.properties Async: false
14:51:18.081 [main] INFO  org.apache.hadoop.hive.ql.session.SessionState - Created HDFS directory: /tmp/hive/hdfs/3818e77f-cb96-45a7-8485-34ba9c472179
14:51:18.096 [main] INFO  org.apache.hadoop.hive.ql.session.SessionState - Created local directory: /tmp/yarn/3818e77f-cb96-45a7-8485-34ba9c472179
14:51:18.115 [main] INFO  org.apache.hadoop.hive.ql.session.SessionState - Created HDFS directory: /tmp/hive/hdfs/3818e77f-cb96-45a7-8485-34ba9c472179/_tmp_space.db
14:51:18.117 [main] INFO  org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: 3818e77f-cb96-45a7-8485-34ba9c472179
14:51:18.118 [main] INFO  org.apache.hadoop.hive.ql.session.SessionState - Updating thread name to 3818e77f-cb96-45a7-8485-34ba9c472179 main
14:51:18.119 [3818e77f-cb96-45a7-8485-34ba9c472179 main] INFO  org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: 3818e77f-cb96-45a7-8485-34ba9c472179
14:51:18.246 [3818e77f-cb96-45a7-8485-34ba9c472179 main] INFO  org.apache.hadoop.hive.ql.Driver - Compiling command(queryId=yarn_20190215195118_7cbf96d5-bf90-4017-a107-e0e86bf8a9b0): CREATE TABLE IF NOT EXISTS `bbackofffice23` ( `ID_BACKOFFICE` DOUBLE, `NOMBRE` STRING, `DESCRIPCION` STRING, `ESTADO` STRING) COMMENT 'Imported by sqoop on 2019/02/15 19:51:17' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
14:51:21.038 [3818e77f-cb96-45a7-8485-34ba9c472179 main] INFO  org.apache.hadoop.hive.metastore.HiveMetaStore - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
14:51:21.084 [3818e77f-cb96-45a7-8485-34ba9c472179 main] INFO  org.apache.hadoop.hive.metastore.ObjectStore - ObjectStore, initialize called
14:51:31.536 [3818e77f-cb96-45a7-8485-34ba9c472179 main] ERROR com.jolbox.bonecp.BoneCP - Unable to start/stop JMX
java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
	at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472) ~[?:1.8.0_141]
	at java.lang.SecurityManager.checkPermission(SecurityManager.java:585) ~[?:1.8.0_141]
	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848) ~[?:1.8.0_141]
	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322) ~[?:1.8.0_141]
	at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522) ~[?:1.8.0_141]
	at com.jolbox.bonecp.BoneCP.registerUnregisterJMX(BoneCP.java:528) [bonecp-0.8.0.RELEASE.jar:?]
	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:500) [bonecp-0.8.0.RELEASE.jar:?]
	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) [bonecp-0.8.0.RELEASE.jar:?]
	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483) [datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296) [datanucleus-rdbms-4.1.7.jar:?]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_141]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_141]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_141]
	at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821) [datanucleus-api-jdo-4.2.1.jar:?]
	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338) [datanucleus-api-jdo-4.2.1.jar:?]
	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217) [datanucleus-api-jdo-4.2.1.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) [jdo-api-3.0.1.jar:3.0.1]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) [jdo-api-3.0.1.jar:3.0.1]
	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) [jdo-api-3.0.1.jar:3.0.1]
	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) [jdo-api-3.0.1.jar:3.0.1]
	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) [jdo-api-3.0.1.jar:3.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:517) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:546) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:401) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:338) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:299) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:612) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:578) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:639) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6869) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:248) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_141]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_141]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_141]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1700) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3581) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3633) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3613) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3867) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:247) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:230) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:387) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:331) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:311) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:287) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:228) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:207) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:368) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:233) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:304) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:537) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1359) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1488) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1278) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1268) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:342) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:489) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:505) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:808) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:774) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:701) [hive-cli.jar:2.1.1-cdh6.0.1]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:341) [sqoop.jar:?]
	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246) [sqoop.jar:?]
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:543) [sqoop.jar:?]
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:634) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.run(Sqoop.java:145) [sqoop.jar:?]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:232) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.main(Sqoop.java:250) [sqoop.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:214) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:199) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:101) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_141]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_141]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
14:51:32.458 [3818e77f-cb96-45a7-8485-34ba9c472179 main] INFO  org.apache.hadoop.hive.metastore.ObjectStore - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
14:51:32.555 [3818e77f-cb96-45a7-8485-34ba9c472179 main] ERROR com.jolbox.bonecp.BoneCP - Unable to start/stop JMX
java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
	at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472) ~[?:1.8.0_141]
	at java.lang.SecurityManager.checkPermission(SecurityManager.java:585) ~[?:1.8.0_141]
	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848) ~[?:1.8.0_141]
	at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322) ~[?:1.8.0_141]
	at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522) ~[?:1.8.0_141]
	at com.jolbox.bonecp.BoneCP.registerUnregisterJMX(BoneCP.java:528) [bonecp-0.8.0.RELEASE.jar:?]
	at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:500) [bonecp-0.8.0.RELEASE.jar:?]
	at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) [bonecp-0.8.0.RELEASE.jar:?]
	at org.datanucleus.store.rdbms.ConnectionProviderPriorityList.getConnection(ConnectionProviderPriorityList.java:57) [datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:402) [datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:361) [datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:316) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:84) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:347) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:310) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:628) [datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.query.Query.executeQuery(Query.java:1844) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:807) [datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.query.Query.execute(Query.java:1715) [datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:371) [datanucleus-api-jdo-4.2.1.jar:?]
	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:213) [datanucleus-api-jdo-4.2.1.jar:?]
	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:243) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:146) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:406) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:338) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:299) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:612) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:578) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:639) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6869) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:248) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_141]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_141]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_141]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1700) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3581) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3633) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3613) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3867) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:247) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:230) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:387) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:331) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:311) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:287) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:228) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:207) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:368) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:233) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:304) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:537) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1359) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1488) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1278) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1268) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:342) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:489) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:505) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:808) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:774) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:701) [hive-cli.jar:2.1.1-cdh6.0.1]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:341) [sqoop.jar:?]
	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246) [sqoop.jar:?]
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:543) [sqoop.jar:?]
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:634) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.run(Sqoop.java:145) [sqoop.jar:?]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:232) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.main(Sqoop.java:250) [sqoop.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:214) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:199) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:101) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_141]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_141]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
14:51:32.621 [3818e77f-cb96-45a7-8485-34ba9c472179 main] WARN  org.apache.hadoop.hive.metastore.MetaStoreDirectSql - Self-test query [select "DB_ID" from "DBS"] failed; direct SQL is disabled
javax.jdo.JDODataStoreException: Error executing SQL query "select "DB_ID" from "DBS"".
	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543) ~[datanucleus-api-jdo-4.2.1.jar:?]
	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:388) ~[datanucleus-api-jdo-4.2.1.jar:?]
	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:213) ~[datanucleus-api-jdo-4.2.1.jar:?]
	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:243) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:146) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:406) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:338) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:299) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:612) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:578) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:639) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6869) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:248) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) [?:1.8.0_141]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [?:1.8.0_141]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) [?:1.8.0_141]
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1700) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) [hive-metastore.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3581) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3633) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3613) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3867) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:247) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:230) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:387) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:331) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:311) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:287) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:228) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:207) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:368) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.<init>(CalcitePlanner.java:233) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:304) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:537) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1359) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1488) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1278) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1268) [hive-exec-core.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:342) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:489) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:505) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:808) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:774) [hive-cli.jar:2.1.1-cdh6.0.1]
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:701) [hive-cli.jar:2.1.1-cdh6.0.1]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:341) [sqoop.jar:?]
	at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246) [sqoop.jar:?]
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:543) [sqoop.jar:?]
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:634) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.run(Sqoop.java:145) [sqoop.jar:?]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:232) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241) [sqoop.jar:?]
	at org.apache.sqoop.Sqoop.main(Sqoop.java:250) [sqoop.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:214) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:199) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:101) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51) [oozie-sharelib-sqoop-5.0.0-cdh6.0.1.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_141]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_141]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
	at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_141]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_141]
	at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_141]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1726) [hadoop-common-3.0.0-cdh6.0.1.jar:?]
	at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) [oozie-sharelib-oozie-5.0.0-cdh6.0.1.jar:?]
Caused by: java.sql.SQLSyntaxErrorException: Table/View 'DBS' does not exist.
	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedPreparedStatement.<init>(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedPreparedStatement42.<init>(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.jdbc.Driver42.newEmbedPreparedStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at com.jolbox.bonecp.ConnectionHandle.prepareStatement(ConnectionHandle.java:1193) ~[bonecp-0.8.0.RELEASE.jar:?]
	at org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:345) ~[datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getPreparedStatementForQuery(RDBMSQueryUtils.java:211) ~[datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:633) ~[datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.query.Query.executeQuery(Query.java:1844) ~[datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:807) ~[datanucleus-rdbms-4.1.7.jar:?]
	at org.datanucleus.store.query.Query.execute(Query.java:1715) ~[datanucleus-core-4.1.6.jar:?]
	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:371) ~[datanucleus-api-jdo-4.2.1.jar:?]
	... 91 more
Caused by: org.apache.derby.iapi.error.StandardException: Table/View 'DBS' does not exist.
	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.FromBaseTable.bindTableDescriptor(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.FromBaseTable.bindNonVTITables(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.FromList.bindTables(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.SelectNode.bindNonVTITables(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.DMLStatementNode.bindTables(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.DMLStatementNode.bind(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.compile.CursorNode.bindStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.GenericStatement.prepMinion(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.GenericStatement.prepare(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.sql.conn.GenericLanguageConnectionContext.prepareInternalStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedPreparedStatement.<init>(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedPreparedStatement42.<init>(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.jdbc.Driver42.newEmbedPreparedStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at org.apache.derby.impl.jdbc.EmbedConnection.prepareStatement(Unknown Source) ~[derby-10.14.1.0.jar:?]
	at com.jolbox.bonecp.ConnectionHandle.prepareStatement(ConnectionHandle.java:1193) ~[bonecp-0.8.0.RELEASE.jar:?]
	at org.datanucleus.store.rdbms.SQLController.getStatementForQuery(SQLController.java:345) ~[datanucleus-rdbms-4.1.7.jar:?]

	

 

Posts: 1,836
Kudos: 415
Solutions: 295
Registered: ‎07-31-2013

Re: Sqoop importing is failing with ERROR com.jolbox.bonecp.BoneCP - Unable to start/stop JMX

The central problem is this:

> 14:51:17.697 [main] WARN org.apache.hadoop.hive.common.LogUtils - hive-site.xml not found on CLASSPATH

For Sqoop to discover your Hive MetaStore service or DB, it needs to be supplied the appropriate configuration. Please try adding your client hive-site.xml to the Oozie workflow lib to allow Sqoop's Hive invocation to discover your existing metastore correctly.
Announcements