2017-10-16 08:11:02,224 INFO [HiveServer2-Handler-Pool: Thread-52]: session.HiveSessionImpl (HiveSessionImpl.java:acquireAfterOpLock(333)) - We are setting the hadoop caller context to 2baabf5f-ab92-4ffd-ae57-1f81becec533 for thread HiveServer2-Handler-Pool: Thread-52 2017-10-16 08:11:02,225 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,225 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,226 INFO [HiveServer2-Handler-Pool: Thread-52]: ql.Driver (Driver.java:compile(427)) - We are setting the hadoop caller context from HIVE_SSN_ID:2baabf5f-ab92-4ffd-ae57-1f81becec533 to hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef 2017-10-16 08:11:02,227 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,227 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command: SELECT COUNT(*) FROM sandbox.number_generator WHERE 1 = 2 2017-10-16 08:11:02,228 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.ParseDriver (ParseDriver.java:parse(209)) - Parse Completed 2017-10-16 08:11:02,229 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,229 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,234 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:analyzeInternal(10436)) - Starting Semantic Analysis 2017-10-16 08:11:02,235 INFO [HiveServer2-Handler-Pool: Thread-52]: sqlstd.SQLStdHiveAccessController (SQLStdHiveAccessController.java:(95)) - Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=2baabf5f-ab92-4ffd-ae57-1f81becec533, clientType=HIVESERVER2] 2017-10-16 08:11:02,236 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:genResolvedParseTree(10383)) - Completed phase 1 of Semantic Analysis 2017-10-16 08:11:02,236 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(1628)) - Get metadata for source tables 2017-10-16 08:11:02,236 INFO [HiveServer2-Handler-Pool: Thread-52]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(860)) - 2: get_table : db=sandbox tbl=number_generator 2017-10-16 08:11:02,237 INFO [HiveServer2-Handler-Pool: Thread-52]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(411)) - ugi=anonymous ip=unknown-ip-addr cmd=get_table : db=sandbox tbl=number_generator 2017-10-16 08:11:02,275 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(1766)) - Get metadata for subqueries 2017-10-16 08:11:02,275 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(1790)) - Get metadata for destination tables 2017-10-16 08:11:02,276 ERROR [HiveServer2-Handler-Pool: Thread-52]: hdfs.KeyProviderCache (KeyProviderCache.java:createKeyProviderURI(87)) - Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !! 2017-10-16 08:11:02,281 INFO [HiveServer2-Handler-Pool: Thread-52]: ql.Context (Context.java:getMRScratchDir(448)) - New scratch dir is hdfs://hive-host:8020/tmp/hive/anonymous/2baabf5f-ab92-4ffd-ae57-1f81becec533/hive_2017-10-16_08-11-02_227_4905545243586525498-7 2017-10-16 08:11:02,281 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:genResolvedParseTree(10387)) - Completed getting MetaData in Semantic Analysis 2017-10-16 08:11:02,282 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.BaseSemanticAnalyzer (CalcitePlanner.java:canCBOHandleAst(399)) - Not invoking CBO because the statement has too few joins 2017-10-16 08:11:02,285 INFO [HiveServer2-Handler-Pool: Thread-52]: common.FileUtils (FileUtils.java:mkdir(573)) - Creating directory if it doesn't exist: hdfs://hive-host:8020/tmp/hive/anonymous/2baabf5f-ab92-4ffd-ae57-1f81becec533/hive_2017-10-16_08-11-02_227_4905545243586525498-7/-mr-10000/.hive-staging_hive_2017-10-16_08-11-02_227_4905545243586525498-7 2017-10-16 08:11:02,325 WARN [HiveServer2-Handler-Pool: Thread-52]: optimizer.ConstantPropagateProcFactory (ConstantPropagateProcFactory.java:process(1055)) - Filter expression GenericUDFOPEqual(Const int 1, Const int 2) holds false! 2017-10-16 08:11:02,326 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(694)) - Processing for FS(205) 2017-10-16 08:11:02,326 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(694)) - Processing for SEL(204) 2017-10-16 08:11:02,327 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(694)) - Processing for GBY(203) 2017-10-16 08:11:02,327 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(694)) - Processing for RS(202) 2017-10-16 08:11:02,328 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(694)) - Processing for GBY(201) 2017-10-16 08:11:02,328 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(694)) - Processing for SEL(200) 2017-10-16 08:11:02,329 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(441)) - Processing for FIL(199) 2017-10-16 08:11:02,329 INFO [HiveServer2-Handler-Pool: Thread-52]: ppd.OpProcFactory (OpProcFactory.java:process(413)) - Processing for TS(198) 2017-10-16 08:11:02,330 WARN [HiveServer2-Handler-Pool: Thread-52]: optimizer.ConstantPropagateProcFactory (ConstantPropagateProcFactory.java:process(1055)) - Filter expression Const boolean false holds false! 2017-10-16 08:11:02,335 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,335 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,336 INFO [HiveServer2-Handler-Pool: Thread-52]: optimizer.ColumnPrunerProcFactory (ColumnPrunerProcFactory.java:pruneReduceSinkOperator(866)) - RS 202 oldColExprMap: {VALUE._col0=Column[_col0]} 2017-10-16 08:11:02,336 INFO [HiveServer2-Handler-Pool: Thread-52]: optimizer.ColumnPrunerProcFactory (ColumnPrunerProcFactory.java:pruneReduceSinkOperator(915)) - RS 202 newColExprMap: {VALUE._col0=Column[_col0]} 2017-10-16 08:11:02,338 WARN [HiveServer2-Handler-Pool: Thread-52]: optimizer.ConstantPropagateProcFactory (ConstantPropagateProcFactory.java:process(1055)) - Filter expression Const boolean false holds false! 2017-10-16 08:11:02,523 INFO [HiveServer2-Handler-Pool: Thread-52]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(860)) - 2: get_table_statistics_req: db=sandbox table=number_generator 2017-10-16 08:11:02,523 INFO [HiveServer2-Handler-Pool: Thread-52]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(411)) - ugi=anonymous ip=unknown-ip-addr cmd=get_table_statistics_req: db=sandbox table=number_generator 2017-10-16 08:11:02,563 INFO [HiveServer2-Handler-Pool: Thread-52]: optimizer.SetReducerParallelism (SetReducerParallelism.java:process(108)) - Number of reducers determined to be: 1 2017-10-16 08:11:02,564 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.TezCompiler (TezCompiler.java:runCycleAnalysisForPartitionPruning(157)) - Cycle free: true 2017-10-16 08:11:02,565 INFO [HiveServer2-Handler-Pool: Thread-52]: physical.NullScanOptimizer (NullScanOptimizer.java:process(129)) - Found where false TableScan. TS[198] 2017-10-16 08:11:02,565 INFO [HiveServer2-Handler-Pool: Thread-52]: ql.Context (Context.java:getMRScratchDir(448)) - New scratch dir is hdfs://hive-host:8020/tmp/hive/anonymous/2baabf5f-ab92-4ffd-ae57-1f81becec533/hive_2017-10-16_08-11-02_227_4905545243586525498-7 2017-10-16 08:11:02,566 INFO [HiveServer2-Handler-Pool: Thread-52]: physical.Vectorizer (Vectorizer.java:validateMapWork(693)) - Validating MapWork... 2017-10-16 08:11:02,566 INFO [HiveServer2-Handler-Pool: Thread-52]: physical.Vectorizer (Vectorizer.java:verifyAndSetVectorPartDesc(533)) - The input format org.apache.hadoop.hive.ql.io.OneNullRowInputFormat of path -mr-10003sandbox.number_generator{} doesn't provide vectorized input 2017-10-16 08:11:02,567 INFO [HiveServer2-Handler-Pool: Thread-52]: parse.CalcitePlanner (SemanticAnalyzer.java:analyzeInternal(10554)) - Completed plan generation 2017-10-16 08:11:02,567 INFO [HiveServer2-Handler-Pool: Thread-52]: ql.Driver (Driver.java:compile(477)) - Semantic Analysis Completed 2017-10-16 08:11:02,567 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,569 INFO [HiveServer2-Handler-Pool: Thread-52]: exec.ListSinkOperator (Operator.java:initialize(333)) - Initializing operator OP[207] 2017-10-16 08:11:02,569 INFO [HiveServer2-Handler-Pool: Thread-52]: ql.Driver (Driver.java:getSchema(253)) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], properties:null) 2017-10-16 08:11:02,569 INFO [HiveServer2-Handler-Pool: Thread-52]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,570 INFO [HiveServer2-Handler-Pool: Thread-52]: ql.Driver (Driver.java:compile(559)) - We are resetting the hadoop caller context to HIVE_SSN_ID:2baabf5f-ab92-4ffd-ae57-1f81becec533 2017-10-16 08:11:02,570 INFO [HiveServer2-Handler-Pool: Thread-52]: session.HiveSessionImpl (HiveSessionImpl.java:releaseBeforeOpLock(357)) - We are resetting the hadoop caller context for thread HiveServer2-Handler-Pool: Thread-52 2017-10-16 08:11:02,575 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (Driver.java:checkConcurrency(173)) - Concurrency mode is disabled, not creating a lock manager 2017-10-16 08:11:02,575 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,575 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (Driver.java:execute(1408)) - Setting caller context to query id hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef 2017-10-16 08:11:02,576 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (Driver.java:execute(1411)) - Starting command(queryId=hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef): SELECT COUNT(*) FROM sandbox.number_generator WHERE 1 = 2 2017-10-16 08:11:02,577 INFO [HiveServer2-Background-Pool: Thread-4011]: hooks.ATSHook (ATSHook.java:(114)) - Created ATS Hook 2017-10-16 08:11:02,577 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,579 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,580 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (SessionState.java:printInfo(984)) - Query ID = hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef 2017-10-16 08:11:02,580 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (SessionState.java:printInfo(984)) - Total jobs = 1 2017-10-16 08:11:02,581 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,582 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (SessionState.java:printInfo(984)) - Launching Job 1 out of 1 2017-10-16 08:11:02,582 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (Driver.java:launchTask(1746)) - Starting task [Stage-1:MAPRED] in serial mode 2017-10-16 08:11:02,583 INFO [HiveServer2-Background-Pool: Thread-4011]: tez.TezSessionPoolManager (TezSessionPoolManager.java:getSession(132)) - QueueName: null nonDefaultUser: true defaultQueuePool: null blockingQueueLength: -1 2017-10-16 08:11:02,583 INFO [HiveServer2-Background-Pool: Thread-4011]: tez.TezSessionPoolManager (TezSessionPoolManager.java:getNewSessionState(162)) - Created a new session for queue: null session id: 4996084a-5e40-498a-af46-11d324e7e41a 2017-10-16 08:11:02,634 INFO [ATS Logger 0]: hooks.ATSHook (ATSHook.java:createTimelineDomain(123)) - ATS domain created:hive_2baabf5f-ab92-4ffd-ae57-1f81becec533(anonymous,hive,anonymous,hive) 2017-10-16 08:11:02,638 INFO [ATS Logger 0]: hooks.ATSHook (ATSHook.java:createPreHookEvent(302)) - Received pre-hook notification for :hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef 2017-10-16 08:11:02,693 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Context (Context.java:getMRScratchDir(448)) - New scratch dir is hdfs://hive-host:8020/tmp/hive/anonymous/2baabf5f-ab92-4ffd-ae57-1f81becec533/hive_2017-10-16_08-11-02_227_4905545243586525498-18 2017-10-16 08:11:02,694 INFO [HiveServer2-Background-Pool: Thread-4011]: exec.Task (TezTask.java:updateSession(285)) - Tez session hasn't been created yet. Opening session 2017-10-16 08:11:02,694 INFO [HiveServer2-Background-Pool: Thread-4011]: tez.TezSessionState (TezSessionState.java:open(129)) - Opening the session with id 4996084a-5e40-498a-af46-11d324e7e41a for thread HiveServer2-Background-Pool: Thread-4011 log trace id - query id - hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef 2017-10-16 08:11:02,695 INFO [HiveServer2-Background-Pool: Thread-4011]: tez.TezSessionState (TezSessionState.java:open(145)) - User of session id 4996084a-5e40-498a-af46-11d324e7e41a is anonymous 2017-10-16 08:11:02,700 INFO [HiveServer2-Background-Pool: Thread-4011]: tez.DagUtils (DagUtils.java:localizeResource(974)) - Localizing resource because it does not exist: file:/usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar to dest: hdfs://hive-host:8020/tmp/hive/anonymous/_tez_session_dir/4996084a-5e40-498a-af46-11d324e7e41a/hive-hcatalog-core.jar 2017-10-16 08:11:02,732 INFO [HiveServer2-Background-Pool: Thread-4011]: tez.DagUtils (DagUtils.java:createLocalResource(724)) - Resource modification time: 1508130662730 2017-10-16 08:11:02,736 ERROR [HiveServer2-Background-Pool: Thread-4011]: exec.Task (TezTask.java:execute(226)) - Failed to execute tez graph. org.apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=WRITE, inode="/user/anonymous":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1922) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4150) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1109) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:645) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3075) at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:3043) at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1181) at org.apache.hadoop.hdfs.DistributedFileSystem$25.doCall(DistributedFileSystem.java:1177) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1177) at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1169) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1924) at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:787) at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:897) at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:367) at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:161) at org.apache.hadoop.hive.ql.exec.tez.TezTask.updateSession(TezTask.java:286) at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:165) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1748) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1494) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1291) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1153) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197) at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76) at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=anonymous, access=WRITE, inode="/user/anonymous":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1922) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4150) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1109) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:645) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554) at org.apache.hadoop.ipc.Client.call(Client.java:1498) at org.apache.hadoop.ipc.Client.call(Client.java:1398) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at com.sun.proxy.$Proxy19.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:610) at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) at com.sun.proxy.$Proxy20.mkdirs(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3073) ... 34 more 2017-10-16 08:11:02,738 INFO [HiveServer2-Background-Pool: Thread-4011]: hooks.ATSHook (ATSHook.java:(114)) - Created ATS Hook 2017-10-16 08:11:02,739 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,740 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,740 INFO [ATS Logger 0]: hooks.ATSHook (ATSHook.java:createPostHookEvent(362)) - Received post-hook notification for :hive_20171016081102_bc3b8867-326f-4646-8244-1ea3f9eeccef 2017-10-16 08:11:02,741 ERROR [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (SessionState.java:printError(993)) - FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask 2017-10-16 08:11:02,741 INFO [HiveServer2-Background-Pool: Thread-4011]: ql.Driver (Driver.java:execute(1638)) - Resetting the caller context to HIVE_SSN_ID:2baabf5f-ab92-4ffd-ae57-1f81becec533 2017-10-16 08:11:02,742 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,742 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - 2017-10-16 08:11:02,742 INFO [HiveServer2-Background-Pool: Thread-4011]: log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - 2017-10-16 08:11:02,743 ERROR [HiveServer2-Background-Pool: Thread-4011]: operation.Operation (SQLOperation.java:run(256)) - Error running hive query: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:323) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:199) at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76) at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)