Member since
10-31-2016
81
Posts
1
Kudos Received
0
Solutions
11-05-2018
04:32 AM
@Sandeep Nemuri Thanks
... View more
11-02-2018
11:56 AM
@Sandeep Nemuri Thanks so much. One more scenario: If different business users are accessing hive with different user name from same server. Is it possible when user logins into server and can users query be assigned to predefined queue automatically?? Scenario as Below: Thanks in advance
... View more
11-02-2018
09:22 AM
Labels:
- Labels:
-
Apache Hive
-
Apache YARN
08-13-2018
08:47 AM
I want to export hive table to Json file for doing analysis
... View more
Labels:
- Labels:
-
Apache Hive
02-06-2018
10:50 AM
com.informatica.platform.dtm.executor.hive.boot.storagehandler.INFAInputFormat$INFAInputFormatRecordReader.close(INFAInputFormat.java:138)
at org.apache.hadoop.hive.ql.io.HiveRecordReader.doClose(HiveRecordReader.java:50)
at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.close(HiveContextAwareRecordReader.java:104)
at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:174)
at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.next(TezGroupedSplitsInputFormat.java:142)
at org.apache.tez.mapreduce.lib.MRReaderMapred.next(MRReaderMapred.java:113)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:61)
... 16 more
2018-02-06 13:43:47,094 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Final Counters for attempt_1517897399576_0044_1_00_000000_2: Counters: 27 [[File System Counters FILE_BYTES_READ=53220][org.apache.tez.common.counters.TaskCounter GC_TIME_MILLIS=308, CPU_MILLISECONDS=35160, PHYSICAL_MEMORY_BYTES=953155584, VIRTUAL_MEMORY_BYTES=24876220416, COMMITTED_HEAP_BYTES=953155584, INPUT_RECORDS_PROCESSED=0, INPUT_SPLIT_LENGTH_BYTES=1, OUTPUT_RECORDS=0][HIVE DESERIALIZE_ERRORS=0, RECORDS_IN_Map_1=0, RECORDS_OUT_1_default.w6196205737321119958_read_mb_leadassignto=0][Informatica Transformation Counters BytesThroughput_Read_mb_leadassigntobvertical1=0, ProcessedBytes_Read_mb_leadassigntobvertical1=0, ProcessedRows_Read_mb_leadassigntobvertical1=0, RejectedBytes_Read_mb_leadassigntobvertical1=0, RejectedRows_Read_mb_leadassigntobvertical1=0, RowsThroughput_Read_mb_leadassigntobvertical1=0][Informatica Transformation Counters_Map_1_INPUT_w6196205737321119958_infa_read_mb_leadas BytesThroughput_Read_mb_leadassigntobvertical1=0, ProcessedBytes_Read_mb_leadassigntobvertical1=0, ProcessedRows_Read_mb_leadassigntobvertical1=0, RejectedBytes_Read_mb_leadassigntobvertical1=0, RejectedRows_Read_mb_leadassigntobvertical1=0, RowsThroughput_Read_mb_leadassigntobvertical1=0][TaskCounter_Map_1_INPUT_w6196205737321119958_infa_read_mb_leadas INPUT_RECORDS_PROCESSED=0, INPUT_SPLIT_LENGTH_BYTES=1][TaskCounter_Map_1_OUTPUT_out_Map_1 OUTPUT_RECORDS=0]]
2018-02-06 13:43:47,094 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Joining on EventRouter
2018-02-06 13:43:47,094 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Closed processor for vertex=Map 1, index=0
2018-02-06 13:43:47,095 [WARN] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Ignoring exception when closing input w6196205737321119958_infa_read_mb_leadassigntobvertical1_m_load_mms_dl_his_mb_leadassigntobvertical(cleanup). Exception class=java.io.IOException, message=com.informatica.powercenter.sdk.dtm.DTMException: Internal error. The DTM instance has shut down. Contact Informatica Global Customer Support.
2018-02-06 13:43:47,095 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Closed input for vertex=Map 1, sourceVertex=w6196205737321119958_infa_read_mb_leadassigntobvertical1_m_load_mms_dl_his_mb_leadassigntobvertical
2018-02-06 13:43:47,095 [INFO] [TezChild] |output.MROutput|: out_Map 1 closed
2018-02-06 13:43:47,095 [INFO] [TezChild] |runtime.LogicalIOProcessorRuntimeTask|: Closed input for vertex=Map 1, sourceVertex=out_Map 1
2018-02-06 13:43:47,096 [INFO] [main] |task.TezChild|: Shutdown invoked for container container_e117_1517897399576_0044_01_000004
2018-02-06 13:43:47,096 [INFO] [main] |task.TezChild|: Shutting down container container_e117_1517897399576_0044_01_000004
End of LogType:syslog_attempt_1517897399576_0044_1_00_000000_2
... View more
02-05-2018
08:38 AM
workflow-log.txt@Sivaprasanna I am use informatica BDM 10.0.1 version. Source is mysql and target is HIVE table. While executing the Infoematica workflow iam getting the error as mentioned. FYR I am attaching the workflow log. Thanks.
... View more
02-03-2018
04:17 PM
Hi , While running the informatica mapping , iam getting the error as mentioned below. 2018-02-03 21:37:30.604 <DefaultWorkManager-WorkerThread-7> INFO: [2904][2018-02-03 21:37:29.095] Invoke M_load_bm_dl_his_sla_ticket_submaster_invoke: Faulting: Mapping-error : java.lang.Exception: [MPSVCCMN_10094] The Mapping Service Module failed to run the job with ID [MdOS0gj8EeiQDeexJuc-4Q] because of the following error: [HIVE_1070] The Integration Service failed to run Hive query [exec0_query_6] for task [exec0] due to following error: Hive error code [2], Hive message [FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1517663055306_0153_1_00, diagnostics=[Task failed, taskId=task_1517663055306_0153_1_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Failure while running task:java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: Mapping execution failed with the following error:
ODL_26128 Database error encountered in connection object [pri_sqoop_dwh3] with the following error message: [Database error encountered in connection object [com.informatica.adapter.infajdbc.InfaJDBCConnectInfo] with the following error message: [There is no runtime plugin entry for OSType[LINUX], objectName[com.informatica.adapter.infajdbc.InfaJDBCConnectInfo] and interfaceName[INFASQLDataAdapter].
]
The Data Integration Service could not find the run-time OSGi bundle for the adapter [com.informatica.adapter.infajdbc.InfaJDBCConnectInfo] for the operating system [LINUX]. Copy the adapter run-time OSGi bundle and verify that you have set the correct library name in the plugin.xml file.]
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: Mapping execution failed with the following error:
ODL_26128 Database error encountered in connection object [pri_sqoop_dwh3] with the following error message: [Database error encountered in connection object [com.informatica.adapter.infajdbc.InfaJDBCConnectInfo] with the following error message: [There is no runtime plugin entry for OSType[LINUX], objectName[com.informatica.adapter.infajdbc.InfaJDBCConnectInfo] and interfaceName[INFASQLDataAdapter].
] Thanks, Kotesh.
... View more
Labels:
02-02-2018
01:29 PM
@Sandeep Nemuri can you help me, how to change to namenode or the nameservice. Thanks
... View more
02-01-2018
01:01 PM
Hi @Jay Kumar SenSharma While writing to hive through informatica IDQ. I am getting the error as below. following directory: [/home/minfaadmin/informatica102].
2018-02-01 15:58:59.438 <LdtmCompile-pool-2-thread-23> INFO: [AUTOINST_3029] The cluster Hadoop distribution type is: [hortonworks_2.6].
2018-02-01 15:58:59.784 <LdtmCompile-pool-2-thread-23> INFO: [AUTOINST_3001] The MD5 hexadecimal value for the Informatica archive is [d152bc991f7cf7a905c05b22e494b30e].
2018-02-01 15:58:59.785 <LdtmCompile-pool-2-thread-23> INFO: [AUTOINST_3026] The Informatica archive already exists on the Data Integration Service machine. No archiving necessary.
2018-02-01 16:01:42.561 <LdtmCompile-pool-2-thread-23> SEVERE: Data integration service failed to create DTM instance because of the following error:
java.lang.RuntimeException: [java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby
at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87)
at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:2000)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1377)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4105)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1136)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:854)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)
]
2018-02-01 16:01:42.564 <LdtmCompile-pool-2-thread-23> SEVERE: [DSCMN_10282] The Integration Service failed to submit the mapping [m_load_mms_dl_his_mb_followupdet] because of the following error: [java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby
at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87)
at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:2000)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1377)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4105)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1136)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:854)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)
].
com.informatica.ds.common.exceptions.DataServiceRuntimeException: [DSCMN_10282] The Integration Service failed to submit the mapping [m_load_mms_dl_his_mb_followupdet] because of the following error: [java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby
at WA].
... View more
Labels:
- Labels:
-
Apache Hive