Support Questions
Find answers, ask questions, and share your expertise

Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

Highlighted

Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

New Contributor

Hi All,

I am using Ambari 2.2.2 and HDP 2.4

I have enabled ACID property if HIve using Ambari.

I have created a database table using the hive command prompt here is the sql

CREATE TABLE variables_new ( id INT, a_cost INT , b_cost INT ) CLUSTERED BY (id) INTO 2 BUCKETS STORED AS ORC TBLPROPERTIES("transactional"="true");

I am trying to connect to hive from my php source code. I have used the library from GIT. Till now I am able to connect to hive, list the databases, list tables and run select * from commands. Now I have to insert, update and delete from my php source code.

While running Insert command I am getting following error

SQL: INSERT INTO TABLE DB_NAME.variables_new VALUES (2, 10, 20)

Error:

Vertex failed, vertexName=Map 1, vertexId=vertex_1471845991138_0085_1_00, diagnostics= Vertex vertex_1471845991138_0085_1_00 Map 1 killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: values__tmp__table__1 initializer failed, vertex=vertex_1471845991138_0085_1_00 Map 1 org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://y:8020/tmp/hive/hive/9a4f4ac7-be23-4df2-b524-d0a48ec9c076/_tmp_space.db/Values__Tmp__Table__1 at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:287) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409) at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

Vertex killed, vertexName=Reducer 2, vertexId=vertex_1471845991138_0085_1_01, diagnostics= Vertex received Kill in INITED state., Vertex vertex_1471845991138_0085_1_01 Reducer 2 killed/failed due to:OTHER_VERTEX_FAILURE

DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1

Please advice, where I am making a mistake.

Thanks in advance,

Paresh Kendre (kendreparesh@gmail.com)

,

Hi All,

I have a database table, created with following command in hive

CREATE TABLE variables_new ( id INT, a_cost INT , b_cost INT ) CLUSTERED BY (id) INTO 2 BUCKETS STORED AS ORC TBLPROPERTIES("transactional"="true");

I am using Ambari 2.2.2 and HDP 2.4 and I have enabled the ACID property of Hive.

I am trying to connect and do some insert operations using PHP client. I have used php thrift library for this. I am able to list all databases, tables and able to run select commands. Next step is to inert data into table using php client.

While writing INSERT command ( INSERT INTO TABLE att.variables_new2 VALUES (2, 10, 20) )

I am getting following error

Vertex failed,

vertexName=Map 1, vertexId=vertex_1471845991138_0085_1_00, diagnostics= Vertex vertex_1471845991138_0085_1_00 Map 1 killed/failed due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: values__tmp__table__1 initializer failed, vertex=vertex_1471845991138_0085_1_00 Map 1 org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://y:8020/tmp/hive/hive/9a4f4ac7-be23-4df2-b524-d0a48ec9c076/_tmp_space.db/Values__Tmp__Table__1 at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:287) at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:229) at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315) at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307) at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:409) at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:155) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:273) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:266) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:266) at org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

Vertex killed, vertexName=Reducer 2, vertexId=vertex_1471845991138_0085_1_01, diagnostics= Vertex received Kill in INITED state., Vertex vertex_1471845991138_0085_1_01 Reducer 2 killed/failed due to:OTHER_VERTEX_FAILURE

DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1

Please help.

-Paresh

6 REPLIES 6
Highlighted

Re: Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

@Paresh Kendre

Hi. I'd double check the path hdfs://y:8020/tmp/hive/hive/9a4f4ac7-be23-4df2-b524-d0a48ec9c076/_tmp_space.db/Values__Tmp__Table__1 and make sure the user has access to write to tmp. It doesn't appear to be an issue with the way you have created the ACID table.

Highlighted

Re: Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

New Contributor

Hi @Scott Shaw,

Thanks a lot for your reply.

I just cross checked and found that the user has write permission.

Now I am getting following error

ThriftSQL\HiveQuery Object ( [_resp:ThriftSQL\HiveQuery:private] => ThriftSQL\TExecuteStatementResp Object ( [status] => ThriftSQL\TStatus Object ( [statusCode] => 0 [infoMessages] => [sqlState] => [errorCode] => [errorMessage] => )

[operationHandle] => ThriftSQL\TOperationHandle Object ( [operationId] => ThriftSQL\THandleIdentifier Object ( [guid] => ������H����(�Sg [secret] => ��g�=zK�{�=U# )

[operationType] => 0 [hasResultSet] => [modifiedRowCount] => )

)

[_client:ThriftSQL\HiveQuery:private] => ThriftSQL\TCLIServiceClient Object ( [input_:protected] => Thrift\Protocol\TBinaryProtocol Object ( [strictRead_:protected] => [strictWrite_:protected] => 1 [trans_:protected] => Thrift\Transport\TSaslClientTransport Object ( [transport_:Thrift\Transport\TSaslClientTransport:private] => Thrift\Transport\TSocket Object ( [handle_:Thrift\Transport\TSocket:private] => Resource id #25 [host_:protected] => 172.16.1.150 [port_:protected] => 10000 [sendTimeoutSec_:Thrift\Transport\TSocket:private] => 0 [sendTimeoutUsec_:Thrift\Transport\TSocket:private] => 100000 [recvTimeoutSec_:Thrift\Transport\TSocket:private] => 0 [recvTimeoutUsec_:Thrift\Transport\TSocket:private] => 750000 [persist_:protected] => [debug_:protected] => [debugHandler_:protected] => error_log )

[username_:protected] => hive [password_:protected] => hive [saslComplete_:Thrift\Transport\TSaslClientTransport:private] => 1 [readBuffer_:Thrift\Transport\TSaslClientTransport:private] => [writeBuffer_:Thrift\Transport\TSaslClientTransport:private] => )

)

[output_:protected] => Thrift\Protocol\TBinaryProtocol Object ( [strictRead_:protected] => [strictWrite_:protected] => 1 [trans_:protected] => Thrift\Transport\TSaslClientTransport Object ( [transport_:Thrift\Transport\TSaslClientTransport:private] => Thrift\Transport\TSocket Object ( [handle_:Thrift\Transport\TSocket:private] => Resource id #25 [host_:protected] => 172.16.1.150 [port_:protected] => 10000 [sendTimeoutSec_:Thrift\Transport\TSocket:private] => 0 [sendTimeoutUsec_:Thrift\Transport\TSocket:private] => 100000 [recvTimeoutSec_:Thrift\Transport\TSocket:private] => 0 [recvTimeoutUsec_:Thrift\Transport\TSocket:private] => 750000 [persist_:protected] => [debug_:protected] => [debugHandler_:protected] => error_log )

[username_:protected] => hive [password_:protected] => hive [saslComplete_:Thrift\Transport\TSaslClientTransport:private] => 1 [readBuffer_:Thrift\Transport\TSaslClientTransport:private] => [writeBuffer_:Thrift\Transport\TSaslClientTransport:private] => )

)

[seqid_:protected] => 0 )

[_ready:ThriftSQL\HiveQuery:private] => )

Can you please advice, what this error is?

-Paresh (kendreparesh@gmail.com)

Highlighted

Re: Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

New Contributor

IN addition to that, In hiveserver2.log file, I found there are following errors

(KeyProviderCache.java:createKeyProviderURI(87)) - Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!

2016-08-29 11:37:52,391 ERROR [HiveServer2-Handler-Pool: Thread-59]: txn.TxnHandler (TxnHandler.java:unlock(591)) - No lock in w mode found for unlock(UnlockRequest(lockid:248))

NoSuchLockException(message:No such lock lockid:248)

2016-08-29 12:28:36,534 ERROR [HiveServer2-Handler-Pool: Thread-59]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(159)) - NoSuchLockException(message:No such lock lockid:258)

Error running hive query: HiveSQLException: Illegal Operation state transition from CLOSED to FINISHED

This may help you to guide me where I am wrong. I am digging on internet regarding the same but

any help is appriciated. If I get some solution I'll update this thread.

-Paresh

Highlighted

Re: Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

@Paresh Kendre

Lock issues are seen when using BoneCP for jdbc connection pooling controlled by property "datanucleus.connectionPoolingType". Please change the property to datanucleus.connectionPoolingType=dhcp, restart Hive services and then try the query.

Hope this helps.

Thanks and Regards,

Sindhu

Highlighted

Re: Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

New Contributor

Hi, thanks your suggestion, cna you please have a look at the errors that I found in log file.

  • (KeyProviderCache.java:createKeyProviderURI(87)) - Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
  • 2016-08-29 11:37:52,391 ERROR [HiveServer2-Handler-Pool: Thread-59]: txn.TxnHandler (TxnHandler.java:unlock(591)) - No lock in w mode found for unlock(UnlockRequest(lockid:248))
  • NoSuchLockException(message:No such lock lockid:248)
  • 2016-08-29 12:28:36,534 ERROR [HiveServer2-Handler-Pool: Thread-59]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(159)) - NoSuchLockException(message:No such lock lockid:258)
  • Error running hive query: HiveSQLException: Illegal Operation state transition from CLOSED to FINISHED
Highlighted

Re: Hive Insert sql throws InvalidInputException,Hive Insert sql gives org.apache.hadoop.mapred.InvalidInputException

New Contributor

Hi Paresh

I was also working on web application which need php to hive connection, I am using Hortonworks Sandbox with Ambari 2.2.2 and HDP 2.4

i am using hive-hs2-php-thrift from git

https://github.com/QwertyManiac/hive-hs2-php-thrift

i am able to connect to hive with hiveserver2 settings hive.server2.authentication=NOSASL

also i am able to use basic Queries such as SHOW DATABASES, SHOW TABLES, SELECT * FROM xademo.customer_details where customer_details.phone_number = '5553947406'

but While running with like SELECT * FROM xademo.customer_details where customer_details.phone_number like '5553947406'

I am getting following error

[errorMessage] => Error while processing statement: FAILED: Hive Internal Error: org.apache.atlas.AtlasServiceException(Metadata service API CREATE_ENTITY failed with status 400(Bad Request) Response Body ({"error":"Null value not allowed for multiplicty Multiplicity{lower=1, upper=1, isUnique=false}","stackTrace":"org.apache.atlas.typesystem.types.ValueConversionException$NullConversionException: Null value not allowed for multiplicty Multiplicity{lower=1, upper=1, isUnique=false}\n\tat org.apache.atlas.typesystem.types.DataTypes$PrimitiveType.convertNull(DataTypes.java:93)\n\tat org.apache.atlas.typesystem.types.DataTypes$StringType.convert(DataTypes.java:469)\n\tat org.apache.atlas.typesystem.types.DataTypes$StringType.convert(DataTypes.java:452)\n\tat org.apache.atlas.typesystem.persistence.StructInstance.set(StructInstance.java:118)\n\tat org.apache.atlas.typesystem.types.ClassType.convert(ClassType.java:141)\n\tat org.apache.atlas.services.DefaultMetadataService.deserializeClassInstance(DefaultMetadataService.java:252)\n\tat org.apache.atlas.services.DefaultMetadataService.createEntity(DefaultMetadataService.java:230)\n\tat org.apache.atlas.web.resources.EntityResource.submit(EntityResource.java:96)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:606)\n\tat com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)\n\tat com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)\n\tat com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)\n\tat com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)\n\tat com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)\n\tat com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)\n\tat com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)\n\tat com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)\n\tat com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:409)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:558)\n\tat com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:733)\n\tat javax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\tat com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:287)\n\tat com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:277)\n\tat com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:182)\n\tat com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)\n\tat org.apache.atlas.web.filters.AuditFilter.doFilter(AuditFilter.java:67)\n\tat com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)\n\tat com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)\n\tat com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)\n\tat com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)\n\tat com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:499)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\tat org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tat java.lang.Thread.run(Thread.java:745)\n"}))

have you ever seen this kind of error while running sql queries from php-hive client

Thanks and Regards,

Sandesh