Member since
02-11-2019
78
Posts
1
Kudos Received
0
Solutions
03-21-2020
09:35 AM
Still struggling with this... See exception stack below 2020-03-21 12:27:31,694 ERROR [IPC Server handler 10 on 45536] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1584785234978_6403_m_000000_0 - exited : com.teradata.connector.common.exception.ConnectorException: index outof boundary
at com.teradata.connector.teradata.converter.TeradataConverter.convert(TeradataConverter.java:179)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:111)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:70)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:134)
at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:122)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
... View more
03-20-2020
09:37 AM
Thanks EricL. At least I know it will work
... View more
03-18-2020
01:04 PM
I'm trying to export data from a hdfs location to teradata. I Have created a table with same schema in teradata Export Command: sqoop export --connect jdbc:teradata://teradataserver/Database=dbname --username xxxx --password xxxx --table teradataTbl --export-dir /hdfs/parquet/files/path/ Exception: 2020-03-18 14:32:00,754 ERROR [IPC Server handler 3 on 41836] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1584475869533_13501_m_000002_0 - exited : com.teradata.connector.common.exception.ConnectorException: index outof boundary at com.teradata.connector.teradata.converter.TeradataConverter.convert(TeradataConverter.java:179) at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:111) at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:70) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:134) at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:122) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
03-06-2020
07:52 AM
Thanks a million. Got same issue after re-adding a host that was removed prior to upgrade from CDH 5.6 to CDH 6.2 Fixed it by deleting /var/lib/cloudera-scm-agent/cm_guid on the node
... View more
01-27-2020
02:04 PM
Hi,
I'm getting this error below while trying to submit a scala jar built in Intellij using maven
spark version 2.3.0
scala version 2.11.11
Command used:
spark2-submit --master="yarn" --deploy-mode="cluster" --queue root.myyarnqueue --executor-memory 12G --driver-memory 12G --class MyClassName /projects/myscala.jar arg_1 arg_2
Error Message
java.lang.ClassNotFoundException: MyClassName at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.spark.util.Utils$.classForName(Utils.scala:239) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 20/01/27 15:54:47 INFO util.ShutdownHookManager: Shutdown hook called
... View more
Labels:
- Labels:
-
Apache Spark
01-18-2020
10:39 PM
What is the most efficient way to get count of records meeting different search criteria from a Hive table.
1. count all records where column-a = Null
2. count all records where column-b in [1, 3, 5]
3. count all records where column-c = 'xxx'
etc.
there are a couple hundred of these counts, in groups of 3 or 4.
... View more
Labels:
- Labels:
-
Apache Hive
01-14-2020
07:43 AM
I'm having same issue.... everything works fine, but these config warning are on all services in cloudera manager.
... View more
12-19-2019
11:56 AM
We don't have any indexes or collection in this cluster. Just trying to use it for Solr Search functions now. We were just running the tutorial to validate the Solr service configuration. Any steps we can take to re-initialize the service as new and get rid of any left over artifacts from the 5.14 will also be good as we don't have anything in the Search service yet. Regards
... View more
12-19-2019
11:19 AM
We weren't using the Solr service in the previous cluster. So we just removed the service during the upgrade and added it after the upgrade following the turorial we are able to create the 2 shard cores and the test collection. Just the execute query throws exception
... View more
12-18-2019
11:50 AM
Completed the Tutorial, but querying fails
Env:
Migrated from CDH 5.14 to CDH 6.2
Solr 7.4.0
Browsing to mysolrserver:8983/solr/test_collection/select?q=*:*&wt=json&indent=true
returns this error below.
{ "responseHeader":{ "zkConnected":true, "status":500, "QTime":1, "params":{ "q":"*:*", "indent":"true", "wt":"json"}}, "error":{ "trace":"java.lang.NullPointerException\n\tat org.apache.sentry.binding.solr.authz.SentrySolrPluginImpl.getShortUserName(SentrySolrPluginImpl.java:413)\n\tat org.apache.solr.handler.component.DocAuthorizationComponent.getUserName(DocAuthorizationComponent.java:80)\n\tat org.apache.solr.handler.component.DocAuthorizationComponent.prepare(DocAuthorizationComponent.java:95)\n\tat org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:272)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2548)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:764)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:521)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:376)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:322)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:513)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)\n\tat org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)\n\tat org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:539)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:108)\n\tat org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)\n\tat org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)\n\tat org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)\n\tat org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)\n\tat java.lang.Thread.run(Thread.java:748)\n", "code":500}}
... View more