Member since
07-26-2017
16
Posts
2
Kudos Received
0
Solutions
06-25-2018
05:36 PM
Actually I was waiting for an answer. My problem was not solved. I switched to another solution that doesn't involve XML Files. Sorry @Anji Raju
... View more
02-23-2018
01:44 AM
Hi community! I´m having problems with HDP 2.6.1 Hive View 2.0. When I open Hive View, it freezes on service check and SQL Window doesn´t show (see attached file). Hive view 2.0 log has the following errors: 22 Feb 2018 20:31:11,456 INFO [HiveViewActorSystem-akka.actor.default-dispatcher-71] [HIVE 2.0.0 AUTO_HIVE20_INSTANCE] DeathWatch:43 - Registration for Actor[akka://HiveViewActorSystem/user/068b8839-5385-4b45-b59a-f478c52c13ad:syncjdbcConnector#-1472728613] at Thu Feb 22 20:31:11 ECT 2018
22 Feb 2018 20:31:11,598 INFO [HiveViewActorSystem-akka.actor.result-dispatcher-72] [HIVE 2.0.0 AUTO_HIVE20_INSTANCE] StatementExecutor:90 - Statement executor is executing statement: show databases like '*', Statement id: 0, JobId: SYNC JOB
22 Feb 2018 20:32:11,473 ERROR [ambari-client-thread-4181] [HIVE 2.0.0 AUTO_HIVE20_INSTANCE] DDLDelegatorImpl:238 - Query timed out to fetch table description for user: mamv141114
java.util.concurrent.TimeoutException: deadline passed
at akka.actor.dsl.Inbox$InboxActor$$anonfun$receive$1.applyOrElse(Inbox.scala:117)
at scala.PartialFunction$AndThen.applyOrElse(PartialFunction.scala:189)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.actor.dsl.Inbox$InboxActor.aroundReceive(Inbox.scala:62)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
22 Feb 2018 20:32:11,475 ERROR [ambari-client-thread-4181] [HIVE 2.0.0 AUTO_HIVE20_INSTANCE] ServiceFormattedException:97 - Query timed out to fetch table description for user: mamv141114
22 Feb 2018 20:32:11,475 ERROR [ambari-client-thread-4181] [HIVE 2.0.0 AUTO_HIVE20_INSTANCE] ServiceFormattedException:98 - java.util.concurrent.TimeoutException: deadline passed
java.util.concurrent.TimeoutException: deadline passed
at akka.actor.dsl.Inbox$InboxActor$$anonfun$receive$1.applyOrElse(Inbox.scala:117)
at scala.PartialFunction$AndThen.applyOrElse(PartialFunction.scala:189)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at akka.actor.dsl.Inbox$InboxActor.aroundReceive(Inbox.scala:62)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Has anyone faced the same problem?
Hope anyone can help me
... View more
Labels:
09-28-2017
01:44 PM
I have created an XML Table with XML Serde in Hive (I'm using hortonworks 2.6.1): CREATE EXTERNAL TABLE xml_factura(id STRING, version STRING, ambiente STRING, tipoEmision STRING, razonSocial STRING,
nombreComercial STRING, ruc STRING,claveAcceso STRING,codDoc STRING,estab STRING,
ptoEmi STRING,secuencial STRING,dirMatriz STRING,fechaEmision STRING,dirEstablecimiento STRING,
contribuyenteEspecial STRING, obligadoContabilidad STRING,tipoIdentificacionComprador STRING,
razonSocialComprador STRING,identificacionComprador STRING,totalSinImpuestos STRING,
totalDescuento STRING,totalImpuesto ARRAY <map<STRING,STRING>>,propina STRING,
importeTotal STRING,moneda STRING,detalle ARRAY <map<STRING,STRING>>,
infoAdicional STRUCT<nombre:STRING,campoAdicional:STRING>)
ROW FORMAT SERDE 'com.ibm.spss.hive.serde2.xml.XmlSerDe'
WITH SERDEPROPERTIES ( "column.xpath.id"="/factura/@id",
"column.xpath.version"="/factura/@version",
"column.xpath.ambiente"="/factura/infoTributaria/ambiente/text()",
"column.xpath.tipoEmision"="/factura/infoTributaria/tipoEmision/text()",
"column.xpath.razonSocial"="/factura/infoTributaria/razonSocial/text()",
"column.xpath.nombreComercial"="/factura/infoTributaria/nombreComercial/text()",
"column.xpath.ruc"="/factura/infoTributaria/ruc/text()",
"column.xpath.razonSocial"="/factura/infoTributaria/razonSocial/text()",
"column.xpath.claveAcceso"="/factura/infoTributaria/claveAcceso/text()",
"column.xpath.codDoc"="/factura/infoTributaria/codDoc/text()",
"column.xpath.estab"="/factura/infoTributaria/estab/text()",
"column.xpath.ptoEmi"="/factura/infoTributaria/ptoEmi/text()",
"column.xpath.secuencial"="/factura/infoTributaria/secuencial/text()",
"column.xpath.dirMatriz"="/factura/infoTributaria/dirMatriz/text()",
"column.xpath.fechaEmision"="/factura/infoFactura/fechaEmision/text()",
"column.xpath.dirEstablecimiento"="/factura/infoFactura/dirEstablecimiento/text()",
"column.xpath.contribuyenteEspecial"="/factura/infoFactura/contribuyenteEspecial/text()",
"column.xpath.obligadoContabilidad"="/factura/infoFactura/obligadoContabilidad/text()",
"column.xpath.tipoIdentificacionComprador"="/factura/infoFactura/tipoIdentificacionComprador/text()",
"column.xpath.razonSocialComprador"="/factura/infoFactura/razonSocialComprador/text()",
"column.xpath.identificacionComprador"="/factura/infoFactura/identificacionComprador/text()",
"column.xpath.totalSinImpuestos"="/factura/infoFactura/totalSinImpuestos/text()",
"column.xpath.totalDescuento"="/factura/infoFactura/totalDescuento/text()",
"column.xpath.totalImpuesto"="/factura/infoFactura/totalConImpuestos/totalImpuesto",
"column.xpath.propina"="/factura/infoFactura/propina/text()",
"column.xpath.importeTotal"="/factura/infoFactura/importeTotal/text()",
"column.xpath.moneda"="/factura/infoFactura/moneda/text()",
"column.xpath.detalle"="/factura/detalles/detalle",
"column.xpath.infoAdicional"="/factura/infoAdicional/campoAdicional" )
STORED AS INPUTFORMAT 'com.ibm.spss.hive.serde2.xml.XmlInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat'
LOCATION '/user/mamv141114/xmlfacturas'
TBLPROPERTIES ( "xmlinput.start"="<factura id", "xmlinput.end"="</factura>" )
I have uploaded three xml files and I'm able to Query the table (SELECT * , SELECT count(*), etc...) . Everything works fine. Then I created the following ORC table: CREATE TABLE xml_factura_orc (id STRING, version STRING, ambiente STRING, tipoEmision STRING, razonSocial STRING, nombreComercial STRING, ruc STRING,claveAcceso STRING,codDoc STRING,estab STRING, ptoEmi STRING,secuencial STRING,dirMatriz STRING,fechaEmision STRING,dirEstablecimiento STRING, contribuyenteEspecial STRING, obligadoContabilidad STRING,tipoIdentificacionComprador STRING, razonSocialComprador STRING,identificacionComprador STRING,totalSinImpuestos STRING, totalDescuento STRING,totalImpuesto ARRAY <map<STRING,STRING>>,propina STRING, importeTotal STRING,moneda STRING, detalle ARRAY <map<STRING,STRING>>, infoAdicional STRUCT<nombre:STRING,campoAdicional:STRING>) STORED AS ORC tblproperties ("orc.compress"="SNAPPY"); When I try to insert all the data from this table into this ORC table with this sentence: INSERT OVERWRITE TABLE facturas.xml_factura_orc
SELECT id, version, ambiente, tipoEmision, razonSocial, nombreComercial, ruc, claveAcceso, codDoc, estab, ptoEmi, secuencial, dirMatriz, fechaEmision, dirEstablecimiento, contribuyenteEspecial, obligadoContabilidad, tipoIdentificacionComprador, razonSocialComprador, identificacionComprador, totalSinImpuestos, totalDescuento, totalImpuesto, propina, importeTotal, moneda, detalle, infoAdicional FROM facturas.xml_factura; I get the following error: Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1506550557761_0005_1_00, diagnostics=[Vertex vertex_1506550557761_0005_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:70)
at org.apache.tez.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:89)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:151)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:148)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.createInitializer(RootInputInitializerManager.java:148)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInputInitializers(RootInputInitializerManager.java:121)
at org.apache.tez.dag.app.dag.impl.VertexImpl.setupInputInitializerManager(VertexImpl.java:4620)
at org.apache.tez.dag.app.dag.impl.VertexImpl.access$4400(VertexImpl.java:202)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.handleInitEvent(VertexImpl.java:3436)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:3385)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:3366)
at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
at org.apache.tez.state.StateMachineTez.doTransition(StateMachineTez.java:57)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:1938)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:201)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2080)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2066)
at org.apache.tez.common.AsyncDispatcher.dispatch(AsyncDispatcher.java:184)
at org.apache.tez.common.AsyncDispatcher$1.run(AsyncDispatcher.java:115)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:68)
... 25 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://ndatos01.sri.ad:8020/tmp/hive/mamv141114/1762793d-62ac-45d8-ae09-2900aa4851f9/hive_2017-09-28_08-14-12_349_3611498623986481792-11/mamv141114/_tez_scratch_dir/7bed50d7-f0f0-424d-ab95-5d9797635eba/map.xml: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: com.ibm.spss.hive.serde2.xml.XmlInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:479)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:318)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:101)
... 30 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: com.ibm.spss.hive.serde2.xml.XmlInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:138)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:238)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:226)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:745)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:113)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
at org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:1182)
at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:1069)
at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:1083)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:439)
... 32 more
Caused by: java.lang.ClassNotFoundException: com.ibm.spss.hive.serde2.xml.XmlInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
... 50 more
]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)
¿Where do I have to put the hivexmlserde.jar so this INSERT works? ¿Has anyone faced the same problem? I have copied the jar in several lib directories and have no success, so I think there's another place where I have to copy this jar ¿Is there another way to read a batch of XML files and insert into an ORC table or another hive table? I really appreciate if anyone can help me.
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
08-21-2017
08:51 PM
Hi @Geoffrey Shelton Okot: I've killed Atlas process via command line and started the service on Ambari Web Interface, and now it worked ok. Thanks for your help.
... View more
08-21-2017
08:28 PM
Thanks for your answer. I've created the directory and now other error is shown: Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 181, in <module>
MetadataServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 145, in stop
user=params.metadata_user,
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'source /usr/hdp/current/atlas-server/conf/atlas-env.sh; /usr/hdp/current/atlas-server/bin/atlas_stop.py' returned 255. Exception: [Errno 1] Operation not permitted
Traceback (most recent call last):
File "/usr/hdp/current/atlas-server/bin/atlas_stop.py", line 78, in <module>
returncode = main()
File "/usr/hdp/current/atlas-server/bin/atlas_stop.py", line 49, in main
os.kill(pid, SIGTERM)
OSError: [Errno 1] Operation not permitted
... View more
08-18-2017
02:31 PM
Hello everyone: I'm getting the following error while trying to restart Atlas server: stderr: /var/lib/ambari-agent/data/errors-410.txt Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 29, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 53, in setup_users
set_uid(params.smoke_user, params.smoke_user_dirs)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 137, in set_uid
mode=0555)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/var/lib/ambari-agent/tmp/changeUid.sh'] failed, parent directory /var/lib/ambari-agent/tmp doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-410.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-410.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py', 'START', '/var/lib/ambari-agent/data/command-410.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START', '/var/lib/ambari-agent/data/structured-out-410.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', ''] stdout: /var/lib/ambari-agent/data/output-410.txt 2017-08-18 09:29:16,185 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-18 09:29:16,519 - Stack Feature Version Info: stack_version=2.6, version=2.6.1.0-129, current_cluster_version=2.6.1.0-129 -> 2.6.1.0-129
2017-08-18 09:29:16,529 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-08-18 09:29:16,531 - Group['livy'] {}
2017-08-18 09:29:16,532 - Group['spark'] {}
2017-08-18 09:29:16,532 - Group['zeppelin'] {}
2017-08-18 09:29:16,533 - Group['hadoop'] {}
2017-08-18 09:29:16,533 - Group['users'] {}
2017-08-18 09:29:16,533 - Group['knox'] {}
2017-08-18 09:29:16,533 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,534 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,535 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,536 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,537 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,538 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 09:29:16,538 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,539 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 09:29:16,540 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 09:29:16,541 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop']}
2017-08-18 09:29:16,542 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,542 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,543 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,544 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,545 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 09:29:16,545 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,546 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,547 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,548 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,549 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,549 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,550 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,551 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,552 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 09:29:16,553 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-410.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-410.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py', 'START', '/var/lib/ambari-agent/data/command-410.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START', '/var/lib/ambari-agent/data/structured-out-410.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']
Command failed after 1 tries Thanks in advance for your help
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
08-18-2017
02:25 PM
Hello everyone: I'm getting the following error when I try to start Activity Analizer in ambari (HDP-2.6): stderr: /var/lib/ambari-agent/data/errors-408.txt Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 29, in hook
setup_users()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 53, in setup_users
set_uid(params.smoke_user, params.smoke_user_dirs)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 137, in set_uid
mode=0555)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/var/lib/ambari-agent/tmp/changeUid.sh'] failed, parent directory /var/lib/ambari-agent/tmp doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-408.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-408.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', ''] out: /var/lib/ambari-agent/data/output-408.txt 2017-08-18 08:44:28,388 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-18 08:44:28,722 - Stack Feature Version Info: stack_version=2.6, version=2.6.1.0-129, current_cluster_version=2.6.1.0-129 -> 2.6.1.0-129
2017-08-18 08:44:28,732 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-08-18 08:44:28,733 - Group['livy'] {}
2017-08-18 08:44:28,735 - Group['spark'] {}
2017-08-18 08:44:28,735 - Group['zeppelin'] {}
2017-08-18 08:44:28,735 - Group['hadoop'] {}
2017-08-18 08:44:28,735 - Group['users'] {}
2017-08-18 08:44:28,736 - Group['knox'] {}
2017-08-18 08:44:28,736 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,737 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,738 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,738 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,739 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,740 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 08:44:28,741 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,742 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 08:44:28,742 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 08:44:28,743 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop']}
2017-08-18 08:44:28,744 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,745 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,746 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,746 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,747 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2017-08-18 08:44:28,748 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,749 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,750 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,750 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,751 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,752 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,753 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,754 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,754 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2017-08-18 08:44:28,755 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-408.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-408.json', 'INFO', '/var/lib/ambari-agent/tmp', 'PROTOCOL_TLSv1', '']
Command failed after 1 tries ¿Has anyone found a solution to this error? Thanks for your help
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
07-26-2017
09:32 PM
1 Kudo
Hi
@Jay SenSharma. Thanks for your support, you've guided me to finally find a solution. I did the following:
Query for ambari rpm:
rpm -qa | grep ambari
ambari-metrics-monitor-2.5.1.0-159.x86_64
ambari-infra-solr-2.5.1.0-159.noarch
ambari-infra-solr-client-2.5.1.0-159.noarch
ambari-metrics-grafana-2.5.1.0-159.x86_64
ambari-metrics-hadoop-sink-2.5.1.0-159.x86_64
ambari-agent-2.5.1.0-159.x86_64
Removed ambari-infra-solr-client
yum remove ambari-infra-solr-client
Loaded plugins: langpacks, product-id, search-disabled-repos, subscription-manager
This system is not registered to Red Hat Subscription Management. You can use subscription-manager to register.
Resolving Dependencies
There are unfinished transactions remaining. You might consider running yum-complete-transaction, or "yum-complete-transaction --cleanup-only" and "yum history redo last", first to finish them. If those don't work you'll have to try removing/installing packages by hand (maybe package-cleanup can help).
--> Running transaction check
---> Package ambari-infra-solr-client.noarch 0:2.5.1.0-159 will be erased
--> Finished Dependency Resolution
Dependencies Resolved
=======================================================================================================================
Package Arch Version Repository Size
=======================================================================================================================
Removing:
ambari-infra-solr-client noarch 2.5.1.0-159 installed 25 M
Transaction Summary
=======================================================================================================================
Remove 1 Package
Installed size: 25 M
Is this ok [y/N]: y
Downloading packages:
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
Erasing : ambari-infra-solr-client-2.5.1.0-159.noarch 1/1
warning: file /usr/lib/ambari-infra-solr-client/solrCloudCli.sh: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/log4j.properties: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/zookeeper-3.4.6.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/woodstox-core-asl-4.4.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/utility-1.0.0.0-SNAPSHOT.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/tools-1.7.0.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/stax2-api-3.1.4.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/solr-solrj-5.5.2.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/slf4j-log4j12-1.7.2.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/slf4j-api-1.7.2.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/objenesis-2.2.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/noggit-0.6.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/log4j-1.2.17.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/junit-4.10.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/jcl-over-slf4j-1.7.7.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/jackson-mapper-asl-1.9.13.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/jackson-core-asl-1.9.9.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/httpmime-4.4.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/httpcore-4.4.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/httpclient-4.4.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/hamcrest-core-1.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/guava-16.0.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/easymock-3.4.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-logging-1.1.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-lang-2.5.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-io-2.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-collections-3.2.2.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-codec-1.8.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-cli-1.3.1.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/commons-beanutils-1.9.2.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/checkstyle-6.19.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/antlr4-runtime-4.5.3.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/antlr-2.7.7.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs/ambari-logsearch-solr-client-2.5.1.0.159.jar: remove failed: No such file or directory
warning: file /usr/lib/ambari-infra-solr-client/libs: remove failed: No such file or directory
Verifying : ambari-infra-solr-client-2.5.1.0-159.noarch 1/1
Removed:
ambari-infra-solr-client.noarch 0:2.5.1.0-159
Complete!
Here I've noticed: "
warning: file /usr/lib/ambari-infra-solr-client/libs: remove failed: No such file or directory". That's really odd.
Installed ambari infra client again:
yum install ambari-infra-solr-client
....
Installed:
ambari-infra-solr-client.noarch 0:2.5.1.0-159
Complete!
After these steps, the ambari-infra installation via the web page shows a success message. Yes!
Thanks for your time and support
Hope this errors are reviewed and corrected in future releases.
... View more
07-26-2017
08:45 PM
Hi again @Jay SenSharma. I've checked the package installation, and is already installed in this node. Perhaps, ¿is this a problem with the python script that creates the configuration files for each component? Thanks for your time.
... View more
07-26-2017
07:46 PM
Hi everyone. I've been several days trying to install HDP 2.6.1.0 on a RHEL 7.2 cluster. My current problem is Apache Spark installs in three nodes and fails on one node. I've checked the python scripts and they are the same. ¿Do I need something different in the node that's failing?.
The log for successful installation is:
2017-07-26 14:24:38,858 - /etc/hadoop/conf is already linked to /etc/hadoop/2.6.1.0-129/0
2017-07-26 14:24:38,858 - /etc/mahout/conf is already linked to /etc/mahout/2.6.1.0-129/0
2017-07-26 14:24:38,858 - Skipping /etc/storm/conf as it does not exist.
2017-07-26 14:24:38,858 - /etc/atlas/conf is already linked to /etc/atlas/2.6.1.0-129/0
2017-07-26 14:24:38,858 - Skipping /etc/ranger/admin/conf as it does not exist.
2017-07-26 14:24:38,859 - /etc/flume/conf is already linked to /etc/flume/2.6.1.0-129/0
2017-07-26 14:24:38,859 - /etc/sqoop/conf is already linked to /etc/sqoop/2.6.1.0-129/0
2017-07-26 14:24:38,859 - /etc/accumulo/conf is already linked to /etc/accumulo/2.6.1.0-129/0
2017-07-26 14:24:38,860 - Skipping /etc/phoenix/conf as it does not exist.
2017-07-26 14:24:38,860 - /etc/storm-slider-client/conf is already linked to /etc/storm-slider-client/2.6.1.0-129/0
2017-07-26 14:24:38,860 - /etc/slider/conf is already linked to /etc/slider/2.6.1.0-129/0
2017-07-26 14:24:38,860 - Skipping /etc/zeppelin/conf as it does not exist.
2017-07-26 14:24:38,861 - /etc/hive-webhcat/conf is already linked to /etc/hive-webhcat/2.6.1.0-129/0
2017-07-26 14:24:38,861 - /etc/hive-hcatalog/conf is already linked to /etc/hive-hcatalog/2.6.1.0-129/0
2017-07-26 14:24:38,861 - /etc/falcon/conf is already linked to /etc/falcon/2.6.1.0-129/0
2017-07-26 14:24:38,861 - Skipping /etc/knox/conf as it does not exist.
2017-07-26 14:24:38,862 - /etc/pig/conf is already linked to /etc/pig/2.6.1.0-129/0
2017-07-26 14:24:38,862 - /etc/spark2/conf is already linked to /etc/spark2/2.6.1.0-129/0
2017-07-26 14:24:38,862 - /etc/hive/conf is already linked to /etc/hive/2.6.1.0-129/0
Command completed successfully!
And the log for failing installation is: stderr: /var/lib/ambari-agent/data/errors-599.txt
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/spark-client/conf/spark-defaults.conf'] failed, parent directory /usr/hdp/current/spark-client/conf doesn't exist
stdout: /var/lib/ambari-agent/data/output-599.txt
......
2017-07-26 14:24:23,893 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-07-26 14:24:23,907 - call['ambari-python-wrap /usr/bin/hdp-select status spark-client'] {'timeout': 20}
2017-07-26 14:24:23,939 - call returned (0, 'spark-client - 2.6.1.0-129')
2017-07-26 14:24:23,948 - Directory['/var/run/spark'] {'owner': 'spark', 'create_parents': True, 'group': 'hadoop', 'mode': 0775}
2017-07-26 14:24:23,949 - Directory['/var/log/spark'] {'owner': 'spark', 'group': 'hadoop', 'create_parents': True, 'mode': 0775}
2017-07-26 14:24:23,950 - PropertiesFile['/usr/hdp/current/spark-client/conf/spark-defaults.conf'] {'owner': 'spark', 'key_value_delimiter': ' ', 'group': 'spark', 'mode': 0644, 'properties': ...}
2017-07-26 14:24:23,958 - Generating properties file: /usr/hdp/current/spark-client/conf/spark-defaults.conf
2017-07-26 14:24:23,959 - File['/usr/hdp/current/spark-client/conf/spark-defaults.conf'] {'owner': 'spark', 'content': InlineTemplate(...), 'group': 'spark', 'mode': 0644}
Command failed after 1 tries
The step: " Generating properties file: /usr/hdp/current/spark-client/conf/spark-defaults.conf", fails only in one node. As you can see, the other installation shows successful file creation
Thanks if anyone can give me some advice, we are really interested in using this suite for some use cases in our company. Regards, Miguel
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
07-26-2017
07:32 PM
Just wondering if is convenient to install this product from zero. Currently I'm having some issues with installation. ¿Someone could give an advice of which HortonWorks version is mature enough for installation and testing?
... View more
07-26-2017
06:49 PM
Hi Jay. Thanks for your answer. I've removed this component from installation, and now a similar problem occurs whith "Spark Client". ¿Do you have the line to install ambari-infra manually with yum (yum install ambari.....)? Thanks for your time and support.
... View more
07-26-2017
04:57 PM
1 Kudo
Thanks a lot for this post! It helped me a lot removing HDP-2.6.1.0 from my cluster. I used terminator and the following sequence of commands (hope it helps someone else). [See the atached file]completely-uninstall-hw.txt
... View more
07-26-2017
04:45 PM
Hi, I'm installing HDP-2.6.1.0 on a RHEL 7.2 cluster (4 servers). I've configured a local repo for RHEL and Ambari like this: HDP-2.6.1.0 HDP Version - HDP-2.6.1.0 HDP-UTILS-1.1.0.21 Hortonworks Data Platform Version - HDP-UTILS-1.1.0.21 ambari-2.5.1.0 ambari Version - ambari-2.5.1.0 mysql-community-server MySQL Community Server 5.7 repository
redhat-sri "redhat cd" repolist: 5,073 Followed installation steps, but installation of "Infra Solr Client" fails with error: Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr_client.py", line 51, in <module>
InfraSolrClient().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr_client.py", line 30, in install
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 119, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/infra_solr_client.py", line 35, in configure
setup_infra_solr(name ='client')
File "/var/lib/ambari-agent/cache/common-services/AMBARI_INFRA/0.1.0/package/scripts/setup_infra_solr.py", line 132, in setup_infra_solr
solr_cloud_util.setup_solr_client(params.config)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/solr_cloud_util.py", line 225, in setup_solr_client
content=StaticFile(solrCliFilename)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 123, in action_create
content = self._get_content()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 160, in _get_content
return content()
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 52, in __call__
return self.get_content()
File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 76, in get_content
raise Fail("{0} Source file {1} is not found".format(repr(self), path))
resource_management.core.exceptions.Fail: StaticFile('/usr/lib/ambari-infra-solr-client/solrCloudCli.sh') Source file /usr/lib/ambari-infra-solr-client/solrCloudCli.sh is not found
¿Anyone has faced the same problem?
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)