Member since
08-26-2017
9
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5807 | 10-29-2018 07:39 PM |
10-29-2018
07:39 PM
I asked the same question on Github and they answered me, this was what worked for me.
Read the vineshcpaul's answer.
I copy it here:
If you have HDP 2.6.5 VM deployed, there should be a file name generate-proxy-deploy-script.sh under /sandbox/deploy-scripts/assets. Please try editing the script to add a port that you want to forward. There are 2 arrays defined in the script, one is called tcpPortsHDP, add a new entry in the following format considering port 23232 as the one you need to forward.
[23232]=23232.
Once the change is made, rerun the script
cd /sandbox/deploy-scripts
assets/generate-proxy-deploy-script.sh
This generate as new script proxy-deploy.sh under /sandbox/deploy-scripts/sandbox/proxy/proxy-deploy.sh
execute proxy-deploy.sh to get the new reverse proxy deployed.
cd /sandbox/deploy-scripts
sandbox/proxy/proxy-deploy.sh
Also, the port needs to be forwarded on the virtualbox as well .
VirtualBox -> settings -> network -> Advanced -> port forwarding -> add new port forwarding rule
... View more
10-09-2018
09:32 PM
Hello @Geoffrey Shelton Okot sorry for my late response, but that tutorial doesn't work anymore for version 2.6.5 of the sandbox. Just trying to execute the first command already gives error. And the ´start_scripts´ is not found in the ´/root' directory. I know that that tutorial works well until version 2.6.4, however this is not the case why I ask. So again, is there a way to add ports forwarding in the new version?
... View more
10-03-2018
07:42 PM
Hello community! I have a problem with the last version of the HDP Sandbox (2.6.5), I know about this article explaining the new architecture but I still don't understand how I can add new ports. Could someone please explain in simple steps how to do this? I would like to add the MySQL port. Any help would be appreciated!
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
10-02-2018
10:48 PM
Same problem here while triying to insert in the Sandbox HDP 2.6.4. And the MapReduce mode is very slow too, it would be great be able to use Hive with TEZ.
... View more
10-02-2018
10:43 PM
Hello @rtheron, today I tried to insert rows through the Hive Web UI that comes with Ambari (logged in as admin) and they inserted well. Also, I tried changing the execution motor of Hive from TEZ to MapReduce and it worked, it filled the Date table of Hive. Coincidentally I saw this other question today very similar to my case: https://community.hortonworks.com/questions/222722/hive-query-fails-in-tez-runs-in-mr-mode.html Do you another idea about the root of the problem?
... View more
10-01-2018
04:40 PM
Hello everybody! I am trying to fill a Date Dimension in Hive using PDI Spoon transformations, my environment is the HDP Sandbox 2.6.4. I already filled some small dimensions tables using PDI (v8.1), but for some reason the maximum amount of rows inserted is only 57, after that the jobs begin to throw errors like the following: 2018/09/29 14:41:36 - D_Date.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : Because of an error, this step can't continue:
2018/09/29 14:41:36 - D_Date.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : org.pentaho.di.core.exception.KettleException:
2018/09/29 14:41:36 - D_Date.0 - Error inserting row into table [dim_fecha] with values: [2015/01/01 00:00:00], [54], [2015/02/23 00:00:00.000], [20150223,0], [2015,0], [1,0], [2,0], [23,0], [1,0], [February], [Feb], [Monday], [Mon], [1,0], [1,0], [2,0], [54,0]
2018/09/29 14:41:36 - D_Date.0 -
2018/09/29 14:41:36 - D_Date.0 - Error inserting/updating row
2018/09/29 14:41:36 - D_Date.0 - Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1538242990698_0009_54_00, diagnostics=[Task failed, taskId=task_1538242990698_0009_54_00_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Failure while running task:java.lang.RuntimeException: java.lang.RuntimeException: Hive Runtime Error while closing operators
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:173)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:139)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:347)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:194)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:185)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:185)
at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:181)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Hive Runtime Error while closing operators
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:370)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:164)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /apps/hive/warehouse/awv_almacen.db/dim_fecha/.hive-staging_hive_2018-09-29_18-41-23_686_5852176182341767434-4/_task_tmp.-ext-10000/_tmp.000000_0 could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and no node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1719)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3368)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3292)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:850)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:504)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2347)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.closeWriters(FileSinkOperator.java:202)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:1046)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:620)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:634)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.close(MapRecordProcessor.java:346)
... 15 more
And the Spoon's log output is longer than that but keep saying the same (more or less). Does anyone have an idea of what the error might be? I would really appreciate some help 🙂 I am still very new in the Hadoop’s world. Regards.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Tez