Member since
09-27-2016
8
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1255 | 10-14-2016 05:43 PM |
10-14-2016
05:43 PM
1 Kudo
Never mind. Got it figured out. I only needed to configure the host only adapter as the first adapter and the NAT adapter as the second. Setting the details in VB manager of the host-only adapter to 192.168.100.100 subnetmask 255.255.255.0 and DHCP server on (serveraddres) 192.168.100.100 subnetmask 255.255.255.0 and lowest address to 192.168.100.101 and highest to 192.168.100.254.
... View more
10-14-2016
05:34 PM
1 Kudo
I was wondering how I can add an extra host only adapter to the HDP 2.5 virtualbox sandbox. I want to test replicating an Oracle database running on my laptop into the sandbox. I use Attunity or Oracle Golden Gate for replication. But I need to access the cluster through host only network.
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
10-03-2016
09:09 AM
Hello Cindy,
I had to use the finalresults table also. Riskfactor is empty after the Spark tutorial. Regards, Robbert.
... View more
10-03-2016
09:07 AM
Hi @Greg Keys, %hive works as it should, but I was hoping the tutorial would go into using a JDBC driver also. Anyway, thanks for the answer. At least I was able to play around with the graphs in Zeppelin. Regards,
Robbert
... View more
09-29-2016
12:58 PM
Sorry @Greg Keys but the turorial explicitly states to use %jdbc(hive) for connecting to hive. After SELECT * FROM riskfactor the scipt should show data from the table. After that the tutorial shows that a graph can be made, but I run into this error.
... View more
09-29-2016
10:53 AM
1 Kudo
I am going through the following lab: http://hortonworks.com/hadoop-tutorial/hello-world-an-introduction-to-hadoop-hcatalog-hive-and-pig/#section_7 and in paragraph 6.2 (execute a hive query) I am trying to run the following code in Zeppelin:
%jbdc(hive)
SELECT * FROM riskfactor
When I execute this code I run into the error "prefix not found". What am I doing wrong here. This seems pretty straightforward. Regards, Robbert
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Zeppelin
09-28-2016
06:44 AM
@Michael Young and @mrizvi thanks for your answers. This works like a charm. I promise next time I will take a closer look at the errors. I am an Oracle guy used to get errors telling him exactly what went wrong and I have to get used to sifting through log-messages from the whole stack.
... View more
09-27-2016
12:54 PM
Hello, I am stepping through this part of the HDP 2.5 tutorial: https://github.com/hortonworks/tutorials/blob/hdp-2.5/tutorials/hortonworks/hello-hdp-an-introduction-to-hadoop/hello-hdp-section-5.md I have executed this statement in the Hive view in Ambari under maria_dev: CREATE TABLE riskfactor (driverid string,events bigint,totmiles bigint,riskfactor float) STORED AS ORC; I have checked the table to be present in the default db and it is there. After executing the following pig script: a = LOAD 'geolocation' using org.apache.hive.hcatalog.pig.HCatLoader();
b = filter a by event != 'normal';
c = foreach b generate driverid, event, (int) '1' as occurance;
d = group c by driverid;
e = foreach d generate group as driverid, SUM(c.occurance) as t_occ;
g = LOAD 'drivermileage' using org.apache.hive.hcatalog.pig.HCatLoader();
h = join e by driverid, g by driverid;
final_data = foreach h generate $0 as driverid, $1 as events, $3 as totmiles, (float) $3/$1 as riskfactor;
store final_data into 'riskfactor' using org.apache.hive.hcatalog.pig.HCatStorer(); I get the following errors: ls: cannot access /hadoop/yarn/local/usercache/maria_dev/appcache/application_1474973150203_0003/container_1474973150203_0003_01_000002/hive.tar.gz/hive/lib/slf4j-api-*.jar: No such file or directory
ls: cannot access /hadoop/yarn/local/usercache/maria_dev/appcache/application_1474973150203_0003/container_1474973150203_0003_01_000002/hive.tar.gz/hive/hcatalog/lib/*hbase-storage-handler-*.jar: No such file or directory
WARNING: Use "yarn jar" to launch YARN applications.
16/09/27 11:51:21 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
16/09/27 11:51:21 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE
16/09/27 11:51:21 INFO pig.ExecTypeProvider: Trying ExecType : TEZ_LOCAL
16/09/27 11:51:21 INFO pig.ExecTypeProvider: Trying ExecType : TEZ
16/09/27 11:51:21 INFO pig.ExecTypeProvider: Picked TEZ as the ExecType
2016-09-27 11:51:21,605 [main] INFO org.apache.pig.Main - Apache Pig version 0.16.0.2.5.0.0-1245 (rexported) compiled Aug 26 2016, 02:07:35
2016-09-27 11:51:21,605 [main] INFO org.apache.pig.Main - Logging error messages to: /hadoop/yarn/local/usercache/maria_dev/appcache/application_1474973150203_0003/container_1474973150203_0003_01_000002/pig_1474977081603.log
2016-09-27 11:51:23,260 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /home/yarn/.pigbootup not found
2016-09-27 11:51:23,453 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://sandbox.hortonworks.com:8020
2016-09-27 11:51:24,818 [main] INFO org.apache.pig.PigServer - Pig Script ID for the session: PIG-script.pig-8ca435c7-920a-4f44-953e-454a42973ab8
2016-09-27 11:51:25,478 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2016-09-27 11:51:25,671 [main] INFO org.apache.pig.backend.hadoop.PigATSClient - Created ATS Hook
2016-09-27 11:51:27,037 [main] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist
2016-09-27 11:51:27,107 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://sandbox.hortonworks.com:9083
2016-09-27 11:51:27,170 [main] INFO hive.metastore - Connected to metastore.
2016-09-27 11:51:27,904 [main] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist
2016-09-27 11:51:27,906 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://sandbox.hortonworks.com:9083
2016-09-27 11:51:27,909 [main] INFO hive.metastore - Connected to metastore.
2016-09-27 11:51:28,140 [main] WARN org.apache.pig.newplan.BaseOperatorPlan - Encountered Warning IMPLICIT_CAST_TO_FLOAT 1 time(s).
2016-09-27 11:51:28,237 [main] WARN org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.metastore.local does not exist
2016-09-27 11:51:28,317 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://sandbox.hortonworks.com:9083
2016-09-27 11:51:28,325 [main] INFO hive.metastore - Connected to metastore.
2016-09-27 11:51:28,723 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 0:
<file script.pig, line 9, column 0> Output Location Validation Failed for: 'riskfactor More info to follow:
Pig 'double' type in column 2(0-based) cannot map to HCat 'BIGINT'type. Target filed must be of HCat type {DOUBLE}
Details at logfile: /hadoop/yarn/local/usercache/maria_dev/appcache/application_1474973150203_0003/container_1474973150203_0003_01_000002/pig_1474977081603.log
2016-09-27 11:51:28,746 [main] INFO org.apache.pig.Main - Pig script completed in 7 seconds and 330 milliseconds (7330 ms) When I executed the script for the very first time I did not see any errors, but the riskfactor table was still empty and should have been populated.
Is there somebody that can help?
... View more
Labels: