Member since
02-21-2017
34
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8677 | 02-20-2017 03:51 AM |
02-28-2017
06:17 AM
1) metatool -listFSRoot email.input employee.input extrnl recharge.input txns1.txt
aruna@aruna:~/hive_demo$ metatool -listFSRoot
Initializing HiveMetaTool..
17/02/28 14:07:46 INFO metastore.ObjectStore: ObjectStore, initialize called
17/02/28 14:07:46 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
17/02/28 14:07:46 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
17/02/28 14:07:58 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
17/02/28 14:08:00 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
17/02/28 14:08:00 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
17/02/28 14:08:08 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
17/02/28 14:08:08 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
17/02/28 14:08:10 INFO metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
17/02/28 14:08:10 INFO metastore.ObjectStore: Initialized ObjectStore
Listing FS Roots..
2) hdfs getconf -nnRpcAddresses aruna@aruna:~/hive_demo$ hdfs getconf -nnRpcAddresses
localhost:9000
... View more
02-24-2017
05:57 AM
I installed Hbase and start the Hbase. But when i check the runn deamons using "sudo jps" it shoes me only below. aruna@aruna:~/hbase-1.2.4$ sudo jps
[sudo] password for aruna:
6722 HRegionServer
6525 HQuorumPeer
6975 Jps
1) When installing i added below to the site.xml file <property>
<name>hbase.rootdir</name>
<value>/home/aruna/hbase-1.2.4/hbasestorage</value>
</property>
<property>
<name>hbase.zookeeper.quorum</name>
<value>localhost</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/home/aruna/hbase-1.2.4/hbasestorage/zookeeper</value>
</property>
<property>
<name>hbase.zookeeper.property.clientPort</name>
<value>2181</value>
</property> 2) I added below to the file export JAVA_HOME=/usr/lib/jvm/java-7-oracle
export HBASE_MANAGES_ZK=true I starts the Hbase using below command. ./bin/start-hbase.sh
I don't know why its not running HMaster deamon?
... View more
Labels:
- Labels:
-
Apache HBase
02-23-2017
06:21 AM
@Sindhu below are the results 1) hive> describe formatted telecom.recharge; hive> describe formatted telecom.recharge;
OK
# col_name data_type comment
cell_no int
city string
name string
# Detailed Table Information
Database: telecom
Owner: aruna
CreateTime: Wed Feb 22 12:40:21 SGT 2017
LastAccessTime: UNKNOWN
Protect Mode: None
Retention: 0
Location: hdfs://localhost:9000/user/hive/warehouse/telecom.db/recharge
Table Type: MANAGED_TABLE
Table Parameters:
COLUMN_STATS_ACCURATE true
numFiles 2
totalSize 93
transient_lastDdlTime 1487773869
# Storage Information
SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
InputFormat: org.apache.hadoop.mapred.TextInputFormat
OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
Compressed: No
Num Buckets: -1
Bucket Columns: []
Sort Columns: []
Storage Desc Params:
field.delim ,
serialization.format ,
Time taken: 0.141 seconds, Fetched: 32 row(s)
2) hadoop fs -ls /user/hive/warehouse aruna@aruna:~/hadoop-2.7.3/sbin$ hadoop fs -ls /user/hive/warehouse
Found 4 items
drwxrwxr-x - aruna supergroup 0 2017-02-23 12:00 /user/hive/warehouse/myreatail.db
drwxrwxr-x - aruna supergroup 0 2017-02-22 19:45 /user/hive/warehouse/telecom.db
drwxrwxr-x - aruna supergroup 0 2017-02-22 12:23 /user/hive/warehouse/telecome_bkp.db
drwxrwxr-x - aruna supergroup 0 2017-02-22 12:26 /user/hive/warehouse/telecome_bkp_extend.db
3) hadoop fs -ls /user/hive/warehouse/telecom.db aruna@aruna:~/hadoop-2.7.3/sbin$ hadoop fs -ls /user/hive/warehouse/telecom.db
Found 3 items
drwxrwxr-x - aruna supergroup 0 2017-02-22 22:31 /user/hive/warehouse/telecom.db/recharge
drwxrwxr-x - aruna supergroup 0 2017-02-22 17:48 /user/hive/warehouse/telecom.db/recharged
drwxrwxr-x - aruna supergroup 0 2017-02-22 19:45 /user/hive/warehouse/telecom.db/recharged2
... View more
02-23-2017
06:16 AM
I created table from hive and add some data using txt file. when i check the below command it shows me last 4 lines are null. hive> select * from txnrecords; 84 05-26-2011 4006771 41.0 39 Exercise & Fitness Machine Weight Accessories Singapore Changi
85 05-26-2011 4006771 41.0 39 Exercise & Fitness Machine Weight Accessories Singapore Changi
NULL NULL NULL NULL NULL NULL NULL NULL NULL
NULL NULL NULL NULL NULL NULL NULL NULL NULL
NULL NULL NULL NULL NULL NULL NULL NULL NULL
NULL NULL NULL NULL NULL NULL NULL NULL NULL
1) How can i remove these null values?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
02-22-2017
06:46 AM
Ok. even there is a database(telicome) and table(recharge) already created , but i can't fine the hive/warehouse folder.Since i didn't add them in hive-site.xml that folder may created in a default location ?So where can be that folder ? when i am installing hive i created folders using below command and add permission. hadoop fs -makedir -p /user/hive/warehouse //create folder
hadoop fs -chmod g+w /user/hive/warehouse //add permission
hadoop fs -chmod g+w /tmp //add permission
... View more
02-22-2017
06:34 AM
@Jay SenSharma , below are the results for queries. 1) hive shell ; hive> hive shell;
NoViableAltException(26@[])
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1071)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:396)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1170)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: ParseException line 1:0 cannot recognize input near 'hive' 'shell' '<EOF>'
2) use telicom; hive> use telecom;
OK
Time taken: 0.02 seconds
3) select * from recharge hive> select * from recharge;
OK
9999999 ind abc
6399 usa mnb
9087 uk sasadda
Time taken: 0.351 seconds, Fetched: 3 row(s)
... View more
02-22-2017
06:05 AM
I don't have hive-site.xml file in conf folder. There is a hive-default.xml.template file. It have hive.metastore.warehouse.dir value as below
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
... View more
02-22-2017
05:54 AM
when i am installing hive i set below in .bashrc file export HIVE_HOME=/home/aruna/apache-hive-1.2.1-bin
export PATH=$PATH:/home/aruna/apache-hive-1.2.1-bin/bin Create folders hadoop fs -makedir -p /user/hive/warehouse //create folder
hadoop fs -chmod g+w /user/hive/warehouse //add permission
hadoop fs -chmod g+w /tmp //add permission and added hadoop path to hive-config.sh file.
export HADOOP_HOME=/home/aruna/hadoop-2.7.3
these are the things i have done.I don't remember that i set the "hive.metastore.warehouse.dir" value?
... View more
02-22-2017
05:34 AM
I installed Hive and create table.When i check the table properties using "describe extended recharge", then i found the table was created in below location location:hdfs://localhost:9000/user/hive/warehouse/telecom.db/recharge I try to find the location inside user folder, but i can't find the hive/warehouse folder ?
... View more
Labels:
- Labels:
-
Apache Hadoop