Member since
01-18-2017
7
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2387 | 01-18-2017 06:50 PM |
02-08-2017
10:51 PM
1 Kudo
Want to point out couple of things.. date is a hive key word. To have column called date you need to say `date` string,. If you specify that a table is partitioned ( in your cased partitioned by (`date` string) ), that partition is implicitly accepted as the last column of the table. No need to specify it in the schema again. CREATE EXTERNAL TABLE user ( userId BIGINT, type INT, level TINYINT )PARTITIONED BY (`date` String)LOCATION '/external_table_path'; As @Steven O'Neill mentioned, when ingesting data into HDFS, you should create sub-directories like date=2016-01-01 under your table location. Then your alter table statement will work.
... View more
01-23-2017
04:46 PM
Can any of the experts here know what is the root cause of this issue? What I did was a work around to get things rolling but doesn't look like a meaningful solution.
... View more
01-18-2017
06:50 PM
It worked for me after I modified HDFS core-site.xml and removed com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec from io.compression.codecs parameter. I also removed io.compression.codec.lzo.class (which was set to com.hadoop.compression.lzo.LzoCodec). Restarted mapreduce, yarn, hive and oozie.
... View more
01-18-2017
04:41 PM
@Kuldeep Kulkarni
Can you help? I also have copied hadoop-lzo-0.6.0.2.5.0.0.-1245.jar to /usr/hdp/2.5.0.0-1245/oozie/share/lib/hive on oozie master. I also have them in /user/oozie/share/lib/hive in hdfs. I do see that this jar file is copied in the mapreduce task logs. But still getting the LzoCodec not found error.
... View more
01-18-2017
04:36 PM
yes they are installed on all the nodes. rpm -ql hadooplzo_2_5_0_0_1245 and rpm -ql hadooplzo2_5_0_0_1245-native gives right output on all the nodes.
... View more
01-18-2017
03:42 PM
I have a hive workflow which inserts into table2 by selecting from table1. Table2 is in orc format. Every instance of the hive workflow fails with Class com.hadoop.compression.lzo.LzoCodec not found error. I'm able to run the same insert command directly in hive command prompt and it works fine (with both tez and mapr as execution engines). Not able to figure out why it is failing when running with oozie. I tried various steps like using ZLIB and SNAPPY as compression formats, copied the lzo jar files to /var/lib/oozie and restarted oozie, included the paths to these jars in classpath and librarypath. But still getting the same error. Any help is greatly appreciated..
... View more
Labels:
- Labels:
-
Apache Falcon
-
Apache Hadoop
-
Apache Oozie