Member since
07-12-2018
5
Posts
0
Kudos Received
0
Solutions
07-18-2018
08:19 AM
I tried with custom processor but didn't work either. Finally I am using workaround and data is compressed in Hive.
... View more
07-16-2018
09:12 AM
I have tried your code: CREATE EXTERNAL TABLE sourcetable (json string)
row format delimited
fields terminated by ","
STORED as TEXTFILE
LOCATION '/test/snappy'
TBLPROPERTIES("orc.compress"="snappy") without success. Result is the same as above - 0 rows. I have found that snappy codec version is different in NiFi and in HDFS. I'll try to create custom processor in NiFi with the same snappy codec version as it is in HDFS.
... View more
07-13-2018
07:25 AM
Hello guys, I have a problem with reading snappy files from HDFS. From the beginning: 1. Files are compressed in Apache NiFi on separate cluster in CompressContent processor. 2. Files are send to HDFS directly from NiFi to /test/snappy 3. External Table in Hive is created to read data. CREATE EXTERNAL TABLE test_snappy(
txt string)
LOCATION
'/test/snappy'
; 4. Simple query: Select * from test_snappy; results with 0 rows. 5. HDFS -text command returns error: $ hdfs dfs -text /test/snappy/dummy_text.txt.snappy
18/07/13 08:46:47 INFO compress.CodecPool: Got brand-new decompressor [.snappy]
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.hadoop.io.compress.BlockDecompressorStream.getCompressedData(BlockDecompressorStream.java:123)
at org.apache.hadoop.io.compress.BlockDecompressorStream.decompress(BlockDecompressorStream.java:98)
at org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:105)
at java.io.InputStream.read(InputStream.java:101)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:87)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:61)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:121)
at org.apache.hadoop.fs.shell.Display$Cat.printToStdout(Display.java:106)
at org.apache.hadoop.fs.shell.Display$Cat.processPath(Display.java:101)
at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317)
at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)
at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)
at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:118)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:315)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:372) Here is my test file, dummy_text.txt.snappy: https://we.tl/pPUMQU028X Do you have any clues?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache NiFi