Member since
11-28-2016
18
Posts
0
Kudos Received
0
Solutions
08-04-2017
12:47 PM
Could you please guide me how to migrate present current Hadoop to EMC Isilon.
... View more
Labels:
06-27-2017
09:10 AM
Hi, Thanks.
... View more
06-23-2017
12:42 PM
could you please share the documnet for Best Practices for Setup Of HDP 2.6 with ambari 2.5 on EMC isilon HDP-2.6.1.0 with Ambari Version2.5.1.0
... View more
Labels:
03-13-2017
07:05 AM
I resolved temporarily by performing below step. i removed it from the core-site.xml file
“com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec”
from “io.compression.codecs” property in “core-site.xml” file.
... View more
03-03-2017
08:57 AM
p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff}
span.s1 {font-variant-ligatures: no-common-ligatures}
span.Apple-tab-span {white-space:pre} 2017-03-03 07:59:59,090 ERROR [Timer-Driven Process Thread-8] o.apache.nifi.processors.hadoop.PutHDFS PutHDFS[id=931441fa-015a-1000-d89f-19ccb802a271] Failed to write to HDFS due to java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found.: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found. 2017-03-03 07:59:59,091 ERROR [Timer-Driven Process Thread-8] o.apache.nifi.processors.hadoop.PutHDFS java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzoCodec not found. at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:139) ~[hadoop-common-2.7.3.jar:na] at org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:180) ~[hadoop-common-2.7.3.jar:na] at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.getCompressionCodec(AbstractHadoopProcessor.java:398) ~[nifi-hdfs-processors-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at org.apache.nifi.processors.hadoop.PutHDFS$1.run(PutHDFS.java:251) ~[nifi-hdfs-processors-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_101] at javax.security.auth.Subject.doAs(Subject.java:360) [na:1.8.0_101] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1678) [hadoop-common-2.7.3.jar:na] at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:230) [nifi-hdfs-processors-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) [nifi-api-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1099) [nifi-framework-core-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) [nifi-framework-core-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) [nifi-framework-core-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) [nifi-framework-core-1.1.0.2.1.2.0-10.jar:1.1.0.2.1.2.0-10] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_101] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_101] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_101] at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzoCodec not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) ~[hadoop-common-2.7.3.jar:na] at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:132) ~[hadoop-common-2.7.3.jar:na] ... 19 common frames omitted
... View more
Labels:
- Labels:
-
Apache Hadoop
11-29-2016
07:46 PM
Thank you very much, its working fine now.
... View more
11-29-2016
08:20 AM
i am trying to load a orc table with a non orc file,but during installation time of ambari i changed hive.default.fileformat to ORC (default is TextFile) 2.png1.png
... View more
11-29-2016
05:44 AM
Thank you , i will follow, but during installation time of ambari i changed hive.default.fileformat to ORC (default is TextFile)
... View more
11-29-2016
05:21 AM
yes, i am trying to load a orc table with a non orc file, may i know how to convert csv to orc or how to create external hive table .
... View more
11-29-2016
04:26 AM
org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 30019]: The file that you are trying to load does not match the file format of the destination table. Destination table is stored as ORC but the file being loaded is not a valid ORC file 1.png
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
11-28-2016
08:06 PM
Thanks, could you please share your email address. my issue is http://hortonworks.com/hadoop-tutorial/how-to-process-data-with-apache-hive/ I am practicing from above link, but my hive.default.fileformat is ORC 1.png2.png
... View more
11-28-2016
08:03 PM
Thanks,http://hortonworks.com/hadoop-tutorial/how-to-process-data-with-apache-hive/ I am practicing from above link, but my hive.default.fileformat is ORC find screen shots as wlel1.png2.png
... View more
11-28-2016
08:00 PM
http://hortonworks.com/hadoop-tutorial/how-to-process-data-with-apache-hive/ I am practicing from above link, but my hive.default.fileformat is ORC 1.png2.png
... View more
11-28-2016
07:19 PM
org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 30019]: The file that you are trying to load does not match the file format of the destination table. Destination table is stored as ORC but the file being loaded is not a valid ORC file.
... View more
- Tags:
- Data Processing
- orc
Labels:
- Labels:
-
Apache Hive
11-28-2016
07:13 PM
Thanks, but when i run the query the below error i am getting. org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 30019]: The file that you are trying to load does not match the file format of the destination table. Destination table is stored as ORC but the file being loaded is not a valid ORC file.
... View more
11-28-2016
09:46 AM
how to change, initially i configured default as ORC .
... View more
11-28-2016
09:33 AM
Hi, i would like to run the query LOAD DATA INPATH '/user/maria_dev/drivers.csv' OVERWRITE INTO TABLE temp_drivers; But my hive default fileformat is ORC, Data is in .csv format.help.png
... View more
Labels:
- Labels:
-
Apache Hive