Member since
07-06-2016
8
Posts
1
Kudos Received
0
Solutions
04-26-2018
12:53 PM
hi i have a simple oozie shell action to echo "hell o world", I am trying to over ride the oozie shell action logging Please let me know how this can be achieved I tried setting the map reduce properties of shell action still it does not work I have a bash script like this bash -------------------------- #!/bin/bash echo "log4j logger checking the target" # End of script tasks, returning error code I added the following property in workflow and i tried to execute <code><property>
<name>mapred.child.java.opts</name>
<value>-Dlog4j.configuration=${log4jConfig}</value>
</property> the log4j content is similar to this log4j.rootLogger=${hadoop.root.logger} hadoop.root.logger=INFO,oozie log4j.appender.oozie=org.apache.log4j.ConsoleAppender log4j.appender.oozie.target=System.err log4j.appender.oozie.layout=org.apache.log4j.PatternLayout log4j.appender.oozie.layout.ConversionPattern=[%p] %d{yyyy-MM-dd'T'HH:mm:ss.SSSZ} [sds-to-hdfs][%c] %m%n I want to change the logging target to System.out just to check if the log4j is overirdden But still the log4j is not overriden please let me know if there is smothing wrong
... View more
12-04-2016
06:35 AM
The error seem to be mismatch data type with data set and case class.Check the each columns data type first Use csv api to read csv file and print schema Eg: val ebaydf = sqlcontect.read.format("com.databricks.spark.csv").option("header", "true").option("InferSchema", "true").load(path)
ebaydf.printschema()
... View more
08-04-2016
06:39 PM
it worked ..thanks
... View more
10-17-2016
03:35 AM
Hi,Matt Burgess, How to import data from mysql to hive use nifi PutHDFS processor and PutHiveQL processor? I already get ORC file, but i cannot put ORC file to hive? the NiFi1.0 I used.
... View more