Member since
04-05-2017
12
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
11758 | 04-25-2017 01:53 PM | |
11878 | 04-17-2017 12:56 PM |
08-02-2017
01:13 PM
Your answer is very confusing. Is there a single example of using a job.properties file to provide the --username and --password information needed by a jdbc:sqlserver:..... driver? Do I need to add Arguments? I can add arguments for the entire Sqoop action command within my Hue..Oozie workflow editor and it works with HARDCODED username and password values but I don't want hardcoded values. How about a screenshot showing the Sqoop Command arguments and a separate screenshot showing the Properties page? Does anyone at Cloudera have one of these that they can share with the community? It would make the trial and error, brute force method of trying all possible combinations of everything much easier. Thanks.
... View more
06-14-2017
04:21 PM
I'm not sure how the to identify the Kafka broker version easily. I did find the the file kafka_2.10-0.10.1.2.1.0.0-165.jar.asc in the /libs folder where Kafka is installed so I am assuming I am running Kafka 0.10.1. I did get both the ConsumeKafka and ConsumeKafka_0_10 connectors to work. Thanks. Now off to figure out why the PutHiveStreaming doesn't work, but that will be for a different post.
... View more
06-14-2017
12:14 PM
I am unable to get the GetKafka connector to connect to Kafka. I have a single server install of HDF-2.1. NiFi, Kafka, and ZooKeeper are all installed on the same server. There is no security. I have verified the ZooKeeper port by connecting to ZooKeeper from other machines in this manner:
ex)$ telnet 192.168.99.100 2181
Trying 192.168.99.100...
Connected to cb675348f5c8.
Escape character is'^]'
(Ctrl C to disconnect) Here is the error I get from the GetKafka connector: Any help would be greatly appreciated.
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache NiFi
04-25-2017
01:53 PM
Here is the actual solution to uninstalling the lower version of JAVA located in the /usr/jdk-1.6.0_31-fcs.x86_64 directory. Determine if this java version was installed using RPM Open the Terminal Window Login as the super user Try to find the package by typing: rpm -qa Copy the long list of packages to a text editor so you can check for jdk-1.6.0_31-fcs.x86_64 Do a find for 'jdk-1.6.0_31-fcs.x86_64' If found, it was installed using RPM and should be uninstalled using RPM Go back to the Terminal Window RPM Uninstall Verify the directory is there by going to the /usr/java directory and listing it's contents cd /usr/java ls -l Check to see what your JAVA_HOME variable is set to $JAVA_HOME returns jdk-1.6.0_31-fcs.x86_64 Remove the lower version of Java using the RPM rpm -e jdk-1.6.0_31-fcs.x86_64 Verify the directory has been removed ls -l Exit the Terminal Window Exit Open the Termina Window and login as the Super User Verify the JAVA_HOME has changed to the correct version of Java $JAVA_HOME Results should be the correct version of Java, not the old jdk-1.6.0_31-fcs.x86_64 or whatever version yours is that needed to be removed. Repeat 2-4 for each node in your cluster. This is what I actually had to do to remove the offending version of Java and ensure that all nodes were running on the same version of JAVA.
... View more
04-17-2017
12:56 PM
We uninstalled the earlier version of Java from all nodes, restarted all, and confirmed our $JAVA_HOME is now pointed to the correct version of Java. This corrected the issue and we were able to validate the environment.
... View more
04-14-2017
01:16 PM
Found my $JAVA_HOME returns /usr/java/jdk1.6.0_31 Unsure how to change this for all users to /usr/java/jdk1.7.0_67-cloudera
... View more
04-14-2017
10:41 AM
Clean install of CDH 5.10 - Running any hadoop ** command returns the "Unsupported major.minor version 51.0" error message on any node in the cluster. Found while working through Testing the Installation.... Occurs with or without the "Java Home Directory" setting override in Hosts Configuration in Cloudera Manager. hdfs@hadoop1:/> hadoop fs Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at java.lang.ClassLoader.defineClass(ClassLoader.java:615) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141) at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at java.net.URLClassLoader.access$000(URLClassLoader.java:58) at java.net.URLClassLoader$1.run(URLClassLoader.java:197) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: org.apache.hadoop.fs.FsShell. Program will exit. login as: root Using keyboard-interactive authentication. Password: Last login: Fri Apr 14 12:31:18 2017 from 10.4.4.44 hadoop1:~ # sudo su hdfs hdfs@hadoop1:/root> hadoop version Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/util/VersionInfo : Unsupported major.minor version 51.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631) at java.lang.ClassLoader.defineClass(ClassLoader.java:615) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141) at java.net.URLClassLoader.defineClass(URLClassLoader.java:283) at java.net.URLClassLoader.access$000(URLClassLoader.java:58) at java.net.URLClassLoader$1.run(URLClassLoader.java:197) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: org.apache.hadoop.util.VersionInfo. Program will exit There are two versions of Java in /usr/java jdk1.6.0_31 jdk1.7.0_67-cloudera $PATH = /usr/local/bin:/usr/bin:/bin:/usr/bin/X11:/usr/games
... View more
Labels:
- Labels:
-
Apache Hive
-
Cloudera Manager
-
HDFS