Member since
11-21-2015
15
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2010 | 03-30-2017 07:26 AM | |
3635 | 11-25-2015 08:12 AM |
03-30-2017
07:26 AM
I have uninstalled the Cloudera Manager and also removed open JDK by using yum remove java* Started with Master Node and added two new Hosts later and installation successfully completed.
... View more
03-28-2017
11:55 AM
At this moment, I have skipped the Cloudera Manager JDK installation since I have downloaded and installed JDK 1.8 on all the hosts and set the $JAVA_HOME as well. But still java -version points to open JDK. [root@namenode1 ~]# rpm -aq | grep -i jdk java-1.8.0-openjdk-headless-1.8.0.121-0.b13.el7_3.x86_64 java-1.8.0-openjdk-1.8.0.121-0.b13.el7_3.x86_64 jdk-1.6.0_31-fcs.x86_64 copy-jdk-configs-1.2-1.el7.noarch [root@namenode1 ~]# java -version openjdk version "1.8.0_121" OpenJDK Runtime Environment (build 1.8.0_121-b13) OpenJDK 64-Bit Server VM (build 25.121-b13, mixed mode) [root@namenode1 ~]# echo $JAVA_HOME /usr/java/jdk1.8.0_121 [root@namenode1 ~]# Let me try to remove openJDk
... View more
03-28-2017
09:49 AM
@surajacharya What if the screen freezes and no progress though the Download shows 100% but distribute 0/4 , Unpack 0/4 and Activate 0/4?
... View more
03-28-2017
06:48 AM
I have faced this issue and after that uninstalled once and installed once again. Did not face this issue. But ended up Heart Beat failures.
... View more
03-28-2017
06:20 AM
Hello All, I am trying to setup a 4 Node Cluster on CentOS7 in a Virtual Box using CDH Installation Path A. (Using automated way with Parcels) I was able to download the required Parcels and Cloudera-scm-server and Cloudera-scm-agent are running successfully on the respective Master Nodes and datanodes. But After the download 100% completes, it was unable to distribute it to any of the hosts. I do see that /var/log/cloudera-scm-server/cloudera-scm-server.log has messages with the Heart beat failure. java.net.ConnectException: No route to host to http://datanode2.example.com:9000/heartbeat java.net.ConnectException: No route to host to http://datanode3.example.com:9000/heartbeat java.net.ConnectException: No route to host to http://datanode2.example.com:9000/heartbeat java.net.ConnectException: connection timed out to http://datanode1.example.com:9000/heartbeat java.net.ConnectException: connection timed out to http://datanode3.example.com:9000/heartbeat java.net.ConnectException: No route to host to http://datanode2.example.com:9000/heartbeat java.net.ConnectException: connection timed out to http://datanode1.example.com:9000/heartbeat java.net.ConnectException: connection timed out to http://datanode3.example.com:9000/heartbeat My /etc/hosts [root@namenode1 cloudera-scm-server]# cat /etc/hosts 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 192.168.2.200 namenode1.example.com namenode1 192.168.2.201 datanode1.example.com datanode1 192.168.2.202 datanode2.example.com datanode2 192.168.2.203 datanode3.example.com datanode3 [root@namenode1 cloudera-scm-server]# Note: I have disabled firewalld service on all the hosts. Am I missing something?. I have tried enough install and uninstall so waiting for any of the valuable inputs. Cloudera Community Team - Please let me know if you need any more information to resolve this issue.
... View more
Labels:
01-28-2017
07:54 PM
I am able to submit a Oozie workflow to sqoop a mysql (retail_db.orders) table into HDFS location as avro datafile and the JOB is successfully completed. However, Unable to locate the .avsc (schema) file in the Oozie location. Any idea, where would be those supporting files (.avsc, .java) are being saved for the JObs submitted using Oozie? Because whenever we do Sqoop import from the CLI the respective .avsc and .java are being created in the directory from where the sqoop import has been invoked.
... View more
01-28-2017
07:44 PM
sqoop Version : Sqoop 1.4.5-cdh5.4.2 CDH quickstart VM: cdh5.4.2 Directory structure: /home/cloudera/review/sqoop
... View more
01-28-2017
07:39 PM
By the way, I have cleared the CCA175 Exam 🙂
... View more
01-23-2017
02:44 PM
Hello, I have created a sqoop job as [cloudera@quickstart sqoop]$ sqoop job --create orders_import -- import --connect "jdbc:mysql://${hostname}:3306/retail_db" --table orders --target-dir /user/cloudera/cca175/review/sqoop/orders --username root --password cloudera -m 5 I have triggered the above sqoop job and successfully completed. But when I tried to override to import the data into avro format with a new file name it throws an exception. [cloudera@quickstart sqoop]$ sqoop job --exec orders_import -- --delete-target-dir --target-dir /user/cloudera/cca175/review/sqoop/orders_as_avro --as-avrodatafile Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 17/01/23 14:36:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.8.0 Enter password: 17/01/23 14:36:38 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 17/01/23 14:36:38 INFO tool.CodeGenTool: Beginning code generation 17/01/23 14:36:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1 17/01/23 14:36:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM orders AS t LIMIT 1 17/01/23 14:36:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce Note: /tmp/sqoop-cloudera/compile/da508ace8c9dcaa1eae5a2005fb413db/orders.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 17/01/23 14:36:41 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/da508ace8c9dcaa1eae5a2005fb413db/orders.jar 17/01/23 14:36:41 WARN manager.MySQLManager: It looks like you are importing from mysql. 17/01/23 14:36:41 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 17/01/23 14:36:41 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 17/01/23 14:36:41 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 17/01/23 14:36:41 INFO mapreduce.ImportJobBase: Beginning import of orders 17/01/23 14:36:41 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar 17/01/23 14:36:42 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps 17/01/23 14:36:42 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/127.0.0.1:8032 17/01/23 14:36:43 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/cca175/review/sqoop/orders already exists 17/01/23 14:36:43 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/cca175/review/sqoop/orders already exists at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146) at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:270) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325) at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692) at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615) at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:213) at org.apache.sqoop.tool.JobTool.run(JobTool.java:268) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236) [cloudera@quickstart sqoop]$ Is there any limitation in the parameters overriding during the Sqoop JOB?. Note: I am using CDH quickstart vm
... View more
Labels:
01-21-2017
04:00 PM
Hello All, I am preparing for CCA175 Spark and Hadoop Developer certification. Just would like to confirm whether CDH VM environment has hive-hcataglog-core.jar?. I am planning to use this jar if the situation requires to create a Hive table for a given .json data file.
... View more
Labels:
01-21-2017
03:56 PM
I have tried to execute the following DDL CREATE TABLE employee_exp_json ( id INT, fname STRING, lname STRING, profession STRING, experience INT, exp_service STRING ) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' STORED AS TEXTFILE; But got an error FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Cannot validate serde: org.apache.hive.hcatalog.data.JsonSerDe I do see some of the posts said that need to add jar hive-hcataglog-core.jar but I am unable to locate that jar in CDH quickstart VM 5.4.2.0. So it would be great if you could suggest how to fix this.
... View more
Labels:
12-19-2015
07:01 PM
1 Kudo
Hello All - Today I have successfully installed Docker on my laptop and then I've executed the command and PULL was also successful. $ docker pull cloudera/quickstart:latest $ docker images REPOSITORY TAG IMAGE ID CREATED VIRTUAL SIZE cloudera/quickstart latest 7e0ff6dfbec6 2 weeks ago 6.213 GB But when I executed the docker run command recieved the error $ docker run --hostname=quickstart.cloudera --privileged=true -t -i cloudera/quickstart /usr/bin/docker-quickstart exec: "C:/Program Files/Git/usr/bin/docker-quickstart": stat C:/Program Files/Gi t/usr/bin/docker-quickstart: no such file or directory Error response from daemon: Cannot start container 2cd2aa170885fe4ff33c34fac3ea3 17a3b3151cc7140575f92a218d657ad8a6d: [8] System error: exec: "C:/Program Files/G it/usr/bin/docker-quickstart": stat C:/Program Files/Git/usr/bin/docker-quicksta rt: no such file or directory However, the following commands are successfully returning the results. $ docker run cloudera/quickstart which hadoop /usr/bin/hadoop $ docker run cloudera/quickstart hadoop version Hadoop 2.6.0-cdh5.5.0 Subversion http://github.com/cloudera/hadoop -r fd21232cef7b8c1f536965897ce20f50 b83ee7b2 Compiled by jenkins on 2015-11-09T20:37Z Compiled with protoc 2.5.0 From source with checksum 98e07176d1787150a6a9c087627562c This command was run using /usr/jars/hadoop-common-2.6.0-cdh5.5.0.jar Any idea, why the docker run fails when I mention /usr/bin/docker-quickstart it tries to find in C:/Program Files/Git/usr/bin/docker-quickstart: no such file or directory Surprisingly, when I execute the following command on Virtual Box which was installed part of docker installation it went fine however "hue" was failed. docker run --hostname=quickstart.cloudera --privileged=true -t -i cloudera/quickstart /usr/bin/docker-quickstart I saw similar post which had the same error but there was no resolution so I am creating a separate post. Thanks, Gokul
... View more
11-25-2015
08:12 AM
Hello Sean - Thanks for your response. I've tried this as a solution. After I have restarted my VM and rebooted Cent Os I could see the changes!!!!. Thanks Sean !!!!
... View more
11-21-2015
06:47 PM
Hello All - I tried to execute a simple WordCount MapReduce JOB by using Hue Oozie Workflow but ended up with the following Error. Error Code: JA006 Error Message: JA006: Call From quickstart.cloudera/127.0.0.1 to quickstart.cloudera:10020 failed on connection exception: java.net.ConnectException: Connection refused; But the same JOB was successfully executed in CLI by using the following command hadoop jar learning.jar /user/cloudera/Data/FruitCount/fruits.txt /user/cloudera/Data/FruitCount/output Can you please help me if you have faced this scenario. job.properties ========= oozie.use.system.libpath=True nameNode=hdfs://quickstart.cloudera:8020 jobTracker=quickstart.cloudera:8032 Java Package name is mapReduce and Jar : learning.jar WorkFlow.xml ================ <workflow-app name="My_Workflow" xmlns="uri:oozie:workflow:0.5"> <start to="mapreduce-c6a1"/> <kill name="Kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <action name="mapreduce-c6a1"> <map-reduce> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <configuration> <property> <name>mapred.mapper.class</name> <value>mapReduce.FruitCountMapper</value> </property> <property> <name>mapred.reducer.class</name> <value>mapReduce.FruitCountReducer</value> </property> <property> <name>mapred.input.dir</name> <value>/user/cloudera/Data/FruitCount/fruits.txt</value> </property> <property> <name>mapred.output.dir</name> <value>/user/cloudera/Data/FruitCount/NewOutput</value> </property> </configuration> </map-reduce> <ok to="End"/> <error to="Kill"/> </action> <end name="End"/> </workflow-app> And also faced the same error when I ran the JOB in oozie CLI and the JOB -info Actions ------------------------------------------------------------------------------------------------------------------------------------ ID Status Ext ID Ext Status Err Code ------------------------------------------------------------------------------------------------------------------------------------ 0000001-151121170809805-oozie-oozi-W@mapreduce-c6a1 START_MANUALjob_1448154472842_0004 RUNNING JA006 ------------------------------------------------------------------------------------------------------------------------------------ 0000001-151121170809805-oozie-oozi-W@:start: OK - OK - ------------------------------------------------------------------------------------------------------------------------------------ Thanks Gokul
... View more
11-21-2015
05:28 PM
Dear All - I'm new to CDH and successfully installed the CDH 5.4. And I tried to change the background default image but unable to do it. Can you please help if you have faced this issue/scenario. Thanks in advance!!!
... View more