Member since
07-21-2014
141
Posts
8
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2179 | 02-01-2017 04:49 PM | |
1395 | 01-15-2015 01:57 PM | |
2198 | 01-05-2015 12:59 PM |
05-02-2017
06:29 PM
Thanks, this is very useful. How would one go about getting the application name? Is it the app name or the app ID or something else? Thanks!
... View more
05-01-2017
02:54 PM
Thanks for the update. > Are you looking for support on a particular environment? Yes, looking for Debian (Wheezy or Jessie) support.
... View more
04-28-2017
12:54 PM
Trying to understand the requirements for CDSW which seems to be only available on RHEL/CentOS: https://www.cloudera.com/documentation/data-science-workbench/latest/topics/cdsw_requirements_supported_versions.html#operating_system_req But docker containers are support across other linux distributions. Are there plans to make it available on other distributions?
... View more
Labels:
- Labels:
-
Cloudera Manager
04-12-2017
03:31 PM
I got 5 clusters managed by CM and everything is on the latest CDH 5.10. In order to evaluate the Java upgrade, I would like to test the upgrade on a test cluster and minimize the issues when upgrading the prod clusters.
... View more
04-12-2017
02:39 PM
Thanks for the response. Since CM host manages the clusters, I wanted to check if the hosts for one of the cluster (running the services) be running say Java 8 while CM host still be on Java 7?
... View more
04-12-2017
12:46 PM
Is it possible to have CM host running on a different version of Java compared to the Java version all the hosts of a cluster? Thanks!
... View more
Labels:
- Labels:
-
Manual Installation
04-04-2017
11:24 AM
Can anyone weigh in on the possibility of running the cluster hosts, except the cloudera manager host, on JDK 1.8? Thanks!
... View more
03-17-2017
08:01 PM
I run into this error when I try to start the NodeManager: ~~~~ Error starting NodeManager org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Recieved SHUTDOWN signal from Resourcemanager ,Registration of NodeManager failed, Message from ResourceManager: Disallowed NodeManager from <host>, Sending SHUTDOWN signal to the NodeManager. at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStart(NodeStatusUpdaterImpl.java:215) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120) at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStart(NodeManager.java:288) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:522) at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:568) Caused by: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Recieved SHUTDOWN signal from Resourcemanager ,Registration of NodeManager failed, Message from ResourceManager: Disallowed NodeManager from <host>, Sending SHUTDOWN signal to the NodeManager. at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.registerWithRM(NodeStatusUpdaterImpl.java:283) at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStart(NodeStatusUpdaterImpl.java:209) ~~~~~~ Any idea how to overcome this error and start the NodeManagers? Thanks!
... View more
Labels:
- Labels:
-
Manual Installation
03-17-2017
10:25 AM
2 Kudos
While attempting to apply a host template, I keep running into this error/notification: Host must have a single version of CDH installed I didn't notice multiple cm agents running on the host, is there anything else I need to check? Thanks!
... View more
Labels:
- Labels:
-
Cloudera Manager
-
Manual Installation
03-16-2017
03:35 PM
CDH and Cloudera Manager requirements are mentioned here: https://www.cloudera.com/documentation/enterprise/release-notes/topics/rn_consolidated_pcm.html "... Cloudera does not support mixed environments, all nodes in your cluster must be running the same major JDK version." If I want to run the hosts of the cluster on JDK 8, can I still have CM running on JDK 7? Thanks!
... View more
03-12-2017
11:24 PM
I currently have my CDH 5.9 cluster currently on JDK 1.7 and wanted to know if its safe to upgrade the hosts to JDK 1.8. Also, is it possible to have the Cloudera manager to be on JDK 1.7 while the cluster to be upgraded to JDK 1.8. Thanks!
... View more
Labels:
- Labels:
-
Manual Installation
02-02-2017
12:08 PM
How does one go about deploying a Spark streaming job on a CDH cluster? Does the job need to be deployed as a CSD parcel? Thanks!
... View more
Labels:
- Labels:
-
Apache Spark
02-01-2017
04:49 PM
Ok, I do notice the CDH 5.10 parcel and requires Cloudera Manager to be updated before updating CDH parcel.
... View more
02-01-2017
04:35 PM
Is this still the right way to configure the repo URL and get the latest CDH when available: http://archive.cloudera.com/cdh5/parcels/{latest_supported}/ I don't seem to find CDH 5.10 via Cloudera Manager. If I look at the 'latest' directory I notice the CDH version is 5.3.10: http://archive.cloudera.com/cdh5/parcels/latest/ Please let me know if the repo URL needs to be updated. Thanks!
... View more
Labels:
- Labels:
-
Cloudera Manager
01-16-2017
03:48 PM
Is there a R package/parcel that is available to be installed across the CDH cluster? Thanks!
... View more
Labels:
01-12-2017
04:24 PM
Given the size of the dataset, I believe the data fits in memory and its not providing any additional performance improvement. Thanks!
... View more
01-11-2017
08:20 PM
I got a table of size ~1gb and I tried to setup hdfs caching as described in this 'Using HDFS caching with Impala" doc, with replication factor of 2 I noticed that without hdfs caching the queries seem to be performing better. I'm using CDH 5.8.2. Is there anything I might be missing or can check why that is the case? Thanks!
... View more
Labels:
- Labels:
-
Apache Impala
10-31-2016
09:47 PM
Thanks for the tutorial. Are there any workflow examples of how to handle the update and delete transactions and how to replicate it into HBase/Phoenix?
... View more
02-22-2016
02:46 PM
Thanks srowen! I did notice that I'm able to import 'spark.graphx._' but in the known issues for CDH 5.5 its marked as not supported so was checking: http://www.cloudera.com/documentation/enterprise/latest/topics/cdh_rn_spark_ki.html "GraphX not supported Cloudera does not support GraphX." It probably means its not supported in these forums 🙂
... View more
02-22-2016
02:32 PM
Is Graphx avaiable in the latest CDH distro? How to check if a specific pkg or lib is available in CDH in general? Thanks!
... View more
Labels:
09-16-2015
10:21 PM
I got the oozie job to run after adding all the args under Parameters. thanks!
... View more
09-16-2015
09:56 PM
Thanks Harsha! I'm using Oozie editor in Hue 3.6 and do not see any option to add 'args'. I attempted to use 'parameters' but upon submitting got an exception regarding invalid xml. Please let me know how to add the args.
... View more
09-16-2015
06:43 PM
I've configured an Oozie workflow to run this Sqoop job: <sqoop xmlns="uri:oozie:sqoop-action:0.2"> <job-tracker>my-node:8032</job-tracker> <name-node>hdfs://my-node:8020</name-node> <command>import --connect "jdbc:mysql://<url>" --username "dummy" --password "pwd" --query "SELECT col1, col2, CAST(REPLACE(SUBSTR(datecol,1,7), '-', '') AS UNSIGNED) as dateyyymm FROM src_table WHERE \$CONDITIONS" --target-dir /user/hive/warehouse/hive_table --fetch-size 0 --hive-import --hive-drop-import-delims --hive-table hive_table -m1</command> <file>/tmp/hive-site.xml#hive-site.xml</file> </sqoop> The Sqoop command works fine if ran from command prompt, but I keep running into these errors using Oo: [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Error parsing arguments for import: [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: col1 [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: col2 [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: CAST(REPLACE(SUBSTR(datecol,1,7), [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: '-', [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: '') [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: AS [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: UNSIGNED) [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: as [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.BaseSqoopTool - Unrecognized argument: dateyyymm .. ...
... View more
Labels:
- Labels:
-
Apache Oozie
06-19-2015
07:56 AM
Thanks for the info. I did the 'Deploy client configuration' but still do not see the hive-site.xml updated with the configuration I provided in the safety valve field via CM. Please let me know if there is any other way to verify the deployment. Thansk!
... View more
06-19-2015
02:03 AM
I'm facing same error as well when I add this configuration to Hue's hue_safety_valve.ini via CM: ~~~ [hbase] hbase_clusters=(Cluster|some-hbase-thrift-server:9090) ~~~~ * I've verified that the ' Enable HBase Thrift Server Framed Transport' is unchecked * thrift_transport is set to 'buffered' in Hue Is there a way to download the hbase-site.xml via CM? Please let me know if there is any other config to update. Thanks!
... View more
06-19-2015
12:21 AM
I've updated the configuration for Hive using CM under " Hive Service Advanced Configuration Snippet (Safety Valve) for hive-site.xml" providing this configuration: <property> <name>hbase.zookeeper.quorum</name> <value>some.zookeeper.node</ value> </property> After restarting the Hive service, I don't see this config in the /etc/hive/conf.cloudera.hive/hive-site.xml config file. It seems to already have this property set to some other value. Does adding a safety value with an existing property overwrite or append the value to the hive-site.xml? Thanks!
... View more
Labels:
04-20-2015
03:35 PM
I need to process nested JSON. How did you go about mapping the fields to the nested AVRO schema?
... View more
03-12-2015
04:11 PM
Sorry, was a copy/paste error. I did have a comma between pagename and year but got the error I pasted. Thanks!
... View more
03-12-2015
04:05 PM
I got a 'log' table which is currently partitioned by year, month and day. I'm looking to create a partitioned view on top of 'log' table but running into this error: ~~~~ hive> CREATE VIEW log_view PARTITIONED ON (pagename,year,month,day) AS SELECT pagename year,month,day,uid,properties FROM log; FAILED: SemanticException [Error 10093]: Rightmost columns in view output do not match PARTITIONED ON clause ~~ Whats the right way to create a partitioned views? I'm using Hive 0.13 in CDH 5.3.2. Thanks!
... View more
Labels:
01-15-2015
01:57 PM
Thanks Joey, looks like "/usr/bin/flume-ng agent ... -Duser.home=/user/xyz -c /path/to/jar1:/path/to/jar2" seem to have done the trick by prepending to the java.class.path.
... View more