Member since
10-01-2018
802
Posts
143
Kudos Received
130
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3060 | 04-15-2022 09:39 AM | |
| 2470 | 03-16-2022 06:22 AM | |
| 6536 | 03-02-2022 09:44 PM | |
| 2900 | 03-02-2022 08:40 PM | |
| 1909 | 01-05-2022 07:01 AM |
03-02-2021
07:31 AM
@ryu You should increase the jute buffer for client and server both. For example -Djute.maxbuffer=100000000 in hive-env. The value is just an example, in short you have to bump up the value from the current value. Check the same for ZK as well.
... View more
03-02-2021
07:22 AM
@codecracker Try to take a look of doc : https://docs.cloudera.com/documentation/data-science-workbench/1-6-x/topics/cdsw_editors_pycharm.html You can try to install the package at project level as well I guess.
... View more
03-02-2021
07:17 AM
@sass There may be an issue with curl version itself or you can try -k in the curl command. Also check out this : https://access.redhat.com/solutions/2989291
... View more
03-02-2021
03:55 AM
1 Kudo
@Cluster-CDP I can see bionic parcels as well. Please make sure you are using the correct document and method to install the Spark. Follow below doc: https://docs.cloudera.com/runtime/7.2.7/cds-3/topics/spark-spark-3-packaging.html https://docs.cloudera.com/runtime/7.2.7/cds-3/topics/spark-install-spark-3-parcel.html
... View more
03-01-2021
10:24 AM
@Alex_IT From my Oracle knowledge, there are 2 options for migrating the same Oracle_home [DB] from 12C to 19C if you are running 12.1.0.2 then you have the direct path see the attached matrix. With this option, you won't need to change the hostname. The other option is to export your current schema CM ,oozie,hive,hue,Ranger etc schemas install a fresh Oracle 19c box with an empty database, and import the old schemas this could be a challenge as you might have to rebuild indexes or recompile some database packages etc but bot are doable. Hope that helps
... View more
02-25-2021
12:59 AM
@ThinkBig Upgrades from HDP 3.x to CDP Private Cloud Base are not currently supported. The best way is to build new CDP cluster and then migrate the Data using Replication Manager or any alternate way which suits you. Refer the Supported Upgrade Path: https://docs.cloudera.com/cdp-private-cloud/latest/upgrade/topics/cdpdc-upgrade-paths.html Use of Replication Manager: https://docs.cloudera.com/cdp-private-cloud/latest/data-migration/topics/cdp-data-migration-replication-manager-to-cdp-data-center.html Also I would encourage you to contact Cloudera for planning a better migration plan since you already have subscription.
... View more
02-25-2021
12:18 AM
bumping again, still trying to disable all application core dump that run inside the containers
... View more
02-24-2021
05:42 AM
Thanks @GangWar for the response. Are there any resources that you are aware of that maybe help me use ambari to manage any of the hadoop components after installing them via open source? Also is it possible to upgrade the hadoop components of an existing HDP cluster via open source?
... View more
02-24-2021
01:48 AM
@GangWar I know about the steps to install bigsql on CDH but I am looking for correct installer as I couldn't find it. looking for installer(bin file or rpm files ) ....
... View more
02-24-2021
12:20 AM
@sass You can check the supported versions of database below and then choose anyone to upgrade to. https://docs.cloudera.com/documentation/enterprise/release-notes/topics/rn_consolidated_pcm.html#cdh_cm_supported_db The packages you listed are SQL packages not the DB so DB server upgrade should take care of the dependencies. CM server just need a DB running in healthy state with all configuration set rightly.
... View more