Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 912 | 06-04-2025 11:36 PM | |
| 1510 | 03-23-2025 05:23 AM | |
| 744 | 03-17-2025 10:18 AM | |
| 2696 | 03-05-2025 01:34 PM | |
| 1790 | 03-03-2025 01:09 PM |
06-06-2018
07:44 AM
@Praveen Atmakuri For the existing, you can accomplish this using the mysqldump to backup and then load the data to the Azure Mysql DB
... View more
06-06-2018
06:55 AM
@Praveen Atmakuri There is a better option almost out-of-the-box if you want to preserve your MySQL and avoid migration downtime What is Azure Database for MySQL? Hope that helps
... View more
06-06-2018
06:09 AM
@Sriram If you are using DNS then that should be fine as the hostname resolution will be automatically done. Have you encountered any issues ? Here is the check DNS document
... View more
06-05-2018
03:40 PM
@Michael Bronson There are 3 ways to avoid updates just tweak you disable ambari & hdp* repos by changing the URL o a non-existant host /etc/yum.repos.d/ *.repo /etc/host file Just block the public-repo-1.hortonworks.com public IP Hope that helps
... View more
06-05-2018
06:04 AM
1 Kudo
@Sriram FQDN is required if you are not using DNS/reverse DNS. See: http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.2.0/bk_Installing_HDP_AMB/content/_check_dns.html You can manage this by maintaining /etc/hosts but whenever IP changes in the environment then you have to update the entries in the host file. FQDN is recommended for reasons like users does not need to have local /etc/hosts in their env to reach the cluster. Hope that helps
... View more
06-04-2018
02:06 PM
@JAy PaTel If that is the case you might need to script it in advance see using Sqoop to fetch many tables in parallel Hope that helps
... View more
06-04-2018
01:48 PM
@JAy PaTel Why go the lengthy tedious way to exclude 98 tables when you can just mention the 2 tables you want imported?
... View more
06-04-2018
01:46 PM
@shrinivas acharya Here is a valid command to import emp from MySQL to emp in hive sqoop import
--connect jdbc:mysql://localhost:3306/blp
--username root
--password Root@123
--table emp
--target-dir /blp/test/emp
--fields-terminated-by ","
--hive-import
--create-hive-table
--hive-table emp Please revert
... View more
06-04-2018
01:11 PM
@JAy PaTel When you want to import ONLY a subset of the database you don't use "import-all-tables" $ sqoop import
--connect jdbc:sqlserver://<HOST>:<port>;databasename=<mssql_database_name>
--username foo
--password foo
--table mssql_table1,mssql_table2
--hive-import Please try the above method and revert
... View more
06-04-2018
12:15 PM
@shrinivas acharya Can you open this link http://hdpcluster2.blpclean.com:8088/cluster/app/application_1528094305283_0011 and copy error message in the logs. Is SSL enabled?
... View more