Member since
02-02-2016
583
Posts
518
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3274 | 09-16-2016 11:56 AM | |
1375 | 09-13-2016 08:47 PM | |
5482 | 09-06-2016 11:00 AM | |
3182 | 08-05-2016 11:51 AM | |
5261 | 08-03-2016 02:58 PM |
06-08-2016
01:18 PM
@Washington Nascimento Thanks for confirming. Please accept the answer to close this thread.
... View more
06-08-2016
12:51 PM
5 Kudos
@Eric Periard We don't need to restart any service, you can use below command to force failover. hdfs haadmin -failover http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/HDFSCommands.html#haadmin
... View more
06-08-2016
11:39 AM
2 Kudos
@sankar rao
Generally to install R dependent packages we use below command on R shell. install.packages(c("rJava", "Rcpp", "RJSONIO", "bitops", "digest","functional", "stringr", "plyr", "reshape2", "dplyr","R.methodsS3", "caTools", "Hmisc")) For R hdfs packages you need to download the package and installed it like below. Download packages rhdfs , rhbase , rmr2 and plyrmr from https://github.com/RevolutionAnalytics/RHadoop/wiki and install them install.packages("<path>/rhdfs_<version>.tar.gz", repos=NULL, type="source")
install.packages("<path>/rmr2<version>.tar.gz", repos=NULL, type="source")
install.packages("<path>plyrmr_<version>.tar.gz", repos=NULL, type="source")
install.packages("<path>/rhbase_<version>.tar.gz", repos=NULL, type="source") Here is good doc link for R on hadoop. http://www.rdatamining.com/big-data/r-hadoop-setup-guide
... View more
06-07-2016
10:03 PM
I still believe this can be fix through different syntax since I don't have SAP installed with SAP driver so can't test it with multiple changes locally. Please give another try with below syntax. "SCHEMA.\"/BI0/TCUSTOMER\""
... View more
06-07-2016
09:05 PM
2 Kudos
@Josh Persinger Please use like this, it will resolve the issue. --table "TestSchema\".\"/BIC/AZDSOV1_100" Similar to bug https://issues.apache.org/jira/browse/SQOOP-1722
... View more
06-07-2016
03:08 PM
2 Kudos
@Washington Nascimento Try with below two commands. export HADOOP_OPTS=" -Djavax.net.ssl.trustStore=/home/user/truststore-xxxxx.jks" OR sqoop import -Dmapred.child.java.opts=" -Djavax.net.ssl.trustStore=/home/user/truststore-xxxxx.jks"
... View more
06-07-2016
01:40 PM
4 Kudos
@Kaliyug Antagonist If prod doesn't have internet access then I would recommend to setup local repo for it and it is easier to setup. please follow below docs. Download .tar file for HDP and UTIL repro. https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.1.0/bk_Installing_HDP_AMB/content/_hdp_24_repositories.html For creating local repro. https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.1.0/bk_Installing_HDP_AMB/content/_getting_started_setting_up_a_local_repository.html
... View more
06-07-2016
12:39 PM
4 Kudos
@David Tam Can you try adding native dir in AMBARI_JVM_ARGS variable and see if that resolve this. File: /var/lib/ambari-server/ambari-env.sh -Djava.library.path=/usr/hdp/<version>/hadoop/lib/native
... View more
06-07-2016
12:17 PM
@Saurabh Kumar Is this happening with single user or with all users? Did you checked your Hive impersonation setting?.
... View more
06-07-2016
11:30 AM
Are you using Ambari to change these parameter or manually adding it?
... View more