Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2144 | 11-01-2016 05:43 PM | |
6550 | 11-01-2016 05:36 PM | |
4174 | 07-01-2016 03:20 PM | |
7127 | 05-25-2016 11:36 AM | |
3458 | 05-24-2016 05:27 PM |
03-05-2016
05:35 AM
1 Kudo
@Michael Dennis Uanang I did notice that and that's why you may have to run that statement
... View more
03-05-2016
03:44 AM
@Nirvana India You can check this through http://www.cloudera.com/documentation/manager/5-0-x/Cloudera-Manager-Managing-Clusters/cm5mc_sentry_config.html http://www.cloudera.com/documentation/enterprise/5-2-x/topics/cm_mc_hive_service.html
... View more
03-05-2016
03:24 AM
1 Kudo
@Colton Rodgers Error: Nothing to do --> I have experienced this when repo files are not correct Please make sure that you are using correct repo.
... View more
03-05-2016
03:15 AM
@Ali Gouta Schintalapani is Kafka committer 😉 http://kafka.apache.org/committers.html
... View more
03-05-2016
03:10 AM
1 Kudo
@Alan McShane Thanks for sharing this information. Did you get any information on Bug number?
... View more
03-05-2016
02:59 AM
@Kyle Prins See this thread https://community.hortonworks.com/questions/15506/error-cannot-retrieve-repository-metadata-repomdxm.html The above thread may not completely apply to yours as you are working on local repo. Please check the iptables and check if you can run wget on the file to check its existence.
... View more
03-05-2016
02:51 AM
1 Kudo
@Divakar Annapureddy What user are you running the view? See this "We were having the same issue with a user account. I think this happens if the user hasn't invoked hive shell even once. We asked the user that was having this problem to login to hive shell and then try accessing the Hive view through Ambari. As soon as the hive shell was invoked, the error went away. That tells me that something is missing from the user profile which Ambari is looking for if the user hasn't logged into hive shell even once."
... View more
03-05-2016
02:40 AM
@rkanchu Please see this from technical side https://community.hortonworks.com/articles/4689/getting-started-with-sas-and-hadoop.html 1) Leveraging HDFS for flat files using the SAS Filename Statement The SAS Filename statement allows a SAS programmer to setup a pointer to an inbound or outbound filesystem directory. With Hadoop, the SAS Filename statement can reference an HDFS directory. Once the file reference is established, this file reference can be used within a SAS Data Step on an Infile or File statement. This enables SAS programmers to read and write flat files to and from HDFS inline within their programs. 2) Leveraging HDFS for SAS Libraries using the Libname Statement SAS also implemented the SPDE engine on the Libname statement to support leveraging HDFS to store SAS tables or data sets. Once a library reference is established leveraging the Libname statement, SAS programmers can use this libref on a Data or Set statement within a SAS Data Step, or as input to a SAS procedure. There are minor limitations to leveraging this method over a standard file system for SAS libraries. SAS documentation will provide these details. 3) Accessing directly, HiveServer2 SAS had implemented a Libname statement to setup a SAS library reference to HiveServer2. It is available for mostly read access to Hive tables. Once a SAS library reference has been established (this leverages a JDBC connection), SAS programmers can leverage HiveServer2 tables from within their SAS programs, as input to a SET statement or on a DATA= statement within a SAS procedure. SAS has implemented, a dynamic Push Down In Database capabilties to take standard Statistical procedures like Proc Summary, Means, Freq used by SAS programmers with HiveServer2. This capability will generate a complex HiveQL statement for the users and send this over to HiveServer2 for execution. This allows a significant portion of the math to take place in Hadoop. 4) Executing HDFS, Pig, Hive, and MapReduce inline within a SAS program SAS created Proc Hadoop, a procedure available with this product, to enable SAS programmers to execute, inline within a SAS program, any HDFS, Pig, Hive, or MapReduce script or program that has been created outside of SAS. I hope you find this information useful as you get stated using SAS Access to Hadoop.
... View more
03-05-2016
02:37 AM
@S Srinivasa I believe you have mixed the multiple questions here. I will start with the last comment 1) Make sure that you can ssh without password from ambari server to all the hosts and ssh to ambari server too "important" in ambari server, ssh localhost and if you cant login without password then cd ~/.ssh cat id_rsa.pub>>authorized_keys ssh localhost 2) If you are not using password less idea then make sure that you install agents manually and register with ambari server host=namenode.teg
ERROR: Bootstrap of host namenode.teg fails because previous action finished with non-zero exit code (255)
ERROR MESSAGE: Permission denied (publickey,gssapi-keyex,gssapi-with-mic).
... View more
03-05-2016
02:26 AM
@Edgar Daeds FAILED:SemanticExceptionUnable to determine if hdfs://MY_CLUSTER/apps/hive/warehouse/db_name.db/table_name is encrypted: Permission denied: user=my_user (not hive), access=EXECUTE, inode="/apps/hive/warehouse/db_name.db/table_name" Do you have encryption in place? my_user does not have x on that table
https://community.hortonworks.com/articles/10367/apache-ranger-and-hive-column-level-security.html
... View more