Member since
10-28-2016
392
Posts
7
Kudos Received
20
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3481 | 03-12-2018 02:28 AM | |
| 5204 | 12-18-2017 11:41 PM | |
| 3648 | 07-17-2017 07:01 PM | |
| 2577 | 07-13-2017 07:20 PM | |
| 8234 | 07-12-2017 08:31 PM |
01-14-2017
01:41 AM
@Neeraj Sabharwal, @Sunile Manjee - any ideas on this ?
... View more
01-14-2017
01:29 AM
curl -iku guest:guest-password -X put 'https://sandbox.hortonworks.com:8443/gateway/default/webhdfs/v1/user/guest/knox-sample?op=MKDIRS&permission=777' i made the change highlighted, and i was able to make it work i.e. the gateway/default instead of gateway/knox_sample Not sure if that is the correct fix, do i need to rename the default Knox gateway (to knox_sample) ? i'm new to Knox, so appreciate your responses.
... View more
01-13-2017
11:28 PM
Hi all, I'm trying to secure HDP using Apache Knox .. using the HDP Apache Knox tutorial, and running into following issues, when firing the following commands -> curl -iku guest:guest-password -X put 'https://sandbox.hortonworks.com:8443/gateway/knox_sample/webhdfs/v1/user/guest/knox-sample?op=MKDIRS&permission=777' Any ideas on this ?
[root@sandbox knox-server]# curl -iku guest:guest-password -X put 'https://sandbox.hortonworks.com:8443/gateway/knox_sample/webhdfs/v1/user/guest/knox-sample?op=MKDIRS&permission=777'
HTTP/1.1 404 Not Found
Date: Thu, 12 Jan 2017 06:24:17 GMT
Cache-Control: must-revalidate,no-cache,no-store
Content-Type: text/html; charset=ISO-8859-1
Content-Length: 331
Server: Jetty(9.2.15.v20160210)
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 404 </title>
</head>
<body>
<h2>HTTP ERROR: 404</h2>
<p>Problem accessing /gateway/knox_sample/webhdfs/v1/user/guest/knox-sample. Reason:
<pre> Not Found</pre></p>
<hr /><i><small>Powered by Jetty://</small></i>
</body>
</html>
... View more
Labels:
01-12-2017
07:42 PM
Hello - is it possible for me to have multiple instances in
OpenTSDB - each instance pointing to a separate Hbase tables for storing
timeseries data. eg. Install 1 - will write data to table - tsdb
Install 2 - will write to table - tsdb2 and so on.
What all files do i need to make changes to make this work ?
Essentially, the idea is segregate the time series data, based on functionality.
Pls let me know your inputs.
thanks!
... View more
Labels:
01-10-2017
07:49 PM
HADOOP_CLASSPATH=$(ls -1 /usr/hdp/current/hbase-client/lib/*.jar | tr '\n',
':') hadoop jar mr.jar hbase.labfiles.Exercise5.ProductAnalyzer -libjars
$(ls -1 /usr/hdp/current/hbase-client/lib/*.jar | tr '\n', ',') This worked finally (changes highlighted) !
... View more
01-10-2017
07:28 AM
Hi All, i'm running a mapReduce job - copying from Hbase table and putting into HDFS, ansd running into error. Any ideas on this ? Code snippet : public class ProductAnalyzer extends Configured implements Tool {
.....
public static void main(String[] args) throws Exception {
System.out.println(" in main111");
System.exit(ToolRunner.run(new Configuration(), new ProductAnalyzer(), args));
}
public int run(String[] args) throws Exception {
System.out.println(" In run ");
String table = "sales_fact";
String output = "/tmp/exercise5/output1";
... }}
Command : HADOOP_CLASSPATH=$(ls -1 /usr/hdp/current/hbase-client/lib/* | tr '\n', ':') hadoop jar mr.jar hbase.labfiles.Exercise5.ProductAnalyzer -libjars $(ls -1 /usr/hdp/current/hbase-client/lib/* | tr '\n', ',') Error : WARNING: Use "yarn jar" to launch YARN applications.
in main111
Exception in thread "main" java.lang.IllegalArgumentException: File name can't be empty string
at org.apache.hadoop.util.GenericOptionsParser.validateFiles(GenericOptionsParser.java:390)
at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:299)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at hbase.labfiles.Exercise5.ProductAnalyzer.main(ProductAnalyzer.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) In calling -> System.exit(ToolRunner.run(new Configuration(), new ProductAnalyzer(), args));
in main(), the program errors out run() - is not being called.. Any ideas ?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
01-10-2017
06:16 AM
@rguruvannagari -
actually the file - /usr/share/java/ojdbc6.jar does not exist, though the symlink exists. This is the default installation, so wondering how come the symlink exists but the file does not ! so, what needs to be done to fix this ? drop the symlink ?
... View more
01-10-2017
06:10 AM
Hi All, I'm running a mapReduce java program using the command shown below, the mapreduce program scans Hbase table and puts the data into hdfs. mr.jar -> has the .class files
HADOOP_CLASSPATH=$(ls -1 /usr/hdp/current/hbase-client/lib/* | tr '\n', ':') yarn jar mr.jar hbase.labfiles.Exercise5.ProductAnalyzer -libjars $(ls -1 /usr/hdp/current/hbase-client/lib/* | tr '\n', ',') This gives the following error -> Exception in thread "main" java.io.FileNotFoundException: File /usr/hdp/current/hbase-client/lib/ojdbc6.jar does not exist.
at org.apache.hadoop.util.GenericOptionsParser.validateFiles(GenericOptionsParser.java:405)
at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(GenericOptionsParser.java:299)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:487)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at hbase.labfiles.Exercise5.ProductAnalyzer.main(ProductAnalyzer.java:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) However, the file actually exists [root@sandbox Exercise5]# ls -lrt /usr/hdp/current/hbase-client/lib/ojdbc6.jar
lrwxrwxrwx 1 root root 26 2016-03-14 14:11 /usr/hdp/current/hbase-client/lib/ojdbc6.jar -> /usr/share/java/ojdbc6.jar Any ideas what the issue is & how to fix this ?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
01-05-2017
09:51 PM
The issue is fixed, there was a conflict since the map-reduce component (which was also added to the classpath) has a different netty jar from the one used by OpenTSDB. I've removed the map-reduce jars from the CLASSPATH to fix the issue.
... View more