Member since
05-22-2017
126
Posts
16
Kudos Received
14
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2471 | 02-07-2019 11:03 AM | |
6649 | 08-09-2018 05:08 AM | |
1256 | 07-06-2018 07:51 AM | |
3168 | 06-22-2018 02:28 PM | |
3199 | 05-29-2018 01:14 PM |
09-17-2020
12:49 AM
You can use public hortonworks repo - https://repo.hortonworks.com/content/groups/public/ You may not find exact version which you mentioned. But you can check repo, you can use dependencies as per your cluster version. You can try below dependency : <dependency> <groupId>com.hortonworks.shc</groupId> <artifactId>shc-core</artifactId> <version>1.1.0.3.1.5.0-152</version> </dependency> Let me know it works. It should be compatible.
... View more
02-07-2019
11:03 AM
You can copy hbase-site.xml in your directory and make changes in that hbase-site.xml. Then export below property and launch sqlline. export HBASE_CONF_DIR=<new directory where you have copied hbase-site.xml>
... View more
08-09-2018
05:59 AM
Below are the high level requirements which are needed to
connect to Secure Hbase cluster - hbase-client - Hbase config file - Kerberos config files and keytab for user Pom file and Sample Code are given below: Java Class: (Change Paths for config files and kerberos
related parameters): package com.hortonworks.hbase;import java.io.IOException;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.Path;import org.apache.hadoop.hbase.HBaseConfiguration;import org.apache.hadoop.hbase.TableName;import org.apache.hadoop.hbase.client.Connection;import org.apache.hadoop.hbase.client.ConnectionFactory;import org.apache.hadoop.hbase.client.ResultScanner;import org.apache.hadoop.hbase.client.Scan;import org.apache.hadoop.hbase.client.Table;import org.apache.hadoop.security.UserGroupInformation;public class HBaseConnection { public
static void main(String ar[]) throws IOException { //
System Properties (Change Path/Properties according to env) //
copy krb5.conf from cluster System.setProperty("java.security.krb5.conf",
"/Users/schhabra/krb5.conf"); System.setProperty("javax.security.auth.useSubjectCredsOnly",
"true"); //
Configuration (Change Path/Properties according to env) Configuration
configuration = HBaseConfiguration.create(); configuration.set("hadoop.security.authentication",
"Kerberos"); //
copy hbase-site.xml and hdfs-site.xml from cluster and set paths configuration.addResource(new
Path("file:///Users/schhabra/hbase-site.xml")); configuration.addResource(new
Path("file:///Users/schhabra/hdfs-site.xml")); UserGroupInformation.setConfiguration(configuration); //
User information (Change Path/Properties according to env) UserGroupInformation.loginUserFromKeytab("ambari-qa-c1201@HWX.COM", "/Users/schhabra/smokeuser.headless.keytab"); //
Connection Connection
connection =
ConnectionFactory.createConnection(HBaseConfiguration.create(configuration)); System.out.println(connection.getAdmin().isTableAvailable(TableName.valueOf("SYSTEM.STATS"))); Scan
scan1 = new Scan(); Table
table = connection.getTable(TableName.valueOf("test")); ResultScanner
scanner = table.getScanner(scan1); }} POM: (Dependencies) <project
xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.hortonworks</groupId>
<artifactId>hbase</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>hbase</name>
<url>http://maven.apache.org</url> <properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <repositories>
<repository>
<id>HDP</id>
<name>HDP Releases</name>
<!--url>http://repo.hortonworks.com/content/repositories/releases/</url-->
<url>http://repo.hortonworks.com/content/groups/public</url>
</repository>
</repositories> <dependencies> <dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency> <dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.2.2.5.0.0-1245</version> </dependency>
</dependencies> </project>
... View more
Labels:
08-09-2018
05:08 AM
@Michael Graml it is not possible, if coordinator is killed, workflows will be killed.
... View more
07-30-2018
08:43 PM
Check whether you are able to telnet to RM:8050 and also check netstat output on RM machine whether you see any connections from node on which service check is running.
... View more
07-30-2018
08:33 PM
1 Kudo
Ensure that Phoenix query server has updated hbase-site.xml file with phoenix.schema.isNamespaceMappingEnabled=true and PQS/Hbase should be restarted after making changes.
... View more
07-06-2018
08:31 AM
You can refer this code -https://github.com/apache/spark/tree/master/examples. You can import this into intellij and try connecting.
... View more
07-06-2018
07:51 AM
1 Kudo
I have created this article for same - https://community.hortonworks.com/articles/201959/override-log4j-property-file-via-oozie-workflow-fo.html Please refer that.
... View more
07-06-2018
07:50 AM
Put log4j on HDFS path and then use HDFS path in workflow to override. Sample log4j file: #
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License. See accompanying LICENSE file.
#
# Define some default values that can be overridden by system properties
hadoop.root.logger=DEBUG,CLA
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=ALL
#
# ContainerLog Appender
#
#Default values
yarn.app.container.log.dir=null
yarn.app.container.log.filesize=100
log4j.appender.CLA=org.apache.hadoop.yarn.ContainerLogAppender
log4j.appender.CLA.containerLogDir=${yarn.app.container.log.dir}
log4j.appender.CLA.totalLogFileSize=${yarn.app.container.log.filesize}
log4j.appender.CLA.layout=org.apache.log4j.PatternLayout
log4j.appender.CLA.layout.ConversionPattern=%d{ISO8601} %p [%t] %c: %m%n
log4j.appender.CRLA=org.apache.hadoop.yarn.ContainerRollingLogAppender
log4j.appender.CRLA.containerLogDir=${yarn.app.container.log.dir}
log4j.appender.CRLA.maximumFileSize=${yarn.app.container.log.filesize}
log4j.appender.CRLA.maxBackupIndex=${yarn.app.container.log.backups}
log4j.appender.CRLA.layout=org.apache.log4j.PatternLayout
log4j.appender.CRLA.layout.ConversionPattern=%d{ISO8601} %p [%t] %c: %m%n
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter
Sample Workflow.xml: <workflow-app name="javaaction" xmlns="uri:oozie:workflow:0.5">
<global>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
</global>
<start to="java-action"/>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="java-action">
<java>
<configuration>
<property>
<name>oozie.launcher.mapreduce.task.classpath.user.precedence</name>
<value>true</value>
</property>
<property>
<name>oozie.launcher.mapreduce.user.classpath.first</name>
<value>true</value>
</property>
<property>
<name>oozie.launcher.mapred.job.name</name>
<value>test</value>
</property>
<property>
<name>oozie.launcher.mapreduce.job.log4j-properties-file</name>
<value>${nameNode}/tmp/log4j.properties</value>
</property>
</configuration>
<main-class>WordCount2</main-class>
<arg>${nameNode}/tmp/input</arg>
<arg>${nameNode}/tmp/output2</arg>
</java>
<ok to="end"/>
<error to="kill"/>
</action>
<end name="end"/>
</workflow-app>
... View more
Labels: