Member since
05-22-2017
126
Posts
16
Kudos Received
14
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2529 | 02-07-2019 11:03 AM | |
6859 | 08-09-2018 05:08 AM | |
1292 | 07-06-2018 07:51 AM | |
3244 | 06-22-2018 02:28 PM | |
3312 | 05-29-2018 01:14 PM |
11-17-2022
03:09 AM
Hi, This a BUG from Phoenix if I use upsert with null value, it "insert" the field with 0x00 0x00 bytes values and we cannot change https://issues.apache.org/jira/browse/PHOENIX-6583 Please check the above BUG jira
... View more
06-16-2021
04:01 AM
Hi @schhabra1, Is there a way to connect HBase via Knox ? I have posted the question in the below link. Can you please take a look at the below one? https://community.cloudera.com/t5/Support-Questions/Connect-to-HBase-via-KNOX-using-HBase-Java-client/td-p/318765 Thanks.
... View more
03-08-2021
06:17 PM
Sorry for distrubing. I am working on a toy project which needs to insert spark data frame into hbase. Apache Kafka 2.2.1 Apache Spark 2.4.0 Apache HBase 2.1.4 CDH 6.3.2 After reading some posts about spark hbase connector, I decided to to hortonworks spark habse connector. I am wondering if I need the HBase client configuration file hbase-site.xml for hortonworks spark hbase connector when I am working in CDH environment? Thanks for your help in advance!
... View more
04-15-2020
05:56 PM
you really helped me! Ambari had mapreduce.application.classpath wrong and I never thought to check it. Thank you!
... View more
07-06-2018
07:50 AM
Put log4j on HDFS path and then use HDFS path in workflow to override. Sample log4j file: #
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License. See accompanying LICENSE file.
#
# Define some default values that can be overridden by system properties
hadoop.root.logger=DEBUG,CLA
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=ALL
#
# ContainerLog Appender
#
#Default values
yarn.app.container.log.dir=null
yarn.app.container.log.filesize=100
log4j.appender.CLA=org.apache.hadoop.yarn.ContainerLogAppender
log4j.appender.CLA.containerLogDir=${yarn.app.container.log.dir}
log4j.appender.CLA.totalLogFileSize=${yarn.app.container.log.filesize}
log4j.appender.CLA.layout=org.apache.log4j.PatternLayout
log4j.appender.CLA.layout.ConversionPattern=%d{ISO8601} %p [%t] %c: %m%n
log4j.appender.CRLA=org.apache.hadoop.yarn.ContainerRollingLogAppender
log4j.appender.CRLA.containerLogDir=${yarn.app.container.log.dir}
log4j.appender.CRLA.maximumFileSize=${yarn.app.container.log.filesize}
log4j.appender.CRLA.maxBackupIndex=${yarn.app.container.log.backups}
log4j.appender.CRLA.layout=org.apache.log4j.PatternLayout
log4j.appender.CRLA.layout.ConversionPattern=%d{ISO8601} %p [%t] %c: %m%n
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.log.metrics.EventCounter
Sample Workflow.xml: <workflow-app name="javaaction" xmlns="uri:oozie:workflow:0.5">
<global>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
</global>
<start to="java-action"/>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="java-action">
<java>
<configuration>
<property>
<name>oozie.launcher.mapreduce.task.classpath.user.precedence</name>
<value>true</value>
</property>
<property>
<name>oozie.launcher.mapreduce.user.classpath.first</name>
<value>true</value>
</property>
<property>
<name>oozie.launcher.mapred.job.name</name>
<value>test</value>
</property>
<property>
<name>oozie.launcher.mapreduce.job.log4j-properties-file</name>
<value>${nameNode}/tmp/log4j.properties</value>
</property>
</configuration>
<main-class>WordCount2</main-class>
<arg>${nameNode}/tmp/input</arg>
<arg>${nameNode}/tmp/output2</arg>
</java>
<ok to="end"/>
<error to="kill"/>
</action>
<end name="end"/>
</workflow-app>
... View more
Labels:
10-15-2018
12:24 PM
All columns mapped as VARCHAR.Thanks
... View more
04-27-2018
08:21 AM
@raj pati I believe communication issue was resolved after removing the external jars from hbase libs.
... View more
04-10-2018
07:58 AM
@Dinesh Jadhav The error Server not found in Kerberos database usually occurs when KDC is unable to identify the entry for service principal requested when connecting to the service. (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER) Can you share your the below-modified files;
- krb5.conf - kdc.conf - kadm5.acl Wat values do you have for these params: oozie.service.HadoopAccessorService.kerberos.enabled
local.realm
oozie.service.HadoopAccessorService.keytab.file
oozie.service.HadoopAccessorService.kerberos.principal
oozie.authentication.type
oozie.authentication.kerberos.principal
oozie.authentication.kerberos.name.rules oozie uses jaas configuration for kerberos login can you share it
... View more