Member since
02-21-2016
35
Posts
19
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
9404 | 04-12-2016 04:45 AM |
04-04-2016
10:21 PM
1 Kudo
I tried to enable history server using the link , I could do it only till hdfs dis -mkdir -p /app-logs as this was failing and i could not proceed. Now when I run the pig script which is creating map reduce it is failing.with below error. nay idea? Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=mapred, access=READ, inode="/mr-history/tmp/hdfs/job_1459806783854_0001-1459807556718-hdfs-PigLatin%3ADefaultJobName-1459807582179-1-1-SUCCEEDED-default-1459807564263.jhist":hdfs:hdfs:-rwxrwx--- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Pig
04-04-2016
06:29 PM
[hdfs@ bin]$ ls -lrt
total 32
-rwxr-xr-x. 1 root root 1857 Dec 15 22:25 rcc
-rwxr-xr-x. 1 root root 6594 Dec 15 22:25 hadoop.distro
-rwxr-xr-x. 1 root root 775 Dec 15 22:26 yarn
-rwxr-xr-x. 1 root root 782 Dec 15 22:26 mapred
-rwxr-xr-x. 1 root root 775 Dec 15 22:26 hdfs
-rwxr-xr-x. 1 root root 807 Dec 15 22:26 hadoop-fuse-dfs
-rwxr-xr-x. 1 root root 772 Apr 4 14:21 hadoop
[hdfs@ bin]$ cat hadoop
#!/bin/bash
# Autodetect JAVA_HOME if not defined
if [ -e /usr/libexec/bigtop-detect-javahome ]; then
. /usr/libexec/bigtop-detect-javahome
elif [ -e /usr/lib/bigtop-utils/bigtop-detect-javahome ]; then
. /usr/lib/bigtop-utils/bigtop-detect-javahome
fi
export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.3.4.0-3485/hadoop}
export HADOOP_MAPRED_HOME=${HADOOP_MAPRED_HOME:-/usr/hdp/2.3.4.0-3485/hadoop-mapreduce}
export HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-/usr/hdp/2.3.4.0-3485/hadoop-yarn}
export HADOOP_LIBEXEC_DIR=${HADOOP_HOME}/libexec
export HDP_VERSION=${HDP_VERSION:-2.3.4.0-3485}
export HADOOP_OPTS="${HADOOP_OPTS} -Dhdp.version=${HDP_VERSION}"
export YARN_OPTS="${YARN_OPTS} -Dhdp.version=${HDP_VERSION}"
exec /usr/hdp/2.3.4.0-3485//hadoop/bin/hadoop.distro "$@" [hdfs@ip- bin]$ echo $HADOOP_HOME
[hdfs@ip-bin]$ pwd
/usr/hdp/current/hadoop-client/bin
[hdfs@ip-bin]$
... View more
04-04-2016
06:28 PM
Can you or someone help me with the main question, I am not able to figure out the reason for job failure. after googling it i think it may be related to the variable not set ..I am adding more details about how the variables are showing null... Could you please help..
... View more
04-03-2016
06:15 AM
Thanks this is what i was looking for.... Although this logs does not give much informaiton. Do you know if there is way to make log more readable.? or probably i'll have to ask another question.
... View more
04-03-2016
03:25 AM
yes, you are right. It is creating logs into the the same file but it is when a pig script failing. It is not creating log where map reduce is failing. I need to analyse map reduce logs in order to resolve the issue.
... View more
04-03-2016
01:58 AM
2 Kudos
My pig script failed when I am trying to do order by. I want to check the logs but I can't see the location of the logs. I got to know from the post that it should be set in the file (/etc/hadoop/hadoop-env.sh) but the file has everything commented as shown below. Does that mean it is not being written? how can I set those values to look into the logs as my scripts is failing again and again and I want to check the logs to troubleshoot it.
also my file ( hadoop-env.sh)seems to be at different path(/etc/hadoop/conf.install) is there anything wrong ?? # On secure datanodes, user to run the datanode as after dropping privileges. # This **MUST** be uncommented to enable secure HDFS if using privileged ports
# to provide authentication of data transfer protocol. This **MUST NOT** be # defined if SASL is configured for authentication of data transfer protocol
# using non-privileged ports. #export HADOOP_SECURE_DN_USER=${HADOOP_SECURE_DN_USER}
# Where log files are stored. $HADOOP_HOME/logs by default.
#export HADOOP_LOG_DIR=${HADOOP_LOG_DIR}/$USER
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Pig
03-15-2016
02:10 AM
I am trying ti access the command line interface from server where I have hortonworks install but I am getting below error.Can I not access the command line? [root@ip-xxx-xx-xx ec2-user]# hive
WARNING: Use "yarn jar" to launch YARN applications.
Logging initialized using configuration in file:/etc/hive/2.3.4.0-3485/0/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root":hdfs:hdfs:drwxr-xr-x
... View more
Labels:
- Labels:
-
Apache Hive
02-21-2016
10:53 PM
1 Kudo
Thank you very much for your help.. It is working now.. Will you be able to answer my question? you mentioned to add property hadoop.proxyuser.hive.hosts=* but hontor document says to add hadoop.proxyuser.root.hosts=* I have added both but not sure why and which one is working.
... View more
02-21-2016
10:30 PM
1 Kudo
I already have these property added. Let me give you the brief This were running fine until i restarted my aws server and its dns chnaged and I see these errors. Write now I am getting below errors : org.apache.ambari.view.hive.client.HiveClientException: H060 Unable to open Hive session: org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
... View more
02-21-2016
09:43 PM
1 Kudo
I had to restart my AWS server which caused it's public dns changed . so for opening the ambari-server I had to connect to new dns:8080 port, and I was able to connect easily. But when I am connecting to the hive view it is giving below error. H060 Unable to open Hive session: org.apache.thrift.protocol.TProtocolException:Required field 'serverProtocolVersion'is unset!Struct:TOpenSessionResp(status:TStatus(statusCode:ERROR_STATUS, infoMessages:[*org.apache.
... View more
Labels:
- Labels:
-
Apache Hive
- « Previous
-
- 1
- 2
- Next »