Member since
09-14-2015
111
Posts
28
Kudos Received
13
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1315 | 07-06-2017 08:16 PM | |
5580 | 07-05-2017 04:57 PM | |
3302 | 07-05-2017 04:52 PM | |
3812 | 12-30-2016 09:29 PM | |
1602 | 12-30-2016 09:14 PM |
11-03-2023
06:08 AM
@manish1 This solution doesn't work for me. Any other suggestion?
... View more
09-15-2023
02:06 AM
I use CDH 6.3.2 。 hive 2.1 hadoop 3.0 hive on spark 。yarn cluster 。 hive.merge.sparkfiles=true ; hive.merge.orcfile.stripe.level=true ; This configuration makes the 1099 reduce file result merge into one file when the result is small 。Then the merged file has about 1099 stripes in one file 。 Then the result is so slow when it is read. I tried hive.merge.orcfile.stripe.level=false ; The result is desirable 。One small file with one stripe and read fast 。 Can anyone tell the difference between true and false ? Why " hive.merge.orcfile.stripe.level=true " is the default one ?
... View more
03-21-2020
07:27 AM
I do have similar issue to connect hive from squirrel i use the Beeline version 3.1.0.3.0.1.0-187 by connecting Hortonworks Image thru VM Here are the jars added ,but I am having connection got refused with error "Unexpected Error occurred attempting to open an SQL connection.class java.net.ConnectException: Connection refused: connect" hive-jdbc-3.1.0.3.0.1.0-187.jar hive-jdbc-3.1.0.3.0.1.0-187-sources.jar hive-jdbc-3.1.0.3.0.1.0-187-standalone.jar Jdbc URL jdbc:hive2://sandbox-hdp.hortonworks.com:2181/default Any idea how to fix?
... View more
10-09-2017
08:13 PM
it is a compress bz2 file and i get an error about the codec when tring to get de new file. INFO compress.CodecPool:
Got brand-new decompressor [.bz2] text: Unable to write to
output stream.
... View more
10-10-2017
03:01 PM
2 Kudos
@Lou Richard Good to know that the issue is resolved. It will be great if you can mark this HCC thread as Answered by clicking on the "Accept" Button. That way other HCC users can quickly find the solution when they encounter the same issue. As it was a long thread hence i am writing a brief summary for the HCC users who might encounter this issue and can quickly find the answer. Issue: Atlas Installation Was failing with the following error: File "/usr/hdp/2.6.1.0-129/atlas/bin/atlas_config.py", line 232, in runProcess p = subprocess.Popen(commandline, stdout=stdoutFile, stderr=stderrFile, shell=shell)
File "/usr/lib64/python2.7/subprocess.py", line 711, in __init__ errread, errwrite)
File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exceptionOSError: [Errno 2] No such file or director . Solution: Making sure that the JAVA_HOME is set correctly on the host and it is pointing to a valid JDK (not JRE) and setting the JAVA_HOME properly inside the global environment variable. # cd /usr/hdp/2.6.1.0-129/atlas/server/webapp/
# mv atlas ../
# /usr/lib/jvm/jre-1.8.0-openjdk/bin/jar -xf atlas.war . Cause: "jar" utility comes with JDK (inside $JAVA_HOME/bin) which is being used by the "atlas_config.py" script to extract the atlas.war.
... View more
07-28-2017
09:37 PM
1 Kudo
Hi @Manish Gupta I haven't seen an easy way to do this in a normal Ambari-based install yet
... View more
07-07-2017
05:21 PM
@Jorge Luis Hernandez Olmos I'm happy it worked for you. Sometimes, it happens and I always prefer to take care of mysql from CLI first. As always, if you find this post helpful, don't forget to "accept" answer.
... View more
07-06-2017
02:41 AM
how can we identify a job taking more resources than allocation?
... View more
06-15-2017
01:28 PM
1 Kudo
@Manish Gupta After the installation use following steps to replace default service account with custom service account: For zeppelin: 1. Stop zeppelin service from Ambari. 2. Change the zeppelin user from Ambari server using configs.sh (this command is only available from ambari-server host): # /var/lib/ambari-server/resources/scripts/configs.sh -u <AmbariAdminUser> -p <AmbariAdminUserPassword> set localhost <Cluster-name> zeppelin-env zeppelin_user <ZEP-USER> 3. Set the ownership on zeppelin log and run directories. # chown -R <ZEP-USER>:hadoop /var/log/zeppelin
# chown -R <ZEP-USER>:hadoop /var/run/zeppelin 4. Start zeppelin service from Ambari
... View more