Member since
03-15-2017
24
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
902 | 09-29-2017 12:28 PM |
09-29-2017
12:28 PM
Ok I have the solution. We were using ODBC via OTL and so we needed to set a particular flag "ImplicitSelect" to "true" directly on the connection level. When this flag is not set, "SHOW" queries return empty results. Thanks, Sylvain.
... View more
09-29-2017
07:05 AM
No, I'm using my own software based on Hortonworks ODBC driver (SHOW queries used to work without kerberos).
... View more
09-28-2017
05:06 PM
Hi, I'm facing the following problem : When I run SHOW queries through HortonWorks ODBC Driver with my kerberos-enable cluster, it returns an empty result (there is no error, just an empty result). However, I can perform SELECT queries, I can also run CREATE DATABASE or CREATE TABLE queries. Note : I can perform SHOW queries using beeline client. Note 2 : I tried to activate TRACE logging for the driver, and I get one error : "Sep 28 18:48:03.508 ERROR 2741458688 Connection::SQLSetConnectAttr: [Hortonworks][ODBC] (11470) Transactions are not supported." Any idea ? Thanks in advance, Sylvain.
... View more
Labels:
- Labels:
-
Apache Hive
07-12-2017
04:26 PM
Thanks. I was just misunderstanding the meaning of the property "Reserved space for HDFS". I actually thought it was the disk space we set for file storage ...
... View more
07-03-2017
04:37 PM
Hi, I am facing an alert while starting my services on ambari : HDFS tells me that it has no more disk space available, even though I haven't right anything on HDFS yet. The alert is the following : Capacity Used:[100%, 36864], Capacity Remaining:[0] The thing is that I have set 300000000000 bytes of capacity for HDFS (property "Reserved space for HDFS"), so I can't see where my problem is ... Thanks in advance for your answer ! Sylvain.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Hortonworks Cloudbreak
06-30-2017
12:14 PM
It looks like it was indeed. I killed the process and I did not get the error anymore. Thanks.
... View more
06-26-2017
07:28 AM
Hi, I juste installed a 4-nodes Hadoop cluster, and I can't start the services because the start-up fails at starting NFSGateway on my 4-th node, where it throws : resource_management.core.exceptions.ExecutionFailed The whole error is : Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/nfsgateway.py", line 89, in <module>
NFSGateway().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/nfsgateway.py", line 58, in start
nfsgateway(action="start")
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_nfsgateway.py", line 74, in nfsgateway
create_log_dir=True
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py", line 274, in service
Execute(daemon_cmd, not_if=process_id_exists_command, environment=hadoop_env_exports)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh -H -E /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start nfs3' returned 1. nfs3 running as process 3138. Stop it first. Thanks in advance for your answer ! Sylvain.
... View more
Labels:
- Labels:
-
Apache Hadoop
03-30-2017
11:49 AM
Hi @Predrag Minovic and @ssathish, I tried with WebHCat, it works. I already tried with the Distributed Shell application, but I didn't success. Maybe I did wrong. But I think I will try with Oozie soon. Thanks for you help
... View more
03-30-2017
06:54 AM
I just have another question : do Temporary table have better performances than normal tables ?
... View more
03-29-2017
09:08 AM
Yes I can connect to Hive with ODBC driver. The problem was that a new session was generated at each call from my client. Thanks, Sylvain.
... View more
03-29-2017
09:07 AM
Thank you. As you said, the client I am using is generating a new session at each call, that was the problem. Sylvain.
... View more
03-28-2017
02:07 PM
My cluster is not kerberized and I do not have ranger policies for Hive.
... View more
03-28-2017
02:06 PM
Yes, the tables created via ODBC are showing up in beeline. Creating a database via ODBC works too. The problem seems to be only appearing with temporary tables.
... View more
03-28-2017
01:46 PM
Thank you mliem, I am able to access hive through beeline. I can create my temporary table, and see it appearing with a "show tables" command using beeline. I tried creating my temporary table in the default database, and in a specified one, but I get the same result. Sylvain.
... View more
03-28-2017
01:22 PM
Hi, I am facing the following problem : Using the Hortonworks ODBC driver for Hive, I can create a temporary table (hiveserver2.log tell me that it is done without error), but I can't retrieve this table with a SELECT query thereafter. When I try to do so, I get a "table not found" error. Does it mean the ODBC driver does not keep the session between 2 hive calls ? Is there something I am doing wrong or some configuration I need to set ? I believe since hive 0.14, the "temporary table" part in the ODBC driver configuration is not necessary, or is it ? Thanks in advance, Sylvain.
... View more
- Tags:
- Data Processing
- Hive
Labels:
- Labels:
-
Apache Hive
03-27-2017
08:15 AM
Thanks @ssathish, here are my log files.
... View more
03-23-2017
02:37 PM
I forgot this : Ambari version : Version
2.4.2.0 Hadoop version 2.5.0.0-1245 installed from Ambari (not a sandbox)
... View more
03-22-2017
06:38 PM
Hello, I'm trying to execute some existing examples using the Rest API (with or without using the Knox gateway) It seems to work, but the task is always marked as failed in the Yarn Web UI. I Use the hadoop-mapreduce-examples.jar to launch a wordcount example. It creates a sub task which is properly executed, it creates all the expected files in /user/yarn/output10. But the parent task always failed with an exit code = 0 ( looks like a success 😞 , but it's not) I send a POST request on this URL (I did the new-application command before to get my application-id) https://guest:guest-password@serverip:8443/gateway/default/resourcemanager/v1/cluster/apps (I tried also without knox : http://serverip:8088/ws/v1/cluster/apps, with the same result) And I transmit this JSON command : { "application-id":"application_1490174038938_0018", "application-name":"WordCount", "am-container-spec": { "commands":
{ "command":"yarn jar /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar wordcount /user/yarn/input /user/yarn/output10" } }, "max-app-attempts":1, "application-type":"YARN" } I tried a lot of other JSON config with the same result (this one is the simpliest I tried) Do you have any idea what I did wrong ? Thanks in advance
... View more
Labels:
- Labels:
-
Apache Knox
-
Apache YARN
03-20-2017
01:04 PM
It should be alright now.
... View more
03-20-2017
10:30 AM
Hi, I am running Hadoop on a 3 nodes cluster (3 virtual machines) with respectively 20Gb, 10Gb and 10Gb of disk space available. When I run this command on the namenode : hadoop fs -df -h / I get the following result : When I run this command : hadoop fs -du -s -h / I get the following result : Knowing that the replication number is set to 3, shouldn't I get 3*2,7 = 8,1G in the first screenshot ? I tried to execute expunge command and it did not change the result. Thanks in advance ! Sylvain.
... View more
- Tags:
- Hadoop Core
- HDFS
Labels:
- Labels:
-
Apache Hadoop
03-16-2017
04:43 PM
1 Kudo
Hi, I don't understand what is happenning. I am desperately trying to make work a simple "SHOW VIEWS" command and I always get a ParseException error (cannot recognize input near 'show' 'views'). I believe the command must not return an error even if no view were created ? Anyway, I created a view juste in case. Here are the command I tried and that returned me ParseException errors: SHOW VIEWS;
SHOW VIEWS IN mydatabase;
SHOW VIEWS FROM mydatabase;
Very simple ones. I am running a 2.4 version of Hive so I believe it is not a version issue. Thanks in advance ! Sylvain.
... View more
- Tags:
- Data Processing
- Hive
Labels:
- Labels:
-
Apache Hive
03-16-2017
04:36 PM
Thank you, it seems like TBLPROPERTIES don't like french characters ...
... View more
03-15-2017
02:17 PM
2 Kudos
Hi, I am trying to set custom properties (using "ALTER TABLE ... SET TBLPROPERTIES ..." command) that contains special characters, like 'ç' or 'é'. The problem is I am getting '\u00e7' and '\u00e9' as a result when I execute a "DESCRIBE TABLE ... FORMATTED" command. Is there a way to get the proper encoding in return ? Here is my command : ALTER TABLE mydatabase.mytable SET TBLPROPERTIES ('Test'='François'); Here is what I am getting as a result with a DESCRIBE TABLE ... FORMATTED command : Test Fran\\u00e7ois Thanks in advance for your answer ! Sylvain.
... View more
- Tags:
- Data Processing
- Hive
- Upgrade to HDP 2.5.3 : ConcurrentModificationException When Executing Insert Overwrite : Hive
Labels:
- Labels:
-
Apache Hive