Member since
09-23-2015
800
Posts
898
Kudos Received
185
Solutions
07-15-2024
01:53 AM
Hi team, I want to configuration "yarn-user/hdp01-node.lab.contoso.com@LAB.CONTOSO.COM" to "yarn-user" "yarn-user/hdp02-node.lab.contoso.com@LAB.CONTOSO.COM" to "yarn-user" "yarn-user/hdp03-node.lab.contoso.com@LAB.CONTOSO.COM" to "yarn-user" Please given a rule advidor. Thanks
... View more
07-06-2016
12:03 PM
You mean to exclude two columns? That one would definitely work: (id1|id2)?+.+ Your version would say id1 once or not at all followed by id2 once or not at all followed by anything else. So should work too I think.
... View more
06-16-2016
05:16 PM
That is amazing!
... View more
05-08-2016
02:01 AM
@Benjamin Leonhardi With the release of Yarn.Next, the containers will receive their own IP address and get registered in DNS. The FQDN will be available via a rest call to Yarn. If the current Yarn container die, the docker container will start in a different Yarn container somewhere in the cluster. As long as all clients are pointing at the FQDN of the application, the outage will be nearly transparent. In the mean time, there are several options using only slider but it requires some scripting or registration in Zookeeper. If you run: slider lookup --id application_1462448051179_0002
2016-05-08 01:55:51,676 [main] INFO impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2016-05-08 01:55:53,847 [main] WARN shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2016-05-08 01:55:53,868 [main] INFO client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050
{
"applicationId" : "application_1462448051179_0002",
"applicationAttemptId" : "appattempt_1462448051179_0002_000001",
"name" : "biologicsmanufacturingui",
"applicationType" : "org-apache-slider",
"user" : "root",
"queue" : "default",
"host" : "sandbox.hortonworks.com",
"rpcPort" : 1024,
"state" : "RUNNING",
"diagnostics" : "",
"url" : "http://sandbox.hortonworks.com:8088/proxy/application_1462448051179_0002/",
"startTime" : 1462454411514,
"finishTime" : 0,
"finalStatus" : "UNDEFINED",
"origTrackingUrl" : "http://sandbox.hortonworks.com:1025",
"progress" : 1.0
}
2016-05-08 01:55:54,542 [main] INFO util.ExitUtil - Exiting with status 0
You do get the host the container is currently bound to. Since the instructions bind the docker container to the host IP, this would allow URL discovery but as I said, not out of the box. This article is merely the harbinger to Yarn.Next as that will integrate the PaaS capabilities into Yarn itself, including application registration and discovery.
... View more
03-31-2017
02:35 AM
great article!
... View more
12-19-2017
02:45 PM
Hi, As we suggested i implemented hive2 action with password file but iam getting below exception. ERROR: Error: E0701 : E0701: XML schema error, cvc-complex-type.2.4.a: Invalid content was found starting with element 'argument'. One of '{"uri:oozie:hive2-action:0.1":file, "uri:oozie:hive2-action:0.1":archive}' is expected. My worflow : <workflow-app name="wf_Cred_Test" xmlns="uri:oozie:workflow:0.4"> <start to="HivePartitionAction" /> <action name="HivePartitionAction"> <hive2 xmlns="uri:oozie:hive2-action:0.1"> <job-tracker>${jobTracker}</job-tracker> <jdbc-url>${jdbcURL}</jdbc-url> <script>script/addPartition.sql</script> <param>runtimeEnvironment=${runtimeEnvironment}</param> <file>script/addPartition.sql</file> <argument>-wpassfile</argument> <file>/tmp/dev/app/workflow/wf_Cred_Test/script/passfile#passfile</file> </hive2> <ok to="end" /> <error to="kill" /> </action> <kill name="kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end" /> </workflow-app>
... View more
09-30-2015
06:38 AM
4 Kudos
In some situations, Ranger is not an option but a Hive Authorization scheme is advisable. It is possible to use SQLStdAuth for this. However it comes with a couple caveats. Hive Wiki 1. Configuring SQLStdAuth ( HDP 2.3 ) - In Ambari, select Authorization SQLStdAuth. This will set all related configuration parameters like hive.enable.authorization and disable doAs. - Add admin users to admin role. If you have a user you want to be admin add him to hive.users.in.admin.role=hive,hue,myadmin 2. Prepare HDFS Since all queries now run as the hive user he needs to have read/write rights on all files in HDFS. This includes the load directories for external tables. Ideally change the owner to the warehousing folder to hive and set access right 700. I also added hive to an ETL group and made all load folders read AND writable to this group. 3. Create roles In Hive as an admin user - Become Admin: SET ROLE ADMIN; - Create roles: CREATE ROLE BI; ( should have read rights to all tables ) CREATE ROLE ETL; ( should have read/write rights to all tables ) - Add users to roles: GRANT ETL TO USER ETLUSER; GRANT BI TO USER BIUSER; - Make ETL Role owner of database to be able to create tables in the database ALTER DATABASE DEFAULT SET OWNER ROLE ETL; - Change table to be readable by BI GRANT SELECT ON MYTABLE TO BI; - Change table to be read and writable by ETL GRANT ALL ON MYTABLE TO ETL; NOTE: I did not find a way to make a ROLE into the owner of a table, so only the table owner or admin can drop tables but the ETL user could insert, drop partitions etc. 4. Beeline parameters SQLStdAuth restricts access to hive config parameters to a white list. In older environments Hive scripts would be parametrized with configuration parameters. -hiveconf day=20150201. This will not work anymore since the parameters are not in the whitelist. You can instead use beeline --hivevar day=20150201
... View more
Labels:
08-01-2016
05:53 PM
Hi Florian did you got the answer for your question ? or it's expected behavior of Hive connection in eclipse ?
... View more
08-30-2017
01:13 PM
Hi, I'm trying to configure this in my ambari 2.5.0.3 with Hive 1.2.1. When I try to connect to hive using jdbc the following error is thrown: WARN jdbc.HiveConnection: Failed to connect to localhost:10500 Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10500: Peer indicated failure: Error validating the login (state=08S01,code=0)
... View more