Member since
02-01-2019
650
Posts
143
Kudos Received
117
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2836 | 04-01-2019 09:53 AM | |
1466 | 04-01-2019 09:34 AM | |
6962 | 01-28-2019 03:50 PM | |
1581 | 11-08-2018 09:26 AM | |
3832 | 11-08-2018 08:55 AM |
08-07-2018
12:10 PM
1 Kudo
@Rahul P Could you please point me to the doc which says we can use spark in hive as execution engine? AFAIK we don't have option directly to set spark execution in hive. What is new is a HiveWarehouseConnector for Spark. Since all the tables in Hive will be transactional form HDP3.0, A HiveWarehouseConnector was introduced in spark to access hive tables. Ref: https://docs.hortonworks.com/HDPDocuments/HDP3/HDP-3.0.0/migrating-data/content/hive_hivewarehouseconnector_for_handling_apache_spark_data.html
... View more
08-03-2018
03:17 PM
I don't think we have a way to write a .xslx file, However you can write csv or tsv which are compatible with excel.
... View more
08-03-2018
03:13 PM
@Edward
Samson
Seems like that specific version is not present in the github. The closest we have is https://github.com/hortonworks/hadoop-release/tree/HDP-2.6.5.3-tag
... View more
08-03-2018
03:09 PM
You can use the below command which generates the tsv file: hive -e 'select * from table' > output.tsv
... View more
07-20-2018
11:53 AM
@Bhushan Kandalkar So the issue is with how python returns the os name. You can see the full details in this SO thread. If you want a new python version you can use anaconda and install the python in another path and leave the default one as it is.
... View more
07-19-2018
08:27 AM
@Bhushan Kandalkar
It seems the OS type is not supported. Agent os type: debianjessie/sid 18Jul201807:33:54,768 WARN [qtp-ambari-agent-82]HeartBeatHandler:387-Received registration request from host withnot supported os type, hostname=hadmgrndcc03-1.lifeway.org, serverOsType=ubuntu14, agentOsType=debianjessie/sid
... View more
07-16-2018
03:28 PM
1 Kudo
@Luiz Fernando Lacerda Prina As per the jira it should be configurable. Please refer below from hbase official doc. ""Recent versions of HBase also support setting time to live on a per cell basis. See HBASE-10560 for more information. Cell TTLs are submitted as an attribute on mutation requests (Appends, Increments, Puts, etc.) using Mutation#setTTL. If the TTL attribute is set, it will be applied to all cells updated on the server by the operation. There are two notable differences between cell TTL handling and ColumnFamily TTLs:
Cell TTLs are expressed in units of milliseconds instead of seconds. A cell TTLs cannot extend the effective lifetime of a cell beyond a ColumnFamily level TTL setting."""
... View more
07-16-2018
08:17 AM
@Mukesh Chouhan In order to use the jars present in local filesystem. Please follow below. 1. Place the jars in a directory on livy node and add the directory to `livy.file.local-dir-whitelist`.This configuration should be set in livy.conf. 2. Add all the required jars to "jars" field in the curl command, note it should be added in URI format with "file" scheme, like "file://<livy.file.local-dir-whitelist>/xxx.jar".
... View more
07-15-2018
09:34 AM
@Sundar Lakshmanan "sort" was for the requirement this is mentioned in the question details. You can use the command which @rguruvannagari has mentioned in the above comment to sort with the date (3rd column).
... View more