Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
12-08-2020
01:49 PM
Hi, I have permenantly deleted the data. Is this anyway we can recover the data?
... View more
08-13-2020
09:08 AM
@torafca5 Could you please try downloading the jar from the below link, http://www.congiu.net/hive-json-serde/1.3.8/hdp23/json-serde-1.3.8-jar-with-dependencies.jar Once the jar is downloaded, move the jar to the location /usr/hdp/3.0.1.0-187/hive/lib. Please place the jar on all the nodes hosting Hive services. Also, please make sure you are not using LLAP(HiveserverInteractive) to connect to the hive. add jar command does not work with LLAP. implementing the above recommendation should help overcome this issue.
... View more
12-17-2019
07:48 AM
Hi All . here is more Details about above :- https://community.cloudera.com/t5/Support-Questions/HDInsight-Vs-HDP-Service-on-Azure-Vs-HDP-on-Azure-IaaS/m-p/166424 Thanks HadoopHelp
... View more
11-01-2016
05:31 PM
Demo
Extract data from images and store in HDFS. Documents with size less than 10mb stores into HBase.
Document > 10mb lands into HDFS with metadata into HBase
Part 1 - https://www.linkedin.com/pulse/cds-content-data-store-nosql-part-1-co-dev-neeraj-sabharwal
... View more
Labels:
07-16-2016
12:05 AM
2 Kudos
OpenHAB - Build your smart home in no time!
Welcome to http://www.openhab.org/
A vendor and technology agnostic open source automation software for your home.
OpenHAB is a mature, open source home automation platform that runs on a variety of hardware and is protocol agnostic, meaning it can connect to nearly any home automation hardware on the market today. If you’ve been frustrated with the number of manufacturer specific apps you need to run just to control your lights, then I’ve got great news for you: OpenHAB is the solution you’ve been looking for – it’s the most flexible smart home hub you’ll ever find. Source
Demo:
Go to http://www.openhab.org/getting-started/downloads.html
Download Runtime core and Demo files
Extract Runtime core files in a directory called openHAB and extract Demo files under OpenHAB. See the following:
Now, download smartphone app called openHAB in your smartphone. I am using iOS and once you launch the app then disable DEMO tab and enter the https://192.x.x.x IP:8443 in your local domain as shown below.
You will be controlling the Room settings from your phone while openHAB is running in your machine or raspberry pi.
For now, just for fun, I am running this in my mac and playing on my iOS.
Docs and Examples
If you want to test it like a "pro" then follow this example
... View more
11-30-2016
06:38 PM
To get started with the HDCloud for AWS general availability version, visit http://docs.hortonworks.com/HDPDocuments/HDCloudAWS/HDCloudAWS-1.8.0/bk_hdcloud-aws/content/index.html
... View more
07-05-2016
03:00 PM
Chronos is a replacement for cron.
A fault tolerant job scheduler for Mesos which handles dependencies and ISO8601 based schedules
Marathon is a framework for Mesos that is designed to launch long-running applications, and, in Mesosphere, serves as a replacement for a traditional system
In Mesosphere, Chronos compliments Marathon as it provides another way to run applications, according to a schedule or other conditions, such as the completion of another job. It is also capable of scheduling jobs on multiple Mesos slave nodes, and provides statistics about job failures and successes. Source
Install https://mesos.github.io/chronos/docs/ and gist
Demo
Part 1 - https://www.linkedin.com/pulse/data-center-operating-system-dcos-part-1-neeraj-sabharwal Part 2 - https://www.linkedin.com/pulse/apache-marathon-part-2-neeraj-sabharwal
... View more
07-05-2016
11:30 AM
1 Kudo
You need Mesos to run this - Post 1
What is Apache Marathon?
Marathon is a production-grade container orchestration platform for Mesosphere'sDatacenter Operating System (DCOS) and Apache Mesos.
I am launching multiple applications using Marathon and Mesos is providing the framework to launch those applications.
Demo
More reading https://mesosphere.github.io/marathon/ Gist & Application example
... View more
07-05-2016
04:03 AM
1 Kudo
DC/OS - a new kind of operating system that spans all of the servers in a physical or cloud-based datacenter, and runs on top of any Linux distribution.
Source
Projects
More details https://docs.mesosphere.com/overview/components/
Let's cover Mesos in this post
Frameworks (Application running on mesos) http://mesos.apache.org/documentation/latest/frameworks/
I used http://mesos.apache.org/gettingstarted/ to install Mesos in my local machine. I am launching c++, java and python framework in this demo.
Mesos demo
More reading
... View more
07-04-2016
01:51 PM
4 Kudos
Hive: Apache Hive is a data warehouse infrastructure built on top of Hadoop for providing data summarization, query, and analysis.
HBase: Apache HBase™ is the Hadoop database, a distributed, scalable, big data store
Hawq: http://hawq.incubator.apache.org/
PXF: PXF is an extensible framework that allows HAWQ to query external system data
Let's learn Query federation
This topic describes how to access Hive data using PXF.
Link
Previously, in order to query Hive tables using HAWQ and PXF, you needed to create an external table in PXF that described the target table's Hive metadata. Since HAWQ is now integrated with HCatalog, HAWQ can use metadata stored in HCatalog instead of external tables created for PXF. HCatalog is built on top of the Hive metastore and incorporates Hive's DDL. This provides several advantages:
You do not need to know the table schema of your Hive tables You do not need to manually enter information about Hive table location or format If Hive table metadata changes, HCatalog provides updated metadata. This is in contrast to the use of static external PXF tables to define Hive table metadata for HAWQ.
HAWQ retrieves table metadata from HCatalog using PXF. HAWQ creates in-memory catalog tables from the retrieved metadata. If a table is referenced multiple times in a transaction, HAWQ uses its in-memory metadata to reduce external calls to HCatalog. PXF queries Hive using table metadata that is stored in the HAWQ in-memory catalog tables. Table metadata is dropped at the end of the transaction.
Demo
Tools used
Hive,Hawq,Zeppelin
HBase tables
Follow
this to create hbase tables
perl create_hbase_tables.pl
Create table in HAWQ to access HBASE table
Note:
Port is 51200 not 50070
Links
Gist
PXF docs
Must see
this
Zeppelin interpreter settings
... View more
Labels: