Member since
01-17-2017
20
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4660 | 01-17-2017 05:08 PM |
01-17-2017
05:46 PM
When Hive stores a timestamp value into Parquet format, it converts local time into UTC time, and when it reads data out, it converts back to local time.
... View more
01-17-2017
05:42 PM
Yes. Use a spark-hbase-connector.
... View more
01-17-2017
05:41 PM
Spark properties control most application parameters and can be set by using a SparkConf object, or through Java system properties. Environment variables can be used to set per-machine settings, such as the IP address, through the conf/spark-env.sh script on each node. Logging can be configured through log4j.properties.
... View more
01-17-2017
05:39 PM
Amazon Elastic MapReduce (EMR) builds proprietary versions of Apache Hadoop, Hive, and Pig optimized for running on Amazon Web Services. Amazon EMR provides a hosted Hadoop framework running on the web-scale infrastructure of Amazon Elastic Compute Cloud (EC2) or Simple Storage Service (S3)
... View more
01-17-2017
05:36 PM
With a `SparkSession`, applications can create DataFrames from an [existing `RDD`](#interoperating-with-rdds), from a Hive table, or from [Spark data sources](#data-sources). As an example, the following creates a DataFrame based on the content of a JSON file: {% include_example create_df scala/org/apache/spark/examples/sql/SparkSQLExample.scala %}
... View more
01-17-2017
05:23 PM
In Impala, a table can be created by using the ‘CREATE Table’ command. Let us understand the general syntax of creating a table in Impala with the help of the commands shown on the screen. The ‘PARTITIONED BY’ clause partitions data files based on one or more specified columns values.
... View more
01-17-2017
05:20 PM
1 Kudo
A schema or protocol may not contain multiple definitions of a fullname. Further, a name must be defined before it is used ("before" in the depth-first, left-to-right traversal of the JSON parse tree, where the types attribute of a protocol is always deemed to come "before" the messages attribute.)
... View more
01-17-2017
05:18 PM
The input data set consists of three tables as shown with the following table creation statements in Impala SQL dialect
... View more
01-17-2017
05:16 PM
448 [GB] hdfs://aewb-analytics-staging-name.example.com:8020/user/hive/warehouse/mybigtable 8 [GB]hdfs://aewb-analytics-staging-name.example.com:8020/user/hive/warehouse/anotherone 0 [GB]hdfs://aewb-analytics-staging-name.example.com:8020/user/hive/warehouse/tinyone
... View more
01-17-2017
05:08 PM
1 Kudo
A smoke test is scripted, either using a written set of tests or an automated test A Smoke test is designed to touch every part of the application in a cursory way. It’s shallow and wide. Smoke testing is conducted to ensure whether the most crucial functions of a program are working, but not bothering with finer details. (Such as build verification). Smoke testing is normal health check up to a build of an application before taking it to testing in depth
... View more