Member since
03-04-2019
67
Posts
2
Kudos Received
3
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 6043 | 03-18-2020 01:42 AM | |
| 3864 | 12-16-2019 04:17 AM |
03-18-2020
02:39 AM
Hi @Logica . I think you need keep hive-site.xml file into spark - Please follow the below steps for running the hive query or accessing the hive table through pyspark- https://acadgild.com/blog/how-to-access-hive-tables-to-spark-sql Thanks HadoopHelp
... View more
03-18-2020
01:42 AM
Hi @Logica . please check whether database is selected or not for running the query- below is code for reading hive table - from pyspark.conf import SparkConf
from pyspark.context import SparkContext
from pyspark.sql import HiveContext
sc= SparkContext('local','example')
hc = HiveContext(sc)
tf1 = sc.textFile("/user/BigData/nooo/SparkTest/train.csv")
#print(tf1.show(10))
#here reading hive table from pyspark
#print(data)
#data=tf1.top(10)
#print(data)
hc.sql("use default") #selected db here
spf = hc.sql("SELECT * FROM tempaz LIMIT 100")
print(spf.show(5)) Thanks HadoopHelp
... View more
03-03-2020
06:06 AM
Dear all.
I created one hive temp tables as below -
but how can we identified that tables is temp tables or not.
CREATE temporary TABLE IF NOT EXISTS employee ( eid int, name String,
salary String, destination String)
COMMENT ‘Employee details’
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ‘\t’
LINES TERMINATED BY ‘\n’
STORED AS TEXTFILE;
i used below command to described but not getting any info-
describe formatted employee; col_name data_type comment
1 # col_name data_type comment
2 NULL NULL
3 eid int
4 name string
5 salary string
6 destination string
7 NULL NULL
8 # Detailed Table Information NULL NULL
9 Database: h7 NULL
10 OwnerType: USER NULL
11 Owner: **** NULL
12 CreateTime: Tue Mar 03 08:50:28 EST 2020 NULL
13 LastAccessTime: UNKNOWN NULL
14 Retention: 0 NULL
15 Location: hdfs:*********** NULL
16 Table Type: MANAGED_TABLE NULL
17 Table Parameters: NULL NULL
18 comment Employee details
19 NULL NULL
20 # Storage Information NULL NULL
21 SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe NULL
22 InputFormat: org.apache.hadoop.mapred.TextInputFormat NULL
23 OutputFormat: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat NULL
24 Compressed: No NULL
25 Num Buckets: -1 NULL
26 Bucket Columns: [] NULL
27 Sort Columns: [] NULL
28 Storage Desc Params: NULL NULL
29 field.delim \t
30 line.delim \n
Thanks
HadoopHelp
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Impala
02-03-2020
07:17 AM
Hi @mike_bronson7 . I think you need to continue here:- https://community.cloudera.com/t5/Support-Questions/how-to-know-if-any-service-in-ambari-cluster-need-to-restart/td-p/228707 Thanks HadoopHelp
... View more
12-18-2019
02:22 AM
hI @cjervis . Please use below Download link:- https://www.cloudera.com/downloads/hortonworks-sandbox/hdp.html I jest verified Thanks HadoopHelp
... View more
12-17-2019
07:54 AM
Hi All. Here is all steps for doing same!!! Link :- https://www.oreilly.com/library/view/hadoop-with-python/9781492048435/ch01.html Thanks HadoopHelp
... View more
12-17-2019
07:48 AM
Hi All . here is more Details about above :- https://community.cloudera.com/t5/Support-Questions/HDInsight-Vs-HDP-Service-on-Azure-Vs-HDP-on-Azure-IaaS/m-p/166424 Thanks HadoopHelp
... View more
12-16-2019
05:06 AM
Hi . Please try with below step :- df = spark.read.format("csv").option("header", "true").load("csvfile.csv") just remove "hdfs:///" from path and also try to create separate dir within user dir or other. then load that data and give that path with your code! Thanks HadoopHelp
... View more
12-16-2019
04:17 AM
Hi @rohitmalhotra . Thanks!! i found this solution already!!! problem was :- not able to get exact HDFS Path as last post i mentioned . i checked core.site.xml file with Azure HDInsight cluster DFS File path:- Thanks HadoopHelp
... View more
11-17-2019
11:55 PM
Hi @rohitmalhotra . I want to create hive table on top hdinsight hdfs path . so is it possible to create hive table on HDinsight HDFS path or directly we have to use blob container address . i think i am not able to find out hdfs path from HDInsight! Thanks HadoopHelp
... View more