10-15-2018 11:45 PM
I am trying to access the already existing table in hive by using pyspark
e.g. in hive table is existing name as "department" in default database.
err msg :-
18/10/15 22:01:23 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
18/10/15 22:02:35 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0-cdh5.13.0
18/10/15 22:02:38 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
I checked the below files, they are same.
Any help on how to set up the HiveContext from pyspark is highly appreciated.