Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

pyspark sql query

pyspark sql query

New Contributor

Hi,

 

 I want to convert SQL query to pyspark. Could you please help us to write on this.

 

Can someone take a look at the code and let me know where I'm going wrong:

 

import os
import sys
from pyspark.sql import SQLContext
from pyspark import SparkContext,SparkConf
from pyspark.sql.types import *
from pyspark.sql.session import SparkSession
sc = SparkContext.getOrCreate()
spark = SparkSession(sc)

sql_context = SQLContext(sc)
run_query = sql_context.sql(select location_consumer_id,effective_date from pearson_person limit 3)
run_query.show()

 

 

Thx

Lokesh