28808
DISCUSSIONS
102219
MEMBERS
3161
ARTICLES
This query below works in Spark 1.6.0, but when I run the same one in spark 2.0, it has issues:
>>> dt = sqlContext.sql("select * from change_points where end_last_dttm > '2015-12-31' and mask = 'MW'").cache()
16/10/03 14:52:19 WARN BoneCPConfig: Max Connections < 1. Setting to 20
16/10/03 14:52:21 WARN BoneCPConfig: Max Connections < 1. Setting to 20
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/cloudera/parcels/SPARK2-2.0.0.cloudera.beta1-1.cdh5.7.0.p0.108015/lib/spark2/python/pyspark/sql/context.py", line 360, in sql
return self.sparkSession.sql(sqlQuery)
File "/opt/cloudera/parcels/SPARK2-2.0.0.cloudera.beta1-1.cdh5.7.0.p0.108015/lib/spark2/python/pyspark/sql/session.py", line 543, in sql
return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
File "/opt/cloudera/parcels/SPARK2-2.0.0.cloudera.beta1-1.cdh5.7.0.p0.108015/lib/spark2/python/lib/py4j-0.10.3-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/opt/cloudera/parcels/SPARK2-2.0.0.cloudera.beta1-1.cdh5.7.0.p0.108015/lib/spark2/python/pyspark/sql/utils.py", line 69, in deco
raise AnalysisException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: u'Table or view not found: change_points; line 1 pos 14'
It says ‘change_points’ table is not found, but it’s right there.