with newer versions of spark, the sqlContext is not load by default, you have to specify it explicitly :
scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
warning: there was one deprecation warning; re-run with -deprecation for details
sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@6179af64
scala> import sqlContext.implicits._
import sqlContext.implicits._
scala> sqlContext.sql("describe mytable")
res2: org.apache.spark.sql.DataFrame = [col_name: string, data_type: string ... 1 more field]
I'm working with spark 2.3.2