Member since
11-15-2016
1
Post
0
Kudos Received
0
Solutions
11-15-2016
03:15 PM
@Anchika Agarwal
Assuming that reading and writing data from Teradata is like MySQL or Postgresql.... You will need to include the JDBC driver for Teradata on the spark classpath.
$ SPARK_CLASSPATH=teradata-jdbc.jar bin/spark-shell
Use the following code in Spark shell. Modify and pass all necessary parameters
scala> val jdbcUsername = "USER_NAME"
scala> val jdbcPassword = "PASSWORD"
scala> val jdbcHostname = "HOSTNAME"
scala> val jdbcPort = port_num
scala> val jdbcDatabase ="DATABASE"
scala> val jdbcUrl = s"jdbc:teradata://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}?user=${jdbcUsername}&password=${jdbcPassword}"
scala> val connectionProperties = new java.util.Properties()
scala> Class.forName("com.teradata.jdbc.Driver")
scala> import java.sql.DriverManager
scala> val connection = DriverManager.getConnection(jdbcUrl, jdbcUsername, jdbcPassword) connection.isClosed()
scala> sqlContext.table("jdbcDF").withColumnRenamed("table", "table_number") .write .jdbc(jdbcUrl, "tablename", connectionProperties)
... View more