- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
SET Location in dbwrite table in r
- Labels:
-
Apache Hive
-
Cloudera Hue
Created ‎11-10-2021 04:43 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am using Rstudio and I need to save some tables in my sandbox, however I cannot specify the location, because by default we send the tables to the path: hdfs: //haprod/warehouse/tablespace/managed/hive/cld_ml_bi_eng.db/iris How can I specify the location when writing the table is in the LOCATION '/sandbox/CLD_ML_BI_ENG/iris'
library(DBI)
library(odbc)
con_Hive <- dbConnect(odbc::odbc(), "Hive_produccion", database = "cld_ml_bi_eng", encoding = "latin1")
colnames(iris) <- gsub(pattern = "\\.", replacement = "_", x = colnames(iris))
dbWriteTable(conn = con_Hive, name = Id(schema = "cld_ml_bi_eng", table = "iris"), value = iris)
LOCATION '/sandbox/CLD_ML_BI_ENG/iris'
Created ‎11-12-2021 02:15 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
AFAIK the table location should be set at the time of Creation.
You can set files in a different location.
You can use ALTER TABLE jsont1 SET LOCATION "hdfs://mycluster:8020/jsam/j1"; to change the location and need to move the files manually from old location to new location.
Created ‎11-12-2021 02:15 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
AFAIK the table location should be set at the time of Creation.
You can set files in a different location.
You can use ALTER TABLE jsont1 SET LOCATION "hdfs://mycluster:8020/jsam/j1"; to change the location and need to move the files manually from old location to new location.
