Member since
12-09-2015
43
Posts
18
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
13336 | 12-17-2015 07:27 AM |
01-20-2016
12:34 PM
i dont have time stamp in my table ,i want know how do with without DATE format
... View more
01-20-2016
06:24 AM
thnks ,but i want how to with sqoop incremental import metadata,update column value into hive table
... View more
01-20-2016
05:07 AM
1 Kudo
Mysql table --------------------------- no | student name | dept 1 | siva | IT 2 | raj | cse now i create sqoop incremental JOB data move into hive table (sqoop job --exec student_info) hive table ----------------- no | student name | dept 1 | siva | IT 2 | raj | cse working fine . now i update mysql Table Column value ( dept ) IT -> EEE IN ID 1 Mysql Table --------------------- no | student name | dept 1 | siva | EEE now i again run the sqoop increment import job (sqoop job --exec student_info) IT Show that message 16/01/20 04:41:42 INFO tool.ImportTool: Incremental import based on column `id`
16/01/20 04:41:42 INFO tool.ImportTool: No new rows detected since last import. [root@sandbox ~] data not move into hive table i want know how to move update value move into hive table (or) if not possible means how to move to NOSQL (HBASE) tabe
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Hive
-
Apache Sqoop
01-07-2016
05:37 AM
1 Kudo
i want visualize hive table data into d3.js ,i dont know how to connect d3.js and hive table , any one help me and also if u know any opensource visualzeation tools inform me
... View more
Labels:
- Labels:
-
Apache Hive
12-29-2015
07:13 PM
1 Kudo
throw ambari or if u know any curl command please send me ,i want know how to upgrade hive version in hortonworks and also it is make any issues in future
... View more
Labels:
12-29-2015
04:26 AM
i already have that jar file in my hortonworks,i want know how to compile and execute the file on that path
... View more
12-28-2015
03:33 PM
i want execute the below program in hortonworks some one help me import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;
public class HiveJdbcClient {
private staticString driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";
public static void main(String[] args) throws SQLException {
try {
Class.forName(driverName);
} catch(ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
System.exit(1);
}
Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default", "", "");
Statement stmt = con.createStatement();
String tableName = "testHiveDriverTable";
stmt.executeQuery("drop table " + tableName);
ResultSet res = stmt.executeQuery("create table "+ tableName + " (key int, value string)");
// show tables
String sql = "show tables '"+ tableName + "'";
System.out.println("Running: " + sql);
res = stmt.executeQuery(sql);
if(res.next()) {
System.out.println(res.getString(1));
}
// describe table
sql = "describe " + tableName;
System.out.println("Running: " + sql);
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString(1) + "\t" + res.getString(2));
}
// load data into table
// NOTE: filepath has to be local to the hive server
// NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
String filepath = "/tmp/a.txt";sql = "load data local inpath '" + filepath + "' into table " + tableName;
System.out.println("Running: "+ sql);
res = stmt.executeQuery(sql);
// select * query sql = "select * from " + tableName;
System.out.println("Running: " + sql);
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
}
// regular hive query
sql = "select count(1) from " + tableName;
System.out.println("Running: " + sql);
res = stmt.executeQuery(sql);
while (res.next()) {
System.out.println(res.getString(1));
}
}
}
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
12-17-2015
07:27 AM
2 Kudos
Use the reflect UDF to generate UUIDs. reflect("java.util.UUID", "randomUUID")
... View more
12-09-2015
02:26 PM
2 Kudos
Thnks lot but in ur comment minor correction (-put (space) - (space) after given file path ) hadoop fs -text /hdfs_path/compressed_file.gz | hadoop fs -put - /hdfs_path/uncompressed-file.txt
... View more
12-09-2015
01:37 PM
i want extract file with in hdfs ... i dont want extrct the file outside hdfs and put in file again in hdfs ...
... View more
- « Previous
- Next »