Member since
05-24-2016
17
Posts
3
Kudos Received
0
Solutions
02-21-2017
04:41 AM
I am using HDP 2.3 JDBC CODE: // 1. check the connection Connection con = DriverManager.getConnection("jdbc:hive2://ppp.qqq.net:10000/default", "hive", "hive"); Statement stmt = con.createStatement(); // 2.Find Time Stamp ResultSet res_max = stmt.executeQuery(history_max_time_stamp_query); if (res_max.next()) { // System.out.println(res.getString(1) + "\t" + res.getString(2)); time_stamp = res_max.getString(1); System.out.println(res_max.getString(1)); } //3 while (history_delete_res1.next()) { String delete = "delete from "+orc_table_name +" where "+primary_key +"=" +value;
System.out.println("delete: " + delete); System.out.println("====================="); //System.out.println(kamrepcd +" "+ kamrepnm ); stmt.executeQuery(delete);
JARS used : javac -cp /usr/hdp/2.3.4.7-4/hadoop/hadoop-common-2.7.1.2.3.4.7-4.jar:/usr/hdp/2.3.4.7-4/hive/lib/hive-common-1.2.1.2.3.4.7-4.jar:/usr/hdp/2.3.4.7-4/hive/lib/hive-jdbc-1.2.1.2.3.4.7-4-standalone.jar:. hive_connection.java java -cp /usr/hdp/2.3.4.7-4/hadoop/hadoop-common-2.7.1.2.3.4.7-4.jar:/usr/hdp/2.3.4.7-4/hive/lib/hive-common-1.2.1.2.3.4.7-4.jar:/usr/hdp/2.3.4.7-4/hive/lib/hive-jdbc-1.2.1.2.3.4.7-4-standalone.jar:. hive_connection Table description :ORC with transaction true (I am able to delete record through hive shell) ERROR: Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations. I am able to delete data through hive shell but not through jdbc connection
... View more
Labels:
- Labels:
-
Apache Hive
12-22-2016
07:26 AM
| 123| 001||0|20161219| | 124| 002||1|20161219| | 124| 003|002|1|20161219| how to use spark dataframe and spark core functions like map in scala ? how to put variable value in each row of DF ? is it possible (becasue df is immutable )? if we convert df into rdd then how to change each lines 3 rd column with varible value +1 and increment for each line ?
... View more
12-21-2016
01:21 PM
1)val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc) 2)val df=sqlContext.sql("select * from v_main_test ") 3)df.show() |vbeln| line_number|parent_line_number|flag| dt| | 123| ||0|20161219| | 124| ||1|20161219| df.map(row => {
val row1 = row.getAs[String]("vbeln") val make = if (row1.toLowerCase == "125") "S" else "its 123" Row(row(0),make,row(0))
}).collect().foreach(println) row doest work?? I have to use map function because i have to use each line and update each line by line number and parent line number out put shoud be :line shoud be 001 and dont need to add new line but for flag1 i have to add same line twise as | 123| 001||0|20161219| | 124| 002||1|20161219| | 124| 003|002|1|20161219| how to use spark dataframe and spark core functions like map in scala ?
... View more
Labels:
- Labels:
-
Apache Spark
12-09-2016
09:55 AM
i have more than 6 hive-site.xml p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #000000; background-color: #ffffff}
span.s1 {font-variant-ligatures: no-common-ligatures} /etc/hive/conf.install/hive-site.xml /etc/hive/2.3.4.7-4/0/hive-site.xml /etc/hive/2.3.4.7-4/0/conf.server/hive-site.xml /var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/configuration/hive-site.xml /var/lib/ambari-agent/cache/stacks/HDP/2.0.6.GlusterFS/services/HIVE/configuration/hive-site.xml /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/HIVE/configuration/hive-site.xml /var/lib/ambari-agent/cache/stacks/HDP/2.1.GlusterFS/services/HIVE/configuration/hive-site.xml /var/lib/ambari-agent/cache/stacks/HDP/2.2/services/HIVE/configuration/hive-site.xml /var/lib/ambari-agent/cache/stacks/HDP/2.3/services/HIVE/configuration/hive-site.xml /var/lib/ambari-agent/cache/stacks/HDP/2.3.GlusterFS/services/HIVE/configuration/hive-site.xml /var/lib/smartsense/hst-agent/data/tmp/hdp-dev-n0-a-00044692-c-00013299_hdpdev_0_2016-09-04_20-00-01/services/HIVE/conf/hive-site.xml /var/lib/smartsense/hst-agent/data/tmp/hdp-dev-n0-a-00044692-c-00013299_hdpdev_0_2016-09-04_20-00-01/services/HIVE/conf/conf.server/hive-site.xml /usr/hdp/2.3.4.7-4/etc/hive/conf.dist/hive-site.xml which one should i refer ? i have to see what is a hive.aux.jars.path directory ?? where should i add and what is a meaning of those files ?? is that realyy usefull ?
... View more
Labels:
- Labels:
-
Apache Hive
08-08-2016
03:56 AM
2 Kudos
Hi, I am using hdp 2.3.0 hdfs log : ERROR datanode.DataNode (DataXceiver.java:run(278)) - hdp :50010:DataXceiver error processing unknown operation src: /127.0.0.1:34584 dst: /127.0.0.1:50010 java.io.EOFException
at java.io.DataInputStream.readShort(DataInputStream.java:315)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
at java.lang.Thread.run(Thread.java:745)
2016-08-08 00:26:21,672 ERROR datanode.DataNode (LogAdapter.java:error(69)) - RECEIVED SIGNAL 15: SIGTERM
2016-08-08 00:26:50,714 INFO datanode.DataNode (LogAdapter.java:info(45)) - STARTUP_MSG: java.net.BindException: Problem binding to [0.0.0.0:50010] java.net.BindException: Address already in use; For more details see: http://wiki.apache.org/hadoop/BindException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ....... Caused by: java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:437) ..... 2016-08-08 00:26:51,970 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1
2016-08-08 00:26:51,974 INFO datanode.DataNode (LogAdapter.java:info(45)) - SHUTDOWN_MSG: Check : 1) ps -ef | grep datanode : result: no service 2) nc -l 50010 : already used 3) netstat -nap | grep 50010 Result: many services but all are close wait, not table to find process id 4) hadoop dfs report : Dead datanodes (1):
Name: 10.0.0.14:50010 (hdp-n4)
Hostname: hdp-n4
Decommission Status : Normal
Configured Capacity: 0 (0 B)
DFS Used: 0 (0 B)
Non DFS Used: 0 (0 B)
DFS Remaining: 0 (0 B)
DFS Used%: 100.00%
DFS Remaining%: 0.00%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 0
Last contact: Fri Aug 05 20:51:37 UTC 2016
... View more
Labels:
- Labels:
-
Apache Hadoop
05-24-2016
05:30 AM
I am able to access and create files in DFS i checked fsck status from same node : Its healthy.
... View more
05-24-2016
02:56 AM
ERROR datanode.DataNode (DataXceiver.java:run(278)) - hostname:50010:DataXceiver error processing unknown operation src: /127.0.0.1:49632 dst: /127.0.0.1:50010
java.io.EOFException
at java.io.DataInputStream.readShort(DataInputStream.java:315)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.readOp(Receiver.java:58)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:227)
at java.lang.Thread.run(Thread.java:745) I am just adding a new data node to ambari and facing this issue. on ambari it shows its woking but in logs i am facing this ERROR . i cheched my firewall off and my /etc/hosts is working properly with ssh
... View more
Labels:
- Labels:
-
Apache Hadoop