Member since
05-05-2017
33
Posts
1
Kudos Received
0
Solutions
07-30-2018
06:25 PM
Dear @Felix Albani, I just had a look of the link you send to me. And i did some tests. hbase backup set add backtest mytestTable The above command gives me an error org.apache.hadoop.hbase.TableNotFoundException: Table 'hbase:backup' was not found, got: WRONG_ID. Why is this happening? Thanks
... View more
07-30-2018
04:32 PM
Dear All, I am very new to HBase and HBase snapshot. I want to ask a question regarding the HBase Snapshot. So, when I take an HBase snapshot of a table. Does this snapshot continuously track the changes of this table? If it is not, which mean that I need to continuously delete the old snapshot and take a new snapshot like once a week, right? Thanks, Bin Ye
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
07-19-2018
01:23 PM
Dear All, May I ask a question about the software architecture. I want to build an image recognition application with apache spark. my original thining is that: 1. spring boot(RESTful API)-receive an image 2. spring boot send the image to spark 3. spark process the image and send the result to spring boot 4. spring boot send to users. Is this a good way to develop such application? are there any examples I can find?
... View more
Labels:
- Labels:
-
Apache Spark
01-26-2018
11:24 PM
Hi All, I have a little HBase cluster and a REST server which sends data to the HBase cluster. However, it is now sending data by: /hbase-unsecure. How could I enable the HBase cluster security by using /hbase-secure. All data will be handled over by the REST server, how to enable HBase security only accept the data which has been sent from the REST server. The REST server send SQL command to apache phoenix and apache phoenix insert data into HBase. I just want a very basic username and password authentication. Thanks, Bin Ye
... View more
Labels:
01-16-2018
08:35 PM
How to disable the connection pool?
... View more
01-16-2018
02:43 PM
Hi All, I have a spring boot application and connect to Phoneix HBase. i have a post function like: @RequestMapping(method = RequestMethod.POST, value = "data/batchinsert/")
public ResponseEntity<?> batchInsertHaEnvrironmentData(@RequestBody List<HaEnvrironmentData> tts) {
int i = heds.batchInsert(tts);
if (i <= 0)
new ResponseEntity<String>("failed", HttpStatus.NO_CONTENT);
return new ResponseEntity<String>("Successed", HttpStatus.OK);
}
TheDao: public int batchInsert(List<HaEnvrironmentData> eds){
String sql=EvotionSqlUtils.getUpsertSql("HA_ENVRIRONMENT_DATA", true, HaEnvrironmentEnum.class, null);
BatchPreparedStatementSetter bpss = new BatchPreparedStatementSetter() {
@Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
HaEnvrironmentData ed=eds.get(i);
Connection conn= getConnection();
ps.setString(1, ed.getId());
ps.setTimestamp(2, ed.getRecord_date());
ps.setString(3, ed.getAid_id());
ps.setShort(4, ed.getEnvironment_classification_unit());
}
@Override
public int getBatchSize() {
return eds.size();
}
};
int[] counts=jdbcTemplate.batchUpdate(sql, bpss);
return counts.length;
} The data list [
{"id":"EX2017000021",
"record_date":1515747937238,"type":2,
"aid_id":"82E3735E",
"environment_classification_unit":0},
....
]
This works fine with two or three data in the list. However, it failed when I am trying with 100 data blocks on the list. And shows the error below: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [http-nio-8080-exec-2] Timeout: Pool empty. Unable to fetch a connection in 30 seconds, none available[size:100; busy:100; idle:0; lastwait:30000].
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:706)
at org.apache.tomcat.jdbc.pool.ConnectionPool.getConnection(ConnectionPool.java:198)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:132)
at org.evotion.datarepository.dao.BaseDaoImpl.getConnection(BaseDaoImpl.java:115)
at org.evotion.datarepository.dao.HaEnvrironmentDataDaoImpl.insert(HaEnvrironmentDataDaoImpl.java:43)
at org.evotion.datarepository.dao.HaEnvrironmentDataDaoImpl.batchInsert(HaEnvrironmentDataDaoImpl.java:55)
at org.evotion.datarepository.dao.HaEnvrironmentDataDaoImpl$$FastClassBySpringCGLIB$$618b5314.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
By the way, the JDBCTemplate will release the connection automatically. but why every time, when I execute the post function above, the connection pool gets exhausted. Please help me! Many thanks in advance.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
-
Apache Spark
09-27-2017
12:40 PM
Many thanks, for your reply. I found the errors. I have two versions of hadoop installed and the config folder is symlink to other place.
... View more
09-22-2017
08:32 PM
Hi, @Ajay When i format the namenode, it print out: 17/09/22 21:54:03 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: Cannot create directory /hadoop/hdfs/namenode/current
In fact the hdfs_site.xml file is configred to store data in file:/home/hduser/hadoop_store/hdfs/namenode. I don't understand how could this happen. This is a completely different location Do you any reasons, why is this happening? Many Thanks in advance. Bin Ye
... View more
09-22-2017
04:53 PM
Hi, i manually installed hadoop 2.7.4 However, namenode never start. i can see datanode and secondrynamenode are starting. the core_site.xml <property>
<name>hadoop.tmp.dir</name>
<value>/home/hduser/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://evotion00:54310</value>
</property>
the hdfs_site.xml <property>
<name>dfs.replication</name>
<value>3</value>
<description>Default block replication.
The actual number of replications can be specified when the file is created.
The default is used if replication is not specified in create time.
</description>
</property>
<property>
<name>dfs.name.dir</name>
<value>file:/home/hduser/hadoop_store/hdfs/namenode</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>file:/home/hduser/hadoop_store/hdfs/datanode</value>
</property>
I tried, hadoop namenode -format and restart but not working. Many thanks in advance.
... View more
Labels:
- Labels:
-
Apache Hadoop
09-07-2017
10:29 AM
Hi, I am using phoenix, HBase and Spring. I am using jdbc template to connect to HBase. I have the following code: Person p= new Person();
p.setName("Jone");
P.setAge(23);
P.setSex("male");
String sql="UPSERT INTO TABLE_1 (NAME, AGE, SEX) VALUES (?, ?, ?)";
Object[] params={p.getName, p.getAge, p.getSex};
jdbctemplatejdbcTemplate.update(sql, params); Now, I want to use refelction to consruct Object array. And, i have the following code: BeanWrapperImpl wrapper = new BeanWrapperImpl(p);
ArrayList<Object> param = new ArrayList<Object>();
Object value1 = wrapper.getPropertyValue("name");
Object value2 = wrapper.getPropertyValue("age");
Object value3 = wrapper.getPropertyValue("sex");
param.add(value1);
param.add(value2);
param.add(value3);
jdbctemplatejdbcTemplate.update(sql, param);
But, i failed. it looks like the getPropertyValue function return as a type of Object. But in jdbcTemplate.update() function requires it actual type. Can anyone help me?
... View more
Labels:
- Labels:
-
Apache Phoenix
09-06-2017
01:47 PM
I have an apache phoenix sequence field which is defind as: CREATE SEQUENCE ID.COUNT; When i insert data by sqlline.py, then, the COUNT start from 1. if i insert data by JDBC template, the COUNT will start from 101, 201. It automatically append 01 at the end of the COUNT. How can i fix this issue?
... View more
Labels:
- Labels:
-
Apache Phoenix
07-27-2017
11:57 AM
Hi Ankit, Thanks for your information. I solved the problem by changing long[] to Long[].
... View more
07-27-2017
11:54 AM
Hi Josh, Thanks for your information. Bin Ye
... View more
07-25-2017
12:47 PM
Hi All, I need to store username and password in Phoenix HBase and this table needs to be encrypted. I want to ask, is there any example or any materials I can read? Bin Ye
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
07-13-2017
06:06 PM
HI, I have a spring boot application. It can run on localhost. but when you deploy a war file on the server. It can not start and generate errors: SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/binye/.m2/repository/ch/qos/logback/logback-classic/1.1.11/logback-classic-1.1.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/binye/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
However, i could not find which one is conflict. My pom.xml is below: <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.evotion.datarepository</groupId>
<artifactId>datarepository</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>war</packaging>
<name>datarepository</name>
<description>Data repository</description>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.4.RELEASE</version>
<!-- <relativePath /> lookup parent from repository -->
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<!-- <start-class>org.evotion.datarepository.ServletInitializer.java</start-class> -->
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jersey</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-core</artifactId>
<version>4.10.0-HBase-1.2</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Please helpe me.
... View more
Labels:
- Labels:
-
Apache Phoenix
07-06-2017
11:07 AM
Hi All, I have a phoenix table which there is a column which stores an array as the variable type. The table is defined as below: CREATE TABLE IF NOT EXISTS ENVI_DATA (
PATIENT_ID VARCHAR(50) NOT NULL,
RECORD_DATE TIMESTAMP NOT NULL,
STEPS INTEGER NULL,
STEPS_TIME TIMESTAMP NULL,
SPEED FLOAT NULL,
SPEED_TIME TIMESTAMP NULL,
USER_ROUTES BIGINT[] NULL,
LATITUDE INTEGER NULL,
LONGITUDE INTEGER NULL,
LOCATION_RECORD_TIME TIMESTAMP NULL,
CONSTRAINT PK PRIMARY KEY
(
PATIENT_ID,
MOBILE_ID,
RECORD_DATE
)
) VERSIONS=3,SALT_BUCKETS=16;
The USER_ROUTES BIGINT[] NULL, is defined as long[] in the program. And I have a method to query the data: public EnviData query(String id){
String querySql= "SELECT * FROM ENVI_DATA where PATIENT_ID=?";
RowMapper<EnviData> rowMapper = new BeanPropertyRowMapper<EnviData>(EnviData.class);
return jdbcTemplate.update(querySql, rowMapper, id);
}
However, I got the following error: "Failed to convert property value of type 'org.apache.phoenix.schema.types.PhoenixArray$PrimitiveLongPhoenixArray' to required type 'long[]' for property 'USER_ROUTES' Which means the USER_ROUTES cann't convert to the type of long[]. I want to ask, do I need to write my own row mapper for this?
... View more
Labels:
- Labels:
-
Apache Phoenix
07-05-2017
11:25 PM
Hi, i can ssh to all hosts passwordless. i checked all hosts, ambari-agents are runing fine. /etc/ambari-agent/conf/agent.ini are pointting to the correct server. but still not working
... View more
07-05-2017
10:34 PM
Yes, the succeeded one is the one run as the server. Could you tell me how to route to the server? Many thanks,
... View more
07-05-2017
09:37 PM
Hi ALL, I installed Ambari on Amazon EC2 with three virtual machines. ubuntu16.4 I followed the guide from: https://docs.hortonworks.com/HDPDocuments/Ambari-2.5.1.0/bk_ambari-installation/content/download_the_ambari_repo_ubuntu16.html But, I got the following error: I cam confirm that: Ambari-agents are running on all hosts. /etc/ambari-agent/conf/ambari-agent.ini are correct. ssh working fine. I don't know I missed?
... View more
Labels:
- Labels:
-
Apache Ambari
07-01-2017
10:35 AM
Hi, I am using spring boot jdbc template connecting to a phoenix hbase database. However, i have a table with 100 columns. so, everytime when i insert a row should contain 100 values For example insert: public void insert(Ttsnihl m){
Object[] params = new Object[]{
m.getPATIENT_ID(),
m.getRECORD_DATE(),
m.getSTART_TIME(),
m.getEND_TIME(),
.....// 100 object array
};
jdbcTemplate.update(insertSql, params);
}
And: public void insertBatch(final List<Ttsnihl> ts){
BatchPreparedStatementSetter bpss = new BatchPreparedStatementSetter() {
@Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
Ttsnihl t=ts.get(i);
ps.setString(1, t.getPATIENT_ID());
ps.setTimestamp(2, t.getRECORD_DATE());
ps.setTimestamp(3, t.getSTART_TIME());
ps.setTimestamp(4, t.getEND_TIME());
..... // 100 ps.set()functions
}
@Override
public int getBatchSize() {
return ts.size();
}
};
jdbcTemplate.batchUpdate(insertSql, bpss);
}
I want to ask, is there any smart way to do this? Many thanks in advance!
... View more
Labels:
- Labels:
-
Apache Phoenix
06-30-2017
09:40 AM
Hi, Many Thanks. May i also ask, Long[] is already an array object why do i still need to call createArrayOf function ?
... View more
06-30-2017
09:36 AM
Many thanks, I just realised i need to convert long[] to Long[]
... View more
06-29-2017
05:25 PM
Hi, I am using jdbctemplate connecting to an apache phoenix database, but i had a problem which the database has a record store an array(long[]). In the database, the column is defined as "USER_ROUTES BIGINT[] NULL", In java, i use the long[] instead. How to insert an array as a variable in the PreparedStatement For example: BatchPreparedStatementSetter bpss = new BatchPreparedStatementSetter() { @Override public void setValues(PreparedStatement ps, int i) throws SQLException { EnviData ed=eds.get(i); ps.setString(1, ed.getPATIENT_ID()); ps.setInt(2, ed.getMOBILE_ID()); ps.setArray(3, ed.getUSER_ROUTES()); } } The Function getUSER_ROUTES() will return an long[]. but, the program shows error. How to solve it?
... View more
Labels:
- Labels:
-
Apache Phoenix
06-22-2017
05:36 PM
Thanks for your information. I just have a look of the Phoenix ORM PHO, but I don't understand it very clearly. Could you make an example for me? For example, let says we have an entity class class Person{ int id; int name; int age; int sex; } And a RestController to accept a http post request. how could I use the PHO to receive the data and insert into Phoenix? Thank you very much for your help Bin Ye
... View more
06-22-2017
11:23 AM
I am currently trying to build spring boot rest API to insert data into Apache Phoenix. If I can use JPA to insert data into Phoenix that will be very convenient.
... View more
Labels:
- Labels:
-
Apache Phoenix
05-12-2017
01:01 PM
@Greg Keys Hi, thanks for asking. I decided to split the table and store the data separately. Thanks.
... View more
05-05-2017
08:56 PM
Many thanks for your help. you really do me a big favour I will have a look of the materials which you provided first. thanks,
... View more