Member since 
    
	
		
		
		05-05-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                33
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		07-30-2018
	
		
		06:25 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Dear @Felix Albani,  I just had a look of the link you send to me. And i did some tests.  hbase backup set add backtest mytestTable  The above command gives me an error  org.apache.hadoop.hbase.TableNotFoundException: Table 'hbase:backup' was not found, got: WRONG_ID.  Why is this happening?  Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-30-2018
	
		
		04:32 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Dear All,  I am very new to HBase and HBase snapshot.  I want to ask a question regarding the HBase Snapshot.  So, when I take an HBase snapshot of a table. Does this snapshot continuously track the changes of this table?  If it is not, which mean that I need to continuously delete the old snapshot and take a new snapshot like once a week, right?  Thanks,  Bin Ye  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hadoop
- 
						
							
		
			Apache HBase
			
    
	
		
		
		07-19-2018
	
		
		01:23 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Dear All,  May I ask a question about the software architecture.  I want to build an image recognition application with apache spark.  my original thining is that: 1. spring boot(RESTful API)-receive an image  2. spring boot send the image to spark  3. spark process the image and send the result to spring boot  4. spring boot send to users.  Is this a good way to develop such application?  are there any examples I can find? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Spark
			
    
	
		
		
		09-27-2017
	
		
		12:40 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Many thanks, for your reply. I found the errors. I have two versions of hadoop installed and the config folder is symlink to other place.  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-22-2017
	
		
		08:32 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi, @Ajay  When i format the namenode, it print out:  17/09/22 21:54:03 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: Cannot create directory /hadoop/hdfs/namenode/current
  In fact the hdfs_site.xml file is configred to store data in file:/home/hduser/hadoop_store/hdfs/namenode.  I don't understand how could this happen. This is a completely different location  Do you any reasons, why is this happening?  Many Thanks in advance.  Bin Ye 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		09-22-2017
	
		
		04:53 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi, i manually installed hadoop 2.7.4  However, namenode never start.  i can see datanode and secondrynamenode are starting.  the core_site.xml  <property>
  <name>hadoop.tmp.dir</name>
  <value>/home/hduser/tmp</value>
  <description>A base for other temporary directories.</description>
 </property>
 <property>
  <name>fs.default.name</name>
  <value>hdfs://evotion00:54310</value>
 </property>
  the hdfs_site.xml  <property>
  <name>dfs.replication</name>
  <value>3</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
 </property>
 <property>
   <name>dfs.name.dir</name>
   <value>file:/home/hduser/hadoop_store/hdfs/namenode</value>
 </property>
 <property>
   <name>dfs.data.dir</name>
   <value>file:/home/hduser/hadoop_store/hdfs/datanode</value>
 </property>
  I tried, hadoop namenode -format and restart but not working.  Many thanks in advance. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hadoop
			
    
	
		
		
		07-27-2017
	
		
		11:54 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Josh,  Thanks for your information.  Bin Ye 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		07-25-2017
	
		
		12:47 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi All,  I need to store username and password in Phoenix HBase and this table needs to be encrypted.  I want to ask, is there any example or any materials I can read?  Bin Ye 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache HBase
- 
						
							
		
			Apache Phoenix
			
    
	
		
		
		07-13-2017
	
		
		06:06 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 HI, I have a spring boot application. It can run on localhost. but when you deploy a war file on the server. It can not start and generate errors:  SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/binye/.m2/repository/ch/qos/logback/logback-classic/1.1.11/logback-classic-1.1.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/binye/.m2/repository/org/slf4j/slf4j-log4j12/1.7.25/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
  However, i could not find which one is conflict.  My pom.xml is below:  <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>org.evotion.datarepository</groupId>
	<artifactId>datarepository</artifactId>
	<version>0.0.1-SNAPSHOT</version>
	<packaging>war</packaging>
	<name>datarepository</name>
	<description>Data repository</description>
	<parent>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-parent</artifactId>
		<version>1.5.4.RELEASE</version>
		<!-- <relativePath /> lookup parent from repository -->
	</parent>
	<properties>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
		<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
		<!-- <start-class>org.evotion.datarepository.ServletInitializer.java</start-class> -->
		<java.version>1.8</java.version>
	</properties>
	<dependencies>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-jersey</artifactId>
		</dependency>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-web</artifactId>
		</dependency>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-tomcat</artifactId>
			<scope>provided</scope>
		</dependency>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-test</artifactId>
			<scope>test</scope>
		</dependency>
		<dependency>
			<groupId>org.apache.phoenix</groupId>
			<artifactId>phoenix-core</artifactId>
			<version>4.10.0-HBase-1.2</version>
		</dependency>
		<dependency>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-starter-jdbc</artifactId>
		</dependency>
	</dependencies>
	<build>
		<plugins>
			<plugin>
				<groupId>org.springframework.boot</groupId>
				<artifactId>spring-boot-maven-plugin</artifactId>
			</plugin>
		</plugins>
	</build>
</project>
  Please helpe me. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Phoenix
 
        