Member since 
    
	
		
		
		06-08-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                26
            
            
                Posts
            
        
                0
            
            
                Kudos Received
            
        
                2
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 2080 | 04-21-2021 03:12 AM | |
| 14351 | 01-24-2020 06:27 AM | 
			
    
	
		
		
		10-04-2021
	
		
		07:18 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi.  I'm setting up a CDP 7.1.6  cluster.  I've created a local parcel repo as described HERE  Web Server is up and running and I can access repository from the browser.  In CM the same URL doesn't enable the greyed out continue button  Parcels are there, manifest.json is ok, files permission and ownership are as described.  Can't understand where I am wrong       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
	
					
			
		
	
	
	
	
				
		
	
	
			
    
	
		
		
		04-21-2021
	
		
		03:12 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Just to say that a different approach, with MiNiFi on each machine, meets the needs .     Thanks 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-15-2021
	
		
		07:50 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi there.  I'm using NiFi to get data from remote windows servers using the GetSmbFile processor.  It works but I have a number of servers exposing data on SMB share and I'm trying to understand if I can catalog all my remote servers in a SQL table with needed parameters (eg. hostname, user, pass, share) to be passed to that processor.  I see that GetSmbFiles doesn't support EL so I haven't found a working solution nor anything helpful on line.  Is this achievable without writing my own processor or running external code?  Any help is appreciated 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache NiFi
 
			
    
	
		
		
		04-08-2021
	
		
		11:36 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							I've tried to set a value at a time and restart CM.  Nothing changes    It seems to be a known bug on that CDH release  Had the opportunity to open a SR through my customer's account and support  answered that the message can be dismissed    Thanks for your answers  
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-08-2021
	
		
		06:03 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi all.  I'm facing a CM configuration error I can't understand.  I've kerberized a Cloudera 5.16 cluster which authenticates against an AD/DC controller.  I've set aes256-cts and aes256-cts-hmac-sha1-96 encryption types.  CM reports a configuration issue:      I've a separate cluster, configured with the same values and authenticating to the same AD/DC controller, that have no errors.     I'm struggling on that error but can't solve.     Any idea?       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Cloudera Manager
 - 
						
							
		
			Kerberos
 
			
    
	
		
		
		07-09-2020
	
		
		02:09 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi.  I've a file with 128Mb block size  I'd like to change an existing file's blocksize using:    hdfs dfs -mv /user/myfile.txt /tmp
  hdfs dfs -D dfs.blocksize=268435456 -cp /tmp/myfile.txt /user  It works  When I try to use a distcp, with -p to preserve original file's attributes, target file's blocksize doesn't change  hadoop distcp -p -D dfs.block.size=268435456 /tmp/myfile.txt /user/myfile.txt  Can't understand where am I wrong    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-27-2020
	
		
		07:49 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi.  Has your client machine a valid kerberos client setup?  Is it's krb5.ini, or krb5.conf for windows machines, coherent with underlying kerberos server setup?     Stefano 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-16-2020
	
		
		12:35 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Thanks for your answer, below the details.     I am running Hive and Impala on CDH 6.1.1     I've set those parameters:  SET hive.support.concurrency = true;
SET hive.exec.dynamic.partition.mode = nonstrict;
set hive.compactor.initiator.on = true;
set hive.compactor.worker.threads = 1;
set hive.txn.manager = org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;  Then I've created a sample table:  CREATE TABLE MYDIM (key int, name string, zip string, is_current boolean)
CLUSTERED BY (key) INTO 3 BUCKETS
STORED AS ORC TBLPROPERTIES ('transactional'='true');  I've loaded some data  INSERT INTO MYDIM VALUES
  (1, 'bob',  '95136', true),
  (2, 'joe',  '70068', true),
  (3, 'steve', '22150', true);  I run a select in HIVE and records are returned  SELECT  * FROM MYDIM;
mydim.key 	mydim.name 	mydim.zip 	mydim.is_current
1	bob	95136	true
2	joe	70068	true
3	steve	22150   true  I can update & delete  UPDATE MYDIM SET NAME = 'svasi' WHERE KEY=3;
SELECT * FROM MYDIM;
mydim.key 	mydim.name 	mydim.zip 	mydim.is_current
1	bob	95136	true
2	joe	70068	true
3	svasi	22150   true  Table is available in Impala but no results are returned when I run the select * from mydim  I've tried to invalidate metadata but still no records are read from Impala query     Thanks for your help    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-15-2020
	
		
		07:10 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Community.  I've created a transactional table on HIVE.  I'm able to perform CRUD operations.  When I try to query the same table from IMPALA, my query returns 0 rows.  I've already invalidated metadata for that table but cannot see any of the existing records.     Where am I wrong? 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
 - 
						
							
		
			Apache Hive
 - 
						
							
		
			Apache Impala