Member since 
    
	
		
		
		05-23-2016
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                4
            
            
                Posts
            
        
                2
            
            
                Kudos Received
            
        
                0
            
            
                Solutions
            
        
			
    
	
		
		
		11-04-2017
	
		
		12:19 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi @Jeff Watson. You are correct about SAS use of String datatypes. Good catch! One of my customers also had to deal with this. String datatype conversions can perform very poorly in SAS.  With SAS/ACCESS to Hadoop you can set the libname option DBMAX_TEXT (added with SAS 9.4m1 release) to globally restrict the character length of all columns read into SAS.  However for restricting column size SAS does specifically recommends using the VARCHAR datatype in Hive whenever possible.  http://support.sas.com/documentation/cdl/en/acreldb/67473/HTML/default/viewer.htm#n1aqglg4ftdj04n1eyvh2l3367ql.htm  Use Case  
 Large Table, All Columns of Type String:  Table A stored in Hive has 40 columns, all of type String, with 500M rows.  By default, SAS Access converts String to $32K. So, 32K in length for char.  The math for this size table yields 1.2MB row length x 500M rows.  This causes the system to come to a halt - Too large to store in LASR or WORK.   The following techniques can be used to work around the challenge in SAS, and they all work:  
 Use char and varchar in Hive instead of String.  Set the libname option DBMAX_TEXT to globally restrict the character length of all columns read in  In Hive do "SET TBLPROPERTIES SASFMT" to add formats for SAS on schema in HIVE.  Add formatting to SAS code during inbound reads
 
 example: Sequence Length 8 Informat 10. format 10.     I hope this helps. 
						
					
					... View more