Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

solr index not getting created on hive table if the column count is greater than 30

Highlighted

solr index not getting created on hive table if the column count is greater than 30

New Contributor

Hi All,

I am trying to create a solr index from a hive table. I have listed down the steps i have followed and the issue i am facing :

Steps:

1.) created a base table in hive

hive> create table solrinput3(username string) row format delimited fields terminated by ',';

2.) loaded sample data into the table 'solrinput3' as follows :

hive> insert into solrinput3 values('sanvi');

3.) ADD JAR /opt/lucidworks-hdpsearch/hive/solr-hive-serde-2.2.5.jar;

4.) Now, I have created a solr-hive integrated table as follows :

CREATE EXTERNAL TABLE dbname.solrtest (title STRING)

STORED BY 'com.lucidworks.hadoop.hive.LWStorageHandler'

LOCATION '/lob/test/hive_test'

TBLPROPERTIES('solr.server.url' = 'http://XXXX.XXX.XXX:8983/solr',

'solr.collection' = 'myproj_collection1',

'solr.query' = '*:*');

5.) insert overwrite table solrtest select * from solrinput3;

Above steps allowed me to create a solr index on the table 'solrtest'

We have used only one column field 'title' in this scenario. But, When we tried to use more than 30 column fields , its not creating solr index. Rather its not throwing any error as well.

Is there any limit on number of column fields?

How can we overcome this problem ?

Request some one to address this problem .,

Thank You