This seems like a permission issue. As indicated by the exception, please verify if appropriate write
permissions exist for table directory.
if it is just a vm that you are playin around , disable the security in the hdfs-site.xml
by setting the below property to false in hdfs-site.xml
<property> <name>dfs.permissions.enabled</name> <value>false</value> </property>
After setting please restart the all.
sudo -u hdfs hadoop fs -chmod -R 777 /solr/twitter_demo/core_node1/data/tlog
you have to give the
Thanks again for the reply,
The Stated property is not there in hdfs-site.xml .
Shall i add ?
I am not using solr .
I want Hive analysis on twitter data which came via Flume.
Yes please add the property if it is not there and if you are ok in disabling the security
also restart everything , in order to publish the change in the hdfs-site.xml
I have added the property as you stated and restarted.
But this error is repeating.
Appreciate your help.
This time it didnt complain about the permission error that you got last time.
please issue the below commands.
MSCK REPAIR TABLE table_name;
analyze table table_name compute statistics;
the above should refresh the metadata and try select * from tablename;
Could you verify your hive/lib folder whether you have the hive-serdes-1.0-SNAPSHOT.jar , if not please download it and add it and please excute the select statement.