Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

"org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version" error

avatar
Expert Contributor

Hi,

I am unable to start the "Hive Metastore" service through Apache Server . I keep getting the following error. The materials I found on the internet suggest that it could be due to attempt to get schema information with older metastore. But, I used Ambari to install the cluster and all the sevices, so I am sure all the metastore should have been up to date. I have also observed that the service "HiveServer2" is unstable. When I start the service, it starts and shown as green in the dashboard. But after some time, it stops automatically. I do not know the reason why. When I started writing this post, it was green. After I finished writing the post, it was red so I restarted it.

I am adding some more information. The "MySQL server" service is always in red and the status is always "install failed" . I found out through this forum that, since I had installed MySQL manually, during the first step of Ambari Installation, this service cannot be installed. Is this causing the issue?. Here is the error.

Metastore connection URL: jdbc:mysql://mycomputer.mydomain.com/hive?createDatabaseIfNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hiveuser
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
1 ACCEPTED SOLUTION

avatar
Master Guru

This occurs if the database is there but somehow corrupted. He cannot read the schema version means he cannot find the table entry that contains the hive version. Can you run any query? I wouldn't think so.

How about recreate it and point to the new correct one in ambari? Guidelines below:

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/validat...

View solution in original post

24 REPLIES 24

avatar
Contributor

Hello guys,

I am facing the same issue. and also following the necessary steps that you people taken to resolve issue.

but still i am facing same issue.I have root password of my sql but for drpping hive data base when i fire command show databases; It give me the ERROR msg.

ERROR 1820 (HY000): You must reset your password using ALTER USER statement before executing this statement.

Please tell me the solution to fix it asap.

Thanks in advance.

avatar

That message indicates that your user password may have expired. However you should be able to run set commands

SET PASSWORD = PASSWORD('new_updated_password');

After this your problem should go away.

avatar
Explorer

HI, I am trying this post but when I executed this command "$HIVE_HOME/bin/schematool -initSchema -dbType mysql"

I got no such file or directory found

avatar
Master Guru

you need to replace hive home with the actual path? /usr/hdp/<version>/hive/

Also on the node where you run it you need a hive client installed.

avatar
Explorer

@Benjamin Leonhardi

Thanks.. I am able to find actual path... But how will I replace because everytime I am getting HIVE_HOME not found

Please share how to install hive client..

Is it the right document I am referring https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.0/bk_installing_manually_book/content/ch_inst...

avatar
Explorer

@Benjamin Leonhardi I tried to install thru "yum install hive-hcatalog"

Please find the attached screenshot

6046-2016-07-25-3.png

avatar
Master Guru

Are you using HDP? Then you would install them through Ambari. Host-> Add Client

avatar
Explorer
@Benjamin Leonhardi

Yes, I am using HDP

avatar
Master Guru

So on the ambari go to hosts, select the host you want and press the big Add+ button

avatar
Master Guru

You still will not have HIVE_HOME because the scripts set it dynamically you need to replace that place holder with /usr/hdp/<yourversionlookitupinlinux>/hive