Support Questions

Find answers, ask questions, and share your expertise

Can't start HDFS after expiration of Enterprise Trial

avatar
Contributor

Hello All,

 

I was away from my Cloudera cluster for a bit and the Enterprise trial expired in the mean time.

 

When I got back I needed to update my system (RHEL 7.x, sudo yum update) and when I restarted the system now my HDFS, and pretty much everything else seesm to be kaput.  Mostly I care about HDFS right now.

 

The cloudera-scm-agent and cloudera-scm-server seem to be running fine.

 

I don't have any entries in my HDFS log from Dec 19th on, which is when I think I applied the updates are restarted.

 

Thanks for any ideas you might have.

 

Joe

 

P.S. I've secured this server with Active Directory.

1 ACCEPTED SOLUTION

avatar
Contributor

After further review this seems to be somehow due to updating my Red Hat 7.2 packages (sudo yum update).

 

I noticed in my HDFS Configuration I was getting a notification

 

Mismatched CDH versions: host has NONE but role expect 5

 

Via Google I was able to see that others had fixed the problem by removing OpenJDK from their system and using the Oracle or Cloudera provided JDK.  Since I had the cloudera version I went ahead made the configuration changes necessary to use that  (for me this was /usr/java/jdk1.7.0_67-cloudera).

 

 

View solution in original post

3 REPLIES 3

avatar
Master Collaborator

> I was away from my Cloudera cluster for a bit and the Enterprise trial expired in the mean time.

When a Cloudera Enterprise license expires, the following occurs:  [0]

 

 

- Cloudera Enterprise Enterprise Data Hub Edition Trial - Enterprise features are 
no longer available. see Table [1] - Cloudera Enterprise - Cloudera Manager Admin Console displays a banner
indicating license expiration. Contact Cloudera Support to receive an updated license.
In the meanwhile, all enterprise features will continue to be available.

 

Is your CDH deployment packaged based or parcel?

 

Do you know what your yum update performed, if it updated any CDH/Cloudera packages?

Can you check your yum logs /var/log/yum.log?

 

How did you determined that everything is _kaput_, is there any error/stack trace that you can share?

 

Best,

 

Michalis

 

[0] https://www.cloudera.com/documentation/enterprise/latest/topics/cm_ag_licenses.html#cmug_topic_13_7_...

[1] https://www.cloudera.com/documentation/enterprise/latest/topics/cm_ig_feature_differences.html

avatar
Contributor

I'm not seeing those error messages.  Would they be in a log somewhere?

 

No Cloudera packages were updated.  I can give you a complete list if you desire but it is quite large.

 

When I try to run a very simple command I get a connection refused.

 

[JHS4@pp-hadoop-sec0 ~]$ hdfs dfs -ls
ls: Call From pp-hadoop-sec0.accelrys.net/10.106.15.166 to pp-hadoop-sec0.accelrys.net:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
[JHS4@pp-hadoop-sec0 ~]$

Also I'm unable to access the the Hadoop overview web page.  The one that shows up on port 50070.  (Page cannot be displayed).

 

Joe

 

 

 

avatar
Contributor

After further review this seems to be somehow due to updating my Red Hat 7.2 packages (sudo yum update).

 

I noticed in my HDFS Configuration I was getting a notification

 

Mismatched CDH versions: host has NONE but role expect 5

 

Via Google I was able to see that others had fixed the problem by removing OpenJDK from their system and using the Oracle or Cloudera provided JDK.  Since I had the cloudera version I went ahead made the configuration changes necessary to use that  (for me this was /usr/java/jdk1.7.0_67-cloudera).