Planning to upgrade from HDP 2.6.0 to 3.1.0, So, here can we use the old client with the new server and vice versa? or do we need to upgrade the client side/tools as well? can anyone please help me here.
When you are upgrading from a major version to the other, the Ambari upgrade process takes care of upgrading the clients too, the client version must also match the core HDP version as new libraries/jars are delivered for both core and client software. You won't need to do manual client upgrade unless you are using off the shelf Hadoop
Only read carefully the post-upgrade steps ie for hive
@Geoffrey Shelton Okot
Ambari clients side should be fine, but thinking about from the application (client) side, do we need to upgrade any clients (jars/lib's) from app side clients? Or Is it still work with old clients with new server and vice versa?
Can you share what applications that are plugged to your Hadoop cluster, usually i.e if you have a standalone nifi cluster storing some data in HDFS on an HDP cluster you will need to copy the hdfs-site.xml and core-site.xml if it's presto you do likewise so unless you share what application I can't be of much use but that gives you an idea of the post-upgrade steps to take.
I am connecting Java application with Haddop cluster using connection pools.
Maybe you should validate and copy over the below jars. I haven't been in your situation but this is my best guess