Is there a dependency in Ambari/HDP so we have to install Oracle jdbc jars in all slave node .
Hadoop uses jdbc jars coming with instant client to connect Oracle backend databases, and as Hadoop administrators use sqlplus to verify the database and do trouble-shooting.
Ambari uses jdbc driver to get connected to Databases. So it's important to have a compatible jdbc driver. If the existing driver is not compatible, a new one can be pushed to all the nodes using the following command
ambari-server setup --jdbc-db=oracle --jdbc-driver=/pathtodriver/driver
If the customer is already using Oracle 11g - most likely the existing jdbc driver should be good. Are they facing any issue in Ambari or other products? The jar should be present in product specific folders as well if those products use Oracle as a metastore.
Once ambari-server setup is executed with new JDBC jar, what is the next step to push the jar to Hadoop services - hive, ooze and ranger?
Also --jdbc-driver specifies to the jar path such as /usr/share/java/ojdbc7.jar, could it be a path such as /usr/share/java?
Not in all slave nodes. Just edge/client nodes. Sqoop will distribute jdbc jar from its lib to any slave node that runs an Sqoop job.