Support Questions

Find answers, ask questions, and share your expertise

Oozie not restarting

avatar

Hello,

I recently upgraded ambari to 2.4 did all the post upgrade stuff and now when im restarting services oozie fails and wont let me restart the other services. i thought it was a kerberos issue so i went to disable that since its not really needed and when it goes to shut down oozie it again fails. no idea why. below is the error out put. it seems to be looking for a file or configuration but i have no idea where it is located? any help would be appreciated.

Thanks in advance.

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_client.py", line 76, in <module>
    OozieClient().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 680, in restart
    self.install(env)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_client.py", line 38, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_client.py", line 45, in configure
    oozie(is_server=False)
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py", line 143, in oozie
    content=Template("oozie.conf.j2")
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 123, in action_create
    content = self._get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 160, in _get_content
    return content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 51, in __call__
    return self.get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 142, in get_content
    rendered = self.template.render(self.context)
  File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 891, in render
    return self.environment.handle_exception(exc_info, True)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/templates/oozie.conf.j2", line 35, in top-level template code
    {{oozie_user}}   - nproc    {{oozie_user_nproc_limit}}
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'oozie_user_nofile_limit' was not found in configurations dictionary!
1 ACCEPTED SOLUTION

avatar
Master Guru
@steve coyle

Looks like you are missing below parameter in oozie environment.

23385-screen-shot-2017-08-01-at-105114-am.png

Please add it and restart required services via Ambari.

View solution in original post

11 REPLIES 11

avatar
Master Mentor

@steve coyle

Have you checked in

limits_conf_dir = /etc/security/limits.d

avatar

@Geoffrey Shelton Okot


in the oozie.conf file i have

oozie - nofile 32000

oozie - nproc 1600

i am using ambari to configure everything though and i dont see anything about the oozie file limit in the config section at all.

avatar
Master Mentor

@steve coyle

The problem is with the below param

oozie_user_nproc_limit

check this document

avatar

@Geoffrey Shelton Okot

ok i changed the limits and restarted and got the same error.

i then took a look at the server and gave that a restart. got an error about a missing directory /var/tmp/oozie so i made that directory and cleared that error.

restarted the server

and got the same dang error for the server now:

  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_server.py", line 215, in <module>
    OozieServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_server.py", line 88, in start
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_server.py", line 82, in configure
    oozie(is_server=True)
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py", line 143, in oozie
    content=Template("oozie.conf.j2")
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 123, in action_create
    content = self._get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 160, in _get_content
    return content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 51, in __call__
    return self.get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 142, in get_content
    rendered = self.template.render(self.context)
  File "/usr/lib/python2.6/site-packages/ambari_jinja2/environment.py", line 891, in render
    return self.environment.handle_exception(exc_info, True)
  File "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/templates/oozie.conf.j2", line 35, in top-level template code
    {{oozie_user}}   - nproc    {{oozie_user_nproc_limit}}
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'oozie_user_nofile_limit' was not found in configurations dictionary!

here is the stdout for the server if that might help?

2017-08-01 13:15:51,918 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.0.0-2557
2017-08-01 13:15:51,918 - Checking if need to create versioned conf dir /etc/hadoop/2.3.0.0-2557/0
2017-08-01
 13:15:51,919 - call[('ambari-python-wrap', '/usr/bin/conf-select', 
'create-conf-dir', '--package', 'hadoop', '--stack-version', 
'2.3.0.0-2557', '--conf-version', '0')] {'logoutput': False, 'sudo': 
True, 'quiet': False, 'stderr': -1}
2017-08-01 13:15:51,952 - call returned (1, '/etc/hadoop/2.3.0.0-2557/0 exist already', '')
2017-08-01 13:15:51,953 - checked_call[('ambari-python-wrap',
 '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', 
'--stack-version', '2.3.0.0-2557', '--conf-version', '0')] {'logoutput':
 False, 'sudo': True, 'quiet': False}
2017-08-01 13:15:51,987 - checked_call returned (0, '')
2017-08-01 13:15:51,989 - Ensuring that hadoop has the correct symlink structure
2017-08-01 13:15:51,989 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-01 13:15:52,141 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.0.0-2557
2017-08-01 13:15:52,142 - Checking if need to create versioned conf dir /etc/hadoop/2.3.0.0-2557/0
2017-08-01
 13:15:52,142 - call[('ambari-python-wrap', '/usr/bin/conf-select', 
'create-conf-dir', '--package', 'hadoop', '--stack-version', 
'2.3.0.0-2557', '--conf-version', '0')] {'logoutput': False, 'sudo': 
True, 'quiet': False, 'stderr': -1}
2017-08-01 13:15:52,174 - call returned (1, '/etc/hadoop/2.3.0.0-2557/0 exist already', '')
2017-08-01 13:15:52,174 - checked_call[('ambari-python-wrap',
 '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', 
'--stack-version', '2.3.0.0-2557', '--conf-version', '0')] {'logoutput':
 False, 'sudo': True, 'quiet': False}
2017-08-01 13:15:52,209 - checked_call returned (0, '')
2017-08-01 13:15:52,210 - Ensuring that hadoop has the correct symlink structure
2017-08-01 13:15:52,211 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-01 13:15:52,213 - Group['spark'] {}
2017-08-01 13:15:52,216 - Group['ranger'] {}
2017-08-01 13:15:52,216 - Group['hadoop'] {}
2017-08-01 13:15:52,217 - Group['users'] {}
2017-08-01 13:15:52,217 - Group['knox'] {}
2017-08-01 13:15:52,218 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,219 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,220 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,221 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-08-01 13:15:52,222 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,222 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,223 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-08-01 13:15:52,225 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger']}
2017-08-01 13:15:52,226 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-08-01 13:15:52,228 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,229 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,231 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-08-01 13:15:52,233 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,234 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,236 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,238 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,239 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,241 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,242 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,244 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,246 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-08-01 13:15:52,247 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-08-01 13:15:52,251 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-08-01 13:15:52,261 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-08-01
 13:15:52,262 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 
'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-08-01 13:15:52,264 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-08-01 13:15:52,267 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-08-01 13:15:52,276 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-08-01 13:15:52,277 - Group['hdfs'] {}
2017-08-01 13:15:52,277 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-08-01 13:15:52,279 - FS Type: 
2017-08-01 13:15:52,279 - Directory['/etc/hadoop'] {'mode': 0755}
2017-08-01 13:15:52,308 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2017-08-01 13:15:52,309 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-08-01
 13:15:52,324 - Execute[('setenforce', '0')] {'not_if': '(! which 
getenforce ) || (which getenforce && getenforce | grep -q 
Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2017-08-01 13:15:52,334 - Skipping Execute[('setenforce', '0')] due to not_if
2017-08-01
 13:15:52,334 - Directory['/var/log/hadoop'] {'owner': 'root', 
'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 
'a'}
2017-08-01 13:15:52,339 - Directory['/var/run/hadoop'] 
{'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 
'a'}
2017-08-01 13:15:52,340 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2017-08-01 13:15:52,350 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2017-08-01 13:15:52,352 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2017-08-01 13:15:52,353 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2017-08-01 13:15:52,371 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs', 'group': 'hadoop'}
2017-08-01 13:15:52,372 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2017-08-01 13:15:52,373 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2017-08-01 13:15:52,379 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2017-08-01 13:15:52,386 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2017-08-01 13:15:52,643 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.0.0-2557
2017-08-01 13:15:52,643 - Checking if need to create versioned conf dir /etc/hadoop/2.3.0.0-2557/0
2017-08-01
 13:15:52,644 - call[('ambari-python-wrap', '/usr/bin/conf-select', 
'create-conf-dir', '--package', 'hadoop', '--stack-version', 
'2.3.0.0-2557', '--conf-version', '0')] {'logoutput': False, 'sudo': 
True, 'quiet': False, 'stderr': -1}
2017-08-01 13:15:52,671 - call returned (1, '/etc/hadoop/2.3.0.0-2557/0 exist already', '')
2017-08-01 13:15:52,672 - checked_call[('ambari-python-wrap',
 '/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', 
'--stack-version', '2.3.0.0-2557', '--conf-version', '0')] {'logoutput':
 False, 'sudo': True, 'quiet': False}
2017-08-01 13:15:52,701 - checked_call returned (0, '')
2017-08-01 13:15:52,702 - Ensuring that hadoop has the correct symlink structure
2017-08-01 13:15:52,702 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-08-01
 13:15:52,712 - checked_call['rpm -q --queryformat 
'%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] 
{'stderr': -1}
2017-08-01 13:15:52,762 - checked_call returned (0, '2.5.6.0-40', '')
2017-08-01 13:15:52,768 - HdfsResource['/user/oozie'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab',
 'dfs_type': '', 'default_fs': 'hdfs://amaya.dge.local:8020', 
'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore',
 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 
'principal_name': 'hdfs@DGE.LOCAL', 'user': 'hdfs', 'owner': 'oozie', 
'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 
'directory', 'action': ['create_on_execute'], 'immutable_paths': 
[u'/apps/hive/warehouse', u'/apps/falcon', u'/mr-history/done', 
u'/app-logs', u'/tmp'], 'mode': 0775}
2017-08-01 13:15:52,770 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@DGE.LOCAL'] {'user': 'hdfs'}
2017-08-01
 13:15:52,835 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl 
-sS -L -w '"'"'%{http_code}'"'"' -X GET --negotiate -u : '"'"'http://amaya.dge.local:50070/webhdfs/v1/user/oozie?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmpUr0RTf 2>/tmp/tmpp13S1c''] {'logoutput': None, 'quiet': False}
2017-08-01 13:15:52,887 - call returned (0, '')
2017-08-01 13:15:52,888 - HdfsResource[None] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab',
 'dfs_type': '', 'default_fs': 'hdfs://amaya.dge.local:8020', 
'hdfs_resource_ignore_file': '/var/lib/ambari-agent/data/.hdfs_resource_ignore',
 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 
'principal_name': 'hdfs@DGE.LOCAL', 'user': 'hdfs', 'action': 
['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': [u'/apps/hive/warehouse', u'/apps/falcon', u'/mr-history/done', u'/app-logs', u'/tmp']}
2017-08-01 13:15:52,889 - Directory['/usr/hdp/current/oozie-server/conf'] {'owner': 'oozie', 'create_parents': True, 'group': 'hadoop'}
2017-08-01 13:15:52,889 - XmlConfig['oozie-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/oozie-server/conf', 'mode': 0664, 'configuration_attributes': {}, 'owner': 'oozie', 'configurations': ...}
2017-08-01 13:15:52,906 - Generating config: /usr/hdp/current/oozie-server/conf/oozie-site.xml
2017-08-01 13:15:52,906 - File['/usr/hdp/current/oozie-server/conf/oozie-site.xml']
 {'owner': 'oozie', 'content': InlineTemplate(...), 'group': 'hadoop', 
'mode': 0664, 'encoding': 'UTF-8'}
2017-08-01 13:15:52,947 - File['/usr/hdp/current/oozie-server/conf/oozie-env.sh'] {'content': InlineTemplate(...), 'owner': 'oozie', 'group': 'hadoop'}
2017-08-01 13:15:52,948 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2017-08-01 13:15:52,952 - File['/etc/security/limits.d/oozie.conf'] {'content': Template('oozie.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}


Command failed after 1 tries

could it be a kerberos issue? ive been thinking of removing it, its not really needed since the cluster is in a secure network. however i couldnt remove it because of oozie not wanting to shut down. however since i did the server restart it stopped working so i guess i can proceed with that? lol



thanks so much for your help its really appreciated.

avatar
Master Guru
@steve coyle

Looks like you are missing below parameter in oozie environment.

23385-screen-shot-2017-08-01-at-105114-am.png

Please add it and restart required services via Ambari.

avatar
Master Guru
@steve coyle

Looks like you are missing below parameter in oozie environment.

23385-screen-shot-2017-08-01-at-105114-am.png

Please add it and restart required services via Ambari.

avatar

@Kuldeep Kulkarni

i dont have that variable in ambari ?

here is what my advanced oozie-env looks like

23386-oozie-ambari.jpg

avatar
Master Guru

@steve coyle

You can add it via configs.sh/configs.py script

Here is my configuration for your reference.

23387-screen-shot-2017-08-01-at-111611-am.png

avatar

@Kuldeep Kulkarni

do you happen to have something on how to do that? never done something like before.