Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

java.lang.IllegalStateException(zip file closed)

avatar
Expert Contributor

We are currently using CDH 5.8.3 and most of our oozie hive actions are failing frequently because of following error:

 

ERROR : Ended Job = job_xx with exception 'java.lang.IllegalStateException(zip file closed)'
java.lang.IllegalStateException: zip file closed
at java.util.zip.ZipFile.ensureOpen(ZipFile.java:634)
at java.util.zip.ZipFile.getEntry(ZipFile.java:305)
at java.util.jar.JarFile.getEntry(JarFile.java:227)
at sun.net.www.protocol.jar.URLJarFile.getEntry(URLJarFile.java:128)
at sun.net.www.protocol.jar.JarURLConnection.connect(JarURLConnection.java:132)
at sun.net.www.protocol.jar.JarURLConnection.getInputStream(JarURLConnection.java:150)
at java.net.URLClassLoader.getResourceAsStream(URLClassLoader.java:233)
at javax.xml.parsers.SecuritySupport$4.run(SecuritySupport.java:94)
at java.security.AccessController.doPrivileged(Native Method)
at javax.xml.parsers.SecuritySupport.getResourceAsStream(SecuritySupport.java:87)
at javax.xml.parsers.FactoryFinder.findJarServiceProvider(FactoryFinder.java:283)
at javax.xml.parsers.FactoryFinder.find(FactoryFinder.java:255)
at javax.xml.parsers.DocumentBuilderFactory.newInstance(DocumentBuilderFactory.java:121)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2526)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:982)
at org.apache.hadoop.mapred.JobConf.checkAndWarnDeprecation(JobConf.java:2032)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:484)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:474)
at org.apache.hadoop.mapreduce.Cluster.getJob(Cluster.java:210)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:604)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:602)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.JobClient.getJobUsingCluster(JobClient.java:602)
at org.apache.hadoop.mapred.JobClient.getJobInner(JobClient.java:612)
at org.apache.hadoop.mapred.JobClient.getJob(JobClient.java:642)
at org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:289)
at org.apache.hadoop.hive.ql.exec.mr.HadoopJobExecHelper.progress(HadoopJobExecHelper.java:549)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:435)
at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1782)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1539)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1318)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1127)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1120)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:178)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

 

Please help me to resolve this error.

5 REPLIES 5

avatar
Champion
What libraries or archive files is the job using?

It seems to be trying to connect to some URL using a library somewhere and failing to open that file.

avatar
Expert Contributor

The file is a simple beeline hql to insert data using oozie hive2 action and we have been using this from couple of years and never faced this issue.

 

 

Following is the oozie action:

 

<action name="hive-action-prime-stage-summary-incr">

        <hive2 xmlns="uri:oozie:hive2-action:0.2">

            <job-tracker>${jobTracker}</job-tracker>

            <name-node>${nameNode}</name-node>

            <job-xml>${hiveConfDir}/hive-site.xml</job-xml>

            <jdbc-url>${beeline_jdbc_url}</jdbc-url>

            <script>${oozie_script_path_prime}/hql/stage_summary_incr.hql</script>

            <param>database_destination=${primeDataBaseName}</param>

            <param>tenantid=${xyz}</param>

            <param>version_number=${version}</param>

            <param>database_source=${udmDataBaseName}</param>

            <param>hive_job_metastore_databasename=${hive_job_metastore_databasename}</param>

            <param>hiveUDFJarPath=${ciUDFJarPath}</param>

            <argument>-wpf</argument>

            <file>${hiveConfDir}/hive-site.xml#hive-site.xml</file>

            <file>${nameNode}${impala_udfs}/pf#pf</file>

        </hive2>

        <ok to="joiningS"/>

        <error to="kill_mail"/>

    </action>

avatar
New Contributor

any solution for this problem?

 

I met the same problem, with OpenJDK and CDH 5.13.

avatar
Expert Contributor

Foolbear,

 

Since this is a Hive2 Action, and the job is connecting through JDBC, the following configuration is probably superfluous and should be removed.  All UDF interactions are done through HiveServer2 and are hidden from the client.

 

<param>hiveUDFJarPath=${ciUDFJarPath}</param>

 

Also remove any references to UDFs in <file>${hiveConfDir}/hive-site.xml#hive-site.xml</file>

 

Thanks.

avatar
Expert Contributor

If performing an ADD JAR statement in the HQL file, please reconsider and install the JAR into HiveServer2 as a permanent UDF.

 

https://www.cloudera.com/documentation/enterprise/5-12-x/topics/cm_mc_hive_udf.html

 

https://issues.apache.org/jira/browse/HADOOP-13809

https://issues.apache.org/jira/browse/HIVE-11681