- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Apache Flink - InvalidPathException: Nul character not allowed
- Labels:
-
Apache Flink
Created 08-16-2022 11:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I am running Flink 1.14.4 with 3 jobs running in it. JVM: openjdk version "1.8.0_312". All jobs are running without any issues (at least not obviously noticed) however, I see the below exception in the log. Im not able to understand the reason for the exception. Not sure what is the issue.
Appreciate any help on this.
Exception: java.nio.file.InvalidPathException: Nul character not allowed (full stack trace below)
2022-08-15 16:35:15.350 [flink-rest-server-netty-worker-thread-5] WARN org.apache.flink.runtime.rest.FileUploadHandler - File upload failed.
java.nio.file.InvalidPathException: Nul character not allowed: bash' '-c' #cmdlinux})).(#p new java.lang.ProcessBuilder(#cmds)).(#p.redirectErrorStream(true)).(#process #p.start()).(#ros (@org.apache.struts2.ServletActionContext@getResponse().getOutputStream())).(@org.apache.commons.io.IOUtils@copy(#process.getInputStream() #ros)).(#ros.flush())}b
at sun.nio.fs.UnixPath.checkNotNul(UnixPath.java:93)
at sun.nio.fs.UnixPath.normalizeAndCheck(UnixPath.java:83)
at sun.nio.fs.UnixPath.<init>(UnixPath.java:71)
at sun.nio.fs.UnixFileSystem.getPath(UnixFileSystem.java:281)
at sun.nio.fs.AbstractPath.resolve(AbstractPath.java:53)
at org.apache.flink.runtime.rest.FileUploadHandler.channelRead0(FileUploadHandler.java:189)
at org.apache.flink.runtime.rest.FileUploadHandler.channelRead0(FileUploadHandler.java:71)
at org.apache.flink.shaded.netty4.io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at org.apache.flink.shaded.netty4.io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
at org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
at org.apache.flink.shaded.netty4.io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at org.apache.flink.shaded.netty4.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:719)
at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)
at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)
at org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
at org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at org.apache.flink.shaded.netty4.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at java.lang.Thread.run(Thread.java:748)
Thanks,
Prabu.
Created 08-27-2022 06:00 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@spserd ,
Do these error appear only once in the logs or are they recurring?
What's the command line you used to run the job?
Cheers,
André
Was your question answered? Please take some time to click on "Accept as Solution" below this post.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created 08-29-2022 12:19 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
1. The error is recurring in the logs.
2. To start/stop Flink, I use 'service flink-jobmanager start/stop'.
3. To start the Flink job, I use: $FLINK_HOME/bin/flink run -d /opt/flink-jobs/lib/job1.jar --pathToPropertyFile /opt/flink-jobs/conf/job1.properties
4. To cancel the Flink job, I get the ID from 'flink list' command and I use: $FLINK_HOME/bin/flink cancel <ID>
Please let me know if you need anything.
Appreciate your help on this.
Thanks,
Prabu.
Created 08-29-2022 04:34 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Not sure what's going on. What's your job doing?
Was your question answered? Please take some time to click on "Accept as Solution" below this post.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created 09-12-2022 07:40 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @araujo,
My apologies for coming back late.
The job reads from 2 Kafka sources and send the data to Kinesis stream (sink).
Thanks,
Prabu.