Member since
11-10-2024
6
Posts
6
Kudos Received
0
Solutions
03-28-2025
03:47 AM
* This has been addressed as part of support case. * Tez job failed with below error. Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 104
Serialization trace:
columnTypeResolvers (org.apache.hadoop.hive.ql.exec.UnionOperator)
tableDesc (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:137)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:186)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:219)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$PartitionDescSerializer.read(SerializationUtilities.java:580)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$PartitionDescSerializer.read(SerializationUtilities.java:572)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:181)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:219)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:709)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:211)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:755)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:661)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:638)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:492)
... 22 more * Serialization related jars loaded from different version of hive-exec (hive-exec-<version>.jar) * Remove older version of jars from the HS2 classpath and aux jars to overcome the problem.
... View more
03-06-2025
07:42 AM
Assuming it's a MapReduce job, since you're looking for information related to MapReduce I/O counters. Script to calculate the counter info. [hive@node4 ~]$ cat get_io_counters.sh
#!/bin/bash
# Ensure a job ID is provided
if [ "$#" -ne 1 ]; then
echo "Usage: $0 <job_id>"
exit 1
fi
JOB_ID=$1
# Extract I/O counters from the MapReduce job status
mapred job -status "$JOB_ID" | egrep -A 1 'File Input Format Counters|File Output Format Counters' | awk -F'=' '
/File Input Format Counters/ {getline; bytes_read=$2}
/File Output Format Counters/ {getline; bytes_written=$2}
END {
total_io_mb = (bytes_read + bytes_written) / (1024 * 1024)
printf "BYTES_READ=%d\nBYTES_WRITTEN=%d\nTOTAL_IO_MB=%.2f\n", bytes_read, bytes_written, total_io_mb
}'
[hive@node4 ~]$ Sample Output [hive@node4 ~]$ ./get_io_counters.sh job_1741272271547_0007
25/03/06 15:38:34 INFO client.RMProxy: Connecting to ResourceManager at node3.playground-ggangadharan.coelab.cloudera.com/10.129.117.75:8032
25/03/06 15:38:35 INFO mapred.ClientServiceDelegate: Application state is completed. FinalApplicationStatus=SUCCEEDED. Redirecting to job history server
BYTES_READ=288894
BYTES_WRITTEN=348894
TOTAL_IO_MB=0.61
[hive@node4 ~]$
... View more
11-17-2024
11:14 PM
1 Kudo
Hi Everyone, I am Emmanuel Katto from Dubai, United Arab Emirates (UAE) We encountered an issue on our production Kudu cluster where the tablet server failed due to a disk failure, and the WAL catalog was lost. After installing a new disk and clearing the data directory following the Kudu documentation (Rebuilding Kudu), we restarted the failing tablet server. However, after restarting, we noticed that the kudu ksck command showed two tablet servers with different UUIDs for the same server, and one of them had a "WRONG SERVER_UUID" status.
Questions:
What could be the cause of this error?
How can we avoid this issue in the future?
Is there a way to resolve this problem without restarting the master server?
We also found the kudu tserver unregister command, which appears to be used for removing tablet servers with incorrect UUIDs, but we didn't find this mentioned in the official documentation.
Regards
Emmanuel Katto
... View more
Labels:
- Labels:
-
Apache Flink
11-13-2024
09:56 PM
1 Kudo
Hey everyone,
I’m encountering a "401 Unauthorized" error while configuring the SiteToSite HTTPS Provenance Reporting Task in NiFi. I’ve double-checked the credentials and the configuration, but it still seems to be giving me this error.
Has anyone else run into this issue or have suggestions on what might be causing it? Any guidance or troubleshooting tips would be much appreciated!
Looking forward to your insights!
Regards
Emmanuel Katto
... View more
Labels:
- Labels:
-
Apache Ambari
11-13-2024
03:32 AM
1 Kudo
Hi team,
I'm trying to set up AES decryption in Apache NiFi using the DecryptContent processor for an encryption process based on AES-128 CTR mode. I've successfully implemented AES decryption locally with Node.js, but I’m running into some trouble replicating it in NiFi.
Here are the details of the encryption setup:
Encrypted Text: c6 c7 4b 49 0d cf 5c 20 87 0a e0 cd c4 a7 bf 94 d8
Key: 3E 9B 26 FE 46 4F 6D 2D 2F 69 5D 87 8A 07 93 74
IV: 2d 2c 83 42 00 74 1b 16 20 c0 7d 13 20 00 00 00
Correct Result: 14 25 79 ed a8 ff a7 00 00 e5 03 00 00 be 03 00 00
I've confirmed that my key and IV are correct. I’m using AES-128, CTR mode, and NoPadding for the encryption. The issue arises when I try to decrypt using NiFi’s DecryptContent processor. Here's what I've tried so far:
Cipher Algorithm Mode: Set to CTR
Cipher Algorithm Padding: Set to NoPadding
Key Specification Format: Set to RAW
For the incoming FlowFile content, I've set it as:
c6c74b490dcf5c20870ae0cdc4a7bf94d84E69466949562d2c834200741b1620c07d1320000000
(I also experimented with adding 4E6946694956 as the NiFi IV delimiter.)
Despite these settings, I get the following error: "Wrong IV length: must be 16 bytes long"
It seems like NiFi is interpreting the data as a regular string rather than HEX, which may be the source of the issue.
I have appreciate any suggestions or insights from the team:
Is there a specific way to input HEX data into NiFi to ensure the IV and content are correctly processed?
Should I be formatting the data differently, or is there a setting in the DecryptContent processor I might have missed?
Are there any additional configuration steps or pitfalls I should be aware of when dealing with AES decryption in CTR mode within NiFi?
Thanks in advance for your help!
Best regards, Emmanuel Katto
... View more
Labels:
- Labels:
-
Apache NiFi
11-11-2024
08:24 PM
1 Kudo
Hi everyone, I'm Emmanuel Katto from Dubai, United Arab Emirates (UAE) working on decrypting data using AES-128 in CTR mode in Apache NiFi, and I could really use some help or suggestions on how to configure it correctly. Here's what I've done so far:
Local Setup (Node.js)
Text to Decrypt: c6 c7 4b 49 0d cf 5c 20 87 0a e0 cd c4 a7 bf 94 d8
Key: 3E 9B 26 FE 46 4F 6D 2D 2F 69 5D 87 8A 07 93 74
Initialization Vector (IV): 2d 2c 83 42 00 74 1b 16 20 c0 7d 13 20 00 00 00
Correct Decryption Result: 14 25 79 ed a8 ff a7 00 00 e5 03 00 00 be 03 00 00
In my Node.js setup, this works perfectly, and I can decrypt the content using AES-128 CTR with no padding.
NiFi Setup (DecryptContent Processor)
I am trying to achieve the same decryption in Apache NiFi using the DecryptContent processor. I’ve configured it as follows:
Cipher Algorithm Mode: CTR
Cipher Algorithm Padding: NoPadding
Key Specification Format: RAW
For the incoming FlowFile content, I’ve set it to:
c6c74b490dcf5c20870ae0cdc4a7bf94d84E69466949562d2c834200741b1620c07d1320000000
However, I get an error: "Wrong IV length: must be 16 bytes long". This error suggests that NiFi is interpreting the content as a normal string and not as HEX values.
My Questions:
How do I correctly provide the IV and encrypted content in HEX format to the DecryptContent processor?
Is there any configuration I’ve missed to specify the content as HEX?
Is the IV delimiter (4E6946694956) necessary in this case, or should I be providing the IV as part of the content differently?
Would appreciate any guidance or suggestions from anyone who has worked with AES decryption in NiFi using CTR mode. Thanks in advance!
Regards
Emmanuel Katto
... View more
Labels:
- Labels:
-
Apache NiFi