Member since
03-01-2022
15
Posts
0
Kudos Received
0
Solutions
09-14-2022
01:07 PM
@Asim- Unless your final table has to be a Hive managed(acid) table then, you could incrementally update the Hive table directly using Sqoop. e.g. sqoop import --connect jdbc:oracle:thin:@xx.xx.xx.xx:1521:ORCL --table EMPLOYEE --username user1 --password welcome1 --incremental lastmodified --merge-key employee_id --check-column emp_timestamp --target-dir /usr/hive/warehouse/external/empdata/ Otherwise, the way you are trying is the actually the way Cloudera recommends it.
... View more
08-30-2022
04:10 AM
Hi @Asim- You can check the following link for spark and dbt integration. https://community.cloudera.com/t5/Innovation-Blog/Running-dbt-core-with-adapters-for-Hive-Spark-and-Impala/ba-p/350384
... View more
08-02-2022
06:34 PM
@Asim- JDBC also you need HWC for Managed tables. Here is the example for Spark2, but as mentioned earlier Spark3 we don't have any other way to connect Hive ACID tables from Apache Spark other than HWC and it is not yet a supported feature for Spark3.2 / CDS 3.2 in CDP 7.1.7. Marking this thread close, if you have any issues related to external tables kindly start a new Support-Questions thread for better tracking of the issue and documentation. Thanks
... View more
07-20-2022
06:40 AM
Hi @Asim- Can you upload the screenshot of the ListenSyslog processors properties? The link below may help you to set the buffer size, [1] https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/5/html/tuning_and_optimizing_red_hat_enterprise_linux_for_oracle_9i_and_10g_databases/sect-oracle_9i_and_10g_tuning_guide-adjusting_network_settings-changing_network_kernel_settings **NOTE: Please check with your system admin before you make any changes at the OS level.
... View more
03-07-2022
03:52 AM
{
"_id": "620e6275034f4fe64f1ce2ef",
"patientorderitems": [
{
"_id": "620e6275034f4fe64f1ce2f0",
"patientorderlogs": [
{
"_id": "620e6275034f4fe64f1ce2f1",
"useruid": "6031edd256afd66888232d6e",
"departmentuid": "602f6a3494ce862c04aa49d2"
},
{
"_id": "621efc35da15edd34560da80",
"useruid": "6032021359f2cf686ae807ba",
"departmentuid": "602f6a3494ce862c04aa49d5"
},
{
"_id": "6220a702061f33f4abe8a2a6",
"useruid": "604f3cb743027274a8c565de",
"departmentuid": "604c5393864aa9012d79e986"
},
{
"_id": "6220a70f65ca50f522598a85",
"useruid": "604f3cb743027274a8c565de",
"departmentuid": "604c5393864aa9012d79e986"
},
{
"_id": "6220a717145139f53cfb6143",
"useruid": "604f3cb743027274a8c565de",
"departmentuid": "604c5393864aa9012d79e986"
}
]
}
]
} dear araujo can I have a jolt specification for this JSON as the same format of last JSON?
... View more
03-03-2022
12:33 AM
@Asim- , > There is not Failure FlowFile so I don't think its cause this isssue It doesn't hurt making the modification and trying. What I noticed is that if there's any exception during the execution of your code, due to the problem in that line that I mentioned, no error is shown in the UI and no flowfiles go to either the success or failure relationships. Please make the change and test to see if something else appears. Cheers, André
... View more