Member since
09-19-2020
46
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2299 | 07-13-2021 12:09 AM |
06-28-2022
05:32 PM
Hello Team, we are doing CDC by pushing data to Kafka and another pipeline will be reading data from Kafka. Whenever we restart the second pipeline (read from Kafka to Kudu), I notice there are thousands of records coming. I would like to know how Kafka keeps the checkpoints? Is there any setting to change it? Thanks, Roshan
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Kudu
05-18-2022
01:11 AM
Hello Team, could you please advise how update with table alias works? update cbs_cubes.TB_JDV_CBS_NEW set a.SUB_SERVICE_CODE_V=b.SUB_SERVICE_CODE_V from cbs_cubes.update1_sub_service_code b where a.SUB_SERVICE_CODE_V=b.SUB_SERVICE_CODE_V [Cloudera][ImpalaJDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: TStatus(statusCode:ERROR_STATUS, sqlState:HY000, errorMessage:AnalysisException: 'cbs_cubes.TB_JDV_CBS_NEW' is not a valid table alias or reference. ), Query: update cbs_cubes.TB_JDV_CBS_NEW set a.SUB_SERVICE_CODE_V=b.SUB_SERVICE_CODE_V from cbs_cubes.update1_sub_service_code b where a.SUB_SERVICE_CODE_V=b.SUB_SERVICE_CODE_V. Thanks, Roshan
... View more
Labels:
- Labels:
-
Apache Impala
-
Apache Kudu
04-25-2022
12:52 AM
Hello Team, can you please advise why connection to Kafka fails? Is the issue with keytab? Keytab is working fine on Linux machine. sasl.jaas.config= com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true keyTab="D:\services.kerberos.keytab" principal="services/rb-hadoop-06.mtg.local@INNOV.LOCAL"; org.apache.kafka.common.KafkaException: Failed to create new KafkaAdminClient at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:540) at org.apache.kafka.clients.admin.Admin.create(Admin.java:134) at io.conduktor.kafka.plugins.PluginsAwareKafkaAdmin$.$anonfun$create$2(PluginsAwareKafkaAdmin.scala:17) at io.conduktor.plugins.PluginsResources$$anon$1.executeWithPlugins(PluginsResources.scala:74) at io.conduktor.kafka.plugins.PluginsAwareKafkaAdmin$.$anonfun$create$1(PluginsAwareKafkaAdmin.scala:17) at scala.util.Try$.apply(Try.scala:210) at io.conduktor.kafka.plugins.PluginsAwareKafkaAdmin$.create(PluginsAwareKafkaAdmin.scala:17) at io.conduktor.kafka.plugins.PluginsAwareKafkaAdmin.create(PluginsAwareKafkaAdmin.scala) at io.conduktor.kafka.KafkaClientMaker.makeAdminClient(KafkaClientMaker.kt:103) at io.conduktor.kafka.KafkaClientMaker.checkConnectivity(KafkaClientMaker.kt:79) at io.conduktor.views.clusterconfiguration.KafkaCluster$kafkaClusterConfigTab$1$2$1$11$1$2$1$1.invokeSuspend(KafkaCluster.kt:334) at io.conduktor.views.clusterconfiguration.KafkaCluster$kafkaClusterConfigTab$1$2$1$11$1$2$1$1.invoke(KafkaCluster.kt) at io.conduktor.views.clusterconfiguration.KafkaCluster$kafkaClusterConfigTab$1$2$1$11$1$2$1$1.invoke(KafkaCluster.kt) at io.conduktor.JavaFxExtensionsKt$launchWithProgressCancelable$job$1.invokeSuspend(JavaFxExtensions.kt:353) at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678) at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665) Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: null (68) at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:184) at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:192) at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:81) at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:105) at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:513) ... 19 more Caused by: javax.security.auth.login.LoginException: null (68) at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Unknown Source) at jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Unknown Source) at java.base/javax.security.auth.login.LoginContext.invoke(Unknown Source) at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source) at java.base/javax.security.auth.login.LoginContext$4.run(Unknown Source) at java.base/java.security.AccessController.doPrivileged(Unknown Source) at java.base/javax.security.auth.login.LoginContext.invokePriv(Unknown Source) at java.base/javax.security.auth.login.LoginContext.login(Unknown Source) at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:60) at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:103) at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:62) at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:105) at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:170) ... 23 more Caused by: KrbException: null (68) at java.security.jgss/sun.security.krb5.KrbAsRep.<init>(Unknown Source) at java.security.jgss/sun.security.krb5.KrbAsReqBuilder.send(Unknown Source) at java.security.jgss/sun.security.krb5.KrbAsReqBuilder.action(Unknown Source) ... 36 more Caused by: KrbException: Identifier doesn't match expected value (906) at java.security.jgss/sun.security.krb5.internal.KDCRep.init(Unknown Source) at java.security.jgss/sun.security.krb5.internal.ASRep.init(Unknown Source) at java.security.jgss/sun.security.krb5.internal.ASRep.<init>(Unknown Source) ... 39 more
... View more
Labels:
- Labels:
-
Apache Kafka
03-28-2022
08:16 AM
Dear Team, kindly advise on why connection from our CDC tools to Kudu is not working? What does KuduWriterException: End of file: unable to send message: Other end of pipe was closed (error 0) mean? at java.lang.Thread.run(Thread.java:750) com.webaction.common.exc.ConnectionException: Couldn't establish connection with Kudu ( RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051 ) at com.striim.proc.Connection.KuduWriterConnection.connect(KuduWriterConnection.java:83) at com.striim.proc.Connection.KuduWriterConnection.<init>(KuduWriterConnection.java:44) at com.striim.proc.Connection.KuduWriterConnection.getConnection(KuduWriterConnection.java:100) at com.striim.proc.KuduWriter.initializeKuduClient(KuduWriter.java:224) at com.striim.proc.KuduWriter.initWriter(KuduWriter.java:132) at com.webaction.utils.writers.common.RetriableBaseDataStoreWriter.init(RetriableBaseDataStoreWriter.java:78) at com.webaction.runtime.components.Target.start(Target.java:377) at com.webaction.runtime.components.Flow.start(Flow.java:431) at com.webaction.runtime.components.Flow.start(Flow.java:375) at com.webaction.runtime.components.Flow.start(Flow.java:335) at com.webaction.runtime.components.Flow$2.run(Flow.java:1612) at java.lang.Thread.run(Thread.java:750) com.webaction.common.exc.ConnectionException: Couldn't establish connection with Kudu ( RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051,RB-HADOOP-03.mtg.local:7051,RB-HADOOP-04.mtg.local:7051,RB-HADOOP-05.mtg.local:7051 ) at com.striim.proc.Connection.KuduWriterConnection.connect(KuduWriterConnection.java:83) at com.striim.proc.Connection.KuduWriterConnection.<init>(KuduWriterConnection.java:44) at com.striim.proc.Connection.KuduWriterConnection.getConnection(KuduWriterConnection.java:100) at com.striim.proc.KuduWriter.initializeKuduClient(KuduWriter.java:224) at com.striim.proc.KuduWriter.initWriter(KuduWriter.java:132) at com.webaction.utils.writers.common.RetriableBaseDataStoreWriter.init(RetriableBaseDataStoreWriter.java:78) at com.webaction.runtime.components.Target.start(Target.java:377) at com.webaction.runtime.components.Flow.start(Flow.java:431) at com.webaction.runtime.components.Flow.start(Flow.java:375) at com.webaction.runtime.components.Flow.start(Flow.java:335) at com.webaction.runtime.components.Flow$2.run(Flow.java:1612) at java.lang.Thread.run(Thread.java:750) com.striim.proc.exception.KuduWriterException: End of file: unable to send message: Other end of pipe was closed (error 0) at com.striim.proc.CheckPoint.CheckpointTableImpl.checkForCheckpointTable(CheckpointTableImpl.java:133) at com.striim.proc.CheckPoint.CheckpointTableImpl.<init>(CheckpointTableImpl.java:44) at com.striim.proc.KuduWriter.initWriter(KuduWriter.java:152) at com.webaction.utils.writers.common.RetriableBaseDataStoreWriter.init(RetriableBaseDataStoreWriter.java:78) at com.webaction.runtime.components.Target.start(Target.java:377) at com.webaction.runtime.components.Flow.start(Flow.java:431) at com.webaction.runtime.components.Flow.start(Flow.java:375) at com.webaction.runtime.components.Flow.start(Flow.java:335) at com.webaction.runtime.components.Flow$2.run(Flow.java:1612) at java.lang.Thread.run(Thread.java:750) Command RESUME failed so application CDC.CDC_CBS_CUST_CBS_NP is put in HALTED state com.striim.proc.exception.KuduWriterException: End of file: unable to send message: Other end of pipe was closed (error 0) at com.striim.proc.CheckPoint.CheckpointTableImpl.checkForCheckpointTable(CheckpointTableImpl.java:133) at com.striim.proc.CheckPoint.CheckpointTableImpl.<init>(CheckpointTableImpl.java:44) at com.striim.proc.KuduWriter.initWriter(KuduWriter.java:152) at com.webaction.utils.writers.common.RetriableBaseDataStoreWriter.init(RetriableBaseDataStoreWriter.java:78) at com.webaction.runtime.components.Target.start(Target.java:377) at com.webaction.runtime.components.Flow.start(Flow.java:431) at com.webaction.runtime.components.Flow.start(Flow.java:375) at com.webaction.runtime.components.Flow.start(Flow.java:335) at com.webaction.runtime.components.Flow$2.run(Flow.java:1612) at java.lang.Thread.run(Thread.java:750) com.striim.proc.exception.KuduWriterException: End of file: unable to send message: Other end of pipe was closed (error 0) at com.striim.proc.CheckPoint.CheckpointTableImpl.checkForCheckpointTable(CheckpointTableImpl.java:133) at com.striim.proc.CheckPoint.CheckpointTableImpl.<init>(CheckpointTableImpl.java:44) at com.striim.proc.KuduWriter.initWriter(KuduWriter.java:152) at com.webaction.utils.writers.common.RetriableBaseDataStoreWriter.init(RetriableBaseDataStoreWriter.java:78) at com.webaction.runtime.components.Target.start(Target.java:377) at com.webaction.runtime.components.Flow.start(Flow.java:431) at com.webaction.runtime.components.Flow.start(Flow.java:375) at com.webaction.runtime.components.Flow.start(Flow.java:335) at com.webaction.runtime.components.Flow$2.run(Flow.java:1612) Thanks, Roshan
... View more
Labels:
10-20-2021
07:21 AM
Hello Team, we are having some performance issue when sending data to Kafka using JSON as format. Performs is very slow. Performance is faster when using DSV parser. How can we fine tune JSON formatter? Regards, Roshan
... View more
Labels:
- Labels:
-
Apache Kafka
-
Apache Zookeeper
09-14-2021
03:38 AM
Dear Team, can you please advise why varchar is not acceptable when replicating from Oracle to Kudu though I can create table with varchar on Kudu? Error from our CDC tool. Error while writing batch on table impala::cbs.cb_ad_x Reason account_type_v isn't [Type: string], it's varchar. Cause: account_type_v isn't [Type: string], it's varchar CREATE TABLE cbs.cb_address_xm ( addressid varchar(65535) , account_link_code_n BIGINT , account_type_v varchar(65535) , address_type_n INT , account_type varchar(65535) , address_format varchar(65535) , address_type INT , building varchar(65535) , city varchar(65535) , city_desc varchar(65535) , country varchar(65535) , country_desc varchar(65535) , district varchar(65535) , district_desc varchar(65535) , floor varchar(65535) , landmark varchar(65535) , postal_code varchar(65535) , po_code varchar(65535) , street varchar(65535) , street_desc varchar(65535) , sub_locality_code varchar(65535) , sub_locality_desc varchar(65535) , op_insert_date TIMESTAMP , op_update_date TIMESTAMP , state_desc varchar(65535) , PRIMARY KEY (addressid) ) PARTITION BY HASH (addressid) PARTITIONS 16 STORED AS KUDU ; Regards, Roshan
... View more
Labels:
- Labels:
-
Apache Impala
-
Apache Kudu
08-12-2021
10:45 PM
Hello Team, I have the following file to load on Kudu. 1829;BN=0;UNIT=VOLUME_ALL;IN=0;TC=0;TCC=0;CT=;FU=1000001;CU=54274;FB=61701;FL=ugw9811_3828500385_360_27153 0=5126742111750858;U=23059268534;SI=6;SG=1;SR=7;SN=BROWSING;SC=BROWSING;BS=60342256;BR=2581143;TU=2021-04-27 14:02:47;TF=2021-04-27 00:00:00;TA=2021-04-27 14:02:47;TB=2021-04-27 00:00:00;TE=2021-04-27 14:02:47;TS=1619517767;D=16292;R=151;E=0;UDR_cu=0;UDR_fb=BROWSING;DCM=0;UP=Prepaid;ST=BROWSING;MSISDN=23059268534;APN=orange;SGSN=196.192.13.113;GGSN=196.192.13.113;IMSI=617010014925066;BU1=23292;BN=0;UNIT=VOLUME_ALL;IN=0;TC=0;TCC=62923399;CT=;FU=1000000;CU=3586;FB=61701;FL=ugw9811_3828490275_312_8799 0=5126752111750858;U=23059268534;SI=6;SG=1;SR=7;SN=BROWSING;SC=BROWSING;BS=0;BR=0;TU=2021-04-27 14:02:47;TF=2021-04-27 00:00:00;TA=2021-04-27 14:02:47;TB=2021-04-27 00:00:00;TE=2021-04-27 14:02:47;TS=1619517767;D=16292;R=151;E=0;UDR_cu=0;UDR_fb=BROWSING;DCM=0;UP=Prepaid;ST=BROWSING;MSISDN=23059268534;APN=orange;SGSN=196.192.13.113;GGSN=196.192.13.113;IMSI=617010014925066;BU1=21829;BN=0;UNIT=VOLUME_ALL;IN=0;TC=0;TCC=0;CT=;FU=1000001;CU=3586;FB=61701;FL=ugw9811_3828490275_312_8799 How can I proceed using Spark SQL? Table structure on Kudu: CREATE EXTERNAL TABLE cdr.mobile_datadbs ( id BIGINT NOT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, msisdn STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, serviceid INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, servicegroup INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, servicerev INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, servicename STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, serviceclass STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, downlink INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, uplink INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, storedtime TIMESTAMP NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, firstaccesstime TIMESTAMP NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, lastaccesstime TIMESTAMP NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, servicebegintime TIMESTAMP NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, sessionendtime TIMESTAMP NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, cdrcreatedtime BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, duration BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, hitsperreq BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, errors1 BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, udrcu INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, udrfb STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, status1 STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, userprofile STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, servicetype STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, subsmsisdn STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, apn STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, sgsnaddress STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, ggsnaddress STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, imsi STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, bonusunit STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, bn INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, unit STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, instatus INT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, totalcost BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, totalcharge BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, terminationcause BIGINT NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, firstrequestedurl STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, cellidinfo STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, idpname STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, failureslist STRING NULL ENCODING AUTO_ENCODING COMPRESSION DEFAULT_COMPRESSION, PRIMARY KEY (id) ) PARTITION BY HASH (id) PARTITIONS 16 STORED AS KUDU TBLPROPERTIES ('external.table.purge'='TRUE', 'kudu.master_addresses'='rb-hadoop-03.mtg.local,rb-hadoop-04.mtg.local,rb-hadoop-05.mtg.local') Thanks, Roshan
... View more
Labels:
- Labels:
-
Apache Impala
-
Apache Kudu
-
Apache Spark
08-12-2021
09:48 PM
Hi, yes I would like to re-use your Oracle functions and use Zeppelin as your notebook. we are using impala as SQL engine. How do I translate that SQL function to Oracle dialect? Regards, Roshan
... View more
08-10-2021
05:16 AM
Hello Team, can you please advise why Kudu returns 0 rows? count of base table is same. Oracle - returns 21000 rows: select BankOrgProfile.* from AR.HZ_ORGANIZATION_PROFILES BankOrgProfile where SYSDATE between TRUNC(BankOrgProfile.effective_start_date) and NVL(TRUNC(BankOrgProfile.effective_end_date), SYSDATE+1) ; Kudu (Impala) - returns 0 rows: select BankOrgProfile.* from oracle_financial.HZ_ORGANIZATION_PROFILES BankOrgProfile where current_date() between cast(BankOrgProfile.effective_end_date as date) and nvl(cast(BankOrgProfile.effective_end_date as date), adddate(current_date(),1)) ; Thanks, Roshan
... View more
Labels:
- Labels:
-
Apache Impala
-
Apache Kudu