Support Questions

Find answers, ask questions, and share your expertise

Sqoop kafka atlas JAAS configuration error

avatar
Expert Contributor

I try to import a sql query from RDBMS to hive table. I used these query hundreds times but today I got strange error related Kafka and Atlas.

18/07/06 17:17:49 ERROR security.InMemoryJAASConfiguration: Unable to add JAAS configuration for client [KafkaClient] as it is missing param [atlas.jaas.KafkaClient.loginModuleName]. Skipping JAAS config for [KafkaClient]


org.apache.atlas.notification.NotificationException: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for ATLAS_HOOK-0 due to 30078 ms has passed since batch creation plus linger time
        at org.apache.atlas.kafka.KafkaNotification.sendInternalToProducer(KafkaNotification.java:239)
        at org.apache.atlas.kafka.KafkaNotification.sendInternal(KafkaNotification.java:212)
        at org.apache.atlas.notification.AbstractNotification.send(AbstractNotification.java:114)
        at org.apache.atlas.hook.AtlasHook.notifyEntitiesInternal(AtlasHook.java:143)
        at org.apache.atlas.hook.AtlasHook.notifyEntities(AtlasHook.java:128)
        at org.apache.atlas.sqoop.hook.SqoopHook.publish(SqoopHook.java:190)
        at org.apache.atlas.sqoop.hook.SqoopHook.publish(SqoopHook.java:51)
        at org.apache.sqoop.mapreduce.PublishJobData.publishJobData(PublishJobData.java:52)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:284)
        at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:748)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:509)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for ATLAS_HOOK-0 due to 30078 ms has passed since batch creation plus linger time
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.valueOrError(FutureRecordMetadata.java:65)
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:52)
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:25)
        at org.apache.atlas.kafka.KafkaNotification.sendInternalToProducer(KafkaNotification.java:230)
        ... 17 more


Despite this error sometimes sqoop works, sometimes doesn't. When doesn't, it creates table but empty.

12 REPLIES 12

avatar
Expert Contributor

Hi @Sandeep Nemuri thank you. First configs.py has worked. Bu second one has given following:

Traceback (most recent call last):
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 364, in <module>
    sys.exit(main())
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 357, in main
    return delete_properties(cluster, config_type, action_args, accessor)
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 248, in delete_properties
    update_config(cluster, config_type, delete_specific_property(config_name), accessor)
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 131, in update_config
    properties, attributes = config_updater(cluster, config_type, accessor)
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 195, in update
    properties, attributes = get_current_config(cluster, config_type, accessor)
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 123, in get_current_config
    config_tag = get_config_tag(cluster, config_type, accessor)
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 94, in get_config_tag
    response = accessor(DESIRED_CONFIGS_URL.format(cluster))
  File "/var/lib/ambari-server/resources/scripts/configs.py", line 89, in do_request
    raise Exception('Problem with accessing api. Reason: {0}'.format(exc))
Exception: Problem with accessing api. Reason: HTTP Error 404: Not Found

avatar
Expert Contributor

I tried sqoop and get following error bu import is successfull

 ERROR security.InMemoryJAASConfiguration: Unable to add JAAS configuration for client [KafkaClient] as it is missing param [atlas.jaas.KafkaClient.loginModuleName].                  Skipping JAAS config for [KafkaClient]

and

org.apache.atlas.notification.NotificationException: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for ATLAS_HOOK-0 du                 e to 30075 ms has passed since batch creation plus linger time
        at org.apache.atlas.kafka.KafkaNotification.sendInternalToProducer(KafkaNotification.java:239)
        at org.apache.atlas.kafka.KafkaNotification.sendInternal(KafkaNotification.java:212)
        at org.apache.atlas.notification.AbstractNotification.send(AbstractNotification.java:114)
        at org.apache.atlas.hook.AtlasHook.notifyEntitiesInternal(AtlasHook.java:143)
        at org.apache.atlas.hook.AtlasHook.notifyEntities(AtlasHook.java:128)
        at org.apache.atlas.sqoop.hook.SqoopHook.publish(SqoopHook.java:190)
        at org.apache.atlas.sqoop.hook.SqoopHook.publish(SqoopHook.java:51)
        at org.apache.sqoop.mapreduce.PublishJobData.publishJobData(PublishJobData.java:52)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:284)
        at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:748)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:509)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for ATLAS_HOOK-0 due to 30075 ms has passed since batch creat                 ion plus linger time
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.valueOrError(FutureRecordMetadata.java:65)
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:52)
        at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:25)
        at org.apache.atlas.kafka.KafkaNotification.sendInternalToProducer(KafkaNotification.java:230)
        ... 17 more
Caused by: org.apache.kafka.common.errors.TimeoutException: Expiring 1 record(s) for ATLAS_HOOK-0 due to 30075 ms has passed since batch creation plus linger time

avatar
Expert Contributor

Hi @svaddi HDP 2.6.3, sqoop import, source RDBMS Sybase, destination Hive table, Kerberos not enabled. Query:

sqoop import --connect jdbc:sybase:Tds:XX.XX.XX.XX:2640 --driver com.sybase.jdbc4.jdbc.SybDriver --username XXX --password XXX --target-dir /dw/iq_aktar/XXX --query "select * FROM table and \$CONDITIONS " --m 1 --hive-import --hive-overwrite --hive-table XX.XXX --hive-drop-import-delims