Created 09-19-2018 10:58 AM
I am trying to import the data from rdbms postgre and creating the table in the hive but data is getting loaded into hdfs but table is not getting created . I ma getting below erros when i run the command :
sqoop import --connect jdbc:postgresql://<host>/<databasename> --username <user_name> -P --table poc --hive-import --create-hive-table --hive-table hive.poc --delete-target-dir -- --schema live;
I am facing below errors :
18/09/18 15:18:08 INFO mapreduce.Job: Job job_1536855412397_0032 completed successfully 18/09/18 15:18:08 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=736388 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=470 HDFS: Number of bytes written=546122 HDFS: Number of read operations=16 HDFS: Number of large read operations=0 HDFS: Number of write operations=8 Job Counters Launched map tasks=4 Other local map tasks=4 Total time spent by all maps in occupied slots (ms)=33191 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=33191 Total vcore-milliseconds taken by all map tasks=33191 Total megabyte-milliseconds taken by all map tasks=135950336 Map-Reduce Framework Map input records=4264 Map output records=4264 Input split bytes=470 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=271 CPU time spent (ms)=7380 Physical memory (bytes) snapshot=1624289280 Virtual memory (bytes) snapshot=22467031040 Total committed heap usage (bytes)=2267021312 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=546122 18/09/18 15:18:08 INFO mapreduce.ImportJobBase: Transferred 533.3223 KB in 28.2538 seconds (18.8761 KB/sec) 18/09/18 15:18:08 INFO mapreduce.ImportJobBase: Retrieved 4264 records. 18/09/18 15:18:08 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners 18/09/18 15:18:08 INFO atlas.ApplicationProperties: Looking for atlas-application.properties in classpath 18/09/18 15:18:08 INFO atlas.ApplicationProperties: Loading atlas-application.properties from file:/etc/sqoop/2.6.5.0-292/0/atlas-application.properties 18/09/18 15:18:08 INFO kafka.KafkaNotification: ==> KafkaNotification() 18/09/18 15:18:08 INFO kafka.KafkaNotification: <== KafkaNotification() 18/09/18 15:18:08 INFO hook.AtlasHook: Created Atlas Hook 18/09/18 15:18:08 INFO kafka.KafkaNotification: ==> KafkaNotification.createProducer() 18/09/18 15:18:08 INFO producer.ProducerConfig: ProducerConfig values: acks = 1 batch.size = 16384 bootstrap.servers = [host:6667] buffer.memory = 33554432 client.id = compression.type = none connections.max.idle.ms = 540000 enable.idempotence = false interceptor.classes = null key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = kafka sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXTSASL send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 18/09/18 15:18:08 INFO authenticator.AbstractLogin: Successfully logged in. 18/09/18 15:18:08 INFO kerberos.KerberosLogin: [Principal=null]: TGT refresh thread started. 18/09/18 15:18:08 INFO kerberos.KerberosLogin: [Principal=null]: TGT valid starting at: Tue Sep 18 14:58:46 CEST 2018 18/09/18 15:18:08 INFO kerberos.KerberosLogin: [Principal=null]: TGT expires: Wed Sep 19 00:58:46 CEST 2018 18/09/18 15:18:08 INFO kerberos.KerberosLogin: [Principal=null]: TGT refresh sleeping until: Tue Sep 18 23:12:21 CEST 2018 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'key.deserializer' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'value.deserializer' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'hook.group.id' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'zookeeper.connection.timeout.ms' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'zookeeper.session.timeout.ms' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'enable.auto.commit' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'zookeeper.connect' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'zookeeper.sync.time.ms' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'session.timeout.ms' was supplied but isn't a known config. 18/09/18 15:18:08 WARN producer.ProducerConfig: The configuration 'auto.offset.reset' was supplied but isn't a known config. 18/09/18 15:18:08 INFO utils.AppInfoParser: Kafka version : 1.0.0.2.6.5.0-292 18/09/18 15:18:08 INFO utils.AppInfoParser: Kafka commitId : 2ff1ddae17fb8503 18/09/18 15:18:08 INFO kafka.KafkaNotification: <== KafkaNotification.createProducer() 18/09/18 15:19:08 ERROR hook.AtlasHook: Failed to send notification - attempt #1; error=java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms. 18/09/18 15:20:09 ERROR hook.AtlasHook: Failed to send notification - attempt #2; error=java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.
however as per the old post recommendations I have granted security via kafka-acl.sh :