Member since
10-01-2016
156
Posts
8
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7962 | 04-04-2019 09:41 PM | |
3100 | 06-04-2018 08:34 AM | |
1437 | 05-23-2018 01:03 PM | |
2937 | 05-21-2018 07:12 AM | |
1798 | 05-08-2018 10:48 AM |
05-27-2019
04:30 PM
I checked my tez directory [root@node2 ~]# ls -l /usr/hdp/3.1.0.0-78/tez/
total 0
drwxr-xr-x 2 tez hadoop 44 May 27 00:57 conf
lrwxrwxrwx 1 root root 13 May 27 01:12 conf;5ceb0f32 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 01:34 conf;5ceb1477 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 02:09 conf;5ceb1c24 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 03:58 conf;5ceb3650 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 06:50 conf;5ceb5dd1 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 18:55 conf;5cec0874 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 19:13 conf;5cec0c93 -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 19:17 conf;5cec0d9c -> /etc/tez/conf
lrwxrwxrwx 1 root root 13 May 27 19:21 conf;5cec0e81 -> /etc/tez/conf I think every unsuccessful attempt added a new line. I removed all [root@node2 ~]# rm -rf /usr/hdp/3.1.0.0-78/tez/* Using Ambari I reinstalled tez client for this node only. It worked.
... View more
05-25-2019
06:17 PM
Thank you @Stylianos Sideridis. The issue cooled. Next time I'll keep it in mind, thanks again.
... View more
05-09-2019
01:36 AM
thank you very much @Geoffrey Shelton Okot. The job worked. One last thing: I can't see any table in hive azhadoop. my query: sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --driver com.mysql.jdbc.Driver --username root --password hadoop --query "select * from iris_mysql WHERE \$CONDITIONS" --m 1 --hive-import --hive-table azhadoop.iris_hive --target-dir /tmp/hive_temp the result of mr job 19/05/08 21:33:10 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=172694
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=87
HDFS: Number of bytes written=4574
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Other local map tasks=1
Total time spent by all maps in occupied slots (ms)=26964
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=3852
Total vcore-milliseconds taken by all map tasks=3852
Total megabyte-milliseconds taken by all map tasks=5916672
Map-Reduce Framework
Map input records=151
Map output records=151
Input split bytes=87
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=135
CPU time spent (ms)=1310
Physical memory (bytes) snapshot=241512448
Virtual memory (bytes) snapshot=3256225792
Total committed heap usage (bytes)=152567808
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=4574
19/05/08 21:33:10 INFO mapreduce.ImportJobBase: Transferred 4.4668 KB in 26.0204 seconds (175.7852 bytes/sec)
19/05/08 21:33:10 INFO mapreduce.ImportJobBase: Retrieved 151 records.
... View more
05-07-2019
09:22 PM
Hi @Geoffrey Shelton Okot thanks again. Interestingly adding --driver made the ERROR disappear. But another problem showed up [root@sandbox-hdp ~]# sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --driver com.mysql.jdbc.Driver --username root --password hadoop --query "select * from iris_mysql WHERE \$CONDITIONS" --m 1 --hive-import --hive-table azhadoop.iris_hive --target-dir /tmp/hive_temp
Warning: /usr/hdp/2.6.4.0-91/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
19/05/07 21:04:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.4.0-91
19/05/07 21:04:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/05/07 21:04:19 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/05/07 21:04:19 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/05/07 21:04:20 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
19/05/07 21:04:20 INFO manager.SqlManager: Using default fetchSize of 1000
19/05/07 21:04:20 INFO tool.CodeGenTool: Beginning code generation
19/05/07 21:04:20 INFO manager.SqlManager: Executing SQL statement: select * from iris_mysql WHERE (1 = 0)
19/05/07 21:04:20 INFO manager.SqlManager: Executing SQL statement: select * from iris_mysql WHERE (1 = 0)
19/05/07 21:04:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.4.0-91/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/3e81cb85d0e8a571138759f1babfc886/QueryResult.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/05/07 21:04:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/3e81cb85d0e8a571138759f1babfc886/QueryResult.jar
19/05/07 21:04:22 INFO mapreduce.ImportJobBase: Beginning query import.
19/05/07 21:04:23 INFO client.RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.17.0.2:8032
19/05/07 21:04:23 INFO client.AHSProxy: Connecting to Application History server at sandbox-hdp.hortonworks.com/172.17.0.2:10200
19/05/07 21:04:26 INFO db.DBInputFormat: Using read commited transaction isolation
19/05/07 21:04:26 INFO mapreduce.JobSubmitter: number of splits:1
19/05/07 21:04:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1557245169101_0001
19/05/07 21:04:27 INFO impl.YarnClientImpl: Submitted application application_1557245169101_0001
19/05/07 21:04:27 INFO mapreduce.Job: The url to track the job: http://sandbox-hdp.hortonworks.com:8088/proxy/application_1557245169101_0001/
19/05/07 21:04:27 INFO mapreduce.Job: Running job: job_1557245169101_0001
19/05/07 21:04:40 INFO mapreduce.Job: Job job_1557245169101_0001 running in uber mode : false
19/05/07 21:04:40 INFO mapreduce.Job: map 0% reduce 0% It doesn't move stuck with mapreduce job. No progress, job takes 2550 memory from YARN and status running. No error but no progress. How can anyone import a query from mysql to hive in a sandbox?
... View more
04-29-2019
08:27 AM
Hi @Geoffrey Shelton Okot thanks for your answer. But are you sure this is a driver problem? I think it works fine because I am able to import from mysql to hdfs.
... View more
04-27-2019
02:34 PM
In HDP Sandbox 2.6.4 I imported mysql to hdfs but when I tried to import from mysql to hive with the following command: [maria_dev@sandbox-hdp ~]$ sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --username root --password hadoop --query 'select * from iris_mysql WHERE $CONDITIONS' --m 1 --hive-import --hive-table azhadoop.iris_hive --target-dir /tmp/hive_temp But I got this error: 19/04/27 14:22:19 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b8ee4de is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b8ee4de is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
04-04-2019
09:41 PM
1 Kudo
In this post, Hortonworks says change port number if come across this kind of conflict. For example if 6000 is binded change it to 6001 in 'HDP_2.6.5_deploy-scripts\sandbox\proxy\proxy-deploy.sh' file. I changed 50111 to 50112, 50113, etc. and retried but nothing helped. I had to comment out the following ports: #-p 50111:50111 \ #-p 50095:50095 \ #-p 50079:50079 \ # -p 50075:50075 \ # -p 50070:50070 \ then I could manage to start sandbox-proxy.
... View more
04-04-2019
12:33 PM
I use Windows 10 Pro. With Docker, I try to start HDP-Sandbox 2.6.5. docker container start sandbox-hdp works fine but when I try to start sandbox-proxy: docker container start sandbox-proxy I get the following error: Error response from daemon: driver failed programming external connectivity on endpoint sandbox-proxy (26aa3064fa2cf790e2f529ed1c865944013893b8f14259ef5109407b5b95e08c): Error starting userla
nd proxy: Bind for 0.0.0.0:50111: unexpected error Permission denied
Error: failed to start containers: sandbox-proxy
... View more
Labels: