Member since
04-11-2018
6
Posts
0
Kudos Received
0
Solutions
02-17-2017
12:45 AM
@Jay SenSharma thank you very much! The problem is the path.
... View more
02-17-2017
12:28 AM
Hey guys, I´m trying to run a simple script through VM, but I´m getting the following error. Does anyone know how to fix it? Thanks. Pig commands: grunt> load_data = load 'user/root/pig_demo.txt';
grunt> dump load_data
Pig Log errors: Pig Stack Trace
---------------
ERROR 1066: Unable to open iterator for alias load_data. Backend error : java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias load_data. Backend error : java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
at org.apache.pig.PigServer.openIterator(PigServer.java:1009)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:747)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:231)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:206)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:566)
at org.apache.pig.Main.main(Main.java:178)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 0: java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.getStats(MapReduceLauncher.java:822)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:452)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:308)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1474)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1459)
at org.apache.pig.PigServer.storeEx(PigServer.java:1118)
at org.apache.pig.PigServer.store(PigServer.java:1081)
at org.apache.pig.PigServer.openIterator(PigServer.java:994)
... 13 more
Caused by: java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
at org.apache.hadoop.mapreduce.Job.ensureState(Job.java:292)
at org.apache.hadoop.mapreduce.Job.getTaskReports(Job.java:534)
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.getTaskReports(HadoopShims.java:235)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.getStats(MapReduceLauncher.java:801)
... 20 more
... View more
Labels:
- Labels:
-
Apache Pig
02-15-2017
09:18 PM
Thank you @Sunile Manjee, It has worked after starting the dependencies of Atlas (Ambari Infra, HBase and Kafka). Junior.
... View more
02-11-2017
02:27 AM
The command is: sqoop import \
--connect "jdbc:mysql://sandbox.hortonworks.com:3306/retail_db" \
--username=root \
--password=hadoop \
--table departments \
--hive-home /apps/hive/warehouse \
--hive-import \
--hive-overwrite \
--hive-table sqoop_import.departments \
--outdir java_files The output is: Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/02/11 02:15:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/02/11 02:15:06 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/02/11 02:15:06 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/02/11 02:15:06 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/02/11 02:15:06 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/02/11 02:15:06 INFO tool.CodeGenTool: Beginning code generation
17/02/11 02:15:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 02:15:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `departments` AS t LIMIT 1
17/02/11 02:15:06 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/eef451e1cc95fb2071ebe74f5d9371e9/departments.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/02/11 02:15:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/eef451e1cc95fb2071ebe74f5d9371e9/departments.jar
17/02/11 02:15:08 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/02/11 02:15:08 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/02/11 02:15:08 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/02/11 02:15:08 INFO mapreduce.ImportJobBase: Beginning import of departments
17/02/11 02:15:09 INFO impl.TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
17/02/11 02:15:09 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
17/02/11 02:15:09 INFO client.AHSProxy: Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
17/02/11 02:15:14 INFO db.DBInputFormat: Using read commited transaction isolation
17/02/11 02:15:14 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`department_id`), MAX(`department_id`) FROM `departments`
17/02/11 02:15:14 INFO db.IntegerSplitter: Split size: 1; Num splits: 4 from: 2 to: 7
17/02/11 02:15:14 INFO mapreduce.JobSubmitter: number of splits:4
17/02/11 02:15:14 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1486771879029_0004
17/02/11 02:15:15 INFO impl.YarnClientImpl: Submitted application application_1486771879029_0004
17/02/11 02:15:15 INFO mapreduce.Job: The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1486771879029_0004/
17/02/11 02:15:15 INFO mapreduce.Job: Running job: job_1486771879029_0004
17/02/11 02:15:21 INFO mapreduce.Job: Job job_1486771879029_0004 running in uber mode : false
17/02/11 02:15:21 INFO mapreduce.Job: map 0% reduce 0%
17/02/11 02:15:29 INFO mapreduce.Job: map 50% reduce 0%
17/02/11 02:15:30 INFO mapreduce.Job: map 75% reduce 0%
17/02/11 02:15:31 INFO mapreduce.Job: map 100% reduce 0%
17/02/11 02:15:31 INFO mapreduce.Job: Job job_1486771879029_0004 completed successfully
17/02/11 02:15:31 INFO mapreduce.Job: Counters: 30
File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=652000
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=481
HDFS: Number of bytes written=60
HDFS: Number of read operations=16
HDFS: Number of large read operations=0
HDFS: Number of write operations=8
Job Counters
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=19510
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=19510
Total vcore-milliseconds taken by all map tasks=19510
Total megabyte-milliseconds taken by all map tasks=4877500
Map-Reduce Framework
Map input records=6
Map output records=6
Input split bytes=481
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=930
CPU time spent (ms)=3690
Physical memory (bytes) snapshot=548311040
Virtual memory (bytes) snapshot=7742504960
Total committed heap usage (bytes)=176160768
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=60
17/02/11 02:15:31 INFO mapreduce.ImportJobBase: Transferred 60 bytes in 22.7172 seconds (2.6412 bytes/sec)
17/02/11 02:15:31 INFO mapreduce.ImportJobBase: Retrieved 6 records.
17/02/11 02:15:31 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners
17/02/11 02:15:31 INFO atlas.ApplicationProperties: Looking for atlas-application.properties in classpath
17/02/11 02:15:31 INFO atlas.ApplicationProperties: Loading atlas-application.properties from file:/etc/sqoop/2.5.0.0-1245/0/atlas-application.properties
17/02/11 02:15:32 ERROR security.InMemoryJAASConfiguration: Unable to add JAAS configuration for client [KafkaClient] as it is missing param [atlas.jaas.KafkaClient.loginModuleName]. Skipping JAAS config for [KafkaClient]
17/02/11 02:15:32 INFO hook.AtlasHook: Created Atlas Hook
... View more
Labels:
- Labels:
-
Apache Sqoop
02-05-2017
11:50 PM
Thank you Sindhu, I was facing the same problem. Now it works.
... View more