Member since
01-07-2016
89
Posts
20
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6158 | 02-05-2016 02:17 PM | |
3339 | 02-05-2016 12:56 AM | |
1799 | 01-29-2016 03:24 AM | |
756 | 01-20-2016 03:52 PM | |
776 | 01-20-2016 08:48 AM |
08-22-2016
09:18 AM
hi, not really its same issue.... [08S01]: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Current user : adf_admin is not allowed to grant role. User has to belong to ADMIN role and have it as current role, for this action. Otherwise, grantor need to have ADMIN OPTION on role being granted and have it as a current role for this action.
... View more
08-19-2016
03:36 PM
I tried GRANT admin TO USER adf_admin; and got error below [08S01]: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Current user : adf_admin is not allowed to grant role. User has to belong to ADMIN role and have it as current role, for this action. Otherwise, grantor need to have ADMIN OPTION on role being granted and have it as a current role for this action.
... View more
08-19-2016
03:34 PM
hello @Brandon Wilson , im wondering how can i do this? im googling and i cant see anything related to the setting admin privs to the adf_admin user or whatever user. I thought i can do this putting the user name into conf variable " hive.users.in.admin.role "
pls let me know. thank you
... View more
08-19-2016
10:37 AM
hi, im trying to setup SQL Standard-based Authorization based on the https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_dataintegration/content/hive-013-feature-sql-standard-based-grant-revoke.html but apparently it DOESNT work. These are the values what are recommended. -hiveconf hive.metastore.uris
'' (a space inside single quotation marks)
-hiveconf hive.security.authorization.manager org.apache.hadoop.hive.ql.security.
authorization.
MetaStoreAuthzAPIAuthorizerEmbedOnly My ambari hive setup has other values... hive.metastore.uristhrift://blabla.com:9083 and hive.security.authorization.manager
org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdConfOnlyAuthorizerFactory Right now when i try to show roles; for the user which was defined in hive.users.in.admin.role i got the error below: [08S01]: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Current user : adf_admin is not allowed to list roles. User has to belong to ADMIN role and have it as current role, for this action. Thank you
... View more
Labels:
- Labels:
-
Apache Hive
07-28-2016
07:20 AM
So i found appropriate components but it doesnt convert the file properly, any idea? input file is a binary
... View more
07-27-2016
04:11 PM
Hi, where i can find the character set values that are accepted by ConvertCharacterSet processor? Also what component can i use to load CSV file and to dump results into the converted CSV file?
... View more
02-09-2016
01:37 PM
its strange you cant reproduce error, does it work for you?
Application application_1454923438220_0007 failed 2
times due to AM Container for appattempt_1454923438220_0007_000002
exited with exitCode: 1
For more detailed output, check application tracking
page:http://sandbox.hortonworks.com:8088/cluster/app/application_1454923438220_0007Then,
click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e10_1454923438220_0007_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:576)
at org.apache.hadoop.util.Shell.run(Shell.java:487)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753)
at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
... View more
02-05-2016
04:01 PM
1 Kudo
well CSVExcelStorage doesnt work also....
2016-02-05 16:01:28,917 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2016-02-05 16:01:29,745 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias sourceData
Details at logfile: /home/hdfs/pig_1454687855333.log
grunt> Im confused... what is it.
... View more
02-05-2016
02:45 PM
1 Kudo
for 100% there is no problem with input dataset, i kept only first 5 records in file and its the same issue.
... View more
02-05-2016
02:18 PM
you should fix that FORUM website its pain to format text, paste code etc....
... View more
02-05-2016
02:17 PM
2 Kudos
I created sample code, it works FINE. BufferedInputStream inStream = null;
String inputF = "hdfs://CustomerData-20160128-1501807.avro";
org.apache.hadoop.fs.Path inPath = new org.apache.hadoop.fs.Path(inputF);
try {
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://sandbox.hortonworks.com:8020");
FileSystem fs = FileSystem.get(URI.create(inputF), conf);
inStream = new BufferedInputStream(fs.open(inPath));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
DataFileStream reader = new DataFileStream(inStream, new GenericDatumReader());
Schema schema = reader.getSchema();
System.out.println(schema.toString());
... View more
02-05-2016
02:06 PM
im trying to write sample java code... but https://hadoop.apache.org/docs/r2.6.1/api/org/apache/hadoop/conf/Configuration.html [root@sandbox deploy-4]# find / -name core-default.xml
[root@sandbox deploy-4]# find / -name core-site..xml there are no such a files in sandbox. How can i go thru this step? thanks
... View more
02-05-2016
01:08 PM
can you call avro-tools-1.7.4.jar within the pig script? and also is it possible to access files stored on HDFS using avro-tools?
... View more
02-05-2016
11:36 AM
1 Kudo
Hi, I want to read a metadata from avro file stored in HDFS using AVRO api ( https://avro.apache.org/docs/1.4.1/api/java/org/apache/avro/file/DataFileReader.html ) The avro DataFileReader accepts only File objects. Is it somehow
possible to read data from file stored on hdfs instead of data stored on
local fs? Thank you
... View more
Labels:
- Labels:
-
Apache Hadoop
02-05-2016
09:03 AM
1 Kudo
this is odd: when i do grunt> b = limit sourceData 5; grunt>dump b; i works for me also, when i dont limit result set .. .and just executing dump sourceData; im occurring same error.
... View more
02-05-2016
08:43 AM
1 Kudo
then what kind of issue with environment it could be? I only executed menitoned command, nothing else.
... View more
02-05-2016
12:59 AM
you can find dataset here: https://drive.google.com/file/d/0B6RZ_9vVuTEcTHllU1dIR2VBY1E/view?usp=sharing \\thank you
... View more
02-05-2016
12:56 AM
1 Kudo
fyi https://issues.apache.org/jira/browse/PIG-4793 org.apache.pig.piggybank.storage.avro.AvroStorage is Deprecated, use AvroStorage('schema', '-d') This works.
... View more
02-05-2016
12:39 AM
1 Kudo
needles to say, this is insane. Yes, grunt by -x mapreduce, i tried -x tez but: 2016-02-05 00:37:42,172 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias sourceDataDetails at logfile: /home/hdfs/pig_1454632554431.log privileges are correct:
drwxr-xr-x - hdfs hdfs 0 2016-02-04 23:55 /src
delimiter is is ; any idea?
... View more
02-05-2016
12:13 AM
1 Kudo
Hi, I am trying to execute pig script in mapreduce mode, script is simple: grunt> sourceData = load 'hdfs://sandbox.hortonworks.com:8020/src/CustomerData.csv' using PigStorage(';') as (nullname: chararray,customerId: chararray,VIN: chararray,Birthdate: chararray,Mileage: chararray,Fuel_Consumption: chararray); File is stored in HDFS: hadoop fs -ls hdfs://sandbox.hortonworks.com:8020/src/CustomerData.csv
-rw-r--r-- 3 hdfs hdfs 6828 2016-02-04 23:55 hdfs://sandbox.hortonworks.com:8020/src/CustomerData.csv Error that i got: Failed Jobs:
JobId Alias Feature Message Outputs
job_1454609613558_0003 sourceData MAP_ONLY Message: Job failed! hdfs://sandbox.hortonworks.com:8020/tmp/temp-710368608/tmp-1611282262, Input(s):
Failed to read data from "hdfs://sandbox.hortonworks.com:8020/src/CustomerData.csv" Output(s):
Failed to produce result in "hdfs://sandbox.hortonworks.com:8020/tmp/temp-710368608/tmp-1611282262" Pig Stack Trace---------------ERROR 1066: Unable to open iterator for alias sourceDataorg.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias sourceData at org.apache.pig.PigServer.openIterator(PigServer.java:935) at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:754) at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205) at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66) at org.apache.pig.Main.run(Main.java:565) at org.apache.pig.Main.main(Main.java:177) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136)Caused by: java.io.IOException: Job terminated with anomalous status FAILED at org.apache.pig.PigServer.openIterator(PigServer.java:927) ... 13 more
... View more
- Tags:
- Data Processing
- Pig
Labels:
- Labels:
-
Apache Pig
02-01-2016
12:27 PM
ah i already did ... my question was why its there ... when i use local mode its not there .. anyway there is no reply from anyone behind avrostorage... thats pretty odd.
... View more
02-01-2016
12:20 PM
sure but input data contains all the field, so my question is why it generates [null] as part of the datatype. Also still no luck with https://issues.apache.org/jira/browse/PIG-4793
... View more
02-01-2016
10:34 AM
one more important observation, when i dump data into avro using store outputSet into 'avrostorage' using AvroStorage(); the schema inside avro file looks like: {"type":"record","name":"pig_output","fields":[{"name":"name","type":["null","string"]},{"name":"customerId","type":["null","string"]},{"name":"VIN","type":["null","string"]},{"name":"Birthdate","type":["null","string"]},{"name":"Mileage","type":["null","string"]},{"name":"Fuel_Consumption","type":["null","string"]}]} Why each field contains null?
... View more
01-31-2016
02:26 PM
is there any update on this?
... View more
01-29-2016
06:54 PM
well i cant live with that workaround, thats the problem. what i HCC?
... View more
01-29-2016
06:10 PM
yes, works for me also, but when i use STORE outputSet INTO '/avro-dest/Test-20160129-1401822'
USING org.apache.pig.piggybank.storage.avro.AvroStorage and i define schema as part of the AvroStorage( schema ) ... it doesnt work ;-(((
... View more
01-29-2016
05:38 PM
Ok, i added the line
outputSet = foreach outputSet generate $0 as (name:chararray) , $1 as (customerId:chararray), $2 as (VIN:chararray) , $3 as (Birthdate:chararray), $4 as (Mileage:chararray) ,$5 as (Fuel_Consumption:chararray); and successfully created output avro file using: store outputSet into 'avrostorage' using AvroStorage(); When i try to store output file using code below it is failing /10.0.1.47:8050
2016-01-29 17:24:39,600 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed! at this point i clearly have no idea what else i can do. STORE outputSet INTO '/avro-dest/Test-20160129-1401822' USING org.apache.pig.piggybank.storage.avro.AvroStorage('no_schema_check', 'schema', '{"type":"record","name":"test","fields":[{"name":"name","type":"string","title":"Customer name","description":"non Surrogate Key for joining files on the BDP","DataOwner":"Bank","ValidityDate":"2015.12.22","ValidityOption":"Delete","DataSensitivityLevel":"0","FieldPosition":"1"},{"name":"customerId","type":"string","title":"customer Id","description":"non sensitive field of customer Id","DataOwner":"Bank","ValidityDate":"2015.12.22","ValidityOption":"Retain","DataSensitivityLevel":"0","FieldPosition":"2"},{"name":"VIN","type":"string","title":"Customer VIN","description":"Customer VIN","DataOwner":"Bank","ValidityDate":"2015.12.22","ValidityOption":"Delete","DataSensitivityLevel":"1","FieldPosition":"3"},{"name":"Birthdate","type":"string","title":"Customer birthdate","description":"Customer birthdate","DataOwner":"Bank","ValidityDate":"2015.12.22","ValidityOption":"Delete","DataSensitivityLevel":"1","FieldPosition":"4"},{"name":"Mileage","type":"string","title":"Customer mileage","description":"Customer mileage","DataOwner":"Bank","ValidityDate":"2015.12.22","ValidityOption":"Delete","DataSensitivityLevel":"0","FieldPosition":"5"},{"name":"Fuel_Consumption","type":"string","title":"Customer fule consumption","description":"Customer fuel consumption","DataOwner":"Bank","ValidityDate":"2015.12.22","ValidityOption":"Delete","DataSensitivityLevel":"0","FieldPosition":"6"}]}');
... View more
01-29-2016
05:10 PM
ops sorry my fault ... i dont have that source stored in HDFS ... time to stop debugging for today -)
... View more
01-29-2016
05:06 PM
I dont know what happened but i cant load any avro file in mapreduce mode ...
grunt> sensitiveSet = load '/t-spool-dir/Test-20160129-1401822-ttp.avro' USING AvroStorage();
2016-01-29 17:06:00,668 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: null
Details at logfile: /tmp/hsperfdata_hdfs/pig_1454087102249.log Pig Stack Trace
---------------
ERROR 1200: null
Failed to parse: null at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:201) at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1707) at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1680) at org.apache.pig.PigServer.registerQuery(PigServer.java:623) at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1082) at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:505) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205) at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66) at org.apache.pig.Main.run(Main.java:565) at org.apache.pig.Main.main(Main.java:177) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.lang.NullPointerException at org.apache.pig.builtin.AvroStorage.getAvroSchema(AvroStorage.java:298) at org.apache.pig.builtin.AvroStorage.getAvroSchema(AvroStorage.java:282) at org.apache.pig.builtin.AvroStorage.getSchema(AvroStorage.java:256) at org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175) at org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89) at org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:901) at org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568) at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625) at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102) at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560) at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421) at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:191) ... 16 more ================================================================================ /tmp/hsperfdata_hdfs/pig_1454087102249.log (END)
... View more