Member since
01-26-2021
11
Posts
1
Kudos Received
0
Solutions
11-23-2021
02:43 AM
Hi @lwmonster @Shrilesh Please add the linux path of these 3 jars to dependency section of hbase-interpreter hbase-client.jar hbase-protocol.jar hbase-common.jar
... View more
11-21-2021
07:42 PM
Hi Guys, I am currently working on zeppelin running on a hdp cluster and the zeppelin version is 0.7.3. I configured pig interpreter by manually importing the jars from maven repository. When I run a simple script to load a file and dump the output on console it works properly on local mode by keeping the file in local file system(linux). However when I run on tez mode the script fails with the exception below. The same script works when I run pig on tez mode from cli. I have set HADOOP_CONF_DIR and TEZ_CONF_DIR in zeppelin-env.sh. The script is: data = LOAD '/path/sample.txt' as (C1:chararray,C2:chararray,C3:chararray); b = FILTER data by C1 is not null; DUMP b; The Exception is: org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias b at org.apache.pig.PigServer.openIterator(PigServer.java:1020) at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:782) at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:383) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230) at org.apache.pig.PigServer.registerScript(PigServer.java:781) at org.apache.pig.PigServer.registerScript(PigServer.java:858) at org.apache.pig.PigServer.registerScript(PigServer.java:821) at org.apache.zeppelin.pig.PigInterpreter.interpret(PigInterpreter.java:100) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:97) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:498) at org.apache.zeppelin.scheduler.Job.run(Job.java:175) at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Couldn't retrieve job. at org.apache.pig.PigServer.store(PigServer.java:1084) at org.apache.pig.PigServer.openIterator(PigServer.java:995) ... 18 more Please help me with your inputs.
... View more
Labels:
- Labels:
-
Apache Pig
-
Apache Zeppelin
04-05-2021
10:21 PM
hi @Love-Nifi can you please let me know which processor you are trying to use?
... View more
04-05-2021
09:10 PM
Hi @ram_g Can you please let me know how much time it takes processor A to create single flow file of 100 records?
... View more
04-05-2021
05:56 AM
Hi @giovannimori can you please try to union the two subqueries and form a single view?
... View more
04-05-2021
05:48 AM
1 Kudo
Hi @ram_g The oldest flow file processor determines the oldest file using the amount of time the flow file is in the flow. That makes processor B take any file in random if more than one flow file comes out of processor A. Can you please elaborate the issue you are facing while you go with FirstInFirstOut prioritizer? Also please see if you can limit the queue's threshold to 1 so there is always one flow file in the queue eliminating the confusion caused by prioritizer.
... View more
04-05-2021
03:43 AM
Hi Guys, I am trying to migrate an ORC table from one cluster to another. The export and import does not throw any error and I could see folders and files getting created. But the imported table does not contain any data. Table is getting created with all the correct columns. The same works properly for table stored as textfile. Hive version is 3.1. Please provide your inputs. Regards, Magudeswaran R.
... View more
Labels:
- Labels:
-
Apache Hive
02-11-2021
03:40 AM
Hi @GangWar @AmirMirza Thank you for the inputs. Another new observation is the issue occurs when ambari-server is connecting to MariaDB as there were two different databases. When the database value is changed authentication works and is able to sync data from LDAP.
... View more
01-27-2021
07:04 PM
@GangWar Do you mean login to ambari web using the ambari administrator id? Login to Ambari web UI also works properly. But during setup-ldap command it prompts for ambari admin user and password. After which the following error comes. Enter Ambari Admin password: Fetching LDAP configuration from DB.ERROR: Exiting with exit code 1. REASON: Error while fetching LDAP configuration. Error details: HTTP Error 403: Forbidden Is it trying to connect to MySQL/MariaDB to validate admin creds? Is it due to webserver limitations in MariaDB host I am getting 403? Please share your thoughts over this.
... View more
01-27-2021
03:05 AM
Hi @GangWar Thank you for the reply. The same creds work from manager node where in issue happens only in edge node. Ambari server version is Version 2.7.3.0.
... View more
01-26-2021
08:48 PM
Hi, I am pretty much new to setting up LDAP with Ambari server. So I wanted to understand where the details entered during LDAP setup are stored/cached. Whenever I try to do ambari-server setup-LDAP, I am getting the following error. Fetching LDAP configuration from DB.ERROR: Exiting with exit code 1. REASON: Error while fetching LDAP configuration. Error details: HTTP Error 403: Forbidden Here I am not sure what exactly DB means. So please help me in understanding what happens in background when I try to do LDAP setup or sync. I am trying to connect to an AD hosted in a remote machine.
... View more
Labels:
- Labels:
-
Apache Ambari