Member since
06-07-2016
923
Posts
322
Kudos Received
115
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 4127 | 10-18-2017 10:19 PM | |
| 4361 | 10-18-2017 09:51 PM | |
| 14906 | 09-21-2017 01:35 PM | |
| 1859 | 08-04-2017 02:00 PM | |
| 2431 | 07-31-2017 03:02 PM |
08-29-2016
11:33 PM
@kishore sanchina Looking at your logs, you are running out of PermGen space. you need to increase PermGen and likely your java heap also. How much memory do you have? 2016-08-29 06:58:42,186 ERROR [NiFi Web Server-96] org.apache.nifi.NiFi An Unknown Error Occurred in Thread Thread[NiFi Web Server-96,5,main]: java.lang.OutOfMemoryError: PermGen space You need to increase the following in your conf/bootstrap.conf folder. I would double it for now to start with. #java.arg.11=-XX:PermSize=128M
#java.arg.12=-XX:MaxPermSize=128M
And then possibly update following also.
# JVM memory settings
java.arg.2=-Xms1024m
java.arg.3=-Xmx1024m It would be very difficult for me or anyone else to optimize these for your specific load. You need to figure out based on your load how much memory you need to assign to permgen as well as java heap. this may be trial and error.
... View more
08-29-2016
11:15 PM
@kishore sanchina can you please share the output of your nifi-app.log file? It should be under logs folder in your nifi installation.
... View more
08-29-2016
02:58 PM
@ScipioTheYounger I would say only one type of authorization is available for metastore and it is the Storage based. The second one is pretty useless. It's the legacy one and it allows users to grant themselves whatever permissions they want.
... View more
08-29-2016
02:07 PM
@abhi singh Can you please share your code for evaluate method?
... View more
08-27-2016
08:05 PM
Hi @SBandaru What I meant was this link has a link to download Hortonworks Connector for Teradata. So what I was wondering is if you are using the default sqoop connector for Teradata or Hortonworks Connector for Teradata.
... View more
08-26-2016
05:00 PM
@Pierre Villard
Did you try --queryband or try something like below (notice a dash): --query-band DC=BP\?
... View more
08-26-2016
03:09 PM
@Saifullah Sajjad I don't see the error log. Did you attach it?
... View more
08-25-2016
10:51 PM
@Johnny Fugers That question is little ambiguous. Do you mean if Hadoop provide out of the box tools where you push data and it tells you what distribution you have? The answer is no. But, normally like outside of Hadoop, you will assume a distribution for your data and then verify if data agrees with your assumption. That for sure you can do. Use Spark to do that. Check this link. Or use Python. Check PySpark also. Or even R.
... View more
08-25-2016
09:45 PM
2 Kudos
@Johnny Fugers First find if your data is normally distributed. If not, what's the distribution? That's what will pretty much determine which test you should be using. If data is not normally distributed, you can convert data to its normal form. So, first know your distribution. Are you familiar with Grubb's test? You will have to write your own UDF to do that using Hive. But why do you want to limit yourself to Hive? You can read Hive data using Spark? Spark MLLib will provide you with several out of the box tools to do just that. Your data can still be read using Hive for all other things that you are doing with it but at the same time you can use Spark on same data. Check this link.
... View more
08-25-2016
08:58 PM
@SBandaru Are you using default connector that comes with Sqoop? If yes, then you need to use Hortonworks connector for Teradata. Please see the instructions including download link here.
... View more