Member since
01-28-2015
61
Posts
35
Kudos Received
0
Solutions
04-08-2018
12:34 PM
I am ingesting data from MS SQL Server 2016 to hive using Nifi and the worflow is as follows : GenerateTableFetch->ExtractText->ExecuteSQL->UpdateAttribute->ConvertAvroToORC->PutHDFS->ReplaceText->PutHiveQL but using MS SQL 2012+ database type gives error in GenerateTableFetch. I got solution from this question on stackoverflow. My generatetablefetch generates: SELECT * FROM (SELECT TOP 50000 *, ROW_NUMBER() OVER(ORDER BY asc) rnum FROM ABCD.dbo.DEFG) A WHERE rnum > 0 AND rnum <= 50000 How to configure ReplaceText processor so as to replace "ORDER BY asc" by "ORDER BY newid() asc" so that I can input that to my existing workflow to the ExecuteSQL processor?
... View more
Labels:
- Labels:
-
Apache NiFi
04-08-2018
11:37 AM
But this approach will be painful if you have 2000+ tables to move from MS SQL using Nifi, also Using GenerateTableFetch gives error (Order by clause cannot be null or empty when using row paging) what to do in that case?
... View more
02-23-2018
05:43 AM
@AndrewLim Worked like a charm. Looking at your response, I figured that I never defined schema.name and hence the errors! Thank you for the prompt response!
... View more
02-22-2018
10:06 AM
I have a simple CSV file and the content of the file is as follows: 1,QWER
2,TYUI
3,ASDF
4,GHJK
5,ZXCV I want to move the content of this file into a MYSQL table, hence i have created a the following flow (refer to 1.png) I have configured the PutDatabaseRecord processor as (refer 2.png ) Also, the CSVReader looks like (refer 3.png) I am getting error refer 4.png Can you help me configure my CSVReader I guess its because of it I am not able to push my csv records in to mysql table Any help is appreciated. Reference link: https://community.hortonworks.com/questions/102559/please-suggest-me-stepshow-i-can-insert-csv-filein.html
... View more
Labels:
- Labels:
-
Apache NiFi
10-23-2017
10:21 AM
I am using a jdbc driver to get connected through the MS SQL 2012 server instance with my Nifi 1.3.0 instance. If I use QueryDatabaseTable processor I am able to fetch the incremental data for that specific table with the output just as 1 flow file. I want to fetch a table using a custom query(which includes 1 query which fetches different columns and from multiple tables), so I am using ExecuteSQL processor to run the query but the output of that processor is multiple flow files. I only need 1 flow file so that I can write a code to fetch the same incrementally and would be easy to track 1 flow file. Any idea as to how can this be achieved?
... View more
Labels:
- Labels:
-
Apache NiFi
10-23-2017
09:51 AM
The error was resolved!
... View more
09-01-2017
08:13 AM
I have a table in posgres which has a schema and values as follows: id,password,last_login,is_deleted,created_at,modified_at,email,customer_id,phone,image_url,is_staff,is_email_verified,user_type,address,company_location,company_name,dob,first_name,last_name,redeem_points,membership_id,is_active,segment,otp_pin_id,password_otp_verified,register_otp_verified,transit_points
1111,vadfva/advasdvdv/avdva/adv=,,f,2017-06-02 11:59:00.543744+05:30,2017-06-02 11:59:00.543799+05:30,asdf@DFASDF.com,123455,1234567901,,f,f,general,"{""state"": ""asdsfag"", ""pincode"": ""1234456"", ""address_city"": ""asdffgd"", ""address_line_1"": ""ASDFAFA"", ""address_line_2"": """", ""address_landmark"": """"}",,,1976-12-25 05:30:00+05:30,ASDF,ASDF,0,SD sxvadafc,t,SDFSDF,,f,f,0
When I use QueryDatabase processor where the table does not have complex columns this works, but in the above example, it has a json entry in one of the columns. My guess is that's what triggering error, since the value of address is "jsonb". The error is as follows: ERROR [Timer-Driven Process Thread-2] o.a.n.p.standard.QueryDatabaseTable QueryDatabaseTable[id=9c7e1e9b-1088-115d-c426-d788e86d9ea7] Unable to execute SQL select query SELECT * FROM asdfad due to org.apache.nifi.processor.exception.ProcessException: Error during database query or conversion of records to Avro.: {}
org.apache.nifi.processor.exception.ProcessException: Error during database query or conversion of records to Avro.
at org.apache.nifi.processors.standard.QueryDatabaseTable.lambda$onTrigger$13(QueryDatabaseTable.java:305)
at org.apache.nifi.controller.repository.StandardProcessSession.write(StandardProcessSession.java:2529)
at org.apache.nifi.processors.standard.QueryDatabaseTable.onTrigger(QueryDatabaseTable.java:299)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.QuartzSchedulingAgent$2.run(QuartzSchedulingAgent.java:165)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: createSchema: Unknown SQL type 1111 / jsonb (table: account_user, column: address) cannot be converted to Avro type
at org.apache.nifi.processors.standard.util.JdbcCommon.createSchema(JdbcCommon.java:564)
at org.apache.nifi.processors.standard.util.JdbcCommon.convertToAvroStream(JdbcCommon.java:242)
at org.apache.nifi.processors.standard.QueryDatabaseTable.lambda$onTrigger$13(QueryDatabaseTable.java:303)
... 13 common frames omitted
The QueryDataBaseTable processor is configured as an image attached to this question. I referred a similar question posted on the forum but no solution from there too, the link to that question is: https://community.hortonworks.com/questions/36464/how-to-use-nifi-to-incrementally-ingest-data-from.html Any idea why the error is caused? since it work for the rest of the tables except this once. I tried changing the database type from default to Oracle but it does not help. Do I have a work around for it?
... View more
Labels:
- Labels:
-
Apache NiFi
08-17-2017
09:01 AM
@Jay SenSharma I had removed "?ssl=true" after the database name from the DB connection URL but I still get an error that controller service is disabled. I have pasted the error too above before.
... View more
08-16-2017
09:27 AM
DBCPConnectionPool is configured as the image (dbconnec image) attached Database Connection URL is as: jdbc:postgresql://<IP>:<port>/<database name>?ssl=true I am getting error as follows when i add "org.postgresql.ds.PGPoolingDataSource" Database Driver Class Name in the DBCPCOnnectionPool ERROR [Timer-Driven Process Thread-3] o.a.n.p.standard.QueryDatabaseTable QueryDatabaseTable[id=015d1028-9c7e-1e9b-821a-7e21dd312b82] Unable to execute SQL select query SELECT * FROM account_transaction due to org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (SSL error: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target): {}
org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (SSL error: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:275)
at sun.reflect.GeneratedMethodAccessor711.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:89)
at com.sun.proxy.$Proxy128.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.QueryDatabaseTable.onTrigger(QueryDatabaseTable.java:266)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (SSL error: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:272)
... 17 common frames omitted
Caused by: org.postgresql.util.PSQLException: SSL error: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.postgresql.ssl.MakeSSL.convert(MakeSSL.java:67)
at org.postgresql.core.v3.ConnectionFactoryImpl.enableSSL(ConnectionFactoryImpl.java:359)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:148)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:194)
at org.postgresql.Driver.makeConnection(Driver.java:450)
at org.postgresql.Driver.connect(Driver.java:252)
at org.apache.nifi.dbcp.DriverShim.connect(DriverShim.java:46)
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
... 20 common frames omitted
Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1949)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:302)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:296)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1514)
at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1026)
at sun.security.ssl.Handshaker.process_record(Handshaker.java:961)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1062)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
at org.postgresql.ssl.MakeSSL.convert(MakeSSL.java:62)
... 31 common frames omitted
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:387)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:292)
at sun.security.validator.Validator.validate(Validator.java:260)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:324)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:229)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1496)
... 39 common frames omitted
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:382)
... 45 common frames omitted
And, when I put "org.postgresql.ds.PGPoolingDataSource" in the Database Driver Class Name in the DBCPCOnnectionPool and change the db connection URL to "jdbc:postgresql://X.X.X.X:5432/abc", I get the following error: ,906 WARN [Timer-Driven Process Thread-3] o.a.n.c.t.ContinuallyRunProcessorT ask
java.lang.IllegalStateException: Cannot invoke method public abstract java.sql.Connection org.a pache.nifi.dbcp.DBCPService.getConnection() throws org.apache.nifi.processor.exception.ProcessE xception on Controller Service with identifier 015d1030-9c7e-1e9b-0b6a-4d8d0c01e807 because the Controller Service is disabled
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke (StandardControllerServiceInvocationHandler.java:84)
at com.sun.proxy.$Proxy128.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.QueryDatabaseTable.onTrigger(QueryDatabaseTable. java:266)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.jav a:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProc essorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProc essorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSc hedulingAgent.java:132)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Sche duledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledTh readPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
2017-08-16 09:38:57,906 ERROR [Timer-Driven Process Thread-4] o.a.n.p.standard.QueryDatabaseTab le QueryDatabaseTable[id=015d1028-9c7e-1e9b-821a-7e21dd312b82] QueryDatabaseTable[id=015d1028-9 c7e-1e9b-821a-7e21dd312b82] failed to process session due to java.lang.IllegalStateException: C annot invoke method public abstract java.sql.Connection org.apache.nifi.dbcp.DBCPService.getCon nection() throws org.apache.nifi.processor.exception.ProcessException on Controller Service wit h identifier 015d1030-9c7e-1e9b-0b6a-4d8d0c01e807 because the Controller Service is disabled: { }
java.lang.IllegalStateException: Cannot invoke method public abstract java.sql.Connection org.a pache.nifi.dbcp.DBCPService.getConnection() throws org.apache.nifi.processor.exception.ProcessE xception on Controller Service with identifier 015d1030-9c7e-1e9b-0b6a-4d8d0c01e807 because the Controller Service is disabled
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke (StandardControllerServiceInvocationHandler.java:84)
at com.sun.proxy.$Proxy128.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.QueryDatabaseTable.onTrigger(QueryDatabaseTable. java:266)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.jav a:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProc essorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProc essorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSc hedulingAgent.java:132)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Sche duledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledTh readPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Also, if i add "org.postgresql.Driver" Database Driver Class Name in the DBCPCOnnectionPool, and keep the db connection URL to "jdbc:postgresql://X.X.X.X:5432/abc", I get the following error: 229 ERROR [Timer-Driven Process Thread-3] o.a.n.p.standard.QueryDatabaseTable QueryDatabaseTable[id=015d1028-9c7e-1e9b-821a-7e21dd312b82] Unable to execute SQL select query SELECT * FROM XXXX due to org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (FATAL: no pg_hba.conf entry for host "X.X.X.X", user "abc", database "abc", SSL off): {}
org.apache.nifi.processor.exception.ProcessException: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (FATAL: no pg_hba.conf entry for host "X.X.X.X", user "abc", database "abc", SSL off)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:275)
at sun.reflect.GeneratedMethodAccessor711.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.controller.service.StandardControllerServiceInvocationHandler.invoke(StandardControllerServiceInvocationHandler.java:89)
at com.sun.proxy.$Proxy128.getConnection(Unknown Source)
at org.apache.nifi.processors.standard.QueryDatabaseTable.onTrigger(QueryDatabaseTable.java:266)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1120)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (FATAL: no pg_hba.conf entry for host "X.X.X.X", user "abc", database "abc", SSL off)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at org.apache.nifi.dbcp.DBCPConnectionPool.getConnection(DBCPConnectionPool.java:272)
... 17 common frames omitted
Caused by: org.postgresql.util.PSQLException: FATAL: no pg_hba.conf entry for host "X.X.X.X", user "abc", database "abc", SSL off
at org.postgresql.core.v3.ConnectionFactoryImpl.doAuthentication(ConnectionFactoryImpl.java:438)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:222)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:194)
at org.postgresql.Driver.makeConnection(Driver.java:450)
at org.postgresql.Driver.connect(Driver.java:252)
at org.apache.nifi.dbcp.DriverShim.connect(DriverShim.java:46)
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
... 20 common frames omitted
... View more
Labels:
- Labels:
-
Apache NiFi
03-16-2016
01:54 PM
yes we did increase the root space without hampering the installation, we did follow simple process of increasing the size. we created a external virtual space and merged with root
... View more
02-26-2016
09:45 AM
I am trying to upgrade the storage i have taken the snapshot of the cluster. Just to be sure I need to increase /dev/mappper/centos-root storage size right?
... View more
02-25-2016
12:48 PM
Yes i am using / for hadoop. This partitions are done automatically by ambari, so want to increase the HDFS size, for the POC wanted to import a table of size 70 GB, but because of the current HDFS size, I am able to import only 30+ GB's and the job gets hanged with alerts all over the ambari about the disk usage.
... View more
02-25-2016
12:22 PM
1 Kudo
My /home partition is already 130 GB, but I am not able to use it for HDFS as metioned above as per the gadget. My concern is it should not hamper my HDP installation which I have already done on it
... View more
02-25-2016
11:55 AM
1 Kudo
Yes the cluster is on VM Vsphere
... View more
02-25-2016
11:45 AM
3 Kudos
In the 3 node cluster installation for POC, My 3rd note is datanode, it has a disk space of about 200 GB. As per the widget, my current HDFS Usage is as follows: DFS Used: 512.8 MB (1.02%); non DFS used 8.1 GB (16.52%); remaining 40.4GB (82.46 %) When I do df -h to check the disk size i can see a lot of space is taken by tmpfs as shown in the following screenshot: How can I increase my HDFS disk size?
... View more
Labels:
- Labels:
-
Apache Hadoop
02-25-2016
09:08 AM
1 Kudo
I guess! ill wait for the driver release
... View more
02-24-2016
12:19 PM
I did a lot of steps to be true, not sure which resolved it actually! I'll list all that I did: I installed knox manually, changed the firewall settings, checked if all port were accessible from all the nodes. From what I think changing the port number for fs.default.name from 50070 to 8020 resolved it.
... View more
02-24-2016
11:15 AM
@Neeraj Sabharwal Yes i was able to read that from my laptop only the ones with node2 and node 3 were not accessible
... View more
02-24-2016
10:26 AM
1 Kudo
Yes I have created a symlink, but after creating a symlink, I still get the same error while installing the driver
... View more
02-24-2016
09:48 AM
1 Kudo
The documentation says it should have CentOS 5.0 or 6.0. I also reffered https://community.hortonworks.com/questions/2210/libsasl-version-fro-hive-obdc-rhel-7.html. I have 3 node cluster of HDP 2.3. o CentOS 7 when I try installing I get the following output: [root@node1 enggusr]# yum --nogpgcheck localinstall hive-odbc-native-2.0.5.1005-1.el6.x86_64.rpm
Loaded plugins: fastestmirror, priorities
Examining hive-odbc-native-2.0.5.1005-1.el6.x86_64.rpm: hive-odbc-native-2.0.5.1005-1.x86_64
Marking hive-odbc-native-2.0.5.1005-1.el6.x86_64.rpm to be installed
Resolving Dependencies
--> Running transaction check
---> Package hive-odbc-native.x86_64 0:2.0.5.1005-1 will be installed
--> Processing Dependency: cyrus-sasl-gssapi(x86-64) >= 2.1.22 for package: hive-odbc-native-2.0.5.1005-1.x86_64
HDP-2.3 | 2.9 kB 00:00
HDP-UTILS-1.1.0.20 | 2.9 kB 00:00
Updates-ambari-2.2.0.0 | 2.9 kB 00:00
base | 3.6 kB 00:00
extras | 3.4 kB 00:00
mysql-connectors-community | 2.5 kB 00:00
mysql-tools-community | 2.5 kB 00:00
mysql56-community | 2.5 kB 00:00
updates | 3.4 kB 00:00
Loading mirror speeds from cached hostfile
* base: ftp.iitm.ac.in
* extras: ftp.iitm.ac.in
* updates: ftp.iitm.ac.in
--> Processing Dependency: cyrus-sasl-plain(x86-64) >= 2.1.22 for package: hive-odbc-native-2.0.5.1005-1.x86_64
--> Processing Dependency: libsasl2.so.2()(64bit) for package: hive-odbc-native-2.0.5.1005-1.x86_64
--> Running transaction check
---> Package cyrus-sasl-gssapi.x86_64 0:2.1.26-20.el7_2 will be installed
---> Package cyrus-sasl-plain.x86_64 0:2.1.26-20.el7_2 will be installed
---> Package hive-odbc-native.x86_64 0:2.0.5.1005-1 will be installed
--> Processing Dependency: libsasl2.so.2()(64bit) for package: hive-odbc-native-2.0.5.1005-1.x86_64
--> Finished Dependency Resolution
Error: Package: hive-odbc-native-2.0.5.1005-1.x86_64 (/hive-odbc-native-2.0.5.1005-1.el6.x86_64)
Requires: libsasl2.so.2()(64bit)
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
02-24-2016
09:00 AM
Thank you all! @Geoffrey Shelton Okot @Artem Ervits @Neeraj Sabharwal for the expert solutions here!
... View more
02-24-2016
08:59 AM
@Geoffrey Shelton Okot I have checked the configs I have seen it all fine except i was unable to navigate to http://node1.dtitsupport247.net:50070/webhdfs/ even locally so I checking for the Knox services, I could see the Knox gateway been installed and all the user and directories in place. But i did a yum install knox, changed the port from 50070 back to 8020 and check few config files also referring some related issues in the forum and restarted the cluster and bought up all the components up! Its working now! Finally!
... View more
02-23-2016
12:06 PM
@Geoffrey Shelton Okot Yes, each of those steps have been followed the issues is still the same.
... View more
02-22-2016
11:45 AM
@Neeraj Sabharwal Yes, the datanode and the namenode is up, but I noticed that from the hdfs service page from the quick link I am unable to see anything on the NameNode UI link, it says webpage not available, guess that is the reason why it is not able to establish a connection if I am not wrong? what can I do here? my core-site.xml has <value>hdfs://node1.dtitsupport247.net:8020</value>
even changing the port no. here from 8020 to 50070 does not help, none of the quicklinks are opening, says page does not exists I tried running: ps -ef | grep hadoop | grep -P 'namenode|datanode|tasktracker|jobtracker' I have attached the output for it: outpt1.txt
... View more
02-22-2016
10:21 AM
1 Kudo
@Neeraj Sabharwal Yes, I have been able to successfully turn up the services of HDFS but for MapReduce the history server gives an error. Nt sure why am i getting this error: resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X PUT -T /usr/hdp/2.3.4.0-3485/hadoop/mapreduce.tar.gz 'http://node1.dtitsupport247.net:50070/webhdfs/v1/hdp/apps/2.3.4.0-3485/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444' 1>/tmp/tmpZ5Y51c 2>/tmp/tmpnaJYGu' returned 7. curl: (7) Failed connect to node3.dtitsupport247.net:50075; No route to host
... View more
02-22-2016
10:00 AM
1 Kudo
I did next and completed the installations but many services are not running including. do you have a link which I can refer to? I am not sure what has caused the installation failure so very difficult to decide what step has to be done. I am installing it for the first time
... View more
02-19-2016
03:25 PM
1 Kudo
@Neeraj Sabharwal This is a fresh installation of a cluster, there is no kerberos been configured here in this cluster. Even I refreshed this page but the same this happens, I dont want to reconfigure and reinstall? can we do something here
... View more
02-19-2016
03:02 PM
2 Kudos
I get the following screen while installing because of the error The installation has not completed because of the zookeeper failing to pass the smoke test thus halting the processes further and giving 100% complete and with warning error on the installation page I have attached the zookeeper.txt file which shows the error.
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
02-17-2016
10:13 AM
I too agree! Thank you @Rahul Pathak
... View more
02-17-2016
08:03 AM
@Neeraj Sabharwal @Artem Ervits I hope this thread helps understand the error
... View more