Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

New Cluster Install (Design Question)

avatar
Rising Star

Hi all,

I downloaded and ran sandbox but I want to install Hadoop on a server on a test environment. I tried a standalone Windows installation by following the guide but failed. I think the guide has some missing parts, like formatting hdfs, configuring derby with hive etc.

I need some help about arcihtecture I think. If I install ambari on a centOS VM, can I add windows nodes with ambari.

Thanks

Regards 😉

Özgür

1 ACCEPTED SOLUTION

avatar
Master Mentor
@Özgür Akdemirci

If I install ambari on a centOS VM, can I add windows nodes with ambari. - No

I recommend having centos image and configure all the networking pieces and then add nodes

View solution in original post

14 REPLIES 14

avatar
Master Mentor

@Özgür Akdemirci

I will send a note to docs team. Thanks for find this out.

Re: If I install ambari on a centOS VM, can I add windows nodes with ambari.

Answer is No because there is no ambari for windows. Bits are different

avatar
Rising Star

Thank you @Neeraj Sabharwal & @Artem Ervits 😉

Before cluster, I want to end Windows single node installation with success. Below are the errors.

And what must be the next step for me when trying hadoop with windows standalone. Should I follow hello world tutorial of Sandbox?

Really appreciate for your help.

Regards

After running smoke tests, I got errors like:

1- Hive Smoke Test NestedThrowablesStackTrace: The query returned more than one instance BUT either unique is set to true or on ly aggregates are to be returned, so should have returned one result maximum org.datanucleus.store.query.QueryNotUniqueException: The query returned more tha n one instance BUT either unique is set to true or only aggregates are to be ret urned, so should have returned one result maximum at org.datanucleus.store.query.Query.executeQuery(Query.java:1822) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672) at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:243) at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore .java:540) at org.apache.hadoop.hive.metastore.ObjectStore.convertToMTable(ObjectSt ore.java:1085) at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore. java:813) at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.j ava:114) at com.sun.proxy.$Proxy0.createTable(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_tabl e_core(HiveMetaStore.java:1416) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_tabl e_with_environment_context(HiveMetaStore.java:1449) at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHM SHandler.java:107) at com.sun.proxy.$Proxy18.create_table_with_environment_context(Unknown Source) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$cr eate_table_with_environment_context.getResult(ThriftHiveMetastore.java:9200) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$cr eate_table_with_environment_context.getResult(ThriftHiveMetastore.java:9184) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedPr ocessor.java:110) at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedPr ocessor.java:106) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1657) at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBased Processor.java:118) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadP oolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor. java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor .java:615) at java.lang.Thread.run(Thread.java:744) )

Run-HiveSmokeTest : Hive Smoke Test: FAILED At line:1 char:1 + Run-HiveSmokeTest + ~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep tion + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorExceptio n,Run-HiveSmokeTest

HiveServer2 smoke test - drop table, create table and describe table Connecting to jdbc:hive2://XYZXYZ:10001/ 16/01/14 16:59:59 INFO jdbc.Utils: Supplied authorities: XYZXYZ:10001 16/01/14 16:59:59 INFO jdbc.Utils: Resolved authority: XYZXYZ:10001 16/01/14 16:59:59 INFO jdbc.HiveConnection: Will try to open client transport wi th JDBC Uri: jdbc:hive2://XYZXYZ:10001/ Connected to: Apache Hive (version 1.2.1.2.3.0.0-2557) Driver: Hive JDBC (version 1.2.1.2.3.0.0-2557) Transaction isolation: TRANSACTION_REPEATABLE_READ No rows affected (2.429 seconds) Beeline version 1.2.1.2.3.0.0-2557 by Apache Hive Closing: 0: jdbc:hive2://XYZXYZ:10001/ Connecting to jdbc:hive2://XYZXYZ:10001/ 16/01/14 17:00:03 INFO jdbc.Utils: Supplied authorities: XYZXYZ:10001 16/01/14 17:00:03 INFO jdbc.Utils: Resolved authority: XYZXYZ:10001 16/01/14 17:00:03 INFO jdbc.HiveConnection: Will try to open client transport wi th JDBC Uri: jdbc:hive2://XYZXYZ:10001/ Connected to: Apache Hive (version 1.2.1.2.3.0.0-2557) Driver: Hive JDBC (version 1.2.1.2.3.0.0-2557) Transaction isolation: TRANSACTION_REPEATABLE_READ Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct Me taStore DB connections, we don't support retries at the client level.) (state=08 S01,code=1) Closing: 0: jdbc:hive2://XYZXYZ:10001/

Run-HiveServer2SmokeTest : HiveServer2 Smoke Test: FAILED At line:1 char:1 + Run-HiveServer2SmokeTest + ~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep tion + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorExceptio n,Run-HiveServer2SmokeTest

avatar
Rising Star
@Neeraj Sabharwal

really appreciate for your useful directions, I gave up windows and installed I have a 6 virtual node centos 7 cluster (on windows virtualbox). The cluster has the following nodes.

1 edge, 1 namenode, 1 secondary namenode, 3 datanodes,

and it's working now ;), I can use ambari without any red errors.

I began hello world tutorial,

In hive section of the tutorial, I created geolocation_stage table with a CREATE statement, the statement is executing without an error. But I could not see my staging table under default database,

An interesting point is, if I use search database textbox, I can show the staging table.

I verified the table with an sql statement to mysql hive database, "select * from TBLS;"

Am I on the right way ? 😄

Thanks

Regards

Özgür

avatar
Rising Star

What I meant is the table I created is not shown under default database, refresh also not works.

avatar
Master Mentor

@Özgür Akdemirci Could you do me a favor please? 🙂 If you can post this as new question then it will helpful as its new issue. Accept one of the answers in this thread only if it was helpful.

When you post a new question then post the output of

hive

show tables;