Member since
09-10-2015
95
Posts
166
Kudos Received
34
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1316 | 11-04-2016 04:56 PM | |
1093 | 10-21-2016 07:13 PM | |
2182 | 03-09-2016 07:00 PM | |
2562 | 01-28-2016 12:27 AM | |
1399 | 12-10-2015 03:09 PM |
07-13-2018
06:59 PM
5 Kudos
A common question when planning a new implementation or an upgrade is “what’s supported?” We’ve traditionally handled this with documentation, but as we release more products and support more platforms, it’s been tough for our customers to understand product interoperability. To solve this problem, we’ve created a new interactive web application that Hortonworks Support users can use to quickly answer questions like the following: Which version of Ambari works with HDF 3.1.2? Which Hortonworks products are certified with RHEL 7.4? Which databases are supported with both HDP 2.6.4 and HDF 3.1.2? Usually, these questions would be tough to answer as multiple documentation sections would have to be referenced and compared, but with the Hortonworks Support Matrix, answers are only a few clicks away. Just navigate to https://supportmatrix.hortonworks.com and log in using your Hortonworks Support Portal credentials; then click one or more products to see what we support. It’s important to note that you must have a Hortonworks Support Portal account to access this information.
... View more
05-15-2018
05:56 PM
For more details, please refer to this article: https://community.hortonworks.com/articles/188269/javapython-updates-and-ambari-agent-tls-settings.html
... View more
05-03-2018
04:43 PM
27 Kudos
JDK/Python Changes Causing Ambari Server/Agent Registration Issues Recent JDK and Python updates have introduced behavior changes that can affect the Ambari Server to Ambari Agent registration process. The Ambari Server and Ambari Agent use TLS to register with each other securely. Recent changes to JDK and Python have been made to increase security by eliminating the use of insecure cipher suites and protocols. These changes ensure that more secure protocols and cipher suites are used by the Ambari Server when setting up its TLS sockets, and as a result, require the Ambari Agent’s Python client to be configured to use later versions of the TLS protocol to communicate with the Ambari Server. You may be affected by these changes if you’ve recently upgraded your Python or JDK version and are noticing that your Ambari Agents are not heartbeating back to the Ambari Server and you see entries that resemble this in your Ambari Agent logs: WARNING 2018-04-24 16:35:10,989 NetUtil.py:124 - Server at https://***.***.***.***:8440 is not reachable, sleeping for 10 seconds...
INFO 2018-04-24 16:35:20,990 NetUtil.py:70 - Connecting to https://***.***.***.***:8440/ca
ERROR 2018-04-24 16:35:20,991 NetUtil.py:96 - EOF occurred in violation of protocol (_ssl.c:579)
ERROR 2018-04-24 16:35:20,991 NetUtil.py:97 - SSLError: Failed to connect. Please check openssl library versions. When debugging this problem by using -Djavax.net.debug=all to start up the Ambari Server, you will also see the following in the Ambari Server logs: qtp-ambari-agent-40, fatal error: 40: Client requested protocol TLSv1 not enabled or not supported
javax.net.ssl.SSLHandshakeException: Client requested protocol TLSv1 not enabled or not supported
qtp-ambari-agent-40, SEND TLSv1.2 ALERT: fatal, description = handshake_failureqtp-ambari-agent-40, WRITE: TLSv1.2 Alert, length = 2
qtp-ambari-agent-40, fatal: engine already closed. Rethrowing javax.net.ssl.SSLHandshakeException: Client requested protocol TLSv1 not enabled or not supported This is a telltale sign that the Ambari Agent is trying to communicate with the Ambari Server using TLSv1, instead of the TLS version mandated by upgraded JDK which is TLSv1.2. There are two situations that have to be considered when solving this problem: 1.) If you are running CentOS 6 or SLES 11 the version of Python (2.6.x) does not work with TLSv1.2, so you must make changes to your newly updated JDK in order to proceed. 2.) If you are running CentOS 7, Debian 7, Ubuntu 14 & 16, or SLES 12 the version of Python (2.7.x) does work with TLS v1.2, so you only have to make changes to the Ambari Agent configuration to tell it to use TLS v1.2 in order to proceed. Solution For CentOS 7, Debian 7, Ubuntu 14 & 16, or SLES 12 (Python 2.7) To solve this problem simply configure the Ambari Agent to use TLSv1.2 when communicating with the Ambari Server by editing each Ambari Agent’s /etc/ambari-agent/conf/ambari-agent.ini file and adding the following configuration property to the security section: [security]
force_https_protocol=PROTOCOL_TLSv1_2 Once this configuration change has been made, the Ambari Agent needs to be restarted. After restarting you should no longer see the ERROR’s in the Ambari Agent logs, and in the Ambari Server UI you’ll notice that all Ambari Agents are once again heartbeating. Solution for CentOS 6, or SLES 11 (Python 2.6) In this scenario, the preferred and most secure path forward is to upgrade your Operating System to a version that supports Python 2.7 if that cannot be done the only way forward is to weaken the security of your JDK installation by editting the java.security file in the JDK being used by the Ambari Server and make the following changes:
Locate the jdk.tls.disabledAlgorithms property and remove the 3DES_EDE_CBC reference Save the file, and restart the Ambari Server If there are any other questions, or special circumstances not covered in this post, please create a support ticket with Hortonworks Support.
... View more
Labels:
01-17-2017
09:33 PM
HTTPS Proxy support was released in SmartSense 1.3.0. For more information see the following links: http://docs.hortonworks.com/HDPDocuments/SS1/SmartSense-1.3.1/bk_installation/content/bundle_transport.html http://docs.hortonworks.com/HDPDocuments/SS1/SmartSense-1.3.1/bk_installation/content/HTTPS_upload.html
... View more
11-04-2016
04:56 PM
2 Kudos
Hey @Mark Miller, we are not planning to ship Solr 6.x until late next year.
... View more
10-21-2016
07:13 PM
1 Kudo
In this case it sounds like you're looking to 'take over' an existing Solr instance. Unfortunately that is not something that we support at the moment. The Ambari Infra Solr Instance is used only for our components that depend on Solr to index data (Atlas, Ranger, LogSearch Tech Preview). So you can consider these instances of Solr as restricted for use only by HDP Stack components.
... View more
07-06-2016
04:24 PM
13 Kudos
Hortonworks Data Platform Artifacts Developing solutions with Hadoop commonly requires the use of multiple different HDP component libraries. Whether you’re building solutions with Pig, Spark, Cascading, or HBase, at some point extensions will need to be created, and those artifacts for each component will need to be used.
This guide serves as an overview of where to find those artifacts and how to get them quickly integrated with your preferred build tool, and IDE. Reposotories At Hortonworks, we store all of our artifacts in a public Sonatype Nexus repository. That repository can be easily accessed and searched for commonly used library, source code, and javadoc archives simply by navigating to http://repo.hortonworks.com. Artifacts Jar files containing compiled classes, source, and javadocs are all available in our public repository, and finding the right artifact with right version is as easy as searching the repository for classes you need to resolve. For example, If creating a solution that requires the use of a class such as org.apache.hadoop.fs.FileSystem, you can simply search our public repository for the artifact that contains that class using the search capabilities available through http://repo.hortonworks.com. Searching for that class will locate the hadoop-common artifact that is part of the org.apache.hadoop group. There will be multiple artifacts each with a different version. Artifacts in our repository use a 7 digit version scheme. So if we’re looking at the 2.7.1.2.3.2.0-2650 version of this artifact:
The first three digits (2.7.1) signify the Apache Hadoop base version The next four digits (2.3.2.0) signify our Hortonworks Data Platform release The final numbers after the hyphen (2650) signifies the build number As you’re looking for the right artifact, it’s important to use the artifact version that corresponds to the HDP version you plan to deploy to. You can determine this by using hdp-select versions from the command line, or using Ambari by navigating to Admin > Stack and Versions. If neither of these are available in your version of HDP or Ambari, you can use yum, zypper, or dpkg to query the RPM or Debian packages installed for HDP and note their versions. Once the right artifact has been found with the version that corresponds to your target HDP environment, it’s time to configure your build tool to both resolve our repository and include the artifact as a dependency. The following section outlines how to do both with commonly used with build tools such as Maven, SBT, and Gradle. Maven Setup Apache Maven, is an incredibly flexible build tool used by many Hadoop ecosystem projects. In this section we will outline what updates to your project’s pom.xml file are required to start resolving HDP artifacts. Repository Configuration The pom.xml file enables flexible definition of project dependencies and build procedures. To add the Hortonworks repository to your project, allowing HDP artifacts to be resolved, edit the <repositories/> section and add a <repository/> entry as illustrated below: <repositories>
<repository>
<id>HDP</id>
<name>HDP Releases</name>
<url>http://repo.hortonworks.com/content/repositories/releases/</url>
</repository>
</repositories>
Artifact Configuration Dependencies are added to Maven using the <dependency/> tag within the <dependencies/> section of the pom.xml. To add a dependency such as hadoop-common, add this fragment: <dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1.2.3.2.0-2650</version>
</dependency>
Once both the repository has been added to the <repositories/> section, and the artifacts have been added to the <dependencies/>, a simple mvn compile can be issued from the base directory of your project to ensure that proper syntax has been used and the appropriate dependencies are downloaded. Source & Javadoc When using Maven with an IDE, it is often helpful to have the accompanying JavaDoc and source code. To obtain both from our repository for the artifacts that you have defined in your pom.xml, run the following commands from the base directory of your project: mvn dependency:sources
mvn dependency:resolve -Dclassifier=javadoc SBT Setup The Scala Build Tool is commonly used with Scala based projects, and provide simple configuration, and many flexible options for dependency and build management. Repository Configuration In order for SBT projects to resolve Hortonworks Data Platform dependencies, an additional resolvers entry must be added to your build.sbt file, or equivalent, as illustrated below: resolvers += "Hortonworks Releases" at "http://repo.hortonworks.com/content/repositories/releases/" Artifact Configuration Dependencies can be added to SBT’s libraryDependencies as illustrated below: libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.1.2.3.2.0-2650" To explicitly ask SBT to also download source code and JavaDocs an alternate notation can be used: libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.1.2.3.2.0-2650" withSources() withJavadoc() Once both the repository has been added to resolvers, and the artifacts have been added to dependencies, a simple sbt compile can be issued from the base directory of your project to ensure that proper syntax has been used and the appropriate dependencies are downloaded. Gradle Setup The Gradle build management tool is used frequently in Open Source java projects, and provides a simple Groovy-based DSL for project dependency and build definition. Plugin Configuration Gradle uses plugins to add functionality to add new task, domain objects and conventions to your gradle build. Add the following plugins to your build.gradle file, or equivalent, as illustrated below: apply plugin: 'java'
apply plugin: 'maven'
apply plugin: 'idea' // Pick IDE appropriate for you
apply plugin: 'eclipse' // Pick IDE appropriate for you
Repository Configuration In order for Gradle projects to resolve Hortonworks Data Platform dependencies, an additional entry must be added to your build.gradle file, or equivalent, as illustrated below: repositories {
maven { url "http://repo.hortonworks.com/content/repositories/releases/" }
}
Artifact Configuration Dependencies can be added to Gradle’s dependencies section as illustrated below: dependencies {
compile group: "org.apache.hadoop", name: "hadoop-common", version: "2.7.1.2.3.2.0-2650"
}
idea { // Pick IDE appropriate for you
module {
downloadJavadoc = true
downloadSources = true
}
}
eclipse { // Pick IDE appropriate for you
classpath {
downloadSources = true
downloadJavadoc = true
}
}
Once both the repositories and the dependencies have been added to build file, a simple gradle clean build can be issued from the base directory of your project to ensure that proper syntax has been used and the appropriate dependencies are downloaded.
... View more
03-23-2016
02:35 PM
Can you try restarting the ambari-server and retrying the kerberos wizard, but this time specifying the correct information the first time?
... View more
03-23-2016
01:39 PM
1 Kudo
We've recently found an issue where a case insensitive match is not occurring with the groupMembershipAttr. So if you had groupMembershipAttr=CN, it won't match any objects in the directory due to the wrong LDAP search filter being applied. Can you check that and set groupMembershipAttr=cn instead?
... View more
03-17-2016
04:05 PM
1 Kudo
Which directory server are you using?
... View more