Member since
05-24-2017
9
Posts
1
Kudos Received
0
Solutions
10-16-2018
11:50 PM
Is it possible to batch insert into a table that contains a field of type array<string>. What is the syntax for that?
... View more
02-25-2018
03:54 AM
I am trying to configure the HDP2.6.4 VM sandbox. I need to open up port 9997 (Accumulo TServer Port) since it's not open up by default. I tried to follow the port forwarding instructions below, but I'm not able to get passed the first step LOL. I get a connection refused when trying to connect to port 2122. What am I doing wrong?? ssh -p 2122 root@sandbox.hortonworks.com https://hortonworks.com/tutorial/sandbox-port-forwarding-guide/section/2/
... View more
- Tags:
- Hadoop Core
- hdp-2.6
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
02-21-2018
02:25 AM
@Jay Kumar SenSharma Yes, I have followed the steps on that link. When I run the "ambari-server install-mpack..." command, it errored out with the error "...the file already exists."
... View more
02-21-2018
02:04 AM
1 Kudo
I get the following error when installing Solr via Ambari on HDP2.6.4 Sandbox. Any ideas? ... 2018-02-21 01:22:15,539 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo'] because contents don't match
2018-02-21 01:22:15,575 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-21 01:22:15,638 - Skipping installation of existing package unzip
2018-02-21 01:22:15,638 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-21 01:22:15,647 - Skipping installation of existing package curl
2018-02-21 01:22:15,647 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-21 01:22:15,655 - Skipping installation of existing package hdp-select
2018-02-21 01:22:15,657 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-02-21 01:22:15,657 - Skipping stack-select on SOLR because it does not exist in the stack-select package structure.
2018-02-21 01:22:15,831 - Using hadoop conf dir: /usr/hdp/2.6.4.0-91/hadoop/conf
2018-02-21 01:22:15,832 - Package['lucidworks-hdpsearch'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-02-21 01:22:15,892 - Installing package lucidworks-hdpsearch ('/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch')
2018-02-21 01:22:21,791 - Execution of '/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch' returned 1. Error: Nothing to do
2018-02-21 01:22:21,791 - Failed to install package lucidworks-hdpsearch. Executing '/usr/bin/yum clean metadata'
2018-02-21 01:22:22,005 - Retrying to install package lucidworks-hdpsearch after 30 seconds
2018-02-21 01:23:39,626 - The repository with version 2.6.4.0-91 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2018-02-21 01:23:39,627 - Skipping stack-select on SOLR because it does not exist in the stack-select package structure.
Command failed after 1 tries
... View more
Labels:
06-02-2017
11:38 PM
Yes. Here's the exact curl command I used: curl -v 'http://sandbox.hortonworks.com:8983/solr/test_core/update?commit=true&separator=%09&escape=%5c' --data-binary @/opt/solr/ingestion/data/test/data_tab.txt -H 'Content-type:application/csv'
... View more
06-01-2017
05:02 PM
Yes, I did try the raw values. Pressing tab for separator and \ for the escape, I got the error "invalid char between encapsulated token end delimiter". I also tried \t for the separator, which resulted in the error 'invalid separator:'\t'.
... View more
05-31-2017
10:24 PM
I'm trying to ingest a tab-delimited file using the PutSolrContentStream processor. I followed the example on https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Index+Handlers and added two properties...separator with value of %09 and an escape property with a value of %5c to the PutSolrContentStream processor, but I got the error "Invalid separator:'%09'". I was able to successfully ingest using the curl command specified on the wiki page. Anyone have any ideas how to configure PutSolrContentStream to work with tab-delimited files?
... View more
Labels:
- Labels:
-
Apache NiFi
05-24-2017
12:36 AM
The ListFile processor detects all files in a directory on startup, but additional files are not always detected. Anyone know why this would happen? The new files have different names. All files are owned by the same user so it's not a privilege issue. To provide more details...Initially there were 255 files which were all detected. I added 220 new files, but only 45 of those were detected. I then added an additional 230 files, but only 18 of those were detected. Any files added after that were not detected at all...
... View more
Labels:
- Labels:
-
Apache NiFi