Member since
08-21-2013
146
Posts
25
Kudos Received
34
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3114 | 10-24-2016 10:43 AM | |
6911 | 03-13-2016 02:15 PM | |
3554 | 12-11-2015 01:48 AM | |
3015 | 11-23-2015 12:11 PM | |
2776 | 07-06-2015 10:40 AM |
05-01-2019
12:27 PM
Can you please share your morphlines.conf? I am stuck in a similar situation.
... View more
07-11-2018
11:07 PM
While reading parquet file, How to convert Parquet DECIMAL datatype to String.
... View more
04-12-2018
06:33 PM
In fact, we can use jackson to solve this problem, and it is universal to any json data. morphlines: [
{
id: convertJsonToAvro
importCommands: [ "org.kitesdk.**" ]
commands: [
# read the JSON blob
{ readJson: {} }
# java code
{
java {
imports : """
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.kitesdk.morphline.base.Fields;
import java.io.IOException;
import java.util.Set;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
"""
code : """
String jsonStr = record.getFirstValue(Fields.ATTACHMENT_BODY).toString();
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> map = null;
try {
map = (Map<String, Object>)mapper.readValue(jsonStr, Map.class);
} catch (IOException e) {
e.printStackTrace();
}
Set<String> keySet = map.keySet();
for (String o : keySet) {
record.put(o, map.get(o));
}
return child.process(record);
"""
}
}
# convert the extracted fields to an avro object
# described by the schema in this field
{ toAvro {
schemaFile: /etc/flume/conf/a1/like_user_event_realtime.avsc
} }
#{ logInfo { format : "loginfo: {}", args : ["@{}"] } }
# serialize the object as avro
{ writeAvroToByteArray: {
format: containerlessBinary
} }
]
}
]
... View more
02-18-2018
01:05 AM
Please share the final command which you executing to index. I am getting following exception... 2018-02-18 14:54:01,759 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 192
2018-02-18 14:54:01,765 ERROR [IPC Server handler 15 on 61527] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1516267882526_0659_m_000002_3 - exited : org.kitesdk.morphline.api.MorphlineRuntimeException: java.lang.IllegalArgumentException: Illegal character in scheme name at index 0: 2018-01-10 05:31:10,2,100,100,12,1,1515542470144,311480275243412,18052600405,5808,,310,590,6,190370299670,513,,334,020,7,52941000779800,513,,334,020,7,52941000779800,0,2,3,0,,6,,1,0,52941000779800,,190370299070,0,0,,0,0,,0,0,,0,0,0,0
at org.kitesdk.morphline.base.FaultTolerance.handleException(FaultTolerance.java:73)
at org.apache.solr.hadoop.morphline.MorphlineMapRunner.map(MorphlineMapRunner.java:220)
at org.apache.solr.hadoop.morphline.MorphlineMapper.map(MorphlineMapper.java:86)
at org.apache.solr.hadoop.morphline.MorphlineMapper.map(MorphlineMapper.java:54)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalArgumentException: Illegal character in scheme name at index 0: 2018-01-10 05:31:10,2,100,100,12,1,1515542470144,311480275243412,18052600405,5808,,310,590,6,190370299670,513,,334,020,7,52941000779800,513,,334,020,7,52941000779800,0,2,3,0,,6,,1,0,52941000779800,,190370299070,0,0,,0,0,,0,0,,0,0,0,0
at java.net.URI.create(URI.java:852)
at org.apache.solr.hadoop.PathParts.stringToUri(PathParts.java:128)
at org.apache.solr.hadoop.PathParts.<init>(PathParts.java:48)
at org.apache.solr.hadoop.morphline.MorphlineMapRunner.map(MorphlineMapRunner.java:192)
... 10 more
Caused by: java.net.URISyntaxException: Illegal character in scheme name at index 0: 2018-01-10 05:31:10,2,100,100,12,1,1515542470144,311480275243412,18052600405,5808,,310,590,6,190370299670,513,,334,020,7,52941000779800,513,,334,020,7,52941000779800,0,2,3,0,,6,,1,0,52941000779800,,190370299070,0,0,,0,0,,0,0,,0,0,0,0
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.checkChar(URI.java:3031)
at java.net.URI$Parser.parse(URI.java:3047)
at java.net.URI.<init>(URI.java:588)
at java.net.URI.create(URI.java:850)
... 13 more command: hadoop jar /opt/cloudera/parcels/CDH-5.10.2-1.cdh5.10.2.p0.5/lib/solr/contrib/mr/search-mr-1.0.0-cdh5.10.2-job.jar org.apache.solr.hadoop.MapReduceIndexerTool --solr-home-dir /var/lib/hadoop-hdfs/senario_config --morphline-file /var/lib/hadoop-hdfs/senario_config/conf/morphline.conf --output-dir hdfs://10.10.16.134:8020/solr/senario_collection/core_node1/data/index --input-list hdfs://10.10.16.134:8020/user/hdfs/AMIT/nwsecp-emaster-20180115-175804.log --shards 1
... View more
10-25-2016
01:12 AM
Fantastic! Thanks, whosch.
... View more
03-13-2016
03:10 PM
Excellent that solved the problem along with some unknown fields in the schema.xml where I have now used sanitizeUnknownSolrFields in the Morphline . Thanks for your help.
... View more
02-28-2016
03:24 AM
How did you make it?Could you describe it in detail,thanks a lot.^_^
... View more
12-11-2015
01:48 AM
On yarn the params are called mapreduce.map.java.opts and mapreduce.reduce.java.opts. Wolfgang.
... View more
11-23-2015
11:00 PM
Custom morphline commands are deployed by adding the jar with the custom code to the hbase-indexer Java classpath. The morphline runs inside the hbase-indexer processes which are separate from the hbase processes. It has no impact on the stability of the hbase service.
... View more
07-09-2015
06:03 AM
Thanks for the quick response. After I enable Key-Value Store Indexer service and edit service wide morphline (textbox) config as you suggested above, should I maintain the morphlines.conf file in /etc/hbase-solr/conf directory where I have created for the batch indexing purpose? In other words, if I create morphlinesX.conf, morphlinesY.conf and morphlinesZ.conf, should I update the service wide morphline configuration on KV Store Indexer also? My observation is, when I have those 3 morphlines files in /etc/hbase-solr/conf directory, and enable KV Store indexer service with default configuration, the corresponding 3 collections are active and started generating solr index documents. Further, after updating the KV Store indexer service --> configuration --> servicewide --> morphlines --> morphlines file (textbox), and deploy the client configuration, where it will be updated? Where could I verify the deployed configuraiton from CM admin console? Please clarify.
... View more