Member since
05-24-2018
25
Posts
1
Kudos Received
0
Solutions
06-26-2019
01:29 PM
Hive query giving null values when trying to pull the data from json which has @date column.below is the sample data and the query iam using. {"name":"jai","@date":"2015-06-15"} {"name":"pri","@date":"2017-08-25"} CREATE TABLE json_obj( json string) LOCATION '/user/*************/' select get_json_object(json_obj.json,'$.name') as name, get_json_object(json_obj.json,'$.@date') as date from json_obj; ouput: name date jai null pri null expected output: name date jai 2015-06-15 pri 2017-08-25
... View more
Labels:
- Labels:
-
Apache Hive
06-18-2018
07:35 AM
Hi Snadeep, Did you find solution for this,if so can you please share the query here
... View more
06-16-2018
06:37 AM
Hi @Rajkumar Singh Are you to to fix the problem,even i have same requirement , for me getting null values can you share the ddl and select query for this
... View more
06-13-2018
03:20 PM
Is there any possibility where we can check the count of vesions of a column family in hbase
... View more
Labels:
- Labels:
-
Apache HBase
05-24-2018
04:26 PM
Hi, I have a use case where i need to load json data to hbase using pyspark with row key and 3 column families,Can anyone please help me how to do this. Below is the json i want to load. { "ticid": "1496", "ticlocation": "vizag", "custnum": "222", "Comments": { "comment": [{ "commentno": "1", "desc": "journey", "passengerseat": { "intele": "09" }, "passengerloc": { "intele": "s15" } }, { "commentno": "5", "desc": " food", "passengerseat": { "intele": "09" }, "passengerloc": { "intele": "s15" } }, { "commentno": "12", "desc": " service", "passengerseat": { "intele": "09" }, "passengerloc": { "intele": "s15" } }] }, "Rails": { "Rail": [{ "Traino": "AP1545", "startcity": "vizag", "passengerseat": "5" }, { "Traino": "AP1555", "startcity": "HYD", "passengerseat": "15A" }] } } ticid is the row key ticlocation ,custnum need to be in column family 1 Comments needs to be column family 2 Rails needs to be column family 3
... View more
Labels:
- Labels:
-
Apache HBase
05-07-2018
03:47 PM
Hi @Shu Thanks for the help. i have json file as below format , i am creating ddl which works well,but when i do a select query it throwing error. json file::: {"purchaseid": {"ticid": "1496","ticlocation": "vizag","custnum": "222","Travleinfo": {"Trav": {"fname": "ramu","mname": "g","freq": {"fre": {"frequencynumber": "9","frequnecystatus": "na"}},"food": {"foodpref": [{"foodcode": "9","foodcodeSegment": "chic"},{"foodcode": "22","foodcodeSegment": "veg"},{"foodcode": "36","foodcodeSegment": "idl"}] },"Seats": { "Seat": [{"seatberth": "upper","loc": "s15"},{"seatberth": "lower","loc": "s215"},{"seatberth": "upper","loc": "s3"}] },"stations": { "station": [{"sationname": "vizag","trainnum": "c197"},{"sationname": "hyd","trainnum": "hyd187"},{"sationname": "wrgl","trainnum": "wr1822"}]}}},"Comments": {"comment": [{"commentno": "1","desc": "journey","passengerseat": { "intele": "09" },"passengerloc": { "intele": "s15" }},{"commentno": "5","desc": " food","passengerseat": { "intele": "09" },"passengerloc": { "intele": "s15" }},{"commentno": "12","desc": " service","passengerseat": { "intele": "09" },"passengerloc": { "intele": "s15" } }]}}} DDL::: Create EXTERNAL TABLE SRGMSBI1417.json14_07_05_01( purchaseid struct<ticid:string,ticlocation:string,custnum :string, Travleinfo:struct< trav:struct<fname:string,lname:string,mname:string, freq :struct<fre:array<struct<frequencynumber:string,frequnecystatus:string>>>, food :struct<foodpref:array<struct<foodcode:string,foodcodeSegment:string>>>, Seats :struct<Seat:array<struct<seatberth:string,loc:string>>>, stations :struct<station:array<struct<sationname:string,trainnum:string>>> >>, Comments :struct<Comment:array<struct<commentno:string,desc:string,passengerseat :struct<intele :string>,passengerloc :struct<intele :string> >>> > ) ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' location '/user/srgmsbi1417/json14_07_05_01'; error::: Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Start of Array
... View more
05-07-2018
11:32 AM
@Shu Thanks for the help. i have json file as below format , i am creating ddl which works well,but when i do a select query it throwing error. json file::: {"purchaseid": {"ticid": "1496","ticlocation": "vizag","custnum": "222","Travleinfo": {"Trav": {"fname": "ramu","mname": "g","freq": {"fre": {"frequencynumber": "9","frequnecystatus": "na"}},"food": {"foodpref": [{"foodcode": "9","foodcodeSegment": "chic"},{"foodcode": "22","foodcodeSegment": "veg"},{"foodcode": "36","foodcodeSegment": "idl"}] },"Seats": { "Seat": [{"seatberth": "upper","loc": "s15"},{"seatberth": "lower","loc": "s215"},{"seatberth": "upper","loc": "s3"}] },"stations": { "station": [{"sationname": "vizag","trainnum": "c197"},{"sationname": "hyd","trainnum": "hyd187"},{"sationname": "wrgl","trainnum": "wr1822"}]}}},"Comments": {"comment": [{"commentno": "1","desc": "journey","passengerseat": { "intele": "09" },"passengerloc": { "intele": "s15" }},{"commentno": "5","desc": " food","passengerseat": { "intele": "09" },"passengerloc": { "intele": "s15" }},{"commentno": "12","desc": " service","passengerseat": { "intele": "09" },"passengerloc": { "intele": "s15" } }]}}} DDL::: Create EXTERNAL TABLE SRGMSBI1417.json14_07_05_01(
purchaseid struct<ticid:string,ticlocation:string,custnum :string,
Travleinfo:struct<
trav:struct<fname:string,lname:string,mname:string,
freq :struct<fre:array<struct<frequencynumber:string,frequnecystatus:string>>>,
food :struct<foodpref:array<struct<foodcode:string,foodcodeSegment:string>>>,
Seats :struct<Seat:array<struct<seatberth:string,loc:string>>>,
stations :struct<station:array<struct<sationname:string,trainnum:string>>>
>>,
Comments :struct<Comment:array<struct<commentno:string,desc:string,passengerseat :struct<intele :string>,passengerloc :struct<intele :string>
>>>
>
)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' location '/user/srgmsbi1417/json14_07_05_01'; error::: Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Start of Array
... View more
05-01-2018
05:23 AM
hi @Shu still i am getting the same error,please find the attachment of errorerror.png
... View more
05-01-2018
04:18 AM
Hi , I am trying to create a hive table on top of son data file but getting the below error.Below are my json file,hive ddl and error. Json file:::::::: {
"purchaseid": {
"ticketnumber": "23546852222",
"location": "vizag",
"Travelerhistory": {
"trav": {
"fname": "ramu",
"lname": "gogi",
"travelingarea": {
"destination": {
"stationid": "KAJKL",
"stationname": "hyd"
}
},
"food": {
"foodpref": [{
"foodcode": "CK567",
"foodcodeSegment": "NOVEG"
},
{
"foodcode": "MM98",
"foodcodeSegment": "VEG"
}
]
}
}
}
}
} Hive DDL:: add jar /home/**********/json-serde-1.3.7-jar-with-dependencies.jar;
CREATE external TABLE ds1414(
ticketnumber string,
location string,
Travelerhistory ARRAY<struct<trav :struct<fname:string,lname:string,
travelingarea :ARRAY<struct<destination :struct<stationid:string,stationname:string>>>,
food :ARRAY<struct<foodpref :struct<foodcode:string,foodcodeSegment:string>>>
>>>
)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
location '/user/***********/json2/'
; Error::: select * from ds1414; OK
Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@205df5dc; line: 1, column: 0])
at [Source: java.io.ByteArrayInputStream@205df5dc; line: 1, column: 3]
Time taken: 0.212 seconds
... View more
Labels:
- Labels:
-
Apache Hive
05-01-2018
04:10 AM
Hi i have a json as below.. {
"purchaseid": {
"ticketnumber": "23546852222",
"location": "vizag",
"Travelerhistory": {
"trav": {
"fname": "ramu",
"lname": "gogi",
"travelingarea": {
"destination": {
"stationid": "KAJKL",
"stationname": "hyd"
}
},
"food": {
"foodpref": [{
"foodcode": "CK567",
"foodcodeSegment": "NOVEG"
},
{
"foodcode": "MM98",
"foodcodeSegment": "VEG"
}
]
}
}
}
}
} i am creating a hive table structure as below... add jar /home/**********/json-serde-1.3.7-jar-with-dependencies.jar;
CREATE external TABLE ds1414(
ticketnumber string,
location string,
Travelerhistory ARRAY<struct<trav :struct<fname:string,lname:string,
travelingarea :ARRAY<struct<destination :struct<stationid:string,stationname:string>>>,
food :ARRAY<struct<foodpref :struct<foodcode:string,foodcodeSegment:string>>>
>>>
)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
location '/user/***********/json2/'
; when i do a select statement it is throwing below error...how i need to handle this Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected end-of-input: expected close marker for OBJECT (from [Source: java.io.ByteArrayInputStream@205df5dc; line: 1, column: 0])
at [Source: java.io.ByteArrayInputStream@205df5dc; line: 1, column: 3]
Time taken: 0.212 seconds
... View more
04-12-2018
06:05 PM
Hi , I am trying do filter on rowkey which is combination of 3 columns in my case which are separated by a space.i want to query on filter conditions on 2 columns..below is how my data looks in hbase and hbase queries i am using. i want to query on rowkey which contains values row1 and rad.how can i do that hbase(main):003:0> scan 'testspaces'
ROW COLUMN+CELL
row1 gud column=cf:a2, timestamp=1523548834897, value=value1e
row1 rad column=cf:a2, timestamp=1523548716606, value=value1e
row1 sec column=cf:a2, timestamp=1523548822010, value=value1e
row2 rad exam column=cf:a3, timestamp=1523548741273, value=vale1e
4 row(s) in 0.0150 seconds
scan 'testspaces',{FILTER =>"(PrefixFilter('row1') OR PrefixFilter('sec')"}
scan 'testspaces', { LIMIT => 3, FILTER => "org.apache.hadoop.hbase.filter.RowFilter( =, 'row1') AND ValueFilter( =, 'sec'}
scan 'testspaces', {FILTER => org.apache.hadoop.hbase.filter.RowFilter.new(CompareFilter::CompareOp.valueOf('EQUAL'),SubstringComparator.new("sec"))
AND org.apache.hadoop.hbase.filter.RowFilter.new(CompareFilter::CompareOp.valueOf('EQUAL'),SubstringComparator.new("row1"))}
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Ranger
03-12-2018
05:50 PM
Hi Trivedi ,Thanks for your reply.Can you please answer this in details because i was able to create ddl on other part of xml which consists of array and structs,but if you see the xml i attached segments is a tag which contains segment within it and xse:type is manager,vp,svp. Can you please help in creating the exact schema for this
... View more
03-12-2018
03:04 PM
Hi I have an xml with below format,Can anyone help me how to create a hive DDL on top of this xml. <code><root><root1><id>4545482361</id>`enter code here`
<joiningdate>1/3/2010</joiningdate><Segments><Segmentxse:type="manager"><cityworked>Hyd</cityworked><reports>john</reports><salary>150000</salary><datestarted>1/3/2012</datestarted></Segment><Segmentxse:type="manager"><cityworked>Hyd</cityworked><reports>mike</reports><salary>225000</salary><datestarted>1/9/2014</datestarted></Segment><Segmentxse:type="VP"><cityworked>mumbai</cityworked><datestarted>1/9/2014</datestarted><subemployees><Fname>ram</Fname><Lname>Achanta</Lname><Desgination>Director of IT</Desgination></subemployees></Segment><Segmentxse:type="SVP"><Staus>currentposition</status><numberofemployees>10</numberofemployees></Segment></Segments></root1></root>
... View more
Labels:
- Labels:
-
Apache Hive
05-02-2017
06:14 PM
using the below command to install the solr service execute "Installed SOLR_SERVER component" do
command `curl -u admin:admin -X PUT -d '{"HostRoles": {"state": "INSTALLED"}}' -H 'X-Requested-By:ambari' http://ec2-54-18-98-88.us-west-2.compute.amazonaws.com:8080/api/v1/clusters/ambari/hosts/ip-000-00-00.us-west-2.compute.internal/host_components/SOLR_SERVER`
end
... View more
05-02-2017
06:12 PM
i have followed the same steps as in http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_solr-search-installation/content/ch_hdp-search-install-ambari.html
... View more
05-02-2017
03:05 PM
Hi I am new to hadoop i am trying to add solr service in ambari by using a recepi and executing using chef command ,my code works fine and adds solr compnent but installation is failing and getting the error: stderr: /var/lib/ambari-agent/data/errors-670.txt Repository HDP-UTILS-1.1.0.21 is listed more than once in the configuration
Repository HDP-UTILS-1.1.0.21 is listed more than once in the configuration
Repository HDP-UTILS-1.1.0.21 is listed more than once in the configuration
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/solr.py", line 101, in <module>
Solr().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/solr.py", line 15, in install
import params
File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/params.py", line 12, in <module>
import status_params
File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/status_params.py", line 13, in <module>
solr_config_pid_file = format('{solr_config_pid_dir}/solr-{solr_config_port}.pid')
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py", line 95, in format
return ConfigurationFormatter().format(format_string, args, **result)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/format.py", line 59, in format
result_protected = self.vformat(format_string, args, all_params)
File "/usr/lib64/python2.6/string.py", line 549, in vformat
result = self._vformat(format_string, args, kwargs, used_args, 2)
File "/usr/lib64/python2.6/string.py", line 582, in _vformat
result.append(self.format_field(obj, format_spec))
File "/usr/lib64/python2.6/string.py", line 599, in format_field
return format(value, format_spec)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'solr-config-env' was not found in configurations dictionary!
... View more
Labels:
- Labels:
-
Apache Solr