Member since
05-09-2016
280
Posts
58
Kudos Received
31
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3745 | 03-28-2018 02:12 PM | |
3022 | 01-09-2018 09:05 PM | |
1649 | 12-13-2016 05:07 AM | |
5029 | 12-12-2016 02:57 AM | |
4311 | 12-08-2016 07:08 PM |
07-11-2016
06:54 PM
json_staging table has the following data: {"key": 1, "driverId": "123", "driverName":"James", "eventTime":"2015-08-21 12:23:45.231", "eventType":"Normal", "latitudeColumn":"38.440467", "longitudeColumn":"-122.714431", "routeId":"345", "routeName":"San Francisco to San Diego", "truckId":"67"} {"key": 2, "driverId": "352", "driverName":"John", "eventTime":"2015-09-24 10:45:56.289", "eventType":"Abnormal", "latitudeColumn":"33.19587", "longitudeColumn":"-117.379483", "routeId":"315", "routeName":"San Jose to Los Angeles", "truckId":"23"} {"key": 3, "driverId": "657", "driverName":"Tim", "eventTime":"2016-05-02 05:45:11.009", "eventType":"Normal", "latitudeColumn":"34.44805", "longitudeColumn":"-119.242889", "routeId":"169", "routeName":"San Mateo to Fremont", "truckId":"29"}
... View more
07-11-2016
06:33 PM
I am doing insert like this: INSERT OVERWRITE TABLE hbase_table_json SELECT
get_json_object(json_staging.json, "$.key") AS key,
get_json_object(json_staging.json, "$.driverId") AS driverId,
get_json_object(json_staging.json, "$.driverName") AS driverName
get_json_object(json_staging.json, "$.eventTime") AS eventTime,
get_json_object(json_staging.json, "$.eventType") AS eventType,
get_json_object(json_staging.json, "$.latitudeColumn") AS latitudeColumn,
get_json_object(json_staging.json, "$.longitudeColumn") AS longitudeColumn,
get_json_object(json_staging.json, "$.routeId") AS routeId,
get_json_object(json_staging.json, "$.routeName") AS routeName,
get_json_object(json_staging.json, "$.truckId") AS truckId
FROM json_staging;
... View more
07-11-2016
06:27 PM
Using HDP2.5 TP Sandbox, I created a Hive table using HBaseStorageHandler. Created an another table json_staging and load the json file. Now I want to insert that json file into the first table using simple get_json_object UDF. Getting the following exception. Has anyone experienced this error before, please help java.lang.Exception: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: ParseException line 5:2 Failed to recognize predicate 'get_json_object'. Failed rule: 'regularBody' in statement
... View more
Labels:
- Labels:
-
Apache Hive
07-11-2016
05:23 PM
I guess I got confused by the error message, thank you @Benjamin Leonhardi
... View more
07-11-2016
05:22 PM
thank you so much @Constantin Stanca
... View more
07-11-2016
05:13 PM
Using Beeline running in the HDP 2.5 TP Sandbox and created a table, say, json_staging and then issuing the command: LOAD DATA LOCAL INPATH '/root/hbase_data.json' INTO TABLE json_staging; It fails with the message Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path "/root/hbase_data.json": No files matching path file:/root/hbase_data.json (state=42000,code=40000) The file does exist at the specified location in the Linux file system and all users have read permission. The workaround is to first copy the file to /home/hive directory. Then I can specify the file as 'hbase_data.json' and it is loaded successfully. It fails if the file is anywhere else. Can anyone please explain the reason?
... View more
Labels:
- Labels:
-
Apache Hive
07-10-2016
06:47 PM
what version of HDP are you using? Is it a sandbox or a cluster?
... View more
07-09-2016
10:43 AM
Hi @Qi Wang, there is a bug from Ambari perspective, it is not generating hiveserver2-site.xml. So any changes made in Advanced Hiveserver2 site section from Ambari are not getting reflected, (we make changes in hiveserver2-site.xml for Ranger) so if you disable authorization from the general settings as mentioned above, your Ranger policies will not work as expected. This issue has been raised up and will be resolved soon in the upcoming releases of Sandbox.
... View more
07-09-2016
10:38 AM
1 Kudo
I have successfully integrated Phoenix with Squirrel and also published an article on how to setup. Please have a look here
... View more
07-09-2016
10:31 AM
3 Kudos
Installing SQuirrel Download the SQuirrel jar from here, as per your operating system. Open the terminal and go to the directory where you have downloaded the file. Run the following command: java -jar squirrel-sql-3.7.1-MACOSX-install.jar Once the installer dialog pops up, follow the instructions to install SQuirreL onto your system. You can choose to select optional installs if you like. I am choosing only the base and standard install. If you want other plugins, you can select them from the list given. Configuring SQuirrel and Phoenix Integration: 1. Make sure your Sandbox and HBase is up and running. 2. Open Additional Ports. You need all the HBase and Region Server ports forwarded, 16000, 16010, 16020, 16030. To do this, click on the Settings button for the VM instance. Then click the Network button in the pop up window. There is a port forwarding button at the bottom, click it. Click to add each of the following ports that you don't currently have. 3. Copy the Phoenix client jar to SQuirrel SSH to your Sandbox terminal with root user. Check the location of phoenix client jar by navigating to /usr/hdp/2.5.0.0-817/phoenix/ Check your HDP version. Now go back to your local machine terminal and run the following command to copy the jar from sandbox to the local machine: scp -P 2222 root@127.0.0.1:/usr/hdp/2.5.0.0-817/phoenix/phoenix-4.7.0.2.5.0.0-817-client.jar ~ Now let’s copy this jar to SQuirrel lib. On a Mac, go to Applications and click Open in Finder. Right click on SQuirrel SQL and then click on Show Package Contents. Then navigate to Contents > Resources > Java > lib and paste that phoenix client jar over here. 4. Add sandbox.hortonworks.com in /private/etc/hosts file Type the following command sudo vi /private/etc/hosts And add this entry - sandbox.hortonworks.com and save the file. 5. Add Phoenix driver in SQuirrel Open up SQuirrel, click the Drivers tab on the left side of the window, and click the plus button to create a new driver. Enter this information into the driver creation window: Name: Phoenix Example URL: jdbc:phoenix:sandbox.hortonworks.com:2181:/hbase-unsecure Website URL: (it should be blank) Class Name: org.apache.phoenix.jdbc.PhoenixDriver It should look like this: Click OK. 6. Create an Alias Switch to the Aliases tab and click the plus button to create a new alias. Enter this information in the alias creation window: Name – PhoenixOnHortonworksSandbox Driver – Phoenix URL – This should be auto-populated when you select your driver with jdbc:phoenix:sandbox.hortonworks.com:2181:/hbase-unsecure User Name – root Password – same password that you use while ssh It should look like this: Once you’ve filled out the above information, click Test then select Connect. A box should pop up which says “Connection successful”. Click OK then OK again to create the alias. 7. Connect Double click on your newly created alias and click Connect. You will see a screen like this: You should now be successfully connected and able to run SQL queries.
... View more
Labels: