- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
What is recommended way of moving mainframe data in Packed-Decimal fomat to Hive, either as text or ORC format?
Created ‎09-21-2016 02:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I know Syncsort is a possible solution here, but wanted to check if we HDF can do the job and if we have any other recommendation other than Syncsort ??
Created ‎09-21-2016 03:08 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@hduraiswamy - in order of preference
- SyncSort
- Use the mainframe’s native JDBC services – often unacceptable as the mainframe must consume additional MIPS to convert into JDBC types before sending over the net
- Use this open serde which unfortunately skips reading everything except fixed length fields, severely limiting usefulness
- I've heard about LegStar being used for similar projects, but am not sure how.
Created ‎09-21-2016 03:02 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Most ETL vendors have mainframe integration. For near realtime integration, you may want to look into Attunity.
Created ‎09-21-2016 03:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks @ajaysingh
Created ‎09-21-2016 03:08 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@hduraiswamy - in order of preference
- SyncSort
- Use the mainframe’s native JDBC services – often unacceptable as the mainframe must consume additional MIPS to convert into JDBC types before sending over the net
- Use this open serde which unfortunately skips reading everything except fixed length fields, severely limiting usefulness
- I've heard about LegStar being used for similar projects, but am not sure how.
Created ‎09-21-2016 03:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks @Randy Gelhausen
Created ‎09-21-2016 01:51 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Java Library + Spark => Magic
https://github.com/gmallard/packed-decimal
You could also have that in a dataflow in NiFi
1. get the file via NiFi GetFile
2. ExecuteStreamCommand packed-decimal Java class
2b. or call via Kafka/JMS to Java or Spark program
3. Insert or save as ORC
4. Create a hive table on top
