Member since
04-04-2022
191
Posts
5
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
232 | 06-11-2025 02:18 AM | |
238 | 03-26-2025 01:54 PM | |
367 | 01-08-2025 02:51 AM | |
541 | 01-08-2025 02:46 AM | |
694 | 01-08-2025 02:40 AM |
07-14-2025
01:09 PM
Hello @moekraft Thank you for raising your concern. At the moment, we do not have definitive information regarding the supportability of this request. However, I will go ahead and create an internal JIRA to track it. Your request will be reviewed and addressed soon.
... View more
06-11-2025
02:18 AM
Hello @Artem_Kuzin I looked into this issue and it appears to be a bug in CDP version 7.3.1, which has been resolved in version 7.3.2.0
... View more
03-26-2025
01:54 PM
Hello @snm1523 Add Connection: close header to your curl requests to ensure the connection is closed after each request. Alternatively, use --no-keepalive in the curl command to prevent persistent connections. If necessary, adjust server settings to force connections to close after each request.
... View more
03-26-2025
01:51 PM
Hello @nowy19 Thanks for the posting your query and here is my detailed ans of your query It seems that Atlas checks if the relationship exists within the uploaded batch, rather than between the batch and already uploaded entities. There are a couple of approaches you could consider to avoid timeouts during bulk uploads: Uploading Related Entities in Separate Batches: It’s possible to upload related entities in separate batches. However, you need to ensure that dependencies are respected between batches. If relationships between entities need to be established, you may have to upload them in an order that ensures the relationships can be checked and linked after the entities are uploaded. Batch Size Management: If timeouts are an issue, you might want to consider reducing the batch size for uploads. Smaller batches can reduce the load on the system and help avoid timeouts. This might involve splitting larger datasets into smaller, more manageable chunks. Optimize Atlas Configuration: Adjusting some configurations in Atlas, such as increasing the batch size limit or optimizing the database (e.g., using indexing) might help to handle larger uploads efficiently. Asynchronous Upload Strategy: If possible, you can consider uploading entities asynchronously to prevent long-running operations that can lead to timeouts. This allows the system to handle multiple requests in parallel without overwhelming it. Increase Timeout Settings: If you're encountering timeouts during bulk uploads, you could also look into adjusting timeout settings for the upload process, either at the Atlas server or API level, if that's a feasible option. If you want to upload everything in one batch but avoid timeouts, breaking down the process into smaller, logical steps, while maintaining the required relationships, is usually the most effective approach.
... View more
01-22-2025
05:25 AM
Hello @DreamDelerium Thanks for the sharing this question with us as I checked both both datasource both datasource lineage data should be same because the datasystem_datatransfer part of datasystem_datasource so the origin will be same for lineage data now I comes with you first question why creating this second lineage would impact the first? Not it couldn't impact. Please let me know if you required any clarification for your above question
... View more
01-08-2025
02:51 AM
Error Code: ATLAS-404-00-007 Invalid instance creation/updation parameters passed: type_name.entity_name: mandatory attribute value missing in type type_name." This error indicates that when creating or updating an entity (likely in Apache Atlas or a similar system), a required attribute value for that entity is missing or not provided. Specifically, the entity's type (indicated as type_name.entity_name) is missing a mandatory attribute value defined for that type. Error Code: ATLAS-400-00-08A This error typically occurs when you're trying to upload or import a ZIP file that is either empty or does not contain any valid data. Verify that the ZIP file you're attempting to upload actually contains data. Check the contents of the file and ensure that it's not empty. If it should contain data, try recreating the ZIP file or ensure it's properly packaged before importing.
... View more
01-08-2025
02:46 AM
Hello @DreamDelerium Thanks for the sharing this question with us as I checked both both datasource both datasource lineage data should be same because the datasystem_datatransfer part of datasystem_datasource so the origin will be same for lineage data now I comes with you first question why creating this second lineage would impact the first? Not it couldn't impact. Please let me know if you required any clarification for your above question
... View more
01-08-2025
02:40 AM
@dhughes20 Please check this jira :https://issues.apache.org/jira/browse/ATLAS-1729
... View more
01-08-2025
02:38 AM
@dhughes20 This is looks bug please check this https://issues.apache.org/jira/browse/ATLAS-3958
... View more
08-28-2024
08:57 AM
@hadoopranger You hbase region server is healthy? and execute this command to get hbase region server health status hbase hbck -details &> /tmp/hbck_details_$(date +"%Y_%m_%d_%H_%M_%S").txt
... View more