Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Oryx max-age params

avatar
Explorer

Hi guys,

 

could some of you explain to me purpose and how does max-age-data-hours and max-age-model-hours works? For example I set max-age-data-hours to 7 days and max-age-model-hours to 31 days. How will this act in practice?

 

Thanks,

Matus

1 ACCEPTED SOLUTION

avatar
Master Collaborator

This affects historical input data and model data stored on HDFS only. Every time the batch layer runs it will check the data versus these settings and delete old data/models if they're older than the given age.

 

This does not affect the age of data stored in Kafka topics. The input topic's retention doesn't matter much; just needs to be long enough so that the batch process still sees all data since the last batch by the time it runs. The update topic retention should also be long enough such that at least one model is retained somewhere in the topic. It too should be at least as long as the batch interval. If it's too long, then the speed/serving processes will waste time sifting through old data on startup to catch up.

 

The effect of deleting old input data is that this data will no longer be used in building future models. There's really no effect of deleting old models, with one exception. In some cases a model is stored on HDFS but is too large to send via Kafka, in which case a reference to its HDFS location is stored. If a model is deleted from HDFS but is still referenced on the Kafka update topic then it will be ignored. That's no big deal, but, I suppose it means you shouldn't delete old models too aggressively.

 

batch interval < topic retention times < max age settings is a good general rule.

View solution in original post

5 REPLIES 5

avatar
Master Collaborator

This affects historical input data and model data stored on HDFS only. Every time the batch layer runs it will check the data versus these settings and delete old data/models if they're older than the given age.

 

This does not affect the age of data stored in Kafka topics. The input topic's retention doesn't matter much; just needs to be long enough so that the batch process still sees all data since the last batch by the time it runs. The update topic retention should also be long enough such that at least one model is retained somewhere in the topic. It too should be at least as long as the batch interval. If it's too long, then the speed/serving processes will waste time sifting through old data on startup to catch up.

 

The effect of deleting old input data is that this data will no longer be used in building future models. There's really no effect of deleting old models, with one exception. In some cases a model is stored on HDFS but is too large to send via Kafka, in which case a reference to its HDFS location is stored. If a model is deleted from HDFS but is still referenced on the Kafka update topic then it will be ignored. That's no big deal, but, I suppose it means you shouldn't delete old models too aggressively.

 

batch interval < topic retention times < max age settings is a good general rule.

avatar
Explorer

So, if I am right, these max age parameters will not affect created/updated model in any way right? These parameters are basically affecting only storage usage by Oryx. Correct me if I am wrong please.

avatar
Master Collaborator

Well, the data that is stored certainly affects future models. All historical data is used to build models.

avatar
Explorer
Okay, so for example:
I build model for the first on 1year history data (I am ingesting 1year history in first training), but I have max-age-data-hour setup to 3months. What will happen in this case? Or, if this option is setup to 3months batch layer will read only 3months of data from kafka based on timestamp of data?

avatar
Explorer
Anyone please with detailed explanation?