Created 09-06-2016 02:27 AM
Hi guys,
could some of you explain to me purpose and how does max-age-data-hours and max-age-model-hours works? For example I set max-age-data-hours to 7 days and max-age-model-hours to 31 days. How will this act in practice?
Thanks,
Matus
Created 09-06-2016 02:59 AM
This affects historical input data and model data stored on HDFS only. Every time the batch layer runs it will check the data versus these settings and delete old data/models if they're older than the given age.
This does not affect the age of data stored in Kafka topics. The input topic's retention doesn't matter much; just needs to be long enough so that the batch process still sees all data since the last batch by the time it runs. The update topic retention should also be long enough such that at least one model is retained somewhere in the topic. It too should be at least as long as the batch interval. If it's too long, then the speed/serving processes will waste time sifting through old data on startup to catch up.
The effect of deleting old input data is that this data will no longer be used in building future models. There's really no effect of deleting old models, with one exception. In some cases a model is stored on HDFS but is too large to send via Kafka, in which case a reference to its HDFS location is stored. If a model is deleted from HDFS but is still referenced on the Kafka update topic then it will be ignored. That's no big deal, but, I suppose it means you shouldn't delete old models too aggressively.
batch interval < topic retention times < max age settings is a good general rule.
Created 09-06-2016 02:59 AM
This affects historical input data and model data stored on HDFS only. Every time the batch layer runs it will check the data versus these settings and delete old data/models if they're older than the given age.
This does not affect the age of data stored in Kafka topics. The input topic's retention doesn't matter much; just needs to be long enough so that the batch process still sees all data since the last batch by the time it runs. The update topic retention should also be long enough such that at least one model is retained somewhere in the topic. It too should be at least as long as the batch interval. If it's too long, then the speed/serving processes will waste time sifting through old data on startup to catch up.
The effect of deleting old input data is that this data will no longer be used in building future models. There's really no effect of deleting old models, with one exception. In some cases a model is stored on HDFS but is too large to send via Kafka, in which case a reference to its HDFS location is stored. If a model is deleted from HDFS but is still referenced on the Kafka update topic then it will be ignored. That's no big deal, but, I suppose it means you shouldn't delete old models too aggressively.
batch interval < topic retention times < max age settings is a good general rule.
Created 09-07-2016 12:52 AM
So, if I am right, these max age parameters will not affect created/updated model in any way right? These parameters are basically affecting only storage usage by Oryx. Correct me if I am wrong please.
Created 09-07-2016 12:54 AM
Well, the data that is stored certainly affects future models. All historical data is used to build models.
Created 09-07-2016 12:58 AM
Created 09-09-2016 02:21 AM