Ohad Eytan, M.Sc. Thesis Seminar
Thursday, 15.2.2018, 13:30
Trying to predict which items are likely to be accessed in the near future is the basis for most cache management policies. Storage workloads are often characterized by their level of frequency bias and their level of recency bias. The former captures how well the access frequency serves as a predictor for future accesses while the latter indicates how well does access recency predicts the future. Existing cache management policies have been grappling with the fact that different workloads exhibit different levels of frequency and recency biases, which makes developing a silver bullet solution a daunting task.
In this talk, I will present an adaptivity mechanism for software cache management schemes whom offer tuning parameters targeted at the frequency vs. recency bias. The goal is to take such a scheme and automatically tune its parameters for best performance based on the workload without any manual intervention. We study two approaches for this problem, a hill climbing solution and an indicator based solution. In hill climbing we repeatedly reconfigure the system hoping to find its best setting. In the indicator approach we calculate an estimation of the workloads’ frequency vs. recency bias, and adjust the parameters accordingly in a single swoop. We perform an extensive evaluation of the schemes and adaptation mechanisms over a large selection of workloads with varying characteristics. From this study, we derive the first software cache management policy that is competitive for all tested workloads.