What is a multi-armed bandit?A multi-armed bandit solution is a more complex version of A/B testing that uses machine learning algorithms to dynamically allocate traffic to variationshttps://www.optimizely.com/optimization-glossary/multi-armed-bandit/Coffee Extraction and How to Taste ItExtraction is everything that the water takes from the coffee. This post will cover some basic extraction theory and the tastes associated with over, under and ideal coffee extractions.https://www.baristahustle.com/blog/coffee-extraction-and-how-to-taste-it/joearms.github.iohttps://joearms.github.io/published/2013-05-31-a-week-with-elixir.html"All data that might change in the future should be tagged with a version number. βΒ