Recent blogposts
If Your Models Are Underperforming, Build Better Datasets
Working with data can be hard. You might spend hours on your model or analysis without getting any reasonable results. At that point, it might be tempting to blame your performance issues on the wrong
Bayesian Machine Learning and Julia are a match made in heaven
As I argued in an earlier article, Bayesian Machine Learning can be quite powerful. Building actual Bayesian models in Python, however, is sometimes a bit of a hassle. Most solutions that you will find online
Implementing Neural Networks in 16 lines of raw Julia
When it comes to building Neural Networks and Deep Learning models, Tensorflow and PyTorch are the de-facto standard. While everyday models are quickly implemented out-of-box, customized algorithms can, at times, result in quite verbose looking
When is Bayesian Machine Learning actually useful?
When it comes to Bayesian Machine Learning, you likely either love it or prefer to stay at a safe distance from anything Bayesian. Given that current state-of-the-art models hardly ever mention Bayes at all, there
Multi-output Gradient Boosting for probability distributions
Gradient Boosting is arguably one of the most popular Machine Learning algorithms nowadays. Combining multiple weak learners in order to generate a strong one seems almost too good to be true. Nevertheless, respective packages like xgboost regularly
You DO need math for Machine Learning
Apparently, it has become very popular to convince aspiring Machine Learning engineers that learning theory is rather overrated in practice. Authors of respective articles like to claim that they are able to do a good