I am very happy to announce that (after many months) my interactive course on Hyperparameter Tuning in R has now been officially launched on Data Camp! Course Description For many machine learning problems, simply running a model out-of-the-box and getting a prediction is not enough; you want the best model with the most accurate prediction. One way to perfect your model is with hyperparameter tuning, which means optimizing the settings for that specific model.
These are slides from a lecture I gave at the School of Applied Sciences in Münster. In this lecture, I talked about Real-World Data Science and showed examples on Fraud Detection, Customer Churn & Predictive Maintenance. Real-World Data Science (Fraud Detection, Customer Churn & Predictive Maintenance) von Shirin Glander The slides were created with xaringan.
As with the other videos from our codecentric.ai Bootcamp (Random Forests, Neural Nets & Gradient Boosting), I am again sharing an English version of the script (plus R code) for this most recent addition on How Convolutional Neural Nets work. In this lesson, I am going to explain how computers learn to see; meaning, how do they learn to recognize images or object on images? One of the most commonly used approaches to teach computers “vision” are Convolutional Neural Nets.
This is code that accompanies a book chapter on customer churn that I have written for the German dpunkt Verlag. The book is in German and will probably appear in February: https://www.dpunkt.de/buecher/13208/9783864906107-data-science.html. The code you find below can be used to recreate all figures and analyses from this book chapter. Because the content is exclusively for the book, my descriptions around the code had to be minimal. But I’m sure, you can get the gist, even without the book.
Update: There is now a recording of the meetup up on YouTube. Here you find my slides the TWiML & AI EMEA Meetup about Trust in ML models, where I presented the Anchors paper by Carlos Guestrin et al.. I have also just written two articles for the German IT magazin iX about the same topic of Explaining Black-Box Machine Learning Models: A short article in the iX 12/2018
In a recent video, I covered Random Forests and Neural Nets as part of the codecentric.ai Bootcamp. In the most recent video, I covered Gradient Boosting and XGBoost. You can find the video on YouTube and the slides on slides.com. Both are again in German with code examples in Python. But below, you find the English version of the content, plus code examples in R for caret, xgboost and h2o. :-)
In my last blogpost about Random Forests I introduced the codecentric.ai Bootcamp. The next part I published was about Neural Networks and Deep Learning. Every video of our bootcamp will have example code and tasks to promote hands-on learning. While the practical parts of the bootcamp will be using Python, below you will find the English R version of this Neural Nets Practical Example, where I explain how neural nets learn and how the concepts and techniques translate to training neural nets in R with the H2O Deep Learning function.
- OLDER POSTS
- page 1 of 6