How Kaggle makes GPUs accessible to 5 million data scientists

published on
This post was first published on NVIDIA’s developer blog on Oct 26th 2020 Engineers and designers at Kaggle work hard behind the scenes to make it easy for our 5 million data scientist users to focus on learning and improving their deep learning models instead of ML ops like environment setup and resource provisioning. Kaggle’s community comes to the platform to learn and apply their skills in machine learning competitions. Keep Reading...

Making public mistakes & code memes: The origin of my famous Titanic notebook

published on
Yesterday, it was a great honor to be a guest on Nick Wan’s data science livestream on Twitch to talk through bad (and less bad) data viz that I’ve created. -- It was a lot of fun! Talking about the history of my popular Titanic R notebook on Kaggle was a great opportunity for me to reflect on my data science journey. Its explosive success was very unintended. But as a result I’ve got a couple of cool insights to share about this experience and how I apply them in my role as a product manager at Kaggle today. Keep Reading...

How our team improved perceived reliability of Kaggle Notebooks

published on
I’ve worked at Kaggle on-and-off since 2016. In this time, THE most consistent source of user feedback is about the reliability of Kaggle Notebooks. Sessions were slow to start and, far worse, sometimes users would lose hours of work. While progress had been made over the years, we’d never systematically addressed the problems. Over the past half year or so since I’ve rejoined Kaggle, the Notebooks team renewed its focus on reliability. Keep Reading...