Posts

Showing posts with the label performance metrics

Classification metrics and their Use Cases

Image
  In this blog, we will discuss about commonly used classification metrics. We will be covering  Accuracy Score ,  Confusion Matrix ,  Precision ,  Recall ,  F-Score ,  ROC-AUC  and will then learn how to extend them to the  multi-class classification . We will also discuss in which scenarios, which metric will be most suitable to use. First let’s understand some important terms used throughout the blog- True Positive (TP):  When you predict an observation belongs to a class and it actually does belong to that class. True Negative (TN):  When you predict an observation does not belong to a class and it actually does not belong to that class. False Positive (FP) : When you predict an observation belongs to a class and it actually does not belong to that class. False Negative(FN):  When you predict an observation does not belong to a class and it actually does belong to that class. All classification metrics work on these four te...

Data Science Performance Metrics for Everyone

Image
  Data Science Performance Metrics for Everyone Accuracy, recall, precision, sensitivity, specificity, … — data scientists use so many performance metrics! How do you explain all of them to audiences with non-technical backgrounds? As a data scientist, I find it both challenging, fun, and critical to my job to describe these concepts to everyone. This blog post will explain many performance metrics using common language and pictures so everyone at your company can understand them. Recently, I developed a machine learning model to predict which patients on dialysis will be admitted to the hospital in the next week. This model has received lots of attention in my company (Fresenius Medical Care North America), so I have presented the details of this model to a wide range of audiences including data scientists, data analysts, nurses, physicians, and even the C-suite. From experience, I have learned that everyone interprets ‘accuracy’ differently, so I have to be very careful to explai...