Comparing model performance to a baseline score
While it is great that we have a high accuracy score from our model of 91.7 percent, it is also important to compare this to a baseline score. We dig deeper into this concept in this section.
How to do it...
This section walks through the steps to calculate the baseline accuracy.
- Execute the following script to retrieve the mean value from the
describe()
method:
predictionDF.describe('label').show()
- Subtract
1- mean value score
to calculate baseline accuracy.
How it works...
This section explains the concept behind the baseline accuracy and how we can use it to understand the effectiveness of our model.
- What if every
chat
conversation was flagged fordo_not_escalate
or vice versa. Would we have a baseline accuracy higher than 91.7 percent? The easiest way to figure this out is to run thedescribe()
method on thelabel
column frompredictionDF
using the following script:predictionDF.describe('label').show()
- The output of the script can be seen in the...