Confusion Matrix Basics Quiz

Reviewed by Editorial Team
The ProProfs editorial team is comprised of experienced subject matter experts. They've collectively created over 10,000 quizzes and lessons, serving over 100 million users. Our team includes in-house content moderators and subject matter experts, as well as a global network of rigorously trained contributors. All adhere to our comprehensive editorial guidelines, ensuring the delivery of high-quality content.
Learn about Our Editorial Process
| By ProProfs AI
P
ProProfs AI
Community Contributor
Quizzes Created: 81 | Total Attempts: 817
| Questions: 15 | Updated: May 1, 2026
Please wait...
Question 1 / 16
🏆 Rank #--
0 %
0/100
Score 0/100

1. In a confusion matrix, what does the term 'True Positive' (TP) represent?

Explanation

True Positive (TP) refers to instances where the model accurately identifies a positive case. This means that both the model's prediction and the actual outcome are positive, indicating the model's effectiveness in recognizing true positives within the dataset. This metric is essential for evaluating the model's performance in classification tasks.

Submit
Please wait...
About This Quiz
Confusion Matrix Basics Quiz - Quiz

This Confusion Matrix Basics Quiz helps you master key concepts in model evaluation and classification performance. Learn to interpret true positives, false positives, true negatives, and false negatives. Understand how confusion matrices reveal model strengths and weaknesses, and why they matter for building reliable machine learning systems. Perfect for students... see morelearning data science fundamentals. see less

2.

What first name or nickname would you like us to use?

You may optionally provide this to label your report, leaderboard, or certificate.

2. A confusion matrix has 4 quadrants. Which quadrant contains False Negatives?

Explanation

In a confusion matrix, the quadrants represent different outcomes of predictions. False Negatives occur when the model incorrectly predicts a negative class for an actual positive case. Therefore, in the matrix layout, False Negatives are found in the bottom-right quadrant, where actual positives are misclassified as negatives.

Submit

3. In medical testing, a False Negative means the test says a patient is healthy when they actually have the disease. Why is this dangerous?

Explanation

A False Negative in medical testing can be dangerous because it leads to a misdiagnosis, causing the patient to believe they are healthy. Consequently, they may not seek necessary treatment, allowing the disease to progress unchecked, which can result in severe health complications or even death.

Submit

4. Accuracy is calculated as (TP + TN) / (TP + TN + FP + FN). What does this formula measure?

Explanation

Accuracy measures the proportion of total correct predictions made by a model, encompassing both true positives (TP) and true negatives (TN). By dividing the sum of TP and TN by the total number of predictions (TP + TN + FP + FN), it provides an overall effectiveness of the model in classifying both positive and negative cases correctly.

Submit

5. Precision is TP / (TP + FP). In spam email filtering, what does precision tell us?

Explanation

Precision in spam email filtering measures the accuracy of the filter in identifying spam. Specifically, it indicates the proportion of emails that the filter has marked as spam that are indeed spam. This helps assess the effectiveness of the filter in avoiding false positives, ensuring that legitimate emails are not incorrectly classified as spam.

Submit

6. Recall (sensitivity) is TP / (TP + FN). In cancer detection, high recall is critical because:

Explanation

High recall in cancer detection is essential as it ensures that the majority of actual cancer cases are identified. This minimizes the risk of missed diagnoses, allowing for timely treatment and improving patient outcomes. Prioritizing recall helps to effectively detect true positives, which is crucial in managing cancer effectively.

Submit

7. A model predicts 100 emails as spam: 80 are actually spam (TP), 20 are legitimate (FP). What is the precision?

Explanation

Precision is calculated as the ratio of true positives (TP) to the sum of true positives and false positives (FP). In this case, with 80 true spam emails and 20 false positives, precision equals 80 / (80 + 20) = 80 / 100 = 0.80, indicating that 80% of the predicted spam emails were actually spam.

Submit

8. There are 150 actual disease cases. The model catches 120 of them (TP) and misses 30 (FN). What is the recall?

Explanation

Recall is calculated as the ratio of true positives (TP) to the sum of true positives and false negatives (FN). In this case, recall = TP / (TP + FN) = 120 / (120 + 30) = 120 / 150 = 0.80. This indicates that the model successfully identifies 80% of actual disease cases.

Submit

9. When is a model with high precision but low recall problematic?

Explanation

A model with high precision but low recall is problematic in scenarios where failing to identify positive cases can have severe consequences, such as in disease diagnosis. In such cases, the cost of missing true positives outweighs the benefit of correctly identifying negatives, making it crucial to improve recall despite the potential increase in false positives.

Submit

10. The F1-score balances precision and recall. When should you use F1-score instead of accuracy?

Explanation

F1-score is particularly useful in scenarios with imbalanced classes because it considers both precision and recall, providing a better measure of a model's performance on the minority class. In such cases, accuracy alone can be misleading, as it may not reflect the model's ability to correctly identify the less frequent class.

Submit

11. A confusion matrix shows TP=50, TN=45, FP=5, FN=0. What does FN=0 indicate?

Explanation

FN=0 indicates that the model correctly identified all actual positive cases, meaning it did not miss any true positives. This signifies that every instance that was actually positive was predicted as positive, showcasing the model's effectiveness in recognizing positive instances without any false negatives.

Submit

12. In a dataset with 1000 samples, 950 are negative and 50 are positive. A model predicts all samples as negative. What is its accuracy?

Explanation

In this scenario, the model predicts all samples as negative. Since there are 950 negative samples, the model correctly identifies 950 out of 1000 total samples. Accuracy is calculated as the number of correct predictions divided by the total number of samples, resulting in 950/1000, which equals 0.95 or 95%.

Submit

13. The specificity of a model is TN / (TN + FP). What does specificity measure?

Submit

14. A model has precision=0.9 and recall=0.6. Which statement is true?

Submit

15. Why is a confusion matrix more informative than accuracy alone for imbalanced datasets?

Submit
×
Saved
Thank you for your feedback!
View My Results
Cancel
  • All
    All (15)
  • Unanswered
    Unanswered ()
  • Answered
    Answered ()
In a confusion matrix, what does the term 'True Positive' (TP)...
A confusion matrix has 4 quadrants. Which quadrant contains False...
In medical testing, a False Negative means the test says a patient is...
Accuracy is calculated as (TP + TN) / (TP + TN + FP + FN). What does...
Precision is TP / (TP + FP). In spam email filtering, what does...
Recall (sensitivity) is TP / (TP + FN). In cancer detection, high...
A model predicts 100 emails as spam: 80 are actually spam (TP), 20 are...
There are 150 actual disease cases. The model catches 120 of them (TP)...
When is a model with high precision but low recall problematic?
The F1-score balances precision and recall. When should you use...
A confusion matrix shows TP=50, TN=45, FP=5, FN=0. What does FN=0...
In a dataset with 1000 samples, 950 are negative and 50 are positive....
The specificity of a model is TN / (TN + FP). What does specificity...
A model has precision=0.9 and recall=0.6. Which statement is true?
Why is a confusion matrix more informative than accuracy alone for...
play-Mute sad happy unanswered_answer up-hover down-hover success oval cancel Check box square blue
Alert!