We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2
CLASS- X (AI)
EVALUATION WORKSHEET
Question Choice 1 Choice 2 Choice 3 Choice 4
What is the main goal of To improve the To understand To increase model To reduce data evaluation in AI models? training dataset model reliability complexity usage What phenomenon occurs when a model Underfitting Overfitting Generalization Data leakage remembers the training set? What term describes when both prediction and True Positive True Negative False Positive False Negative reality indicate no fire? What is a False Positive Fire predicted Fire predicted No fire predicted No fire predicted in the context of forest when there is correctly when there is one correctly fire prediction? none Which matrix helps in Performance mapping prediction Decision matrix Confusion matrix Accuracy matrix matrix against reality? What does accuracy Total Percentage of Number of training Complexity of the measure in model predictions correct samples model evaluation? made predictions What is a False Negative Fire predicted No fire predicted No fire predicted Fire predicted in the forest fire correctly correctly when there is one when there is none scenario? Why shouldn't training It leads to It increases It improves model It reduces data be used for overfitting accuracy training computation time evaluation? Correct Correct Incorrect Incorrect prediction What is a True Positive? prediction of no prediction of fire prediction of fire of no fire fire Mapping of What does the confusion Evaluation Understanding of Prediction results predictions and matrix NOT provide? metrics model performance reality
Question Choice 1 Choice 2 Choice 3 Choice 4
Correctly Incorrectly Correctly Incorrectly What does True Positive predicted positive predicted positive predicted predicted (TP) indicate? cases cases negative cases negative cases Correctly Incorrectly Correctly Incorrectly What is True Negative predicted predicted negative predicted predicted (TN)? negative cases cases positive cases positive cases What does a high More False Fewer False More False More True Precision value imply? Positives Alarms Negatives Negatives Which of the following False Negative increases if there are Precision Recall Accuracy rate many False Positives? Fraction of actual Fraction of actual Total number of Total number of What does Recall positives correctly negatives correctly correct incorrect measure? identified identified predictions predictions In the context of fire Fire detected Fire not detected Correct Correct detection detection, what is a False when there is when there is one detection of fire of no fire Negative? none What happens if Precision More false alarms Fewer false alarms Increased True Increased True is low? occur occur Positives Negatives Which metric does not consider False Precision Recall Accuracy F1 Score Negatives? What is the consequence No False No False More True More False of a model with 100% Positives Negatives Negatives Alarms Precision? If a model has high It correctly It misses many It has many It has low Recall, what does it identifies most positive cases false alarms accuracy signify? positive cases
Question Choice 1 Choice 2 Choice 3 Choice 4
What do Precision and Recall both share in their True Positives False Positives False Negatives True Negatives numerator? What does Precision False Positives True Positives True Negatives False Negatives measure in its denominator? What does Recall consider in True Positives False Negatives False Positives True Negatives its denominator? To balance What is the purpose of the To measure To count True To evaluate Precision and F1 Score? accuracy Positives model speed Recall When both When is the F1 Score When When Recall is When F1 Score Precision and considered perfect? Precision is 0 0 is 0.5 Recall are 1 What range does the F1 0 to 10 0 to 1 0 to 100 0 to 50 Score fall within? If a model has high Precision The model is The model has The model is The model is but low Recall, what can be performing balanced biased towards performing well said? poorly performance recall What happens to the F1 It remains It becomes Score if both Precision and It increases It decreases unchanged perfect Recall are low? Which scenario indicates a High Precision, Low Precision, High F1 Score Low F1 Score good model performance? Low Recall High Recall Which measure is NOT directly considered in the F1 Precision Recall True Positives False Negatives Score calculation?