0% found this document useful (0 votes)
64 views

Music SRA Classifier Training - BaseLine

Uploaded by

dthong.work
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views

Music SRA Classifier Training - BaseLine

Uploaded by

dthong.work
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

11/5/24, 9:36 PM Guidelines for Search - Music SRA Classifier Training — BaseLine

Search - Music SRA Classifier Training

Guidelines for Search - Music SRA Classifier Training

Updated:
Updated 10.17.24
Added clarification and examples to Harmful Intent, Interest in Sensitive Topic, and Ambiguous Intent definitions.

Updated 10.01.24:
Added clarification for following Sensitive Topics: Suicide and Self-Harm, Serious Mental Health Concerns, and Restricted and Regulated Content
Added clarification for following Intent Types: Harmful Intent, Opposing Sensitive Topic, Ambiguous Intent

09.09.2024
Additional clarification on “Not Applicable, the query contains unintelligible text or non-English language.” response option
Additional guidance on how to approach determining Sensitive Topic
Additional clarification on Serious Mental Health Concerns response option
Additional clarification on Violence response option
Additional clarification on Adult Sexual Material response option
Additional guidance on how to approach determining User Intent
Additional clarification on Ambiguous Intent response option
Additional clarification on Navigating to Specific Content response option
Additional clarification on Navigating to Related Content response option

Search - Music SRA Classifier Evaluation Guidelines


CONTENT WARNING: Because of the sensitive nature of this task, you may be exposed to content that is upsetting or offensive. Your participation is completely optional. You are encouraged to take breaks and seek
help as needed. If you are not comfortable participating in this task, please speak to a team lead or manager.

Instructions
For this task, you will be presented with a query, or a search, that may be entered into a music streaming platform. You will be asked to read the query and evaluate whether it: (1) contains Sensitive Topics, and if so, what
topics are present; and 2) references an At-Risk Group. You will also be asked to evaluate the user’s intent.

The following guidelines will explain how to evaluate the query and outline the different Sensitive Topics, potential At-Risk Groups, and the different types of user intent. This will include explanations of each question, and
definitions and examples of each response option.

Here is how you will see the query and the questions as you review in Baseline.

My Tasks Training Anthony Mcpherson

Task screenshot

As you review each query, you will respond to the questions below, which relate to user intent and other aspects of the query content.
1. Does the query reference a sensitive topic?
2. Which sensitive topic, if any, is referenced in the query? (Primary Sensitive Topic)
3. If applicable, please select a second sensitive topic referenced in the query. (Sensitive Topic 2; optional)
4. If applicable, please select a third sensitive topic referenced in the query. (Sensitive Topic 3; optional)
5. Does the query reference an at-risk group?
6. What is the user intent?
7. Please explain your selections. (optional)

Part 1: Determine whether a query references a Sensitive Topic

You’ll evaluate whether a query references one of the 17 Sensitive Topics from the list below. Please familiarize yourself with these topics and review the definitions and examples outlined in Part 2.

Question 1: Does the query reference a sensitive topic?


1. Clearly references a sensitive topic. Select this option if the query references at least one sensitive topic in a way that you think is clear and obvious.
2. Maybe references a sensitive topic. Select this option if you think the query may reference at least one sensitive topic, but you are not certain.
3. No, does not reference a sensitive topic. Select this option if you are certain that the query does not reference any sensitive topic.
4. Not Applicable, the query contains unintelligible text or non-English language. Select this option if you cannot understand the query, i.e., if it’s gibberish or has severe typos (e.g., “sdsdsdhhhjj”) OR if the query
contains non-English words (e.g., “I want to matar [spanish]”, “सर्वश्रेष्ठ बॉलीवुड साउंडट्रैक [hindi]”). Please use your best judgment as to whether the query is intelligible, but please ALWAYS select “Not Applicable” for
queries containing non-English. Examples:
“invisisstment”, “making millions throuh mone yheidnt", → please select this “Not Applicable” option when the queries are unintelligible.
“spice wirld”, "i cant expos the way i feel caus that would make me kill myself", “wanna beat my dumba$$ bro” → please evaluate the query as usual if you can understand what the user intended and select from the
first 3 response options.

Sensitive Topics:
1. Suicide and Self-Harm
2. Serious Mental Health Concerns
3. Toxicity, Stereotypes, and Identity-based Hate
4. Violence
5. Interpersonal Abuse
6. Child Sexual Abuse and Child Pornography
7. Adult Sexual Material
8. Illegal Activities, Goods, and Services
9. Queries Insulting Content or Creators
10. Risk of Reputational Harm and Private Information Leaks
11. Disinformation and Conspiracy Theories
12. Politically Controversial Topics
13. Unsafe Actions
14. Unethical Actions
15. Vulgarity and Offensiveness
16. Restricted and Regulated Content
17. Other type of harmful or offensive text

Part 2: Determine which Sensitive Topics are referenced

After you indicate whether the query contains a Sensitive Topic, you will be asked to indicate which Sensitive Topic(s) are referenced in the query. To do so, please review each type of Sensitive Topic. You will see the
following three questions.

Questions 2-4:

1. Which sensitive topic, if any, is referenced in the query? (Primary Sensitive Topic).
2. If applicable, please select a second sensitive topic referenced in the query. (Sensitive Topic 2; optional)

https://baseline.apple.com/training/evaluations/1072/guidelines 1/4
11/5/24, 9:36 PM Guidelines for Search - Music SRA Classifier Training — BaseLine

3. If applicable, please select a third sensitive topic referenced in the query. (Sensitive Topic 3; optional)

Next to each question is a drop-down list of the different Sensitive Topics. Please select which topic(s) are referenced in the query, if any. You can select only one option per question. If the query includes more than 3
Sensitive Topics, please describe the content in the text box of the final question.

Question Guidelines:

Question 2 (“Which sensitive topic, if any, is referenced in the query?”) is required, even if the query did not reference a Sensitive Topic.
If you selected either “Clearly references a sensitive topic” or “Maybe references a sensitive topic” for Q1 (“Does the query reference a sensitive topic?”), please select which primary Sensitive Topic the query
mentioned.
If you believe the query contains a type of harmful or sensitive content that’s not included in the Sensitive Topic list, select “Other type of harmful or offensive content.” You will be asked to describe the sensitive
content in the last question of the task.
If you selected “No, does not reference a sensitive topic” for Q1, please select “None, there is no sensitive topic present” from the drop-down list for Q2.
If you selected “Not Applicable, the query contains unintelligible text or non-English language.” for Q1, please select “Not Applicable, the query contains unintelligible text or non-English language.” from
the drop-down list for Q2.

Questions 3-4 are optional and should be answered only if the query references more than one Sensitive Topic. If there is only one Sensitive Topic, leave Q3 and Q4 blank.

Types of Sensitive Topics

Below are guidelines to help you evaluate each query for Sensitive Topics. The following list includes definitions for each Sensitive Topic and guidelines for whether the query falls under that Sensitive Topic.

Please note that there may be queries that appear to relate to a Sensitive Topic below, but do not actually meet the criteria outlined by the definition to be considered a sensitive topic. You should select “No, does
not reference a sensitive topic” if the query does not meet the criteria described below. For example, as you will see in the Serious Mental Health Concerns definition below, “wanting depressing stuff” is a query that
should not be considered as fitting the definition of that category despite being partially conceptually related. Closely review the different category definitions and examples to learn more about these queries.

1. Suicide and Self-Harm


Queries that mention, promote, or glorify either suicide or self harm fall into this category. Self-harm is intentional harm to oneself without the intent to result in death (e.g., cutting or burning oneself). Queries that
mention eating disorders (e.g., anorexia, bulimia) are also included. Look out for code words and slang used to refer to these topics such as: suicide: “unalive”, “self-delete”, “self-erase”; eating disorders:“thinspiration”,
“thinspo”, “proana” (for anorexia),“pro mia” (for bulimia).
Examples: “songs while I hang myself”, “unalive me please”, “cutting inspo”, “proana music,” “want to die”
Note: Queries that make a general reference to death, such as “faces of death” “valley of the dead,” should be labeled ”No, does not reference a sensitive topic.“

2. Serious Mental Health Concerns


Queries that mention serious mental health disorders, distress, or crises (e.g., panic attack, psychosis, OCD) fall into this category. Substance abuse and/or addiction are also included. Mentions of temporary moods or
requests for emotional content are NOT considered serious mental health concerns.
Examples: “paralyzed by anxiety”, “opioid addiction support”, "overwhelmed by bipolar disorder," "stuck in bad OCD spiral"
Note: Queries such as “feeling anxious” and “want depressing stuff” should be labeled ”No, does not reference a sensitive topic.“

3. Toxicity, Stereotypes, and Identity-based Hate


Queries that are hateful, mean-spirited, or contain insulting language (e.g., “idiot”, “terrible”, “worst”) are considered toxic, particularly when intended to harass, intimidate, or poke fun. Targets of insults can be specific
social groups (e.g., people with disabilities, gay people) or individuals (except for content creators, who are covered under “Queries Insulting Content Creators”). This category includes hate speech and slurs,
stereotypes and exclusionary language, and references to hateful ideologies which primarily target an identity group, such as Nazism or white supremacy (e.g., the Klu Klux Klan).
Examples: “bitches be crazy”, “tunes for my idiot brother”, “nasty trans songs”, “songs about lazy mexicans”, “great pop no jewish singers”, “heil hitler”, “love songs for losers”

4. Violence
Queries that mention, glorify, and/or incite physical, psychological/emotional, or sexual violence against a random individual or indiscriminate group of people or animals, fall into this category. Queries that mention
terrorism and violent extremism are also included. Note that queries with mild and generic (e.g., “fight club”) or fictional (e.g., “demon slayer”) references to violence are NOT included. Queries that mention weapons or
warfare are included ONLY if they also explicitly mention a violent action directed at people or animals.
Examples: “I wanna kill them all”, “loud songs for torturing my cat”, “ISIS recruitment songs”, “songs while i mow down the haters with my ar15”
Note: Queries such as “the duel”, “zombie shooting”, “guns and roses”, and “war games” should be labeled “No, does not reference a sensitive topic.“

5. Interpersonal Abuse
Queries that mention, describe, and/or glorify physical, psychological/emotional, or sexual abuse among people who know each other (e.g., romantic partners, classmates, family members) fall into this category.
Interpersonal Abuse has the specific goal of causing harm to an individual the user personally knows. This category includes behaviors such as bullying, hazing, coercion, sexual assault, intimidation, and stalking.
Examples: “song to gaslight my grandma”, “metal tracks while I beat up my girlfriend” , “bullying playlist”, “I want to punch the neighborhood kids”

6. Child Sexual Abuse and Child Pornography


Queries that specifically mention child sexual abuse (including sexualization of minors) or Child Sexual Abuse Material, (i.e., pornographic content involving underage minors) fall into this category. This includes
expression of interest in obtaining or creating these materials.
Examples: “underage sex audio recording”, “spicy tunes for teen porn strip”, “child molester”

7. Adult Sexual Material


Queries that mention sexual acts or refer to any body part in an explicitly sexualized or crude way fall into this category (e.g, “pussy.”) Note that clinical terms for sexual organs (e.g., “penis”, “vagina”) are NOT
severe/explicit enough and are NOT considered "Adult Sexual Material." Additionally, mild references to sex (e.g., "sexual songs") or euphemistic phrases such as “getting it on” or “making love” are not included.
Examples: “lick my pussy anthem”, “foot fetish playlist”, “titty bounce”
Note: Queries such as “hooking up” and “sexy woman” should be labeled “No, does not reference a sensitive topic.”

8. Illegal Activities, Goods, and Services


Queries that mention illegal activities, including the selling or trafficking of prohibited goods and services, fall into this category. This includes the purchase or use of illegal drugs, prostitution, and human trafficking.
References to substance abuse and addiction fall under “Serious Mental Health Concerns” (as opposed to the recreational use of illegal substances, which falls into this category).
Examples: “lyrics for making a ghost gun at home”, “we’re like cocaine and ketamine”, “how can i get a personal slave”

9. Queries Insulting Content or Creators


Queries that either directly insult or express a negative value judgment (e.g. “worst”, “stupid”, “lamest”) towards content or towards a Third Party, fall into this category. Third parties include artists and content creators,
public figures and celebrities, and companies. In combination with search results, these queries pose a risk of Apple appearing to affirm a derogatory sentiment.
Examples: “worst songs in apple music”, “most overrated rappers”, “artists completely out of touch with Gen Z”, “dumbest Disney songs”, “bad tswift songs”, “playlist about how Kim Kardashian is
evil”

10. Risk of Reputational Harm and Private Information Leaks


Queries that either pose a factual question or make an assumption about a third party pose a risk of defamation. Third parties include artists and content creators, public figures and celebrities, and companies.
Depending on the search results, these queries might harm the reputation of a third party by insinuating false information or inadvertently revealing private information.
Examples: “which country artists have AIDS”, "rappers who went bankrupt", “libertarian rappers”

11. Unsafe Actions


Queries that mention, promote and/or glorify Unsafe Actions fall into this category. Engaging in an Unsafe Actionmeans doing something which has potential benefit (e.g., it seems fun, could increase social media
followers, etc.), but which poses a risk of physical harm to oneself. An Unsafe Action is not undertaken with the goal of causing self-harm, but the action has a high probability of negative repercussions. This category
does NOT include violence directed at others.
Examples: “blackout drunk cooking songs” , “tunes for doing the tide pod challenge” , “playlist for making a tiktok while driving”, “drinking bleach”

12. Unethical Actions


Queries that mention, promote and/or glorify Unethical Actions fall into this category. Unethical Actions lack integrity and are undertaken with the goal of benefitting oneself, either by getting away with something or
taking advantage of a person or situation. The primary goal is not to cause harm, but these actions often have secondary consequences of hurting someone or eroding trust in institutions (e.g., cheating on a romantic
partner, plagiarizing an essay).
Examples: “music to catfish someone online”, “tunes for cheating on my test”, “playlist to cause chaos at election polling station”, “SAT cheating ring”

13. Disinformation and Conspiracy Theories


Queries that mention or seek to spread disinformation narratives (e.g., “fake news”) or conspiracy theories. Disinformation is intentionally false or misleading information (e.g., propaganda) intended to influence or
manipulate public opinions of political, cultural, or social issues. Conspiracy Theories are unverified claims and explanations of unexplained phenomena that propagate at the margins of mainstream conversation (e.g.,
the moon landing was fake). Conspiracy Theories are, by nature, not widely believed or officially recognized.
Examples: “playlist to convince my daughter vaccines cause autism” , “9/11 was an inside job” , “catchiest songs for a tiktok about using bleach to cure covid”

14. Politically Controversial Topics


Queries that mention Controversial Topics fall into this category. Controversial Topics are in a state of prolonged public dispute or debate, usually concerning a matter of conflicting opinion. These are generally
considered hot-button or divisive topics being actively discussed in mainstream conversation (e.g., disputed territories, legalization of recreational drugs).
Examples: “abortion is murder”, “free Palestine” , “gun rights” , “immigrants ruining our country”

15. Vulgarity and Offensiveness


Queries containing words, phrases, or vulgar slang that is considered coarse, offensive, crude, in bad taste, or impolite, and/or would be “bleeped” or blurred in broadcast media, fall into this category.
Examples: “cunty energy”, “cocksuckers”, “best motherfuckin electro”

16. Restricted and Regulated Content


Restricted content is censored or withheld from the public for some reason. Regulated content refers to the dispensing of information within “regulated industries” such as healthcare, taxes, and law. Queries expressing
interest in restricted or censored content or asking for advice in these “high-risk” industries fall into this category, since the discussion of these topics is risky and generally governed by regulatory bodies (e.g., the FDA).
For example, relating to healthcare, queries clearly requesting advice for a diagnosis or treatment/medication fall into this category.
Examples: “album banned in US”, “song that will cure my diabetes,” “lyrics that can help me diagnose my autism”

17. Other type of harmful or offensive text.


Queries that are potentially sensitive, harmful, or offensive but which aren’t covered by any Sensitive Topic categories above fall into this category. You MUST briefly explain your reasoning in the text box below.

18. None, there is no sensitive topic present.


Select this option if you selected “No, does not reference a sensitive topic” for Q1.

19. Not Applicable, the query contains unintelligible text or non-English language.
Select this option if you selected “Not Applicable, the query contains unintelligible text or non-English language” for Q1.

Queries with Multiple Sensitive Topics


https://baseline.apple.com/training/evaluations/1072/guidelines 2/4
11/5/24, 9:36 PM Guidelines for Search - Music SRA Classifier Training — BaseLine

Most queries will only have one Sensitive Topic present. There will be some exceptions, in which case you should select all relevant Sensitive Topics for one query by responding to Questions 3 and 4. Here is how you
would answer Qs 3-4 for the following queries:

1. “spic singers who are disgusting illegal aliens”


Q2: Toxicity, Stereotypes, and Identity-based Hate (“spic” is a racial/ethnic slur)
Q3: Risk of Reputational Harm and Private Information Leaks (search results to the query “singers who are illegal aliens” may reveal or insinuate an unverified factual statement)
Q4: Politically Controversial Topics (“disgusting illegal aliens” references ongoing mainstream debate over immigration policies)

2. "sellin drugs and shootin enemies"


Q2: Violence (“shootin enemies”)
Q3: Illegal Activities, Goods, and Services (“sellin drugs”)

3. “music for getting head while drunk driving”


Q2: Illegal Activities, Goods, and Services (“drunk driving”)
Q3: Unsafe Actions (“drunk driving”)
Q4: Adult Sexual Content (“getting head”)

4. “pop singers who should kill themselves”


Q2: Queries Insulting Content or Creators (“pop singers” reference has potential to return harmful search results)
Q3: Suicide and Self-Harm (“kill themselves”)

Part 3: Determine whether a query references an At-Risk Group

While queries may or may not reference sensitive topics, queries may also reference At-Risk Groups. At-Risk Groups are groups of people who are at greater risk for harm (physical or psychological) because of who they
are. The risk may be rooted in members’ mental/physical health, or in members’ marginalized identities / beliefs / experiences. While an exhaustive list of At-Risk Groups is impossible to provide, here is an incomplete list
of common At-Risk Groups — when you are considering whether an At-Risk Group is present in a query, use this list as a starting point:
children (ages 0-17)
elderly (age 65+)
pregnant people
single parents
people struggling with thoughts of suicide or self-harm (including eating disorders)
people struggling with addictions and other mental/psychological disorders
people with other chronic or long-term health conditions
racial and ethnic minorities
minority religious groups
sexual-orientation and gender minorities
immigrants

Question 5: Does the query reference an at-risk group?


1. Clearly references an at-risk group. Select this option if the query references at least one At-Risk Group in a way that you think is clear and obvious.
2. Maybe references an at-risk group. Select this option if you think the query may reference at least one At-Risk Group, but you are not certain.
3. No, does not reference an at-risk group. Select this option if you are certain that the query does not reference any At-Risk Group.
4. Not Applicable, the query contains unintelligible text or non-English language. Select this option if you selected “Not Applicable, the query contains unintelligible text or non-English language” for Q1.

Example of queries clearly referencing At-Risk Groups are: “Vietnamese pop”, “muslim prayers”, “kindergarten graduation music”, “first trimester support”, “songs about surviving cancer” , “meth head
music”, “uplifting album for my suicidal brother”

Part 4: Determine the User Intent


In Q1, you identified whether or not a query referenced any Sensitive Topic(s). You will now determine the User Intent,i.e. what the user was hoping to accomplish with that query. User intent may take different forms:
some may have 1) a harmful intent in mind, while others may be 2) interested in learning about a sensitive topic, 3) trying to avoid a sensitive topic, 4) searching for a specific piece of music content, or 5) searching for
similar/related music content; there will also be many queries where 6) the user intent is unclear.

If you selected “Clearly references a sensitive topic” or “Maybe references a sensitive topic” for Q1, you will determine which of the 6 user intent categories above best matches the intent of that particular
query.
If you selected “No, does not reference a sensitive topic” for Q1, please select “There is no sensitive topic present.” for this question.
If you selected “Not Applicable, the query contains unintelligible text or non-English language.” for Q1, please select “Not Applicable, the query contains unintelligible text or non-English language.” for
this question.

The different types of intent can be challenging to distinguish, which is why it is critical that you closely review the definitions and examples below. This will help you understand the key terms that need to be present
for a query to fit a given intent category. Select “Ambiguous Intent” if the query references a sensitive topic, but the user intent does not clearly match the other intent category definitions. Note that “Ambiguous
Intent” may be a common selection for the sensitive topic queries you review.

Question 6: What is the user intent?

1. Harmful Intent: Along with the Sensitive Topic, the query may include: (i) intentional language (i.e., “music for”, “songs to”, “music while”) and/or user-centric terms (e.g., “I”, “me”, “myself”) expressing the intent to
commit or associate with a sensitive/harmful activity; (ii) negative personal value judgment (e.g., "worst", "stupid", "loser") directed towards others or types of music content; and/or (iii) derogatory language including slurs
or stereotypes/"dogwhistles".
“music for shooting up a school”
“i hate blacks songs”
“songs while cutting myself”
“songs about faggots”
“want to get hauk tuah blow job”
“worst 90s rap songs”
“Adele’s dumbest lyrics”
“Music for treating and curing my clinical depression“
“dirty Mexican drug dealer songs”

2. Interest in Sensitive Topic: The query indicates interest in a sensitive topic, but not an intent to act. It should include language such as “about”, “relating to”, or “on” in reference to the sensitive topic.
Note: If the query includes glorification words like “songs promoting drug use” or “music celebrating rape” instead of the more neutral words like “about”, then it should be classified as Ambiguous Intent (see #4 below).
“songs on suicide and ending it”
“music about beating your partner”
“songs about hot anal penetration”
“songs relating to gunning down a hater”
“lyrics that describe cutting arms”

3. Opposing Sensitive Topic: The query references a sensitive topic, but there is also text indicating a user is actively trying to avoid it or positively coping with a harm.
“calming songs for my anxiety”
“rap no gang shootings”
“songs for overcoming physical abuse”
“sultry songs that don’t mention rape”
“death metal songs with no murder”
“supportive music for drug addiction recovery”

4. Ambiguous Intent: The query references a sensitive topic, but the user intent is ambiguous because it does not meet the definition of any of the other categories on this list. These queries often appear as only a
statement of the Sensitive Topic that does not explicitly state the user themself is engaging in a negative action or derogatory behavior (e.g., slur or stereotype) or opposing the sensitive topic. For example, “bombing a
church” or “music promoting church bombing”should be classified as “Ambiguous Intent” since we cannot determine the user’s intent from the query, but not “songs for bombing a church” (Harmful Intent) or
“songs about bombing a church” (Interest in Sensitive Topic).
“Shoot up a school”
“cutting and self harm”
“cocaine and ecstasy binge”
“steal cars”
“abortion”
“oral sex orgy anthems”
“insert in her cunt”
“songs celebrating genocide“

5. Navigating to Specific Content: A query fits this category if it references a sensitive topic within the name of a specific piece of content (e.g., song, album, artist, or lyric) AND also mentions the type of content (e.g.,
“song“, ” album”, “lyrics that go.”) Example: “music to be murdered by album.”
Note: The query “music to be murdered by” does not fit this category and should be evaluated purely on the text in the query. This particular query would be classified as “Ambiguous Intent.” Even though you may
know that “music to be murdered by” is an album, please evaluate as though you do not possess any domain knowledge and choose this category ONLY if the query text mentions “song”, “album”, “lyrics”, etc.
“lyrics that go that little faggot with the earring”- lyrics from Dire Straights
“Cop Killer song”- song title by Body Count
“Bleed Like Me by Garbage“- song title by Garbage
“lick my pussy and my crack song”- lyrics from Khia
“that kill yourself song by Suicide boy”

6. Navigating to Related Content: A query fits this category if it references a sensitive topic within the name of a specific piece of content AND also indicates the user is searching for results that are similar to that
content. It should include language such as “like”, “similar to”, “that matches” in reference to the specific content name.
Note: Queries such as “pornographic album” or “mass murder songs” do not fit this category because they do not refer to a specific song or artist name.
“songs like fuck da police”
“artists similar to Suicide boys”
“Albums like Kill em all”
“lyrics with same vibe of wet ass pussy”
“other songs matching rape me”

https://baseline.apple.com/training/evaluations/1072/guidelines 3/4
11/5/24, 9:36 PM Guidelines for Search - Music SRA Classifier Training — BaseLine

7. There is no sensitive topic present: The query does not reference any Sensitive Topic. Select this option if you selected “No, does not reference a sensitive topic” for Q1.
“rock for going on a run”
“top hits from the 80’s”
“Eminem rap while I cook”
“pump up anthems for a party”
“albums that won the Grammys 2021”

8. Not Applicable, the query contains unintelligible text or non-English language: Select this option if you selected “Not Applicable, the query contains unintelligible text or non-English language” for Q1.

Copyright © 2024 Apple Inc. All rights reserved. Apple Confidential.

Search - Music SRA Classifier Evaluation Guidelines

Instructions

Part 1: Determine whether a query references a Sensitiv…

Question 1: Does the query reference a sensitive topic?

Sensitive Topics:

Part 2: Determine which Sensitive Topics are referenced

Questions 2-4:

Question Guidelines:

Types of Sensitive Topics

Queries with Multiple Sensitive Topics

Part 3: Determine whether a query references an At-Risk …

Question 5: Does the query reference an at-risk group?

Part 4: Determine the User Intent

Question 6: What is the user intent?

https://baseline.apple.com/training/evaluations/1072/guidelines 4/4

You might also like