10 Traning Evaluation
10 Traning Evaluation
com
SAP SuccessFactors
Learning Administration -
Training Evaluations
SAP SE Copyrights and Trademarks
© 2018 SAP SE. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or for any purpose without the express
permission of SAP SE. The information contained herein may be changed without prior notice.
Some software products marketed by SAP SE and its distributors contain proprietary software components of other
software vendors.
Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation.
IBM, DB2, DB2 Universal Database, System i, System i5, System p, System p5, System x, System
z, System z10, System z9, z10, z9, iSeries, pSeries, xSeries, zSeries, eServer, z/VM, z/OS, i5/OS,
S/390, OS/390, OS/400, AS/400, S/390 Parallel Enterprise Server, PowerVM, Power Architecture,
POWER6+, POWER6, POWER5+, POWER5, POWER, OpenPower, PowerPC, BatchPipes,
BladeCenter, System Storage, GPFS, HACMP, RETAIN, DB2 Connect, RACF, Redbooks, OS/2,
Parallel Sysplex, MVS/ESA, AIX, Intelligent Miner, WebSphere, Netfinity, Tivoli and Informix are
trademarks or registered trademarks of IBM Corporation.
Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.
Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of
Adobe Systems Incorporated in the United States and/or other countries.
Oracle is a registered trademark of Oracle Corporation
UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group.
Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or
registered trademarks of Citrix Systems, Inc.
HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C®, World Wide Web
Consortium, Massachusetts Institute of Technology.
Java is a registered trademark of Sun Microsystems, Inc.
LabNetscape.
SAP, SAP Fiori, SAP SAPUI5, R/3, SAP Fiori, SAP NW Gateway, SAP NetWeaver, Duet, PartnerEdge, ByDesign,
SAP BusinessObjects Explorer, StreamWork, and other SAP products and services mentioned herein as well as
their respective logos are trademarks or registered trademarks of SAP SE in Germany and other countries.
Business Objects and the Business Objects logo, BusinessObjects, Crystal Reports, Crystal Decisions, Web
Intelligence, Xcelsius, and other Business Objects products and services mentioned herein as well as their
respective logos are trademarks or registered trademarks of Business Objects Software Ltd. Business Objects is
an SAP company.
Sybase and Adaptive Server, iAnywhere, Sybase 365, SQL Anywhere, and other Sybase products and
services mentioned herein as well as their respective logos are trademarks or registered trademarks of
Sybase, Inc. Sybase is an SAP company.
All other product and service names mentioned are the trademarks of their respective companies. Data contained in
this document serves informational purposes only. National product specifications may vary.
These materials are subject to change without notice. These materials are provided by SAP SE and its
affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any
kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only
warranties for SAP Group products and services are those that are set forth in the express warranty
statements accompanying such products and services, if any. Nothing herein should be construed as
constituting an additional warranty.
SAP SUCCESSFACTORS LEARNING ADMINISTRATION - TRAINING 3
Use Example/Visualization
Demonstration by Instructor
A hint or advanced detail is shown or
clarified by the instructor –
please indicate reaching any of these
points to the instructor
Warning or Caution
A word of caution – generally used to
point out limitations or actions
with potential negative impact that need
to be considered consciously
Hint
A hint, tip or additional detail that helps
increate performance of the solution or
help improve understanding of the
solution
Additional information
An indicator for pointing to additional
information or technique beyond the
scope of the exercise but of potential
interest to the participant
Discussion/Group Exercise
Used to indicate that collaboration is
required to conclude a given
exercise. Collaboration can be a
discussion or a virtual collaboration.
Solution or SAP Specific term E.g. Flavors are transaction specific screen
personalization created and rendered using
SAP Screen Personas.
Course Introduction
Overview
Through discussion, demonstration, and hands-on computer lab work, this course teaches
you the concepts and terminology associated with the training evaluation feature in the
SAP SuccessFactors Learning Management System (LMS). You will develop a working
knowledge of this model for use in implementation of your training evaluation program.
You will also gain basic skills in how to use SAP SuccessFactors Learning to create and
modify questionnaire surveys, associate surveys to items, and report on survey results
using the step-by-step, hands-on lab exercises.
Target Audience
This eLearning is intended for SAP SuccessFactors Learning administrators (admins)
responsible for adding and/or editing item (course) records with training evaluations.
Unit Objectives
This unit contains six lessons:
Do not block pop-up windows in your browser. Please unblock pop-up windows so that the
application may function as designed.
When you first login to SAP SuccessFactors Learning, the following message banner may
display at the top of the window:
This training assumes that your SAP SuccessFactors Learning administrator role is
associated with all available workflows in the system. If your role does not include certain
workflows, those tabs and pages will be grayed out and/or inaccessible.
Required Fields
Your system administrator configured specific fields throughout the system as required
based on your organization’s business rules and processes. These required fields are
indicated with a red asterisk (*). You must input data in these fields before you are allowed
to progress.
If you are using SAP SuccessFactors Learning for training, the fields displayed and
marked as required may not reflect the settings you will encounter when accessing your
organization’s system. Your system administrator can provide you with a list of the
required fields for your organization.
Throughout the guide, you may encounter icons that call out various types of information.
The following table illustrates how this guide uses icons to indicate different types of notes,
comments, activities, exercises, etc. that support the text.
Activity: Indicates an activity for you to complete that helps reinforce the
information you just learned.
Best Practice: Indicates helpful hints and tips or other guidance that further
explains the information it accompanies.
Exercise: Indicates a hands-on exercise. Follow the step-by-step process
outlined to perform specific tasks in the instance.
https://community.successfactors.com/
Additional Resources
For more information about SAP SuccessFactors, refer to these resources:
Objectives
Level 1: Reaction – What learners thought and felt about the training
Level 2: Learning – The resulting increase in knowledge or capability
Level 3: Behavior – Extent of behavior and capability improvement and
implementation/application
Level 4: Results – The effects on the business or environment resulting from
the learner’s performance
Terms and Definitions
The following table lists the terms and definitions associated with the training evaluation
model in SAP SuccessFactors Learning.
Consider this scenario: You are on a Human Resource’s team responsible for the design
and implementation of communication courses designed to improve the quality of all forms
of communication throughout your company. One of the courses created focuses on
dealing with customers, conflict, and confrontation. This is an online course and is
available to all employees.
Using the scenario, adhere to the following steps to successfully implement a training
evaluation model:
With the training evaluation tools in SAP SuccessFactors Learning, an admin can create
surveys to support an item-level evaluation, and a follow-up evaluation. Using Question
Editor to create pre- and post-exams rounds out the necessary tools in SAP
SuccessFactors Learning to fully support the implementation of the first three levels of a
training evaluation model.
Once Questionnaire Surveys are created, they are associated with items on the
Evaluations tab of an item record or from the survey record.
Item evaluation (user satisfaction) surveys may be automatically assigned to users when a
learning event is recorded for the item and the completion status is configured to trigger
the survey type. Alternately, an instructor may trigger the user satisfaction survey prior to
recording the learning event.
Follow-up surveys are automatically assigned to users and/or their supervisors after a pre-
determined amount of time has passed since the learning event was recorded for the item.
The Follow-Up Evaluation automatic process manager (APM) must run in order to assign
the follow-up surveys. This process will also notify the users and/or supervisors when
either type of survey is assigned to them.
Pre-Configured Requirements
The ability to support a training evaluation model within SAP SuccessFactors Learning
relies on some pre-configured entities. For example, the pre- and post-exams used for a
learning evaluation are exams created using Question Editor.
Note: Please refer to the SAP SuccessFactors Learning: Online Exams and
Quizzes guide for additional information on how to create objectives, questions,
and exams.
Item Record
Surveys are associated with item records. For the purpose of this course, create an online
item focused on dealing with customers, conflict, and confrontation.
Rating Scale
Rating scale questions can be used with surveys. It is recommended, when implementing
item evaluation and follow-up evaluation surveys, to use at least two applicable rating
scales. For example, the first rating scale could be a five-point scale used for an item
evaluation survey. The second rating scale could be a frequency scale used for a follow-up
evaluation survey.
Conclusion
In this lesson, you learned the concepts and terminology associated with the training
evaluation model.
Knowledge Check
Use what you learned in this lesson to answer the following questions.
1. True or false: The final step for any training program is a summative evaluation
in which you measure how
effectively the training program accomplished its stated goals.
2. The fourth level of evaluation measures , and can be the most difficult
to implement for a training program.
a. Learning
b. Reaction
c. Behavior
d. Results
Objectives
This type of evaluation can reveal valuable data if the questions asked are more complex.
For example, this questionnaire could move beyond how well the users liked the training to
questions about:
Questions asked on a typical item evaluation survey cover the basics of a course and can
be grouped according to focus area. For example, one set of questions might be related to
the effectiveness of the instructor, and another set of questions might inquire about the
usefulness of the materials. Because this type of evaluation is so easy and inexpensive to
administer, it usually is conducted in most organizations.
Question Types
When building your item evaluation, you must create questions and answer choices. There
are four types of questions:
Rating Scale: Use a question type of rating scale when you want to get
quantitative results.
One Choice: Use this question type when you want the user to choose one correct
answer from a group. Additional possible answer choices can be added by selecting
the green plus (+) icon at the end of the question type.
Multiple Choice: Use this question type when you want the user to be able to
choose multiple correct answers from a group. Additional possible answer choices
can be added by selecting the green plus (+) icon at the end of the question type.
Open Ended: Use this question type when you want the user to type an answer
(up to 3,990 characters).
Best Practice: It is encouraged to use the same rating scale per question throughout a
survey in order to provide a mean score per page, across the entire survey, per item, and
per instructor.
Survey Pages
Pages act as an organization system for your survey. You can add a title for the page and
you can add instructions for the page. For example, you might create a page called
"Classroom Conditions" and add instructions that tell users to please evaluate the physical
classroom space.
Ensure that the questions being asked are applicable and relevant. For example, a set of
questions on how well an instructor kept the class engaged might not apply to a purely
online course.
Note: It is important to keep in mind the type of training event the survey will be used for
and how you want the pages in your survey to behave.
Multi-Instructor Surveys
At times, more than one instructor may be facilitating a training session. In these
instances, the survey tool can be configured to prompt the user to evaluate the
effectiveness of all instructors involved.
Pages are particularly important if your survey covers more than one instructor. If your
survey evaluates more than one instructor, then create a single page for all questions
related to instructors. This allows the system to repeat that page for each instructor. Select
Instructor from the Resource Type drop-down menu.
Note: It is very important to select the appropriate Resource Type when creating the
Survey page and to indicate the resource when the learning event is recorded. If the
learning event is recorded without the resource, the Survey page will not be available to
the user.
When responding to a survey, instructor-specific pages are repeated for each instructor
and the user is alerted that responses are related to the instructor specified on the page.
Please see the Reports module to view sample outputs of instructor results.
Localization
When creating questionnaire surveys for use with multiple languages, the values for each
language (locale) may be entered by using the globe icon in the following figure next to
localized fields. Localized fields include: survey name, survey description, survey
instructions, page titles, page instructions, question text, and question responses.
Creating a Survey
To create a survey, navigate to Admin Center > Learning > Learning Administration >
Learning > Questionnaire Survey and select Add New. Enter a Survey ID and Name and
select an Evaluation Level. Next you will enter the Survey Description and Comments,
then select Domain. Check the Active checkbox and click Add.
To create the survey questions, go to the Questions tab and enter the Survey Instructions.
Enter Page 1 title and instructions and click the Add Question icon in the top right corner of
that section. In the text box enter the question stem, then select the Question Type and
Rating Scale. When you are done creating questions on page 1, click Save Draft. Repeat
this process for all pages of the survey.
To configure options and notifications for the survey, select the Options tab. If you would
like the survey to be anonymous, click Yes. Check the Required for Item Completion
checkbox and enter the number of Days to Complete survey from the date of assignment.
Determine whether you want to include a comments field for each question. When done,
click Apply Changes.
If necessary, you may change the standard notification that goes out when the survey is
assigned by clicking on the Notifications tab.
To preview and publish the survey, select the Questions tab and click Preview. Then
select Draft from the drop-down menu. Click Publish and the survey is now ready to use.
Exercise - Create Draft Survey and Add Questions: Item
Evaluation
Scenario
An online course on how to deal with customers, conflict, and confrontation has been
created. You are responsible for creating the item evaluation survey to be assigned to all
users immediately after the course is completed. This is the general course evaluation
survey to be used after every online HR course.
Task
Write down questions for each page of this survey. Each question will be a rating scale
type using the five-point scale (created previously). There should be an open-ended
comments question at the end of each page.
In this exercise, you will create a draft survey and add questions to an Item Evaluation.
Configuration Options
Each item evaluation survey has the ability to be configured uniquely. A survey can be
configured to be submitted anonymously by users. This prevents an administrator
accessing reports from identifying certain feedback from a specific user. A survey can be
required for item completion causing the item to sit in a pending state until the survey has
been completed and submitted by the user at which time the item will be moved to the
user’s Learning History.
In order to ensure there is a defined period in which users should submit feedback, a
configurable number of days can be entered to determine a date by which the user should
complete and submit the survey. If the user does not complete the survey in the timeframe
configured, the survey will move into an overdue status. Even though the survey is
considered overdue, the user will still have the ability to open, complete, and submit the
survey.
The admin has the ability to remove item and follow-up evaluations from user records,
which is especially useful for overdue surveys. This can be done by accessing the Surveys
tab of the user record or using the User Needs Management tool.
A comments section can be added after each question for additional feedback per
question in the following figure.
Notifications
There are two notification templates associated with the questionnaire survey functionality:
The notification template may be modified on each individual survey in the following
figure. It is recommended to embed a questionnaire survey direct link into the notification
template to generate an actual link within the email.
Direct Links
The Direct Links tool allow Admins to build and send a direct link to users. When users
click the links, they go to specific pages in SAP SuccessFactors Learning.
Questionnaire Survey direct links can be created to take users directly to the evaluation
instead of going through their learning assignment.
You can send the links to users so that they can access the pages by clicking the link.
Users are asked to provide login information if they are not already logged in.
You can embed the links in any environment that accepts URL links. For example, you can
add the link to an e-mail, notification message or a Web page.
The use of direct links does not compromise security. Users will have access to only those
sections of the application that their assigned workflow restrictions allow.
Once you have reviewed your questions and layout, close the preview and return to the
Questions tab. Click the Publish button to officially publish and make this survey available.
Once a survey is published, it can be associated with one or more items.
Or, an association between the item and a survey can be created within an item record
from the Evaluations tab.
From the questionnaire survey record, an admin can view the number of users assigned
who have completed the item and been assigned to the survey, the number of completed
surveys, the percentage of assigned surveys that have been completed, and a mean
score for the survey results. The mean score is the average of all the rating scale question
responses across all completed surveys for that item. For each item associated with the
survey, the system calculates the mean score only if, for all surveys completed for that
item, all the rating scale type questions use the same rating scale as specified in the
Rating Scale field.
It is also possible to run a report from this screen on item evaluation results.
It is only possible to associate one survey per item. Therefore if a survey is already
associated with an item, you will receive a warning message while attempting to associate
another survey. You are given the option to remove the previous survey with the current
one, or cancel the association process.
Note: Depending on how the survey was configured, the Days to complete field and
the Required for Item Completion checkbox may or may not be already populated once
the survey is added. If necessary, change these fields and apply changes.
Conclusion
In this lesson, you built the first survey to support evaluation of the users’ reaction to the
training.
1. The most effective question type to use in order to obtain quantitative results is:
a. Open Ended
b. One Choice
c. Multiple Response
d. Rating Scale
2. True or false: An item may have more than one Item Evaluation: User
Satisfaction (level 1) questionnaire survey
associated to it.
Upon completion of this lesson, you will be able to associate exams with an item as pre-
and/or post-exams.
Set Up Learning Evaluations
Configuring an item to support learning evaluations is simple if the exams to be used as
pre- and post-exams have already been created. Otherwise, it is encouraged that you
refer to the SAP SuccessFactors Learning: Online Exams and Quizzes guide for additional
information on how to create objectives, how to create questions using Question Editor,
and how to create an exam from those questions. Although it is optional to associate
objectives with questions using Question Editor, it is highly encouraged to use this
methodology when creating questions to be used in pre- and post-exams that are part of
an evaluation model. Learning evaluation reports display results per objective; however,
this information is not available if the questions are not tied to objectives.
The purpose of a pre- and post-exam is to assess the user population’s knowledge of the
content prior to the training, and then just after the training. If the training program is
effective, there will be a demonstrative increase in knowledge when the two exam results
are compared.
Note: An item’s original classification may change when configuring to support a learning
evaluation. If the item is online, associating one or more exam objects does not affect the
online classification. If the item is instructor-led, associating one or more exam objects will
change the classification from instructor-led to blended.
In the Learning Evaluation: Mastery of Content section, click the drop-down menu for pre-
exam and post-exam and select the desired exams. When done, click Apply Changes.
Note: The exam object must first be associated to the item twice before it can be selected
as pretest or posttest.
Conclusion
In this lesson, you created an association between previously created exams and an item
as pre- and post-exams.
You should now be able to associate exams with an item as pre- and/or post-exams.
Knowledge Check
Use what you learned in this lesson to answer the following questions.
1. True or false: It is a best practice to associate objectives with questions for reporting
and analysis.
2. True or false: Level 2 evaluations are created the same way as Level 1 & Level
3 evaluations, within the Questionnaire Survey tool.
Objectives
Ideally, this measurement is conducted three to six months after the training program. By
allowing some time to pass, learners have the opportunity to implement new skills and
retention rates can be checked. Observation surveys are used, sometimes called
behavioral scorecards. Surveys can be completed by the user and/or the user’s
supervisor.
Functionally, the follow-up evaluation questionnaire survey is created in the same way as
an item evaluation questionnaire survey. Navigate to Learning > Questionnaire Survey and
select Add New. Enter a Survey ID and Name and select an Evaluation Level, in this case,
Follow-up Evaluation. Next you will enter the Survey Description and Comments, and then
select Domain. Check the Active checkbox and click Add.
The follow-up survey has a very different purpose than the item evaluation survey;
therefore, the line of questioning is radically different. For example, follow-up survey
questions evaluating a sales training program might include:
These questions can be reworded to best fit the need of the program and evaluation
methodology. For example, the same questions above may be reworded for use with a
rating scale defined by frequency:
How often did the representative open each customer dialogue with a product benefit
statement, followed by a request to proceed?
How often was the representative able to analyze and describe to you the
category of customers' objections as either valid, misinformation, or smokescreen?
To create the survey questions, go to the Questions tab and enter the Survey Instructions.
Enter Page 1 Title and Instructions and click the Add Question icon in the top right corner
of that section. In the text box enter the question stem, then select the Question Type and
Rating Scale. When you are done creating questions on page 1, click Save Draft. Repeat
this process for all pages of the survey.
Configuration Options
A follow-up survey is assigned to a user and/or the user’s supervisor a configurable
number of days after item completion and is triggered by an automatic process manager
(APM).
You can configure each follow-up evaluation survey to uniquely fit your business needs.
Select the Options tab and make modifications in the Edit the Survey Defaults section.
Unlike the feedback surveys, these may not be anonymous since they help us
determine how well a specific student is applying knowledge that they gained since they
completed the course content and have had a chance to use the knowledge at their job.
You also have the option to include a comments section for each question. When done
selecting your options, click Apply Changes.
A follow-up survey can be completed by the user (employee) only, the supervisor only, or
both. It may also be made required for participants to complete.
A comments section can be added after each question for additional feedback per
question.
To preview and publish the survey, select the Questions tab and click Preview. Then
select Draft from the drop-down menu. Click Publish and the survey is now ready to use.
Exercise - Configure Options for Draft Survey: Follow-
up Evaluation
Scenario
When the online course on how to deal with customers, conflict, and confrontation is
finished by employees, they will need to complete a follow-up evaluation survey. You are
responsible for creating the questions for the follow-up evaluation survey to be assigned to
all users and supervisors. In the next Exercise, we will configure the survey.
Task
Using the previous exercises in this course as a guide, complete the following tasks on
your own:
Create a follow-up evaluation survey using the sample questions on the next page
Write additional questions for each page of this survey. Each question will use
a rating scale type using the five-point frequency scale (created previously).
The comments question at the end of each page is an open-ended question
type
In this exercise, you will configure options for Draft Survey: Follow-up Evaluation.
Conclusion
In this lesson, you built the second survey to support evaluation of the users’ behavior
after the training, and assessed their application of knowledge gained.
Knowledge Check
Use what you learned in this lesson to answer the following questions.
Upon completion of this lesson, you will be able to launch and complete assigned surveys.
The user can launch the survey by clicking on the survey title from My Learning
Assignments tile. The survey can be launched from the mobile as well.
When launched, if the survey has been configured to be anonymous, it will be stated
above the survey.
If the survey is configured to be required for item completion, a warning message displays
to the user if he/she had missed or skipped any questions in the survey when submitting.
On the other hand, if the survey is optional (not required for item completion), then the
questions within the survey may also considered optional and the user has the ability to
omit and skip questions when submitting. This is configurable in the LMS_ADMIN system
configuration file by changing the property:
UsersCanSubmitIncompleteOptionalSurveys=false to true. Furthermore, with an optional
survey, the user has the ability to completely remove the survey from his/her Learning
Plan similar to a self-assigned item. Once a survey has been submitted, it disappears
from the user’s view. There is no record of a survey completion in a user’s learning
history.
When the user gets to the last page of the survey, a Submit button will be available to
submit the completed survey in the below figure. As a user completes the survey, there is
an option to save the survey. This allows the user to complete the survey at a later time.
The online course on how to deal with customers, conflict, and confrontation needs to be
assigned to users. You are responsible for assigning this course to the appropriate
employees.
Task
Note: To test a new survey, it is recommended that the admin publish it, associate it to
an item, and then record a learning event for the item. This will assign the survey to the
user for testing.
Conclusion
This lesson demonstrated how a user would access and complete training evaluations.
2. True or false: If a survey is optional, the user has the ability to remove the
survey from his/her Learning Plan.
Upon completion of this lesson, you will be able to access and launch analysis reports.
Reports Overview
There are specific reports to support analysis of each step of the training evaluation
process.
The Item Evaluation Report shows the mean score (the average results of the rating scale
questions) for each survey; survey page; and survey question, and the percentage of
users who selected each response.
The Item Evaluation by Individual Response Report shows each user’s responses to the
survey questions.
The Item Evaluation by Instructor Report shows the mean score (the average results of the
rating scale questions) for each survey and survey page; grouped according to the
instructor.
1. Navigate to Reports
2. Type evaluation in the Search box
3. Check the Learning checkbox in the Category section (you may have to un-check all
the other options)
4. Check the Published checkbox in the Publication Status section
5. Check the Admin checkbox in the Application section
6. Click Submit
7. Click the report title to open the report search window. You might need to click
the (+) to get to the available reports
8. Enter the report criteria
9. Click Run Report
1. Navigate to Reports
2. Type evaluation in the Search box
3. Check the Learning checkbox in the Category section
4. Check the Published checkbox in the Publication Status section
5. Check the Admin checkbox in the Application section Click Submit
6. Click the report title to open the report search window. You might need to click
the (+) to get to the available reports
7. Enter the report criteria
8. Click Run Report
The Follow-up Evaluation by Individual Response Report shows the mean score (the
average results of the rating scale questions) for each follow-up survey and survey page.
1. Navigate to Reports
2. Type evaluation in the Search box
3. Check the Learning checkbox in the Category section
4. Check the Published checkbox in the Publication Status section
5. Check the Admin checkbox in the Application section
6. Click Submit
7. Click the report title to open the report search window. You might need to click
the (+) to get to the available reports
8. Enter the report criteria
9. Click Run Report
Conclusion
This lesson demonstrated how to access and run evaluation related reports.
Unit Summary
In this unit, you covered:
Appendix A
Customizing Survey Thumbnail
The thumbnail associated with questionnaire surveys may be customized by an Admin.
Navigate to: System Admin > Configuration > Images. Browse for, and upload the desired
custom survey image that will represent a survey record on the User’s Learning Plan.
Questionnaire Survey notifications can be customized either at the global level or at the
survey level. Notification customizations can be done by editing the body of the
notification. HTML may be added to improve the notification’s appearance to the end user.
Syntax tags which represent data about the user and the item being surveyed may be
included in the notifications. For example, the follow-up survey may be configured to
display the full name of the employee with the use of syntax tags.
The Notification Template Editor may also be used to customize your survey notifications.
Navigate to the Notification tab in the left menu of the survey record and select the
LAUNCH EDITOR link.