Vol 3 myQA Machines User's Guide
Vol 3 myQA Machines User's Guide
Vol. 3
myQA Machines
SW Version: 2.12 | Release: 2019-002
Notice
This manual is an integral part of the myQA® system and should always be kept at hand. If the manual
is missing, immediately contact the IBA Dosimetry GmbH for a copy.
Observance of the manual instructions is required for proper performance and correct operation of the
myQA® system. The myQA® system and its accessories must not be used for any other purpose than
what is described in the accompanying documentation (intended use). Violation will result in loss of
warranty.
IBA Dosimetry GmbH does not accept liability for injury to personnel or damage to equipment that may
result from misuse of this equipment, failure to observe the hazard notices contained in this manual, or
failure to observe local health and safety regulations.
Under no circumstances shall IBA Dosimetry GmbH be liable for incidental or coincidental damage
arising from use of the equipment described in this document.
No part of the accompanying documentation may be translated or reproduced without written
permission of IBA Dosimetry, unless reproduction is carried out for the sole purpose of being used by
several people in the same department.
The user must treat the accompanying documentation like any other copyrighted material. If part of the
accompanying documentation is provided in electronic form, these files shall not be modified in any
way. IBA Dosimetry and its suppliers retain title and all ownership rights to the accompanying
documentation (in either electronic or printed form).
1. Introduction ..................................................................................... 1
1.1. Intended Use ............................................................................................................................ 1
1.2. General Product Description.................................................................................................. 1
1.2.1. Generic Tests and Plugin Tests ............................................................................................ 2
1.2.2. Supported Measurement Devices ......................................................................................... 2
1.3. Scope of Delivery .................................................................................................................... 2
1.4. Related Documents................................................................................................................. 3
2. Getting Started with the Software .................................................. 4
2.1. Step 1 – Set up the Equipment ............................................................................................. 4
2.2. Step 2 – Create a Protocol Template ................................................................................... 5
2.3. Step 3 – Create a Machine Protocol .................................................................................... 6
2.4. Step 4 – Execute the Tests .................................................................................................... 7
2.5. Step 5 – Print a Report ........................................................................................................... 8
3. Software Description ....................................................................... 9
3.1. myQA Machines User Interface Overview........................................................................... 9
3.2. General Description of a Protocol Template and Device Protocol ................................ 10
3.3. Test Setup .............................................................................................................................. 10
Test Setup Ribbon and Context Menu ................................................................................ 11
Machines & Templates Panel ............................................................................................. 14
Protocol Tree Panel ............................................................................................................ 14
Template / Definition panel ................................................................................................. 17
Error Indication .................................................................................................................... 28
3.4. Test Run ................................................................................................................................. 29
Test Run Ribbon and Context Menu ................................................................................... 29
Agenda Panel ...................................................................................................................... 31
Task View Panel ................................................................................................................. 33
Task/Test Info Panel ........................................................................................................... 35
Skip a Task.......................................................................................................................... 36
Finish a Task ....................................................................................................................... 37
3.5. Run an Imported Excel Spreadsheet Test......................................................................... 39
Create a Machine Protocol .................................................................................................. 39
Map the Excel Template to the Acceptance Criteria ........................................................... 40
Performing a Test ................................................................................................................ 43
3.6. Test Repository Page ........................................................................................................... 45
Ribbon and Context Menu .................................................................................................. 45
Sorting the Listed Items ...................................................................................................... 46
Display Filter........................................................................................................................ 46
Print a Report ...................................................................................................................... 47
Available Accessories
■ Energy Verification plates for photons and electrons
■ Build-up plates
■ Primus L phantom
■ miniPhantom
■ Gantry mounts for the Elekta or Varian linear accelerators
■ Sphinx V1 or Sphinx Compact device
■ Other QA phantoms and accessories
4 | Getting Started with the Software | myQA User's Guide - Vol.3 myQA Machines
2.2. Step 2 – Create a Protocol Template
Click the workspace selection button and then select myQA Machines (1).
Note that the machines are listed in the Machines & Templates panel (2).
Select the Edit icon (3) in the ribbon to enter the Edit mode.
Select Protocol Templates (2a) in the Machines & Templates panel and then Protocol Templates in
the Protocol Tree panel (4).
Under Protocol Templates (4a) in the Protocol Tree panel (4), select a predefined protocol template
(e.g., TG-142 General (4b)), and then click Copy (5) in the ribbon.
1
3 5
6
2a 4a
4b
2 4 7
Select Protocol Templates (4a) again and then click Paste (6) in the ribbon.
Note that the name of the new protocol template has a suffix, (n), for the nth copy of the same
protocol template (4c).
4c
(Optional) change the name of the protocol template to be specific to each machine and add some
comments to the protocol:
Select a protocol template in the Protocol Tree panel (4) and then edit the name and description in
the Definition panel (7).
Check the test parameters: select a test in the Protocol Tree panel (4) and edit it in the Definition
panel (7) if necessary, e.g. if the Result Type is Numeric, click Edit under Acceptance Criteria,
and then edit the parameters in the popup dialog (7a).
myQA User's Guide - Vol. 3 myQA Machines | Getting Started with the Software | 5
7a
7
(The Machines & Templates panel and ribbon are hidden in this example)
8a
2 4
8b
Note that when the cursor is placed on a protocol template, its description is displayed (8b).
6 | Getting Started with the Software | myQA User's Guide - Vol.3 myQA Machines
Select a machine protocol in the Protocol Tree panel (4), and edit its name in the Definition panel
(7) if desired.
Select a machine protocol in the Protocol Tree panel (4) and then click the Set Active icon (9) in
the ribbon so that this protocol will appear in Test Run.
10 9
2 4 7
If necessary, the acceptance criteria of tests can be still edited in the Definition panel (7).
Click the Save icon (10) in the ribbon.
1 3
myQA User's Guide - Vol. 3 myQA Machines | Getting Started with the Software | 7
For a Pass-Warning-Fail test, select an option.
Optionally, if a note should be added/edited to a task/test, click the Edit button under Task/Test
in the Task/Test Note panel (4), and enter the text.
Click Finish.
Repeat the steps in the Task View panel (3) until all tests are finished. Click Finish (5) in the ribbon.
Select the next task if applicable and go through the above steps until all tasks are done for this
machine.
8 | Getting Started with the Software | myQA User's Guide - Vol.3 myQA Machines
3. Software Description
3.1. myQA Machines User Interface Overview
The myQA Machines window follows Microsoft Windows standards and consists of several sub-areas:
2a
1c
1a 2
1b
1a: Workspace selection button – displays the title and the icon of the currently selected workspace.
Click this button to open the dropdown list and then select the desired workspace.
1b: myQA Machines navigation panel – navigates the myQA Machines workspace via buttons. myQA
Machines consists of three navigation pages:
Test Setup
Test Run
Test Repository
1c: The Workspace Navigation panel collapse button – click it to hide and show the Workspace
Navigation panel.
2: Ribbon – contains the commands of the currently selected navigation page in 1b.
2a: Ribbon collapse button – click it to hide and show the ribbon.
2 3 4
Table 3.1. Description of the Test Setup ribbon; (*) these functions are also available in context menus.
Shortcut
Group Icon name Description
key
Opens a list of clinics for selection. Only the clinics assigned
to the current user are in the list. Only the machines for the
Locations Clinics -
selected clinics are displayed in the Machines & Templates
panel.
Enters the Edit mode.
Edit - If the Edit icon is grayed out, then the page is in the Edit
mode. Otherwise, it is in the View mode.
Edit Saves the changes done in the Edit mode and switches the
Save -
page to the view mode (see Section 3.3.1.5).
Cancels the all changes done in the Edit mode and switches
Cancel -
the page to the view mode (see Section 3.3.1.5).
Paste* Ctrl + v Pastes the contents from the clipboard.
Clipboard Cut* Ctrl + x Copies the selected item into clipboard and deletes it
Copy* Ctrl + c Copies the selected item into clipboard.
In Edit mode: Creates an empty user-defined
New* Ctrl + n
Item protocol/task/test (see Section 3.3.1.2).
In Edit mode: Deletes the selected items (protocol/task/test)
Delete* Del
in the Protocol Tree panel (see Section 3.3.1.4).
Import - In Edit mode: Imports a protocol template from a XML file.
Protocol
Template In View mode: Exports the selected protocol template into a
Export* -
XML file.
In Edit mode, for a selected machine: Opens a drop-down
From
- list of available templates. The selected template is copied to
Template*
the selected machine.
Create as In Edit mode, for a selected machine: Creates a copy of the
-
Protocol Template* selected machine protocol as protocol template.
In Edit mode: Sets the selected machine protocol as the
Set active protocol.
-
Active* Note: Only tests within an active protocol can be performed
in the Test Run page.
Data Refresh - Refreshes the Test Setup data from database.
Sort by
View - Sorts the tests in the Protocol Tree by category.
Category
View mode
In View mode, users can view the Test Setup details and export protocol templates. No modifications
are allowed.
Edit mode
In Edit mode, protocols, tasks, and tests can be created or edited. While in Edit mode, the user cannot
switch to any other myQA Machines workspace, e.g. Test Run. Note that switching to another
application main workspace, e.g., myQA Platform, is still possible.
Only one user can edit the test setup simultaneously. If a user tries to switch to Edit mode while
another user is already in Edit mode, an information message pops up.
To exit Edit mode, click the Save/Cancel icon in the ribbon (see Section 3.3.1.5).
By clicking the Cancel button, the cut item will not be recovered in the Protocol of Machine 1. However,
the cut item is still in the clipboard and can be pasted back to the Protocol of Machine 1 using the Paste
function in the ribbon or context menu,
Be aware that the clipboard is overwritten when another cut or paste action is done or emptied when
the save/cancel is executed.
3 4
Protocol
Task
2
Test
A protocol template/machine protocol can contain one or more tasks and a task can contain one or
more tests.
Item Title
Protocol template Protocol template
Protocol template Task template Task template
Test template Test template
Protocol Protocol Definition
Protocol Task Task Definition
Test Test Definition
ii
iii
iv
ii
iv
The recurrence pattern is crucial for the tests belonging to the task, since it determines the schedule
based on which the tests shall be executed in Test Run.
The different possibilities to set the recurrence pattern are depicted in the screenshot above:
■ Tests that are no specific recurrence pattern: select None (i).
■ Tests that have to be performed with a certain frequency can be summed up tasks with the
recurrence pattern according to this frequency. The set up of the recurrence is fully flexible. For
example, in the default protocols TG-142 General and TG-142 PlugIn following tasks are defined:
Daily QA: every week in combination with every weekday
Weekly QA: every week in combination with any day of the week
Monthly QA: every month on the first/second/etc. (ii) weekday/Monday/etc. (iii) of the month
Annually QA: every 12 months on the first/second/etc. (ii) weekday/Monday/etc. (iii) of the month.
When a task is scheduled, the first due date which fits to the recurrence pattern should be set as start
date. It can be directly entered into the Start Date text box or selected using the calendar. Otherwise,
the current date is set as the start date by default.
The user also has the possibility to attach different objects to the task definition (iv). Section 3.3.4.5
provides a detailed description of the objects that can be attached and the corresponding workflow.
iii ii
Add
Click the Add button to open the Add dialog:
Delete
Select a tolerance, and then click Delete.
Create a test
In the Edit mode, select a task under a machine protocol in the Protocol Tree panel.
Create a test from an existing test:
o Select a test and then click Copy in the ribbon or the context menu.
o Select a task template and then click Paste in the ribbon or the context menu. The selected
test will be copied under this task template.
o The acceptance criteria can be edited by clicking the Edit button in the Test Definition panel.
For a test, the Category and Result Type of a test are shown in the upper-right corner of the Test
Definition panel and cannot be edited. However, the acceptance criteria can be changed. For
Dosimetry Plugin tests, reference baselines are required to be defined if not yet available. See
Section 4.5.1 for defining the baselines for the Dosimetry Plugin tests.
o Enter/edit all required parameters. Tolerances for the Acceptance Criteria can be edited,
added, and deleted using the corresponding buttons. See details in the Create a test template
section above.
Note that tests created from the New command can be also edited, added, and deleted in the
Test Definition panel
By clicking OK, the test will be displayed in the Test Definition panel. Some parameters can be
edited in Definition panel see Section 3.3.4).
ii
iii
Note: The changes made to the dictionary by adding new words are user based.
Attach an object
In Edit mode, drag and drop an object (file, shortcut, or URL link) with mouse into the drop-file area
(i), or
click the Attach icon (ii) and browse the computer for the object.
The attached object will appear below the drop-file area (iii). The attachments are ordered by date of
addition with the most recent at the top.
ii
i
iii
iv v
Delete an attachment
Click the Delete icon (iv).
Open an attachment
In Edit mode, click the Open icon (v).
View mode
Note: Changes in the Test Setup cannot be saved as long as validation errors have not been resolved.
3 4
The ribbon functions can be also accessed via a context menu. Right-click a listed item to open a
context menu containing available functionality. Only applicable ones are enabled. For example, right-
click anywhere on an unfinished test, only Skip is enabled; whereas on a finished test, Reset and
Override Status are enabled.
Machines:
i: Name of the machine
ii: Name of the active protocol
iii: Number of the tasks in the protocol
i ii iii
Active Tasks:
i: Name of the task
ii: Task due date
iii: Number of executed tests/number of tests
i ii iii
Machines:
The Machines group shows the machines as configured in the Equipment Setup including the active
protocol and the number of tasks in this active protocol.
By right-clicking a machine, if the machine has an active protocol, a context menu with Add
Unscheduled Task command will appear. A task in this protocol can be selected as unscheduled task
and listed in Active Tasks (see screenshot below left).
Active Tasks:
The Active Tasks group displays the active tasks of the selected machine if they are not yet finished
within the current scheduling interval and the unscheduled Tasks.
Calendar
Click the calendar and select a new date for the task and then click OK.
The reschedule action will be automatically shown in the Note of the Finish Task dialog (See
Section 3.4.6).
By placing the cursor on a machine/task, a tooltip appears and displays information of this item.
Task name
Due date Protocol Name
Test name
Category
Task View panel with example: the tests are grouped by Category
The Task View panel has the following features:
■ The test category is shown under the test name.
■ The test status is shown with the following symbols:
: Skipped test
If the measured result should take the average of several measurements, click the Calculator button,
enter all results, and then click Apply Result, the average will be entered into the Actual box.
For a Pass/Warning/Fail test, click the Pass, Warning, or Fail button according to the rest result.
Test/task executions will be saved every time when the test/task execution status changes.
Select the desired status from the Override with dropdown list, enter a note optionally, and then
click Override.
Skip a test
Select a test in the Task View panel, click Skip in the Test group of the ribbon.
Click the Edit button of Attachments to display the attached files or enable the attachment editing:
Attach a file: dag a file to the Dag files here area or click , and then browse and open the file.
ii
Overdue tasks can be removed from the Active Tasks list by right clicking on an overdue task and
choosing one of the three alternatives provided in the drop-down menu (ii):
■ Skip,
■ Skip All Overdue, or
■ Finish.
The procedure of finishing a task will be discussed in the next section.
To skip a single overdue task,
select the task in the Active Tasks panel, click Skip in the context menu or in the ribbon; the Skip
Task dialog opens (see screenshot below). Enter a comment in the Note box if desired, and then
click Skip. If the task contains unfinished tests, a warning is displayed in the dialog.
If a scheduled task has not been performed more than once, than only the oldest overdue task will be
displayed in the Active Tasks panel. Also, only the oldest overdue task will be affected by the Skip
function. Subsequently, the second oldest overdue task will be displayed in the Active Tasks panel.
To skip all overdue tasks,
select the task in the Active Tasks panel, choose Skip All Overdue in the context menu; the Skip
Task dialog opens (see screenshot below). Enter a comment in the Note box if desired, and then
click Skip. If the tasks contain unfinished tests, a warning is displayed in the dialog.
Finish a Task
Select a task in the Agenda panel, click Finish in the Task group of the ribbon; the Finish Task
dialog opens.
Enter a comment in the Note box if desired, and then click Finish.
Due next:
It shows the next due date of the task execution.
Warning display:
If the task contains unfinished tests, by clicking Finish in the ribbon, the unfinished tests will be set to
Skipped; a warning describing the situation is displayed in the dialog:
IM PO RT ANT N O T IC E
WRONG STATUS OF MACHINE
Accidentally finish of a task before all tests are executed, or incomplete/wrong
tasks may lead to incorrect conclusion for the status of a machine.
C AUT IO N
INACTIVE FUNCTIONAL MACHINE
An incomplete or wrong task leads to inactive functional machine.
In this example, cell A1 is reserved for the date. The orange and green cells contain measured raw
values, whereas the yellow cells contain calculated averages, which are the result of Excel formulas.
Each average value will be mapped to the corresponding acceptance criterion in the Test Run page for
QA test. The following section will outline the steps necessary to map cells to acceptance criteria in
myQA.
Mapping parameters
In the Test Run page, right-click anywhere in the row of the acceptance criteria, except for the
Actual text field and then select Copy in the context menu, or press the Shift key and left-click in
the Actual text field to generate and save an identifier to the clipboard. The identifier is later on used
to map cells to conditions in myQA.
W AR NI NG
ENSURE CORRECT MAPPING
It is the user’s responsibility that the relevant cells in the Excel template are
correctly mapped to the corresponding acceptance criteria in the Test Run page.
There are two ways to map the identifier to a cell in the template:
Select the cell (1) to be mapped, press Control + V and then Enter the Name Box (2) on the
upper left of Excel.
In the Excel template, right-click the cell to be mapped (A1 in this example), and select Defined
Name…. In the Name box of the opened New Name dialog, press Control + V and then click
OK.
The maximum length of cell name that is supported by Excel is 254 characters. If the acceptance
criterion name contains characters, which are not allowed by Excel, e.g., #, /, etc., they will be
converted into ASCII hexadecimal numbers during the generation of the identifier.
Note: Cell identifiers in a spreadsheet template must be unique.
Note: When entering the values in the Excel spreadsheet, ensure that they have the same units as the
unit settings in myQA Platform > Options > Platform > Site > Units.
Performing a Test
The Excel template and spreadsheet files must be closed to be used in Test Run page.
Click the Import button , and then select the spreadsheet to be tested in the open file browser.
If the template and spreadsheet files are in the same folder and only one template file is in this
folder, no second open file browser to query for the template will appear. The SW will automatically
use the only template present in the folder for the mapping during import.
If there are more than one template or no template in the folder, the browser will open for selecting a
template.
The Import Result dialog will open and report the import status.
As other myQA tests, the test results are shown in the Status column. The imported parameters are
shown in the Actual column:
W AR NI NG
CHECK MAPPING WHEN USING THE TEMPLATE THE FIRST TIME
To ensure that the mapping in the template is correct. It is recommended to
check the values in the Actual column against those values in the original table
when using the template the first time.
The Test Repository ribbon contains a set of predefined filters to easily sort and identify the tests that
are documented in this page. The table below contains the description of the icons in the Test
Repository ribbon:
Please note that the filters in the Test Repository ribbon can be combined and are stored on a per
user basis to the database.
Display Filter
By clicking the filter icon on the right of a column head cell, the Filter dialogue for this column will open.
The items of this column can be selected for display. The rest of the column will automatically update
accordingly.
There two kinds of filters:
■ Number filter: Finishing Date and Result columns. The numbers with exact format are used for
filtering. It is more convenient to use the range filters (IsLessThan / IsLessThanOrEqualTo /
IsGreaterThan / IsGreaterThanOrEqualTo). The filter contains “IsEqualTo” is only valid if the
compete number is correctly entered.
Print a Report
Report Description
Report Summary A4 Provides a list of protocol/task/test names of
Report Summary Letter the selected tests in A4 /Letter paper format.
Provides a detailed description of the selected
Report Detailed
tests.
In the Report for Selected Tests window, click one of the Export buttons to export the report with
the desired format or the Print button to print a hard copy of the report.
Table 3.X. Description of the commands in the Report for Selected Tests window
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Command Icon Description
Navigate back in history 1 Navigates back in history
Navigate forward in history 2 Navigates forward in history
Stop 3 Stops a running action
Refresh 4 Refreshes the report view
First page 5 Goes to the first page
Previous page 6 Goes to the previous page
Current page 7 Shows the page number of the displayed page
Total page 8 Shows the total page number
Next page 9 Goes to the next page
Last page 10 Goes to the last page
Opens the Page Setup dialog for setting up paper, size,
Page Setup… 11
source, orientation, and Margins
Switch to print preview 12 Shows the print preview
Print report 13 Opens the Print dialog for printing
Opens the export format menu for selecting one format to
Export 14 export the report. The formats include PDF, Excel, Rich Text,
TIFF, Web Archive, and XPS Document.
C AUT IO N
THE SAME SETUP FOR BOTH TEST SETUP AND TEST RUN
The measurement setup in test setup and test run must be the same. It is
recommended to describe the test procedure and indicate the following
parameters in the Description box of the Test Definition dialog (see Section
4.5):
■ Field Size
■ Energy
■ Gantry Angle
■ Detector type
■ Detector orientation
■ Detector build-up /depth.
IM PO RT ANT N O T IC E
ASSIGN TEST PARAMETERS
Only the person who is responsible for QA can define the parameters of the
tests. The parameters given in the user manual are only examples .
IM PO RT ANT N O T IC E
THE OUTPUT CALIBRATION SHOULD NOT BE OLDER THAN THREE
MONTHS
If the output calibration is older than 3 months, the SW will display a warning. It
is recommended to perform the Output Calibration again.
In the current SW version, the analysis method for the penumbra calculation for FFF beams (see
Section 5.2, Vol. 6) is sensitive to the correctness of device positioning. In addition, to ensure correction
results, the field size is limited to 20 cm × 20 cm.
Profile Constancy
With the Profile Constance analysis, up to four profiles can be analyzed in an open field simultaneously:
■ Inline Profile (1)
■ Crossline Profile (2)
1 1
1
2 2
2
4 3 4 3
4 3
(a) (b) (c)
Various parameters, e.g. field width, penumbra, symmetry, flatness, are computed according to
predefined active protocol with selected analysis method, e.g. DIN, IEC-60976, etc.
By specifying tolerances, you can verify your existing reference values when repeating the
measurement process. A status display (Passed/Warning/Failed/Skipped) shows immediately whether
the measurement has been performed successfully within the defined tolerances.
Note: The Dolphin profiles are rescaled from SDD to SAD before the analysis parameters are
calculated. The rescale is a simple geometry projection, without fit or artificial changes of the
detector response. The main inline, crossline as well as the diagonal profiles are automatically
saved in the myQA database as raw data.
Note: Due to the size limitation of the device, field size for the Dolphin profile analysis should not
exceed 35 cm × 35 cm.
Note: The two measurements (open field, wedge) need to be performed with the same monitor units
(dose) to ensure a correct result.
IM PO RT ANT N O T IC E
PREPARE THE DEVICE
Check whether the measurement device is properly connected and is ready to
use.
Make sure that the device is in the correct position.
Check the connection between the device and the PC (Ethernet or direct
connection).
Clean the surface of the device of any contamination.
IM PO RT ANT N O T IC E
DEVICE SETUP AND MEASUREMENT CONFIGURATION
Setup measurement devices and configuration in the SW should be done by
experienced users or under the supervision of an experienced user to ensure the
device and SW are properly prepared for measurements.
See Chapter 9, Vol.1 for specific hardware information such as installation and positioning, as well as
other possible applications and limits.
The StarTrack/MatriXX can be placed in different orientations under a linac. It will affect the detector
settings in the SW. The following drawings show the possible orientation of a table mounted detector
relative to the linac. Note that a clockwise rotation yields a positive value of angle. Normally, the device
is placed 0° under the linac for the dosimetry QA measurements.
W AR NI NG
MEASUREMENT DEVICE ORIENTATON
The measurements for Output Calibration and QA tests should be the same.
Otherwise the Output Calibration may be incorrect and lead to wrong dose
delivery and wrong results in QA tests.
StarTrack
Place the build-up plate on the StarTrack with the same orientation as the StarTrack and insert the two
pins in the two diagonal corners of the StarTrack in the corner holes of the plate.
Two pins on
the StarTrack
StarTrack detector
MatriXX
Place the build-up plate on the MatriXX with the same orientation as the MatriXX and align the three
edges to the corresponding lines on the MatriXX:
Three edges
to be aligned
to the edges
of the built-
up plate
MatriXX detector
A table-mounted StarTrack
Use the room lasers and the StarTrack alignment markers to position the StarTrack. Set the SDD =
100 cm.
Place the Photon Energy Verification plate on the active area of the StarTrack.
IM PO RT ANT N O T IC E
CALIBRATE THE ABSOLUTE DOSE OUTPUT REGULARLY
Absolute drift of a device may be unnoticeable. Absolute dose calibration should
be repeated regularly.
IM PO RT ANT N O T IC E
OUTPUT CALIBRATION SHOULD NOT BE OLDER THAN THREE MONTHS
If the output calibration is older than 3 months, the SW will display a warning in
the Test Setup and Test Run. It is recommend to perform the output calibration
again.
IM PO RT ANT N O T IC E
USING A NON-DEFAULT CALIBRATION
If the default calibration or user uniformity correction for the measurements is not
used, it is user’s responsibility to ensure that the selected calibration delivers
acceptable results.
W AR NI NG
MEASUREMENT DEVICE ORIENTATON
The field size given in the tests in the Dosimetry Plugin is at the isocenter.
Please note:
It is possible to add attachments to protocols, tasks and tests of Protocol Templates and Protocol
Definitions as described in Section 3.3.4.5.
In order to define a Test Template for Profile Constancy, two more parameters are required: profile
directions (1) and the analysis method (2). These two parameters affect the acceptance criteria entries
and cannot be edited in the Test Definition panel for a machine protocol.
The available analysis methods for both electron and photon beams are:
Note that Warning and Error Tolerances are given only for the analysis method based on the
International Electrotechnical Commission Standard IEC-60976 (see screenshot below) in the software.
The tolerances for other protocols can be taken e.g. from the acceptance criteria of the machine
vendor, or clinical QA standards. The users need to enter these values manually.
See Section 4.5.1 for creating the baselines for the Dosimetry Plugin tests.
11
10 9
6
1: Combo box for the available measurement devices. Select the one to be connected to the SW.
For the same beam quality, output calibration with the active icon in myQA Platform > Equipment
setup > Calibrations box will be selected. The latest output calibration is set as active by default.
However, if a different calibration should be used, it can be selected with the Edit button in the
Calibrations box.
If no output calibration with the same beam quality is available, a warning sign (7) appears in the
Action column. The tooltip displays the possible causes:
Actions column: contains the action buttons, Start (8), Reset (9), and/or Paste (10) buttons.
Paste button: in myQA FastTrack workspace, measured values can be copied to a clipboard (see
Section 2.5.1.1, Vol.4, myQA FastTrack User’s Guide) and can be pasted as expected values here with
the Paste button,. When the Paste button is disabled, a tooltip shows the possible causes:
Measured column: The measured result can be directly entered, measured, or the average of several
measurements using the Calculator (11).
Baseline measurement
If the measurement should be re-run, click the Reset button , the Measurement button (8) will
appear again, and the Measured box become empty. The new measurement can be run again.
If the measured result should take the average of several measurements,
Write down each measurement result,
Click the Calculator button (11), enter all results.
Enter all measured results.
If temperature and pressure calibration should be
applied, checkmark the KTp box, enter the
temperature and pressure values measured by a
reliable external devices in the Temperature and
Pressure boxes, respectively.
Note: Do not checkmark the KTp box if the entered
values have already been corrected during
their measurement!
Optionally if a user defined factor should be
applied to the average, enter it in the Factor
field.
Click the Apply Result button; it will be
automatically fill into the Measured box.
13
8 12 10
The expected values can be taken from a measurement by clicking the Measurement button (8) or
from an imported one by clicking the Import Measured Data button (12), or, paste with the Paste
button (10) if the measurement has been copied from FastTrack (see Section 2.5.1.1, Vol.4, myQA
FastTrack User’s Guide), or manually enter them.
After a measurement is done, the reference detector type will be shown.
13a 13b
Right-click the select a value in the table, the context menu opens.
IM PO RT ANT N O T IC E
ASSIGN MEASURED DATA TO MACHINE
Be aware that the data imported in the myQA Platform workspace will be
assigned to the selected machine and can be used only for this machine in the
myQA Machines workspace.
Select a measurement (row), its properties are shown on the right panel.
Click Ok, the data is imported into Test Definition panel.
If there is a valid output calibration for the selected beam quality, the Measurement button is
available. If at least one expected value is inserted, the Beam Quality box will be inactivated. The
energy can be changed after all expected values are deleted (the Beam Quality combo box becomes
activated again).
In the Wedge Direction column, the inline profiles are automatically indicated as direction “In/Out” or
“Out/In” whereas the crossline profiles are automatically indicated as the wedge direction “Right to Left”
or “Left to Right”.
Click Save in the ribbon when all test templates are created. This template can be used for so many
machines as desired.
After the machine protocol is set up, select the protocol in the Protocol Tree View panel, and click Set
Active in the Test Setup ribbon. Finally click Save in the Test Setup ribbon to exit the Edit mode. The
protocol is now ready to be run in the Test Run page.
Note: When measuring wedge constancy with Dolphin in combination with an accelerator where the
direction of the wedge is changed by rotating the collimator, only one wedge direction must be
measured.
■ Source and target machine are of different type or model, yet the machine properties are
completely identical
After pasting, the software warns the user that the source and target machines are of different
type or model
After clicking OK, all tests are copied, including description, reference values and tolerances
■ Energies and wedge angles of source and target machine are completely different
After pasting, the software warns the user that the source and target machines are of different
type or model
After clicking OK, the software informs about the status of the copy/paste process
In this case, the protocol and the tasks including description and schedule, yet no tests will be
copied
Linac A Linac B
6 MV 6 MV
10 MV FFF 10 MV
Energies 15 MV
6 MeV 6 MeV
12 MeV 9 MeV
15° 15°
Wedge angles
45° 30°
Example of two Linacs with some identical (green) and some different properties (red).
Therefore, the machine protocol of Linac A will contain five profile constancy tests:
For simplicity, we created a protocol for Linac A containing only one task with the beam profile
constancy tests.
After clicking OK, the software informs about the status of the copy/paste process:
As depicted in the screenshot above, the profile constancy tests for the energies of Linac A that are not
available in Linac B have been removed from the clipboard before pasting and will therefore not be
copied.
After the copy/paste action, the protocol of Linac B contains only the profile constancy tests for the
identical energies with Linac A:
All information contained in these tests (test description, baselines, tolerances) have been preserved
through the copy/paste action. However, the user shall revise these values and adapt them to the target
machine if necessary.
For simplicity, we created a protocol for the source machine Linac A containing only one task with the
energy constancy test for all available energies.
After clicking OK, the software informs about the status of the copy/paste process:
As depicted in the screenshot above, the energy constancy test has been removed from the clipboard
before pasting and will therefore not be copied, although Linac A and Linac B have two energies in
common. After the copy/paste action, only the empty protocol and task have been copied to the target
machine:
Generally, if one or more energies of the source and target machines are different, the energy
constancy test cannot be copied.
For simplicity, we created a protocol for the source machine Linac A containing only one task with the
output constancy test for all available energies.
After clicking OK, the software informs about the status of the copy/paste process:
As depicted in the screenshot above, the output constancy test has been removed from the clipboard
before pasting and will therefore not be copied, although Linac A and Linac B have two energies in
common. After the copy/paste action, only the empty protocol and task have been copied to the target
machine:
Generally, if one or more energies of the source and target machines are different, the output constancy
test cannot be copied.
After copying the protocol and pasting it to the Linac B, the software warns the user that the two
machines are not of the same type or model:
After clicking OK, the software informs about the status of the copy/paste process:
As depicted in the screenshot above, both wedge constancy tests have been removed from the
clipboard before pasting and will therefore not be copied, although Linac A and Linac B have one
flattened beam energy in common. After the copy/paste action, only the empty protocol and task have
been copied to the target machine:
Generally, both the energy and the wedge angles of the source and target machine have to match in
order for the wedge factor test to be successfully copied.
2 3 4
1
By selecting this box, the measurement(s) will be populated to other tests that have the same
parameters including the same machine, energy, inline and crossline field sizes, gantry angle; for
Wedge Constancy tests: with/without wedge and wedge direction (i.e., left-to-right or right-to-left),
whichever is applicable. In this way, the number of measurements can be minimized.
Note: Pressing the Reset button in the Action column resets only the measurement in the current test.
It does not affect the measurements that has been already populated to other applicable tests.
Note: If Share Measurement is activated, please ensure that the correct number of MU are irradiated
in case the output constancy test is one of the tests to be shared.
By clicking one of the icons, e.g. Inline profiles, the inline profiles will be added to the Compare page.
The SW performs the following actions automatically:
The Compare page opens, displaying the measured (measured in Test Setup) and expected
profiles (measured in Test Run) of the selected direction. They are also listed in Data Import.
The expected profile is set as reference.
If a new direction is selected in myQA Machines, the previous profiles will be deleted from
Compare and Data Import.
Table 5.1. The CBCT QA Plugin test types and test parameters
The CBCT QA Plugin provides Individual tests for each test type and a combined test for all test result
parameters.
For phantoms where the positions of the individual modules are unknown or where they could be
modified by the user, an approach based on the Pearson Correlation Coefficient [7] is applied. Here, the
reference image for each module, which is foreseen for analysis, needs to be stored. The correlation
coefficient between the reference image and each slice of the DICOM stack is calculated. The slice with
the maximum value of the correlation coefficient is proposed to the user.
An example is given in the figures below, with the correlation coefficient as a function of the slice
number on the left. The middle and right part of the figure show the reference image and the found
Geometric Distortion
This test verifies distances obtained from an image to the known values[3]. For this purpose, distances
between two structures of the phantom (holes or inserts) are extracted from the image; see an example
below.
Example - phantom, CT slice containing the CTP404 module. ROIs are set on two pairs of structures
for distance extraction.
Spatial Resolution
The Spatial Resolution test is to verify the capability of dissolving narrow lines via MTF (Modulation
Transfer Function). The phantom modules used for measurements of the spatial resolution contain line
bar groups. Within a group, the distances between two adjacent pairs are the same. The distance
among groups are different. Spatial frequency is defined as the line pairs per mm (unit: lp/mm) within a
group.
An ROI should include all bars in a group but the area should not exceed the outer edges of the bars.
An image of the CTP404 module of the CatPhan 504 phantom used for LCV, CNR and HU constancy
tests.
The LCV test is to check how well two different materials (e.g., fat and water in human body) with close
HU values in a CBCT image can be distinguished.
To demonstrate the LCV test, ROI 5 and ROI 6 in the example cover polystyrene (PS) and Low-density
polyethylene (LDPE). The HU values of these materials are similar to fat and water inserts, respectively.
The LCV is defined as[4]:
𝑅𝑒𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡/10
𝐿𝐶𝑉[%] = ,
𝑀𝑒𝑎𝑠𝑢𝑟𝑒𝑑 𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡/𝑀𝑒𝑎𝑠𝑢𝑟𝑒𝑑 𝑁𝑜𝑖𝑠𝑒
with:
𝑅𝑒𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 = 𝐶𝑇𝑃𝑆 − 𝐶𝑇𝐿𝐷𝑃𝐸 , CTPS and CTLDPE: nominal HU values of the two materials,
𝑀𝑒𝑎𝑠𝑢𝑟𝑒𝑑 𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 = 𝑀𝑃𝑆 − 𝑀𝐿𝐷𝑃𝐸 , MPS and MLDPE: mean measured HU values in the two ROIs
𝑆𝐷𝑃𝑆 −𝑆𝐷𝐿𝐷𝑃𝐸
𝑀𝑒𝑎𝑠𝑢𝑟𝑒𝑑 𝑁𝑜𝑖𝑠𝑒 = , SDPS and SDLDPE: the standard deviations of the two ROIs
2
To obtain meaningful measured data, the ROIs should not cover any area outside the inserts; e.g., in
the example given in the figure above, the ROIs are inside the insert borders.
An example acceptance criteria from a LINAC vendor is LCV < 2% [4] [5].
where, M1 and M2 are the mean HU values in ROIs covering two different inserts; SD1 and SD2 are the
standard deviations in these ROIs.
Uniformity
The uniformity test is done with a homogeneous slice of a phantom (see an example below).
Example: Slice used for the uniformity test with five ROIs.
It is recommended to use a central ROI together with more than two peripheral ROIs. In a first analysis
step it is determined which of the ROIs is the central one. The mean values of all ROIs are stored. The
raw pixel values are used for the calculations of this test.
The following uniformity values in % are calculated:
Overall uniformity
𝑀𝑚𝑎𝑥 − 𝑀𝑚𝑖𝑛
Overall-uniformity = 100 ∗ (1 − )
𝑀𝑎𝑣𝑔
where, Mmax and Mmin are the maximum and minimum mean values of the ROIs and Mavg is the average
mean value of all ROIs.
Minimum-uniformity
Minimum-uniformity is the relative difference of the maximum mean value among the peripheral ROIs to
the central ROI:
𝑀𝑑𝑖𝑓𝑓_𝑚𝑎𝑥
Minimum-uniformity = 100 ∗ (1 − )
𝑀𝑐𝑒𝑛𝑡𝑒𝑟
where, Mdiff_max is the maximum mean value among the peripheral ROIs and Mcenter is the mean value of
the center ROI.
Select CBCT kV/MV for Category (1) (for information display only)
Select one of the CBCT Result Type (2), e.g., CBCT: Contrast for an individual test or CBCT:
Combined for all test result types.
Define Energy (3): e.g., select the kV round button and enter the value of the kV
Note that when Undefined is selected, the exact value of the energy cannot be entered.
Note: Only tests with the same defined energy or those with Undefined selected can be shared.
Migrated tests from a previous version will have an undefined energy by default. A manual
adjustment in the Test Setup to match the actually used energy type and value is advised.
Click OK (4). The test is listed in the Protocol Tree panel and displayed in the Test Definition
panel.
Repeat the previous steps to create more tests as needed.
In the Test Definition panel, click Open Module (5). The myQA - Imaging QA – TestSetup
CBCT window opens with default settings.
Click the Import folder box in the Equipment setup section. Browse and select the folder
containing the image or image stack as reference images and then click Open Folder. Select the
desired series in the Select Series dialog.
Once the images are imported and the ROIs have been already defined (e.g., a SW supported
phantom), the SW automatically selects the correct slice and assigns it to a designated module
as a reference image. It is recommended to check whether the auto-slice selection is correct:
o Select a phantom module in the Modules box. In the Images section, check whether the
predefined ROIs are correctly placed. Do this for all modules.
o If the ROIs are only slightly miss-placed, use the functions provided in the Tools panel to
move the ROIs (Section 5.3.3.3).
After reference images for all modules are set, click the Process button in the Baseline values
section. The Expected and tolerance (Warn and Fail) values will be filled. Edit the tolerances if
necessary. They can be edited by changing the values in the corresponding boxes.
Click the OK button. The reference images and configuration files are listed in the Test
Definition section.
Repeat the above “Define the test baseline” step for CBCT QA test if desired.
Click Set Active in the ribbon if the protocol is not yet set as an active protocol.
Click Save in the ribbon.
Note: For this step, you can also create a test template first, and then copy the template into the
machine protocol, see Section 3.3 for instructions.
Click the Start button. The myQA – Imaging QA – TestRun CBCT window opens.
Hint: If no setup is saved, the Start button in Test Run is disabled.
Click the Import folder button. Browse and select the CBCT image folder. The SW loads all images
taken for each module.
As importing a reference image, once the images are imported, the SW automatically selects the
correct slice based on the ROIs and assigns it to a designated module as a test image (active
image). It is recommended to check whether the auto-slice selection is correct:
Select a phantom module in the Modules box. In the Images section, check whether the image
is similar to the reference image and the ROIs are correctly placed. Do this for all modules.
If the ROIs are only slightly miss-placed, use the functions provided in the Tools panel to move
the ROIs (Section 5.3.3.3).
If a different slice needs to be selected, scroll through the slices to search for the right image for
this module. Once found and the ROIs are also properly defined, click the Is active image
button.
If necessary, the Image view can be displayed in full window by clicking the icon and the user
can zoom the image furthermore. See Section 5.3.3 for more information on the myQA – Imaging
QA – TestRun CBCT window.
Once the test images for all modules are properly selected, click Process in the Baseline values
section. See the description of test results in Section 5.3.3.5.
Click OK; results for executed CBCT tests that share the same module are displayed if the Share
Results checkbox is checked.
Repeat the above step to execute the next test until all tests are finished.
Click Ok to close the myQA – Imagine QA – TestRun CBCT window.
Click Finish in the Test Run ribbon and then finish in the Finish Task dialog.
Afterwards, the test results will appear in myQA Cockpit and a report can be created in the Test
Repository page.
Machine
This field displays the machine that was used to perform the QA. It is for information display only.
SID (mm)
This field displays the source to imager (SID) distance in mm. The SID is typically imported from the
evaluated image, but it can be edited if necessary. The SID is used to calculate distances at the
isocenter and used in ROI placement and scaling discrepancy.
In Test Run, if the imported file has a SID different from the one used to create the baselines, a
warning message pops up:
SAD (mm)
This field displays the Source to Axis Distance (SAD) that
was used to perform the QA. The SAD comes typically from
the myQA Platform > Equipment setup, but it can be
edited if necessary. The default SAD value is 1000 mm.
Phantom model
The phantom model is only selectable during the Test
setup. Select the phantom used for the CT scan. The
dropdown list shows the phantoms containing predefined
ROIs for tests.
Module
Select different modules of the selected phantom to display their respective images and results.
To edit the name of a phantom or module, select the item and then enter the new name.
Full display
To facilitate the inspection of the image or ROIs, the image can be displayed in full view by clicking the
full view button in the lower-left corner:
Zoom in/out
Place the mouse on an image and then scroll the mouse wheel up /down to zoom in / out this image.
Resize/move ROI
Place the cursor on the edge of an ROI, you can drag-and drop to change the size of the ROI. Place the
cursor inside an ROI, you can drag-and drop to move the ROI to any position in the image.
Rotates the selected ROI around the ROI center with the defined angle
Rotates the selected ROI around the phantom center with the defined angle
Rotates all ROIs around the phantom center with the defined angle
Rotates all ROIs around the phantom center with an optimized angle
Rotates all ROIs around the phantom center with the defined angle
Rotates all ROIs around the phantom center with an optimized angle
If ROIs in the SW are not aligned with the desired regions on your CT image, use the Move and/or
Rotate functionalities for the ROIs. The rotate function is especially useful for modules for the spatial
resolution test of the CatPhan® phantoms.
ROI Coordinates
The ROI coordinates are (x, y) pairs that define a sequence of boundary points for positioning the
selected ROI on the image. ROI coordinates are measured from the center of the phantom. The SW
automatically detects the center of each module when results are calculated, even without a reference
image.
By clicking the Edit button to open the Edit regions of interest dialog. You can edit, delete the existing
ROIs, and add new ROIs.
Edit a ROI
Select the module and then the ROI to be edited in the Region of interest section. Edit the parameters
in the Coordinates and Test sections.
ROI coordinates:
Circle: x, y coordinates of center and radius
Rectangle: x, y coordinates of four corners
ROI coordinates are measured from the center of the phantom. The SW automatically detects the
center of each module when results are calculated, even without a reference image.
Note that for an existing ROI, it can resized and moved by drag-and-dropping with the mouse, and
rotated in the Tools panel.
Delete a ROI
In the Region of interest section, select the module and the ROI, then click Delete.
Add a ROI
In the Region of interest section, select the module and a ROI with the same test you want to add,
then click Add. A new ROI name is added at the end of the list.
In the Coordinates section, select the ROI shape between circle and rectangle, edit coordinates.
Enter the name of the material if applicable.
In the Test section, select the test name and enter the parameter if required.
Note: The SW does not check the ROI coordinates against the borders of the image. Ensure the
defined ROI does not exceed the image borders.
Expected values
Enter the expected values for the test result of interest in the units indicated in parentheses next to the
result (e.g., %, mm, or HU). The absolute differences between computed results and their baseline
value are calculated to compare with tolerances. See AAPM’s TG-142 for guidance in determining an
expected value for each result.
Disable an analysis: leave the expected value box of a parameter empty, this parameter will not be
analyzed.
Passed
The expected value does not exceed the warning tolerance.
Warning
Enter a positive value for the minimum deviation from the baseline that will generate a warning sign in
display. A warning’s value should be less than or equal to the value entered in the Fail column (equal if
you wish to use only pass or fail indications).
Failed
Enter a positive value as the minimum deviation from the expected value that will generate a failure sign
in display. The fail value should be greater than or equal to the warn value.
Process button
Click the Process button to recalculate results.
Once a parameter is modified, a message appears next to the Process button to warn the user to
repeat the process:
An Example of MTF chart. The coordinates will be displayed by pressing the mouse on the curve.
5.4. References
[1] F. Klein, "Task Group 142 report: Quality assurance of medical accelerators," Med. Phys., vol. 36,
p. 4197, 2009.
[2] DIN, "Leistungsmerkmale zur Bildgebung von Röntgeneinrichtungen für die Computertomographie,"
DIN EN 61223-2-6, 2008.
[3] H. de las Heras-Gala, "Quality Control in Cone Beam Computed Tomography EFOMP-ESTRO-
IAEA Protocol," http://dx.medra.org/10.19285/CBCTEFOMP.V1.0.2017.06, 2017.
[4] ELEKTA, "XVI 5.0 Customer Acceptance Tests," 2013.
[5] Lehmann, "Commissioning experience with cone-beam computed tomography for image-guided
radiation therapy," Journal of Applied Clinical Medical Physics, p. 21, 2007.
[6] Kaur, "Image Recognition using Coefficient of Correlation and Structural Similarity Index in
Uncontrolled Environment," International Journal of Computer Applications, vol. 59, no. 5, p. 32,
2012.
Table 6.1. The EPID QA Plugin test types and test parameters
The SW provides Individual tests for each test type and a combined test for all test result parameters.
Supported file formats: DICOM RT Image and tiff image.
Illustration of principle setup of an EPID measurement with a phantom placed on the table.
The figure below shows an image of a supported phantom, the QC-3 phantom from Standard Imaging.
The length scale in mm is projected to isocenter and the displayed size corresponds to the actual
dimensions of the phantom. The defined ROIs for the different tests are indicated and numbered on the
Supplier Phantom
PTW EPID-QC
Mobius MC2 MV
IBA Primus A
IBA DIGI-13
In the image of a PTW EPID QC phantom below, the circles in the corners are automatically detected.
The center of the large circle connecting the four corner-circles is used as the center of the phantom.
For automatic rotation, the rotation angle is determined via the positions of the four circles in the
corners. This angle is taken into account for placing the ROIs
o .
Image of the PTW EPID QC phantom with detected circles at the phantom center and corners
Mobius MC2 MV
For the Mobius MC2 MV phantom, a rectangle and a circle in the middle is detected. The center of the
circle is the center of the phantom. The nominal orientation angle of the rectangle is - 45°. The deviation
from this angle corresponds to the rotation of the phantom.
The SI QC-3 and kv1 phantoms for MV and kV radiation, respectively, have a very similar structure.
Rectangular structures are detected for the determination of the phantom center and the rotation angle.
+ +
The cross in the middle is detected for the MV-QA and kV-QA phantoms from Sun Nuclear.
+ +
SNC MV-QA (left) and kV-QA (right) phantoms with detected rectangles
The Varian Las Vegas and IBA DIGI-13 phantoms reveal external rectangular structures, which serve
for the determination of the phantom centers.
+ +
Varian Las Vegas (left) and IBA DIGI-13 (right) phantoms with detected rectangular structures
The Leeds TOR 18FG phantom as an external circular shape and internal rectangular structures. The
detected outer circle determines the phantom center. The centers of the two small rectangles are used
for the calculation of the rotation angle. The shown orientation corresponds to a rotation angle of 0
degree.
Leeds TOR 18FG phantom with detected circular and rectangular structures.
IBA Primus A
The center of the IBA Primus A phantom is automatically determined via a detection of the central
cross, indicated by the red circle in figure.
Scaling Discrepancy
This test verifies the distance extracted from the image to a known value[5]. For this purpose, distances
between two structures of the phantom are extracted from the image. In the example below, two circular
ROIs (9 and 10) are placed on two diagonal corners of the phantom.
Scaling discrepancy test - the distance between the two circular ROIs is determined
The distance of the centers of the two ROIs are measured. The measured distance (dmeasured) is then
compared to its nominal value (dnominal defined by the phantom geometry),
∆𝑑 = 𝑑𝑚𝑒𝑎𝑠𝑢𝑟𝑒𝑑 − 𝑑𝑛𝑜𝑚𝑖𝑛𝑎𝑙
∆𝑑 denotes the scaling discrepancy which is stored in the myQA database.
Spatial Resolution
The spatial resolution test is the same as for the CBCT case (see Section 5.2.3). The example below
shows an image of a SI QC-3 phantom where the structures covered by ROIs 1-5 serve for the spatial
resolution determination.
where M1 and M2 are the mean signal values in the two ROIs.
The CNR is defined as[5]:
𝑀1 − 𝑀2
𝐶𝑁𝑅 =
0.5(𝑆𝐷1 + 𝑆𝐷2 )
with SD1 and SD2, the standard deviations in the two ROIs.
Minimum uniformity
The ROIs for the uniformity test must be in homogeneous regions of the phantom. In the example of the
image above, the ROIs for this test are ROIs 7-10. Uniformity is calculated for each ROI, it is defined
as:
𝑆90 − 𝑆10
𝑈𝑛𝑖𝑓𝑜𝑟𝑚𝑖𝑡𝑦 = (1 − ) × 100%
𝑆90 + 𝑆10
S90 and S10 denote the 90% and 10% percentiles of the signals, respectively. Using 90% and 10%
percentiles instead of minimum and maximum values has the advantage of being less sensitive to
single defect or noisy pixels. The value stored in the database is the minimum uniformity value,
obtained from the individual ROIs.
The phantom is aligned to the center of the image. The MV or kV image has been acquired.
Note: The baselines and settings defined in Test Setup will be used for every execution in Test Run.
However, they cannot be changed in Test Run.
Procedure
Click Edit in the Test Setup ribbon to enter the Edit mode.
Select the desired machine in the Machines & Templates panel.
In the Protocol Tree panel, click New in the ribbon to create a Task.
With the task is selected, click New in the ribbon to create tests:
In the New Test Definition dialog (see next page), enter the required name of the test and
optionally a description.
Select Category (1) (Planar MV (EPID), for information display only).
Select Result Type (2) (e.g., EPID (MV) / Planar (kV): Combined or Contest).
Implementation will be aumatically selected according to Result Type.
Define Energy (3): select the MV round button and enter the value of the MV.
When Undefined is selected, the exact value of energy cannot be entered.
1
2
3
In the Test Definition panel, click Open Module (5). The myQA – Imaging QA – TestSetup
window opens. Note that detailed information and handling in the myQA – Imaging QA –
TestSetup window are provided in Section 5.3.3 and Section 6.5.3.
myQA - Imaging QA - Test Setup CBCT window – the Equipment setup and Images sections
Inspect the predefined ROIs on the reference image. If necessary, modify them.
o If desired, the Images view can be displayed in full window by clicking the icon and the
user can zoom the image furthermore.
o The ROIs on the test image on the right can be resized and moved by dragging-and-dropping
o Use the tools on the Tools panel to change the brightness and contrast of the images if
necessary (for instruction see Section 5.3.3.3). Changes of the brightness and contrast have
no influence on the calculation of the test results.
o The image can be displayed in several options of color palettes by selecting one in the Color
Palette dropdown menu.
o The ROIs can be edited using the Edit button in the Regions of Interest panel (for instruction
see Section 5.3.3.4).
After checking the ROIs, click the Process button in the Baseline values section. The Expected
and tolerance (Warn and Fail) values will be filled. Edit the tolerances if necessary. They can be
edited by changing the values in the corresponding boxes.
Click OK, the reference image and configuration files are listed in the Test Definition panel.
The current configuration files will be deleted by clicking Reset configuration or Open Module.
Repeat the above steps until all baselines for other EPID QA tests are created.
Click Set Active in the ribbon if you want to perform the tests.
Click Save in the ribbon.
Note: In this procedure, you can also create a test template first, and then copy the template into the
machine protocol, see Section 3.3 for instructions.
Procedure
In the Test Run workspace, select the corresponding protocol and task in the Agenda panel
In the content area, select a test.
Note the test results of the four EPID tests can be shared with each other. Checkmark the Share
Results box. By starting one test, all of four EPID tests will be performed.
Test Run workspace – an EPID task is selected (before tests are executed)
For a combined test, the Share Results box does not appear since it is obsolete.
Inspect the predefined ROIs on the test image. If necessary, modify them.
o If desired, the Image view can be displayed in full window by clicking the icon and the
user can zoom the image furthermore.
o The ROIs on the test image on the right can be resized and moved by drag-and-dropping
o Use the tools on the Tools panel to change the brightness and contrast of the images if
necessary (for instruction see Section 5.3.3.3); ROIs can be edited using the Edit button in
the Regions of Interest panel (for instruction see Section 5.3.3.4).
Test results (Example) - Baseline values and MTF chart sections. In this example, more EPID
tests are defined and Share results is enabled, all the tests were run automatically and the results
were displayed.
Test Run workspace – an EPID task is selected (after tests are executed)
Click Finish in the ribbon and then finish in the Finish Task dialog. Afterwards, the test results will
appear in myQA Cockpit and a report can be created in the Test Repository page.
To edit the name of a phantom, select the item and then enter the new name.
Click OK
In the myQA - Imaging - Test Setup EPID Window, select the newly created phantom in the
Phantom model dropdown list.
Click Import file, browse and select the image and click Open to load it in the Images view.
In the Region of Interest panel, define ROIs (see Section 5.3.3.4 for instructions).
Click OK
Click Process and then check whether the Expected values and the tolerances (Warn and Fail) are
ok. If the expected value is not ok, check the ROIs. Edit the tolerances if necessary.
Click OK to close the window.
6.6. References
[1] F. Klein, "Task Group 142 report: Quality assurance of medical accelerators," Med. Phys., vol. 36,
p. 4197, 2009.
[2] DIN, “Elektronische Bildempfänger (EPID) – Konstanzprüfung“, DIN 6847-6, 2012.
[3] DIN, "Leistungsmerkmale zur Bildgebung von Röntgeneinrichtungen für die
Computertomographie," DIN EN 61223-2-6, 2008.
[4] D. Dance, "Diagnostic Radiology Physics," IAEA, 2014.
[5] H. de las Heras-Gala, "Quality Control in Cone Beam Computed Tomography EFOMP-ESTRO-
IAEA Protocol," http://dx.medra.org/10.19285/CBCTEFOMP.V1.0.2017.06, 2017.
Table 6.1. The MLC QA Plugin test types and test parameters
The test procedures of these three test types are similar. However, the Picket Fence requires analysis
for only one gantry angle position; whereas the other two test types require analysis for four gantry
angle positions: 0°, 90°, 180°, and 270°.
The VMAT QA Plugin provides individual tests for each test type and a combined test for all test result
types.
Calculations
Automatic image pre-processing
Irradiated peaks
For each MLC and each stripe, the irradiated peak is identified.
A peak is defined by field size and center position as:
Results
The results will be displayed on the loaded image and a green mark (red cross) will be positioned on
the corresponding leaf if the result passes (fails).
Define the test baselines. Baseline for a test is defined once in the Test Setup and will be used in
every execution of the test.
In the Test Definition panel, click the Open Module button. The myQA – Imaging QA –
TestSetup MLC window opens (for detailed information / instruction about this window, see
Section 7.3.3).
Click the Import file box, browse and select the reference image, then click Open.
The image is loaded into the Images section and the SID, SAD, and gantry angle from the
DICOM file are displayed in the Equipment setup section.
The Images section is displayed in full view and further enlarged. The coordinates and intensity
of the cursor position is also displayed.
Once the isocenter position is correctly defined, start the calculation by clicking the Process
button in the Baseline values section. The expected values will be determined.
Inspect the tolerances (in the Warn and Fail columns) provided in the SW. Edit them if
necessary.
Note: The predefined expected values and tolerances are example values coming from test images,
protocols or best practice and users should inspect these values.
myQA – Imaging QA – TestSetup MLC window – Baseline values and Deviation histogram
sections (after process)
Click OK. The reference image and configuration files will be listed. The current reference image
and configuration files can be deleted by clicking Reset Configuration.
Test Run workspace – an MLC QA task is selected (before tests are executed)
Note that for the Leaf Position Accuracy test, it requires analysis for four gantry angle positions: 0°,
90°, 180°, and 270°. Click the corresponding Start button to execute the test for each gantry angle.
Since the default isocenter position is the center of the image, check whether it is correct. As in Test
Setup, the isocenter position can be changed by drag and drop with the mouse. See detail
instruction in Section 7.3.3.2.
myQA – Imaging QA – TestRun MLC window - the Images section is displayed in full view and the
test image is further enlarged. The coordinates and intensity of the cursor position is displayed.
Since the three MLC QA test types use the same images, the analysis results for one test can be
propagated to the other MLC QA tests in the test list by selecting the Share results box. This box is
available as long as there are more than one of the similar tests in one task.
Please note that results are only shared among the same type of tests either have a matching baseline
configuration or no baseline configuration.
Since the images of Leaf Position Accuracy and Picket Fence tests are gantry angle dependent, the
results can be shared only for the tests with the same gantry angle. For the image of the Picket Fence
test, the gantry angle from Test Setup is taken.
myQA – Imaging QA – TestRun MLC window. The sections highlighted with red box are not available
in the myQA – Imaging QA – TestSetup MLC window
In Test Run, if the SID / SAD / gantry angle / collimator angle value of the imported test image is
different from the value in Test Setup, a warning appears and the test setup value is shown when
hovering over the warning symbol. See an example below:
Click No and start importing again; click Yes, the image will be loaded and a warning sign is appear.
The analysis can be still executed.
MLC model
Specifying the MLC model allows the SW to overlay leaf numbers on the image, and determine which
leaves had errors. By default, the isocenter point is assumed at the center of the EPID image. User can
modify the isocenter position (see Section 7.3.3.2).
MLC Image
For importing the reference image (in Test Setup) or test image (in Test Run). A reference image is an
image of a MLC module used to calculate baseline values.
Offset tolerance (mm)
This is the offset of the measured peak position against the expected peak position. This tolerance
decides if the measured peak is passed or failed.
To edit a module, select the MLC module in the MLC section; in the Leaves section, edit the number
of the leaves and widths. Leaves can be added or deleted with the Add or Delete button. Use the
Up and Down buttons to re-arrange the order of the leaves if necessary.
To delete a module, select the MLC module and click the Delete button in the MLC section.
To add a new module, click the Add button in the MLC section, and then enter the module name in
the new space on the list (the MLC module name must be unique). In the Leaves section, click the
Add button, and then edit the number of the leaves and width. Repeat this until all leaves with
different widths are entered.
Click OK / Cancel to confirm /cancel the change.
Isocenter
Coordinates (in pixel and mm) and
intensity of a pixel at the cursor position
The Tools panels for Test setup and Test run is very similar. Except for brightness & Contrast (which
is available for both reference and test images, the rest are for the active image of the workspace.)
Passed
The expected value does not exceed the warning tolerance.
Warning
Enter a positive value for the minimum deviation from expected value that will generate a warning sign
in display. A warning’s value should be less than or equal to the value entered in the Fail column.
Failed
Enter a positive value as the minimum deviation from the expected value that will generate a failure sign
in display. The fail value should be greater than or equal to the warn value.
Process button
Click the Process button to calculate results.
During the Test setup, after processing is done, the user can modify any of the warning and fail
tolerances as desired.
Besides the results displayed in the Baseline values section, the values of the following parameters
per strip are also displayed in the Detailed Results report:
The stripes are identified by a number (numbering goes from left to right) and by a color. One or more
stripes can be hidden by deselecting them in the legend.
Left: principle of data acquisition, middle: open field, right: VMAT field.
Analysis steps:
Read in open and VMAT field images.
Extract information from file headers (if available):
o Pixel spacing
o SID and SAD
Normalize VMAT field to open field to account for flatness and symmetry of the beam
Detect center of field
Via image processing tools (same as for the EPID Plugin, see Section 6.3). This method is based
on the detection of a rectangle in the open field images. See examples shown in the figure below,
positive and negative contrast is treated automatically and the fields can also exceed the image in
one direction.
Profiles of the open and VMAT fields are shown in the left figure below. Furthermore, vertical dashed
lines mark the found field center position (red) and ROI positions (green). The right figure below shows
the VMAT field after normalization to the open field. In this example, six different dose rates (from 125
to 440 MU/min) were irradiated. The ROIs are placed in the middle of the irradiation zones.
Left: horizontal profiles of open and VMAT fields. Right: Normalized VMAT field with ROIs and applied
MU/min.
The phantom is aligned to the center of the image. The EPID open field and MLC field images are
measured. The images can be in DICOM (RTImage), .tiff, or .opg file format.
Note: The baselines and settings defined in Test Setup will be used for every execution in Test Run.
However, they cannot be changed in Test Run.
Note that detailed information and handling in the myQA – Imaging QA – TestSetup VMAT
window are provided in Section 8.2.3.
In the Equipment setup section (1):
Machine: Displays the machine selected in the beginning of this procedure.
SID and SAD: Defult is 1000 mm. SID will be updated upon importing images.
ROI model: Click the dropdown box and select a desired ROI set in the predefined ROI set list.
Open field: Click the corresponding Import file button and load the desired open field image.
VMAT field: Click the corresponding Import file button and load the desired VMAT field image.
Select the Normalize ROI results to 1.0 check box if desired.
To view the imported images, select the corresponding tab on the top of the Image view (2). The
display of an image can be adjusted with the tools in the Tools panel.
In the Profile chart section (5), the inline / crossline profiles of the open and VMAT fields are
displayed.
The ratio of the two images is automatically calculated and displayed in the Ratio tab together
with the ROIs. The center of the image is automatically identified. However, it can be re-located
by drag and drop. The predefined ROIs will be automatically shifted. The ROI values can be
viewed and edited in the Region of Interest panel.
The ROIs mean values and standard deviations are shown by clicking the Show detailed
results button.
Pre-conditions
The protocol is set to active in Test Setup.
Procedure
In the Test Run workspace, select the corresponding protocol and task in the Agenda panel
In the content area, select the test and then click Start.
Test Run workspace – a VMAT task is selected (before tests are executed)
The myQA – Imaging QA – TestRun VMAT window opens. The equipment setup parameters,
normalization, expected values and tolerances set up in the Test setup workspace are displayed.
They cannot be modified in the Test run workspace. SID and SAD can be adapted to the actual
setup.
Click Open field - Import and load the open field image file.
Each ROI mean value and standard deviation are shown by clicking the Show detailed results
button.
Test Run workspace – an EPID task is selected (after tests are executed)
Click Finish in the ribbon and then finish in the Finish Task dialog. Afterwards, the test results will
appear in myQA Cockpit and a report can be created in the Test Repository page.
Machine
This field displays the machine that was used to perform the QA. It is for information display only.
SID (mm)
This field displays the source to imager (SID) distance in mm. The SID is typically imported from the
evaluated image, but it can be edited if necessary. Upon importing, a message pops up and inform that
the SID is different from the current definition. Click Yes to update the value to the one in the imported
file. The user can also edit these values if necessary.
The SID is used to calculate distances at the isocenter and used in ROI placement and scaling
discrepancy.
SAD (mm)
This field displays the Source to Axis Distance (SAD) that was used to perform the QA. The SAD
comes typically from the myQA Platform > Equipment setup, but it can be edited if necessary. The
default SAD value is 1000 mm.
ROI model
The ROI model is only selectable during the Test setup. Select a predefined ROI model for the test in
the dropdown list.
To edit and delete an existing ROI set or add a new ROI set, click the Edit button (see Section 8.2.3.4).
Examples for Image view in Test Setup (left) and Test Run (right)
Isocenter
The isocenter is assumed at the center of the image that is automatically located by default and
displayed with a “+” on the active ratio image. It can be relocated per drag and drop by the user and
reset to default by clicking Isocenter – Reset in the Tools panel (section 8.2.3.3).
Resize/move ROI: Place the cursor on the edge of a ROI, you can drag-and drop to change the size of
the ROI. Place the cursor inside a ROI, you can drag-and drop to move the ROI to any position in the
image.
Full display
To facilitate the inspection of the image or ROIs, the image can be displayed in full view by clicking the
full view button in the lower-left corner:
To edit the name of a ROI module, select the item and then enter the new name.
To delete a module, select the item and click the corresponding Delete button.
Note: The SW does not check the ROI coordinates against the borders of the image. Ensure the
defined ROI does not exceed the image borders.
ROI Coordinates
ROI coordinates are (x, y) pairs at four corner points of the area, measured from the center of the
phantom. The SW automatically detects the center of each irradiation field when results are calculated,
even without a reference image. The center can be moved (see the step below)
In the Image view section of the myQA - Imaging - Test Setup VMAT window, adjust position and
area of the ROIs by drag and drop:
Zoom the image to the area of interest, select a color palette and adjust the brightness and
contrast to have the best view of the image.
Check the isocenter position with the help of the profiles in the Profiles chart display. Ensure the
isocenter is at the center of the profiles.
Adjust the size and position of each ROI so that it is in the center of the irradiated area.
Click Ok when it is done.
For the Expected, Warn, and Fail Values table, comparison result symbols, Process button, and
disabling analysis of a parameter, see Section 5.3.3.5.
The chart can be displayed in full window by clicking the button in the upper right corner.
8.3. References
[1] C. C. Ling, "COMMISSIONING AND QUALITY ASSURANCE OF RAPIDARC RADIOTHERAPY,"
Int. J. Radiation Oncology Biol. Phys., pp. 575-581, 2008.
Image acquisition
In general, any phantom designed for the Winston-Lutz test can be used, e.g., QUASAR™ IsoCenter
Cube from Modus Medical Devices Inc., Winston Lutz Phantom from Standard Imaging Inc. An example
using the IBA cylindrical phantom is described in Section 9.4.1.
See the table below for the Winston-Lutz images acquisition techniques and for which types of the
measurements they are suitable.
The images can have any size and can be inverted. For 3D results, it is required that the measurements
are repeated at cardinal angles, 0°, 90°, 180°, and 270º. If multiple images are processed, they must all
be the same size and the same characteristics, e.g., SDD and DPI.
Note: The field size depends on the phantom used for the test. It should be big enough to contain the
WL ball. However, it should not be so big that it also covers the irrelevant parts of the phantom,
e.g., legs of the phantom, or any other object, e.g., the couch, that generates a pattern interfering
the contour analysis.
W AR NI NG
3D ANALYSIS
Four images with the following angle information should be provided to calculate
the 3D error correctly:
▪ Gantry angle are 0°, 90°, 180°, and 270°, respectively
▪ Same collimator angle for all 4 images
▪ Same couch angle for all 4 images.
2Klein EE, Hanley J, Bayouth J, et al. Task Group 142 report: quality assurance of medical
accelerators. Med. Phys. 2009: 36(9): 4197-4212
Winston-Lutz test
3: Display of the selected image and the analysis results (if available) (see Section 8.2.3)
4: Results
5: General information:
SAD: it is taken from value defined in myQA Platform > Equipment Setup.
SDD: it is read from the DICOM data. For the TIFF images, the SDD value needs to be
entered upon loading the images. The SDD is editable.
DPI: Dots per Inch of the images.
(Last column) Click any in this column to delete the selected image
Loading Images
Load image
Click Load Images to open the dropdown list:
DICOM: Loads DICOM images. See Section 11.1.2 for more instruction.
Tiff: Loads TIFF images. Select Tiff, in the Open browser, select the TIFF images, click Open, and
enter the SDD value in the popup dialog and click OK.
For loading multiple images, the order of the images follows the order of the filename for the TIFF
images, and of the ID (content of the nametag in the DICOM header) for DICOM images (same as in
the DICOM browser).
IM PO RT ANT N O T IC E
GANTRY ANGLE INFORMATION
The SW only imports gantry angles from DICOM data if those are present in the
DICOM header. For TIFF images, the gantry angle needs to be defined
manually.
Perform analysis
Check that the checkboxes of the images to be tested are selected.
The displayed coordinates are the EPID’s coordinates for images from EPID and the image’s
coordinates of images for TIFF images.
D↑ or D→: projection of the 2D deviation vector on vertical or horizontal axis
Dv: length of the 2D deviation vector
: When selected (on), it will zoom to the mouse-defined rectangle (place the cursor on the upper
corner, press and drag to the lower diagonal corner, and then release the mouse).
When un-selected (off), it is a pan tool: place the mouse anywhere on the image, press and drag
the image to the desired position.
Perform Analysis
Inner contour threshold: minimum intensity of the inner pattern threshold. This value determines the
edge of the center object.
Outer contour threshold: minimum intensity of the inner pattern threshold. This value determines the
edge of the radiation field.
Both thresholds are the percentages (0-100) of gray scale corresponding to the absolute value of 0-255.
After images are selected, the SW automatically searches for the contour pattern and the inner and
outer contours as defaults. They can be changed by setting different numbers in the Inner / Outer
contour box. By press the Refresh button , they are reset to the defaults.
Upon loading images, the SW automatically calculates the outer and inner contours using the first
image in the Image Parameter Table.
Click the Start button, (1) in on the top of the table to perform the test with the default contours
defined in (2).
Selecting/de-selecting an image by clicking the selection box will trigger calculation for this image
(displaying/removing the results for this image).
By clicking the Start button , all selected images are calculated. All 2D results will be displayed in
the table of the image parameters.
3D deviation calculation
IM PO RT ANT N O T IC E
REQUIREMENTS FOR 3D CALCULATION
The 3D results are provided only if the following conditions are fulfilled:
• Only four images with gantry angle information, 0°, 90°, 180°, and 270°,
respectively, are selected for calculation.
• Collimator angle and couch angle are the same for all four images.
• All the images have the same DPI to calculate the 3D analysis
Note: If gantry angles are changed, the calculation does not re-start automatically. Calculation will start
after clicking the Start button.
Note that you can also create a test template first, and then copy the template into the machine
protocol. See Section 3.3 for instructions.
Click .
See results listed in the Dv (deviation) and State (Passed/Warning/Failed/Undefined) columns,
Results boxes.
Click Finish, the results will be displayed.
2D Deviation Calculation
Assuming the coordinate of the outer contour center is 𝑂𝐶(𝑂𝐶ℎ𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 , 𝑂𝐶𝑣𝑒𝑟𝑡𝑖𝑐𝑎𝑙 ) and the coordinate of
the inner contour center is 𝐼𝐶(𝐼𝐶ℎ𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 , 𝐼𝐶𝑣𝑒𝑟𝑡𝑖𝑐𝑎𝑙 ). We have
Deviation along the horizontal direction:
𝐷ℎ_𝑝𝑥𝑙 = 𝑂𝐶ℎ𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 − 𝐼𝐶ℎ𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 ,
Deviation along the vertical direction:
𝐷𝑣_𝑝𝑥𝑙 = 𝑂𝐶𝑣𝑒𝑟𝑡𝑖𝑐𝑎𝑙 − 𝐼𝐶𝑣𝑒𝑟𝑡𝑖𝑐𝑎𝑙 ,
2D Euclidian distance between the centers of outer and inner contours:
where, the coordinates, deviations and the distance are expressed in pixels.
Since 1 inch = 25.4 mm, the following formula transfers pixels to mm:
𝐴𝑝𝑥𝑙
𝐴𝑚𝑚 = × 25.4 ,
𝐷𝑃𝐼
Image directions
AP (anterior to posterior) = 0°
PA (posterior to anterior) = 180°
RT (patient’s right) = 90°
LT (patient’s left) = 270°
Definitions of Parameters
Bx (for each direction): distance in x between inner contour center and outer contour center
By (for each direction): distance in y between inner contour center and external contour center
D3d: 3D Euclidian distance between the centers of outer and inner contours.
3D deviation formula
𝐵𝑥𝑅𝑇 + 𝐵𝑥𝐿𝑇
𝐵𝑥 = −𝐵𝑥𝑃𝐴 +
2
𝐵𝑦𝑅𝑇 + 𝐵𝑦𝐿𝑇
𝐵𝑦 =
2
𝐵𝑥𝑅𝑇 − 𝐵𝑥𝐿𝑇
𝐵𝑧 =
2
3 Grimm J, Grimm SL, Das IJ, et al. J Appl Clin Med Phys. 2011; 12(1):182-198
10.2.2.1. Sphinx V1
Normally, the Sphinx phantom is oriented on the patient couch as shown in the photo below. However,
the Lynx, which is attached to the Sphinx, can be oriented in 4 positions. The Sphinx phantom is
aligned through the X-ray imaging system, the in-room lasers can be checked on the markers engraved
on the phantom frame. See Section 7.3.3 in Vol.1, myQA Platform and Devices User’s Guide for the
detail procedure for setting up the Sphinx V1 system.
Illustration of the Lynx orientation using the shape of the Lynx. The BEV goes vertically to the paper.
For example, in BEV, the Lynx is rotated clockwise 90˚ is +90˚.
Illustration of the Sphinx Compact orientation at gantry angles of 90°, 0°, and 270°
The phantom is aligned with the in-room lasers to central fiducial and the markers engraved on the
phantom frame. The photo below show a setup that the Sphinx Compact is orientated for the gantry
angle at 90°. The PLD and template included in myQA are made for the gantry at 90°.
Example: the orientation of the Sphinx Compact with gantry angle at 90˚
See Section 7.3.4 in Vol.1, myQA Platform and Devices User’s Guide for the detail procedure for setting
up the Sphinx Compact system.
Baseline Measurements
Initial measurements can be performed to generate baseline and tolerance values. These value is then
entered into a Sphinx Plugin protocol (see Section 10.3.1.2) for future analysis.
The baseline includes an X-ray image, a DICOM CT image of the Sphinx phantom, output
measurement (with e.g. DOSE 1 and PPC05 chamber), and a complete beam image. These
measurements should be done before using the Sphinx Plugin on a recurrent basis for Machine QA.
They will be used every day later on as a reference for comparison. Therefore, the baseline should be
taken with the same measurement setup as for subsequent daily QA checks.
An example of a PLD image for Sphinx V1 system. All layers are selected to display
An example of a PLD image for Sphinx Compact system. All layers are selected to display. The PLD
has been created to use the Sphinx Compact with the gantry at 90°.
An example of a PLD (Sphinx Plugin-PLD Template.pld) shown in the figure above can be found in
C:\ProgramData\IBA Dosimetry\MachineQA\SphinxSamples
Click the Import *.pld button and browse for, select, and open the PLD file. See Section 10.3.1.3 for
more information.
In the Geometry group, edit the values according to the actual setup if necessary.
Defining the tests – it can be done by directly entering the baseline and user-defined tolerance
values. See Section 10.2.3 for the baseline measurements.
There are six test categories. Based on the procedure, they can be separated in two groups for
description.
The procedure for defining a test of the Couch Translation category is similar as above. Here the
expected values and tolerances are taken from the analysis of the X-ray image.
Group 2: procedure for the other group of categories, Energy, Homogeneity, Spot, and
Coincidence, the definition:
1. Create a new test
2. Enter a name
3. Define the OOI (Object Of Interest)
4. Enter the expected values and tolerances.
See an example of the procedure for defining an Energy test below.
(Sphinx V1)
(Sphinx Compact)
Repeat the above steps to define a test for another energy until all the desired energy tests are
defined. The procedure is similar as above for defining the Energy, Homogeneity, Spot, or
Coincidence test.
Click the OK button to finish editing the test
After all tests are defined for the protocol template, click Save in the ribbon
: Resets the image to the default display, which is the full size of the image, i.e. all pattern defined
in the PLD are shown.
: Zooms or resets the zoom by placing the mouse on the new center, and then rotating the mouse
wheel up or down.
: By selecting it, the cursor becomes double-arrows when it is placed on an edge of an OOI. Hold
down the left mouse and drag the edge to the desired position. When placed inside an OOI, the
cursor becomes four-arrows. Hold down the left mouse and drag the OOI to a desired position.
Cursor: coordinate of the intersection of the two-cursor lines.
Place the mouse on a cursor line, hold down the left mouse and drag the line to the desired position.
Settings group
1: for entering frame duration time (sec), the default value is 2 seconds
2a (only for Sphinx V1): for entering iris aperture (%); default: 100%
2b (only for Sphinx Compact): for entering the gain. The smaller the value the greater the signal
3: starts background measurement
4: connects/disconnects the detector for measurements
W AR NI NG
SPHINX COMPACT SETTINGS
The default values for the gain and frame duration are:
- Frame duration = 2 sec
- Gain = 4 pF
Geometry group
Rotation: for entering the orientation of the phantom. The orientation entered will be indicated with a
green corner of the outer edge of the full image. See the screenshot under PLD File group on the
previous page.
Phantom orientation indication: a corner of the outer edge of the full image becomes green, starting
from upper-left corner: 0˚, 90˚, 180˚, -90˚. See Section 10.2.2.1 (Sphinx V1) and Section 10.2.2.2
(Sphinx Compact) for the definition of phantom orientations.
By default, the following values are set in the Geometry group:
*In the Sphinx V1 system, 20 mm shift in Y-axis is due to the housing of the device, which is also the
shift when the device is oriented at 0˚ and aligned with the room lasers.
Illustration of the shift coordinate. The BEV goes vertically to the paper.
Analysis group
The Analysis group contains six tabs (1) for the supported test categories provided in the Sphinx
Plugin. To create a test,
Click the category tab for the desired test type (1), and select Create new; the test parameter
template (2) will open.
Edit the parameters.
This test will be listed in the dropdown menu of this tab.
Note: When entering values for the OOI coordinates, do not enter values beyond the image size
(< -149.5 mm or > 149.5 mm). Otherwise an Error Analysis message will appear. Then click OK
and re-enter a proper value.
Check through the parameters (3) that were defined in the copied template for all tests in the
Analysis group. If necessary, edit them so that they fit to the machine that is to be tested.
To activate the protocol for test run, select the protocol (4) and then click Set Active (5) in the
ribbon.
Click Save (6) in the ribbon.
2a 2b 3a 3b
1: Agenda panel: select a task for a machine to be run, 2: perform measurements/load images for
analysis, 3: define analysis settings and view results, 4: Task/Test Info panel (Section 3.4.4)
Note: 3a and 3b settings will be saved and used everyday. It is recommended that the Iris opening
(3b) is at least 30 % to avoid image distortion.
4The parameter, ‘BackgroundSamplingTime’, in the myQA configuration file is for the measurement of
the intrinsic noise, and is 20 s by default. In order to match better the use case, one can change the
parameter, “MQA_SphinxPlugin_BackgroundSamplingTime”, in the configuration file (open
IBADos.CSP.Run.Shell.exe.config, in C:\Program Files (x86)\IBA Dosimetry\myQA)
To make the changes effective, save the file and restart myQA.
3. Perform automatic measurement and (synced with the PT delivery machine) and analyze it ( ); see
Section 10.3.3.2.3.
Note: Before a test run, please check the pixel defect filtering parameters edited in the myQA Platform
- Options - myQA PT - Sphinx Plugin workspace. The default parameter values can be viewed
and Edit (see Section 5.4.1.4, Vol. 1, myQA Platform User’s Guide, where the defect filtering
algorithm is also described).
Images acquired with a Sphinx Compact have the following default settings:
■ Uniformity calibration applied. The uniformity calibration file loaded in the Equipment Setup
workspace is select by default. The filename is displayed in the Test Run workspace.
■ Depth calibration applied. The depth calibration file loaded in the Equipment Setup workspace is
selected by default.
■ Defect pixel correction applied. The defect correction parameters from the Sphinx Settings in the
myQA Platform - Options workspace.
The status (“true” or “false”) of the following parameters can be changed in the configuration files:
Geometrical correction
Median filter
W AR NI NG
SATURATED IMAGES
If the image is saturated, a warning sign will appear in the Measure &
Analysis group. By placing the cursor on the warning sign, the following warning
message appears:
“The image appears to contain <n> saturated frames”, where <n> is the actual
number of frames where saturation was detected.
If it is an imported image, it is not qualified to be used for analysis. If it is a
measured image, the measurement should be repeated with a different gain
value of the Sphinx Compact or with a modified map by decreasing MU / spot
where the image is saturated.
1a
2a 2c 2d
2b
2e 3c
1b
Note: if desired, the parameters for a test can be edited by selecting the tab of this test (3c). The
parameters in the Actual column will be filled during the measurement or the user can enter
them manually.
6. Select the detector (2a), click button (2b) to connect the Sphinx Plugin to the device, and enter the
measurement time (2c) and gain (for Sphinx Compact) or iris aperture (for Sphinx V1) (2d), click (2e)
to measure the background.
7. Click , and measurement starts. After measurement finishes, analysis for test categories,
Energy, Homogeneity, Spot, and Coincidence are automatically executed.
8. Perform analysis for the test categories, Output and Couch Translation (see Section 10.3.3.3 for
the description of the tests).
9. (Optional) click the Save Image button. The image will be saved in DICOM format.
10. To finish the tests, click Finish in the lower-right corner. The summary of the test results are
displayed.
For more information of the Finish Task dialog, see Section 3.4.6
12. Click the Finish button to finish the task or to finish the task and create a task report, click the
Finish & Report button. For more information about a report, see Section 2.5 and 10.4.
An example of a Lynx map (Sphinx Plugin - Measurement Example.dcm) and a test protocol (Sphinx
Plugin-Protocol Template.xml) are provided in C:\ProgramData\IBA
Dosimetry\MachineQA\SphinxSamples. User can use this example to practice the workflow 2.
Note: if desired, the parameters for a test can be edited by selecting the tab of this test (3c). The
parameters in the Actual column will be filled during the measurement or the user can enter them
manually.
2. Click . In the Load Image Manually window, select an image and click OK. For
detail instruction of DICOM import, see Section 11.1.
If the DICOM image is older than one hour, a warning message is displayed. To continue the import,
click OK; the image is displayed.
W AR NI NG
QUALTY OF IMPORTED IMAGES
The software does not check the imported images. Please ensure that they are
suitable for the comparison.
3. Click . Analysis for test categories, Energy, Homogeneity, Spot, and Coincidence will be
executed.
4. Perform analysis for the test categories, Output and Couch Translation (see Section 10.3.3.3).
8. Click the Finish button to finish the task or to finish the task and create a task report, click the
Finish & Report button. For more information about a report, see Section 2.5 and 10.4.
This workflow is only for IBA PT Users with PTS R11 (and above).
1. In Agenda, select a machine to be tested under Machines (1a), and a task to be run in the Active
Tasks list (1b).
Note: if desired, the parameters for a test can be edited by selecting the tab of this test (3c). The
parameters in the Actual column will be filled during the measurement or the user can enter
them manually.
2. Select the detector (2a), click button (2b) to connect the Sphinx Plugin to the device, and enter the
iris aperture (for Sphinx V1) or gain (for Sphinx Compact) (2d), click (2e) to measure the
background.
Note: It is not required to enter the measurement time (2c). It will be retrieved automatically from the
TCS (Therapy Control System).
3. Click , and the process info dialog opens. See sub-section, Measurement and Analysis
(Automation, IBA PT Users only), below for more information.
After measurement finishes, analysis for test categories, Energy, Homogeneity, Spot,
Coincidence, and Couch Translation are executed.
4. Perform analysis for the test category, Output (see Section 10.3.3.3).
5. (Optional) the image can be saved in DICOM format using the Save Image button.
To finish the tests, click Finish in the lower-right corner. The summary of the test results are
displayed.
6. To finish the task, click Finish in the ribbon. If the overall status should be changed, checkmark the
Override status box, and then select a new status in the combo box.
7. Click the Finish button to finish the task or to finish the task and create a task report, click the
Finish & Report button. For more information about a report, see Section 2.5 and 10.4.
Starting from PTS R11, myQA Sphinx Plugin and IBA PTS can communicate in order to synchronize
beam and lynx acquisition.
Cursor: displays the coordinate and intensity of the intersection of the two cursor lines.
Click a category tab and then select a test, its test results can be viewed in the Result panel. For the
test categories that show a graph in the Results panel (i.e. Energy, Coincidence and Homogeneity),
the intensity of the cursor position on the curve is displayed in the lower left of the graph.
In the homogeneity graph, the flattened region which is considered for flatness calculation is highlighted
in a different color.
Click the Couch Translation tab and then select the test, and then enter the actual values from the
X-ray image analysis
By selecting one or more tests, and then clicking Report (3) in the ribbon, a test report is created. It
can be exported as *.html, *.rtf, *.pdf or it can be printed by clicking the corresponding buttons on the
upper-right area (4).
If the user defines an OOI which does not include the signal over the RW3 wedge (2, see the screenshot
below) the algorithm cannot properly run the analysis and an error message is displayed (3).
Once the correct OOI is defined, the first derivative of the raw signal is calculated in order to identify the
physical edge of the corresponding RW3 block. This method has been deployed for the first time in
myQA 2016-002.
The final depth-dose curve is calculated by assigning a value of depth to each pixel of the image. To
account for different phantom materials in the Sphinx Compact, all depth values obtained in a Sphinx
Compact test are converted to water-equivalent depth via the default depth conversion (*.dc) file set in
the myQA Platform - Equipment Setup workspace.
𝑆 = ∑ 𝑟𝑖 2
𝑖=1
where
2
(𝑋𝑖 − 𝜇)
𝑟𝑖 = {𝑌𝑖 − 𝑘 ∙ exp (−0.5 ∙ ( ) )}
𝜎
where k, µ and σ are free parameters and X, Y the raw data from the image.
The searched parameters are defined as:
Spot Position = µ
Spot Sigma = σ
Spot Skewness is defined as the third moment of the Gaussian distribution. To reduce influence of
noise on the evaluation of the skewness, a threshold of 5% (relative to spot maximum) is set and
only points above this threshold are considered
Spot intensity is defined as the percentage ratio between the maximum of the spot and the
maximum over the entire image
Coincidence test - raw image Coincidence test – binary image with spot and
fiducial contours
The software automatically searches for the maximum in the OOI defined by the user.
The software automatically searches for the 20% isodose, the 50% isodose and the 80% isodose
The software automatically calculates the field size (defined as the size at 50% isodose) in both X
and Y dimensions
The software automatically calculates the penumbra (defined as the distance between the 80%
and 20% isodose levels
The software automatically identifies the major axis (M_axis) and the minor axis (m_axis) of the
irradiated field
The Uniform Field Region is defined along the major axis, as Field Size – 2·Penumbra (of each side)
This method has been deployed for the first time in myQA 2016-002.
[1] G. J. Kutcher et al. “Comprehensive QA for radiation oncology: Report of AAPM Radiation
Therapy Committee Task Group 40,” Medical Physics, vol. 21, no. 4, pp. 581–618, Apr. 1994.
File based
With this mode, the DICOM data are imported off-line, e.g. from a local or network storage.
Click the File based tab.
Type in the location of the folder in the Folder text box, or click the Folder browser button, then
select the desired folder and click OK.
Notes:
By checking the Look in subdirectories box, the studies in the subdirectories are also searched.
By clicking Add additional files …, a standard Open browser will open. More image files can be
added into the list in box on the right side of the window. However, if Update is now clicked, the list
will be reloaded from the folder currently selected in the Folder text box. The items added with Add
additional files … are removed.
Query/ Retrieve
With this mode, the DICOM data are retrieved from a DICOM Server.
Click the Query/Retrieve tab.
Enter the parameters to find the correct patient. You can search for a single attribute or for a
combination of several attributes by:
Patient Name
Listening
In the Listening mode, the DICOM browser listens to the network to receive data. A status bar on the
bottom of the window shows whether application is ready to receive data.
To use the Listening method, the Select a study to import window must be opened and the
Listening tab is selected.
Click the Listening button. Whenever data is received from a DICOM Server, the available items
are updated accordingly.
DICOMDIR
With the DICOMDIR mode, the DICOMDIR files can be imported.
Click the DICOMDIR tab.
Select the root DICOM directory using the Path button to browse for the directory.
Click Ok. The available studies are displayed.
When performing measurements with dosimetry Plugins, the user has the possibility to apply the
temperature and pressure correction (kTp) within the Machines workspace. The workflow is
independent whether the measurements are performed in Test Setup for baseline definition or in Test
Run.
In the following, the workflow is described using an example in Test Run.
To start the measurement, click the Measurement button (2). The results are automatically
temperature / pressure corrected.
Note: The temperature and pressure values entered for a dosimetry test are automatically applied to the
other dosimetry tests of the same session.
If the single measurements were performed without kTp correction, checkmark the KTp box and
enter the temperature and pressure values (5)
Click Apply Result.
Note: Do not checkmark the KTp box if the entered values have already been corrected during their
measurement! Otherwise, the correction will be applied twice.