0% found this document useful (0 votes)
12 views25 pages

[EGOV GDE] 2024 GDE Admissions TestingReport

The Gauteng Department of Education's e-Admissions application underwent performance testing to ensure its stability and responsiveness under high load conditions, particularly for the 2024 admissions cycle. The testing involved simulating user interactions and backend processes, with specific requirements set for transaction success rates and system stability. Results indicated successful performance across various user load tests, confirming the application's capability to handle anticipated traffic and bulk application transfers.

Uploaded by

Tendai Choruwa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views25 pages

[EGOV GDE] 2024 GDE Admissions TestingReport

The Gauteng Department of Education's e-Admissions application underwent performance testing to ensure its stability and responsiveness under high load conditions, particularly for the 2024 admissions cycle. The testing involved simulating user interactions and backend processes, with specific requirements set for transaction success rates and system stability. Results indicated successful performance across various user load tests, confirming the application's capability to handle anticipated traffic and bulk application transfers.

Uploaded by

Tendai Choruwa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Gauteng Department of Education

e-Admissions Performance Test


Report

Documented by:
Stephan Erasmus
[email protected]

2024 June

App Performance Management Test Management


Performance Testing Test Automation
Network Testing Robotics Testing
Predictive & Root Cause Analysis Robotics Process Automation
APM & QA Consulting Solution Support
Security Testing Solution Training
Document sign off

Name Date Signature

UJ Babu Paul 18/07/2024

EGOV Tebogo Phashake 17 July 2024


2024 – GDE Online Admissions System – Performance Test Report

Contents
1 Performance Testing Summary....................................................................................................... 3
1.1 Project Background ................................................................................................................. 3
1.2 Purpose ................................................................................................................................... 3
1.3 Performance Testing Approach .............................................................................................. 3
1.4 Performance Testing Requirements ....................................................................................... 3
2 Performance Test Scenario Design and Data .................................................................................. 4
2.1 Ramp-up, Pacing and Duration ............................................................................................... 4
2.2 Think Times ............................................................................................................................. 4
2.3 Azure Plan reference ............................................................................................................... 5
2.4 Data Issues .............................................................................................................................. 5
3 Performance Test Result ................................................................................................................. 6
3.1 Results for 10 000VU Test in Performance Environment ....................................................... 6
3.1.1 Virtual User Concurrency (GDE2024_RegApp_Full_10000VU.5) ................................... 7
3.1.2 Transaction Summary (GDE2024_RegApp_Full_10000VU.5) ......................................... 7
3.1.3 Transaction Response Times (GDE2024_RegApp_Full_10000VU.5) .............................. 8
3.1.4 Request Rates (GDE2024_RegApp_Full_10000VU.5) ..................................................... 8
3.2 Results for 20 000VU Test in Performance Environment ....................................................... 9
3.2.1 Virtual User Concurrency (GDE2024_RegApp_Full_20000VU.2) ................................... 9
3.2.2 Transaction Summary (GDE2024_RegApp_Full_20000VU.2) ......................................... 9
3.2.3 Response Times (GDE2024_RegApp_Full_20000VU.2) ................................................ 10
3.2.4 Request Rates (GDE2024_RegApp_Full_20000VU.2) ................................................... 11
3.3 Results for 40000VU Test in Performance Environment ...................................................... 11
3.3.1 Virtual User Concurrency (GDE_Admissions_40k.17)................................................... 12
3.3.2 Transaction Summary (GDE_Admissions_40k.17) ........................................................ 12
3.3.3 Response Times (GDE_Admissions_40k.17) ................................................................. 13
3.3.4 Request Rates (GDE_Admissions_40k.17) .................................................................... 14
3.3.5 System Resources (GDE_Admissions_40k.17) .............................................................. 14
3.4 Results for 40 000VU Soak Test in Production Slot Environment ......................................... 15
3.4.1 Virtual User Concurrency (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10) .. 15
3.4.2 Transaction Summary (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10) ....... 16
3.4.3 Response Times (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10) ................ 16
3.4.4 Request Rates (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10) ................... 17
3.4.5 System Resources (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10) ............. 18
3.5 Results for 230VU Admin Portal Test in Performance Environment .................................... 18
3.5.1 Virtual User Concurrency .............................................................................................. 19
3.5.2 Transaction Summary ................................................................................................... 19

Page 1
2024 – GDE Online Admissions System – Performance Test Report

3.5.3 Response Times ............................................................................................................ 20


3.5.4 Request Rates ............................................................................................................... 20
4 Conclusion ..................................................................................................................................... 21
4.1 Summary ............................................................................................................................... 21
4.1.1 ID Verification Service ................................................................................................... 21
5 Recommendations ........................................................................................................................ 22
5.1 Start Planning Earlier ............................................................................................................ 22
5.2 Roles and responsibilities...................................................................................................... 22
5.3 VPN and server access .......................................................................................................... 22
5.4 DHA Testing ........................................................................................................................... 22
5.5 Increase the testing scope .................................................................................................... 22
5.6 Introduce Network Testing ................................................................................................... 23
5.7 Monitoring ............................................................................................................................ 23

Page 2
2024 – GDE Online Admissions System – Performance Test Report

1 Performance Testing Summary

The Gauteng Department of Education (GDE) e-Admissions application has undergone only minor
changes leading up to this performance testing project, however, it is still prudent to ensure any
change did not impact the performance of the application from a responsiveness and stability point
of view.

An additional objective for the 2024 release of this application was to improve the performance of
the backend process for bulk application transfers.

To this end, the Department engaged the services of the University of Johannesburg (UJ) to manage
the performance testing elements of the delivery. IT Ecology, the selected provider for performance
testing and monitoring to UJ, was engaged to perform this function. This document outlines the
various performance tests executed.

The purpose of the performance testing executed against the e-Admissions application was to
ensure mainly the stability of the system under the high pressure that the application would face on
launch day, as well as the remainder of the admissions cycle. Additionally, the purpose of the
backend performance testing was to ascertain if the bulk application transfer process could handle
the anticipated load whilst performing as expected.

The frontend performance tests were executed using the public-facing e-Admissions portal. The
tests were structured in a way that reflect normal usage patterns of a parent registering a learner
and applying for a school. The load to be simulated was based on the anticipated load as seen in
previous years with a 10% margin as per the requirements below. In this round of testing, both
Grade 8 user journeys and Grade 1 user journeys were included in the performance tests.

Since no documented information was available as to what should be expected for the bulk
application transfer process in terms of transaction rate or responsiveness, tests were based on
estimates discussed with the GDE in order to set up realistic goals for these tests. This backend
performance test was not to be executed at the same time as the frontend tests because in
production on launch day (high load) the backend process is not executed at volume.

The following requirements were identified prior to testing:

A 40 000-user test performing more than 100 000 transactions in a 1-hour period whilst maintaining
a success rate of more than 95% would be considered a pass. This test must not have any critical
application failures that it was not able to recover from within a reasonable timeframe to allow the
95%+ success rate to be maintained. Additionally, it is a requirement that the application remain
stable over an extended period of time and this is to be tested with an 4-hour soak test.

Lastly, the estimated requirement for the bulk application transfer backend test is that 200 users
should be able to execute 4 bulk transfers of 25 learners each in an hour, yielding 800 bulk transfers
that transferred a total of 20000 learners in an hour.

Page 3
2024 – GDE Online Admissions System – Performance Test Report

2 Performance Test Scenario Design and Data


In order to meet the abovementioned requirements for the frontend, the following settings were
configured in our tests. These tests were then executed iteratively, starting with smaller tests with
lower virtual user count and ultimately leading up to the 40 000-user test using these settings for the
e-Admissions frontend (public portal) tests. Similarly, smaller tests were executed for the e-
Admissions backend (admin portal) ultimately culminating in a 230-user test to meet the
requirements, specifically for Bulk Transfers.

This section shows the ramp-up time, the pacing time, and the duration of each type of test that was
executed.

Ramp-up time is defined as the amount of time it takes for all the virtual users in the test to begin
transacting and have them in a state where they are either transacting actively or in pacing time.

Pacing time is defined as the amount of time a virtual user waits after finishing their admissions
application before starting the next one. For larger tests, the ramp-up time was aligned with pacing
time so that there are minimal spikes in user load. Pacing time is introduced so that the goal number
of transactions will be achieved within the duration of the test; without unrealistically loading the
system by introducing requests at an elevated rate.

Settings on Frontend tests:


Virtual User Ramp-up time Pacing Time Expected Test duration
Count (hh:mm:ss) (hh:mm:ss) (minutes)
50 00:13:20 00:13:20 30
1000 00:13:20 00:13:20 30
10000 00:26:40 00:26:40 60
20000 00:26:40 00:26:40 60
40000 00:26:40 00:26:40 60
40000 00:26:40 00:26:40 240

Settings on Backend tests:


Virtual User Ramp-up time Pacing Time Test duration
Count (hh:mm:ss) (hh:mm:ss) (minutes)
100 00:13:20 00:13:20 60
230 00:13:20 00:13:20 60

Think times between transactions for all tests were set to an average of 3 seconds per simulated
user interaction. This is to simulate the user interaction time such as typing or selecting dropdowns.

Page 4
2024 – GDE Online Admissions System – Performance Test Report

Throughout the results shown in section 2, reference is made to the tier of Azure app service used.
These are referred to by the abbreviations in the table below. The table also shows the resources
available for each of the tiers used.

Instance Cores Ram Storage


S1 1 1.75 GB 50 GB
S2 2 3.50 GB 50 GB
S3 4 7 GB 50 GB
P1v2 1 3.50 GB 250 GB
P2v2 2 7 GB 250 GB
P3v2 4 14 GB 250 GB
P1v3 2 8 GB 250 GB
P2v3 4 16 GB 250 GB
P3v3 8 32 GB 250 GB

In previous years the data to be used for performance tests was based on an extract of the data
generated in production the year before. Initially this approach was to be followed again and an
attempt was made to clean out the provided data file by deleting records that were found to be
causing DHA validation errors, but this was quickly abandoned as it takes far too long considering the
datafile holds about 250000 records. Considering the larger (40000 user) tests require a much larger
data set, and upon the request of the GDE to include Grade 8 and Grade 1 functionality in the final
tests, a new script was developed to represent the Grade 1 flow. The data file was then separated
into data for each of the scripts (Gr1 and Gr8) respectively.

To remedy the DHA validation issues, a script was created that queried DHA using the direct DHA
service URL. This query submitted the IDs in the provided datafile and saved the response to each ID
request. This allowed for validation of IDs that are not found on DHA and these were eliminated out
of the datafile. Next the names/surnames and gender of both learners and parents found in the DHA
service responses were saved. Lastly, the date of birth was derived from the SAID using the 7th digit
of the ID to ensure we no longer get “date of birth does not match the ID” errors. This data was
compiled into spreadsheets used by the performance test scripts.

This indeed eliminated most of the “DHA challenge” type of errors found last year and in this year’s
tests before the data was “cleaned” as described above.

Page 5
2024 – GDE Online Admissions System – Performance Test Report

3 Performance Test Result


Shakeout tests (10 user) are typically executed to ensure that the testing scripts and environments
are set up correctly. The shakeout test result also provides an initial average response time
measurement which is then used during the design of bigger tests to calculate the pacing time
required to achieve transaction objectives.

Originally, the previous year’s Grade 8 script was amended to accommodate changes in the 2024
release. This script was used for the initial 50VU test which proved that the script worked as
expected and the overall environment setup was in place.

Next a 1000 user test was executed to verify that the data tables used in our scripts were used
correctly in the test. This also showed that errors due to data validation against DHA were
problematic as in previous years when using the previous year’s data extract. Additionally, other
errors due to issues in the datafile content were found.

Tests were therefore executed with DHA disabled so as to only test the e-Admissions application,
whilst the data file was recreated using the process described in the previous section.

For the frontend tests, further load tests were then executed at 10000 virtual users, 20000 virtual
users and 40000 virtual users with the 40000 virtual user scenario also being tested in a soak test.
The backend tests were executed for 230 virtual users as per requirements. The last successful test
of each of the abovementioned tests is described in the subsections below.

Test ID: GDE2024_RegApp_Full_10000VU.5


Date: 10:40 (28/05/2024)

The 10000-user test was executed with up to 10 app services of type P3V3, REDIS set to P2 and the
General-Purpose Database plan with 40 vCores. Multiple 10000 user tests were executed

The initial 10000 user test was attempted with DHA integration in place and the originally extracted
datafile, but this showed that there was a significant amount of DHA validation errors throughout
the larger dataset. Therefore, tests were executed without DHA validation to prove that the
application as such would hold up, which it did just fine. Since previous years’ testing only included a
script representing Grade 8 functionality, these initial 10000 user tests were executed using the
newly developed Grade 1 script and associated data.

The 10000-User test was successful and ran with very little environmental issues. Transaction rates
and response times were within acceptable thresholds.

Page 6
2024 – GDE Online Admissions System – Performance Test Report

3.1.1 Virtual User Concurrency (GDE2024_RegApp_Full_10000VU.5)

3.1.2 Transaction Summary (GDE2024_RegApp_Full_10000VU.5)


Transaction name Passed Failures Failure %
Full Flow 29891 109 0.36%
GDEReg_HomePage 30000 0
GDERegGr1_01_Click_Register 30000 0
GDERegGr1_06_Input_ParentID 30000 0
GDERegGr1_07_Parent_DHA_Validation 30000 0
GDERegGr1_08_Input_CellphoneNumber 30000 0
GDERegGr1_14_Select_TermsConditions_OK 30000 0
GDERegGr1_18_Select_Submit 29999 1
GDERegGr1_22_Input_LearnerID 29999 0
GDERegGr1_23_Learner_DHA_Validation 29999 0
GDERegGr1_29_Select_ApplyToSchools 29999 0
GDERegGr1_31_Select_SubmitApplications 29891 108
GDERegGr1_32_Select_Dashboard 29891 0
GDERegGr1_Logout 29891 0

Page 7
2024 – GDE Online Admissions System – Performance Test Report

3.1.3 Transaction Response Times (GDE2024_RegApp_Full_10000VU.5)

Transactions
Transaction Average Minimum Maximum per second
GDEReg_HomePage 0.382 0.192 7.158 5.74
GDERegGr1_01_Click_Register 0.014 0 3.943 5.746
GDERegGr1_06_Input_ParentID 0.042 0.015 4.083 5.751
GDERegGr1_07_Parent_DHA_Validation 0.033 0.001 3.939 5.751
GDERegGr1_08_Input_CellphoneNumber 0.038 0.012 4.939 5.751
GDERegGr1_14_Select_TermsConditions_OK 0.02 0 6.798 5.751
GDERegGr1_18_Select_Submit 0.644 0.418 16.836 5.75
GDERegGr1_22_Input_LearnerID 0.043 0.011 4.061 5.75
GDERegGr1_23_Learner_DHA_Validation 0.034 0.008 4.01 5.75
GDERegGr1_29_Select_ApplyToSchools 0.473 0.301 8.264 5.75
GDERegGr1_31_Select_SubmitApplications 0.294 0.179 5.152 5.73
GDERegGr1_32_Select_Dashboard 0.138 0.083 4.34 5.73
GDERegGr1_Logout 0.225 0.123 5.228 5.73

3.1.4 Request Rates (GDE2024_RegApp_Full_10000VU.5)

Page 8
2024 – GDE Online Admissions System – Performance Test Report

Test ID: GDE2024_RegApp_Full_20000VU.2


Date: 13:20 (28/05/2024)

Following the success of the 10 000-user test, the 20 000-user test was attempted again using the
Grade 1 script and associated data. The environment for the 20000-user test was from the outset
configured as per the learnings from last year. REDIS was set to P2 and the application services were
set for P3V3. Microsoft introduced changes to some of the Azure plans and it means more bang-for-
buck out of the General-Purpose plan. Therefore, the initial 20000 user test was attempted with the
database on General Purpose and 40 cores.
This test did not perform as expected and another test was executed with the database back to
General-Purpose but 80 vCores as per the findings from the previous year and this test passed as
expected. This second 20000 user test ran without any glitches and passed all required metrics.

3.2.1 Virtual User Concurrency (GDE2024_RegApp_Full_20000VU.2)

3.2.2 Transaction Summary (GDE2024_RegApp_Full_20000VU.2)


Transaction name Passed Failures Failure %
Full_Flow 59794 206 0.34%
GDEReg_HomePage 60000 0
GDERegGr1_01_Click_Register 60000 0
GDERegGr1_06_Input_ParentID 60000 0
GDERegGr1_07_Parent_DHA_Validation 60000 0
GDERegGr1_08_Input_CellphoneNumber 60000 0
GDERegGr1_14_Select_TermsConditions_OK 60000 0
GDERegGr1_18_Select_Submit 59998 2
GDERegGr1_22_Input_LearnerID 59998 0
GDERegGr1_23_Learner_DHA_Validation 59998 0
GDERegGr1_29_Select_ApplyToSchools 59998 0
GDERegGr1_31_Select_SubmitApplications 59794 204
GDERegGr1_32_Select_Dashboard 59794 0
GDERegGr1_Logout 59794 0

Page 9
2024 – GDE Online Admissions System – Performance Test Report

3.2.3 Response Times (GDE2024_RegApp_Full_20000VU.2)

Transactions
Transaction name Average Minimum Maximum per second
GDEReg_HomePage 0.397 0.115 10.113 11.47
GDERegGr1_01_Click_Register 0.014 0 1.811 11.484
GDERegGr1_06_Input_ParentID 0.043 0.011 2.902 11.515
GDERegGr1_07_Parent_DHA_Validation 0.032 0 2.72 11.517
GDERegGr1_08_Input_CellphoneNumber 0.037 0.012 2.729 11.517
GDERegGr1_14_Select_TermsConditions_OK 0.019 0 1.483 11.517
GDERegGr1_18_Select_Submit 0.676 0.407 5.701 11.516
GDERegGr1_22_Input_LearnerID 0.045 0.014 2.697 11.516
GDERegGr1_23_Learner_DHA_Validation 0.033 0.008 2.563 11.516
GDERegGr1_29_Select_ApplyToSchools 0.511 0.281 5.754 11.516
GDERegGr1_31_Select_SubmitApplications 0.347 0.173 3.999 11.477
GDERegGr1_32_Select_Dashboard 0.154 0.079 2.517 11.477
GDERegGr1_Logout 0.205 0.114 3.279 11.477

Page 10
2024 – GDE Online Admissions System – Performance Test Report

3.2.4 Request Rates (GDE2024_RegApp_Full_20000VU.2)

Test ID: GDE_Admissions_40k.17


Date: 21:00 (30/05/2024)

Additional injectors were added to the test infrastructure to accommodate the 40000-user load.
Unfortunately, many tests failed due to CPU issues on the injectors. These were rectified once the
cause thereof was identified. Windows Defender exclusions for Eggplant folders and the injector
process were added to each injector which brought CPU to acceptable levels.

The 40000 user tests were executed with both the Grade 1 and Grade 8 script in order to test all
functionality but also because data available for only Grade 8 or Grade 1 would have been
insufficient for this test. These tests were executed with the environment configurations as per
learnings from the previous year, with the exception of the database. It was set to Business Critical
but using only 40 cores initially. When this did not provide the desired result, it was decided to go
back to the proven configuration of Business Critical with 80 vCores.

A successful 40000-User test was run on the performance environment using the below parameters:
• Redis set to P2 Tier
• API App Services 18 instances (P3V3)
• Public Portal APP Service 10 instances (S3)
• Database upgraded to Business Critical and set at 80 vCores
• Enabled the communication to the DHA.

The successful 40000-user test was executed in the performance test environment with DHA
integration enabled as the new dataset was by this time available to be used in the Grade 1 and
Grade 8 scripts respectively. It was successful in terms of the number of applications completed and
having good response times. The Boxfusion team also verified that the available resources remained
optimal throughout the duration of the test. An identical test including DHA was also executed in
production (slot) environment 2 days before go-live which yielded identical results.

Page 11
2024 – GDE Online Admissions System – Performance Test Report

3.3.1 Virtual User Concurrency (GDE_Admissions_40k.17)

3.3.2 Transaction Summary (GDE_Admissions_40k.17)


Transaction name Passed Failures Failure %
Full_Flow (Gr1 and Gr8) 119190 810 0.67%
GDEReg_HomePage 120000 0
GDERegGr1_01_Click_Register 60000 0
GDERegGr1_06_Input_ParentID 60000 0
GDERegGr1_07_Parent_DHA_Validation 59993 7
GDERegGr1_08_Input_CellphoneNumber 59993 0
GDERegGr1_14_Select_TermsConditions_OK 59993 0
GDERegGr1_18_Select_Submit 59993 0
GDERegGr1_22_Input_LearnerID 59993 0
GDERegGr1_23_Learner_DHA_Validation 59993 0
GDERegGr1_29_Select_ApplyToSchools 59978 15
GDERegGr1_31_Select_SubmitApplications 59644 334
GDERegGr1_32_Select_Dashboard 59644 0
GDERegGr1_Logout 59644 0
GDERegGr8_06_Input_ParentID 60000 0
GDERegGr8_07_Parent_DHA_Validation 59991 9
GDERegGr8_08_Input_CellphoneNumber 59991 0
GDERegGr8_18_Select_Submit 59989 2
GDERegGr8_22_Input_LearnerID 59989 0
GDERegGr8_23_Learner_DHA_Validation 59977 12
GDERegGr8_31_Input_CurrentSchool 59977 0
GDERegGr8_32_Select_ApplyToSchools 59977 0
GDERegGr8_34_Submit_Applications 59555 422
GDERegGr8_35_Select_UploadDocuments 59553 2
GDERegGr8_36_Upload_ParentID 59551 2
GDERegGr8_37_Upload_ProofOfAddress 59551 0
GDERegGr8_38_Upload_LearnerBirthCertificate 59549 2
GDERegGr8_39_Upload_LearnerLatestReport 59547 2
GDERegGr8_40_Select_BackToDashboard 59546 1
GDERegGr8_Logout 59546 0

Page 12
2024 – GDE Online Admissions System – Performance Test Report

3.3.3 Response Times (GDE_Admissions_40k.17)

Transactions
Transaction name Average Minimum Maximum per second
GDEReg_HomePage 1.115 0.13 47.584 33.598
GDERegGr1_01_Click_Register 0.012 0 7.753 16.796
GDERegGr1_06_Input_ParentID 0.081 0.04 12.053 16.836
GDERegGr1_07_Parent_DHA_Validation 0.031 0.004 7.768 16.85
GDERegGr1_08_Input_CellphoneNumber 0.04 0.006 7.907 16.852
GDERegGr1_14_Select_TermsConditions_OK 0.016 0 7.75 16.852
GDERegGr1_18_Select_Submit 0.579 0.362 9.698 16.852
GDERegGr1_22_Input_LearnerID 0.077 0.035 12.061 16.852
GDERegGr1_23_Learner_DHA_Validation 0.031 0.006 8.004 16.852
GDERegGr1_29_Select_ApplyToSchools 0.407 0.251 8.477 16.85
GDERegGr1_31_Select_SubmitApplications 0.256 0.157 7.971 16.791
GDERegGr1_32_Select_Dashboard 0.168 0.081 8.133 16.791
GDERegGr1_Logout 0.225 0.111 21.137 16.791
GDERegGr8_06_Input_ParentID 0.088 0.041 12.281 16.815
GDERegGr8_07_Parent_DHA_Validation 0.038 0.005 7.362 16.853
GDERegGr8_08_Input_CellphoneNumber 0.049 0.001 11.97 16.869
GDERegGr8_18_Select_Submit 0.858 0.452 34.227 16.869
GDERegGr8_22_Input_LearnerID 0.086 0.037 12.009 16.869
GDERegGr8_23_Learner_DHA_Validation 0.039 0.002 7.761 16.867
GDERegGr8_31_Input_CurrentSchool 0.11 0.044 23.054 16.867
GDERegGr8_32_Select_ApplyToSchools 0.543 0.276 26.914 16.867
GDERegGr8_34_Submit_Applications 0.328 0.168 25.94 16.793
GDERegGr8_35_Select_UploadDocuments 0.108 0.035 7.871 16.793
GDERegGr8_36_Upload_ParentID 0.19 0.092 13.763 16.792
GDERegGr8_37_Upload_ProofOfAddress 0.184 0.092 14.626 16.792
GDERegGr8_38_Upload_LearnerBirthCertificate 0.199 0.099 15.106 16.792
GDERegGr8_39_Upload_LearnerLatestReport 0.188 0.091 12.081 16.792
GDERegGr8_40_Select_BackToDashboard 0.19 0.086 7.971 16.792
GDERegGr8_Logout 0.828 0.286 25.954 16.792

Page 13
2024 – GDE Online Admissions System – Performance Test Report

3.3.4 Request Rates (GDE_Admissions_40k.17)

3.3.5 System Resources (GDE_Admissions_40k.17)


All Azure system resources and measured metrics were well within acceptable norms and has ample
spare capacity.

Resources on DHA Gateway servers were also observed and load balancing as well as responsiveness
from DHA was as expected.

Page 14
2024 – GDE Online Admissions System – Performance Test Report

Test ID: GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10


Date: 10:19 (05/06/2024)

Part of the requirements for performance testing as in previous years was that a 4 hour soak test be
run to simulate a longer duration of activity against the system to prove its stability.

Due to elections, DHA did not allow for any high-volume tests to be run during the day. DHA
maintenance slots however occur from 10h30pm leaving little time to run this test in the evenings.
Therefore, it was decided to run this test in the production (slot) environment without DHA in play.

The environment for this test was originally configured to the configuration used in the successful
test last year, however it was evidenced in multiple of these tests that the 18 API app service
instances did not cope. App services would at times take on significantly more load than other app
services causing some of them to crash. That in turn aggravates the situation as fewer app services
are then available to take on the load before crashed app services are returned to service.

Because of these findings, the app services were increased to 24 instances yielding this configuration
for the final and successful test:
• Redis set to P2 Tier
• API App Services 24 instances (P3V3) and Public Portal APP Service 10 instances (P3V3)
• Database upgraded to “Business Critical” and set at 80 vCores
• DHA not enabled

The final test proved to achieve excellent response times across the Grade 8 and Grade 1 flows,
however the test ran somewhat faster than expected and completed the target transaction count of
100000 transactions in the 3 hours instead of 4. Since this was considered long enough for the
purpose of the endurance test and since no resources showed any signs of a gradually worsening
bottleneck, it was decided that this test can be accepted as a pass for the endurance test.
Error rates remained below 5% too, however all tests revealed HTTP404 warnings for some
Javascript requests. These do not affect end users as they are not displayed on-screen. These were
investigated and it was found that an ID in the URL for these requests had changed when these tests
were executed from the ID in place during script recordings. Since this ID could not be correlated, it
will just need to be replaced in future tests with the new ID, but this does not affect actual users and
hence this test is considered a pass. This test was rerun 2 days before go-live with similar results.

3.4.1 Virtual User Concurrency (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10)

Page 15
2024 – GDE Online Admissions System – Performance Test Report

3.4.2 Transaction Summary (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10)


Transaction name Passed Failures Failure %
Full_Flow (Gr1 and Gr8) 237992 1024 0.853%
GDEReg_HomePage 120000 0
GDERegGr1_01_Click_Register 60000 0
GDERegGr1_06_Input_ParentID 60000 0
GDERegGr1_07_Parent_DHA_Validation 60000 0
GDERegGr1_08_Input_CellphoneNumber 60000 0
GDERegGr1_14_Select_TermsConditions_OK 60000 0
GDERegGr1_18_Select_Submit 60000 0
GDERegGr1_22_Input_LearnerID 60000 0
GDERegGr1_23_Learner_DHA_Validation 60000 0
GDERegGr1_29_Select_ApplyToSchools 59998 2
GDERegGr1_31_Select_SubmitApplications 59566 432
GDERegGr1_32_Select_Dashboard 59566 0
GDERegGr1_Logout 59566 0
GDERegGr8_06_Input_ParentID 60000 0
GDERegGr8_07_Parent_DHA_Validation 60000 0
GDERegGr8_08_Input_CellphoneNumber 60000 0
GDERegGr8_18_Select_Submit 60000 0
GDERegGr8_22_Input_LearnerID 60000 0
GDERegGr8_23_Learner_DHA_Validation 59999 1
GDERegGr8_31_Input_CurrentSchool 59999 0
GDERegGr8_32_Select_ApplyToSchools 59998 1
GDERegGr8_34_Submit_Applications 59410 588
GDERegGr8_35_Select_UploadDocuments 59410 0
GDERegGr8_36_Upload_ParentID 59410 0
GDERegGr8_37_Upload_ProofOfAddress 59410 0
GDERegGr8_38_Upload_LearnerBirthCertificate 59410 0
GDERegGr8_39_Upload_LearnerLatestReport 59410 0
GDERegGr8_40_Select_BackToDashboard 59410 0
GDERegGr8_Logout 59410 0

3.4.3 Response Times (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10)

Page 16
2024 – GDE Online Admissions System – Performance Test Report

Transactions
Transaction name Average Minimum Maximum per second
GDEReg_HomePage 1.33 0.122 65.508 22.193
GDERegGr1_01_Click_Register 0.014 0 7.905 11.097
GDERegGr1_06_Input_ParentID 0.048 0.008 22.036 11.101
GDERegGr1_07_Parent_DHA_Validation 0.034 0.002 23.155 11.104
GDERegGr1_08_Input_CellphoneNumber 0.04 0.005 19.858 11.108
GDERegGr1_14_Select_TermsConditions_OK 0 0 0.176 11.111
GDERegGr1_18_Select_Submit 0.582 0.371 27.772 11.115
GDERegGr1_22_Input_LearnerID 0.042 0.007 14.288 11.12
GDERegGr1_23_Learner_DHA_Validation 0.034 0.002 22.541 11.124
GDERegGr1_29_Select_ApplyToSchools 0.404 0.256 25.073 11.128
GDERegGr1_31_Select_SubmitApplications 0.249 0.159 26.379 11.089
GDERegGr1_32_Select_Dashboard 0.169 0.074 25.273 11.089
GDERegGr1_Logout 0.213 0.101 23.457 11.089
GDERegGr8_06_Input_ParentID 0.056 0.011 23.726 11.105
GDERegGr8_07_Parent_DHA_Validation 0.047 0.006 23.505 11.108
GDERegGr8_08_Input_CellphoneNumber 0.053 0.007 24.17 11.112
GDERegGr8_18_Select_Submit 0.857 0.436 28.875 11.117
GDERegGr8_22_Input_LearnerID 0.054 0.011 24.527 11.121
GDERegGr8_23_Learner_DHA_Validation 0.045 0 21.914 11.126
GDERegGr8_31_Input_CurrentSchool 0.112 0.043 22.852 11.13
GDERegGr8_32_Select_ApplyToSchools 0.514 0.259 25.718 11.133
GDERegGr8_34_Submit_Applications 0.311 0.17 25.643 11.08
GDERegGr8_35_Select_UploadDocuments 0.103 0.049 24.275 11.08
GDERegGr8_36_Upload_ParentID 0.192 0.092 24.006 11.08
GDERegGr8_37_Upload_ProofOfAddress 0.191 0.092 25.425 11.08
GDERegGr8_38_Upload_LearnerBirthCertificate 0.207 0.095 25.778 11.08
GDERegGr8_39_Upload_LearnerLatestReport 0.204 0.09 26.75 11.08
GDERegGr8_40_Select_BackToDashboard 0.196 0.079 22.656 11.08
GDERegGr8_Logout 0.693 0.28 23.467 11.08

3.4.4 Request Rates (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10)

Page 17
2024 – GDE Online Admissions System – Performance Test Report

3.4.5 System Resources (GDE2024_RegApp_40000VU_Enduro_Prod_NoDHA.10)


All Azure system resources and measured metrics were well within acceptable norms and has ample
spare capacity.

Test ID: GDE_Bulk_Transfers_230VU_Test.3


Date: 07:23 (13/06/2024)

The admin portal is a separate web portal used by Admissions administrators to manage the
applications received from the public facing portal. It sources the data from the same database via
the same API backend service used by the e-Admissions application.

The backend service and database service were thus originally retained at low usage levels. However
very quicky it was found that the database is used heavily, and complex queries are causing the
database CPU to reach its capacity. In the final test shown here, considered a successful test based
on requirements, the database tier was upgraded to “Business Critical” and set at 80 vCores.

Response times are still not as expected for a transactional system, however considering the
operations performed, these response times are deemed acceptable as long as no timeouts are
experienced as was the case in production in previous years. This objective was indeed achieved
with this environment configuration with no failed transactions reported on this test.
Since the high usage of the admin portal only takes place later in the year, it was recommended that
further testing and/or tuning be performed before then. Also, better requirements for performance
are needed to determine what is considered acceptable to users and what is not. Therefore, real
user experience is recommended to identify how the application is used and what the actual
patterns are during which users are finding it difficult to work with the application.

Page 18
2024 – GDE Online Admissions System – Performance Test Report

3.5.1 Virtual User Concurrency

3.5.2 Transaction Summary


Transaction name Passed Failures Failure %
Full_Flow (Bulk Transfer) 1150 0 0.0%
01_Navigate_to_landing_Page 1150 0
02_Login 1150 0
GDE_BBT_01_Click_Transfer_Management 1150 0
GDE_BBT_02_Change_Page_to_30 1150 0
GDE_BBT_03_ReOrder_Transfer_Application_Type 1150 0
GDE_BBT_05_Click_Bulk_Transfers 1150 0
GDE_BBT_06_Select_School 1150 0
GDE_BBT_07_Click_Transfer 1150 0
03_Logout 1150 0

Page 19
2024 – GDE Online Admissions System – Performance Test Report

3.5.3 Response Times

Transactions
Transaction name Average Minimum Maximum per second
01_Navigate_to_landing_Page 0.551 0.26 9.542 0.263
02_Login 3.664 2.34 13.068 0.264
GDE_BBT_01_Click_Transfer_Management 9.161 8.145 17.421 0.264
GDE_BBT_02_Change_Page_to_30 9.054 7.946 16.884 0.264
GDE_BBT_03_ReOrder_Transfer_Application_Type 9.327 8.151 12.76 0.264
GDE_BBT_05_Click_Bulk_Transfers 2.509 0.466 9.175 0.264
GDE_BBT_06_Select_School 0.085 0.039 8.709 0.264
GDE_BBT_07_Click_Transfer 25.726 18.318 36.205 0.264
03_Logout 0.039 0.019 0.985 0.264

3.5.4 Request Rates

Page 20
2024 – GDE Online Admissions System – Performance Test Report

4 Conclusion

The performance testing completed on the 2024 release of the GDE Online Admissions system
ended with mostly favourable results but highlighted one main issue as in the previous year.

4.1.1 ID Verification Service


Even with a much cleaner dataset created by utilising the DHA NPR service directly, this integration
was still created the most challenges during the performance testing:
1. Gateway servers load balancing did not work initially. Restarting app pools resolved the issue
2. DHA restricted access to us during the day which meant any large tests had to be executed
after 18h00. This was further restricted at times due to the elections that took place around
the time of this project.
3. A TLS version change occurred, most likely at DHA. All performance tests prior to the
production tests were concluded by 6 June and this change occurred after that date.
Therefore, the gateway servers which did not support the now enforced TLS 1.3 now failed
to provide the DHA access. Extensive troubleshooting over many days led to the
identification of the problem and the root cause was determined to be the Operating system
version Windows Server 2012. Once this was upgraded on the 3 servers, the service was
restored.
4. The above root cause analysis showed clearly that there was limited knowledge of the SLA
for these servers and thus it took many meetings over days to understand who has
responsibility to assist in the matter.
5. Since GDE was under the impression that the servers including the OS was managed by
Vodacom, no maintenance and patching had been done since 2019. Also, staff had left GDE
and it turned out that IT Ecology were the only ones with VPN and server admin access to
these gateway servers.
6. These servers were previously monitored by Dynatrace so as to establish if the load
balancing worked well, if the resources on the gateway servers were within acceptable levels
and if the service response was degrading at any time and what the cause thereof was. It
was decided not to acquire Dynatrace this year and a decision was taken to implement MS
AppInsights for the metrics required. This was done very late into the project and after all
except production performance testing was completed. It is still unclear if the monitoring
that will be available on the day of go-live will meet the requirements and perform as
expected without potentially overloading the servers or slowing down the DHA service.
7. The connection between the application coming into the gateway servers is on HTTP and not
on HTTPS as was also identified by a recent PEN test, posing unnecessary risk.
8. The service code hosted on the gateway servers was written by SITA and is the property of
SITA. When analysing the type of problem experienced and described above, it means one
potentially requires GDE, eGOV, DHA, SITA, Vodacom and Boxfusion in the same meeting
which has caused significant delays to root cause analysis.

Page 21
2024 – GDE Online Admissions System – Performance Test Report

5 Recommendations

Having performed the performance testing for e-Admissions for around 7 years now, IT Ecology have
in previous rounds advised that this project should start sooner to leave more time to effectively
plan, test and fix any issues identified.

When a problem such as the TLS issue occurs, GDE need to be in a position to address and resolve
such a problem much faster. Instead, this round it became apparent that there was a lock of
awareness as to who has responsibility to maintain the operating system of the DHA gateway
servers at Vodacom. A SLA could not be sourced immediately to clarify roles and responsibilities with
regards to the code hosted on IIS (written by SITA), the operating system on the servers (GDE
responsibility) and the VM itself (Vodacom responsibility). It took many meetings including GDE,
eGOV, SITA, DHA, Boxfusion and Vodacom to understand who has the responsibility for these
servers which significantly impacted the resolution time.

The abovementioned servers have not been maintained or patched for many years. GDE staff that
had access to the servers via VPN had in the meantime left GDE and IT Ecology were the only team
to be able to get onto these servers which clearly is not ideal since IT Ecology is only engaged on an
ad-hoc basis for the yearly performance testing. Had a different company been chosen for
performance testing this year, nobody could have accessed these servers and they would potentially
have needed to be rebuilt from scratch.

The constraint this year that proved to cost significant time during the project, was having to test
without DHA enabled during normal office hours. This meant testing with DHA was only allowed
from 18:00 onwards. This in turn caused large time gaps within testing when DHA was required
causing the continuation of work to be put on hold completely until 18:00 when testing with DHA
was allowed. Testing during the evening does not allow for a direct comparison to the criteria that
would be experienced on the day of go-live, meaning that the DHA would be under more strenuous
and continuous use during the day and would therefore not compare directly to a test during the
evening where normal usage of the DHA system is not present. With the information we have
gathered from previous years, most parents that are looking to register their children are doing so
during normal business hours. No slots were requested to however test with DHA during normal
working hours and therefore the test results are considered a better than normal circumstance.

The testing scope of this project focuses entirely on only 3 requirements, namely that the system
must support 40000 users submitting 100000 applications in an hour with the ability to recover from
any failures such that overall availability is 95% or better.

A 3s response time is requested but with traditional performance testing, this response time is only
measured as a server turnaround time and not a user experience time, which is likely much higher
than what is acceptable to most users, especially when considering the large home page. More
testing is therefore recommended that would:

Page 22
2024 – GDE Online Admissions System – Performance Test Report

1. Measure or at least estimate the user experience time


2. Identify what can be done to improve user experience
3. Retest to identify if changes had the desired effect on user experience

Not all real-life users will have access to the same standard of networks. For example. Mobile
networks can vary in speed based on the distance from the tower to the user’s device. Which would
be vastly different to someone on a wired data network such as fibre.
Network testing plays a crucial role in optimizing network performance. By identifying bandwidth
constraints, congestion, latency, packet loss, and out-of-sequence packets, organizations can
enhance data transmission and ensure a seamless user experience. We recommend incorporating
network testing into regular operations to proactively address network issues and improve overall
network efficiency.

Monitoring should be further extended to incorporate the whole application. As part of the
performance testing engagement, IT Ecology and Microsoft have enabled monitoring of various
metrics on the gateway servers since a decision was taken to no longer utilise Dynatrace which
performed this function previously. The current monitoring in place may suffice for the go-live,
however it does not offer the same capabilities in its current form.

1. Currently, only requests and response times are measured coming into the gateway service.
No measurements are available to measure actual requests and passed responses from DHA
and associated response times. It is therefore not easy to track if a possible response time
degradation is due to the gateway services or due to DHA.
2. User experience monitoring is also recommended as it will provide details of why certain
users may have challenges interacting with the system. This may be due to a slow
connection, a specific ISP or a browser or browser version which does not perform as
expected. Such monitoring is not available now that Dynatrace has been replaced.
3. User experience monitoring is also recommended to provide necessary input to
performance testing of the admin portal. Very lose requirements have been provided for
admin portal performance testing and any outcomes from such an exercise are only as
valuable as the inputs provided for it. Learning from user experience monitoring how users
interact with the system is extremely helpful in making the decision how and where to make
improvements to the application. There is even an option to completely re-architect the bulk
transfer process, but such a decision should be guided by data that proves what the issues
are, when they occur and why the occur.

Page 23

You might also like