Pilot Aptitude Testing Guide
Pilot Aptitude Testing Guide
2nd Edition
2nd Edition
NOTICE
DISCLAIMER. The information contained in this publication is subject to constant review in the light of changing government requirements and regulations. No subscriber or other reader should act on the basis of any such information without referring to applicable laws and regulations and/or without taking appropriate professional advice. Although every effort has been made to ensure accuracy, the International Air Transport Association shall not be held responsible for any loss or damage caused by errors, omissions, misprints or misinterpretation of the contents hereof. Furthermore, the International Air Transport Association expressly disclaims any and all liability to any person or entity, whether a purchaser of this publication or not, in respect of anything done or omitted, and the consequences of anything done or omitted, by any such person or entity in reliance on the contents of this publication. International Air Transport Association. All Rights Reserved. No part of this publication may be reproduced, recast, reformatted or transmitted in any form by any means, electronic or mechanical, including photocopying, recording or any information storage and retrieval system, without the prior written permission from: Senior Vice President Safety, Operations and Infrastructure International Air Transport Association 800 Place Victoria P.O. Box 1 13 Montreal, Quebec CANADA H4Z 1M1
Guidance Material and Best Practices for Pilot Aptitude Testing ISBN 978-92-9233-779-7 2012 International Air Transport Association. All rights reserved. MontrealGeneva
TABLE OF CONTENTS
1 2 INTRODUCTION ....................................................................................................................................... 4 PILOT APTITUDE TESTING .................................................................................................................... 7 2.1 Safety and Human Performance ..................................................................................................... 7 2.2 Efficiency ......................................................................................................................................... 7 2.3 Fair Testing...................................................................................................................................... 8 2.4 Risks of Not Testing Aptitude .......................................................................................................... 9 2.5 Quality Assurance ........................................................................................................................... 9 ITQI ONLINE PILOT SELECTION SURVEY .......................................................................................... 10 3.1 Outline of the Survey ..................................................................................................................... 10 3.2 Summary of Survey Questions...................................................................................................... 11 3.3 Lessons Learned from the Industry SURVEY ............................................................................... 13 3.3.1 Accepted Approach ........................................................................................................ 13 3.3.2 Selection Systems may display a lack of conceptual basis............................................ 13 3.3.3 Strengths and Weaknesses methodologies being improved ...................................... 13 3.3.4 Ready-Entry Pilots (low experience) are a diverse group .............................................. 13 3.3.5 Selection for Captains and First Officers Undervalued .................................................. 14 LEGAL PROVISIONS FOR APTITUDE TESTING ................................................................................. 15 4.1 General .......................................................................................................................................... 15 4.2 Survey ........................................................................................................................................... 15 4.3 Existing Regulations ...................................................................................................................... 15 4.4 Data Protection and Professional Standards ................................................................................ 17 APTITUDE TESTING AND RECRUITMENT / HIRING .......................................................................... 18 5.1 Testing Supports Recruitment ....................................................................................................... 18 5.2 Screening and Selection ............................................................................................................... 18 5.3 Mechanism: Structured Aptitude Testing and Recruitment ........................................................... 19 5.4 Cultural Diversity ........................................................................................................................... 21 5.5 Hiring Decision .............................................................................................................................. 21 PREDICTING PERFORMANCE OF PILOTS ......................................................................................... 22 6.1 Test Reliability ............................................................................................................................... 22 6.2 Test Validity ................................................................................................................................... 23 6.3 Norm .............................................................................................................................................. 24 6.4 Measurement Scales ..................................................................................................................... 25 MEASURING DIMENSIONS / TESTING INSTRUMENTS ..................................................................... 26 7.1 Measuring Dimensions of Pilot Aptitude Testing........................................................................... 26 7.2 Testing Instruments ....................................................................................................................... 28 7.2.1 Questionnaires ................................................................................................................ 28 7.2.2 Free-Style Interviews ...................................................................................................... 28 7.2.3 Semi-Standardized Interviews ........................................................................................ 29 7.2.4 Targeted Selection .......................................................................................................... 29 7.2.5 Paper-Pencil Psychometric Tests ................................................................................... 29 7.2.6 PC-Based Psychometric Tests ....................................................................................... 29 7.2.7 Work Samples................................................................................................................. 29 7.2.8 Simulation-Based Testing of Operational Competencies ............................................... 29 7.2.9 Fixed-Base Simulators .................................................................................................... 29 7.2.10 Full-Flight Simulators ...................................................................................................... 30 7.3 Motivation of Applicants ................................................................................................................ 30 7.4 IATA Matrix Pilot Aptitude Testing.............................................................................................. 30
iii
DESIGNING PILOT APTITUDE TESTING SYSTEMS ........................................................................... 33 8.1 Elements of a Pilot Aptitude Testing System................................................................................. 34 8.2 Test Criterion ................................................................................................................................. 34 8.3 PersonalITY Requirements ............................................................................................................ 35 8.4 Test Battery.................................................................................................................................... 35 8.5 Arrangement of Stages .................................................................................................................. 36 8.6 Content of Stages .......................................................................................................................... 37 ADMINISTRATION OF APTITUDE TESTING SYSTEMS ...................................................................... 39 9.1 Selection Team .............................................................................................................................. 39 9.2 Duration of Tests............................................................................................................................ 39 9.3 Outsourcing / Acquisition of Tests ................................................................................................. 39 9.4 Presentation of Results to the Applicant ........................................................................................ 40 9.5 Preparation Courses ...................................................................................................................... 40 9.6 Reapplication ................................................................................................................................. 40 9.7 Validity Period of Results ............................................................................................................... 41 9.8 Evaluation of the Aptitude Testing System .................................................................................... 41
10 FINANCIAL ASPECTS ............................................................................................................................ 42 10.1 Cost Effectiveness of Aptitude Testing Systems ........................................................................... 42 10.2 Structure and Contributions ........................................................................................................... 42
iv
ACRONYMS
A/C ATPL ATO CPL EASA FAA FBS FCL FCLPT FFS FTO ICAO IOE IR IT ITQI JAA JAR KSA MPA MPL PAN-TRG PC TEM Aircraft Airline Transport Pilots License Approved Training Organization Commercial Pilots License European Aviation Safety Agency Federal Aviation Agency (US) Fixed-Base Simulators Flight Crew Licensing Flight Crew Licensing and Training Panel Full Flight Simulator Fight Training Organization International Civil Aviation Organization Initial Operating Experience Instrument Rating Information Technology IATA Training and Qualification Initiative Joint Airworthiness Authority (Europe, pre-EASA) Joint Airworthiness Regulations Knowledge, Skills and Attitudes Multi-Crew Aeroplane Multi-Crew Pilot License Procedures for Air Navigation Services Training Personal Computer Threat and Error Management
vi
FOREWORD
Dear Colleagues, It is my pleasure to present the second edition of the IATA Guidance Material and Best Practices for Pilot Aptitude Testing. The creation of this manual was identified as one of the IATA Training and Qualification Initiatives priorities to preserve and improve operational safety in civil aviation. The data within the manual is based on an online survey that was conducted in the summer of 2009. Despite the clear benefits of a proper pilot selection process, the results showed that only a minority of airlines have a specific selection system in place that is structured and scientifically-based. This guidance material covers many of the aspects necessary to understand, construct and run a successful Pilot Aptitude Testing system. I would like to thank the members of the ITQI Project Group Selection and the airlines that have contributed to the development of this material. To obtain more information about the different publications that were developed under ITQI, please consult our website at: www.iata.org/itqi Best regards,
Gnther Matschnigg, Senior Vice President IATA Safety, Operations & Infrastructure
EXECUTIVE SUMMARY
Manual objective: to support all airlines. In recognition of the changing industry environment, this manual is designed to enable airline recruitment managers to implement modern practical pilot aptitude testing systems in their organizations. Operational decision-makers, aiming to recruit the best candidates, face the dilemma of selecting between similar testing systems offered by various providers with varying document terminology. This manual should enable more informed selection. Assumed Pilot Applicant Pool: A direct relationship between recruitment pool size and success of pilot aptitude testing (PAT) has been seen. The larger the recruitment pool, the better the PAT results. An adequate supply of pre-qualified and interested applications from which to select (for an airline career) are an assumed basis for this manual, which deals with selection and pilot aptitude testing. Initiatives to address a shrinking recruitment pool are beyond the scope of this manual. Selection Systems: The term pilot aptitude testing is used as hyponym, overarching all areas of pilot selection including aptitude diagnostics (basic abilities, specific/operational abilities, social competencies and personality traits). Measurement dimensions: The primary measurement dimensions of pilot aptitude tests are: a) b) c) d) Basic abilities (physical and mental) Operational competencies Social competencies Personality traits
System performance: The performance of an aptitude testing system can be measured by an evaluation of the following factors: a) Test reliability b) Test validity (especially predictive validity) c) Ratio of the selection rate (number off successes) versus hit rate (on-site success rate with regard the test criterion) Testing tools of choice: The least qualified testing instruments are freestyle interviews, while some of the higher qualified testing instruments involve psychometric testing. Classic flight-simulator checks are suitable to quantify the amount and type of training needed for selected personnel, and provide some confidence in the validity of previous experience, in case of ready-entry pilots, but they are not so suitable for testing aptitude. Simulation-based testing of operational competencies can be performed best on specifically programed (PC-based) low fidelity simulators, since they provide high values of predictive validity. Multi-stage testing systems (less expensive screening procedures first, costly selection procedures last) are most advisable. Selection team and result: Hiring decisions should be made by a dedicated selection team. In the interest of safety and fairness and, assuming that the aptitude testing system has been professionally developed implemented and validated, the hiring decision should be based solely on test results.
Executive Summary
Regulatory issues: Medical examination, language proficiency and the ability to comprehend training course content are specific ICAO requirements for training. National regulators worldwide have been reluctant to develop guidance on personality, yet this criterion is most important for flight crew. There are some general guidelines for assuring the best psychological aptitude of applicants, but there is a lack of definitive material available. Equal opportunity legislation, data protection rules, legal provisions for professional aptitude testing and aspects of cultural diversity must be considered to ensure that ethical and legal aptitude testing is achieved. Benefits: Professional aptitude testing for airline pilots, if correctly implemented, can contribute considerably to cost savings and enhanced safety for an airline. Selection is the first point of action, where no costs have yet been sunk, and improving this part of the process is critical to the avoidance of future risk and cost. The costs associated with implementing an effective aptitude testing system are significantly lower than subsequent costs of high failure rates resulting from immature selection. Benefits include enhanced safety, lower overall training costs, improved training and operational performance, more positive working environments, reduced labor turnover, and enhanced reputation of the airline brand.
1 INTRODUCTION
Human factors: IATA supports investments in human performance, since human factors continue to account for the majority of aircraft accidents. The airline accident rate has plateaued in recent years, and more growth is likely to lead to more accidents unless this rate can be further reduced. More effective PAT, as part of the IATA Training and Qualification Initiative (ITQI), has become an important initiative to enable quantifiable reductions in accident rates. Identified needs: Historically, military organizations and large operators with high volumes of applicants have access to mature selection systems. Ironically, smaller organizations which tend to have the highest turnover of personnel are often less capable of developing and maintaining an effective aptitude testing process. Over time the product of suboptimal PAT may become a safety issue, especially as the experienced pool dries up. As airlines face industry growth, decreasing numbers of experienced airline and military pilots will be available for direct entry, and a large number of operators will be forced to source from the general aviation market, and cadet entry. Survey: In order to address the PAT challenge from industry evidence, IATA conducted an online survey1 of member airlines and associated operators. This survey was designed to review industry selection practices in support of the production of this manual, posing 91 questions covering areas of organization/training/hiring, psychology, methodology and financial aspects of PAT. Full details of the survey and results can be found in Section 3. Single source document: The manual contains, in a single-source document, a summarized overview of the most important aspects of aptitude testing. Although large well-established operators may already enjoy mature selection processes, the industry is changing rapidly, and a review of the PAT manual by all operators is likely to support greater efficiency and system safety. Improving the selection process: Rather than just gearing up for more pilot training, we must also ask the question: Who are we looking for to man future commercial airliner flight decks? In consideration of new needs, selection must be re-geared. Attributes to enhance: Find fundamental attributes of the new generation pilot pool for teamwork development, and capitalize on their improved self-learning styles and rapid knowledge acquisition. Target attributes: The operational requirements of airliners from generation 1 to 4 are well understood, and generation 5 is some time away. The target attributes required for a safe airline-piloting career are therefore universally standard and should be considered independent of culture and generation. Adapting to change: Nevertheless, contemporary characteristics and motivations of new generations from different regions must be accommodated in the selection process. Appropriate adjustments must be made to maximize efficiency. Pilot aptitude testing therefore warrants a fresh review for the benefit of those operators who have well-established legacy selection processes, as well as those start-up operators. This manual attempts to provide a toolbox of solutions and best practices for all. Safety and growth: There are concerns that under severe industry cost pressures, the quality of selection and training may have subsided towards minimum standards, or at least failed to adapt to needs. Despite regular industry commitments to safety, and recognition that competent pilot performance is critical, current selection processes may not be optimal to new industry requirements. While crew training is a driver for operational safety, the quality of recruits entering the industry is the primary key. In periods of strong growth, pilot supply comes under increased pressure. If the quality of new entrants declines, longer-term operational safety may be compromised, and the task of training organizations becomes more challenging.
Introduction
Management input: The design phase of an aptitude testing system requires high management attention (definition of the job requirements, application/re-application criteria, presentation of results, evaluation procedures) and the involvement of aviation psychology expertise or professionals with experience and expertise in the aviation human factors arena. Management questions: Are current systems valid? Despite some skepticism amongst pilots regarding the viability of testing human behavior, there are numerous examples of aviators who have initially failed entry-level testing with legacy carriers and subsequently became highly successful professional pilots for other carriers. Ironically, many of these pilots were re-examined at a later date and hired as ready/direct entry pilots by the original testing airline to ultimately enjoy a seamless career. Some of these pilots were subsequently promoted to leadership positions within the legacy carriers because of their previous experience. This illustrates the need to clarify the following questions: How accurate are current industry selection practices? Is there enough research data available for the development of a consensus for job requirements and aptitudes? Are our aptitude testing instruments capable of measuring with reliability and validity and provide a long-term prognosis of the behavior of a candidate?
Study: An in-depth study of the fundamentals will answer these key questions. Additionally, this process can reveal areas of aptitude testing that work well and are commonly accepted, as well as those areas of testing where relatively new methods are under development. Career disclosure to applicants: Today few pilot candidates in selection are fully advised of the volatile nature of the airline industry. While effective selection requires that the employer must learn about the applicant, the applicant also has the right to know about the career ahead including both the negative and positive aspects. This is important to avoid later disappointment and motivation loss, which can translate into risk. Clarity regarding career prospects ahead is vital at the outset, as time and money spent at the start of the career will be better spent when compared to millions lost in failure or demotivation downstream, for those with false expectations. Applicants should have realistic expectations at the start. Length of the selection process: For the primary selection process, only a few days of actual selection time are often allocated to this important task. Hardly sufficient to provide the level of confidence in an individual you may employ for the next 30 years. A more extensive selection and grading process should be considered, including motivation, flight suitability, and simulation assessment over a reasonable time period. The output of this process is one of the most important investments in the industry. As a guide, military Air Forces and some airlines already allocate 3-4 weeks for the complete process. Simulation-based testing: The highest degree of difficulty becomes apparent when testing operational competencies of pilots in complex and dynamic situations. In order to measure, evaluate and predict this dimension of human behavior reliably, adequate simulation-based environments need to be developed and used. Currently, these testing environments are only available on a limited basis in very few testing facilities. Note: A simulation-based testing environment must not be confused with full-flight simulator testing, which is traditionally used in assessing flying skills and for determining the type and quantity of training needed for new pilots. Selecting team players: Airline operations are now so complex that pilots cannot be expected to know or understand systems to the extent previously required, but they need to know where to look and access information. As the systems become more complex, more focus should be on multi-crew operations. Although routine activities on flight decks have been automated, many new airline system challenges have emerged, demanding responsive management and teamwork skills at ever-increasing levels of quality.
INTRODUCTION SUMMARY
A major upgrade of the airline pilot selection process should consider the following: 1. Attributes, qualities, motivations, and attitudes of new generation pilot pools are changing and job requirements have also changed since earlier selection processes were established. 2. Be aware of parallel initiatives such as ICAO Next Generation of Aviation Professionals (NGAP) and IATA Training and Qualification Initiative (ITQI), working to re-invigorate youth interest in piloting careers to secure the large volume and quality of applicants needed. 3. Continuous feedback from operations back to selection is essential for continuous improvement. 4. As growth and other factors impact safety, selection of the right personnel becomes even more critical. The rapid development of hardware has exposed unresolved limitations in human-ware in cockpits, but humans remain the last line of defense in the error chain. Improved pre-education, selection, and training are the most fruitful targets for improvement if the accident rate is to be driven down further. ICAO, through the Next Generation Aviation Professionals (NGAP) initiative, and IATA, through the IATA Training and Qualification Initiative (ITQI), have both recognized the importance of selection and training of professionals.
Process variety: The development of a qualified diagnostic system for aptitude testing is time consuming and can be costly. Historically, military organizations and large operators with high quantities of applicants generally have access to robust and mature screening and selection systems. Smaller organizations, which tend to have the highest turnover of personnel, are usually less sophisticated in developing and maintaining an effective aptitude testing process. The latter may be considered a safety issue, especially as the pool of experienced applicants grows smaller. Growth: As airlines face growth, fewer experienced airline and military pilots will be available and a large number of medium and small size operators will need to recruit their staff from the general aviation market. In some regions there is no significant general aviation, and the shrinking pool of experiential knowledge will present more dramatic challenges for operators who will become increasingly dependent on cadet entry. This scenario necessitates even more rigor in all selection processes.
2.2 EFFICIENCY
The implementation of a professional aptitude testing system has proven to be highly effective and efficient. It has become more affordable but may still be perceived to be a high cost. If correctly implemented however, an effective aptitude evaluation process for pilots can contribute to considerable cost savings for the airline.
Hit rate definition: success rate with regard to the selection criterion
Confidentiality: De-identification of respondents and confidentiality of collected data was assured by IATA. Response Summary: The following is a summary of the surveys response rate: 110 institutions responded to the invitation by logging in 6 institutions stated reasons for not being able to answer the survey 66 institutions completed parts or all of the survey 53 completed part I 19 completed part II 19 completed part III 12 completed all three parts.
Survey observations:
a) Accepted approach the categories used to evaluate the issue of pilot selection, and the questions asked, were deemed acceptable survey design process by the respondents b) Current selection systems appeared to lack a conceptual basis there is a need for conceptual support in setting up an efficient selection system c) Strengths and weaknesses the mix of changes made and changes desired, affected the methodology used, more than organizational issues or efficiency of testing systems d) Ready-entry pilots (low experience/direct-entry) are a diverse group. Predicting the performance of this group seems to be especially difficult. This is the least homogeneous group with neither license or flying hour levels to clearly assess pilot competency. e) Selection for captains and first officers is undervalued most selection systems have been established for ab-initio candidates, and these display a high degree of sophistication. Fewer and less methodically-qualified selection systems are in place for first officers. Selection systems used for captains display the least developmental quality and maturity. Note: The cornerstones of an airlines safety culture are the leadership skills of its first officers and captains. Therefore, investments in the professional testing process are of vital importance.
10
PART I Organization-Training-Hiring
Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Question: Which types of operations are you performing? Which kind of personnel are you employing/recruiting? How many candidates have been tested in total during the recent three years? How many candidates passed the selection process successfully in the last year? How many candidates did you hire (did you give a contract) in the last year? How long has your selection system been in place? Are you offering selection for other companies? What actions do you take to ensure a sufficient number of applicants? Which way according to your experience is the most effective one? Are there any preconditions for the candidates to be accepted in your selection process? How long has the definition of preconditions been in place? Has your state a legal requirement for selection of pilots besides ICAO medical provisions and language proficiency? Does your reg. authority perform any selection in addition to ICAO medical provisions and language proficiency? Does your reg. authority delegate any selection in addition to ICAO medical provisions and language proficiency? Does your reg. authority supervise any selection in addition to ICAO medical provisions and language proficiency? Do you employ foreign nationals? Is there any restriction with regard to the number of foreign nationals? Do you tailor recruitment campaigns to specific target groups? If yes according to which criteria? Who is performing the selection? In the case your own company performs the selection partly or in total, do you have a special procedure to identify selection team members? What is your process to identify selection team members? From where do they get their qualification for this function? Who decides about hiring of pilots in your company? Is the decision (hiring) solely based on results of the pilot selection system? How many years do you keep your selection results valid in the case you cannot immediately hire candidates and put them on a waiting list? How is the result of your selection process presented to the candidates? How is the result of your selection process presented to the hiring decision maker? Do you perform a reference check on Ready Entry/FO/Capt. candidates? Is your selection system incorporated in the QMS of your company? Is the organization performing the selection certified? Who maintains the selection system in terms of QM?
11
Guidance Material and Best Practices for Pilot Aptitude Testing PART II Financial Aspects
Question: Question: Question: Question: Question: Question: What are the costs of your selection per candidate and group? What are the costs per new hire? How much do you invest per year in recruitment? How much does the candidate contribute towards the costs of selection? How much does the candidate contribute toward the costs of training? Do you receive any government incentives for recruiting, training, staff retention?
12
ITQI Online Pilot Selection Survey 3.3 LESSONS LEARNED FROM THE INDUSTRY SURVEY 3.3.1 ACCEPTED APPROACH
The participation rate of the ITQI survey, subsequent feedback from the participants and the ongoing evaluation of the survey results have shown that the categories used to evaluate the issue of pilot selection and the questions which were asked were accepted by the respondents and can be used in the further development and optimization of more standardized selection systems.
3.3.2
Very few (13/66) institutions stated that they run a selection concept. Ten organizations have dedicated concepts for different target groups in place. The others also perform selection, but do not refer to their procedures as a concept. This could be an indication, that there is a need for conceptual support in setting up efficient selection systems, which is an intended consequence of this IATA manual. Furthermore, most institutions running a selection system do not offer such services to third parties.
3.3.3
Organizations who monitor their selection system stated that their programs have changed significantly during recent years. Half would like to make more changes, but are limited by various factors. Changes made and changes desired affect the methodology (reliability, validity and evaluation) but are less focused on the impact on the organizational efficiency (time, costs and automation). A significant challenge is that professional selection systems can only be evaluated and maintained by collecting the necessary data from all parties in the system. Processing this data through the career path of a pilot into a statistically and scientifically accurate format is demanding of time. Consequently, most organizations applying a stable selection system consider their methodical criteria (high reliability, quality of the evaluation procedure and high validity) as strengths, followed secondarily by economics. The weaknesses they identified were, (a) requirements for test-operator qualification, (b) low degree of automation and (c) economy in time.
3.3.4
Recruiting groups:
The survey grouped applicants into four categories: (1) Captains, (2) First Officers, (3) Ready-Entry (Direct Entry) Pilots with low experience, and (4) Ab-initio graduates.
13
Guidance Material and Best Practices for Pilot Aptitude Testing Performance prediction:
Predicting the performance for ready-entry pilots (low experience) is especially difficult because this group is not homogeneous, and neither licenses nor flying hours can reliably describe a pilots competency. Numerous industry experts agree that from direct observation in extensive instructional and evaluation settings, there is a wide variety of individual performance for any given volume of total flight experience in a log-book; and that total flight experience is not a reliable measure of competence. A structured selection system is therefore vital in removing this ambiguity.
Career performance:
While many opinions abound, only long-term collection of career performance data in sufficient volume will most convincingly expose optimal selection criteria; which demographics work best (age, education, origins, previous employment arena), leading to the highest probability of career success.
3.3.5
Most selection systems have been established for ab-initio candidates and they display a high degree of sophistication. Fewer and less methodically qualified selection systems are in place for first officers, and those selection systems existing for captains display the lowest levels of system maturity and robustness. This may stem from the legacy belief that hours = competence.
Performance predictions:
Every operator seeks to make predictions about the future performance of their staff, such as the future suitability of first officers for a promotion to command. However, long-term predictions regarding the performance of pilots derived from earlier selection processes can only be made optimally when pilots remain in a continuously stable and controlled working environment fostering continuous education, professional development, and regular assessments of job performance. Normally, this situation can only be assumed to exist within mature organizations with positive organizational and safety cultures4, and wellestablished career development systems.
14
4.2 SURVEY
The legal survey part contained four questions concerning requirements, conduct and supervision of selection procedures by the authorities of the concerned state. Except for Language Proficiency (Level 4) requirements, no institution reported any involvement of their authority.
15
CAAC (China)
Additional to Medical Fitness and Language Proficiency in CCAR Part 61 (16. Dec 2004) the following requirements apply: 61.103 61.153 61.183 Eligibility requirements for student pilots (b) Be of good moral character Eligibility requirements for commercial pilot certificate (b) Be of good moral character (d) At least graduate from high school Eligibility requirements for ATPL (b) Be of good moral character (d) At least graduate from high school In the English version, there is no guidance material.
CASA (Australia)
Beside the Medical Fitness and Language Proficiency the following applies: CAR 5.09 Flight crew license issue and refusal: (1)(c) is a fit and proper person to hold the license.
FAA (USA)
14 CFR Part 61 requires for commercial pilots the relevant medical examination and to read, speak, write and understand the English language and: 61.153 (c) Be of good moral character
DGCA (INDIA)
Civil aviation requirements (CAR) Section 7 Series B Flight Crew Standards, Training and Licensing: 1 Introduction: In general, an applicant for issue of a license should meet the requirements in respect of Age, Basic Educational Qualification, Medical fitness ... 7.3 Commercial Pilot The applicant should have a) Passed 10 + 2 standard examination with Physics and Mathematics Subjects from a recognized Board/University or an equivalent examination.
TRANSPORT CANADA
The flight training unit shall ensure that an applicant, before being admitted to an integrated course, has a secondary school diploma or equivalent in accordance with the personal licensing standards.
16
Legal Provisions For Aptitude Testing 4.4 DATA PROTECTION AND PROFESSIONAL STANDARDS
In many areas of the world, legislative framework relating to personal data privacy and protection is in place. For example, the European Union Council Directive 95/46/EC set the requirements for the protection of individuals with regard to the processing of personal data and on the free movement of such data. It is important to be aware that such data also includes the reports that are generated by computers or human testers during aptitude testing. Outside EU: Testers beyond the legislation of the EU might find it useful to consider the key principles of the directive as it provides a minimum set of requirements to process personal data fairly and lawfully. In any case, data protection is an important issue to be addressed before putting a pilot selection system in place. Professional Standards: In addition to legal provisions, the society of psychologists has developed professional standards for the development, application and validation of aptitude testing (for example, in the USA, the Society for Industrial and Organizational Psychology and in Great Britain, the British Psychological Society). Note: It is strongly suggested that legal counsel is consulted for interpretation and application of the legal requirements concerning aptitude testing of personnel. The authors of this manual are not lawyers and the information given should not be taken as legal advice.
17
The following are key questions in this process: (a) (b) (c) (d) (e) (f) Who is responsible for the process? Who defines the requirements? Who performs the testing? Who takes the hiring decision? Is the decision solely based on the results of aptitude testing? How and by whom is the recruitment process / selection system maintained (evaluated)?
18
Combined requirements:
The variety of requirements dictate that the definition of the requirement profile ideally be a combined task of human resource departments, flight operations and training departments. Senior personnel of these departments are able to identify both the success and the potential problem areas that the company has experienced with its existing group of pilots.
19
Guidance Material and Best Practices for Pilot Aptitude Testing Biographical data:
The profile is normally extended by biographical data (i.e., age, sex, nationality, language, licensing, education, school degree, hobbies, sports, family, interests, etc.).
Note: Operators may adapt the suggested definitions to their specific requirements.
Additional criteria:
Additional target group characteristics like gender, age, language, nationality, school level and marks, flying hours, etc. may be added.
Documentation:
The recruitment process requires documentation.
Types of tests:
The testing process itself can thereafter be designed by identifying the types of tests to be included in the selection system. Since there are different testing methodologies available, it is essential to choose adequate tests for each specific step. A closer look at the pros and cons of various test batteries (sequence or set of tests) unveils widespread common errors, especially regarding the issues of who should conduct the testing and which methodologies are most effective (see Section 8 Construction of Pilot selection systems).
20
Team Decision:
In most cases, the hiring decision is made as a group by the designated airline selection team. Assuming that the selection system was based on a thorough definition of requirements, the hiring decision can be based primarily on the testing results. Other factors such as administrative aspects, availability of applicants, legal aspects and flying experience should be incorporated in the definition of requirements and are therefore covered by the screening process.
21
Test quality:
There is consensus that predictions depend on the quality of tests and feedback data for its evaluation. Without developing a specific, statistical formula, the following criteria describe the principles of effective testing which best determine the quality of its predictions.
Reliability coefficient:
Test reliability is expressed in terms of a reliability coefficient. The reliability coefficient provides an estimate of the precision of measurement of a test. The higher the reliability coefficient, the smaller the margin of error around a test score will be. Common methods of determining reliability are: Test/re-test reliability: The comparison of test results of the same group, which takes the same test again after a certain time. This method provides information about the stability of the test. Alternate form reliability: If several forms of a test exist that claim to measure the same ability then, the reliability coefficient can be obtained by comparing the results of the same group of individuals taking different tests. Internal consistency: If, for example, the results of a test from one group are split into two halves by taking the individual scores on all odd-numbered questions and the scores of all even-numbered questions, then the internal consistency can be measured by correlating the two halves. Alternate forms reliability: Tests may exist in several versions or alternate forms. Scores from different forms that measure the same dimensions are correlated.
22
23
6.3 NORM
Knowing that a candidate has reached a score of 70% is quite meaningless without a NORM. At best, it can only serve to provide a ranking among the tested group.
Comparison: Norms are necessary to enable objective interpretation, rating, and grading of test results. Norms are preconditions for meaningful pass/fail decisions. By comparing the candidates performance with the norm, we can determine how far the candidates performance is above or below the performance of the comparison group. The choice of the norm is crucial and the comparison group should be consistent of people who have applied for the same or similar jobs in the industry. For example, comparing the performance of ab-initio cadets in their initial type rating skill test with the skill test results of first officers transitioning from a short-haul type to a long-haul type, or with first officers being upgraded to captains on the same type, could lead to very different conclusions, easily misinterpreted.
24
25
Common process:
The following measuring dimensions represent a sample of what is common in pilot aptitude testing. These are not equally applied to all four different groups of pilots (ab-initio, ready entry pilots, first officers and captains). Ab-initio cadets & captains: Ab-initio cadets usually experience the most comprehensive test program while captains receive the least. However, it is not that simple, as the test program is usually adapted to the respective group. Ab-initio cadets, for example, cannot be expected to have aviation-specific technical knowledge. However, they must possess a good foundation of academic qualifications such as physics in order to acquire knowledge during their later training and career. In a similar way, this applies to some social competencies such as leadership and team play. Knowledge & social competency: Captains need to show professional aviation knowledge and should be strong in the social competencies needed to practice good Threat and Error Management (TEM) and Crew Resource Management (CRM). Some may have particularly strong instructional and/or managerial skills. Ready-entry pilots or first officers need to be flexible with the ability to adapt to new environments, challenges and policies. Basic mental abilities are vital, playing a constant and crucial role for all groups because they are necessary for each individual to cope with changes in his/her working environment (upgrading, typeconversions, new procedures, etc.).
26
The particular job requirements for pilots can be characterized by the dynamics caused by motion, time, content and complexity of the task to be performed. The ability to master these requirements is called Operational competency. Operational competencies should be tested in addition to basic mental abilities because they do not necessarily show a high correlation ratio. b) Operational Competencies (i.e., the most intelligent candidates are not always most suitable for the operational challenges of the job). Operational competencies comprise the abilities to perform the following: Psychomotor tasks: Compensatory tracking Pursuit tracking Multitasking Spatial orientation (3-dimesional, dynamically) Movement anticipation Information processing
Strategic competencies: Prevention Minimizing risk Compensatory strategies Dealing with ambiguity Management competencies: Planning Organization Prioritization Decision-making Problem-solving Collection of information Derivation of hypotheses Hypotheses checks Derivation of task concepts
c) Social competencies: Communication skills (non verbal/paraverbal/verbal) Cooperation Assertiveness Leadership competencies d) Personality traits Self discipline Self-critical attitudes Stress management Self organization Professional aspiration level
27
Questionnaires are the most used instruments. They are perfectly suitable to collect facts such as biographical data. They are not very suitable when testing for psychological criteria. The following are examples of relevant biographical data: School and university grades English language proficiency Mathematics and physics Computer skills Interest in sciences General education (cultures, history, politics and languages) Sports Interests in travelling Interests in leisure activities to compensate for stress Fanatic fixation about flying
Biographical data evaluation is a cost effective way of screening. Successful candidates from this stage are basically qualified for all of those professions which require a defined set of academic prerequisites; however, they need to be tested further for their pilot-specific aptitude. Screenings based on biographical data are used to assure that only qualified candidates are led towards the later and more costly stages of selection.
7.2.2
FREE-STYLE INTERVIEWS
Expert ratings acquired from free-style interviews are commonly used, but are very subjective and methodically weak. They are not suitable for measuring aptitude. They are also not suitable for selection or the elimination of the weakest candidates. If used at all, then they may be used to introduce people to each other but should not be particularly relevant for making decisions in the screening or selection process. Due to the lack of standardization in free-style interviews, each applicant usually faces a different scenario and comparisons between applicants are very difficult to make. By using a rating system at least the systematic aspect of observation can be improved. Holistic judgments of experienced experts, based on freestyle interviews, can sometimes produce surprisingly high hit rates; however, this fact can only be determined by evaluating the results after a certain amount of time (years). When a new person takes over the job, continuity is not assured and the new person cannot build on the previous experience.
28
Semi-standardized interviews follow a prescribed set of questions and evaluation criteria. They are quite demanding for the test operator. Provided that the interviewer applies professional questioning and communication techniques, semi-standardized interviews can be successful in capturing personality traits and social competence.
7.2.4
TARGETED SELECTION
In a targeted selection system the interviewer is to collect job-related behavior from an applicant's past history. Interviewers are trained to focus their interview skills and selection decisions on standardized interview principles. Performance is rated by interviewers by applied scores from defined scales (Underlying requirement dimensions can be developed by an empirical process using expert knowledge and in this case reflects the desired company culture)
7.2.5
Paper-pencil testing has been commonly used and is qualified to evaluate basic mental abilities (i.e., intelligence). These tests can play an important role in the screening process. They are usually performed with large groups of applicants but the evaluation is time consuming. Paper-pencil tests have normally been replaced by PC based testing.
7.2.6
PC-based psychometric tests require some IT infrastructure and can be web-based. Similar to paper-pencil psychometric tests, they are reliable and very cost-efficient for testing basic mental abilities.
7.2.7
WORK SAMPLES
The principle of a work sample is to create a task typical of the job to be performed, then observe the results and provide feedback. Work samples are typically used when screening applicants in the airplane (military) or when testing licensed pilots in a flight simulator. The value of work samples is highly dependent on the standardization of the exercises and the quality of observation personnel. If performed by welltrained and experienced experts, work samples can be of good value because of their realistic content.
7.2.8
Simulation-based testing of operational competencies can combine realism of work samples with the advantages of psychometric testing. This form of testing addresses the ability of pilots to solve complex tasks in dynamic environments. Such scenarios are more realistic and comprise interactions of multiple requirements, which must be controlled by pursuing certain (professional) strategies. They require not only rational performance (strategies), but also include emotional (fears and fun) and motivational (confidence and commitment) aspects. Testing scenarios are quite complex and tend to be less rigid then psychometric testing. They also can capture the ability of problem-solving in unstructured situations (unexpected emergency situations which cannot be drilled by procedures). Simulation-based testing of operational competencies can be performed on specifically programed (PC-based) low fidelity simulators. They provide high values of predictive validity5.
7.2.9
FIXED-BASE SIMULATORS
Fixed-Base Simulators (FBS) are used to provide testing in work sample scenarios. Their value depends on the standardization of the exercises and the expertise of the observation personnel.
5
29
Guidance Material and Best Practices for Pilot Aptitude Testing 7.2.10 FULL-FLIGHT SIMULATORS
Assessments in Full-Flight Simulators (FFS) seem, at first sight, to be ideal because they offer the highest degree of realism, by reproducing the actual dynamics and complexity of the pilots working environment. Flight Checks in full-flight simulators are commonly used to test flying skills of ready entry pilots, first officers and captains and are valuable tools to complement, but not to replace aptitude testing. Note: From a diagnostic point of view, work samples in full-flight simulators are quite demanding (standardization of the scenarios, disturbances, quality of observation, complexity of instruction, inter-rater reliability, etc.). Simple arrangements do not necessarily produce the kind and quality of data, required for valid aptitude testing purposes and are therefore not advised as replacements for classic means of aptitude testing.
Instruction must be understandable and should challenge performance adequate to the measuring dimensions. Some motivational screening tests with strong success rates are available today. By design, the best of these ensure that rehearsed answers will not be successful.
Matrix 1
Matrix 1 proposes one possible method to allocate measuring dimensions and instruments to the four target groups other solutions are possible.
Measuring Dimensions Screening Biographical Data Instruments Ab-initio Biographical and career data: School degree School marks Professional education documents Memory capacity Ready Entry First Officer Captain
Basic mental
Psychometric
Logic abilities
30
Ab-initio
Ready Entry
First Officer
Captain
Speed and accuracy of information processing (perception, classification, transformation) Spatial abilities (static) Technical comprehension Reasoning (information processing with basic figures) Logic abilities Long term concentration Allocation of attention Multi-tasking (different skills combined) Psycho motor abilities (pursuit tracking; compensatory tracking) Spatial abilities (dynamic)
Paper-pencil tests Composite mental abilities Psychometric apparatus tests PC-based psychometric tests Paper-pencil tests PC-based psychometric tests Simulator based tests / work samples; Psychometric tests for operational competences
Attention Allocation Multi-tasking (different skills combined) Situation awareness Situation awareness
Socialinteractional competences
Cooperation skills Assertiveness Conflict prevention and solution Cooperative problem-solving Leadership
Personality traits
Semistandardized interviews
Basic professional motivation Stress coping with social confrontation Stress coping with information load Stress coping with time pressure Self-discipline Self-criticism Captains additionally: Safety motivation
31
Guidance Material and Best Practices for Pilot Aptitude Testing Matrix 2
Matrix 2 shows in addition, a sample set of appropriate measuring scales.
Phases Measuring Dimensions Age, education type and level, grades, flying hours etc. Screening Knowledge: Mathematics, Physics, English language skills Ab-initio, Ready Entry Ab-initio, Ready Entry, FO; Captains (longterm concentration) Ab-initio, Ready Entry, FO School grades Rank Rows Target Group All Instrument Check Scales Yes/No
Work sample (simulation-based) or Psychometrics test (PC) Low Fidelity Simulator (PC) Group tests by means of work samples inconnection with rating system Half-standardized interview in connection with rating system
T-Values, Rank rows Rank rows, qualitative discrimination Rank rows, qualitative discrimination
All
All
32
Job analysis:
Job analysis involves two steps. First, the purpose of the job needs to be established (why the job is needed in the market and what function it serves). This step is called Job description. It must not be confused with the second step, which clarifies what kinds of abilities the applicants should possess to be able to perform the job well. These are called personal requirements and sometimes also called job requirements. Job analysis with job description and personal requirements are important because they require a detailed analysis of the organizations objectives, company values and future challenges. This task requires significant management input. It is important to establish a scientifically detailed definition of the essential employee qualities necessary for excellence in job performance. Besides basic general abilities, professional abilities and certain personality traits (sometimes operator specific) will be preferred.
System goals:
Once these prerequisites are established the organization can move toward building and maintaining an efficient and effective aptitude testing system capable of achieving the following goals: Identifying the most suitable staff for the job Delivering selected personnel at lowest possible cost Providing a fair and legally defensible architecture
33
Guidance Material and Best Practices for Pilot Aptitude Testing 8.1 ELEMENTS OF A PILOT APTITUDE TESTING SYSTEM
To achieve the goal of improving staff performance, either during training or during operations, the testing system must contain certain elements which need to be arranged in a sequential order to function effectively. This is referred to as a structured testing system including elements of which are tabled:
Key question What is our problem / goal? Required action Define the criterion(criteria) which shall be achieved by the Aptitude Testing System Perform Job Analysis. Translate job/personal requirements and performance criteria which constitute the desired good work into scientific psychological terms (KSAs Knowledge, Skills, Abilities/Attitudes and personality traits) which can be measured by psychological tests. Decide about the test battery (set of tests/measuring instruments) and their sequence. Decide on the selection team members. Combine all test scores (profile). Define cut-off criteria, decide at which stages to exclude applicants and decide when and by whom the hiring decision is taken. Construct the evaluation system by implementing data feedback process from training/ operation back to the selection team, assure supporting IT environment to enable data management.
How do we measure? Which tests serve us best? How do we get from test results to a hiring decision?
Note: The described process should be documented carefully for several reasons. First, selection is a part of the recruitment process and should be included in the quality assurance documentation of the company. Second, construction, implementation and maintenance of the selection system itself are facilitated by good documentation. Third, the documentation may be required for proving the fairness of the system against legal claims.
Measurement scales:
Digital measuring scales like pass/fail are often used but not very helpful. Scales should be carefully developed and the achievable performance levels must contain qualitative descriptions. Performance criteria for measuring pilot performance during training should coincide with the criteria used in the operation. This is usually seen if the training institution is closely connected to an associated airline (for MPL, this is a regulatory requirement).
34
6 7
P. MASCHKE and K.-M. GOETERS, Deutsches Institut fr Luft- und Raumfahrt e.V. 1999 Test battery definition: a set of aptitude tests/selection instruments
35
Multi-stage selection procedures are advisable because they offer the possibility to reduce costs per applicant. To achieve this, the less expensive tests are conducted first. Only successful candidates from Stage 2A are allowed to continue to Stage 2B which is more expensive. Multiple stage testing can be separated by time (i.e., if evaluation of the results is time consuming) and/or location (i.e., parts of the test battery are installed at a fixed location and other tests can be administered close to the home of the applicants or are available as online tests). However, depending on the required time and test arrangements, both stages can also be processed on the same day.
36
Stage 1 Screening
Stage 2A Selection/screening Stage 2B Selection Stage 2C Flight simulator testing (Ready Entry, FO, CPT only) Stage 3
Note: Full Flight Simulators (FFS) seem ideal selection devices as they offer the highest degree of realism and are commonly used to test ready entry pilots, first officers and captains. They are valuable to assess type specific flying skills and to determine training measures needed to integrate newly hired staff into the organization, as well as validating previous experience data provided. However, from a scientific perspective, the prediction of future performance may be difficult to assess in full flight simulators, because these devices do not produce the extent and quality of data needed for such diagnostics. FFS are normally not used to assess ab-initio cadets or read entry pilots with very low experience because of the high effort of instruction and pre-exercising that would be necessary before the assessment. Additionally, difficulty would arise in clearly measuring performance in the desired test-relevant measuring dimensions and avoid measuring the ability to comprehend the given simulator instruction. Competency-based pilot training courses, especially the MPL course, combine multi-stage selection procedures with continuous/progressive assessment of pilots performance during the initial career.
37
Note: During MPL courses, the final hiring confirmation decision is normally taken after the successful completion of the course at the end of the IOE phase. In this case, 15 to 18 months are available to continuously assess the performance of cadets/pilots. Competency-based training, including subsequent course evaluation process, requires that continuous onthe-job performance data is recorded throughout an entire career. This data must be fed back into the selection system to assure that the selection system continues to be able to validate the tests, allowing continuous improvement of the selection system. Data-rich selection systems are capable of coping with even small pools of applicants.
38
39
Guidance Material and Best Practices for Pilot Aptitude Testing Measuring dimensions:
The survey showed that during ab-initio testing, the highest number of measuring dimensions were applied (biographical data, mental abilities, pilot-specific competencies, personality traits and social abilities). With increasing flight experience (from ready entry pilots to first officers and captains) the number of measuring dimensions and their methodical qualification was reduced (a fact that should be questioned) and flight checks in full flight simulators were added (see Section 3.3 Lessons Learned from the Industry).
Pre-knowledge of tests:
The same applies to the fact that tests cannot be kept secret over a long period of time. In many cases, organizations provide a general overview to their applicants in advance. Applicants who have completed the tests pass the information to others. Similarly, career counselors and preparation institutes investigate test content as well to optimize their own preparation courses. Therefore, organizations should establish a policy on how to protect their aptitude testing procedures.
40
41
10 FINANCIAL ASPECTS
Pilot aptitude testing is considered an integral part of hiring and staff retention. Its costs therefore, should be related to the overall costs of hiring and staff retention. Appropriate testing of staff and the effort to retain selected staff in the organization are closely correlated; good selection is perceived as a stabilizing factor for the organization and reduces the effort of staff retention. Although the costs for testing can be tracked easily, the costs of recruitment and hiring are more complex and staff retention costs might even be difficult to judge.
A variable requirement:
Depending on the market situation, the demand for pilots, and the attractiveness of the organization looking for new staff, can be variables in a cycle. As a result, the selection process as a task can be permanent or sporadic, simple or elaborate task. Most organizations today rely on the internet to advertise jobs for pilots. However, the staffs personal contacts for potential applicants may also play an important role. Government incentives for recruiting, training or staff retention seem not to be available in aviation.
42
Financial Aspects
5. Who pays? the organization the applicant the applicant prepays and is reimbursed by the organization the organization prepays and is reimbursed by the applicant 6. Which costs are shared among the involved parties? costs of testing costs of training costs of administration
Expert outsourcing:
When building the hiring process, outsourcing of certain modules should be considered. A number of expert service providers supply professional pilot aptitude testing solutions offer at globally competitive prices. Some of these providers use internet to administer tests and therefore are location-independent and cost effective.
Conclusion:
A professionally administered Pilot Aptitude Testing (PAT) program may be viewed as a significant cost at first, but in relation to the far greater potential cost of a failed selection process, it must be seen as the first critical step in developing a piloting career. As such it can be seen as the cement poured into the foundations of an effective airline Safety Management System.
43
44
APPENDIX 1
Data Processing and Report December 2009-12-18 ITQI Selection Working Group
45
CONTENT
History of the Questionnaire Description of the Sample Why was the Questionnaire not answered (Reasons). Types of Operations Performed by the Answering Institutions Results of the Questions 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. Kind of Personnel Employed Amount and Costs of Pilot Selection Preconditions for Being Accepted as a Candidate for Pilot Selection Share of Costs, the Candidates have to Pay and Incentives of the Government Role of Government and Regulatory Authority Specification of the Selection Concept with Regard to Special Groups Evaluation of your Selection System and Structure Lessons Learned and Changes Made Specifications with Regard to Characteristics of the Own Company Methodical Aspects of your Selection System Composition of your Selection Team Requirement Dimensions and Selection Procedure Quality Management of the Concept
46
Appendix 1
I. HISTORY OF THE QUESTIONNAIRE
The questionnaire was developed by the ITQI Working Group Selection from October 2008 until June 2009 and distributed online to Legacy Carriers, Regional Airlines, Business Aviation, Cargo Carriers, Pilot Training Organizations, Universities with Aviation Fachlties and Pilot Selection Providers. The online survey started end of June 2009 and and finished early August 2009. From September till December 2009 data processing took place. Evaluation, analysis and the creation of the first draft of the relevant guidance material was finished in January 2010. The questionnaire was realized as an online concept which automaticly presenteted the questions. Most of the questions could be answered by yes/no or by slection of one or more answering categories from a list of alternatives. In some cases free style answering was allowed.
II.
110 institutions logged in. 66 institutions filled in one or two parts of the questionnaire or the whole questionnaire. The following table II.1. shows how many institutions answered which part of the questionnaire. Tab. II.1.: Frequencies of institutions answering parts of the questionnaire Parts Frequency of institutions Part I 53 Part II 19 Part III 19 53 institutions only filled in part I, 19 filled in part II and 19 filled in part III. (see Attachment 1) The following table II.2. shows, how many institutions answered how many parts of the questionnaire. Tab. II.2.: How many institutions answered how many parts of the questionnaire Frequency of No. of parts institutions 1 part 53 2 parts 1 (1 + 2; 2 + 3; 1 + 3) 3 parts 12 (1 + 2 +3) 53 institutions only filled in one part (1 or 2 or 3). 1 institution filled in two parts ( 1 and 3 or 1 and 2 or 2 and 3). 12 institutions filled in the whole questionnaire (three parts).
47
The institutions who did not answer the questionnaire but answered the question above, in no case stated, that they do not have a formalized selection system in place (see: table III.1. no. 4). The following table III.2. shows additional reasons for a non-response mentioned by some institutions. Tab. III.2.: Other reasons No. Categories Our selection system has been outsourced to CAE, so called 'Pilot 1 Provisioning Program' 2 We are a maintenance training organization and do not recruit pilots Frequency Percent 1 1 ,9 ,9
So there also have been institutions who logged in, but did not answer the questionnaire, because they did not do any selection (see: III.2. no. 2.).
48
Appendix 1
IV. TYPES OF OPERATIONS PERFORMED BY THE ANSWERING INSTITUTIONS
In order to give a first impression of the target group which answered the questionnaire, question 35 is put at the very beginning of this report. It asks how many institutions perform which type(s) of operation(s). Question 35: Which types of operations are you performing? A total of 57 institutions dealt with this question. It has been included in all parts of the questionnaire. 6 institutions did not make any input. 43 institutions mentioned between 1 and 7 types of operation. In total 69 institutions marked one or more types of operation (in various combinations). There are 12 different combinations. Tab. IV.1.: Types of operation Type of operation No. of operations No. of institutions Corporate 23 23 Regional 8 31 Legacy 8 39 Approved Training Organization 13 52 Cargo 7 59 Low Cost Carrier 4 63 Non Scheduled charter 6 69 Other operations 4 73
Beyond the types of operations predefined by the questionnaire, 4 institutions introduced additional operations. The following table IV.2. specifies this category. Tab. IV.2.: Other operations Categories Aircraft manufacture, flying demonstration, training and transportation trips 1
Carrier 1
49
The following table IV.3. shows how many institutions perform which combinations of operations. Tab.IV.3.: Types of operations and number of institutions No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Types of Operations Corporate Regional Legacy Approved Training Organization Cargo Low Cost Carrier Non Scheduled Charter Corporate/Regional Corporate/Cargo Regional/ Approved Training Organization Corporate/Non Scheduled Charter Corporate/ Approved Training Organization/ Cargo/Low Cost Carrier Regional/ Approved Training Organization /Low Cost Carrier/Non Scheduled Charter Corporate/ Regional/Lgacy/Approved Training Organization/ Cargo/Low Cost Carrier/Non Scheduled Charter No. of Institutions 14 4 4 8 1 0 0 1 1 1 1 1 1 2
50
Appendix 1
V. 1. RESULTS OF THE QUESTIONS Kind of Personnel Employed
Question 1: Which kind of personnel are you employing/recruiting? This question was answered by 51 institutions. The following table 1.1. shows, how many institutions employ which kind of target groups. Most of the institutions employ candidates from several different groups. Tab.1.1: No. of institutions and type of groups employed by the answering institutions Groups Ab Initio Ready Entry l.e. FOs Cpt.s No. of institutions 30 20 27 18
Most of the institutions recruit (n = 30) Ab Initio candidates.The least institutions (n = 18) recruit Cpt.s. The following table 1.2. shows more details on how many institutions employ which types of staff (1). Some institutions do not employ any staff. Tab.1.2.: Candidates employed No. Ab initio Ready Entry (low experience) 1 1 0 2 0 0 3 0 1 4 0 1 5 0 1 6 1 1 7 0 0 8 0 0 9 1 1 10 1 0 11 1 0 12 0 0 13 1 1 14 1 1 15 1 0 16 1 0 17 1 1 18 0 0 19 1 0 20 0 0
FOs 0 0 1 0 1 1 0 0 1 0 1 0 1 0 0 0 1 0 1 1
Cpt.s 0 1 0 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 1 0
Sums of groups 1 1 2 1 3 3 1 0 4 1 3 0 3 2 1 1 3 0 3 1
51
FOs 1 0 1 1 1 1 1 0 1 1 1 0 0 0 0 0 1 1 1 1 1 1 0 1 0 0 1 0 0 0 1 51
Cpt.s 1 1 0 0 0 0 1 0 0 0 1 0 0 1 0 1 0 1 0 1 1 1 0 0 0 0 1 0 0 0 1 51
Sums of groups 3 2 2 2 3 3 3 1 2 2 3 2 2 1 1 1 2 2 1 2 2 3 1 3 4 2 2 2 2 1 3
15 of the institutions only employ one group. 17 of the institutions employ two different groups. 14 of the institutions employ 3 different groups. 2 of the institutions employ candidates from each one of the groups. 3 institutions do not employ candidates from any group at all (entities which provide selection for others) As there is no formal definition of the expression low experience to characterize the Ready Entry group, the institutions were asked how they apply this term.
52
Appendix 1
The following table 1.3. shows the definitions of low experience stated by the different institutions. The data are based on the answers of 20 institutions. Tab. 1.3.: Definition of low experience No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Categories <1500 <500hdv 200 hours 200 hrs CPL Mult IR 250 Flying hours & Multi0-Engine 50 Flying hours (Licenced Pilots) or Exmilitary pilots with1000 Fly 350 hrs heavy twin, fixed wing 4 year college students 500 h 500 hours 500 hrs CIVIL AVIATION COLLEGE 230HOURS OF FLT TIME,ME,IR Civil Aviation College, 230 Hours of flying time, ME, IR CPL IR MPA CPL IR FI IRI CRI SFI TRI TRE Graduation from Civil Aviation College. 230 Hour of flying. Holding ME, IR Less 2000 hr ttl with little or no jet experience Less than 500h Licensed specialists (i.e. Flt. Navigators, Flight engineers ) Low experience Frequency of institutions 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
The definitions show significant differences with regard to the understanding of low experience. Reasons for this could be caused by the type of operation, the aircraft used, differences of the education systems, legal requirements, but also historical influences.
2.
Question 2: For which of the defined groups do you have a selection concept in place? This question refers to the institutions who have a selection concept in place for one or several target groups. It was answered by 13 institutions. 10 of them confirmed having a concept for one or several of the defined target groups. Compared with the data from question 55 (see: above) these are much less institutions (25/10). This difference refers to the different number of institutions who answered the repective questions. Survey: For which of the defined groups do you have a selection concept in place Tab. 2.1.: Number of institutions which have an own selection concept in place. Percent of 13 No.of institutions (No) Category No.of institutions (Yes) (Yes) Ab Initio 8 61,5 5 Ready Entry 6 46,2 7 FOs 7 53,8 6 Cpt.s 5 38,5 8 Percent of 13 (No) 38,5 53,8 46,2 61,5
53
Ab initio 1 1 1 0 1 1 0 1 1 1 8
FOs 0 1 0 1 1 1 1 1 0 1 8
Cpt.s 1 1 0 1 0 1 1 0 0 0 5
This data does not show any systematical connection between the selected groups and the performed operations. There are 8 institutions who select Ab Initios and 8 institutions who select FOs. 6 select Ready Entries and 5 select Cpt.s. There can be made out a perceptible trend toward an equipment with selection systems for several groups . Only one institution (no. 3) has a selection system only for 1 group (Ab Initio). The lines with no type of operations belong to the 5 institutions who perform selection (services, consulting) but no operations.
54
Appendix 1
Question 3:
How many candidates have been tested in total during the recent three years?
The following tables 2.4. 2.15. show how many candidates of the different career groups were tested by the institutions in the years 2006, 2007 and 2008.
A.
12 institutions answered the question with regard to the year 2006. Ab initio: Year 2006 Tab. 2.4.: Tested Ab Initio candidates 2006 No. Frequ. of candidates Frequ. of institut. Percent 1 2 3 4 5 6 7 8 9 10 11 0 10 21 36 49 100 116 135 204 1000 4862 1 1 1 1 1 1 1 1 2 1 1 12 8,3 8,3 8,3 8,3 8,3 8,3 8,3 8,3 16,7 8,3 8,3 100,0
Total of institutions:
In total n = 6737 Ab Initio candidates were tested by the 12 represented institutions in the year 2006. 12 institutions answered the question with regard to the year 2007. Ab initio: Year 2007 Tab. 2.5.: Tested Ab Initio candidates 2007 No. Frequ. of candidates Frequ. of institut. Percent 1 2 3 4 5 6 7 8 9 0 10 22 40 62 141 151 200 243 1 1 1 1 1 1 1 1 2 8,3 8,3 8,3 8,3 8,3 8,3 8,3 8,3 16,7
55
Total of institutions:
All in all n = 6527 Ab Initio candidates were tested by the 12 represented institutions in the year 2007. 15 institutions answered the question with regard to the year 2008. Ab initio: Year 2008 Tab. 2.6.: Tested Ab Initio candidates 2008 No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Frequ. of candidates 12 20 25 35 40 46 50 130 177 200 280 300 1000 4997 Frequ. of institut. 1 1 1 1 1 1 1 1 1 1 1 2 1 1 15 Percent 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 13,3 6,7 6,7 100,0
Total of institutions:
In total n = 7612 Ab Initio candidates were tested by the 15 represented institutions in the year 2008.
56
Appendix 1
B.
The following tables 2.7. 2.9. show how many candidates from the group of the Ready Entries were tested in the years 2006, 2007 and 2008. With regard to the year 2006 the question was answered by 10 institutions. Ready Entry: Year 2006 Tab. 2.7.: Tested Ready Entry candidates 2006 No. Frequ. of candidates Frequ. of institut. Percent 1 2 3 4 5 6 7 0 8 15 61 75 160 600 2 1 2 2 1 1 1 10 20,0 10,0 20,0 20,0 10,0 10,0 10,0 100,0
Total of institutions:
All in all n = 995 Ready Entry candidates were tested by the 10 represented institutions in the year 2006. With regard to the year 2007 the question was answered by 11 institutions. Ready Entry: Year 2007 Tab. 2.8.: Tested Ready Entry candidates 2007 No. Frequ. of candidates Frequ. of institut. Percent 1 0 2 18,2 2 8 1 9,1 3 9 1 9,1 4 10 1 9,1 5 20 1 9,1 6 55 1 9,1 7 56 2 18,2 8 129 1 9,1 9 600 1 9,1 Total of institutions: 11 100,0 All in all n = 943 Ready Entry candidates were tested by the 11 represented institutions in the year 2007. With regard to the year 2008 the question was answered by 13 institutions.
57
C.
The following tables 2.10 2.12. show how many candidates from the group of the FOs were tested by the institutions. With regard to the year 2006 the question was answered by 9 institutions. FOs: Year 2006 Tab. 2.10.: Tested FOs in 2006 No. Frequ. of candidates Frequ. of institut. 1 0 2 2 15 1 3 21 2 4 40 1 5 55 1 6 141 1 7 300 1 Total of institutions: 9 Percent 22,2 11,1 22,2 11,1 11,1 11,1 11,1 100,0
All in all n = 593 FOs were tested by the 9 represented institutions in the year 2006.
58
Appendix 1
With regard to the year 2007 the question was answered by 12 institutions. FOs: Year 2007 Tab. 2.11.: Tested FOs in 2007 No. Frequ. of candidates Frequ. of institut. 1 0 1 2 3 1 3 5 1 4 9 1 5 17 1 6 22 1 7 60 1 8 65 2 9 134 1 10 300 1 11 750 1 Total of institutions: 12 Percent 8,3 8,3 8,3 8,3 8,3 8,3 8,3 16,7 8,3 8,3 8,3 100,0
All in all n = 1430 FOs were tested by the 12 represented institutions in the year 2007. With regard to the year 2008 the question was answered by 15 institutions. FOs: Year 2008 Tab. 2.12.: Tested FOs in 2008 No. Frequ. of candidates Frequ. of institut. 1 0 1 2 2 1 3 5 1 4 6 1 5 8 1 6 10 1 7 13 1 8 15 1 9 35 1 10 42 1 11 60 1 12 63 1 13 65 1 14 200 1 15 750 1 Total of institutions: 15 Percent 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 100,0
All in all n = 1274 FOs were tested by the 15 represented institutions in the year 2008.
59
The following tables 2.13. 2.15. show how many Cpt. candidates were tested in the years 2006 2008 by the participating institutions. With regard to the year 2006 the question was answered by 10 institutions. Cpt.s: Year 2006 Tab. 2.13. Tested Cpt.s in 2006 No. Frequ. of candidates Frequ. of institut. 1 0 2 2 1 1 3 3 1 4 4 1 5 12 1 6 40 1 7 65 1 8 140 1 9 300 1 Total of institutions: 10 Percent 20,0 10,0 10,0 10,0 10,0 10,0 10,0 10,0 10,0 100,0
All in all n = 565 Cpt.s were tested by the 10 represented institutions in the year 2006. With regard to the year 2007 the question was answered by 9 institutions. Cpt.s: Year 2007 Tab. 2.14. Tested Cpt.s in 2007 No. Frequ. of candidates Frequ. of institut. Percent 1 0 2 10 3 11 4 19 5 55 6 60 7 78 8 300 Total of institutions: 2 1 1 1 1 1 1 1 9 22,2 11,1 11,1 11,1 11,1 11,1 11,1 11,1 100,0
All in all n = 533 Cpt.s were tested by the 9 represented institutions in the year 2007.
60
Appendix 1
With regard to the year 2008 the question was answered by 14 Institutions. Cpt.s: Year 2008 Tab. 2.15.: Tested Cpt.s in 2008 No. Frequ. of candidates Frequ. of institut. 1 0 3 2 1 2 3 2 1 4 5 1 5 10 2 6 19 1 7 60 2 8 69 1 9 100 1 Total of institutions: 14 Percent 21,4 14,3 7,1 7,1 14,3 7,1 14,3 7,1 7,1 100,0
All in all n = 337 Cpt.s were tested by the 14 represented institutions in the year 2008. Question 4: How many candidates passed the selection process successfully in the last year?
The following tables 2.16 - 2.19. show how many candidates of the different groups in 2008 passed the selection within the different institutions. The following question was answered by 19 institutions. Ab initio Tab. 2.16.: Passed Ab Initio candidates in 2008 No. Frequ. of candidates Frequ. of institut. Percent 1 0 2 10,5 2 5 1 5,3 3 6 1 5,3 4 9 1 5,3 5 12 3 15,8 6 13 1 5,3 7 14 1 5,3 8 16 1 5,3 9 31 1 5,3 10 61 2 10,5 11 80 2 10,5 12 162 1 5,3 13 200 1 5,3 14 384 1 5,3 Total of institutions: 19 100,0 With regard to the Ab Initios at the answering institutions n = 1158 candidates passed the selection process successfully.
61
Total of institutions:
With regard to the Ready Entries at the answering institutions n = 250 candidates passed the selection process successfully. The following question was answered by 14 institutions. FOs Tab. 2.18.: Passed FO candidates in 2008 No. Frequ. of candidates Frequ. of institut. 1 2 2 2 3 1 3 4 1 4 6 1 5 7 1 6 8 1 7 10 1 8 11 1 9 20 1 10 45 1 11 100 1 12 350 1 13 500 1 Total of institutions: 14 Percent 14,3 7,1 7,1 7,1 7,1 7,1 7,1 7,1 7,1 7,1 7,1 7,1 7,1 100,0
With regard to the FOs at the answering institutions n = 1068 candidates passed the selection process successfully.
62
Appendix 1
The following question was answered by 13 institutions. Cpt.s Tab. 2.19.: Passed Cpt. candidates in 2008 No. Frequ. of candidates Frequ. of institut. 1 2 1 2 1 2 5 1 3 8 2 4 9 1 5 12 1 6 17 1 7 20 1 8 28 1 9 100 1 10 130 1 Total of institutions: 13 Percent 15,4 7,7 7,7 15,4 7,7 7,7 7,7 7,7 7,7 7,7 7,7 100,0
With regard to the Cpt.s at the answering institutions n = 341 candidates passed the selection process successfully. Question 5: How many candidates did you hire (did you give a contract) in the last year?
The following tables 2.20 2.23. show how many candidates from the different career groups were hired by the answering institutions in the year 2008. The following question was answered by 15 institutions. Ab initio Tab. 2.20.: Hired Ab Initio candidates in 2008 No. Frequ. of candidates Frequ. of institut.ns 1 5 1 2 6 1 3 9 1 4 12 2 5 14 1 6 15 1 7 31 1 8 35 1 9 58 1 10 60 2 11 140 1 12 168 1 13 199 1 Total of institutions: 15 Percent 6,7 6,7 6,7 13,3 6,7 6,7 6,7 6,7 6,7 13,3 6,7 6,7 6,7 100,0
All in all n = 824 Ab Initio candidates were hired by the answering institutions.
63
64
Appendix 1
The following question was answered by 15 institutions. Cpt.s Tab. 2.23.: Hired Cpt. candidates in 2008 No. Frequ. of candidates Frequ. of institut. 0 1 1 3 2 1 3 1 5 1 8 1 9 1 17 1 19 1 20 1 22 1 28 1 130 1 Total of institutions: 15 Percent 6,7 20,0 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 6,7 100,0
All in all n = 266 Cpt. candidates were hired by the answering institutions. The following table 2.24. shows the overall number of institutions in each case, as well as the tested, accepted and hired candidates with regard to the different groups. Please keep in mind that the values of tested, passed and hired candidates in the table 2.24 can not be correlated column by column, as different numbers of institutions are contained in the cumulative values. Tab. 2.24.: Tested, accepted and hired candidates with regrad to the different groups in 2008 Groups Ab initio Ready FOs Cpt.`s No. of institutions 16 13 15 4 No. of tested candidates 12609 868 1274 341 No. of institutions 19 11 14 13 No. of passed candidates 1158 250 1068 341 No. of institutions 15 10 15 15 No. of hired candidates 824 181 1000 266
The table above shows that the biggest fluctuation exists in the Ab Initio sector. The smallest fluctuation applies to Cpt.s. The following tables 2.25 2.28. show the relations between tested and passed candidates of the four groups. In this case the 2 data sets refer to the same institutions and groups. The following table 2.25. indicates the relation between tested and passed Ab Initio candidates at the different institutions.
65
66
Appendix 1
FOs Tab. 2.27.: Tested and passed FO candidates Tested in the Year No. Passed the selection process 2008 1 6 2 2 10 6 3 8 4 4 15 2 5 5 3 6 750 500 7 60 10 8 42 11 9 200 100 10 13 7 11 65 20 12 63 45 13 2 8 The following table 2.28. indicates the relation between tested and passed Cpt. candidates at the different institutions. Cpt.s Tab. 2.28.: Tested and passed Cpt. candidates Tested in the Year No. Passed the selection process 2008 1 1 1 2 60 28 3 10 8 4 5 8 5 60 12 6 100 100 7 10 9 8 1 1 9 2 2 10 69 20 11 0 5 The selection rate, i.e. the number of accepted applicants in relation to the number of tested applicants varies significantly between the particular institutions. The following table 2.29 is based on the data of 26 institutions and shows, how many institutions employed how many candidates in which combinations. It could be misleading, to relate these figures to the figures of the relation between tested candidates and succesful candidates (cf. Ta. 2.25 2.28), because the employed persons often do not refer to the same year and succesful candidates often do not get employed in the year of the selection.
67
The number of hired candidates in the group of the Ab Initios ranges from 5 to 350, in the group of the Ready Entries from 0 to 73, in the group of the FOs from 2 to 500 and in the group of the Cpt.s from 0 to 130. The group of the FOs shows the biggest fluctuation.
68
Appendix 1
The following table 2.30. is based on the data of 26 institutions and gives an overview on the number of hired candidates in relation to the type of operation. Tab.2.30.: Hired candidates with regard to operations Operations
No. Corporate Regional Legacy ATO Low cost Cargo carrier Nonsch.charter Other operations? Ab Initio
Groups
Ready Entry, FOs Cpt.s Sum
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
1 1 1 1 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 1 1 1 1 0 1 0
0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 0 0 1 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1
0 0 0 0 0 1 0 0 0 0 0 1 0 1 1 1 1 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 0 1 0
0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 1 0
. . 73 . 2 . . . . 16 . 8 . 0 56 10 . . . . . 4 6 . 6 .
2 . 6 4 2 . 35 2 . . 500 18 11 24 . . . 7 . . 20 11 . 350 . 8
1 2 28 8 . . 19 3 17 . . 22 . 0 . . . 9 1 1 20 . . 130 . 5
69
The following tables 2.31. 2.34. show, in which year the different institutions have installed the selection systems for the different groups. The following question was answered by 23 institutions. Ab initio Tab. 2.31.: Selection system in place: Ab Initio No. In place since: Frequency Percent 1 1956 1 4,3 2 1973 2 8,7 3 1988 3 13,0 4 1996 1 4,3 5 1998 1 4,3 6 1999 2 8,7 7 2003 2 8,7 8 2004 1 4,3 9 2005 2 8,7 10 2006 4 17,4 11 2007 2 8,7 12 2008 2 8,7 Total of institut. 23 100,0 It ranges from 1956 to 2008. The following question was answered by 17 institutions. Ready Entry Tab. 2.32.: Selection system in place: Ready Entry No. In place since: Frequency Percent 1 1955 2 11,8 2 1956 1 5,9 3 1980 1 5,9 4 1988 1 5,9 5 1992 1 5,9 6 1994 1 5,9 7 1995 1 5,9 8 1998 1 5,9 9 1999 1 5,9 10 2000 2 11,8 11 2003 1 5,9 12 2004 1 5,9 13 2005 1 5,9 14 2007 1 5,9 15 2008 1 5,9 Total of institutions: 17 100,0
70
Appendix 1
It ranges from 1955 to 2008. The following question was answered by 23 institutions. FOs Tab. 2.33.: Selection system in place: FOs No. In place since: Frequency Percent 1 1946 1 4,3 2 1956 1 4,3 3 1970 1 4,3 4 1980 1 4,3 5 1994 1 4,3 6 1995 1 4,3 7 1998 2 8,7 8 1999 1 4,3 9 2000 3 13,0 10 2003 1 4,3 11 2004 1 4,3 12 2005 2 8,7 13 2006 1 4,3 14 2007 5 21,7 15 2008 1 4,3 Total of institutions: 23 100,0 It ranges from 1946 to 2008. The following question was answered by 16 institutions. Cpt.s Tab. 2.34.: Selection system in place: Cpt.s No. In place since: Frequency Percent 1 1970 1 6,3 2 1980 1 6,3 3 1989 1 6,3 4 1994 1 6,3 5 1998 1 6,3 6 1999 1 6,3 7 2000 3 18,8 8 2003 1 6,3 9 2004 1 6,3 10 2006 2 12,5 11 2007 3 18,8 Total of institutions: 16 100,0 It ranges from 1970 to 2007.
71
9 of the 33 institutions who answered question 7 offer selection for other companies, too. 24 institutions do not offer selection for other institutions. The following table 2.36. shows, how many institutions offer selection services to other companies and for which target group. This question was answered by 42 institutions (answers with regard to several groups were possible). Tab. 2.36.: Institutions offering selection services for other companies No.of institutions (Yes) Percent No.of institutions (No) Groups (Yes) Ab initio 7 16,7 35 Ready entry 7 16,7 35 FOs 6 14,3 36 Cpt.s 3 7,1 39 Question 8: Percent (No) 83,3 83,3 85,7 92,9
What are the costs of your selection per candidate and group (US$ per candidate)?
The following table 2.37. shows, which costs the institutions have for the selection per candidate in the different groups. The left column contains the cost levels. The following columns to the right show, how many institutions have such costs for which type of selection and group (1). Tab. 2.37.: Costs for selection per candidate in different career groups Level: costs 2000 2940 4110 Ab Initio Outsource 0 0 0 Ab Initio In House 1 Ready Entry Outsource 0 0 0 Ready Entry In House FOs Outsource 1 1 1 FOs In House Capt.s Outsource 1 Capt.s In House 0 0 0
72
Appendix 1
Question 9: What are the costs per new hire administration)($ per candidate/group) (including advertisement, selection and
This question was answered by 7 institutions. The following table 2.38. shows, which costs per new hire of a candidate/ group were told how often by the different institutions. In the left column costs levelsare listed. The following columns show, how many institutions have such costs per new hire of which type of group. Tab. 2.38.: Costs per new hire Frequencies of institutions on the respective level of cost per group Level: costs Ab Initio (n = 7) Ready Entry (n = 5) FOs (n = 5) Capt.s (n = 4) 0 3 2 1 1 500 1 0 0 0 1000 1 0 0 1 1500 0 0 1 1 4000 0 0 1 0 4500 0 1 0 0 5000 1 1 0 0 8000 0 0 1 1 12000 0 0 1 0 13050 1 0 0 0 16000 0 1 0 0 The costs The costs The costs The costs stated. per Ab Initio new hire rangefrom 0 to 13 050 US $.. per Ready Entrie new hire range from 0 to 16 000 US $. per FOnew hire range from 0 to 12 000 US $. per Cpt new hire range from 0 to 8 000 US $ . For the hiring of captains the lowest costs are
Question 10:
What actions do you take to ensure a sufficient number of applicants (several answers are allowed)?
This question was answered by 41 institutions. The following table 2.39 show how often the different institutions use a certain type of advertisement. Survey: Means for advertisement Tab. 2.39.: Number of institutions which use a certain type of advertisement No. Means for advertisement 1 General news papers 2 Specialized aeronautical publications 3 Homepage 4 Personal contacts of staff to potential applicants 5 Organized meetings with schools/companies No. of institutions 12 13 25 12 11
Obviously the homepage is the far most frequently used type of advertisment. Concerning the other means, there is no significant difference in the frequency of their use.
73
Which way, according to your experience, is the most effective one (type of advertisement)?
This question was answered by 31 institutions. The following table 2.41. gives an overview of the types of advertisement, which were judged to be the most effective ones by the institutions. Survey : Judgements on the most effective type of advertisement Tab. 2.41.: Effectiveness of kinds of advertisement No. Categories 1 Advertisement in general news papers 2 Advertisement in specialized aeronautical publications 3 Homepage 4 Personal contacts of staff to potential applicants 5 Organized meetings with schools/companies Total of institutions/judgements: Most effective type of advert. Percent 4 4 15 7 1 31 12,9 12,9 48,4 22,6 3,2 100,0
The homepage was judged to be by far the most effective type of advertisment. Question 12: How much do you invest per year in recruitment?
The following table 2.42. shows how much the 5 answering institutions spend on recruitment per year. Tab. 2.42.: Investment for recruitment No. Level of costs (US $) Frequency of institutions 1 0 2 2 300 000 1 3 400 000 1 4 4 900 000 1 Total of institutions: 5 Percent 40,0 20,0 20,0 20,0 100,0
74
Appendix 1
3. Preconditions for Being Accepted as a Candidate for Pilot Selection
Are there any preconditions for the candidates to be accepted in your selection process?
Question 13:
The question was answered by 40 institutions The following table 3.1. shows how many institutions establish preconditions for the acceptance of a candidate for the selection. Ab Initio: Are there any preconditions for the candidates to be accepted in your selection process? Tab. 3.1.: Preconditions for acceptance of candidates in selection No. Categories Frequency Percent 1 Yes 36 90,0 2 No 4 10,0 Total of institutions: 40 100,0 36 of the 40 answering institutions stated that they have preconditions for their selection. These are necessary because of the requirements of the profession of pilots. There are significant differences, however, in the type of required preconditions and the level on which they are established. (cf. the following tables) Ab Initio: Are there any preconditions for the candidates to be accepted in your selection process? The question was answered by 36 institutions. The following table 3.2. shows which preconditions are required by the institutions for the admission to the selection process concerning the Ab Initio group. Tab. 3.2.: Preconditions: Ab initio No. Categories 1 Gender 2 Age 3 School degree 4 School marks 5 Flying hours 6 Nationality 7 Ethnic groups 8 Knowledge of foreign language 9 Other precinditions Frequency (yes) 1 16 15 7 8 8 0 3 6 Percent of n = 36 2,8 44,4 41,7 19,4 22,2 22,2 0 8,3 16,7
6 of the 36 institutions who answered the question require preconditions t other than the predifined categories.
75
The minimum age ranges from 18 to 21 . One institutions stated the minimum age of 17 . The higher values refer to the upper age limit. It ranges from 26 years to 45 years. Specifications with regard to school degree: Ab initio Tab. 3.4.: Specifications: school degree No. Categories 1 >=bac+1 2 An university graduate 3 Bachelor's Degree 4 Bachelor 5 Fachgeb. oder allg. HS0Reife 6 GCE A Levels 7 Graduated High School 8 Graduation of university 9 HIGH SCHOOL 10 High School Diploma 11 High school graduated 12 High school/university/bachelor 13 Irish Leaving Cert 14 Pre college and college 15 Sciences preferred Frequency 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
76
Appendix 1
Specifications with regard to school marks: Ab initio Tab. 2.5.: Specifications: school marks No. Categories 1 A level in Physics, Maths 2 Any subjects 3 College degree preferred 4 English, Math, Science 5 Maths and physics 6 SCIENCE Frequency 1 2 1 1 1 1
Mathematics and Physics were stated most frequently. Specifications with regard to flying hours: Ab initio Tab. 3.6.: Specifications: Flying hours No. Categories 1 0 2 235 3 250 4 250 hr/ ME IR 5 500 hours 6 500/ME/IR for ZFTT, 150/ME/IR for full type rating 7 Minimum 200 Frequency Frequency 2 1 1 1 1 1 1
The range goes from 0 to 500 flyings hours. There seem to be different definition of the term Ab Initio . It also may be the case that the question was not clear to everybody and some institutions stated the accumulated flying hours during their respective ab-initio course. . Ready Entry: Are there any preconditions for the candidates to be accepted in your selection process? The question was answered by 36 institutions. The following table 3.7. gives a survey on how often institutions define preconditions for accepting Ready Entry candidates for the selection. Survey: Ready Entry: Tab. 3.7.: Preconditions: Ready Entry No. Categories 1 Gender 2 Age 3 School degree 4 School marks 5 Flying hours 6 Nationality 7 Ethnic groups 8 Knowledge of foreign language 9 Other precinditions Frequency (yes) 1 6 5 4 9 5 0 5 2 Percent of n = 36 2,8 6 13,9 11,1 25,0 13,9 0 13,9 5,6
2 institutions introduced types of preconditions which were not given in the questionnaire.
77
Specifications with regard to flying hours (number and specifications) Tab. 3.11.: Specifications: flying hours No. Categories 1 200 2 3 4 5 6 7 250 hours CPL/IR with frozen ATPL subjects Frequency Percent 1 ,9 1 ,9 ,9 ,9 2,7 ,9 ,9
250 hours for licenced pilots/1000 hours for military pilots 1 350 hr heavy twin multi engine 500 500 hours 500 on type 1 3 1 1
The requirement of flying hoursas a precondition ranges from 200 to 500 hours.
78
Appendix 1
Specifications with regard to nationality Tab. 3.12.: Specifications: nationality No. Categories Frequency Percent 1 Brasilian 1 ,9 2 EEC 1 ,9 3 EU 1 ,9 4 Korean 1 ,9 5 Maltese 1 ,9 Specifications with regard to ethnic groups Tab. 3.13.: Specifications: ethnic groups No. Categories Frequency Percent 1 0 110 100,0 Specifications with regard to knowledge of a foreign language (in the case the native language is English) Tab. 3.14.: Specifications: foreign language No. Categories Frequency Percent 1 0 31 86,1 2 Knowledge of a foreign language 5 Total 36 13,9 100,0
The following tables 3.15. - 3.16. contain statements on preconditions which were not predefined in the questionnaire. Other preconditions? Tab. 3.15.: Other preconditions No. Categories Frequency 1 0 34 2 Other preconditions? 2 Total of institutions 36 Percent 94,4 5,6 100,0
Specifications with regard to other preconditions. Tab. 3.16.: Specifications: Other preconditions No. Categories Frequency Percent 1 Civil Aviation College graduate 1 ,9 2 Graduation from the Civil Aviation College 1 ,9 FOs: Are there any preconditions for the candidates to be accepted in your selection process? This question was answered by 36 instituions.
79
The number of flying hours for FOs plays the biggest role. The following tables 3.18. - 3.23. specify the categories which have been used in table 2.17. for the group of FOs. Specifications with regard to age Tab. 3.18.: Specifications: age No. Age limits 1 18-37 2 18 3 21 4 58 5 max 40 years 6 Minimum 18 years Frequency 1 1 1 1 1 1
Values for the minimum and the maximum age are specified. The range of the minimum age goes from 18 to 21 years and for the maximum age from 37 to 58 years. Specifications with regard to school degree Tab. 3.19.: Specifications: school, degree No. Categories 1 Fachgeb. oder allg. HS0Reife 2 High School Diploma 3 Post0secondary entrance 4 University Frequency 1 1 1 1
High school diploma and even university gradings are required as a precondition for selecting FOs. Specifications with regard to school marks Tab. 3.20.: Specifications: school marks No. Categories Frequency 1 High School 1 1 institution asks for high school marks as a precondition for FOs.
80
Appendix 1
Specifications with regard to flying hours (number and specifications) Tab. 3.21.: Specifications: flying hours No. Categories 1 1000 hours on high performace types 2 1500 3 200 hour 4 200 total 5 2000 6 250 7 250 hr ME IR 8 500 9 500 hours 10 600 total time, 300 MCC 11 Depends on age 12 Min. 1500 TT with min. 500 jet or turboprop >5,7 t MTOW 13 Minimum 200 14 Minimum 2000 hrs fixed wing 15 Total 1000 hours/500 hours for applying A/C type Frequency 1 2 1 1 1 1 1 1 1 1 1 1 1 1 1
Also with regard to the flying hours the requirements show substantial variety. The range goes from 200 to 2000 hours. Specifications with regard to nationality Tab. 3.22.: Specifications: nationality No. Categories 1 Brasilian 2 EEC 3 EU 4 Jordanian 5 Nationals of Trinidad and Tobago 6 Vietnamese & others Frequency 1 1 1 1 1 1
Partly the institutions require a certain nationality. Others restrict to political groups like EU. Specifications with regard to ethnic groups Tab. 3.23.: Specifications: Ethnic groups Frequenc No. Categories 1 0 110 Percent 100,0
81
Cpt.s: Are there any preconditions for the candidates to be accepted in your selection process? This question was answered by 36 institutions. The following table 3.25. shows the frequencies of certain profiles of preconditions the institutions defined for their selection of Capts Survey: Cpt.s: preconditions for the candidates to be accepted for which group? Tab. 3.25.: Preconditions: Cpt.s No. Categories 1 Gender 2 Age 3 School degree 4 School marks 5 Flying hours 6 Nationality 7 Ethnic groups 8 Knowledge of foreign language 9 Other precinditions Frequency (yes) 1 5 3 1 12 4 0 2 5 Percent of n = 36 2,8 14,0 8,4 2,8 33,6 11,2 0,0 5,6 14,0
The following tables 3.26. to 3.31. show the specifications of criteria for the Cpt.s. Specifications with regard to age Tab. 3.26.: Specifications: age No. Age limits 1 18 2 58 3 Minimum 18 years 4 5 Frequency 1 1 1 Percent ,9 ,9 ,9 ,9 ,9
The range for the minimum age goes from 18 to 21 years and for the maximum age from 57 to 58.
82
Appendix 1
Specifications with regard to school degree Tab. 3.27.: Specifications: school degree No. Categories Frequency 1 BS 1 2 High School Diploma 1 3 University 1 Specifications with regard to school marks Tab. 3.28.: Specifications: school marks No. Categories Frequency Percent 1 High school 1 ,9 Specifications with regard to flying hours (number and specifications) Tab. 3.29.: Specifications: flying hours No. Categories 1 1500 hours 2 2 000 hr PIC on Jet more 20 tons 3 3 000 4 3 000 hours 5 3 500 6 5 000 7 6 000 8 7 000 hours total, 3 000 PIC, international routes 9 8000 10 Min. 3 000 TT with min. 2 000 jet or turboprop >5,7 t MTOW 11 Minimum 3000 12 Total 5 000 hours/ 500 hours of PIC time The required minimum experience ranges 1500 to 8000 hours. Specifications with regard to nationality Tab. 3.30.: Specifications: nationality No. Categories 1 Brasilian 2 EU 3 Nationals of Trinidad and Tobago 4 Vietnamese & others Frequency 1 1 1 1 Percent ,9 ,9 ,9 ,9 Frequency 1 1 1 1 1 1 1 1 1 1 1 1 Percent ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9 Percent ,9 ,9 ,9
Specifications with regard to ethnic groups Tab. 3.31.: Specifications: ethnic groups No. Categories Frequency Percent 1 0 110 100 Ethnic groups are not excluded or priviledged by any of the institutions.
83
The following tables 3.33. to 3.36. show how long the above mentioned specifications have been valid for the different groups. Ab initio Tab. 3.33.: Preconditions in place: Ab Initio No. Categories Frequency 1 1 year 1 2 10 1 3 10 years 2 4 10 years; review each year 1 5 1956 1 6 20 1 7 2006 1 8 2007 1 9 21 years 1 10 3 1 11 3 years 2 12 Always 1 13 For 36 years 2 14 More than 20 years 1 15 Since 2003 1 16 Since 2008 1 The range for Ab Initios goes from 1 to 36 years. Ready Entry Tab. 3.34.: Preconditions in place: Ready Entry No. Categories Frequency Percent 1 1 1 ,9 2 10 1 ,9 3 10 years 2 1,8 4 1999 1 ,9 5 2 years 1 ,9 6 20 1 ,9 7 20 years 1 ,9 8 2000 1 ,9 9 2007 1 ,9 Percent ,9 ,9 1,8 ,9 ,9 ,9 ,9 ,9 ,9 ,9 1,8 ,9 1,8 ,9 ,9 ,9
84
Appendix 1
Tab. 3.34.: Preconditions in place: Ready Entry No. Categories Frequency Percent 10 6 years 1 ,9 11 For 54 years 2 1,8 12 ID 1 ,9 13 More than 20 years 1 ,9 The range for Ready Entries goes from 1 to 54 years. FOs Tab. 3.35.: Preconditions in place: FOs No. Categories Frequency Percent 1 1 1 ,9 2 10 years 3 2,7 3 1970 1 ,9 4 1995 1 ,9 5 1999 1 ,9 6 2 1 ,9 7 2 years 1 ,9 8 2000 1 ,9 9 2005 1 ,9 10 2006 1 ,9 11 2007 2 1,8 12 30 1 ,9 13 5 1 ,9 14 6 years 1 ,9 15 at least 2 years 1 ,9 16 ID 1 ,9 The range for FOs goes from 1 to 30 years. CPT.S Tab. 3.36.: Preconditions in place: Cpt.s No. Categories Frequency Percent 1 10 years 2 1,8 2 1970 1 ,9 3 1995 1 ,9 4 2 years 1 ,9 5 20 1 ,9 6 2006 1 ,9 7 2007 2 1,8 8 3 1 ,9 9 30 1 ,9 10 6 years 1 ,9 11 More than 20 years 1 ,9 The range for Cpt.s goes from 1 to 30 years.
85
Question 15:
The following table 4.1. shows the frequencies of candidates in the different groups who share the costs for selection. Tab. 4.1.: Contribution toward costs of selection No. Categories Amount (US$) 1 Ab initio (n = 7) 0 2 Ready entry (n = 5) 0 3 FOs (n = 6) 0 4 Cpt.s (n = 4) 500 In 4 institutions the Cpt.s have to pay US $ 500 for the selection. Question 16: How much does the candidate contribute toward the costs of training?
The following tables 4.2. to 4.5. show the frequencies of candidates in the diferent groups who share the costs for training. The following question was answered by 7 institutions. Ab initio: Tab. 4.2.: Contributions towards the costs of training No. Categories Frequency Percent 1 0 4 57,1 2 45 000 1 14,3 3 84 000 1 14,3 4 120 000 1 14,3 Total 7 100,0 In the Ab Initios group the share ranges from 45 000 to 120 000 US $. The following table contains data of 5 institutions. Ready Entry Tab. 4.3.: Contributions towards the costs of training No. Categories Frequency Percent 1 0 4 80,0 2 30 000 1 20,0 Total 5 100,0 1 institution stated an amount of 30 000 US $ for the group of Ready Entries.
86
Appendix 1
The following table contains the data of 6 institutions. FOs Tab. 4.4.: Contributions towards the costs of training No. Categories Frequency Percent 1 0 6 100,0 None of the institutions is asking for a share of costs by the FOs for training. The data in the following table are based on the answers of 4 institutions. Cpt.s Tab. 4.5.: Contributions towards the costs of training No. Categories Frequency Percent 1 0 4 100,0 None of the institutions is asking for a share of costs by Cpt.s. for training.
5.
Questions 17 - 19:
The following table 5.1. shows how many institutions get any government incentives for recruiting (Q. 17), training (Q. 18) and staff retention (Q. 19). Survey of the questions 17 to 19: Tab. 5.1.: Government incentives for recruiting No. Categories Amount (US$) 1 Recruiting (n = 10) 0 2 Training (n = 11) 0 3 Staff retention (n = 11) 0 None of the institutions get any government support in terms of funding recruitment, training or staff retention . Question 20: Has your state a legal requirement for selection of pilots besides ICAO medical provisions and language proficiency
The following tables 5.2. to 5.5. show which legal requirements influence the type of selection for the different groups.
87
The data in the following table 5.3. are based on the answers of 3 institutions. Ready Entry Tab. 5.3.: Legal requirements of the government No. Categories 1 Class1 2 Medical clearance by state aviation authority 3 Security check Frequency 1 1 1
The data in the following table 5.4. are based on the answers of 3 institutions. FOs Tab. 5.4.: Legal requirements of the government No. Categories 1 NATIONAL LICENSE 2 level 4 and Class1 3 4 Frequency 1 1
The data in the following table 5.5. are based on the answers of 3 institutions. Cpt.s Tab. 5.5.: Legal requirements of the government No. Categories Frequency 1 FAA, JARS OR NATIONAL LICENSE 1 2 Medical clearance by state aviation authority 1 Question 21: Does your reg. authority perform any selection in addition to ICAO medical provisions and language proficiency?
88
Appendix 1
The following table 5.6. shows the number of institutions, who confirm, that the reg. authority except medical provisions and language proficiency is performing additional selection. Tab. 5.6.: Additional type of selection No. Categories Frequency Percent 1 Yes 1 3,1 2 No 31 96,9 Total 32 100,0 One institution stated that the reg. authority performs additional selection. The following table 5.7. specifies the type of selection the government performs. The data only refer to one institution. Tab. 5.7.: Additional type of selection No. Groups Type of selection 1 Ab initio ELP Level 4 2 Ready entry ELP Level 4 3 FOs ELP Level 4 4 Cpt.s ELP Level 4 Question 22: No. of institutions 1 1 1 1
Does your reg. authority delegate any selection in addition to ICAO medical provisions and language proficiency?
This question was answered by 31 institutions. The following table 5.8. shows at how many institutions the reg. authority delegates selection in addition to ICAO medical provisions and language proficiency. Does your reg. authority delegate any selection in addition to ICAO medical provisions and language proficiency? Tab. 5.8.: Delegation of aditional selection Frequenc No. Categories 1 Yes 3 2 No 28 Total 31 Percent 2,7 25,5 28,2
3 institutions stated, that delegations happen. 28 institutions stated, that this does not happen.
89
This question was answered by 31 institutions. The following table 5.10. tells, at how many institutions the reg. authority supervises selection according to certain criteria. Tab. 5.10.: Supervision of selection by the government No. Categories Frequency Percent 1 Yes 4 3,6 2 No 27 24,5 Total 31 28,2 4 institutions reported that the reg. authority supervises selection in addition to ICAO medical provisions and language proficiency. In 27 cases the institutions denied any supervision by the reg. authority. The following table 5.11. specifies the statements of the 4 institutions, which confirm supervision. Tab. 5.11.: Specification of supervised selection No. Groups Type of selection No. of institutions 1 Ab initio ELP Level 4 1 2 Ready entry ELP Level 4 1 3 FOs ELP Level 4 1 4 Cpt.s ELP Level 4 1
90
Appendix 1
6. Specification of the Selection Concept with Regard to Special Groups
Do you employ foreign nationals?
Question 24:
This question was answered by 31 institutions. The following table 6.1. shows how many institutions employ foreign nationals. Do you employ foreign nationals? Tab. 6.1.: Foreign nationals No. Categories 1 Yes 2 No Total Frequenc 24 7 31 Percent 77,4 22,6 100,0
24 institutions employ foreign nationals. 7 institutions do not employ foreign nationals. The following table 6.2. gives an overview of how many institutions employ foreign candidates according to which conditions. Survey: Ab Initio: : Do you employ foreign nationals?
Tab. 6.2.: Survey Ab-initio Frequency Ready entry Frequency FOs All nationalities meeting our requirements All Nations Any allowed to work in the United States Currently (Taiwan, Europe) Frequency Cpt.s Frequency
All Nations
Austrian
EU
CAPTAINS ONLY Currently (Europe, Southeast Asia, South America, United States) D,F,UK,I,S,...... EU From many countries More than 50 nationalities UK, GER, FRA
EEC
EU citizens 1
5 6 7
1 1 2
Malaysia, India, HK, UK, Aus None Short term contracts only Various Yes
1 1 1
D, F,UK.......... EEC EU
1 1 1
1 1 1
8 9
1 1
1 1
1 1
91
Tab. 6.2.: Survey Ab-initio 10 11 Frequency Ready entry Frequency FOs Various Yes Frequency Cpt.s 1 1 Yes Frequency 1
Question 24a: Is there any restriction with regard to the number of foreign nationals? This question was answered by 31 institutions. The following table 6.3. shows, if there are any restrictions with regard to the max. number of foreign nationals. Tab. 6.3. Restrictions No Categories Frequency Percent 1 0 31 100,0 All institutions stated, that there are no restrictions with regard to the number of foreign nationals who could be employed. Question 25: Do you tailor recruitment campaigns to specific target groups?
This question was answered by 32 institutions. The following table 6.4. shows how many institutions have specified their selection concept with regard to special target groups. Tab. 6.4.: Recruitment campaigns No. Categories Frequency Percent 1 Yes 9 28,1 2 No 23 71,9 Total 32 100,0 9 institutions have adapted their selection system to the needs of special target groups. 23 institutions have not done so.
92
Appendix 1
Question 26: If yes according to which criteria?
This question was answered by 39 institutions. The following table 6.5. tells how often institutions have made any adaptations of their selection concept to target groups with regard to which criteria. Survey: Criteria for adaptation to groups Tab. 6.5.: Criteria for adaptation Categories Gender Career status Military/ Civil School levels Ethnic groups Nationality No. of institutions (Yes) 0 2 3 5 0 2 Percent (Yes) 0 5,1 7,7 12,8 0 5,1 No. of institutions (No) 39 37 36 34 39 37 Percent (No) 100 94,9 92,3 87,2 100,00 94,9
The following table 6.6. shows, for which additional groups adaptations have been made according to additional criteria. Other, please describe. Tab. 6.6.: Specification of criteria No. Categories Frequency 1 0 37 2 Other, please describe. 2 Total 39 Percent 33,6 1,8 35,5
2 of the 39 institutions who answered this question have specified their concept according to criteria, which were not mentioned in the list above. The following table 6.7. tells the additional criteria. Other, please describe. Tab. 6.7.: Specification: Other No. Categories Frequency Percent 1 Airline sponsored cadets 1 ,9 2 Right to live and work in targeted base area 1 ,9 Question 27: Do you address cultural diversity in your selection system?
93
For this question none of the institutions used any category, which has not been offered by the list in the questionnaire. Question 28: Based on which criteria did you adapt your selection concept to cultural diversity?
The following table 6.10. tells how often institutions have adapted their concept to cultural diversity according to which criteria. This question was answered by 4 institutions. Multiple answers were allowed. Tab. 6.10.: The criteria according to which the adaptation was based No. of institutions Criteria based on (Yes) Law 1 Organizational needs in company 3 Needs of the school/training department 2 Practical needs with regard to performance of groups 3 Empirical findings provided by scientific analysis of 1 selection data No. of institutions (No) 3 1 2 1 3
94
Appendix 1
Other criteria? If "yes", please describe. Tab. 6.11.: Other criteria No. Categories Frequency Percent 1 0 110 100,0 Except the criteria which were given in the questionnaire, no additional criteria have been used by the institutions. Question 29: Do you accept test results of other institutions?
This question was answered by 11 institutions. The following table 6.12. offers information about the number of institutions who accept tests of other institutions. Tab. 6.12.: Accepting test results of other institutions No. Categories Frequency Percent 1 Yes 3 27,3 2 No 8 72,7 Total 11 100,0 3 of 11 institutions accept test results of other institutions. Question 29a: Are there any preconditions for the acceptance of test results provided by other institutions? The following table 6.13. shows the number of institutions who accept test results of other institutions only under certain conditions. Tab. 6.13.: Preconditions for acceptance No. Categories Frequency Percent 1 Yes 4 100,0 4 of 11 institutions defined conditions under which they accept test results of other institutions. The following table 6.14. shows the statement of the institutions about the kind of conditions. If yes, which preconditions? Tab. 6.14.: Specification of preconditions No. Categories 1 JAA operator 2 3 Frequency 1
95
Question 30:
This question was answered by 12 institutions. The following table 7.1. shows how many institutions consider which aspects of their selections systems being a strengths (answers to several categories have been allowed) . Survey: The strengths of your selection system? Tab. 7.1.: Strengths of selection systems Categoreies Economy in time Economy in costs No. of successful candidates with regard to later career phases Results of empirical evaluation High reliability High validity Quality of the evaluation procedure Degree of automation Combination of tests Flexibility for different groups Requirements for test operator qualification No. of institutions (Yes) 4 7 6 3 9 7 9 3 7 3 1 Percent (Yes) of n =12 33,3 58,3 50,00 25,00 75,00 58,3 75,00 25,00 58,3 25,00 8,3 No. of institutions (No) 8 5 6 9 3 5 3 9 5 9 11 Percent (No) of n = 12 66,7 41,7 50,00 75,00 25,00 41,7 25,00 75,00 41,7 75,0 91,7
Most frequently methodical criteria are considered a strengths (high reliability; quality of the evaluation procedure; high validity) of the selection systems. Then criteria like economy and combination of tests are following. Economic criteria play a substatial role. The criterion no. of successful candidates with regard to later career steps (n = 6) is also substantial . 9 of 12 institutions consider high reliability a special strength of their selection system. 7 of 12 institutions consider high validity a special strength of their selection systems. 9 of 12 institutions consider the results of the empirical evaluation a special strength of their selection systems. The following table 7.2. contains a statement about the strengthes of selection systems refering to criteria, other than the categories mentioned in table 7.1. Other strengths, please describe. Tab. 7.2.: Other strengths No. Categories Frequency Percent 1 Tailored to our operations and corporate culture 1 ,9
96
Appendix 1
Question 31: What do you consider the weaknesses of your selection system?
This question was answered by 11 institutions. The following table 7.3. shows the number of institutions who consider some aspects as weaknesses of their selection system. Survey: Weaknesses of the selection system Tab. 7.3.: Weaknesses of selection systems Categories Economy in time Economy in costs No. of successful candidates with regard to later career phases Results of empirical evaluation Low reliability Low validity Quality of the evaluation procedure Degree of automation Combination of tests Flexibility for different groups Requirements for test operator qualification No. of institutions 4 3 0 0 0 0 0 5 1 2 9
Most frequently "requirements for test operator qualification was mentioned as a weakness of systems (n = 9). Then a low degree of automation was mentioned (n = 5). Lack of economy in time was mentioned 4 times as a criterion. The following table 7.4. contains an additional category, which was introduced by an institution in order to describe a weakness of its system. Other weaknesses, please describe Tab. 7.4.: Other weaknesses No. Cathegory Frequency Percent 1 Test conducted in English only 1 ,9
8.
Question 32:
This question was answered by 11 institutions. The following table 8.1. shows how many institutions have performed significant changes of their selection systems. Did you make significant changes to your selection system in the past? Tab. 8.1.: Significant changes No. Categories Frequency 1 Yes 9 2 No 2 Total 11 Percent 81,8 18,2 100,0
97
This question was answered by 9 institutions. The following table 8.2. gives an overview of the lessons learned, which have been the reasons for the institutions to perform the changes. Tab. 8.2.: Lessons learned No. Categories Adaptation to emphasize evaluation of working in unstructured environments after having changed our sim training to include more unstructured (no specific checklist 1 available) events requiring application of past learnt system knowledge and basic flying skill 2 Cognitive testing should be in home language if not English 3 I took my military experience into our company 4 Job profiles in aviation have changed 5 Multi candidate assessment group exercise Omitment of grading and psycho tests for pilots graduated at own ATO. The quality of 6 the school in such, that that's not required anymore. Results within the connected airline are good and steady. 7 Situation awareness 8 Update norms, validity analysis, attrition analysis, training of selection panel members We found that the best for our selection process is the combination of references 9 research, interview and the simulator assessment Frequency 1 1 1 1 1 1 1 1 1
Question 32.B: With regard to which parameters did you make changes (several answers possible)? This question was answered by 10 institutions. The following table 8.3. shows how many institutions have performed changes with regard to which aspects of their selection systems. Survey: With regard to which parameters did you make changes (several answers possible)? Tab. 8.3.: Parameters according to which changes have been made Percent No. of (Yes) of Categories institutions (Yes) n = 10 Economy in time 2 20,00 Economy in costs 2 20,00 Optimization of rate of successful 2 20,00 candidates in later career phases Empirical evaluation 3 30,00 Optimization of reliability 5 50,00 Optimization of validity 5 50,00 Optimization of the evaluation procedure 4 40,00 Degree of automation 3 30,00 Combination of tests 4 40,00 Flexibility of different groups 2 20,00 Requirements for test-operator 0 0 qualification No. of institutions (No) 8 8 8 7 5 5 6 7 6 8 10 Percent (No) of n = 10 80,00 80,00 80,00 70,00 50,00 50,00 60,00 70,00 60,00 80,00 100,00
98
Appendix 1
Optimizations of methodic aspects are the most frequent changes. The following table 8.4. contains an additional category which was introduced by one institution. Other parameters? Tab. 8.4.: Other parameters No. Categories Frequency Percent Ab Initio cadets test should include practical exercise in basic FTD replicating 1 piston trainer to determine level of "mechanical intuition" and ability to divide 1 ,9 attention operating machinery not necessarily flying Question 33: If you could, would you make any changes to your selection system?
This question was answered by 11 institutions. The following table 8.5. shows how many institutions would perform any changes if they could. If you could, would you make any changes to your selection system? Tab. 8.5.: Changes whished to be done No. Categories Frequency Percent 1 Yes 6 54,5 2 No 5 45,5 Total 11 100,0 6 institutions would perform changes of their selection system if they had the possibility to do so. 5 institutions do not see any need for changes. The following table 8.6. contains the type of changes which would be performed by the institutions. If "yes", what have been your lessons learned? Tab. 8.6.: Lessons learned No. Categories 1 Automation If cadets are ESL then tests should in home language add FTD practical 2 exercise 3 Improve user interface. Though not critical but good to have. 4 More like the military tests 5 Regard future developments like MPL 6 Would add the mandatory simulator assessment Frequency Percent 1 ,9 1 1 1 1 1 ,9 ,9 ,9 ,9 ,9
99
The following table 8.8. shows how many institutions would like to change their selection system according to categories which were not mentioned in the above list. Other parameters? Tab. 8.8.: Other parameters No. Categories Frequency Percent 1 0 110 100,0 No additional category was introduced by the institutions with regard to the type of changes. Question 34: Do you have data about empirical evaluation of your measuring dimensions/ tests/test battery?
This question was answered by 10 institutions. The following table 8.9. contains the results about the number of institutions who have data about empirical evaluation of the measuring categories of their system. Do you have data about empirical evaluation of your measuring dimensions/tests/test battery? If "yes", please specify. Tab. 8.9.: Data about empirical evaluation No. Categories Frequency Percent 1 Yes 5 50,0 2 No 5 50,0 Total 10 100,0 5 of 10 institutions who dimensions/tests/batteries. answered this question have empirical data about measuring
100
Appendix 1
The following table 8.10. gives a survey on types of measuring dimensions/tests/batteries they have data about, for the groups separately. Survey: Do you have data about empirical evaluation of your measuring dimensions/ tests/test battery? Tab. 8.10.: Specification of data about empirical evaluation Focus of evaluation Measuring dimension/test/test battery Ab Initio Tests/test battery COMPASS score Composite score COMPASS & 10P Reliability 0,75 80% if expert English~50% if IOAC 4 Predictive validity 0,35 80% if expert English~50% if IOAC 4 97% We have several internal and external publications Ready Entry Measuring dimensions Composite score Realiability 0,75 Predictive validity 0,35 FOs Measuring dimensions 0 Tests/test battery 0 Reliability 0 Predictive validity 0 Cpt.s Measuring dimensions 0 Tests/test battery 0 Reliability 0 Predictive validity 0 Frequency 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0
For FOs and Cpt.s there are no data about empirical evaluation of measuring dimensions/tests/batteries. The following table 8.11. contains specifications about the types of data based on empirical evaluation. It allows the comparision of the groups for which at the 4 institutions who answered the question a test system is available. Question 34: Specifications with regard to the type of data on emirical evaluation 8.11.: Specifications about type of data Groups Case Ab Ready No. Initio entry, FOs Cpt.s Reliability Ab Ready 1 0 Cpt.s 0.75 Initio entry, l.e. 2 3 4 Ab Initio Ab initio Ab Initio 0 Ready entry, l.e. 0 0 0 0 0 0 Types of methodical criteria Measuring Predictive validity dimensions composite 0.35 score 80% if expert English~50% if 0 ICAO 4 publications 97% 0 0
Tests/Test battery _ 0 0 0
101
9.
Question 36:
This question was answered by 11 institutions. The following table 9.1. contains the number of institutions, who have adapted their selection system to their type(s) of operation. Is your selection system tailored in a special way to your type of operation? Tab. 9.1.: Tailoring of system to type of operation(s) Frequenc No. Categories 1 Yes 5 2 No 6 Total 11 Percent 45,5 54,5 100,0
At 5 of 11 institutions the selection system is tailored in a special way to the type of their operation.
Question 37:
If "yes", according to which special characteristics of your operation is your selection tailored?
This question was answered by 5 institutions. The following table 9.2. shows according to which special features of the operation the system has been adapted. Tab. 9.2.: Type of characteristics the system has been tailored for No. Categories Bridge course at the flight academy is given on the same type of aircraft as 1 the initial type within the airline (if hired). The results are used iso a seperate grading 2 Commercial airline operations 3 Company safety culture, team redundancy Emphasizes the need for seeing the whole picture of the operation, not only the pilot's view. We evaluate the candidates for their aptitude to contribute to 4 all aspects of the operation (working on projects, taking over postholder positions) We "tailor" our selection procedure for every customer. Which means 5 different profiles, languages, test procedures and result reporting. Frequency Percent 1 1 1 1 1 ,9 ,9 ,9 ,9 ,9
102
Appendix 1
Question 38: Which requirements in the selection concept cover the special characteristics of your operation?
This question was answered by 8 institutions. The following table 9.3. contains data about the the requirement dimensions which have been the reason for the adaptation. Tab. 9.3.: Requirements of selection system covering the characteristics No. Categories Ability to work an unknown problem and ability to present topics to an 1 auditorium Assessment centre, team cooperation and communication, adherance to 2 procedures, commandability 3 English fluency 4 English language proficiency For Capts, relevant experience. For Ab Initio level, assessment of aptitude 5 and attitude via biodata, selection tests, and interviews. 6 Good cooperation between the ATO and the airline 7 None 8 We do not have an own operation. Question 39: Frequency Percent 1 1 1 1 1 1 1 1 ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9
Do you distinguish in your selection system between psychologically based requirements and requirements which are due to the special interests/needs of your company?
This question was answered by 11 institutions. The following table 9.4. contains data about the number of institutions who differentiate between psychologically based requirements and requirements which are based on special interests/needs of the company. Tab. 9.4.: Psychological and administrative requirements No. Categories Frequency Percent 1 Yes 4 36,4 2 No 7 63,6 Total 11 100,0 4 institutions differentiate between the types of requirements.
103
This question was answered by 3 institutions. The following table 9.5 describes the requirements which are integrated in the selection system according to the needs of the company. Survey: If "yes", please describe the requirements resulting from special interests/ needs of the company/ops. Tab. 9.5.: Specification of requirements of the company No. Categories Empirically determined, eg., factors relevant to passing ground school and 1 actual flying. 2 Two step selection 1.) general abilities 2.) company specific requirements 3 We have e.g. as customers also international Airforces with special needs. Question 41: Frequency Percent 1 1 1 ,9 ,9 ,9
Do you address any specific characteristics of your target groups in the concept of your selection systems?
This question was answered by 10 institutions. The following table 9.6. shows how many institutions shape their concept of selection according to the specific features of the target groups. Tab. 9.6.: Specific characteristics of the target groups addressed by your concept of selection systems? No. Categories Frequency Percent 1 Yes 6 60,0 2 No 4 40,0 Total 10 100,0 6 institutions told that they address their selection concept to specific features of the target group. 4 do not do so. The following table 9.7. describes how many institutions have adapted their concept with regard to which criteria. This question was answered by 8 institutions. Survey: Which type of characteristics do you address in the concept of your selection systems? (Per group) Tab. 9.7.: Characteristics of candidates, addressed by selection concept No. of institutions No. of institutions Characteristics (Yes) (No) Ab Initio Gender 2 6 Age 5 3 Language 5 3 School level 6 2 School marks 3 5 Nationality 3 5 Ethnic group 0 0
104
Appendix 1
Tab. 9.7.: Characteristics of candidates, addressed by selection concept No. of institutions No. of institutions Characteristics (Yes) (No) Flying hours 2 6 ATPL 1 7 Instructor Rating 0 0 Type Rating 0 0 Ready Entry Gender 2 6 Age 3 5 Language 5 3 School level 4 4 School marks 2 6 Nationality 2 6 Ethnic group 0 0 Flying hours 6 2 ATPL 4 4 Instructor Rating 0 0 Type Rating 1 7 FOs Gender 0 0 Age 2 6 Language 2 6 School level 1 7 School marks 0 0 Nationality 1 7 Ethnic group 0 0 Flying hours 2 6 ATPL 2 6 Instructor Rating 0 0 Type Rating 0 0 Cpt.s Gender 2 6 Age 3 5 Language 3 5 School level 0 0 School marks 0 0 Nationality 2 6 Ethnic group 0 0 Flying hours 3 5 ATPL 3 5 Instructor Rating 0 0 Type Rating 2 6 In the Ab Initio group age, language and school level play the most important role. In the Ready Entry group language, school level, flying hours and ATPL play the most important role. The same criteria are the most important ones in the groups of the FOs and the Cpt.s.
105
For the other groups no additional criteria were introduced. Question 42: In which way do you adapt your selection concept to these characteristics?
This question was answered by 12 institutions. The following table 9.9. contains data about the question in which way the selection systems have been adapted to the special characteristics of the target groups. Survey: Modes of adaptation Tab. 9.9.: Modes of adaptation Categories Selection of type of tests/requirement dimensions Number of tests Type or number of selection phases Definition of norms Language of tests Preconditions for being accepted for the tests No. of institutions (Yes) 7 5 4 3 3 7 Percent (Yes) of n = 12 58,3 41,7 33,3 25,00 25,00 58,3 No. of institutions (No) 5 7 8 9 9 5 Percent (No) of n = 12 41,7 58,3 66,7 75,00 75,00 41,7
Most frequently the preconditions for the candidates (n = 7) to be accepted for the tests are adapted to the target groups. Furthermore the type of requirement dimensions (n = 7) and the number of tests (n = 5) are adapted. Also the types and numbers of the selection phases (n = 4) are accounted for. Language tests (n = 3) and nor ms (n = 3) most seldomly play a role. The following table 9.10. shows how many institutions have performed an adaptation according to additional criteria not included in the list above. Other ways? If "yes", please describe. Tab. 9.10.: Specification: Other modes No. Categories Frequency 1 0 110 Percent 100,0
106
Appendix 1
10. Methodical Aspects of your Selection System
Which type of selection instruments do you use for the different groups?
Question 43:
12 of 110 institutions who logged in to the questionnaire answered the questions which refer to the use of a selection system. The following tables 10.1. and 10.2. contain a survey on how many institutions use which types of selection instruments for the different groups. Survey: Which type of selection instruments do you use for the different groups? Tab. 10.1.: Type of selection system used for the different groups (part 1) Instruments Semi Paper-Pencil Apparatus Free style standardized tests tests Groups Questionnairs interviews Interviews (psychomtric) (psychometric) Ab 7 4 7 3 4 Initio Ready 8 3 5 3 3 Entry FOs 6 4 5 3 3 Capt.s 4 3 3 2 1 Sums 25 14 20 11 11 Tab. 10.2.: Type of selection system used for the different groups (part 2) Simulation based Simulation based Groups Groups worksamples psychometric tests Ab Initio 5 3 1 Ready 5 3 2 Entry FOs 6 1 1 Capt.s 3 1 1 Sums 19 8 5
Questionnaires (n = 25) are used most frequently. Semi standardized interviews (n = 20) follow on the second rank. Groups (n = 19) also are used quite frequently. PC-based psychomtric tests (n = 17) also are mentioned with a substantial frequency. Psychometric based apparatus are mentioned n = 11 times. Full flight Simulators show the same frequency (n = 11). Work Samples are mentioned n = 8 times. For Ab Initio and Ready Entry candidates no additional instruments were mentioned. For FOs and Cpt.s additional instruments were introduced: for FOs presentation of behavioral scenarios in 1 case and for Cpt.`s presentation of behavioral scenarios in 1 case. The following table 10.3. gives a survey on combinations of the used selection instruments with regard to institutions and groups. Within the columns 1-12 are the institutions and the rows show the groups and the instruments.
107
108
Appendix 1
Tab. 10.3.: Selection instruments, groups and institutions in combination Case Numbers 1 2 3 4 5 6 7 8 9 10 11 12 Simulation based psychometric tests Ab Initio 0 0 0 0 0 1 0 0 0 0 0 0 Ready Entry 0 0 0 0 0 1 0 0 1 0 0 0 FOs 0 0 0 0 0 1 0 0 0 0 0 0 Cpt.s 0 0 0 0 0 1 0 0 0 0 0 0 Fixed base simulator Ab Initio 0 0 1 0 0 0 0 0 0 0 0 0 Ready Entry 0 0 0 0 0 0 0 0 0 0 0 0 FOs 0 0 0 0 0 0 0 0 0 0 0 0 Cpt.s 0 0 0 0 0 0 0 0 0 0 0 0 Full flight simulator Ab Initio 0 0 0 0 0 0 0 0 0 0 0 0 Ready Entry 0 1 0 1 0 0 0 0 1 0 0 1 FOs 0 1 0 1 1 0 0 0 0 0 0 1 Cpt.s 1 1 0 1 0 0 0 0 0 0 0 0 Question 44: Which grading system do you apply for the description of results in your selection system?
The following table 10.4. shows how many institutions apply which types of grading systems. This table is based on data of 12 institutions. Survey: Grading systems Tab. 10.4.: Frequencies of institutions with regard to the grading systems used. No. of institutions No. of institutions No. Grading systems used (Yes) (No) Ab Initio 1 Verbal description of performance 6 6 2 Pass/fail 6 6 3 Qualitative classification in several classes 4 8 4 Rank rows 1 11 5 Percentage ranks (0-100) 3 9 6 Stanine values (1-9) 4 8 7 T-Values (0-100) 1 11 Ready Entry 1 Verbal description of performance 4 8 2 Pass/fail 5 7 3 Qualitative classification in several classes 3 9 4 Rank rows 1 11 5 Percentage ranks (0-100) 2 10 6 Stanine values (1-9) 4 8 7 T-Values (0-100) 1 11
109
1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0 1 1
2 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0
3 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1
4 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0
5 1 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0
6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
7 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 0 0
8 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 0 1
9 1 1 0 0 1 1 0 0 1 1 0 0 0 0 0 0 0
10 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0
11 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
12 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Sums 6 4 3 2 6 5 3 3 4 3 2 2 1 1 0 1 3
110
Appendix 1
Tab. 10.5.: Grading systems and groups Case no. of institut. Percentage-ranks Ready Entry Percentage-ranks FO Percentage-ranks Cpt. Stanine values Ab Initio Stanine values Ready Entry Stanine values FO Stanine values Cpt. T-Values Ab Initio T-Values Ready Entry T-Values FO T-Values Cpt.
1 1 0 0 1 1 0 0 1 1 0 0
2 0 0 0 0 0 0 0 0 0 0 0
3 0 0 0 0 0 0 0 0 0 0 0
4 0 0 0 0 0 0 0 0 0 0 0
5 0 0 0 0 0 0 0 0 0 0 0
6 0 0 0 1 1 1 1 0 0 0 0
7 0 0 0 0 0 0 0 0 0 0 0
8 1 0 0 0 0 0 0 0 0 0 0
9 0 0 0 0 0 0 0 0 0 0 0
10 0 0 0 1 1 1 0 0 0 0 0
11 0 0 0 0 0 0 0 0 0 0 0
12 0 0 0 1 1 1 0 0 0 0 0
Sums 2 0 0 4 4 3 1 1 1 0 0
Verbal description and pass/fail play the most important role. Question 45.A: Grading systems for flight school training This question was answered by 6 institutions. The following table 10.6. shows how many institutions use which types of grading systems for examinations. Which grading system do you use for the performance evaluation in the following types (see tab. 9.7. through 9.20.) of examination? Tab. 10.6.: Grading systems for types of examination Frequenc No. Categories 1 Pass/Fail 3 Qualitative classification in several 2 3 classes, like: "below standard", "standard", "above standard" Total No. of institutions 6 Percent 50,0 50,0 100,0
3 institutions use pass/fail and further 3 use qualitative classification in several classes, like: "below standard", "standard", "above standard". The following table 10.7. contains a survey of grading systems which are used by the institutions for the evaluation of performance in flight school training. Survey: Type of grading systems for flight school training. Tab. 10.7.: Grading system for flight school training No. 1 2 3 4 5 6 Categories Verbal description of performance (strengths and weaknesses) Pass/Fail Qualitative classification in several classes, like: "below standard", "standard", "above standard" Rank rows Empirically based numerical expert rating system Other system, please describe. Frequency of institutions 0 1 0 0 0 0
111
Except pass/fail there is no use of any additional type of grading system in flight school training. The following table 10.8. shows a specification, delivered by one institution for the pass/fail category. Pass/Fail Tab. 10.8.: Specification No. Categories 1 Competency based Question 45.B: Grading system for initial type rating? This question was answered by 8 institutions (multiple answers have been allowed). The following table 10.9. contains statements about the frequency with which institutions use different grading systems for initial type rating. Survey: Grading systems for initial type rating. Tab. 10.9.: Grading system for initial type rating No. 1 2 3 4 5 6 Category Verbal description of performance (strengths and weaknesses) Pass/Fail Qualitative classification in several classes, like: "below standard", "standard", "above standard" Rank rows Empirically based numerical expert rating system Other system, please describe. Frequency of institutions 0 5 2 1 1 0 Frequenc 1 Percent ,9
Here the variety is bigger. Pass/fail is used most frequently. Verbal description of performance (strengths and weaknesses) and other systems are not used for initial type rating. The following tables 10.10. and 10.11. specify the statements to the questions above. Pass/Fail Tab. 10.10.: Specification No. Categories 1 Competency based Empirically based numerical expert rating system Tab. 10.11.: Specification No. Categories 0 - 100% The other criteria were not specified. Frequenc 1 Percent ,9 Frequenc 1 1 Percent ,9 ,9
112
Appendix 1
Question 45.C: Grading system for LOFT scenarios in type rating This question was answered by 7 institutions. The following table 10.12. contains statements on how many institutions use which grading system for LOFT scenarios in type ratings. Survey: Types of grading systems for LOFT scenarios in type rating. Tab. 10.12.: Grading system for LOFT scenarios in type ratoing No. 1 2 3 4 5 6 Categories Verbal description of performance (strengths and weaknesses) Pass/Fail Qualitative classification in several classes, like: "below standard", "standard", "above standard" Rank rows Empirically based numerical expert rating system Other system, please desribe. Frequency of institutions 0 3 3 1 0 0
Pass/fail and qualitative classification in several classes, like: "below standard", "standard", "above standard" is mentioned 3 times each type, for LOFT scenarios. Rank rows are used in one case. The following tables 10.13. and 10.14. specify the table 10.12. above. Pass/Fail Tab. 10.13.: Specification No. Categories 1 1 2 Competency based The other criteria were not mentioned, except: Rank rows Tab. 10.14.: Specification No. Categories 1 Excellent; 0 not acceptable (ranks) Question 45.D: Grading system for Line Training This question was answered by 9 institutions. Frequenc 1 Percent ,9 Frequenc 1 1 Percent ,9 ,9
113
Pass/fail is used 5 times, qualitative classification in several classes 3 times and verbal description of performance (strengths and weaknesses) 1 time. The last three categories of the list above are not used for line training. The following table 10.16. shows a specification of a grading system used by one institution. Qualitative classification in several classes, like: "below standard", "standard", "above standard" Tab. 10.16.: Specification No. Categories 1 ATP standard Frequency 1 Percent ,9
Question 45.E: Grading system for Check flights The following table 10.17. shows how many institutions use which grading system for check flights. Survey: Types of grading systems for LOFT scenarios in check flights. Tab. 10.17.: Grading system for LOFT scenarios check flights No. 1 2 3 4 5 6 Category Verbal description of performance (strengths and weaknesses) Pass/Fail Qualitative classification in several classes, like: "below standard", "standard", "above standard" Rank rows Empirically based numerical expert rating system Other system, please desribe. Frequency of institutions 1 6 2 0 0 0
Pass/fail is mentioned 6 times, qualitative classification in several classes 2 times and verbal description of performance 1 time. The remaining systems in the list are not used for check flights.
114
Appendix 1
The following tables 10.18. and 10.19. specify the statements of the table above. Pass/Fail Tab. 10.18.: Specification No. Categories 1 1 Frequency 2 Percent 1,8
Qualitative classification in several classes, like: "below standard", "standard", "above standard" Tab. 10.19.: Specification No. Categories 1 ATP standard Question 46: Frequency 1 Percent ,9
Are there any grading levels (positive, negative) which have obligatory consequences for the candidate?
This question was answered by 11 institutions. The following table 10.20. indicates how many institutions use grading systems with certain levels which have obligatory consequences for the candidate. Are there any grading levels (positive, negative) which have obligatory consequences for the candidate? Tab. 10.20.: Obligatory consequences Categories Number Yes 9 No 2 At 9 institutions there are obligatory consequences, at 2 institutions there are not any obligatory consequences. The following table 10.21. specifies the statements of the table above. If "yes", please describe Tab. 10.21.: Specifications with regard to obligatory consequences No. Categories Fail means retrain once and retest. Another fail means termination during 1 the initial phase of employment. For selection, just a pass/fail for checks, then pass, fail or remedial action, 2 eg. reinforcement training. It is a Pass or Fail System. If an individual does not acquire certain level of 3 grades we expel that individual. 4 Low grading requires further training. 5 Result of grading, if given. 6 SCHOOL LEVEL AERONAUTICS QUALIFICATIONS 7 We have fixed "cut off" scores for different groups, based on statistics. Frequency 1 1 1 1 1 1 1 Percent ,9 ,9 ,9 ,9 ,9 ,9 ,9
There are concepts, which lead directly to exclusion and concepts, which are linked with a support. The question which concept makes sense in which case depends on the type of weaknesses and is also partly answered from an operational and economical point of view.
115
Question 47:
Do you only refuse candidates at the end of the whole selection procedure or do you decide after each step?
This question was answered by 11 institutions. The following table 10.22. shows how many institutions make selective decisions at the end of the selection procedure or earlier. Do you only refuse a candidate at the end of the whole selection procedure or do you decide after each step? Tab. 10.22.: Phase for selection decisions No. Categories 1 After the whole procedure 2 Step by step Total No. of institutions Frequency 4 7 11 Percent 36,4 63,6 100,0
4 institutions make decisions at the end of the whole procedure, 7 institutions make decisions earlier, which means, they select in several steps. The following table 10.23. shows after which steps the institutions make decisions in terms of the exclusion of candidates. After which steps do you refuse unsuccessful candidates? Tab. 10.23.: Steps after which decisions are made No. Categories 1 Application 2 Basic Assessment 3 Grading 4 Initial interview 5 Pre-selection (computer tests) 6 Shortlisting 7 Simulator Screening 8 AC 9 initial interview 10 Medical Check 11 psycho0technical tests 12 Second interview with HR 13 Simulator Screening 14 Stage 1 Interview 15 final interview and aptitude test 16 First day of Final Assessment 17 interview with the selection board 18 Psychological assessment 19 Simulator worksample 20 Stage 2 Interview 21 Interview 22 Management Interview Frequency 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
116
Appendix 1
Tab. 10.23.: Steps after which decisions are made No. Categories 23 Medical and statutory clearances 24 Second day of Final Assessment 25 Flight Grading
Frequency 1 1 1
With regard to this question only few correlations can be identified, partly due to the fact that the respective expressions are not clear, because there is no standardized categorisation of such steps. Question 48: Are there any measuring dimensions or instruments which have an accentuated higher weight for your evaluation than other ones?
This question was answered by 11 institutions. The following table 10.24. shows how many institutions weight particular measuring dimensions higher than others at the evaluation of the results. Are there any measuring dimensions or instruments for selection which have an accentuated higher weight for your evaluation than other ones? Tab. 10.24.: Measuring dimensions with higher weight No. Categories Frequency Percent 1 Yes 3 27,3 2 No 8 72,7 Total No. of institutions 11 100,0 3 institutions apply this approach. 8 do not apply this approach. The following table 10.25. shows which measuring dimensions the institutions weight higher. Survey of measuring dimensions with higher weight Tab. 10.25.: Specifications No. 1 2 3 4 5 6 7 8 9 10 Measuring Dimensions 0-100% Manipulative skills Multi-tasking Situation awareness Decision making under time Aptitude tests for cognitive ability Computer tests Flight /FFS grading interviews for interpersonal skills Biodata for relevant life Number of institutions 1 1 1 1 1 1 1 1 1 1
Question 49:
This question was answered, depending on the different groups, by different numbers of institutions. The data is shown per group in the following table.
117
Question 49.A: Which method for empirical evaluation did you apply to your selection system? The following table 10.26. shows how many institutions rely on personal judgement of responsible staff and on career analyzes of exemplified candidates in the empirical evaluation process concerning the particular groups. Overview of methods for the different groups Tab. 10.26.: Methods of empirical evaluation Groups Ab Initio Ready FOs Cpt.`s Sums Personal judgement of responsible staff 3 2 3 2 10 Career analyzes of exemplified candidates 4 3 4 2 13 No. of answering institutions 7 5 7 4 0
Career analyzes of exemplified candidates (n = 13) is applied more often than Personal judgement of responsible staff (n = 10). Question 49.B: Hit rate with regard to criteria during further career? This question was answered, depending on the type of group, by different numbers of institutions. The following table 10.27. indicates how many institutions rely on hit rate during further states of career as an method of empirical validation with regard to the particular groups. Survey: Hit rate with regard to criteria during further career? Tab. 10.27.: Hit rate Groups Ab initio Ready FOs Cpt.s No. of institutions (Yes) 7 6 4 3 No. of institutions (No) 1 0 0 0 Total no. of institutions 8 6 4 3
The following table 10.28 shows which career steps are used by how many institutions as criteria for the evaluation of the hit rate.
118
Appendix 1
The data refers to the answers of 12 institutions. Survey: Hit rate with regard to criteria during further career? Tab. 10.28.: Hit rate No. 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 Criteria Ab Initios Theoretical examinations Flight school training Type rating IOE results Proficiency checks Upgrading HR development data Sickness rate Other criteria Ready Entries Theoretical examinations Flight school training Type rating IOE results Proficiency checks Upgrading HR development data Sickness rate Other criteria FOs Theoretical examinations Flight school training Type rating IOE results Proficiency checks Upgrading HR development data Sickness rate Other criteria Cpt.s Theoretical examinations Flight school training Type rating IOE results Proficiency checks Upgrading HR development data Sickness rate Other criteria No. institutions (yes) 5 6 3 0 1 0 1 0 0 3 3 5 2 3 2 1 1 0 2 0 5 3 4 2 0 1 0 1 0 2 2 4 2 0 0 0 No. (no) 7 6 9 12 11 12 11 12 12 9 9 7 10 9 10 11 11 12 10 12 7 9 8 10 12 11 12 11 12 10 10 8 10 12 12 12 institutions
Apart from the criteria given in the questionnaire there were not mentioned any further categories.
119
120
Appendix 1
Tab. 10.30.: Mathematical correlation No. of institutions No. Categories (Yes) FOs Theoretical 1 0 examinations 2 Flight school training 1 3 Type rating 2 4 IOE results 1 5 Proficiency checks 2 6 Upgrading 1 7 HR development data 0 8 Sickness rate 0 Cpt.s Theoretical 1 0 examinations 2 Flight school training 0 3 Type rating 1 4 IOE results 1 5 Proficiency checks 2 6 Upgrading 0 7 HR development data 0 8 Sickness rate 0
In the case of Ab Initio and Ready Entry candidates methodically more qualified processes appear significantly more often than in the case of FOs and Cpt.s. A reason for this could be, amongst others, that generally selection processes are applied more infrequently for these groups, but also that the types of process applied for these groups (cf. question 40) are harder to handle methodically. Question 50: Which type of mathematical procedure did you apply for empirical evaluation?
This question was answered by 12 institutions. The following table 10.31. shows how many institutions apply a certain mathematical method for empirical evaluation with regard to the different carreer groups. Survey: Frequency of institutions which applied a mathematical procedure Tab. 10.31.: Mathematical method Categories No. of institutions (Yes) Ab initio 4 Ready entry 4 FOs 4 Cpt.s 3 Percent (Yes) 33,3 33,3 33,3 25,00 No. of institutions (No) 8 8 8 9
With regard to Ab Initios, Ready Entries and FOs 4 times a certain mathematical method was applied in each case. With regard to Cpt.s a certain mathematical method was applied only 3 times. The following table 10.32. gives a survey of the frequencies, which mathematical method was applied for the different groups by the institutions.
121
Question 51.A: What are your failure rates during Ab Initio courses? The question was answered by 7 institutions. The following table 10.33. shows the percentage of the candidates who failed Ab Initio traing. at how many institutions. Tab. 10.33.: Ab initio training % No. Failure rates % Frequency 1 0 1 2 2 1 3 3 1 4 5 1 5 8 1 6 10 1 7 100 1 8 Total 7
In the second colums at the left side the percentage of candidates is listetd, which did not pass the Initio Training. Question 51.B, C, D, E: What are your failure rates during Type rating courses? These questions were answered by different numbers of institutions depending on the type of training. The following table 10.34. shows at how many institutions which percentages of the particular career groups do not pass the different types of training (failure rate).
122
Appendix 1
The left column contains the type of training, followed by the case number of the institution, who gave the answers. The remaining columns contain the percentages for the different career groups with the number of institutions who had failures at this height. Tab. 10.34.: Overview on failure rates Ab No. Categories Institut. Initios % Institut. Type rating 1 0 1 Type rating 2 0 1 Type rating 3 100 1 Line Familiarization Line Familiarization Proficiency Checks Proficiency Checks Proficiency Checks Upgrading Upgrading Upgrading Upgrading 1 2 1 2 3 1 2 3 4 0 3 2 10 0 5 100 0 0 1 1 1 1 0 1 1 0 0 Ready Entries % 0 2 0 0 5 0 5 0 3 0 0 0 No. Institut. 2 1 0 1 1 1 1 0 1 0 0 0 No. Institut. 2 2 0 2 1 1 1 1 1 1 1 1 No. Institut. 2 0 0 2 1 1 1 1
FOs % 0 100 0 0 10 0 5 8 0 4 5 10
0 0 0
123
Question 52 und 53: Describe the composition of your selection team and their tasks The questions refering to different aspects of selection in this chapter 11 is partly answered by a bigger number of institutions, than question 2 ( n = 10) (see: page 7) at the beginning of the questionnaire. This may be caused by different kinds of understanding of the questions. Question 2 is focussing on a selection concept, which means a formalized procedure like a selection system. This question (52) now is focussing on a selection team which may be responsible also for selection procedures like screening or on the job selection. This question was answered by 11 institutions. The following table 11.1. shows how often which types of staff are responsible for the selection. Several answers were allowed. The data are based on the answers of 11 institutions. Survey on different types of selection staff Tab. 11.1.: Types of selection staff Categories Qualified Psychologist Trained staff of the psychologist or your company Administration staff of your company (non0pilots) Postholders (pilots). If "yes", in which function(s)? No. of institutions 8 7 8 9
8 institutions employ qualified psychologists, 7 work with trained staff of the psychologist or the company, 8 with administration staff of the company (non0pilots) and 9 with postholders (pilots). The following table 11.2. specifies the statement about postholders by defining their special membership to any department closer. The data are based on the answers of 8 institutions. Postholders (pilots). If "yes", in which function(s)? Tab. 11.2.: Postholders (pilots) No. Categories 1 FOP HR-manager, Pilot Recruitment Manager, 2 3 Management pilots 4 Manager Flight Operations 5 PILOT MANAGEMENT 6 Training Captains 7 TRE 8 We work together as an expert team In most cases staff of the management is concerned. Frequency 1 1 1 1 1 1 1 1 Percent ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9
124
Appendix 1
The following table 11.3. shows the combination of types of staff per institution. The left column contains the case number of the institution, followed by the single functions which occur. The most right column shows the general function of the postholders in each case. The data are based on the answers of 11 institutions. Tab. 11.3.: Combinations of types of staff Qualified Trained staff of the No. Psychologists psychologist or company 1 1 1 2 0 0 3 0 1 4 0 1 5 0 1 6 7 8 9 10 11 Question 54: 1 1 1 0 1 1 1 1 0 0 1 0 Administration staff of the company 1 1 1 1 1 0 1 1 1 0 0
Postholders (pilots) Management pilots 0 Training Captains --FOP We work together as an expert team Manager Flight Operations 0 HR-manager, Piolot Recruitment Manager, VP Pilot Management TRE
This question was answered by 12 institutions. The following table 11.4. shows how often which type of tasks is done by which type of staff. The left column contains the general professional function. The following colums show how often this type of staff is responsible for which function. So the numbers within the fields describe the number of institutions in which this particular function applies to the respective general professional function. Survey: For which functions are they responsible in the selection process?
Tab. 11.4.: Responsibility in the selection process Monitoring Monitoring performance performance an and evaluevaluation in ation in simulator groups Data interpretation and performance evaluation 5
Categories
Organization
Running tests
Performing interviews
125
Tab. 11.4.: Responsibility in the selection process Monitoring Monitoring performance performance an and evaluevaluation in ation in simulator groups Data interpretation and performance evaluation
Categories
Organization
Running tests
Performing interviews
The postholders form the biggest group (n = 36), followed by the psychologists (n = 34) and administration staff of company (non-pilots) (n = 30). Trained staff of the psychologist form the smallest group (n = 26). Question 54.A: Other function(s)? If "yes", please specify. This question was answered by one institution. The following table 11.5. contains the specification of an institution on the question about the functions (see table above). Tab. 11.5.: Specification No. Categories 1 We have a registered training for our staff.
Frequency 1
Percent ,9
Question 55:
This question was answered by 33 institutions. The following table 11.6. shows how many of the institutions perform selection themselves and how many cooperate with another company in terms of selection. Who is performing the selection? Tab. 11.6.: Who is performing the selection? No. Categories Frequency My own 25 1 company Both of them 2 8 (each one a part) Total 33 Percent 22,7 7,3 30,0
126
Appendix 1
25 of 33 institutions who answered the question stated that their company performes the selection on its own. 8 institutions stated that they cooperated with other companies. None of the institutions stated that the selection was performed exclusively by another company. The following table 11.7. gives a survey of the persons/departments of the own company responsible for selection, in the case of a performance without the help of another institution. Survey: Which person/department of the company is responsible? Tab. 11.7.: Persons/departments responsible for selection No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 Categories ATO Head of Training Department in the division responsible for pilot selection Flight Crew Recruitment Section of Flight Operations Department Flight Operation Manager Flight Operations Flight Operations and Human Resources for the air carrier. Myself for my company Flight Operations Division Flight Operations with HR coordination Flight OPS, Training Center For Ab - initio: Flight Training Center For FO and Captain: Flight Crew 919 Head of Training HR Human Resource Department 0 responsible of the airline Human resources / training department Human Resources Department Manager Crew Resources of the Crew Resources Department Manager Pilot Recruitement and Transfer Multiple HUMAN RESOURCES Selection team run by our Selection Manager Training Manager Frequency of institutions 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
The following table 11.8. shows which person/department is responsible for performing the selection, in the case of a performance with the help of another institution. This question was answered by 8 institutions. Survey: Which person/department in your company is responsible for performing the selection? Tab. 11.8.: Person/department responsible for selection No. 1 2 3 4 5 6 7 8 Categories Accountable Manager & Manager Flight Operations Head of Trg. and the HR FLIGHT OPERATION HUMAN RESOURCES Human Resources SPL/OP Pilot Recruitment Manager Leadership team/Human Resources Myself and 2 others OPS RESOURCE MANAGER Recruitment department, management pilots Frequency of institutions 1 1 1 1 1 1 1 1
127
In the case your own company performs the selection partly or in total, do you have a special procedure to identify selection team members?
This question was answered by 32 institutions. The following table 11.10. shows how many institutions have special procedures to identify their selection team members. The much bigger number of institutions who answered this question compared with qu. 2. may result from different understanding In the case your own company performs the selection partly or in total, do you have a special procedure to identify selection team members? Tab. 11.10.: Frequency of special procedure to identify selection team members No. Categories Frequency Percent 1 Yes 24 75,0 2 No 8 25,0 Total No. of institutions 32 100,0 24 institutions have such special procedures to identy their selection team members. 8 institutions do not have such special procedures.
128
Appendix 1
Question 57: If yes in qu. 56: What is your process to identify selection team members?
This question was answered by 21 institutions. The following table 11.11. contains the particular measures/qualifications. Survey: What is your process to identify selection team members? Tab. 11.11.: Process to identify selection team members? No. Categories 1 HR manager,1 Pilot Recruitment Manager, The vice0presidents of the "flying" units 1 circulate as chairmen of the selection board, Iaw, the vicepresidents of the B777, B744, B737, MD11/A330 units 2 Academic background and experience 3 Board, written in Part D CHECK AIRMAN TYPE RATED FOR THE SPECIFIC TYPE OF AIRCRAFT WHICH 4 THE HIRING FOR. OPERATION MANAGEMENT MEMBER 5 flight experience experience in human resources,management, operations experience 6 FOP Management Team or Selcetion Interview. 7 HR filter using various metrics, screening by reporting leadership, interview process 8 Human Resources Department of the Division selects team members Identify best candidates. Perform training. Give them hand0on experience and only let 9 them select with Senior Selection team member. Instructor rating, experience, Knowing the local traditions and habits, CRM and basic 10 psychology. 11 Interviews 12 Language Proficiency and Aero Experience 13 Leadership team members 14 Management pilot 15 Management staff who have undergone training Meeting minimum qualifications which must include experience in pilot selection at an 16 airline or flight training organization 17 Nomination of experienced TRE by postholder training The interview team composed of the Assistant General Manager Aircrew, Manager 18 Crew Resources, Crew Resources Specialist and Management Pilots. 19 They are pilots, TRE, TRI and staff of Flight Training Center & Flight Crew 919 20 They form part of Flight Operations Management Team We choose members from branch management based on their experience and 21 provide limited training from the human resources department. Frequency 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
129
This question was answered by 25 institutions. The following table 10.12. shows how the members of the selection teams get their qualification and from where. Survey: From where do they get their qualification for this function? Tab. 11.12.: Qualification Categories On the job training Experience in former functions Special training Regular recurrent training Cooperation with experts No. of institutions 19 17 17 5 14 No. of institutions 6 8 8 20 11
At 19 institutions the qualification takes place on the job training, at 17 institutions on the basis of experience in former functions, at 17 institutions as special training, at 5 institutions on the basis of regular recurrent training and at 14 institutions as cooperation with experts. Other? If "yes", please describe. Tab. 11.13.: Specification No. Categories 1 0 Frequency 110 Percent 100,0
There was no other way of qualification mentioned than the given ones. Question 59: Who decides about hiring of pilots in your company?
This question was answered by 39 institutions. The following table 11.14. shows how often which person/group in an instituion decides about the hiring of pilots. Survey: Who decides about hiring of pilots in your company? Tab. 11.14.: Who decides about hiring of pilots in your company? No. of institutions No. of Categories Selection team 24 15 Management of the airline 4 35 HR-responsible of the airline 10 29 Pilots (Postholders) 13 26 institutions
In 24 institutions the decision is made by the selection team, in 4 cases it is made by the management, in 10 cases it is made by the HR-responsible and in 13 cases it is made by pilots in respective functions.
130
Appendix 1
The following table 11.19. contains further specification about who makes the hiring decision. Other? If "yes", please describe. Tab. 11.19.: Specification No. Categories Frequency The Recruitment Selection Panel, members include Assistant 1 General Manager Aircrew, Manager Crew Resources, Chief 1 Pilot and Manager Flying Training. This specification was made by one institution. Question 60: Is the decision (hiring) solely based on results of the pilot selection system?
This question was answered by 32 institutions. The following table 11.20. shows at how many institutions the decision is solely based on results of the pilot selection system. Tab. 11.20.: Base of selection decision No. Categories Frequency 1 Yes 23 2 No 9 Total No. of institutions 32
23 institutions answered this question with yes, 9 institutions answered this question with no. Question 61: Which additional factors influence the hiring decisions?
This question was answered by 38 institutions. The following table 11.21. shows how often institutions take which additional factors for the hiring decision into account. Survey: Tab. 11.21.: Additional influences on the hiring decision No. Categories No. of institutions (Yes) 8 6 0 5 6 7 5 0 2 2 No. of institutions (No) 30 32 38 33 32 31 33 38 36 36
Ab Initio 1 Availability of applicants (time) 2 Flying experience 3 Salary, requested by the applicant 4 Administrative aspects 5 Legal aspects Ready Entry 1 Availability of applicants (time) 2 Flying experience 3 Salary, requisted by the applicant 4 Administrative aspects 5 Legal aspects
131
FOs 1 Availability of applicants (time) 2 Flying experience 3 Salary, requested by the applicant 4 Administrative aspects 5 Legal aspects Cpt.s 1 Availability of applicants (time) 2 Flying experience 3 Salary, requested by the applicant 4 Administrative aspects 5 Legal aspects Question 61.A: Other factors? If "yes", please describe.
The following tables 11.22. 11.25. show for each group separately, which additional factors play a role for the hiring decision. Ab Initio Tab. 11.22.: Specification No. Categories 1 DLR PASS 2 English Test / Personality Test 3 Regulator approval 4 Results of the training course Ready Entry Tab. 11.23.: Specification No. Categories Results of the grading, Psycho1 test and the interview with the selection board FOs Tab. 11.24.: Specification No. Categories 1 Background reference check Relevant other qualifications 2 beyond flying experience, i.e. academic formation 3 TYPE RATED Frequenc 1 1 1 Percent ,9 ,9 ,9 Frequenc 1 Percent ,9 Frequenc 1 1 1 1 Percent ,9 ,9 ,9 ,9
132
Appendix 1
Cpt.s Tab. 11.25.: Specification No. Categories Relevant other qualifications 1 beyond flying experience, i.e. academic formation 2 TYPE RATED Frequenc 1 1 Percent ,9 ,9
12.
Question 62:
This question was answered by 12 institutions. The following table 12.1. shows how often the requirement dimensions of the selection systems are defined according to which criteria. Survey: Definition of requirement dimensions Tab. 12.1.: Definition of requirement dimensions No. of Categories institutions (Yes) Defined by the tests we use 7 Based on personal judgement 5 Based on work samples 5 Based on requirement definitions of airline 6 experts Based on scientific requirement analyzes 5 No. of institutions (No) 5 7 7 6 7
Methodically more challenging concepts of defintion (work samples, requirement definitions of experts, scientific requirement analyzes) appear more often than concepts, which base upon individual expertise. Question 62.A: Other ways of definition? If yes, please describe! Other factors? Tab. 12.2.:Specification No. Categories 1 0 Frequency 110 Percent 100,0
133
The following tables 12.3. 12.7. show who constructed the respective selection system. Individual: This alternative was not chosen by any of the institutions. Tab. 12.3.: Indiviual No. Categories 1 0 Function: This alternative was not chosen by any of the institutions. Tab. 12.4.: Function No. Categories 1 0 Department: This alternative was chosen by 4 institutions. Tab. 12.5.: Department 1 2 3 4 Flight Operations Flight OPS Flight Training Human Resources Frequency 1 1 1 1 Percent ,9 ,9 ,9 ,9
Frequency 110
Percent 100,0
Frequency 110
Percent 100,0
Institution: This alternative was chosen by 1 institution. Tab. 12.6.: Institution No. Categories 1 Special provider Company: This alternative was chosen by 2 institutions. Tab. 12.7.: Company No. Categories 1 Airline 2 Special provider
Frequency 1
Percent ,9
Frequency 1 1
Percent ,9 ,9
134
Appendix 1
Question 64: How is your selection system structured (measuring dimensions)
This question was answered by 11 institutions. The following table 12.8. shows which types of evaluation areas/measuring dimensions are applied by how many institutions for the different groups. Tab. 12.8.: Structure of selection system with regard to measuring areas Basic More complex Applicatio Personality Social Groups mental pilot specific n data features abilities abilities competences Ab Initio 8 7 7 7 7 Ready Entry 5 5 4 4 4 FOs 5 5 4 5 4 Cpt.s 3 2 2 3 3
The following table 12.9. allows a comparison between the particular institutions with regard to the different carreer groups and measuring dimensions. Tab. 12.9.: Comparison of groups with regard to the structure of the selection system Case Number 1 2 3 4 5 6 7 8 9 10 A preselection phase based on application data Ab Initio Ab i. Ab i. Ab i. 0 Ab i. 0 Ab i. Ab i. Ab i. 0 Ready Enry R.e. R.e. 0 0 0 0 R.e. R.e. 0 0 FO 0 FOs 0 0 FOs FOs FOs 0 0 0 Cpt. Cpt.s Cpt.s 0 0 0 Cpt.s 0 0 0 0 A selection phase focusing on basic mental abilities Ab Initio Ab i. 0 Ab i. 0 Ab i. 0 Ab i. Ab i. Ab i. 0 Ready Enry R.e. 0 0 0 0 0 R.e. R.e. R.e. 0 FO 0 0 0 0 FOs FOs FOs 0 FOs 0 Cpt. Cpt.s 0 0 0 0 Cpt.s 0 0 0 0 A selection phase focusing on more complex, pilot specific competences Ab Initio Ab i. 0 Ab i. 0 Ab i. 0 Ab i. Ab i. Ab i. 0 Ready Enry R.e. 0 0 0 0 0 0 R.e. R.e. 0 FO 0 0 0 0 FOs FOs 0 0 FOs 0 Cpt. Cpt.s 0 0 0 0 Cpt.s 0 0 0 0 A selection phase focusing on personality features Ab Initio Ab i. 0 Ab i. 0 Ab i. 0 Ab i. Ab i. Ab i. 0 Ready Enry R.e. 0 0 0 0 0 0 R.e. R.e. 0 FO 0 0 0 FOs FOs FOs 0 0 FOs 0 Cpt. Cpt.s 0 0 Cpt.s 0 Cpt.s 0 0 0 0 A selection phase focusing on social abilities Ab Initio Ab i. 0 Ab i. 0 Ab i. 0 Ab i. Ab i. Ab i. 0 Ready Enry R.e. 0 0 0 0 0 0 R.e. R.e. 0 FO 0 0 0 0 FOs FOs 0 0 FOs 0 Cpt. Cpt.s 0 0 0 0 Cpt.s 0 0 0 0
11 Ab i. R.e. FOs 0 Ab i. R.e. FOs 0 Ab i. R.e. FOs 0 Ab i. R.e. FOs 0 Ab i. R.e. FOs 0
135
11 0 0 0 0 0 R.e. FOs 0
Do your candidates get any information about the selection procedure in advance?
This question was answered by 10 institutions. The following table 12.10. shows at how many institutions the candidates get information about the selection procedure in advance. Do your candidates get any information about the selection procedure in advance? Tab. 12.10.: Information about selection procedure No. Categories Frequency Percent 1 Yes 7 70,0 2 No 3 30,0 Total 10 100,0 At 7 institutions the candidates get information in advance. At 3 institutions they do not get any information at this state. The following table 12.11. shows at how many institutions the candidates of the different groups get which type of information in advance. Details about information: Tab. 12.11.: Details about given information Type of Categories Duration Organization requirement dimensions Ab Initio 7 7 6 Ready 6 6 5 FOs 5 5 5 Cpt.s 2 2 2
Type of tests 6 5 5 1
How to prepare 4 3 4 0
Grading system 3 2 2 0
Other? 0 0 0 0
Organizational information is given most frequently. Information about the type of requirement dimensions and tests is given slighty less frequently. Information about possibilities of preparation and grading systems is given least frequently.
136
Appendix 1
Question 66: Do you accept preparation courses for your selection procedure?
This question was answered by 10 institutions. The following table 12.12. shows how many institutions accept preparation courses for their selection procedure. Do you accept preparation courses for your selection procedure? Tab. 12.12.: Preparation courses No. Categories Frequency 1 Yes 3 2 No 7 Total 10 Percent 30,0 70,0 100,0
3 institutions accept preparation courses. 7 institutions do not accept preparation courses. Question 67: Do you support preparation courses for your selection procedure?
This question was answered by 11 institutions. The following table 12.13. shows how many institutions support preparation courses for their selection procedure. Do you support preparation courses? Tab. 12.13.: Preparation courses No. Categories Frequency Percent 1 Yes 1 9,1 2 No 10 90,9 Total 11 100,0 1 institution supports such courses. 10 institutions do not support such courses. Question 68: How many years do you keep your selection results valid (years/group) in the case you can not immediately hire the candidates and put them on a waiting list?
This question was answered by 16 institutions for the Ab Initios, by 13 institutions for the Ready Entries, by 18 institutions for the FOs and by 11 institutions for the Cpt.s.
137
This question was answered by 11 institutions. The following table 12.15. shows at how many institutions the selection systems allow in which states a prognosis concerning suitability for the captains role. Survey: Prognosis of suitability fort the Cpt.s role Tab. 12.15.: State for prognosis concerning the suitability for the Cpt.s role Categories Ab initio Ready FOs Cpt.s No. of institutions (Yes) 7 7 6 3 Percent (Yes) 63,6 63,6 54,5 27,7 No. of institutions (No) 4 4 5 8
138
Appendix 1
Question 70: How is the result of your selection process presented to the candidates (several answers are possible)?
This question was answered by 37 institutions. The following table 12.16. shows how often the institutions use which type of presentation of results toward the candidate. Survey: Result presented to candidates Tab. 12.16.: Presentation of selection results to the candidates No. of institutions Categories (Yes) Verbal description of strengths and weaknesses 12 Only information about pass/fail 17 Only information about position in a rank row 0 Scale value/with reference to a cut off value 1 Profile of results with important/all dimensions 5 Question 70.A: Other? If yes please describe! The following table 12.17. contains further ways of presentation of results. Other? If "yes", please describe. Tab. 12.17.: Specification Frequenc No. Categories Copy of paper results goes to 1 1 In case of failure an offer is made to 2 discuss the (negative) results with the 1 Pilot Recruitment Manager Percent ,9 ,9 Percent (Yes) 32,4 45,9 0 2,7 13,5 No. of institutions (No) 25 20 37 36 32
Question 70.B: Is the feedback given in written form or in a personal conversation? This question was answered by 37 institutions. The following table 12.18. shows at how many institutions feedback is given in a written form and at how many institutions feedback is given in a personal conversation. Tab. 12.18.: Type of feedback No. Categories 1 Written form 2 Personal conversation Total
Frequenc 19 18 37
19 institutions give feedback in a written form to the candidate. 18 institutions give feedback in a personal conversation with the candidate.
139
This question was answered by 37 institutions. The following table 12.19. shows how often the institutions use which type of presentation of results toward the hiring decision maker. Survey: How is the result of your selection process presented to the hiring decision maker Tab. 12.19.: Presentation of results to the hiring decision maker No. of institutions Categories (Yes) Verbal description of strengths and weaknesses 7 Only information about pass/fail 3 Only information about position in a rank row 1 Scale value/with reference to a cut off value 4 Profile of results with important/all dimensions 11 All available information 20 Other? If "yes", please describe. Tab. 12.20.: Specification No. Categories 1 0 Frequency 110 Percent 100,0 Percent (Yes) 18,9 8,1 2,7 10,8 29,7 54,1 No. of institutions (No) 30 34 36 33 26 17
Other types of presentation of results were not mentioned by the institutions. Question 72: How do you ensure data protection of the selection results?
This question was answered by 8 institutions. The following table 12.21. contains the mentioned possibilities of data protection. Tab. 12.21.: Specification: data protection No. Categories 1 AREA RESTRICTED 2 Data kept confidential by relevant department and supervised by manager 3 Data Privacy Act In company HR systems and due to the small size of the group of people 4 having access to the data 5 Kept secured in the human resources department 6 Managed by our selection provider 7 Restricted Access and securing data of Information The selection results from different stages are concentrated at a specialy 8 assigned flight OPS officer who is responsible to calculate and issuue the results of the selection 8 institutions name explicit concepts for data protection. Frequency 1 1 1 1 1 1 1 1 Percent ,9 ,9 ,9 ,9 ,9 ,9 ,9 ,9
140
Appendix 1
Question 73: Do you perform a reference check on Ready Entry/FO/Cpt. candidates?
This question was answered by 37 institutions. The following table 12.22. shows how many institutions perform a reference check on Ready Entry/FO/Cpt. candidates. Do you perform a reference check on Ready Entry/FO/Cpt. candidates? Tab. 12.22.: Reference check No. Categories 1 Yes 2 No Total Frequency 31 6 37 Percent 83,8 16,2 100,0
31 institutions perform such a reference check. 6 institutions do not perform such a reference check. Question 74: Do you allow failed candidates to re-apply?
This question was answered by 10 institutions. The following table 12.23. shows how many institutions allow failed candidates to re-apply. Tab. 12.23.: Repetition of the Test No. Categories Frequency 1 Yes 6 2 No 4 Total 10
6 institutions allow a repetition of the test, 4 institutions do not allow a repetition of the test. Question 75: What are the re-applying criteria?
This question was answered by 7 institutions. The following table 12.24. shows how often the institutions allow a re-application under which conditions. Survey: Re-applying criteria Tab. 12.24.: Re-applying criteria Categories After a certain time Performance close to cut off criteria Depending on the type of weakness If weaknesses can be corrected in a certain time No. of institutions (Yes) 5 2 4 3 Percent (Yes) 71,4 28,6 57,1 42,9 No. of institutions (No) 2 5 3 4
141
This question was answered by 6 candidates. The following table 12.26. indicates how often the institutions apply which selection processes with regard to the re-applying candidates. Tab. 12.26.: Selection of re-applying candidates No. Categories 1 Same as the first time 2 Tests only for the weak points 3 A different test battery Total No. of institutions
Frequenc 4 1 1 6
4 institutions do not change anything in terms of the selection concept. 1 institution repeats only the tests with low achievements and 1 institution uses a different test battery. Other? If "yes", please describe. Tab. 12.27.: Specification No. Categories 1 0 Frequency 110 Percent 100,0
None of the institutions stated further concepts. Question 77: Do you adapt the conditions/standards/procedures during periods of high demand of pilots? This question was answered by 10 institutions. The following table 12.28. shows how many institutions adapt their conditions/standards/procedures in periods of high demand of pilots. Tab. 12.28.: Number of institutions who adapt to periods of high demand of pilots No. Categories Frequency Percent 1 Yes 2 20,0 2 No 8 80,0 Total No. of institutions 10 100,0 2 institutions adapt their conditions/standards/procedures. conditions/standards/procedures. 8 institutions do not adapt their
142
Appendix 1
Question 77A: If yes, in which way? This question was answered by 11 institutions. The following table 12.29. shows how often the institutions adapt their conditions/standards/procedures in periods of high demand of pilots in which way. Survey: Specifications Tab. 12.29.: Strategies of adaptation to periods of high demand of pilots No. of Percent Categories institutions (Yes) (Yes) Recruiting more candidates for being tested 1 9,1 Accepting candidates on a lower minimal 1 9,1 performance level Reducing requirements for career steps 0 0 Other? If "yes", please describe. Tab. 12.30.: Specification No. Categories 1 0 Frequency 110 Percent 100,0 No. of institutions (No) 10 10 11 Percent (No) 90,9 90,9 100
None of the institutions mentioned further ways of adaption. Data Source Tab. 12.31.: Data Source No. Categories 1 1 IATA Pilot Selection Survey_Main Contact 2 1 IATA Pilot Selection Survey_Other Contacts Total Frequency 86 24 110 Percent 78,2 21,8 100,0
143
13.
Quality Management
The following table 13.1. shows at how many institutions the selection concept is certified by a quality management concept. The data is based on the answers of n = 35 institutions which perform selection themselves or with the help of other institutions. The table 13.1. shows from the left to the right which institutions have a selection concept for which target groups, if it is incorporated in a QMS system, if the performing organization is certified and on the right side which instance maintains the selection system according to the QMS concept. If there do not appear any values in the left columns, it is about institutions which let other institutions perform their selection. 20 institutions state that their concept is incorporated in a QMS concept. 9 institutions state that this is not the case. 6 institutions do not answer this question at all. 16 institutions state that the performing institution is certified. 9 institutions state that this is not the case. 10 institutions do not answer this question at all. In 5 cases an individual is responsible for the maintenance. In 6 cases an official is mentioned as being responsible. In 16 cases a department is responsible. In 3 cases an instituition is mentioned as being responsible. In 6 cases a company is responsible. There are some few double entries with regard to the categories of differentiation.
144
Appendix 1
Quality Management
Tab. 13.1. Quality Management Selection concept in Place/groups Ready Ab Initio Entry, l.e. . . . . Is your selection Is the organization system incorporated in performing the the QMS of your selection certified? company? Yes Yes Yes No . Flight operations manager AirService . . Who maintains the selection system in terms of QM?
No.
FOs
Cpt.s
Individual
Function
Department
Institution
Company
1 2
. .
. .
Yes
No
Cpt.s
Yes
Yes
Division HR
FOs
Cpt.s
Yes
No
Flight Planning Department( F/O, CPT), Human . Resources Department (Ab Initio, Ready Entry) . flight OPS . . . training Human Resourses . . . . . . . Airline
6 7 8 9 10 11 12
. . .
. . .
. . . 0 0 . .
. . . 0 0 . .
. . . . .
. . . . . training manager
. . . provider . . Airline
Yes
Coordinator
13
FOs
Cpt.s
No
14
Yes
Yes
15 16 17 18
. . . .
. . . .
. . . .
. . . .
No No Yes No
Yes No Yes .
. . . .
. . . .
. . . .
145
No.
FOs
Cpt.s
Individual
Function
Department
Institution
Company
19
Crew Resources
20
Ab initio 0
FOs
Yes
No
21 22 23 24 25 26 27
. . . .
. . . .
. . . . Cpt.s Cpt.s 0
Yes . Yes No . No .
Yes Yes No No . No .
. HT
. provider . . . . .
. . . . .
28
Yes
Yes
Airline
29
Yes
provider
Airline Group . .
30 31 32 33 34 35
0 FOs . . . .
0 0 . . . .
. . No Yes No Yes
. .
. .
. .
. .
ROSANA C AMARO
Airline
Airline
146
Appendix 1
Attachment 1:
No. Part I
Questionnaires filled in
Part II Part III Summe ausgef. 3 ausgef. 2 ausgef. 1 ausgef.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
1 0 0 0 1 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0
1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 1 1 0 0 0 1 0 1 0 0 0 1 0 0 1 1 0 0 0 0 1 0 0 0
3 1 1 1 2 1 1 3 3 1 1 1 1 1 1 3 3 1 1 1 3 1 3 1 1 1 3 1 1 3 3 1 1 1 1 3 1 1 1
1 . . . . . . 1 1 . . . . . . 1 1 . . . 1 . 1 . . . 1 . . 1 1 . . . . 1 . . .
. . . 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1 1 1 1 1 . . 1 1 1 1 1 1 . . 1 1 1 . 1 1 1 1 . 1 1 . . 1 1 1 1 . 1 1 1
147
No.
Part I
Part II
Part III
Summe ausgef.
3 ausgef.
2 ausgef.
1 ausgef.
40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 Total
1 0 0 1 1 0 0 0 0 1 1 0 0 0 0 1 1 1 0 0 0 1 1 1 1 1 1 53
0 1 0 1 0 1 1 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 19
0 0 1 1 0 0 0 1 1 0 0 1 0 1 1 0 0 0 0 0 1 0 0 0 0 0 0 19
1 1 1 3 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0
. . . 1 . . . . . . . . . . . . . . . . . . . . . . . 12
. . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1 1 1 . 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 53
148
to order :
Printed in Canada
ISBN 978-92-9233-779-7