Employee Satisfaction Survey: by David C Lusty
Employee Satisfaction Survey: by David C Lusty
QUANTIFY London 18 Rodway Road Roehampton Village London SW15 5DS 08452 41 41 60
[email protected]
QUANTIFY Yorkshire 4 West Parade Wakefield West Yorkshire WF1 1LT 0845 241 3450
[email protected]
www.quantify.co.uk COPYRIGHT NOTE: This is copyright material. We encourage you to share it with anyone you know who may be interested but you may only do so by copying the entire document. Any other use or reproduction of this document or any part of it without our written permission is a breach of our copyright. First published 2001 This edition 2006 Last revised 21 February 2011
Contents
Contents ........................................................................ 2 About the author ........................................................... 4 About the Book............................................................. 5 Kit components ......................................................... 5 Information is Power .................................................... 6 Management Information ......................................... 7 Financials.............................................................. 7 People and their feelings....................................... 7 Information about peoples feelings ..................... 8 Employee satisfaction survey ............................... 8 Whats in a name? ................................................ 9 Plan the project ............................................................. 9 Purpose ..................................................................... 9 Benefits ..................................................................... 9 Checklist: Possible objectives of the survey ......... 9 Cost......................................................................... 10 In house or outsourced? .......................................... 11 Checklist: Reasons to outsource parts of the project ................................................................. 11 Methodology .............................................................. 13 Web or paper .......................................................... 13 Reason quoted .................................................... 13 We say ................................................................ 13 Checklist: Advantages of a web based survey .... 13 Checklist: Disadvantages of a web based survey 14 Sample or census? .................................................. 14 Checklist: Advantages of the Census approach .. 14 Develop / agree a questionnaire.................................. 15 Reuse an old one? ................................................... 15 Checklist: Reasons for reusing an existing questionnaire....................................................... 15 Checklist: Reasons to develop a new questionnaire ............................................................................ 15 Topic areas ............................................................. 16 Checklist: Possible survey topics........................ 16 The Development Process ...................................... 17 Checklist: An effective questionnaire ................. 17 Focus groups....................................................... 17 Other preliminary research ................................. 18 Size ......................................................................... 18 So What? ................................................................ 18 Comments ............................................................... 19 Classifications......................................................... 19 Checklist: Possible classification systems .......... 19 Return address ........................................................ 21 Pilot ........................................................................ 21 Pilot Procedure ................................................... 21 Design .................................................................... 21 Translation.............................................................. 22 Printing ................................................................... 22 Maximising response rate........................................... 22 Checklist: Maximising response rate .................. 23 Publicise the project ................................................... 23 Checklist: Possible publicity routes.................... 23 Preferred completion route web or paper ............ 24 Distribute questionnaires ............................................ 25 The questionnaire pack ....................................... 25 The Covering Letter ............................................... 25 Checklist: The Covering letter............................ 25 Model text for a covering letter .............................. 25 Closing date ............................................................ 26 N members .......................................................... 27 Business Reply Envelope ....................................... 27 Home or work address? .......................................... 27 Checklist: Advantages of internal distribution ... 27 Checklist: Disadvantages of internal distribution27 A Web Option?....................................................... 28 Access controlled, Confidentiality assured ........ 28 Offer both routes and secure respondent commitment in advance...................................... 28 Gather responses; Capture data; Analyse results ........ 30 Checklist: Reasons to outsource the processing to QUANTIFY ........................................................... 30 Progress report........................................................ 31 Tabulation by location or department ..................... 31 Analysing & reporting the results........................... 31 Quantitative results ............................................. 31 Total percentage favourable ........................... 32 Averages ......................................................... 33 Percentiles ...................................................... 33 Significance .................................................... 33 Indices ............................................................ 34 Importance ...................................................... 34 Dont like numbers? ....................................... 34 Qualitative results ............................................... 34 Anecdotal use ................................................. 34 Analytical use ................................................. 35 Normative (Benchmarking) Comparisons .................. 35 Caution ................................................................... 36 Expectation ......................................................... 36 Wording.............................................................. 37 Response frame .................................................. 37 Sequence ............................................................ 37 Conclusion.......................................................... 37
Take action ................................................................. 38 Publish the results. .................................................. 38 Agree initiatives...................................................... 38 Action ..................................................................... 38 Measure again......................................................... 38 Add to your Management Information System (MIS) ................................................................................ 39 Instruct us to help with your employee satisfaction survey ......................................................................... 39 When you instruct us, you get ............................ 39 You get ........................................................... 39 So that ............................................................. 39 When you instruct us, YOU choose ................... 40 You choose ......................................................... 40
So that................................................................. 40 QUANTIFY Menu of services ............................... 40 Employee Satisfaction Surveys .......................... 40 Service Guarantee........................................... 41 More information ................................................... 42 Other services ......................................................... 42 Employment Cycle Research ............................. 42 Customer Satisfaction......................................... 42 360 Feedback (Management Strengths) ............ 42 Off-the-shelf ................................................... 42 Bespoke .......................................................... 43 Teamwork (Internal Customer Satisfaction) ...... 43 Quantitative data from written remarks .............. 43
Kit components YourESSRK.pdf; the Research Kit edition of this document, including full instructions for the other components: ItemsESSRK.xls; the survey items (questions) bank of 1,700 items Quantify.dot; a Word add-in providing a useful questionnaire editing toolbar QuaireRK.dot; the Word template for creating questionnaires Questionnaire.doc; a sample, ready-made questionnaire to use or adapt Analysis.xlt; the template for Excel for data capture, subset analysis and statistical tests Normstable.xls; an Excel workbook providing normative data to put your results in context
The Kit costs 500 plus VAT and is delivered on CD-ROM. If you would like to order a copy, please email [email protected] or call us on 020 8704 1296.
Information is Power
Have you ever realised that something about the way your organisation isnt working as it should? Perhaps the way things are currently arranged means that employees feel taken for granted, or overworked and underpaid and that they might be better off with a different employer. But when you suggest some change to deal with the problem you have identified, the response is How do you know theres a problem? If all you have to go on is your judgement and one or two specific employees whose cases you can quote, it is hard to convince others of the need to change the way things have been done for the last so many years. After all, it seems to work OK and were no worse than anyone else, are we? And each of your anecdotes represents a sample of one. Wouldnt your arguments carry much more weight if you were able to back them up with objective data demonstrating the problem? An employee satisfaction survey will provide you with that evidence; management information about how it feels to work in your organisation, in this department compared with that one; for this manager compared with that one. It will show where the arrangements work well, and where they need to be improved. A well informed manager is an influential manager. An Employee Satisfaction Survey will provide you with information which will give you the power to bring about change for the better. This book is all about how to conduct the survey in the most effective and economical way.
Management Information
Financials Most organisations have a business plan of one kind or another and a way of comparing what happens in reality with what had been planned. Often the plan for this year has been derived from the actual performance a year earlier, and is expressed as a percentage improvement on prior year. A system of reporting provides management with (fairly) up-to-date information on actual results, usually month by month, with comparisons versus plan and prior year. This Management Information System will include a line for each source of revenue, all added together to arrive at total revenue; then a line for each cost item followed by total costs and the consequent profit or loss. Often, there are analysis lines showing revenue per unit or cost per unit what the unit is depends on the nature of the business. There may be other statistics which help to pinpoint the reasons for any difference between actual performance and the plan or prior year figure. Hotels measure their occupancy rate the percentage of available rooms which were occupied; car rental companies measure utilisation the percentage of available cars which were rented out. There may be employee cost statistics, too; say employee cost per guest night, or employee cost per rental. In 99% of organisations, all these numbers have one thing in common. They are based on money or transactions, or other data which is gathered necessarily in the process of finding and serving customers, billing them and getting paid; finding, hiring and paying employees. These reports, often referred to as the financials are the main tool by which the enterprise is managed and yet they only include numbers which happen to be available anyway in the business system. Is that all the information needed to manage a modern enterprise? People and their feelings People, and the way they feel about what they do profoundly affect the effectiveness of an organisation and its success in selling its goods and services. If your employees care about the success of the business, if they are proud of the quality of its outputs and their part in creating them, the business is likely to flourish. If they regard their employer as an adversary, an exploiter who deserves no loyalty or effort from them, the business will suffer. Similarly, if customers trust the organisation, like its products and have good feelings about their contact with it they will be easier to sell to again and again than if they dread the next time they may be forced to deal with it. Peoples feelings affect the bottom line just as much as ingredient cost per cover or energy cost per unit. But the data doesnt exist anywhere in the financial system, so most organisations are managed by watching all the measures which are easy to count, rather than all the measures which matter. It is a bit like steering a supertanker in a fog with a big chunk of the radar screen covered up. If you really mean it when you say Our people are our greatest asset, why does your management information system tell you so little about this crucial asset and how they feel?
Information about peoples feelings There are two main ways of finding out how people feel about things. You can study what they say, or observe what they do. Getting someone with suitable training to observe and report on all your people isnt practical for a number of obvious reasons. You might get some useful information by talking to employees, or more to the point, listening to them. Managers should be doing this all the time. But depending on the individual managers relationship with them, employees with a criticism to express, however constructive, may hold back for various reasons: Risk of offending the manager, or incurring his or her wrath Risk of being perceived as a troublemaker, or negative Feels a bit wet to complain e.g. about not getting pats on the back
Even if people do open up to their managers, you cant express what they learn in numbers which could be used as a management statistic, and even if you could, the individual interpretation each manager put on what they had found out would make the measure too subjective and inconsistent. What you need is a consistent, objective way of measuring employees feelings. Employee satisfaction survey A survey can gather information using a consistent set of questions, allowing employees to remain anonymous, and thereby freeing them to tell the truth. It can produce a single measure of satisfaction overall or you can extract various indices including: Engagement Commitment Satisfaction with Working conditions Pay and benefits Training & development Communication
There is a more comprehensive checklist of topics you might want to include in a survey under Checklist: possible survey topics below. Whatever measures you decide you want, they can be calculated for the whole organisation as well as separately for each location, department, and cost / profit centre (to coincide with the way the financials are produced. You probably shouldnt attempt to measure monthly, like the financials. An annual main survey sent to every employee can be supplemented by quarterly tracking sample
surveys using just a few key questions, so you can update the employee feelings measures in the Management Information System quarterly. Whats in a name? Perhaps a rose by any other name would smell as sweet, but the name you give your survey will affect the way it is perceived. If your organisation makes a distinction between staff, who work in offices, and other workers, who dont, you wont want to call it a staff survey, because some employees will feel it doesnt apply to them. The traditional description attitude survey isnt ideal. The word attitude has acquired negative associations, and it probably isnt wise to suggest that you think your employees have an attitude. Especially in an organisation which emphasises the importance of customer satisfaction, why not give the message that employee satisfaction is important, too, by calling the employee survey a satisfaction survey, even if some of the measures will be about more than satisfaction? The word employee is a bit clunky, but at least it is all-embracing so unless you have a better word which takes in everyone, we suggest Employee Satisfaction Survey Or use your organisations word for your people Colleague Satisfaction Survey Partner Satisfaction Survey Team Member Satisfaction Survey
Benefits
Specify the purpose or purposes for which the employee survey is to be run. Checklist: Possible objectives of the survey Improve communication Demonstrate managements commitment to listening to employees views
Discover aspects of communication employees see as failing Provide management information on which to base action to
Improve employee performance / productivity Improve employee retention Reduce employee sickness and other absence Improve quality of service Speed Courtesy Accuracy Improve customer satisfaction
Study the correlation between employee satisfaction measures and customer satisfaction employee performance / productivity employee costs bottom line Monitor the effect of change Reorganisation Acquisition / merger Required by external authority or parent company Government or other body Group requirement Required for accreditation ISO quality accreditations Investors in People Fulfil commitment made to Trade Union or other employee representative body
Cost
We are often asked How much does it cost to run an employee satisfaction survey? And of course, there is no single answer to that question. It depends on the answers to a long list of questions, including How many people do you employ? How many do you plan to include in the survey i.e. will you run a census survey, including everyone, or a sample, in which only selected people receive a questionnaire? For a discussion of the census or sample issue, see Sample or census below. How many questions will there be in the questionnaire? Precoded ones can be answered with a tick in one of several boxes you provide, and are relatively cheap to process Free text prompts, where you ask people to write in an answer are much more expensive to process Do you already have a satisfactory questionnaire you can reuse, or adapt slightly, or will you have to develop one from scratch?
As a rough guide, you should reckon to spend between 500 and 5,000 on getting a consultant to help you develop a new questionnaire, depending on the complexity of what you want to measure, and the rigour with which you decide to conduct the preparatory research and the testing of the draft questionnaire. Spending more than this might further improve the questionnaires validity (ability to measure what it sets out to measure) or reliability (ability to measure what it measures consistently each time you use it) but the law of diminishing returns applies. Is it worth spending twice as much to get a questionnaire which is 5% better? You can reduce the consultants fee by carrying out much of the work yourself but you would be wise to get some experienced, professional input at some stage, to avoid falling into the many pitfalls which lie in wait for the inexperienced questionnaire author. You may find that the cost of your time to do the work will actually exceed what you would have paid a specialist. Once you have a questionnaire, allow about 2 per head to cover the cost of distributing, gathering, analysing and reporting the results of your survey. This is a very rough rule of thumb and the actual cost may be more or less depending on the size of the questionnaire and the complexity of the analysis you will require. Finally, you should allow for feeding back the results to management and employees and for taking action to initiate change which may be indicated by the results of the survey. As you proceed with the planning of the project and you know the answers to more of the questions, you or your external supplier will be able estimate the cost of the project more accurately.
In house or outsourced?
Much of the work involved in running a survey can perfectly well be done in house, if there is someone who can devote the time required. You may have the capacity and the skills in house to handle Planning the project Drafting the questionnaire Printing, mailing
Other jobs are probably best done by a specialist and involving an independent external professional provides some advantages you cant get any other way. Checklist: Reasons to outsource parts of the project Know-how Questionnaire design. A specialist will help iron out any wrinkles in your draft questionnaire
Systems. A specialist has systems and procedures all worked out which anyone in house would have to figure out as they go along so theyll take much longer, which might cost more than the specialist would have billed Experience. A specialist will have conducted many surveys and knows what works and what doesnt Web platform. An external supplier might offer the option to publish your survey on the web We do! Professionalism. You might not have the expertise in house to handle the technicalities of sampling, or assessing the significance of results Instant questionnaire. An external supplier may offer an off-the-shelf questionnaire you can use at lower cost than is involved if you develop one of your own Confidentiality. Some employees will fear that an outspoken comment on their part might lead to unpleasantness if it becomes known to one of their bosses. So people will be much more willing to give full, frank input if you provide an independent
person to facilitate focus groups for developing a questionnaire destination to return survey responses to facility to analyse the data Normative data. Your external supplier might be able to put your survey results in context by making comparisons with data gathered from other employee surveys Authority. When communicating the results, and convincing colleagues that some change is required, an external professionals view often carries more weight than anyone in house.
This means it is often best to use outside help to handle Focus group facilitation Questionnaire drafting and design Response handling, data capture Results analysis Interpretation and presentation of conclusions
Some suppliers might expect to take over the whole project as a package. Most will be willing to do so if that is what you require but you might want to find a supplier who, like QUANTIFY, will be pleased to take on any parts of the project you cant handle, or dont choose to, while letting you stay in control and handle the rest.
Methodology
Web or paper
Organisations are increasingly using the web for surveys. The main reasons people give are listed below with our view of them. Reason quoted More reliable distribution channel You can have response rate updates while the survey is live Eliminates manual keying from questionnaires Saves paper Allows quicker reporting when the survey closes Facilitates feedback to employees It is the current fashionable way of doing things We say Maybe, depending on the way you choose to distribute paper questionnaires. We provide daily progress reports whether clients choose paper or web distribution True, but extra set up costs might outweigh the saving. You probably need to cut out the data capture for at least 1,000 responses to recoup the extra costs involved in setting up your survey on the web. True, it might save some trees but dont assume that it will save you money. If paper responses have been returning separately to us during the life of the survey, they are almost all keyed by the time you decide to close it, so reporting is just as quick either way. You can put your survey results on the web or your intranet if you wish, regardless of how the data were gathered. True. Some of your people might look down their nose at a paper questionnaire, these days.
We think the web approach is neither better nor worse than paper, just different. That means it may suit some organisations but not others, so what matters is which is suitable for you. We suggest that there can be a very significant advantage in using both routes. See A web option? Checklist: Advantages of a web based survey Eliminates the need to print all those paper questionnaires Many employees prefer it, so might enhance response rate Reduces the data keying workload
On the negative side, although we save on postage and the data keying we don't have to do, the extra set up work together with the web hosting cost might exceed the saving. You cant fit as many questions on one screen as you can on one page of a paper questionnaire, so there are more virtual page turns involved in completing a questionnaire on the web than on paper and each one takes longer, even with the fast software and server we use. Also, people mistrust the web, believing that everything they do or view can be traced. If access to your survey requires the use of a key or login, people will take it that you can pick out their individual response. Checklist: Disadvantages of a web based survey Often more expensive than paper and pencil May heighten employees concerns about their confidentiality Takes longer to complete on the web than on paper Requires a PC so People cant do it on the bus or the train on the way to or from work Some people wont be comfortable using a PC Not suitable for people with no access to a PC
Sample or census?
Should you give every employee the opportunity to participate (census), or only ask some employees (sample)? Most employee surveys adopt the census approach, which has the following advantages: Checklist: Advantages of the Census approach Bigger sample size makes results more convincing Adequate sample sizes even when the results are sliced up into subsets to compare one group with another. A sample survey might not provide statistically significant data on specific subsets Nobody feels excluded, or receives the message that their views dont matter
A sample approach might be appropriate to Prevent survey fatigue, particularly if you decide to measure employee satisfaction quarterly. If everyone gets a questionnaire to fill out every three months, they will soon get fed up with it and response rate will decline. You may choose then to conduct an annual census using a
comprehensive questionnaire, and in the three other quarters sample a different 1/3 of the people using a short questionnaire with just the key tracking measures. Reduce cost. The cost reduction available by sampling might not be as much as you imagined, because a census can be distributed in batches and managers asked just to give everybody a pack (questionnaire, covering letter and reply envelope). A sample needs to be selected from an employee list, and personally addressed, so the distribution process is more expensive. All the same, there may be a saving to be made so if you employ a large number of people, and are satisfied that you can deal with the perception of those excluded, sampling might be worth considering.
Checklist: Reasons to develop a new questionnaire This is to be the first survey conducted Old questionnaire doesnt gather required data Old questionnaire is poorly designed Doesnt address the current issues Asks two or more questions in one, so staff cant respond to specific points Uses inappropriate response frames Finds out problems but gets no clue to solutions Staff cant understand questions, or give meaningful responses Asks about things which cant or wont be changed Old questionnaire is simply out of date
In practice, it is often possible to retain sufficient elements of an old questionnaire to provide for trend comparisons whilst incorporating new items to keep the questionnaire topical, relevant, fresh and interesting.
Topic areas
You need to decide what you need to measure. The following checklist is fairly comprehensive, but if you try to include all these topics you will probably end up with too big a questionnaire. Try to group the questions you ask under several different headings, each of which covers one aspect of the service you deliver. Then, as well as asking employees how satisfied they are with each area, you can ask how important each of them is to people. Your employees may be very dissatisfied with some aspect of their experience at work but if they regard that aspect as unimportant, this might not matter as much as another area which people are not dissatisfied with to quite the same degree, but which they see as very important. Before you invest effort and resources in dealing with an issue, it is best to know how important it is to your people as well as how satisfied or dissatisfied they are about it. Checklist: Possible survey topics Communication Teamwork / relationships with Manager with Colleagues with other Departments Management / supervision Strategic direction and leadership Leadership / Management skills Recognition / motivation Technical competence Management of change The job Variety Authority Responsibility Stress Sense of achievement Job satisfaction The organisation Culture Equal opportunities Job security Career progression Learning and development Terms and conditions Working environment Compensation Benefits Fulfilment of Personal goals Attitude to quality Attitude to customer satisfaction Identification with Organisation & its goals Commitment Engagement Confidence in the survey process and its outcomes
Most employee surveys are conducted on paper or on the web using self-completion questionnaires, the most exacting kind to prepare, because there will be nobody available to explain what was meant if an informant misinterprets a question. Writing questionnaires which work requires training and experience and comprehensive instruction in questionnaire writing would fill a small book, but not this one. At QUANTIFY we run a one day seminar for clients who want to develop their own questionnaires. It covers the basic dos and donts and will equip someone who already has a good command of English to produce competent questionnaires. Focus groups An invaluable part of the process of developing a questionnaire is to get some input from the proposed target group. A good way to do this is to conduct one or more meetings where a reasonably representative group of employees can talk freely about the issues to be included in the survey. Employees are less likely to talk freely if they think that their boss might hear about any criticism they offer. If people think they might suffer as a consequence, or be challenged and expected to justify any comments, they will most likely keep quiet. We find that focus groups facilitated by an independent person from outside the organisation, with no employer representative present, work
best. The outputs from focus groups often lead to things being included in questionnaires which wouldnt have been there otherwise. This avoids producing a questionnaire which asks questions about lots of issues excepting the one people most want to get a message to management about. The people involved in the focus group can also become advocates of the process, encouraging their colleagues to participate. The Research Kit edition of this book has a 4 page section here on Running a focus group including who should participate, choosing a venue and facilitator and a step-bystep guide to handling the meeting. Other preliminary research It may be impractical for a number of reasons to bring employees together to run focus groups. In this case, the preliminary research to inform the development of the questionnaire may be carried out via individual interviews, face to face or on the telephone.
Size
The bigger the questionnaire, the greater is the risk that people will put it to one side, or in the bin. On the other hand, a questionnaire with only three questions will cost much the same to print and distribute as a bigger one so you might feel that you could get more information for your money by using a bigger questionnaire. In practice, you probably want to ask about a wide range of issues, so you wont have trouble thinking of more questions; you are more likely to have a problem keeping the size down to avoid putting people off. The Research Kit edition of this book has a 10 page section here on Writing the questionnaire including; using the Research Kits items bank; brevity; use of language; eliminating wasted words; leading questions; positively and negatively expressed items; response frames, odd or even number of options in a rating scale and more.
So What?
When you have assembled a list of proposed questions, look at each one and write down the answer to this question. If the response to this question is very unfavourable what would you do about it? If the answer is that management would be unwilling to change anything in response, consider removing the question because asking it will suggest to your employees a willingness to contemplate change, and lead to frustration when the point is made in the survey results but no change occurs. If it was an issue raised by employees in the focus groups, it is evidently important to them. You need to have a very good reason for deciding that nothing can be done about it and you need to convince the focus group members that it is a good reason, too. Otherwise, they might lose all faith in the process and instead of being advocates they might become opponents of the survey.
If the answer to the question is that you wouldnt know what you could change to improve the situation, consider whether the question is specific enough. You may not be able to make it any more specific, in which case, you may have to compromise and include it as a diagnostic tool, a question which might reveal a problem which you would then need to investigate further.
Comments
Most of the questions should allow response by ticking one of several boxes but you may want your questionnaire to include prompts which invite free text response. Sometimes some space at the end of the questionnaire headed Please add any further comments here will be all that is required. Getting anything useful out of the answers is much more time-consuming and therefore much more expensive than analysing ticks in boxes, though. There are two main ways to use the responses Anecdotally. When the survey results are available, and you are advocating some kind of change, as well as presenting the results drawn from the tick-box answers, you quote some of the remarks employees made. Often, they are pithy and persuasive, and you can point out that it isnt you saying this, it is the employees. Analytically. By paraphrasing and categorising every remark and counting how many there are in each category, and how many expressing each single general sentiment, you can see which thought was expressed most often.
If you are tempted to include many open questions, is that because you dont know what the issues are, that employees are bothered about? You may be able to find out more by conducting focus groups, if you havent already done so, and make sure that the questionnaire already covers the key concerns employees want to communicate to you. The extra cost of developing a questionnaire with fewer free text prompts will give you a saving at the analysis stage. The greater the number of participants in your survey, the bigger the analysis saving will be. An external supplier should be able to advise on the trade-off here.
Classifications
The results will be more valuable if you provide for comparisons between different groups of people, areas, departments etc. This requires that you include on the questionnaire some items where you ask people to place themselves in one of several categories you offer. Depending on your organisation and the subdivision of results which would be helpful in interpreting the messages the survey provides, you may want to ask employees to tick one of several options you provide under some or all of the following headings. Checklist: Possible classification systems Department Location
Job type / function Job grade / level Age group Length of service
These classifications are generally, if loosely, known as demographic items. The department or location split can be very useful because it often divides people up according to the manager they report to. You could do this explicitly, of course, by listing the managers and asking people to tick the one they report to. When you get the results, you might well see some striking differences between the groups managed by different people and the information will allow you to identify the aspects of different managers styles which work well, and those which dont, and target development activity accordingly. For a more specific way of providing feedback to managers on how their style is perceived by their team and others, a 360 Feedback or Management feedback system will be helpful. Find more about these on our web site at www.quantify.co.uk. Some employees will be anxious, despite any promises to the contrary, that classification questions may be used to identify them, so you may be tempted to keep the classification systems as wide as possible and to skip some altogether. This is to deny yourself the chance to use the survey results to their best advantage, however, so it is a decision which should not be taken lightly. The first time you conduct a survey using a new questionnaire, the results dont mean much by themselves. Whatever the pattern of responses to a question, you probably wont know if the overall result is good, bad or indifferent. You may have access to some normative comparisons for some of the questions, but probably not for all of them. The most useful data you can get out of any of the questions will be to see where one group of employees is significantly happier, or less happy about the issue than another. This allows you to identify areas where more effective practices are in place, which you may be able to identify and replicate elsewhere, and find the key areas which would benefit from improvement, whether it concerns people in a particular department, with a specific age, or length of service range, of a specific gender or ethnicity, or whatever it is. Without the classification questions in the questionnaire, you cant do the analysis, so you will never know. Even if you include no classification questions at all, the cynical employee who doesnt believe your assurances that the process is anonymous and confidential will still be convinced that each questionnaire has an invisible or disguised identity mark on it so that you can see how specific people replied. It is wise to put the demographics section at the end of the questionnaire. This way, someone who has already invested the time required to fill out the rest of the questionnaire might decide not to
answer these questions but they will probably still send it back. So we can at least include it in the overall results. If the same person encountered these questions at the beginning of the questionnaire, they might just throw it in the bin.
Return address
People lose reply envelopes, or mix them up with other mailings they happen to have lying around at the same time, so even though you plan to provide a reply paid envelope, print a reply address on the questionnaire, too. Use a FREEPOST address independent of your organisation if you can.
Pilot
If a newly developed questionnaire includes many questions that havent previously been used in other questionnaires, it may be worth conducting a pilot exercise. This takes time, and costs money, adding to the development cost, so the more people you plan to include in the survey the easier it becomes to justify the cost of a pilot. Pilot Procedure Choose a small but reasonably representative group of employees. If you used focus groups to help develop the questionnaire, some of the focus group members would be a good group to use for the pilot. Ask them to complete the draft questionnaire, then gather their feedback on the questionnaire. At this stage, they should understand that you are not gathering their answers to the main survey, just testing the questionnaire. A good way to gather feedback is to meet up to six pilot targets at a time and work through the questionnaire line by line asking them to say how they understood the instruction, or question, how they went about responding, what the appearance of the questionnaire suggested to them. You are looking for any indication that the questionnaire might be interpreted in any way you hadnt intended. Encourage them to raise any different interpretation they think is possible. Then you can amend any ambiguous item to ensure that as many people as possible will understand it the way you intended.
Design
As well as the content, the appearance of the questionnaire should be fresh and appealing. Many people dont read the instructions, if any, so the design must make it obvious how they should complete the questionnaire. The physical size of the resulting document is important. Even if you decide you want 100 questions, which is really a few too many, it is possible to fit them on four sides of A4. Alternatively, you could space them out more and make a pretentious booklet of 32 sides. The version on four sides of A4 (printed on A3 and folded to make a nice neat booklet) is much more likely to be completed and returned, though, because the thud the 32 pages make when they hit the desk makes it seem like a much bigger chore to fill it out.
The Research Kit edition of this book has a 6 page section here on Designing the questionnaire including; using the Research Kits Word template for questionnaire design; inserting a logo; headings,; items and sub-items; creating neat response frames; arranging demographic items; fitting it in .
Translation
If the questionnaire needs to be translated, now is the time to do it. To facilitate processing, it is best to keep the layout and page breaks the same in all versions, so that the equivalent question is always in the same place, on the same page. Make sure the translators dont reverse the headings on the response scales!
Printing
For a paper questionnaire, printing and production should be to a standard appropriate to the importance of the project but not extravagant; in keeping with your house style, but distinct from any literature which requires no response. Paper a bit heavier (say 100gsm) than normal office paper (usually 80gsm) invests the survey with a little bit more importance. Make sure that the paper can be written on with ball-point or fountain pen. Glossy paper sometimes resists any attempt to write on it, which defeats the object of the exercise, however pretty the questionnaire looks.
Checklist: Maximising response rate Involve representatives of the target group in the project Publicise the survey in advance Make sure the content is relevant to the people you ask to respond Produce the questionnaire to an appropriate quality Make its size manageable; not intimidating Promise confidentiality
Allow employees to complete in work time if they wish Provide free postage for return Provide an independent external agency to handle responses confidentially Set a suitable time limit Issue a reminder Promise a donation to charity or the locations employee social fund for each response received
And if you decide not to guarantee anonymity Promise entry in a prize draw for each response received Promise a voucher or gift for each response received
Articles in corporate Newsletter / magazine Feature on in house radio / TV Note enclosed with pay-slips Email messages
Personalised mailing from Chief Executive / Head of HR Topic in team briefings / other meetings Feature on the intranet web pages Posters
Distribute questionnaires
The questionnaire pack
Each participating employee should receive a pack comprising A covering letter The questionnaire A business reply envelope
Response is post paid Promise publication and action on results Closing date Urgent and important do it now
Dear Colleague / Employee Name Employee Satisfaction Survey I am writing to invite you to participate in a survey. This is a confidential, anonymous way for every member of the Company Name team to have a say about how it feels to work here. The results will help us to make sure that any changes we make will be changes for the better. Confidential Please complete the questionnaire and return it in the envelope provided to the independent company, Quantify, who will handle the analysis for us and protect your confidentiality. You can complete the survey just by ticking boxes if you choose, so it need not take more than a few minutes. You dont need to use a stamp. Please post your response to arrive by closing date. No one at Company Name will see your completed questionnaire. Quantify will provide reports to us, which show the views of different groups of people. To ensure your complete confidentiality, no group with less than [N] members will have their views reported. We will let you know what the main results of the survey were, and we will take action to deal with any issues which emerge. Yours sincerely Chief Executive PS Why not do it right now and be sure to have your say?
Sometimes, clients ask us to provide a further covering letter on our letterhead to reassure employees that we really exist, and are promising to protect their confidentiality.
Closing date
The closing date you publish in the letter and elsewhere should be at least a week after the day you expect employees to receive the pack but not be much more unless shift or rostering arrangements mean you cant predict when people will see the questionnaire. The aim is to create sufficient sense of urgency to encourage many people to do it straightaway, so it doesnt get put to one side and forgotten, but not so great a sense of panic that people regard it as unreasonable, and bin it without even considering joining in.
We suggest you extend the deadline later, when you issue a reminder, see Progress Report below.
N members
We will never report to a client the results of any group smaller than three individuals. You may choose to instruct us to apply a different minimum subset size. The results from a subset smaller than about ten will have very big sampling error, anyway, so might not be of much value.
No impact on confidentiality people can take them home if they wish, and if you have provided a business reply envelope, they can post their response in any pillar box. We find response rates tend to be better
There are some possible disadvantages, though. Checklist: Disadvantages of internal distribution Employees might be distracted from essential work Employees might collaborate, rather than giving their own view
Managers (paranoid ones?) sometimes forget to distribute questionnaire packs, although we can usually spot this when the responses start coming in and tip you off so that you can
encourage them. See Tabulation by location or department below. Group dynamic might lead to a collective decision not to participate
A Web Option?
You may want some or all informants to complete your survey on the web. Here is how we handle this. For a discussion of the advantages and disadvantages of using the web, see Web or Paper above. You can visit a sample web survey which demonstrates the speed of our system, provides links to selected outputs illustrations and describes the many aspects which can be configured to suit your preference. Click here to visit the survey. Access controlled, Confidentiality assured We need to ensure that only authorised informants can participate while still hoping to satisfy them about their confidentiality. In the QUANTIFY system, each informant gets a unique randomly generated respondent key to use to get into the survey. When they are ready to complete the survey, employees visit the site, usually just by clicking on the link provided in an email invitation. The survey opens with the clients logo on the welcome page, then presents the survey items for them to complete by clicking buttons or typing if text responses are required. When we are ready, we retrieve the data directly from the server and load it into our analysis software along with any data we have keyed from paper questionnaires. The email invitations are generated from data where each employees name and email address record is allocated one of the unique access keys. When we download the response data, the access keys do not download so neither we nor our client can relate survey responses to individual informants. Offer both routes and secure respondent commitment in advance There are still only a few organisations in which every employee has access to a connected PC with a browser and feels comfortable using it. So the web may be the best medium for many of the informants for your survey but it may not be suitable for all of them. Some of the targets for your survey might not have access to a web enabled PC, or perhaps some people will suspect that their responses won't be as anonymous on the web as they would be on paper. So why not let people choose which way they prefer to participate in the survey?
Step 1 Advance notice Our client provides us with a list of all the participants with name, workplace (internal mailing) address and email address, if they have one. We send an advance notice to every member of the target group to say that the survey will be happening. As usual, this message includes explanation of the reasons for conducting the survey, commitment to take action on the results and assurances of anonymity and confidentiality. Crucially, though, it also asks each participant to choose whether they wish to complete the survey on paper, or on the web. They can respond by using a tear-off strip, or by sending email directly to us. This process has the usual positive effect on response rate which any advance publicity will achieve, but this is heightened by the commitment people make by choosing to participate by one route or another. We use the participant list the client provided to create a preference register, in which each person is allocated as a paper or a web participant. People who don't respond to the initial request are allocated to one route or another using a rule we agree with the client. This could be "Paper unless they opt for the web", or "Web if we have an email address on file, paper if not" or any other appropriate rule. Step 2 Invitations We generate invitations to participate, as follows:Paper Participants: A personalised covering letter, with the printed questionnaire and a QUANTIFY business reply envelope. We can mail these direct, or deliver them to the client for distribution. Step 3 Data gathering Responses return to us by post or accumulate on the web site, and we send the client daily progress reports showing responses received so far. Step 4 Reminders We issue a reminder. For paper participants it may be just a memo asking managers to remind all their people. For web participants, we send an email reminder to everyone, (we don't know who has and who hasn't responded) with their unique access link repeated. If they have already completed the survey but they try again, they just get a message saying "This respondent key has already been used." Web Participants: A personalised email invitation to participate, including a link to the survey web site, with a unique respondent key for this participant. We send these directly to participants at the email addresses provided by them, or the client employer.
Step 5 Closure & analysis When the client agrees with us that the survey should close, we download the response data from the web site, combine it with the keyed data from the paper participants, and proceed with analysis and reporting in the usual way.
Anyone doing this work in house will have to reinvent all our systems and procedures, so you will probably spend more in-house than we would bill you We are always pleased to talk over your project and give an estimate for whatever services you require. There is no charge for an initial discussion. The Research Kit edition of this book has a 10 page section here on Analysing and reporting the results including configuring the included analysis tool for your survey; handling incoming responses; data entry; subset analysis; the relationship between sample sizes and the accuracy of the result; assessing the significance of apparent differences of opinion and more.
Progress report
As the originally published closing date approaches, you should consider extending the closing date and issuing a reminder. If we are handling the replies, we will be sending daily progress reports by email, tracking the responses received each day and the cumulative response so far. The progress report is illustrated in our Report Styles Illustrations document, samples.pdf. It is usually worth extending the closing date by at least a further week, so people have altogether had two weeks or more to respond. Reminders, whether personally addressed or not, can announce the extension of the closing date, and will produce some more responses. There is a trade-off between waiting for more responses, which will lend more weight to the results when you get them, and closing the survey, which will allow you to get results while they are still current, not stale news. You choose.
If the prompt was Have you had an appraisal meeting in the last year? and you report that 42% of informants ticked Yes, everyone knows what that means. If you compare two different groups, and you find that for one group the result was 32% and for another it was 50% we know what that means, too. What if the prompt was I found my last appraisal meeting useful, and you asked people to tick one of the following boxes? Strongly disagree I found my last appraisal meeting useful Disagree In between Agree Strongly agree
Two different groups might produce results like these: I found my last appraisal meeting useful Group 1 Group 2 Disagree 18% 20% Agree 20% 18%
We can still compare, but it is a bit harder with more numbers to look at, so to make it simpler, we aggregate the numbers one way or another.
Total percentage favourable
People often report the answers to questions like this by giving the total percentage who ticked the favourable responses, Agree and Strongly agree in this case. This way, group 1 above scores 52% and group 2 only 36% and we have no difficulty deciding which is the happier group. Now look at these two sets of results. I found my last appraisal meeting useful Group A Group B Strongly disagree 12% 32% Disagree 20% 20% In between 32% 12% Agree 18% 20% Strongly agree 18% 16%
Using the total favourable ticks approach, they both score 36% but we dont need to spend long looking at the numbers to see that group B is less happy than group A. This way of reporting results disregards important part of the data we went to so much trouble to collect. A Strongly agree response is different from an Agree, so it is barmy to treat them as if they were the same. And the
Strongly disagree; Disagree and In between responses are all different, so it makes sense to recognise this and treat them differently.
Averages
Here is a way to work out a single number to represent each groups result, taking all the options into account. Give each reply a value, or score, according to the box the person chose to tick, as follows. A vote for is worth Strongly disagree 1 Disagree 2 In between 3 Agree 4 Strongly agree 5
When we work out the average score for each of the above groups we get 3.1 for group A and 2.7 for group B, numbers which reflect the difference between the groups.
Percentiles
Most people are happier thinking about scores out of 100, so we would convert these averages to express them as if the scale had been from 0 to 100 instead of 1 to 5. We can do this with all the results, whatever scale they use, and this makes all the results comparable. After the conversion, we report the percentile results for the two groups like this. I found my last appraisal meeting useful Percentile result Group A Group B 52.5 42.5
This makes it easy to compare the two groups and conclude that group A are happier than group B. The Research Kit edition of this book includes detailed instructions here on how to convert average results to percentiles.
Significance
We might be right to reach this conclusion, but the apparent difference might be just the result of sampling error. To decide what significance we can attach to the difference, we need to calculate the sampling error, which depends on the number of people in each of the groups we are comparing and the standard deviation of their answers. If we dont do this, we might put a lot of resources into dealing with an apparent problem which wasnt a problem in reality, just a product of the sampling process, so-called sampling error. For people who are interested in this sort of thing but dont know how to do the necessary calculations, we offer a one day seminar entitled Getting facts out of figures; A beginners practical guide to descriptive statistics.
Indices
By averaging the responses for several related questions, you can work out a rating for the topic (or cluster) they represent. Express it as a percentile, and you can call it a satisfaction index. If you have several questions about communication, for example, you can average them and express the result as a percentile to provide a Satisfaction with communication Index. You could do the same with each topic your questionnaire covers. If you want some questions to have more effect on the final index than others, the average can be a weighted average.
Importance
If you also included in the questionnaire a block of questions asking how important each topic is to customers, you can work out a Priority for action index for each topic. Take the satisfaction index for the topic away from 100 to convert it into a dissatisfaction index. Then multiply by the topics importance index and divide by 100. The result is a priority for action index. The following illustration shows how the thing people are most dissatisfied with isnt always the top priority for action. Topic (cluster) Communication Physical environment Recognition and reward
Dont like numbers?
Satisfaction index 60 52 81
Dissatisfaction index 40 48 19
Importance index 85 52 78
Many people dont enjoy playing with numbers like this. They need help, and may find it in house, or get it from an external supplier. Qualitative results Research outputs which dont involve numbers are known as qualitative results. They are concerned with what people say in words rather than responding in any way that can readily be counted. The focus groups that we suggest as a way of informing the development of a questionnaire fall into the category of qualitative research. If an employee survey questionnaire includes prompts inviting written responses, any other comments etc. the data the questionnaire gathers under these prompts is qualitative data.
Anecdotal use
One way to use the comments is to quote some of them in support of a case you are making for a change in response to the survey outputs. This can often be useful because the comment might well be expressed in much more forceful and therefore persuasive language than would be appropriate
for you to use in a management meeting. But you can quote a comment from the survey and it isnt you saying it. If you only want to use the comments anecdotally, all you need by way of reporting is a transcript of all the remarks written on the questionnaires. The weakness of anecdote is that each one can easily be dismissed as a sample of one; it doesnt necessarily represent a widely held view.
Analytical use
If you analyse the comments and group like remarks together in suitable categories, you can then report how many people said the air conditioning needed to be improved. This remark might be allocated to a main category Working Environment, with a subheading Heating and ventilation. Another main category might be Compensation with a subheading Benefits and a subsubheading Holidays and there might be a number of remarks recorded there, suggesting a general increase in holiday entitlement, more holiday to reward long service, more flexible arrangements for fixing the dates of holidays etc. It is useful to look at the remarks grouped thematically; according to the categories they fall into by frequency; listed in descending order of the number of times each was mentioned, disregarding the categories they were allocated to.
Reported by frequency, comments acquire much more persuasive power because you can see what proportion of those replying to the survey mentioned each point and if you can list the comments this way separately for different groups of employees by department for example, you can pin down where the air conditioning problem is at its worst, or establish that the concern is pretty general. As well as comparing one group with another within a survey, you can compare one survey with another, so you can see if things have got better or worse since last years survey. We wont pretend that this sort of analysis is easy. We have invested a good deal of effort in setting up systems to help us to carry out such analysis but it still requires quite a lot of effort and is much more expensive than analysing ticks in boxes. Download our reporting styles illustrations document, samples.pdf to see illustrations of the ways we report qualitative results quantitatively.
To meet this need, we have developed a sophisticated norms management system which allows clients to put their results in context by establishing, say, that their result on a particular question is better than 74% of the responses in our normative database; (in statistical terminology, at the 74th percentile of the normative sample.) The norms cover a collection of 62 core employee satisfaction items and the sample sizes vary according to the popularity of each item in clients surveys over the years. The Research Kit edition of this book has a 3 page section here on Making Normative comparisons using the Research Kits normative table.
Caution
We have some serious reservations about the use of benchmarking or normative data, however, so while we are pleased to be able to offer a service which meets a need expressed by many clients, we offer it with the following health warnings. Expectation This is the most serious drawback of benchmarking, because it can actually lead to a "good" organisation getting worse satisfaction scores than a "bad" one. The responses people give to questions about their satisfaction with anything don't depend only on the experience they have had. Exactly the same experience might lead to quite different satisfaction responses depending on what the informant had expected. Someone who had not expected to be treated particularly well might be very satisfied with rather run-of-the-mill treatment, while someone who had expected excellent treatment would be very dissatisfied with exactly the same experience. These two people would give very different answers to the same satisfaction questions despite the fact that their experience was exactly the same. The harder you have worked at satisfaction, the higher the expectation of your informants is likely to be, making it harder for you to get high satisfaction scores. When you compare your satisfaction scores with those obtained in another organisation, you won't know what their informants were expecting. Another organisation might not in the past have tried half as hard as yours, and their informants, expecting to be very shabbily treated, might have been pleasantly surprised even to be treated in a manner you would regard as unacceptably poor. Should you then be concerned if your satisfaction scores are no better than that other organisation's? Should you invest management time and scarce resources in dealing with the "problem"? We don't think so. Of course you won't be comparing your results with just one other organisation's, so the wider norms group will include some "good" and some "bad" organisations. All the same, if your result doesn't compare as well as you had hoped you can never be sure if that is because people are getting a less good experience from your organisation or because they have learned to expect more from you.
Wording Clients often find a standard item wording inappropriate to their organisation, so they adapt it to meet their needs. Norms include items using the identical standard wording, but they also include items expressed in equivalent terms. So where the standard item reads I always get the equipment and/or facilities I need to do my job, we also include items where clients have preferred I have the equipment and facilities I need to do my job; or We always get the equipment and/or facilities we need to enable us to do our jobs; or I have sufficient resources to enable me to do my job. This pragmatic approach is intended to provide clients with the widest possible normative group to compare with but we acknowledge that we are not strictly comparing like with like, so we offer the option of norms based on the smaller samples including only surveys using the standard text word for word. Response frame Clients use different response frames, so while one survey is scored on a four point scale, Very dissatisfied, Dissatisfied, Satisfied, Very satisfied; another might use a seven point agreement scale, Strongly disagree, Disagree, Tend to disagree, Neither agree nor disagree, Tend to agree, Agree, Strongly agree. To incorporate data gathered using any response frame, each individual response is expressed in the normative database as a percentile score; that is converted as if its response frame had been a scale from 0 to 100. Sequence The response to a question can be influenced by the other questions which come before it in the questionnaire. Imagine asking people to rate their overall happiness on some scale. If the question is preceded by one asking them to rate the happiness of their marriage / relationship, this narrows the perception of the overall question, so people reply to it much more in the context of their relationship than they would if the general question was put before the marriage / relationship one. Even if we restrict ourselves to the identical text of the standard core question, the different questions which have preceded it in the various questionnaires will have coloured responses to the standard question in varying ways which we can't predict or control, so once again, we aren't strictly comparing like with like. Conclusion All these influences mean that while normative comparisons can help to provide a context for your survey results, they should be approached with great caution and corroborative evidence should be sought before you invest effort and resources in addressing a perceived competitive disadvantage.
Take action
Publish the results.
You should provide every employee with at least a summary of the results so everyone knows, or thinks they know, how good or bad they were. This demonstrates your commitment to an open process of improvement and may encourage some people who this time thought it wasnt worth bothering to reply to change their mind next time.
Agree initiatives
Identify the three or four most pressing issues which emerge from the survey results. Involve people at all levels in the decision-making if you can, so that everyone feels part of the push for improvement. Introduce initiatives to address the issues and make them high priority for everyone from the Chief Executive down. More than three or four high priority issues for everyone to address as well as their normal daily duties will be too much to cope with. Without the Chief Executives wholehearted commitment nothing is likely to change much. In a big organisation, you may devolve this process to divisions, or sites, allowing each one to choose its own key issues and decide on its own initiatives for improvement.
Action
Install the initiatives, getting as much employee input as possible into the process. Get people at all levels to commit to making things change and make it clear that everyone shares responsibility for making it work. We can advise on, and facilitate this process for you.
Measure again
After a suitable interval for the initiatives to take effect, repeat the survey to measure any change. Many employers carry out annual surveys and track the trend on a number of different topic indices. This can be the most productive stage for the survey because the first time you run a survey you can compare results from one area with those from another, or one job type with another but the second and subsequent times you run it, you can use the previous results as a benchmark against which to measure this years results. Remember, though, that employee satisfaction is a function of expectation as well as experience the result you get is a measure of peoples perception of the experience of work in the context of the treatment they expected. Exactly the same experience might lead to quite different satisfaction responses depending on what the informant had expected. Someone who had not expected to be treated particularly well might be very pleased if their actual treatment came as a nice surprise, even if it still wasn't particularly
wonderful. On the other hand, if they had expected excellent treatment, exactly the same not particularly wonderful experience would leave them disappointed so they would answer a satisfaction question in a survey differently. Satisfaction is a function of experience and expectation. By showing sufficient interest to operate an employee satisfaction survey, you have probably increased your peoples expectations, so it is hard to achieve higher satisfaction measures each time you run a survey; particularly the second and third times, when people are still getting used to the idea that you genuinely care about how they feel.
Your people get an independent destination to send replies They feel more confident in their to confidentiality, so more reply You get regular updates on the number of responses so far You always know what is happening; you stay in control
You get You get your results in your choice of our standard formats only days* after you decide to close the survey to further responses You get a single point of contact You get advice in plain English, not survey jargon
So that The information you get is current, live data, not stale news You get to know your contact, and your contact knows your project Nobody uses technical terms in an effort to impress, or blind you with science
* Never more than two weeks. Thats guaranteed, (see our service guarantee) or theres no charge for the whole project.
So that
You get outputs set out in a way that suits you Your results are delivered in the way you find most convenient
on paper You choose any parts of the project you prefer to do yourself, with or without some guidance from us You choose how long to wait for stragglers to reply You choose from the menu below the expertise or services you want to outsource
You dont have to pay us to do things you could perfectly well do yourself You control the trade-off between response rate and quick results You are always in control, but we are ready with help whenever you require it
Printing Mailing Incoming mail handling Web, email or telephone interview options
Data capture Analysis and reporting Benchmarking Interpretation and advice on action Implementation
Service Guarantee
We deliver your outputs fast, or there's no charge for the whole project. Provided you choose one of our standard reporting formats and as long as informants' responses are returned directly to us (not to you and then forwarded in a big parcel), we will deliver your survey results no later than two weeks after you instruct us to accept no further responses. If we ever miss this deadline, we will waive our entire fee for the project.
More information
For more information, please visit our web site, quantify.co.uk or contact
QUANTIFY London QUANTIFY Yorkshire
18 Rodway Road Roehampton Village London SW15 5DS 020 8704 1296 [email protected]
4 West Parade Wakefield West Yorkshire WF1 1LT 0845 241 3450 [email protected]
Other services
We also help clients to quantify:Employment Cycle Research Conduct a Starters Survey and a Leavers Survey to measure peoples perceptions at the crucial entry and exit points of their relationship with your organisation. This allows you to compare these peoples views with those of mainstream employees and improve your management of the recruitment / induction process and reduce the number of good employees who choose to leave you. Customer Satisfaction 90% of dissatisfied customers never complain, they just take their business elsewhere, so counting customer complaints is measuring a little bit of the effect, not getting to the cause. Why not get objective measures of customer satisfaction, repurchase intention etc. and really take control of these crucial issues? You can download a free book, Your Customer Satisfaction Survey from our web site quantify.co.uk. 360 Feedback (Management Strengths) Give your managers objective feedback on the management behaviours they need to develop.
Off-the-shelf
For a quick start and minimum expense, we offer two low cost off-the-shelf web-based 360 Feedback systems. Full details are available here.
Bespoke
A management feedback system designed specially for you, (based on your management competencies and their associated behaviours, if you have defined them) provides managers with guidance and incentive to improve. Sources of feedback might include Managers manager, Managers peers, Managers direct reports, Other independent observers. Choose a reporting style which allows comparisons between managers, or one which reports only each individuals relative strengths and development needs among your set of management competencies. You can download a free book, Your Management Feedback System from our web site quantify.co.uk. Teamwork (Internal Customer Satisfaction) Give each department a measure of its success in meeting the needs of colleagues who rely on its support to get their jobs done. Each team reviews its own results before managers meet to trade the improvements they promise to others for those they need from other areas. Everyone becomes more aware of the needs of their internal customers, and service to the external customer improves as a result. Improved service = better customer retention = better bottom line. You can download a free book, Your Internal Customer Satisfaction Survey from our web site quantify.co.uk. Quantitative data from written remarks Customer comments cards Employee survey comments Letters of complaint or thanks
We can summarise any body of written material and provide frequency reports organised thematically, or in descending order of frequency of mention of a specific issue.