360 Feedback FAQs
This Frequently Asked Questions page has been specifically designed to provide practical, real world, honest answers to the most commonly asked 360 feedback questions.
- About 360 degree feedback
- Benefits and concerns of 360 feedback
- Purchasing 360 feedback
- Giving and receiving 360 feedback
- Best practice guidelines
About 360 degree feedback
Everything you need to know about the history, core purpose and most common uses of 360 feedback. For an overview of what 360 feedback is and how we use it, watch our video.
-
How do Lumus360 define 360 degree feedback?
Lumus360 define 360 feedback as:
‘The process of collecting perceptions from multiple sources and then collating them into a feedback report that can be used for development or measurement purposes.’
-
What is the history of 360 feedback?
References to multiple-source feedback can be found as far back as 100 years ago! Its first significant use was by the US military in WW1 and then by the Germany Army in the 1940’s. The first record of it being used in a corporate environment was by Esso (now ExxonMobil) in the 1950s. The name shift from ‘multi source’ to ‘360 degree feedback’ took place in the mid-1980s which was the same time its use started to became much more wide spread. Now 30 years on, 360 feedback is seen as an established HR / people development practice with multiple applications across the globe.
-
How does 360 work?
The overarching process hasn’t changed much in the last 30 years. The main two components being:
1. The perceptions/ratings of others is sort from multiple sources. These would normally be:
- Line Manager
- Colleagues/Peers
- Direct Reports
The above is not an exclusive list and quite often other groups (Key Stakeholders, Customers, Executive Team/ Board Members etc.) are also invited.
2. Results are then collated into a report that enables the following to be viewed:
- The extent to which behaviours are seen
- Comparisons between rating groups
-
How has the 360 process evolved?
Early iterations were paper based and the data was collated either manually/with the aid of excel typed software that converted the input ratings into some basic graphs. Today, most systems are cloud based, make use of the latest coding functionality and are as automated as possible.
-
What is the purpose of 360 degree feedback?
Different derivatives of 360 feedback have been used for multiple purposes including recruitment, culture change, personality measuring, customer feedback etc. However, today it is most commonly used to either:
- Support the personal development of employees
- Drive/inform performance decision making
-
What is the difference between 360 feedback, 360 appraisal, 360 evaluation, 360 review and 360 assessment?
It would be nice to say that:
- 360 degree feedback refers to the process when it’s used for development purposes
- 360 appraisal, evaluation, review and assessment are used when the process is linked to performance management
But unfortunately, that’s not the case! In practice, it comes down to personal preference with each phase being used to describe multiple implementation scenarios.
-
What are the most common uses for 360 feedback?
Lumus360 have been in the 360 market place for over 20 years and our clients over the past 5 years (300 plus) have used it for the following purposes:
- Performance measurement – 6%
- People development - 94%. The breakdown of this being:
- Executive development coaching (12%)
- As part of a leadership/ Management Development Programme (28%)
- Personal development (to support goal setting) - (47%)
- Talent management (5%)
- Team development (2%)
Over the last 4 years, we have also seen a shift towards running the process as a ‘development’ activity but where the outcomes are taken into an annual (performance management) development planning conversation.
-
Which organisations use 360?
Whilst it has been estimated that over a third of U.S. companies and 90% of Fortune 500 companies utilise 360 feedback. In our experience the cross sector breakdown is:
- Private sector - 75% (mainly large and medium sized businesses)
- Public sector – 19%
- Third (charity) sector – 7%
Benefits and concerns of 360 feedback
360 feedback drives a ‘love it’ – ‘hate it’ reaction in most people and the following aims to provide a balanced, honest view of the advantages and disadvantages of 360 degree feedback when used within the personal development and performance measurement contexts.
-
What are the advantages of 360 feedback when it’s used for ‘people development’ purposes?
When 360 is used as a ‘development tool’ it has been our experience that some, most or all of the following benefits are achieved:
Measurable individual benefits:
- Improved self-awareness
- Identification of perception blind spots
- Improved understanding of own strengths (as perceived by others)
- Understanding where others believe improvements can be made (or behaviours are not seen)
- Grows own level of feedback maturity
- Provides a good basis for performance improvement
- Improved understanding of leadership expectations
- Places focus on “how” things get done as opposed to what gets done
Other benefits:
- Contributes to growing cultures of openness, feedback and Continuous Improvement
- Supports individual accountability for own development
- Enables/ promotes development dialogue
- Acts as a catalyst for positive change
- Supports – encourages personal development
- Reinforces the link between behaviour and performance
If asked to identify the most important benefit of a 360 degree review, it would be its ability to focus participants on their working approach and how they can do things even better.
-
What are the disadvantages of 360 feedback when it’s used for development purposes?
The following potential disadvantages apply particular if the process is not well managed:
- Developmental/ critical feedback may not be received well by the participant resulting in negative / bad/ resentful feelings
- Anonymity can provide a shield for some to use the process for personal revenge
- The reliability (consistency) of people’s interpretation of the questions, rating scale, motives etc brings a ‘subjective’ (as opposed to objective) slant to the feedback/ ratings
- Not all feedback is ‘good quality’ and can leave room for interpretation
- Participants not taking a balanced view of their feedback – focusing more on the developmental ratings/ comments
Having spent over 20 years supporting organisations to get 360 feedback right and as a feedback facilitator to many thousands of participants. We can site examples of all of the above. However, they can all also be easily avoided!
-
Why do 360 feedback initiatives fail?
This is rarely because of the disadvantages listed above, but typically because of:
- A lack of organisational appetite/ engagement
- Poor communications
- No clear implementation strategy particularly with regards to expected outputs and how they will be supported and measured
-
Are 360 performance reviews effective?
Unfortunately there isn’t a simple ‘yes – no’ answer to this one. When used for ‘performance measurement’ the context totally changes and with that comes a higher risk that it doesn’t deliver what was initially expected. Our brief article 360 appraisal model for performance management provides further useful reading.
-
What are the pros and cons of using 360 degree feedback for performance evaluation?
The following is based on the starting assumption that it is either managed ‘very well’ or ‘very poorly’
Advocates of using 360 feedback as part of a performance appraisal/ management process would argue that the pros are:
- Input from multiple stakeholder groups results in a more rounded appraisal/ breadth of perspectives
- Highlights behavioural performance strengths, weakness and gaps
- Supports the identification of development opportunities
- Places focus on core competencies/ behaviour expectations
Opponents to using 360 feedback as a performance management tool would argue the cons are:
- Risk of game playing – You score me high and I’ll return the favour
- Accuracy of results – From a rater reliability quality perspective. i.e. consistency in how the scale and behavioural statements are interpreted
- Rater motivations (conscious or unconscious) and human biases - Everything from not wanting to disadvantage the appraisee to the halo/horns effect
- Identifying sufficient raters – Statistical reliability increases with 7 or more
- Lots of evidence that shows ‘if done badly’ it damage morale/ culture
- Defensive ‘rejection of the data’ - Particularly if the feedback is perceived as being unjustified or that other rater motives are at play
- Question quality – Appropriateness to role, observability, vagueness, subjectiveness etc
- The value/ weighing of different respondent ratings – Does a Line Managers view carry more weight than Direct Reports?
- Cost – In time and software
- Typically results in focus being placed on weaknesses / development
-
Why do 360 evaluations not work?
We know of several organisation that use 360 feedback as part of their annual performance management process and it works very well. However, the failure of the majority of organisation to effectively mitigate and manage the potential pitfalls, results in the initial ‘great idea’ falling to get past its first annual outing!
Purchasing 360 feedback
Some of the main questions asked by those purchasing 360 feedback. For further information, we strongly recommend The HR Professionals guide to purchasing 360 feedback services.
-
What are the key things to consider when purchasing 360 feedback?
Whilst there are many things to consider when choosing a 360 feedback provider. The core questions to ask yourself are:
- Do we require a 360 feedback specialist or will a general survey tool provider or software specialist do?
- How experienced are they…. Are they an established, credible provider?
- Do we expect the provider to complete any administrative tasks (dealing with bounce back emails, report production etc ) or will we do that?
- Is their systems User Experience (UX) simple and intuitive?
- Do we require a standard questionnaire or is a purpose built one needed?
- How does their total (all in) cost compare to others?
- What other services (coaching, train the coach workshops etc) do we require?
-
What 360 feedback questionnaire should we use?
What 360 feedback questionnaire should we use?
- How will we ensure our questionnaire is ‘fit for purpose’?
- If we use an ‘off the shelf’ questionnaire will the provider help us to update it?
- If we want to use a bespoke/purpose built questionnaire – What’s the cost and time frame involved?
Having spent over 20 years supporting organisations to get 360 feedback right and as a feedback facilitator to many thousands of participants. We can site examples of all of the above. However, they can all also be easily avoided!
-
What 360 feedback reports will be needed?
Most providers now offer three core reports that you should be able to modify to meet your specific needs. The standard reports are:
- Individual feedback report – Intended for use by the participant
- Comparative reports – Used to measure/monitor an individual's progress over time
- Training needs analysis (group) reports – These reports collate feedback from a group of recipients to produce a high level executive summary
-
Training needs analysis (group) reports – These reports collate feedback from a group of recipients to produce a high level executive summary
In general, your 360 provider is primarily responsible for ensuring that processing activities are compliant with the EU data protection law and that they have implemented appropriate technical and organisational measures to ensure compliance with the GDPR. Therefore, the easiest way to get this right is to:
- Insist they provide a Data Processing Agreement – This is a legally binding contract that will provide you with the legal assurance they are GDPR compliant
- Insist they have a Personal Data Privacy Notice - This notice should answer in simple, concise and plain language terms what happens to the users 360 feedback data and it should be clearly accessible at the point users input into the system
- Ask to see their GDPR/IT/Data Security Policy – This will provide further reassurance that your data is in safe hands
-
How much does a 360 assessment cost?
The wide range of 360 feedback services, a lack of standard pricing terminology and the reluctance of many providers to be upfront (on their website) about prices, makes comparing costs between different providers difficult.
The only effective way to compare costs between several providers is to:
- Be clear about the service you require
- Identify the type and number of reports needed
Then seek a range of quotations based on the above and request that the final summary shows a ‘total cost per participant’.
Giving and receiving 360 feedback
Providing meaningful 360 feedback is very important and requires understanding and skill. The most common questions asked by 360 feedback raters/respondents are:
-
How do I provide good, constructive 360 feedback?
Our recommended guiding principles for completing 360 degree feedback questionnaires are to:
- Be honest – Complete the questionnaire as accurately as possible, detailing the honest perception you have of the participant. This also means sharing your perceptions of the things you believe he/she could further develop – Remember, honest feedback is the best gift you can give someone!
- Provide written comments wherever you can – Participants find these a particularly rich source of useful information and your comments will form a crucial part of the overall feedback report – Please aim to:
- Choose your words carefully ensuring you are both candid and fair
- Provide feedback that is specific and actionable
- Neutralise your feedback comments if you concerned about anonymity
- Avoid sitting on the fence – Make good use of the rating scale, avoiding ‘middle of the road’ ratings. Limiting yourself to a few of the rating values will make the feedback bland and very difficult for the recipient to identify the development messages
- Be clear about your motives – The aim of the exercise is to support participants to build a genuine awareness of those things they are doing well and those areas they could further develop. If your motive for giving the feedback involves revenge, having a dig, venting or sniping then please think again – this isn’t the place to put the knife in!
-
How do you give negative/ critical/ developmental/ constructive 360 feedback?
Highlighting another person’s development opportunities is a key part of the 360 feedback process but one that many respondents struggle to get right. Use the following to guide your feedback ratings and comments:
- Feedback with respect - Be mindful of the person receiving the feedback, their style, preferences and how it will ‘land’ in their eyes. In short – Honesty with thought
- Share your perception only – Ensure your feedback and ratings reflect your experience only. Avoid sharing tittle tackle – hearsay
- Be specific – Avoid generic, bland statements – Provide feedback that is specific and actionable
- Offer development ideas – Aim to offer a potential solution
- Honourable intent – Confirm to yourself the only reason for providing the feedback is to support the person, be even better
- Use the right words – Try and avoid the use of absolute words such as ‘must’, ‘every’, ‘always’, ‘just’, ‘never’ – They are rarely factual and normally create an emotional response
-
What should I write in 360 feedback?
The overarching goal of most 360 feedback processes, is to support the participant to:
- Identify and capitalise on their strengths
- Identify and develop aspects of their style/ approach that could be better
With this in mind – You should always aim to provide ‘balanced’ feedback – Sharing your perception of their strengths and development opportunities
-
How do you give 360 feedback examples?
The SBI model works well - Highlight the Situation, outline the Behaviour observed and reflect back the Impact it had.
Best practice 360 feedback guidelines
Some of the most frequently asked general questions about 360 degree feedback
-
Are 360 reviews anonymous?
Where clients request respondent anonymity. 360 providers normally try to achieve this by:
- Merging respondent scores together to provide an average
- Listing feedback comments in randomised manner
To assist with the above, best practice also includes insisting on a minimum number (normally 3 – 4 ) of responses / completed questionnaires within each of the feedback groups.
Whilst the above does not directly ‘tag’ any comments or scores to an individual. In our experience most participants believe they can identify where low ratings have come from!
-
How often should 360 feedback be used?
Scheduling a repeat of the 360 process 12 – 18 months down the line allows time for behavioural changes to be implemented and people’s perceptions to change. Using a ‘comparative’ report to highlight areas of movement, can support the measurement of perception change.