Our Resources

360 Feedback Questionnaire Design – Best Practice Guide

The design of your 360 feedback questionnaire is the single most important factor in determining whether your process delivers meaningful, actionable insights. A well-crafted questionnaire not only ensures the quality and reliability of the feedback but also builds confidence among participants and stakeholders.

Drawing on extensive real-world experience, this guide sets out proven best practice for designing effective 360 feedback questionnaires. We highlight the principles that underpin successful design and provide practical recommendations.


Define the Purpose of your 360 Feedback Questionnaire

Effective design begins with a clear understanding of your goals. Your questionnaire should be purpose-driven, behaviour-based, and future-focused.

By asking: What are we trying to achieve with this 360 feedback process?

Common objectives include:

  • Developing leaders and managers
  • Underpinning high-potential programmes
  • Supporting culture change
  • Aligning individual development
  • Supporting performance measurement
Supporting Participants to Convert Feedback Into Action

Avoid generic templates. Instead, design questions that reflect your organisation’s leadership and management priorities. The strongest questionnaires link directly to the behaviours your organisation values and wants to develop.

For more information on ensuring that your 360 feedback process is anchored to a strategic driver, see our article Strategic Uses of 360 Degree Feedback.


360 Feedback Questionnaire Structure

A balanced questionnaire combines both quantitative and qualitative elements.

Quantitative (tick-box) questions: These provide structured data that can be used to identify trends and make comparisons. Ensure that questions are:

  • Behaviourally focused
  • Clearly worded and measurable

Qualitative (free text) questions: Participants consistently rate written comments as the most valuable part of their feedback. Use free text boxes to:

  • Elaborate on their ratings at the end each competency area
  • Provide overall reflections at the end (e.g. strengths and development needs)
Supporting Participants to Convert Feedback Into Action

Questionnaire Length - Best Practice

The aim should be to design a questionnaire that can be completed in under 20 minutes. Longer surveys increase fatigue, reduce response quality, and can create significant costs for the organisation in terms of time spent. A concise, well-structured questionnaire also helps maintain participant engagement, encourages higher completion rates, and demonstrates respect for people’s time.

Analysis of data from over 10,000 respondents shows the following average completion times:

  • 30 questions → 9 minutes
  • 40 questions → 10 minutes
  • 50 questions → 12 minutes
  • Free-text responses → 3 minutes per question

Based on these insights, our recommended best practice is:

  • 40 quantitative questions across 3–4 competency areas
  • 1 free-text box per competency area
  • 2 summarising qualitative questions at the end
Supporting Participants to Convert Feedback Into Action

Designing a 360 Feedback Rating Scale

The choice of rating scale should align with the purpose of your 360 feedback, whether it’s performance evaluation or development.

Common types of scales:

  • Judgemental scales (performance-focused): Measure competence, compare individuals to peers, or capture satisfaction scores.
  • Frequency/observation scales (recommended for development): Measure how often behaviours are observed (e.g. Almost Always → Rarely).
  • Agreement scales (e.g. Strongly Agree → Strongly Disagree): More suited to attitude surveys than to 360 feedback.

Best practice for rating scales:

  • Avoid 3-point scales — they’re too limited and reduce reliability.
  • Use 5- to 7-point scales for a balance between reliability and ease of use.
  • Label every point clearly to ensure consistent interpretation.
  • Use one consistent scale format throughout the questionnaire for simplicity and clarity.
Supporting Participants to Convert Feedback Into Action

Testing and Validation

Before launching your 360 feedback questionnaire, involve a small group of typical respondents and a few senior leaders in a structured review process. The aim is to:

  • Test clarity and relevance – ensure each question is easy to understand, directly relevant, and based on observable behaviours, helping to improve data quality and reduce ambiguity.
  • Validate alignment with purpose – confirm that every item supports the core objectives of the 360 feedback process and reinforces the insights you aim to deliver.

After testing, refine based on feedback by adjusting the wording, structure, and length to ensure the final questionnaire is balanced, concise, and straightforward to complete.

Supporting Participants to Convert Feedback Into Action

360 Questionnaire Design - Best Practice Summary

Key principles:

  • Focus on quality, not quantity: Use a smaller number of well-designed questions that directly measure the behaviours you want to emphasise. Avoid overwhelming respondents with too many items. Aim for around 40 well-designed quantitative questions supported by 4–5 free-text boxes.
  • Group related questions into competency areas: Cluster similar items into 3–4 clear behavioural themes (e.g. Leading Others, Communication, Team Relationships) to:
    • Improve respondent focus
    • Provide context for each question
    • Strengthen the reliability of the data
  • Introduce each section clearly: Use brief contextual introductions to explain what is being measured and guide respondents effectively.
  • Use the right rating scale:
    • Apply a consistent 5–7 point frequency scale across the questionnaire.
    • Include a ‘can’t answer’ or ‘not applicable’ option for situations where respondents have not observed certain behaviours.
  • Pilot and refine before launch: Test the questionnaire with a small group to identify ambiguities, improve clarity, and ensure it works as intended.
Supporting Participants to Convert Feedback Into Action

360 Feedback Questionnaire Design: Frequently Asked Questions

Q. How many questions should a 360 feedback survey include?

A. Aim for around 40 rating questions grouped into 3–4 competency areas. Include one free-text box per area and two summary questions at the end. This balance keeps surveys concise, maximises engagement, and ensures reliable, high-quality feedback.


Q. Which rating scale works best for 360 feedback?

A. Use a 5–7 point frequency-based scale focused on how often behaviours are observed. Label each point clearly and use one consistent format throughout to improve reliability and make results easier to interpret.


Q. How do you pilot a 360 feedback questionnaire effectively?

A. Test the questionnaire with a small group of typical respondents and senior leaders. Check clarity, relevance, and alignment with objectives. Refine wording, structure, and length based on feedback to ensure the final version is concise and easy to complete.


Q. What are the best practices for 360 feedback questionnaire design?

A. Focus on quality, not quantity. Use clear, behaviour-based questions grouped into logical competencies. Apply a consistent rating scale, include targeted free-text questions, and pilot before launch. Design the questionnaire to deliver actionable, meaningful insights.



More from '360 Questionnaire Design':
More from '360 Questionnaire Design':

360 Feedback Questionnaire Design – Best Practice Guide

Provides proven best practice for designing effective 360 feedback questionnaires.

Six Principles for Designing a 360 Feedback Questionnaire

Explores six must-dos for designing a 360 feedback questionnaire, providing practical guidance to create clear, focused, and actionable feedback that supports meaningful development