360-Degree Feedback #3: Preparation and Implementation – How to Do It Right
Once you’re clear on what you want to evaluate and who will evaluate whom, the hardest part begins: implementation. This is where it’s decided whether your project will succeed or become just another “HR initiative” everyone forgets about in six months. The key is thorough preparation and, above all, earning the trust of people in the organization.
Start Strategically, Not Tactically
The most common mistake I see is focusing on technical details without any strategic thinking. HR managers often start with questions like: “What software should we use?” or “How long will the questionnaire be?” Instead, you should begin somewhere entirely different.
Define a Clear Purpose
Imagine that in a year you’re evaluating the success of your 360-degree feedback process. What would have to change for you to say, “It worked”? Do you want your managers to communicate more effectively? Do you want to reduce team turnover? Improve cross-departmental collaboration?
Pro tip: A clear objective is the foundation of success. Organizations that implement 360-degree feedback just because “everyone else is doing it” often fail. Successful implementations begin with a specific goal, for example: “We want our team leads to be better people managers.” Then the entire process is tailored to that goal – from selecting competencies to follow-up development activities.
Gain Executive Support – And I Mean Real Support
You might think: “Sure, executive support – that’s an HR cliché.” But with 360-degree feedback, it’s absolutely critical. It’s not just about approving the budget. You need leadership to:
- Understand the purpose and expected benefits.
- Actively communicate the importance of the process.
- Lead by example (ideally, by participating themselves).
- Support follow-up development activities.
Pro tip: A formal “yes” from leadership is not enough. If the CEO says, “Okay, let’s do it,” but doesn’t participate or speak about it publicly, employees quickly understand it’s not a real priority. The project then withers, no matter how well prepared. Leaders must actively demonstrate the importance of the process – by participating themselves, regularly addressing it in meetings, asking about results, and supporting follow-up actions. Visible support is just as important as budget allocation.
Communication, Communication, Communication
If there’s one thing you should remember from this, it’s the importance of communication. People naturally fear 360-degree feedback. They worry it will be a “witch hunt,” that someone will be punished for bad results, or that it will be used against them.
What and How to Communicate
Your communication must be clear, honest, and repeated. Key messages include:
- Why we’re doing this:
“We want to support the development of our key people and help them lead their teams more effectively.” - What it means for employees:
“You’ll receive valuable feedback to help you grow. This is not about performance evaluation for reward or punishment purposes.” - How anonymity is protected:
“All responses except from your direct manager will be anonymous. Only you and your coach will see individual comments.” - What happens with the results:
“Together, we’ll create an individual development plan. The results will not be used to decide on salary or promotions.”
Pro tip: Invest in information sessions. Organize presentations for everyone involved, where you explain the process, answer questions, and most importantly – let people know who they can contact if they have concerns. Transparency and open communication are the foundation of trust. Often, these sessions determine whether people see the process as helpful or just another “HR thing.”
Creating the Feedback Tool
Now comes the “technical” part – creating the questionnaire.
It may seem simple, but a quality questionnaire is an art.
It must be detailed enough to provide useful data, but not so long that it discourages raters.
Structure of an Effective Questionnaire
A proven structure looks like this:
- Introduction (5 minutes):
Explains the purpose, emphasizes anonymity, provides estimated completion time (usually 20–25 minutes), and includes a contact for questions. - Scaled questions (15 minutes):
For each competency, 3–5 specific behavioral questions using a five-point scale (Never – Rarely – Sometimes – Often – Always), plus an “Cannot assess” option. - Open-ended questions (5–10 minutes):
3–4 qualitative questions like “What are this person’s main strengths?” or “Where could they improve the most?”
Example of a well-phrased scaled question:
Instead of the vague “Is a good communicator,” use: “Actively listens to others without interrupting,” or
“Provides clear and understandable instructions.”
Selecting and Preparing the Raters
You might be surprised, but the quality of feedback depends less on who you choose and more on how you prepare them.
Even the most suitable person can give unhelpful feedback if they don’t know how to do it.
Criteria for Selecting Raters
Raters should:
- Know the person being rated for at least 6 months.
- Work with them regularly.
- Have observed their behavior in relevant situations.
- Be capable of providing honest, constructive feedback.
Rater Training – The Most Critical Step
This is where most projects go wrong. People know how to rate, but not how to give quality feedback. Effective training should include:
- Basics of constructive feedback:
Focus on behaviors, not personality. Instead of “John is arrogant,” say: “John frequently interrupts others during meetings.” - The SBI method: Situation – Behavior – Impact
“During Monday’s meeting (situation), you interrupted Maria three times (behavior), which made it seem like her opinion wasn’t valued (impact).” - Avoiding common mistakes:
- Halo effect (one positive trait skews all ratings)
- Leniency or harshness bias
- Recency bias (focusing on recent events only)
Pro tip: Investing in rater training pays off a hundredfold. Organizations that skip training often get vague comments like “He’s okay,” or “She should communicate more.” Proper training teaches raters to give specific, behaviorally-based feedback with concrete examples. The difference in data quality between trained and untrained raters is dramatic.
Timeline – Reality vs. Wishful Thinking
A common mistake is underestimating how much time implementation takes. A quality 360-degree process is not a sprint – it’s a marathon.
A Realistic Timeline Looks Like This:
- Preparation phase (4–6 weeks):
Define goals, choose competencies, create questionnaire, plan communications, train administrators. - Communication and training (1–2 weeks):
Inform employees, train raters, answer questions. - Data collection (3–4 weeks):
Distribute questionnaires, monitor progress, send reminders. - Processing and feedback (3–4 weeks):
Generate reports, conduct individual feedback sessions. - Development plans (ongoing):
Create and implement individual development plans.
Pro tip: Don’t rush the timeline. Trying to complete the entire 360° process in two months usually leads to chaos. People are confused, give superficial feedback, and the whole thing feels unprofessional. A realistic timeline with ample buffer time (often twice your initial estimate) provides space for proper preparation, training, and processing. It’s better to postpone than to mess it up by rushing.
In the next part of this series, we will take a look at the actual evaluation process, from data collection to feedback, including practical examples and tips you can use yourself.