EVAL 360

a performance insight tool that lets companies assess and track talent skills and growth.

ROLE

UI/UX, Design Lead

LOCATION

EMEA Remote

TIMELINE

Nov -Dec 2022

STATUS

Unmaintained

Engineering managers often struggle with fragmented performance data. Eval360 centralises this through a multi-source assessment engine.

Developed during the high-stakes HNGi9 cohort , Eval360 is a functional SaaS tool designed to track engineering talent. I led the design of a holistic assessment system, successfully shipping a product that translates complex peer and manager feedback into actionable data.

PROBLEM STATEMENT

“Engineering managers and HR leads need a way to evaluate the skills and growth of engineers within their teams because traditional performance reviews often lack structure, consistency, and real-time insight into technical development.”

“Engineering managers and HR leads need a way to evaluate the skills and growth of engineers within their teams because traditional performance reviews often lack structure, consistency, and real-time insight into technical development.”

“Engineering managers and HR leads need a way to evaluate the skills and growth of engineers within their teams because traditional performance reviews often lack structure, consistency, and real-time insight into technical development.”

A radar mobile image with a popup

Engineering teams are often evaluated based on output, not insight. Performance reviews get reduced to vague scores or generic check-ins, and that leaves both managers and engineers in the dark about what growth actually looks like.

We designed Eval360 as a skill evaluator that is focused on structure, consistency, and clarity. Something that could fit into a team’s workflow without becoming the workflow.

It Enable engineers to understand their skill strengths and gaps, Facilitate honest, constructive peer and manager feedback in a psychologically safe way, deliver actionable insights for personal growth and company decision-making and an over all distraction-free evaluation experience for all users.

Figjam of swot analysis for Eval 360

WORKING ON THE EMPLOYEE SKILL ASSESSMENT FLOW

We started with the interviews and surveys.

As a team, we spoke with pm interns and junior engineers to research both sides of the evaluation experience to understand what was working, what felt unnecessary, and where the gaps were.

As design lead, I also benchmarked existing tools like Pluralsight, HackerRank, and Codility. while they were good at content or testing, none offered a full-cycle evaluation experience that included automated tasks, self-reflection, peer feedback, and manager reviews in one system

Flow of Peer review evaluation
Mapping out the three core user roles and user journey into:

Engineers (being assessed)
Managers (assigning and reviewing)
Peers (providing additional context)

Designing the employee assessment flow to move the users forward straightforwardly.
Receive Assessment → Complete Task →Review → Insight & Growth Plan

Branding assets

CREATING THE LOOK

Eval 360 needed to feel structured and built for performance without looking like a spreadsheet.

We kept the visual system clean and corporate by making subtle design choices like using rounded icons, breathable typography, Consistent spacing and layout rules.
We also built a shared design system of buttons, forms, data cards, tables, and various components to keep things consistent across the team.

Feature 1: Receive Evaluation Invite

TAKE ASSESSMENT ON EVAL360

Receive evaluation invitation

A personalized link that takes the engineer straight into the evaluation. a short welcome message, the task title, and a clear CTA.

Design decision: no login walls. Just a direct, focused entry point. 

A detailed screen for users to take assessments
User takes evaluation

Whether it’s multiple-choice or a practical task, we designed the questions layout with enough breathing space, grouped sections, and a timer to keep focus sharp.

A fill skill evaluation form
Peer review of evaluation submission

After self-evaluation, each engineer is also reviewed by peers. These reviews are anonymous, and peers rate the same skills the engineer rated themselves on creating a balanced 360° feedback loop.



Peers use a 1–5 rating scale, with optional comment fields to offer constructive feedback or encouragement. Comments are kept anonymous to reduce bias and encourage honesty.

Evaluation summary
From the dashboard, engineers can click “See All” to view their full evaluation summary

The evaluation summary shows a simple performance overview visualized as a spiderweb chart with each skill broken down with scores from the self-review, peer review, and manager in a table.



A suggested growth plan is included and some practical feedback to help the engineer reflect and move forward.

TAKEAWAYS

I had to think beyond UI. I was designing with developers in mind, planning for data structures, access levels, and review loops. It was my first time designing with scale and real implementation in view.

Designing for evaluation meant designing with care. From tone to layout, everything had to feel intentional and fair. When people feel a system is consistent, they’re more likely to engage honestly.

Product thumbnail

LET'S CONNECT

Copyright © 2026 Jilaga

Made with

&

LET'S CONNECT

Copyright © 2026 Jilaga

Made with

&

LET'S CONNECT

Copyright © 2026 Jilaga

Made with

&

Create a free website with Framer, the website builder loved by startups, designers and agencies.