
Visualizing Developer Productivity
GitClear analyzes GitHub data and provides engineering teams with deep productivity insights. I worked with founder Bill Harding to design, evaluate, and launch the first version of the tool in 2018.
PART 1
Understanding the Problem
Engineering managers struggled to get insights from GitHub reports.
Software development is a complex process, which is notoriously hard to measure. Without reliable data, it’s hard for managers to evaluate developers and their work effectively. They rely on standup meetings, check-ins, and status reports, which take hours and offer limited insights.
Even if data is available, it’s only useful if managers and developers agree on its validity. Programming is a complex process that includes on research, decision-making, and feedback in addition to writing code. Many developers don’t believe that code work should be quantified and are wary of being measured at all.
As a manager and programmer, Bill was frustrated by the lack of insight he had into his team’s work and with existing tools. He developed an an algorithm that analyzed GitHub commits and assigned a value to each line of code. He asked me to work with him to visualize this new data set for developers.
PART 2
Identifying User Outcomes
The leaderboard concept was designed to let managers and team members see real-time updates and click in to review and intervened when stuck.
I conducted several interviews with engineering managers and team members to observe their process, learn about pain points, and hear how they felt about quantifying their work. I identified themes and framed these as user outcomes, using key words from the conversations.
As a manager, I want to:
Evaluate my team using reliable data
Spend less time keeping up with work
Learn how my team is improving over time
Know when to step in and help
As an engineer, I want to:
Be evaluated fairly and transparently
Be recognized for my accomplishments
Get help when I need it so I can get unstuck
PART 3
Ideation and Design
Data visualizations should be clear and useful for managers and members of the team. The goal is to provide transparency and encourage shared language.
Design Principles and Visual Direction
As design was getting started, we agreed on set of design principles based on our research:
Design for managers and developers. Data should be transparent, consistent, and easy to understand.
Offer insights, not advice. Give managers information, but don't tell them how to do their job.
Avoid pitting developers against one another. Celebrate achievement or offer to help.
Complement existing workflows. Our tools should reduce work by aligning with GitHub, Jira, and others.
Key words include: bright, clean, electric, friendly, smart, trustworthy, accessible, meaningful, impactful.
Dashboards and Data Visualizations
Our team, which included Bill, two developers, and me, talked extensively about daily work, challenges, and scenarios. We sketched ideas and agreed on four areas of focus: daily activity, team leaderboard, individual skills, and productivity trends.
After we reviewed rounds of sketches and wireframes, I designed a high-fidelity prototype. The dashboards had information about teams and individuals, and focused on graphs and data visualizations. We left some of the details open-ended, assuming that we’d iterate once our team began using the tool.




PART 4
Testing and Iteration
Early user feedback helped us identify practical ways to visualize data. The Performance Review page focused on a specific use case: facilitating conversations between engineering mangers and team members.
Observation and Design Changes
During implementation, the Bonanza engineering began to use the tool as part of their code review workflow. The early feedback helped us to refine the algorithm and product design.
The Daily Activity design included alerts and actions, which we thought would help managers know when to review, congratulate, or help. However, our team didn’t seem to use these and found them disruptive.
We replaced this with a recent commits feed, which grouped related changes together. Users could view code changes in our new diff viewer or click to related Jira projects. This became much more useful to the dev team. It helped them see the entire body of work in real time and understand how Line Impact scores were being calculated.
We also heard feedback about a need to customize the scoring mechanism. We added calibration settings, which allowed users to exclude file types, adjust default scoring, and set up code categories.
Beta Testing and User Feedback
We released a beta version in fall 2017, and began to receive a steady stream of feedback from trials and demo requests. At first, customers were interested but skeptical. For example, many loved the skills graphs, but weren't sure how they'd use the data. Was it reliable? What action would they take? Would their team agree with what was implied by the graphs?
Based on this feedback, we decided to look at more practical ways to visualize data. For example, a common theme during demos was challenges doing annual performance reviews. We realized that our skills and productivity graphs were missing some key information: biggest achievements during the year, performance on different types of projects, and feedback from peers. We designed a new performance review tool to help address this pain point.

PART 5
Measuring and Learning
GitClear received warm reviews from the small group of early adopters.
After extensive testing, we released GitClear in 2018. Bonanza’s engineering team saw significant results and we began to see increased demand for demos and trials.
Bonanza developers reported that they spent fewer hours per week in meetings and reviewing code.
The Bonanza development team’s output increased year over year. The ability to measure helped managers set goals with teams and Line Impact trend graphs made it easy to see progress.
Bonanza managers had more data for annual reviews and project prioritization. This made promotions and raises more transparent and encouraged team buy-in.
Customer adoption was low at first, but rose steadily in 2019. Customers who signed up rated the tool highly, and reported similar time-saving and productivity improvements.
We also learned a lot from our team and early adopters about how we can improve:
It’s important to validate ideas early and often with people from different organizations and backgrounds.
There’s a difference between what people like and what they use. Skills graphs were appealing, but less useful than Annual Review and Standup tools.
Algorithms and data feel objective, but are still driven by human logic and interpretation. It’s important to keep this in mind and actively reduce bias.
Making the data available, clear, and adjustable did a lot to build trust with developers. Even if they didn’t agree with a score, it became easier to talk about why.
GitClear Team
Bill Harding: Founder, Lead Developer
James Spence: Design Director, Lead Product Designer
Liz Johnson: Senior Product Designer
Matthew Kloster: Senior Developer
Eric Salczynski: Senior Developer
Amy Bell: Senior Product Manager
Kevin Andrews: Project Manager