Back in November, we had the idea to collectively start tracking our time.

The timing felt right to do so as our Lanzarote retreat was right around the corner, which would give us a whole month to conduct this team experiment. So we did!

Time Tracking Screenshot 1

First, we elected to setup a company account with Harvest, a time-tracking software. We then created a per-user account and asked all team members to track their time and categorize it for 1 month.

Getting on the same page

It's important to note that the goal has never been to go full-on Orwellian or to ensure everyone is cranking hours like Harvey Specter.

Quite the opposite, we saw it as a way to identify potential optimizations in our workflow, and to identify behaviors in how we use our time (for better or worse) that we may not have been aware of.

The goal of tracking our time was (and still is) to improve ourselves and how we work together as a team.

Setting up a process

We elected to follow a strict process around time-tracking:

  1. Install the Harvest desktop widget and iPhone app if you would like (it may help simplify your time tracking).
  2. Whenever you start a task, start an entry in Harvest.
  3. Whenever you stop that task (even to get up and take a break), stop the entry.
  4. If you come back to the same task, start the same task again. This will create a new entry for that task but the entries will be able to be linked together in reporting
  5. With Harvest, link entries with tickets/PR you are working on.

Taking the time to set it up right

It turns out, one of the hardest part of this experiment wasn't to pick up the habit to track time but rather to define everything we track and put it in the right buckets.

Creating projects

We initially defined the permanent projects:

  1. Customer Support — All time spent researching or responding to customer support should be tracked in the Customer Support project.
  2. Firefighting – Any time we have an issue that requires immediate attention because it is impacting customers.
  3. Fixing Broken Windows — Bug fixes, documentation clean up, etc.
  4. DNSimple Blog — Work performed on the blog—including writing, design improvements, reviewing, publishing, etc.
  5. DNSimple — Administrative tasks, tasks working ON the business (as opposed to FOR the business), tasks that do not fit in any of the above projects.

Other projects exist only while they are being worked on. For example, the API v2 project existed until it was launched in December.

After a few back and forths, we broke down everything by project, then by tasks. We chose to follow the same structure as our Github repositories with a couple additional items.

Tasks were a bit trickier to define but we settled for:

  1. Administrative — Email, meetings, setting up new laptop environments.
  2. Process Development – Time spent writing and discussing processes.
  3. Project Management — Time spent defining projects, collaborating with the team to decide the details of the project, and organizing the steps necessary to complete projects.
  4. Programming — Time spent coding.
  5. Writing — Time spent writing blog posts, customer documentation, and internal documentation that is not defining a process.
  6. UI/UX — Design work and user-experience related work.
  7. Review & Feedback — Review of pull requests, feedback on issues.
  8. Research — Time spent researching in preparation for implementing a project.
  9. Evangelism — Writing talks, proposals, presenting at conferences, attending meetups.

From there, most of us used the Github integration to track the specifics of a tasks.

Takeaway: seeing the big picture

By having the whole team tracking their every move we quickly saw the outliers—projects that either got a lot of "attention" or others that seemed left without being touched.

For example, we realized that not much love (and time) was given to "fix broken windows" (ie.: fix bugs, improve documentation, etc) and as a result we decided to bring back WTF day.

On the other hand, by seeing how much time we were spending on specific projects, it became possible to put a "time-cost" on some of them. For example, it became possible to get the average cost of writing, publishing, and promoting a blog post. Incidentally, it allowed us to put a number on what the ROI should be for it.

The aftermath

One month passed… As agreed, one month later, we parted ways with our Orwellian fantasy and without any surprises, most of the team stopped tracking their every move as soon as we ended the experiment.

Except for David, Simone, and myself. In a future post I'll share what we each got out of it. Stay tuned!