Metrics For Open Practice Projects

Just a quick post with some thoughts I was having last night while setting up our first (rough) Open Practice page about metrics.

One of the tough problems for our market is how to keep track of and measure progress, success and failure. Solving this means coming up with a way to "score" our projects. Clearly, the ultimate proof is in the pudding -- either the site accomplishes it's business goals or it doesn't -- but that's somewhat out of our hands. There also may be great value for the Open Practice in projects that don't completely succeed in their mission.

One of the things we've got to break out of is this "win all the time" business mentality. What organizations need is accountability, not internally-oriented spin creating an illusion of success. There are far too many entities out there that have no way to recognize failures, to process and learn from mistakes, and also to identify and salvage the individual pieces that worked from an overall puzzle that didn't.

We have a lot of ambitious ideas and clients, and we're consciously putting ourselves out there as innovators. It's part of our job to sometimes fall flat on our faces, and to make that a net-positive by taking away valuable information. That's what experimentation and the empirical method are all about.

Metrics
I'm getting a little off track. Running this Open Practice is partly about the ethic of experimentation, but it's also about getting data. Quantitative analysis, baby. That means metrics.

Think of it like this: if you had a baseball card for your website, what stats would be on the back? Some standard ideas:

  • Number of users, posts, comments.
  • Email list signups.
  • File downloads, video views, audio listens.
  • General site traffic / Google analytics.

Those are pretty much established best-practices by this point. I'm thinking about how to push the envelope a bit. First of all there are some more pointed user/community metrics that could be implemented:

Metrics For Community

  • Ratio of successful logins to failed attempts.
  • Number of user-to-user connections on a social network / median number of connections per user.
  • Number of user-invites, emailed pages, etc.
  • Average logins, posts, comments, comment/post ratings per day/week/month

Finally, from a the technical Drupal perspective, it might be good to have some code-level statistics to indicate the overall cleanliness of a project, how portable and scalable your success might be:

Metrics For Code-Quality, Scalability, Kludgyness

  • List of contributed/custom modules in use.
  • Lines of contributed/custom code, including PHP pages, blocks, etc.
  • Total size in bytes of configuration: variables, views, cck, etc.
  • Number of database tables, total db size.
  • Avg. queries per page-view, page load time, etc.

I'm going to continue developing these ideas. It should not be too hard to work these things up for our Open Practice projects (like NAN), and I'm looking forward to producing some prototypes.

Topics