You want the anaytics? You can't handle the analytics!
This is why Agile and all of the other “user-centered” practices have come to rely so heavily on proxies for the end-user, e.g. the product owner, etc. Make no mistake, “user proxies” are compensating for an inherent weakness in most of today’s development practices – that is, a lack of a consistent, reliable, or scalable means to capture runtime intelligence. ...but all is not lost.
Web site development – what can it teach us?
Let’s be honest – hardcore developers don’t consider website designers or the users of those “website builders” to be “real” developers. What do they know about algorithms, distributed architectures, or anything to do with the craft (dare I say art?) of engineering quality software? OK, but guess what? These “wannabe developers” focus on – and demand empirical evidence in support of – how their applications are really being used in the wild. In fact, the most remedial “drag and drop” web site developer not only expects to gather real-world usage statistics, they also know that this information will be a (the?) critical factor in future development iterations. They know that only a fool would build something with no way to measure BOTH adoption AND the business impact of that adoption.
Yes, that’s right; website developers actually correlate click-by-click behavior with financial results! Now riddle me this - how many non-web applications are developed with that kind of accountability built-in? The answer isn’t even 0 – it’s null.
You want the analytics? You can’t handle the analytics!
The website developer has even more to teach “real developers.” Website developers have long understood that analytics (when they are good) become, in their own right, bona fide assets – but, here’s the catch – this is only true when they are made public! Knowing something is popular makes it even more popular. So now comes the $64,000 question; if (and we already know it’s a big if) a development team is capturing usage information – how likely is it that they then turn around and share their results with their users, customers or sponsors? (Don’t laugh – it’s a serious question). Users want to benchmark themselves against their peers (usage patterns) and their applications against alternatives (the best tool for the job).
And now it gets a little awkward – if you don’t track usage, you can’t predict results, make corrections, or measure their impact. Developers that don’t incorporate real-world usage patterns into their development process are forced to treat this data as a potential liability. They must work to keep usage analytics confidential and cry foul when others ask to see that very information.
This cannot be healthy. The exclusion of runtime intelligence from traditional development methodologies not only handicaps development, it diminishes the value of their software to those that matter most – the users and sponsors who are denied empirical evidence of their application’s impact.
Open Analytics and CodePlex…
I am using the term “open analytics” here to mean usage analytics that are available simultaneously to all application stakeholders; developers, their sponsors, users, potential users, and (yes) potential competitors (I am not saying that this is an application whose source code is public – that would be open source – not open analytics).
As more and more projects opt-in to share their usage statistics with the rest of the CodePlex community, they will see their software improve in quality and users will have one more metric (in addition to downloads and page views) to help predict the value of CodePlex projects.
If your software is as good as you tell everyone it is – and if you want to make it even better – then open analytics should be a welcome addition to your development arsenal. …but if you secretly fear genuine accountability, well, I guess that’s another story.