Topic: Performance Metrics
This topic has been replaced by TopicSelfAssessment
which expands the scope of the issue. (Metrics only serve as part of a process of self-evaluation.)
The OGC could develop metrics to evaluate its own performance towards achieving its goals. These metrics would help contribute to a yearly self-analysis of OGC performance.
As part of the effort of self-evaluation, metrics could help spot areas where OGC activities have changed, slowed down, or disappeared. This might help the OGC identifiy areas which need attention.
For example, the OGC could track, year by year, the number of new Domain Working Groups and Standards Working Groups formed, the number of standards published, the current average length of time between the formation of a SWG and the publication or rejection of its standard, the number of comments submitted to different standards, the number of unaddressed public comments received in a RFC phase, and other similar passive metrics of activity. These metrics should be automated to avoid needing staff attention. These metrics could be reviewed periodically (yearly or less) to spot trends or changes and examine the situation.
Relations to other topics
This topic is related to TopicUpdateVision
, since the metrics will serve to assess the effectiveness of the OGC reaching its vision.
It would be useful for the OGCi to have a number automated metrics that cover the whole gammut of OGC* activities that could give a sense of current effectiveness. Examples of such metrics could be:
- Participants: events, activity (in person, wiki, teleconf)
- Local fora: level of activity
- TC: # participants, # new participants, #presentations, votes, ...
- DWG: # presentations, level of activity, new arrivals, departures
- Outreach: number of comments in RFC, unsolicited, other
- State of abstract spec: Last revision, citation, use in stds,
- numbers of new, revision, numbers without revision
- duration of SWGs, revisions
- uptake, use, test rates of Discovery, View, Access, Processing services
- time to implement (survey), time to debug after implementation
- number of test suites: new, delay since las revision, unused
- rates used for testing, new instances declared to OGCi
- raters of certification, level of deployments of certified software
- feedback on use of OGC standards
- sampling of sentiment
- Budget: income versus expenditure
- Membership: level, activity, incoming, leaving, type
- Staff: satisfaction (survey?)
- 26 Jul 2013
Monitor the standards process: "Are any standard languishing?"
ISO seems to have a rule that standards must be renewed periodically.
The OGC has no such rule. There are many standards that seem to have
fallen into the cracks. Where such standards are central to others,
there perhaps ought to be an effort to sustain them.
=> ACTION: The OGC should periodically review all its standards to
assess the health of each standard in terms of revision
activity, addressing change requests, usage and other metrics.
Where appropriate, the OGC should figure out what help it
could bring to standards that are languishing.
- 26 Jul 2013
The Leadership group has not yet discussed this topic.