Cities and counties routinely report on their own performance year to year, and many also compare their metrics with the comunity next door. But how can they tell if they're really performing as well as they could--or whether they're comparing apples to apples?

ICMA's approach to local government performance benchmarking simplifies the process by zeroing in on key indicators and providing agreed-upon definitions that help ensure meaningful comparisons. And it's free of charge.

  • 80 Key indicators with consistent definitions
  • More than 20,000 data points available
  • Compare with others or year-to-year trends
  • No specialized software and no charge        

This approach, Open Access Benchmarking, also provides flexibility so that cities and counties can reap the benefits of comparison regardless of their own reporting cycles and software choices.

A significant challenge to benchmarking initiatives is getting everyone on the same page – in terms of what to measure, how to measure it, and how and when to collect and analyze the data. Through its Performance Management Advisory Committee, ICMA has found ways to ensure consistency where it’s necessary while also limiting the need for centralized control.

The result is Open Access Benchmarking, led by jurisdictions. The city or county is in the driver’s seat, free from requirements governing software choices, data collection methods, or timeframes for entering, analyzing, or reporting performance data.

This approach is based on a set of Key Performance Indicators and corresponding definitions that were developed by the Advisory Committee and a working group of jurisdictions around the country. Data and definitions are available online, and updated versions of the dataset are posted continuously as we receive new submissions from jurisdictions.

Jurisdictions are welcome to download a copy of the data and use it internally or with a software provider of their choice. In return, ICMA encourages jurisdictions to share their data with ICMA (in Excel or CSV format) so that others can benefit from an updated and growing dataset. 

Another challenge for benchmarking is data cleaning. Open Access Benchmarking addresses this by the use of warning flags that are coded into the response form through which jurisdictions report their data. Jurisdictions are cautioned to review their own data in light of these warnings and to communicate with each other around any data points of concern. And if you wish to follow up with a jurisdiction represented in the database, contact information is provided so you can easily reach the right person.

Participants in earlier benchmarking initiatives sometimes said that they felt overwhelmed by the sheer volume of data collected. So the Advisory Committee settled on a modest list of just 80 metrics. While this is not sufficient to benchmark every aspect of every department, it is intended to provide at least a key set of data and related ratios for comparing with peer organizations, without overburdening the assigned staff.

Since counties often provide services that cities don’t, the working group also developed a separate list of 54 county-specific metrics.

To hear more about performance management and benchmarking, add the “Performance Management” topic to your interests. And if you have questions, comments, or data to add to this database, please contact