By Gerald Young

Inspiration can be unexpected and come like a lightning bolt. Most often, however, it comes after reflection and critical analysis because the important work of organizing, tracking, and interpreting relevant data has been conducted first. Local governments sit on a wealth of information about their operations, but they don’t always take the time to ensure that it is organized and ready to use.

For 80 years, the International City/County Management Association (ICMA) has encouraged and assisted local governments in collecting, organizing, and using that information. ICMA’s earliest forays into the field resulted in a series of articles in the 1930s. And then in 1943 came a book by its executive director, Clarence Ridley, and ICMA staff member and future Nobel Prize-winning economist Herbert Simon entitled Measuring Municipal Services.

Fiscal data remained a key resource for managers in the years that followed, but the focus was often solely inward looking: “How did we perform this year?” Where some found value in collecting benchmark information from other jurisdictions, collection was idiosyncratic at best, with each jurisdiction deciding for itself which measures were important and which other jurisdictions to select for comparison.

For the identified comparable communities, either the data were lifted from their budget documents with or without their knowledge, or the measures were communicated in a brief survey, with the recipients doing their best to interpret how those measures should be defined.

 

Standardizing the Process

ICMA recognized both the value of performance data and the hazards of collecting it without some standardization. In the mid-1990s, an idea for a standardized, systematic—and comparative—performance measurement program began to gel and take root among participants in the Large Cities Executive Forum, a group of managers in cities with populations of 250,000 or more.

At the time, the Governmental Accounting Standards Board (GASB) was proposing to establish measures on which local governments would be required to report, and these managers sought to take the initiative rather than waiting for a mandate by some external body.

ICMA provided an institutional home for a pilot performance measurement consortium of 44 jurisdictions. Participants decided on the service areas they would focus on, agreed on the appropriate indicators, and hammered out precise definitions. ICMA joined with the Urban Institute to obtain a grant from the Alfred P. Sloan Foundation to help fund the startup.

In the years that followed, local government managers, budget staff, department heads, and other subject matter experts tackled the detailed questions underlying the seemingly simple. “What did you spend?” became “What was the actual (not budgeted or estimated) expenditure, excluding overhead or encumbrances?”

Such key terms as overhead, full-time equivalent, hours paid, sworn staff, and completed repairs were spelled out as well. In some cases, each item in the data collection instrument might be accompanied by half a page or more of instructions.

With ICMA serving as the clearinghouse for an annual data collection effort, the communities participating could rely on a consistent source of data from an entity independent of their local media, unions, or other interests, and one that did not rely on an ad hoc inquiry to another jurisdiction.

While apples-to-apples inputs were a key consideration, so, too, was data verification. Each jurisdiction’s submittals were subject to statistical formulas and staff review to identify outliers that needed to be verified or inconsistencies that needed to be corrected.

The cleaned databases were then provided to the participating jurisdictions and published in an annual report, which provided summary statistics and also jurisdiction-by-jurisdiction graphs showing the performance on key measures. By then, dubbed the ICMA Center for Performance Measurement or CPM, the consortium grew in both participation and scope.

Remaining focused on such core local government services as public safety, road maintenance, parks and recreation, libraries, and support services, it also expanded to include risk management, planning and permitting, and sustainability. It also expanded to include cities, counties, and other local governments, both large and small.

 

From Measurement to Management

The next challenge facing ICMA was how the data would be used. Would each jurisdiction simply incorporate selected graphs into its budgets or presentations to elected officials?

Would that be the end of the discussion or just the beginning? Or would the data be a source for additional research into best practices that lead to improved performance?

To help local governments move from measurement to management, ICMA published What Works: Management Applications of Performance Measurement in Local Government, starting in 2001. What Works presented case studies demonstrating how performance information had resulted in cost savings and management and policy changes, and enhanced dialogue among staff, elected officials, and the public concerning strategic goals, actual performance, and action plans for improvement. Updated editions followed in 2002, 2003, 2008, and 2010.

From FY2005 to FY2006, for example, the city of Dallas, Texas, reduced the average time from initiation of a code complaint to voluntary compliance for all violation types from 28 days to 11 days by:

  • Implementing a new customer relationship management system that improved the ability to monitor cases at every point in the inspection and resolution process.
  • Establishing time standards for resolving cases.
  • Introducing a civil adjudication process that reduced the time required to resolve some cases.

To help participants interpret the data, ICMA also looked beyond individual measures at how multiple components interrelate through correlations among sets of metrics. Response time, for example, from dispatch to arrival in public safety emergencies can be affected by the square mileage served per station; topography; population density; dispatch technology; call volume per apparatus per day; turnout time; and differences in practices for recording arrival time.

 

Networking and Consortia

Where data analysis has led to procedural or organizational change, ICMA has fostered networking among the jurisdictions by participant-led webinars. Examples include exploring efforts in Albany, Oregon, to track and reduce sick leave usage and encouraging departmental business planning in Austin, Texas.

Subgroups also emerged within the larger program, especially among those with a particular policy or geographic connection. This led to the formation of state and regional consortia to encourage ongoing dialogue on relevant performance issues.

 

Tackling Continuing Challenges

Still, some key challenges remained: technology and time.

Technology: Technology has continued to evolve throughout the program, as ICMA has shared surveys and data through CD-ROM, association-hosted websites, off-the-shelf databases, print and PDF documents, and now cloud-based applications.

Performance dashboards for each jurisdiction were introduced in 2008, but these initial reports had limited options for altering the pre-set display. User customization also took a leap that year with an online reporting tool that enabled jurisdictions to identify specific jurisdictions, states, or population cohorts with which to compare their performance and view multiyear trends.

This early online application and subsequent upgrades enabled jurisdictions to identify key measures, fiscal years, and relevant filters to create and save their own graphs. But since the data collection and online reporting tools were not integrated, a lag remained between the release of the cleaned database and the upload to the reporting tool.

Time: Lag time was an issue in other ways as well. Each jurisdiction reported data only once a year, typically about two months after its fiscal year close. Once the data verification process took place and the database was shared, the jurisdictions would already be several months into their next fiscal year.

And since some participants had different fiscal year-ends, those with a June 30 fiscal year close would have to wait until the next spring to see data from the jurisdictions with a December 31 close. By that time, they would already be well into the approval process of their following year’s budget.

Relatedly, time was an issue in completing the annual surveys. With more than 5,000 individual measures, jurisdictions faced a considerable investment of staff time in responding to the surveys each year, let alone analyzing the results.

 

Reimagining the Program

Focusing on those challenges, ICMA embarked on a reimagining of the program in 2013. Over the course of 18 months, staff conducted market research on a wholly revised suite of services, issued an RFP for an integrated software platform, and convened focus groups to help rethink the “core” outcomes and other measures collected.

The result, rolled out at ICMA’s 100th annual conference in Charlotte, is ICMA Insights™, a partnership between ICMA and SAS®, the industry leader in business analytics software.

Based on feedback from participants, the new software provides the flexibility that was previously lacking, quick turnaround time, and a more streamlined set of measures, but with the ability to add custom measures. It also has extensive graphing, reporting and trend analysis tools, and built-in forecasting capabilities that are hosted on the latest technology.

A significant contribution of the focus groups was a careful consideration of the measures to be incorporated into Insights. The program had grown to include more than 5,000 measures—an unsustainable number. Each focus group included representatives from varying levels of local government—city and county managers, budget directors, other department heads, and line staff—as well as members of the academic community.

Feedback was also provided by other professional associations, including the American Library Association, American Association of Code Enforcement, and International Public Management Association for Human Resources. Measures were aligned as appropriate with existing industry standards, including both Uniform Crime Reports (UCR) and the National Incident-Based Reporting System (NIBRS) for crime reporting, and the National Fire Incident Reporting System (NFIRS) for fire incident data.

Rather than trying to measure everything a local government undertakes, the goal was to track only those key items that jurisdictions were most likely to compare with others.

In facilities management, for example, where prior surveys tracked breakout categories relating to a wide range of facility types, the new ICMA Insights system is focused on custodial, repair, and utility measures in administrative and office facilities.

Where policy or procedural considerations may impact the cost or structure of service delivery, an array of descriptive questions also remain to enable participants to filter the data to include those most like their own jurisdiction.

Particular attention was given to the larger outcomes to be achieved by a service (such as minimizing the number of fleet vehicles requiring repeat maintenance within 30 days), rather than just tracking input and output measures (such as the number of staff or number of work orders processed).

Those measures deemed “nice to know” but not of particular analytical value were dropped to minimize the data collection burden on local government staff.

But while the resulting list of 950 measures available in ICMA’s comparative database may fit the needs of most jurisdictions, there are always potential exceptions and localized priorities. What if a jurisdiction also operates an airport, convention center, cemetery, or other specialized program or facility?

To better meet the needs of those jurisdictions without cluttering the system for the more general-purpose local governments, Insights offers the option of adding custom measures, which can be used either in inter-jurisdictional comparisons (e.g., percentage of performing arts center days booked) or in an individual local government’s goal tracking (e.g., “percentage of Fourth Avenue business incubator space occupied”).

Regional consortia or nationwide groups with similar concerns can also establish custom measures that address their key priorities. The Valley Benchmark Cities consortium, for example, described elsewhere in this issue of PM, is using Insights to help facilitate its comparisons among 11 Phoenix-area governments.

Moving forward, annual scrutiny of the standard set of performance measures will remain a routine part of the program, particularly as fields like information technology and its related metrics continue to evolve.

All in all, ICMA Insights represents the newest offering in ICMA’s performance management toolkit as it evolves to take advantage of new technologies that provide real-time information and analysis and carefully vetted metrics to help jurisdictions of all sizes better understand their own operations and learn from the successful practices of their peers.

Through Insights and the new ICMA Center for Performance Analytics, ICMA is positioned to provide both the software and the strategic advice to facilitate data-driven decision making.

As ICMA Executive Director Bob O’Neill has observed, “Through services such as ICMA Insights, which combines industry-leading analytics with one of the largest repositories of U.S. local government performance metrics, we can apply comparative performance tools across vast numbers of local governments and apply predictive analytics to some of the more complex service delivery issues of our time.”

ICMA’s historical commitment to performance management will carry forward as a priority. Performance management not only promises better-managed individual communities, but also represents a fundamental professional practice that will endure even as local governments evolve to address whatever challenges they may face in the future.

 

 

New, Reduced Membership Dues

A new, reduced dues rate is available for CAOs/ACAOs, along with additional discounts for those in smaller communities, has been implemented. Learn more and be sure to join or renew today!

LEARN MORE