When Shelters Shift the Risk, Part 4: When the Ledger Lies
What Orange County’s broken shelter math reveals about California’s vanishing accountability
Previously in this series, we’ve followed how shelters from Apple Valley to Sacramento quietly shift danger, cost, and blame onto rescues and communities; this chapter shows what that shift looks like when the numbers themselves stop adding up.
The Orange County animal shelter has a math problem.
According to its own quarterly reports, Orange County Animal Care took in 2,783 stray dogs last year. Its annual report says the number was 2,648. Somewhere between one official county document and another, 135 stray dogs disappear.
The foster program tells a similar story. Add the quarterly totals and you find 355 dogs placed in foster care. The annual report lists only 237, that’s 118 animals who exist in one public record and not in another.
These are not rounding errors. They are living animals whose lives disappear when the paperwork can’t be trusted.

Those discrepancies were not uncovered by an auditor or a grand jury. They were discovered by a local resident, Michael Mavrovouniotis, a retired quantitative investment manager and volunteer data scientist for Social Compassion in Legislation (SCIL), who published his findings in Voice of OC on March 17, 2026.
Mavrovouniotis did not come to this data blind. In 2023, OC Animal Care deleted its published statistics entirely, a pattern documented at the time by Voice of OC. Anticipating that history might repeat, he copied and preserved the 2024 quarterly and annual reports at a public archive before publishing his findings. The shelter’s data integrity problem, in other words, is not new. It is recurring.
In most areas of public life, from public health to law enforcement to finance, numbers that don’t reconcile typically trigger investigations. In animal sheltering, they often trigger nothing at all.
If this series has shown anything, from Apple Valley to Sacramento, it is that animals and dollars rarely disappear all at once. They disappear on paper first. When the ledger lies, danger, cost, and blame migrate toward those with the least power: the animals, the volunteers trying to save them, and the communities paying the bill.
These are not obscure metrics. They are the basic building blocks of any shelter’s story: how many animals came in, how many left, and what happened in between. In any other public system, numbers this fundamental that do not reconcile would be treated as a breakdown in accountability, not a clerical glitch.
Yet OC Animal Care is not being graded by an outside watchdog on its data. It is grading itself. The same executive leadership that touts “progress” and posts celebratory graphics is also publishing reports that do not agree with each other on how many animals even entered the building.
To make matters worse, the same shaky ledger is being used to bill cities for services. As Mavrovouniotis points out, OC Animal Care’s cost‑allocation table appears to include non‑contract cities such as Irvine and Laguna Woods, assigning them hundreds of animals despite the fact they do not receive service from the county shelter.
If confirmed, that would mean the financial map of who pays for this system and who uses it is as unreliable as the intake and foster data. When the inputs into a billing formula are wrong, the invoices are wrong. It is that simple.
And this is not happening in a state with strong, enforced standards for shelter reporting. It is happening in California.
How a statewide data vacuum makes this possible
If Orange County’s numbers seem shockingly loose, consider the landscape they inhabit.
California once maintained a statewide ledger of shelter activity. It was imperfect, but it existed. Today, that system has largely withered. No agency reconciles reports. No authority rejects numbers that do not add up.
Into that vacuum step voluntary dashboards and self-reported metrics, useful tools, but incapable of enforcing truth. When reporting becomes optional, accountability becomes optional too, and the story of sheltering is outsourced to storytellers.
In Parts 2 and 3 of this series, I described how the national dashboards can create a comforting but incomplete picture of “progress” by sampling the shelters that are already motivated to look good, while the most troubled facilities quietly opt out of view.
That is the landscape in which OC Animal Care operates. There is no state system that rejects a county’s report because the quarterlies do not reconcile with the annual. There is no automatic penalty when a public shelter’s own documents disagree on how many animals went through its doors. There is, instead, a vacuum, and vacuums in governance rarely stay empty. They fill with self‑curated narratives.
In that environment, a county shelter can publish irreconcilable reports and still issue press releases about “record” live‑release rates. The architecture is built to allow it.
How bad data shifts risk and blame
Bad data is not incidental. It is how risk shifts from the powerful to the powerless.
The arithmetic glitches at OC Animal Care are not bookkeeping problems. They are the mechanism that allows risk, cost, and blame to be shifted away from the institutions that control the system and onto those with the least influence inside it.
We’ve seen this dynamic in other corners of California.
In Part 1, I wrote about the high‑desert shelter in Apple Valley, where the town used sterilization paperwork and a thicket of rules to pressure rescues, while quietly killing the overwhelming majority of cats that came through the door. On paper, Apple Valley could hold up sterilization certificates and contract clauses as proof of “responsible” policy. It took a close look at the outcomes data to see that, in practice, the system was eliminating cats, not solving the problem.
In Parts 2 and 3, we followed rescues and fosters who took on the animals that municipal shelters did not want to count as “their” responsibility any longer: dogs with serious behavioral risk, animals with costly medical needs, litters from dogs never offered spay/neuter in the first place. Rescues absorbed the bills and the danger. The shelters kept the optics.
OC Animal Care’s numbers show us what that kind of risk‑shifting looks like in spreadsheets:
When animals disappear between quarterly and annual reports, it becomes impossible to track euthanasia trends accurately. A county can claim that killing is down, or that live‑release is up, without being forced to reconcile every intake and every outcome.
When foster numbers do not match across reports, we cannot know how many animals were actually supported in the community, for how long, or with what success. We can only know how many the shelter chooses to count.
When billing tables include non‑contract cities, or misallocate costs, the communities that drive intake and the communities that pay for the system may not be the same. The political pressure to fix what is broken can be quietly shifted from one jurisdiction to another.
At the same time, the people and organizations with the least leverage, rescues, fosters, individual citizens, are often held to a higher standard than the shelter itself.
In Apple Valley, rescues were threatened with the loss of their ability to pull animals over paperwork issues, even as the town’s own data remained opaque. In the AB 631 debates, many rescues feared becoming the easiest targets for new enforcement while municipalities continued to operate behind partial or missing data. Across the state, shelters routinely demand meticulous sterilization documentation and compliance from rescues, while their own intake and outcome reports cannot pass a basic addition test.
The double standard extends to risk data. In Los Angeles, as I have documented, readily available dog‑attack and bite data have been sidelined when they conflicted with “no‑kill” branding and live‑release narratives. The public bears the risk of dangerous animals being placed or released, but cannot see a clear, honest picture of how policy decisions drive that risk.
The pattern is consistent: where data are thin, fragmented, or self‑curated, it is easier to punish the people trying to plug the gaps than to hold the system itself accountable.
OC Animal Care’s irreconcilable ledgers are not an isolated bookkeeping error. They are one more example of a structure in which the story told to the public is allowed to drift away from the reality experienced by animals, rescues, and communities.
What honest shelter data would look like
Trust begins with four questions:
How many animals entered the system?
How many left alive, and by what paths?
How many died, and why?
Who paid for what, and on what basis?
That means building data systems that do what OC Animal Care’s current reporting does not.
At a minimum, an honest public shelter system would include:
Daily census that reconciles to intakes and outcomes. There should be a credible, public count of how many animals are in the building and in foster care each day, and those counts should reconcile to the intake and outcome figures over time.
Quarterly reports that must match the annual. The sum of four quarterlies should equal the annual total for each major category. If an error is discovered, it should be corrected through a public erratum, not quietly buried in a new, conflicting report.
Mandatory statewide reporting with enforcement teeth. Participation should not be voluntary. If a shelter fails to report, or reports numbers that do not reconcile, there should be consequences, up to and including loss of certain state funds.
Transparent, audited cost‑allocation formulas. Cities should be billed based on clearly defined, verifiable measures of use, and non‑contract jurisdictions should not appear in allocation tables. If a shelter claims a city is responsible for a given share of costs, it should be able to show the data that supports that claim.
These are not radical demands. They are the same basic standards of accountability we expect from school districts, law‑enforcement agencies, and public‑health departments. We should not accept less rigor when it comes to the systems that control life and death for hundreds of thousands of animals each year.
Accountability, local and statewide
So what should happen next?
In Orange County, the first step is simple. The Board of Supervisors and the contracting cities should commission an independent reconciliation of OC Animal Care’s last several years of intake, outcome, and billing data. Every quarterly and annual report should be forced to add up. Every category should be checked. Every discrepancy should be documented and corrected in public.
Contracts with OC Animal Care should be updated to require reconciled, public reports as a condition of doing business. If the numbers don’t add up, the money stops flowing until they do. City councils, especially in high‑use jurisdictions, should hold hearings on whether they are being billed off accurate data and whether their animals are being accurately counted.
At the state level, California must decide whether it wants to lead or lag on shelter transparency.
Rebuilding a comprehensive, mandatory shelter‑reporting system will not, by itself, solve overcrowding, underfunding, or the fallout from years of “no‑kill” politics. But it is impossible to address those issues honestly without a shared, audited ledger of what is happening in our shelters.
Any future state funding, whether for spay/neuter initiatives, enforcement of access laws like AB 2010, or new shelter‑improvement grants, should be tied to verified, reconcilable data. And any public agency claiming “no‑kill” or “90% live‑release” status should be required to base that claim on a full, reconciled dataset that includes every intake, every outcome, and a credible daily census, not a selectively reported or privately uploaded subset.
In earlier parts of this series, I argued that California’s shelter crisis is no longer primarily a problem of information. We know what is happening to animals on the ground. Rescues know. Volunteers know. Front‑line staff know. The problem is a lack of political will to align law, funding, and oversight with that reality.
Orange County has now given us something else: a glimpse of what happens when the numbers themselves cannot be trusted. When animals can vanish between one report and the next, and nothing in law demands an answer, the ledger becomes one more tool for shifting the risk.
You cannot fix what you cannot see. When the numbers themselves vanish, accountability vanishes too.
The Series
Ed Boks is the former executive director of animal care and control agencies in New York City, Los Angeles, and Maricopa County, and a past board member of the National Animal Control Association. His work has appeared in the Los Angeles Times, New York Times, Newsweek, Real Clear Policy, Sentient Media, and now on Animal Politics, a lively community spanning 48 states and 61 countries.
If you enjoyed this article, consider subscribing to Animal Politics. Your support fuels independent research, reporting, and discussion shaping the future of animal welfare.



Warren Cox, director at 22 shelters over 60 years, 1952-2012, always emphasized to me that the first thing a director should do when he/she arrives for work in the morning is a nose count of dogs & cats, & then do another as the last task before leaving in the evening. I was with him several times at several shelters when he did those nose counts. Because he reconciled the numbers twice every single day, his monthly and yearly data were always precise. If the numbers from either count didn't match inventory plus intakes minus deaths, adoptions, and foster transfers, meaning an animal was missing, he stayed there until the missing animal or animals were accounted for. Thus, on one occasion when I was there, a kitten was found alive & well who had somehow escaped into a drain, 20 minutes of searching by all hands after Warren sounded the alarm that a kitten was missing. On other occasions dogs were found wandering the halls, having apparently been dumped on the premises instead of being properly admitted. They were microchipped & logged into the system before Warren left. One way or another, Warren made sure the numbers matched, & made sure a whole generation of shelter directors whom he trained did likewise. Those directors are now mostly retired themselves. We can only wish they had passed along a similar work ethic.
There is also a discrepancy at San Jose Animal Care Center between numbers in their yearly fiscal report, and in their ACS dashboard. Also the yearly fiscal report from the city auditor is using the numbers from ACS dashboard. All these are "in theory" presented and approved by the city council.
For example, for FY23-24, the yearly report states that total intake was 13,212 animals while the ACS dashboard says it was 12,242. Here’s a table for the past four fiscal reports (hopefully this is readable):
Fiscal Year ACS yearly reports ACS Dashboard Annual Report on City Services (approx)
FY21-22 15530 14531 14500
FY22-23 11031 10233 10400
FY23-24 13212 12242 12500
FY24-25 11189 11206 11200
Discrepancies were also found in earlier fiscal years. This issue makes it even more difficult to compare different administation (bug or feaature?)