Bad: Apples, Barrels, Crops, and Farmers
A rotten apple can spoil the barrel but a bad farmer can spoil the farm.
In fraud investigations, the causes are often described by the acronym ABC: Bad apple, bad barrel, bad crop. A bad apple is one bad actor (like Bernie Madoff), a barrel is a group within an otherwise non-fraudulent organization (like the Rampart police scandal), and a bad crop would be an entirely fraudulent enterprise (such as Theranos). A good friend and colleague, Dean Gialamas, suggested there should be an “F” added to the acronym, for Bad Farmer. That is, apples, barrels, and crops don’t go bad on their own: Someone or something has to let them go bad.
Another helpful descriptor of fraud is the fraud triangle. Pressure can be either personal financial pressure, such as living beyond one’s income, or workplace financial pressure, such as a shortfall in revenue. The employee sees the situation as unsolvable by normal, approved solutions and not in a position to speak with others who may help. The employee thus exploits an opportunity to defraud their organization. The employee identifies a means by which they can abuse their position to resolve the pressure in a way that will not be detected. The employee then rationalizes the fraud, justifying the crime to comport with their moral compass. Rationalizations are often based on external ethical factors, such as a need to take care of loved ones or a perceived dishonesty by a supervisor or employer.
The fraud triangle was designed to describe financial fraud but it works for other kinds of fraudulent behavior, like the dry labbing Annie Dookhan is infamous for or stealing drugs like Sonja Farak. But it turns out they weren’t the only ones being fraudulent about their forensic work. Four additional workers were referred for prosecution in the Massachusetts laboratory scandal. However,
…counting the number of “bad apples” shouldn’t distract us from the reality that the Massachusetts lab scandals were not the work of isolated violators; they were full system crashes. A look at them will reveal system weaknesses that exorcizing individuals will not cure. …Standing to the left of Annie Dookhan (and Sonja Farak, too) were the people who hired them, trained them, supervised them, and devised the laboratory evidence-handling protocols they blithely skated around. Standing to their right was a legion of lab directors and legal system practitioners– prosecutors, defenders, and judges—who failed (throughout the disposition of 30,000 cases) to notice that anything was amiss.
Bad farmers, indeed. We like to blame people; it’s easy. But people don’t fail, systems do. Systems can be hard to analyze or even perceive. But, as Charles Perow said,
[V]irtually every system we will examine places “operator error” high on its list of causal factors—generally about 60 to 80 percent of accidents are attributed to this factor. But if, as we shall see time and again, the operator is confronted by unexpected and usually mysterious interactions among failures, saying that he should have zigged instead of zagged is possible only after the fact. Before the accident no one could know what was going on and what should have been done.
Who would suspect that a laboratory technician would dry lab? Uhm, everyone. No one is exempt from ethical professionalism but the system must guide proper behavior. Get a system that’s very connected and very complex, like a forensic laboratory, and you are bound to have an “accident,” normal accidents in Perow’s terminology. They are normal in the sense being a natural outcome of complex, tightly connected systems, that is, not unexpected but not predictable, either. In the image below from Perow’s book, quadrant 2 is the most likely to have a catastrophic normal accident. Think about where forensic laboratories might fit on this chart).
Safety and quality features themselves become part of the system and that adds complexity. As complexity grows, we’re more likely to encounter failure from unexpected sources. It’s counterintuitive: safety features reduce safety (cry wolf 90 times and no one listens). Much like prohibition laws, give people restrictive systems and they’ll find ways around them. Personal recognizance is necessary, yes, but so are systems that help prevent fraud, like making sure the person who keeps the blank checks isn’t the one who signs them or having a tip system for turning in frauds (the most cost-effective way of identifying fraud). The problem is, someone on the other end has to listen and that person can’t be a part of the fraud. Like your boss.
A forensic laboratory is a system in a system of systems, creating what we call the criminal justice system (it’s not just one but that’s how we misguidedly think of it). The problems multiply as outputs from one system become inputs for another connected system:
Cops under pressure to hit a Compstat number stop a guy on Dudley Street. He has a bag in his pocket. He thinks it contains cocaine. In fact, it contains baking soda. The cops arrest him, and seize the bag. A “field test” of the contents is ambiguous, but he has an outstanding warrant for ignoring child support orders, so the cops arrest him, and charge him with possessing drugs with intent to distribute.
An assistant district attorney charges a felony trafficking offense. The bag goes to the lab. The lab is overwhelmed; the volume of cases forces triage, shortcuts, “covert work rules.” Today’s “covert work rule” sets the stage for practical drift to another even more lax practice tomorrow. Annie Dookhan fakes a test result and certifies it. It “makes sense” to her—it’s despicable, but rational. No one questions the result—maybe no one is able to question the result.
The case comes back to court. The prosecutor has a file, nothing else. The defender has 50 files, and no easy access to a chemist of his or her own. The judge has 40 cases on his docket and needs to get to zero by 4 p.m. A deal is offered: Drop the mandatory minimum, offer six months in the House of Correction.
When normal accidents happen in a system, it’s almost never a bad apple or even a barrel or crop: It’s the farm and the farmer. Dookhan and Farak (and on and on) couldn’t have gotten away with what they did if the system and those overseeing it hadn’t allowed them to. Another way to think of normal accidents is as "sentinel events." A sentinel event is a significant negative outcome that:
Signals underlying weaknesses in the system or process.
Is likely the result of compound errors.
May provide, if properly analyzed and addressed, important keys to strengthening the system and preventing future adverse events or outcomes.
Sentinel event reviews are a good place to start but they take time, effort, and honesty. They need to be more prevalent and supported. We really need to start learning from errors in our criminal justice systems.