Some states require auditors to evaluate labs on a consistent basis. States requiring ISO17025 certification hold labs accountable to specific criteria, but is it enough? As auditors, we’ve often followed a lab’s ISO audit with our own to ensure states are conforming to the checklist mandated by the Washington State Liquor and Cannabis Board. We’ve consistently found more deficiencies in those labs than identified by the ISO audit alone.
There is significant oversight of the labs comprising the Washington state cannabis testing community. To date, covering 69 onsite and paper audits, there has been 358 total deficiencies identified and 347 suggestions provided to labs to enhance their analytical capabilities and prowess. These audits have resulted in five suspensions that were not lifted until the lab fixed the deficiencies and submitted requested data proving the rectification. One lab could not, or would not as might be the case, fix their deficiencies, and thus, their certification to practice analytical testing subsequently expired.
Some folks in the industry have proselytized the problems with labs as though it were a universal occurrence. In a state like Washington, where traceability data can be obtained by anyone through a simple public records request, lots of people can run some numbers of their own, and publicize the differences between all of the labs. Other states, like California and Colorado do not provide traceability data to the public upon request, citing codes pursuant to confidentiality of licensee data. Perhaps this highlights one of the main reasons Washington has received so much attention from people looking to publish research, whether as part of a formal research study, or that stemming from private citizens. Washington’s numbers are an easy target because they are the only target. Evaluations of differences between state regulations, combined with the numbers provided above, illustrate the translucency of the Washington cannabis program.
There have been allegations regarding some labs being “friendly” labs. “Friendly” labs are characterized as providing the data their client requested, rather than actual data, or falsely inflating potency data so that their client can command the highest price for their commodities. While the latter has indeed been discovered in our own audits, what’s important to consider is that despite labs pointing fingers to see competing labs reprimanded, or private analyses of public records requests revealing statistical differences between subsets of labs, an auditor must find compelling evidence through the documentation requested from a lab to cite them for deficiencies. One such lab, accused by their comrades as falsely inflating potency data, was found to have mistakenly multiplied in their calculations, rather than divide. Interestingly, the analysis of corrective action reports revealed that even some of the lab’s clients did not believe the values measured by the lab. Auditors only have the opportunity to observe lab practices (good or bad) when they are actually onsite. These results were discovered during an unannounced audit. Unannounced audits are useful when performing specific investigations in response to complaints. The ever present possibility of an unannounced audit also helps keep nefarious practices at bay. These unannounced audits are increasingly utilized in the program’s platform. Given that auditors cannot police labs 24/7, it is all the more important for clients to submit their concerns to their governing entity. The bottom line is that when auditors are onsite, the labs will be practicing good laboratory procedure, as with ProficiencyTesting Samples (as shown by their passing the PT). However, honesty and ethics must be continued beyond the audit, and that, quite frankly, has eluded some labs.
No industry has 100% oversight, and to intimate the contrary is a fallacy. As a frame of reference, the FDA regulates food, drugs, biologics, medical devices, cosmetics, veterinary products, and tobacco products. As of February 2016, there were 2,882 food facility registrations and 96 drug establishment registrations as of February 2017 in Washington. The FDA’s Seattle District office issued only 16 total warning letters in 2017. When compared to other regulatory agencies such as the FDA, Washington’s cannabis oversight program is extremely active.
Conspicuous labs will undoubtedly point fingers at their rivals in attempts to usurp their clientele. There are multiple reasons why different labs may have various potency averages. The sampling of the products can significantly affect the measured potency, since cannabis is a natural product known to have at least 500 chemical constituents. Thus, if two different labs measured two different aliquots of the same flower sample, it would be expected that two different results would be measured. When you couple other factors like how the products were sampled, operator and instrument variability, and environmental factors, more variation can be introduced. These factors are accounted for by understanding and reporting the uncertainty of measurement. Beginning in August 2017, Washington state laboratories were required to calculate this uncertainty. Being a new industry with a variable matrix and few available standard methods, expected uncertainties have yet to be determined industry wide. To provide perspective, when counting CFUs in food microbiology, if the expected value is 10,000 CFUs, it is acceptable to report anywhere from 6,000 CFUs to 14,000 CFUs. These numbers might seem like quite a lot. However, with a 100-fold dilution, the range is 60 CFU to 140 CFU, or 60 to 140 colonies on a plate.
Total dietary fiber in cereal has an acceptable uncertainty of ±15%. Nationally accredited environmental testing laboratories using standard methods for testing lead in water have an acceptance range of ±14% in proficiency testing rounds.
What’s ultimately important is that labs are held accountable when they buck the system. To do so requires more than just speculation. Auditors must use the information requested and provided to judge the lab’s acumen. While a recent, rather exhaustive study showed some Washington labs to be statistically different when comparing higher and lower potency averages, the historical data used in the study was tainted by the use of inaccurate potency data. When considering inflated potency across historical data sets, it is important to note that often, when labs first enter in to the program, kinks may need to be resolved. These kinks, though, contribute to the average potency found in the traceability system. In the case of the lab multiplying instead of dividing, how many values were entered in to the tracking system that padded the average potency from that lab? That number is not enigmatic, however, as knowing the start and end dates encompassing the inaccuracies would reveal the percentage of values that were entered in to the traceability system during this period.
It’s also important for readers and members of the cannabis community to know that auditors are not the final say when it comes to recalls or lab closures. While these concepts can be suggested, auditors do not have the authority to do anything but strongly suggest resolute ramifications for lack of compliance and submit the supporting evidence. The governing bodies at the State level have the ultimate say and responsibility to put measures in place to fix and prevent future problems found by the compliance programs. Requiring accreditation to a standard doesn’t remove the responsibility from the governing bodies to ensure all parties in the industry are putting out safe, quality products to the consumer.