Avoiding the Pitfalls of PGx
Dr. Bronwyn Ramey holds a Ph.D. in molecular microbiology and biochemistry from Indiana University and has been working in the field of pharmacogenomics (PGx) since 2006. Dr. Ramey is also certified by the American Board of Bioanalysis (ABB) as a High Complexity Lab Director (HCLD) and has been advising and consulting labs through her consulting company on all things pharmacogenetics since 2014.
We sat down with Dr. Ramey to learn more about the challenges facing PGx labs and how consultants, like Dr. Ramey, help these labs improve quality and time to market while avoiding the serious pitfalls that less experienced labs are likely to encounter.
Tell me about your history in PGx.
After completing my Ph.D., I spent the first eight years of my career working as the Director of Technology Implementation at PGXL where I supervised the design and implementation of new molecular assays (PGx in particular), developed protocols, and managed our validations. I also gained experience in clinical software development. In 2014, I decided to branch out on my own and started a consulting company with the goal of helping labs adopt or improve PGx testing. All in all, I have about 20 years of molecular experience and 13 years of experience in pharmacogenomics.
After all this time, I’ve learned a few things. There are some serious pitfalls out there. I’ve seen labs close up shop because of a few uninformed decisions and I really hope that having this profile on ClinicalLabAdvisor.com will get more labs asking questions before they pull the trigger on purchasing instruments, selecting clinical reporting solutions, or even hiring technologists. Some labs, for whatever reason, run straight into big decisions without the needed scientific or medical expertise and it never turns out well.
It seems like a lot of independent labs performing toxicology testing also get into PGx. What pitfalls have you seen these labs fall into. What should our readers avoid?
First of all, the pivot from high-complexity LC-MS testing to high-complexity PGx testing is not as simple as it sounds. There are several new and unique factors that need to be considered, particularly for toxicology labs trying to adopt PGx.
(1) DNA quality is a real issue in PGx. Since everything flows from the extraction process, it’s vital that labs select appropriate equipment and design processes to yield enough high-quality DNA for the downstream needs. Although vendors offer what they call “off-the-shelf” solutions, those products rarely work out of the box. Every lab solution requires customization and optimization.
(2) Lack of understanding of the complexity of PGx interpretation. Gene targets in pharmacogenomics, such as CYP2D6, are exceptionally complex and may require long probes at several loci. This kind of assay can be difficult to design and, just as importantly, difficult to troubleshoot. Labs need solid SOPs, fail-safe protocols, and well-trained technologists to manage this degree of complexity.
(3) The regulations in PGx are different than toxicology. For labs not experienced in PGx, it’s critical that they get access to outside counsel for compliance, reimbursement, accreditation, and preparation for their first inspections.
I’ll give you one example of how this plays out. One of my former clients, a major statewide hospital system, was bringing on PGx testing and—before I was hired—they chose a reporting solution that had an incomplete gene profile for the drugs they were reporting. The company that sold the reporting solution told the hospital that it had a unique study that proved the robustness of their clinician reports. The sad thing was that even a cursory look at this “study” by anyone who understands PGx would have saved this hospital from locking themselves into a contract for this second-rate product. In the end, the solution they chose was scientifically subpar, and—just as bad—they were being gouged on pricing. This is where the old saying that, “an ounce of prevention is worth a pound of cure,” comes into play.
What technologies do your labs utilize for PGx testing and do you have any recommendations on this?
One thing I’d like to make clear, and it’s something that’s not true of every consultant out there, is that I don’t have any agreements with vendors. When I help a lab select the right equipment or technology, it’s completely based upon their needs. I’m proud to say that I don’t receive referral bonuses for steering labs one way or another.
But back to your question, I still prefer the SNP-based testing over next-generation sequencing (NGS) for PGx. I feel like NGS is overkill for what we’re trying to achieve in PGx. When my clients are still deciding upon what equipment to use, I usually present them with three platforms.
(1) The Thermo Fisher Open Array (QuantStudio 12k)
(2) The LGC Douglas Scientific IntelliQube
(3) And the Agena Bioscience MassARRAY
Each of these has its own advantages and disadvantages. For instance, the Thermo instrument has a higher cost per sample and is a closed system but, on the other hand, all of the panels are stocked and require minimal design. The LGC intelliQube system has a much lower cost per sample, but it also requires more technical acumen to get it operational. Ultimately, you need to not only consider the merits of the platforms, but also, “How much does it take to validate?”, “What kind of technical expertise do my technologists need?”, “How am I going to report to clinicians?”, and “What kind of flexibility do I want for the long run?”
What kind of advising are you doing on Clinical Lab Advisor?
I’m available for any questions on pharmacogenomics. From method development, to troubleshooting, to selecting reporting solutions and instrumentation, to assay design and validation help, to SOP development, and even for interviewing candidates. Whether you need an expert opinion on a question or just want to book me to interview one of your technologists, I’m here to help!
Find an Advisor
Founder, Phoenix Laboratory Consulting, LLC