Breaking the code: The potential of the $100 genome

In 2001, the cost to sequence an entire human genome was $100 million. Since then, cost has moved swiftly downward, hitting $1 million around 2007 to sequence the genome of James Watson (a co-discoverer of DNA’s double-helix). The price has continued its downward curve, falling to about $3,000-$4,000 in 2013.

A race is on to reach a price target of $100 per complete genome. That $100 figure is still a few years off, but it is looking more probable every day. Chip-based sequencing, which eliminates the need for expensive reagents and uses relatively cheap equipment, has lowered costs significantly and should allow even lower cost sequencing as throughput speeds increase. Nanopore sequencing, which splits a DNA strand into its single helixes and passe


s the entire strand through a tiny protein tube that reacts to the individual molecules, is a radical departure from previous technology. It’s not yet ready for clinical use, but it, too, should lower costs and speed up throughput.

Healthcare technology is opening new doors. That drop in cost will mean that truly personalized medicine is on the near horizon. The ability to sequence a genome quickly and cheaply is the gateway to understanding the underlying molecular pathways for disease. 

Making sense of all that data

The next big hurdle is analyzing the genome and understanding what it all means. Currently, the exome, or the portion of the genome that we actually understand, represents only about 1 percent of the total. It’s one thing to search out mutations in specific areas of a genome to identify a specific disease risk. It’s far more difficult to understand the implications of all 3 billion genetic pairs in a genome. Add in the epigenetic information in the controller regions of DNA and you have a monumentally complex data set.

To achieve the ability to redirect disease-causing genetics will take a complex toolkit of analytic technology, computing infrastructure, medical research and a healthcare system that supports the end goal.

Analytic technology is moving forward quickly, allowing medical researchers to identify mutations and create therapies based on the results. The pediatric cancer clinical trial  led by the Neuroblastoma and Medulloblastoma Translational Research Consortium (NMTRC) and supported by the Translational Genomics Research Institute (TGen) is a model for how to use genetic data, analytics and computing infrastructure to improve treatment.  TGen researchers map the genome of tumor cells and use analytic tools on a high-performance computing platform to quickly determine the disease pathways. This helps the oncologists select the most effective form of therapy for each child.

Beyond cancer treatment

Oncology is not the only area where genotyping affects treatment decisions. For example, clinicians now use genetic markers to predict how patients will react to warfarin therapy, a common treatment to prevent clot-related strokes in patients with atrial fibrillation. Older approaches required near-daily blood tests at the beginning of therapy to determine an appropriate dose.

These are baby steps up the mountain of genetic data that we need to conquer, but as our sophistication in genetic analysis grows, and as the speed of processing increases, our understanding of our DNA will increase. As with most technology, the rate of increase will speed up, and the time needed to solve the problem will go down. I am not unrealistic in hoping that my children will live much longer lives than me, because they will know so much more about how to avoid disease.

Think of it. In the near future, we could identify before birth all the diseases built in to a baby’s DNA. A bit farther in the future, we may be able to take steps in utero to prevent those diseases from happening.  Any disease caused by an unfavorable genetic variation could be short-circuited, eliminating many of the chronic diseases that not only shorten lives but also make the final decades of life unpleasant and expensive. I like the idea that accidental injuries could become the most common cause of death for 100-year-olds, not cardiac arrest or Alzheimer’s.

Three steps to unlocking the future

So how do we accomplish that? First, keep investing in healthcare technology. In a previous post on patient-centered care, I noted that we’ve made tremendous progress in crossing the quality chasm, and that technology has been critical in those efforts. We need to increase our sophistication in using analytic tools to improve outcomes and cut costs in patient care and operations. That will help free up funds to invest in new genetic-based treatments when they become available. And the better we are at analytics, the better prepared we will be to use genetic information.

Second, patients need to be confident that insurers and employers can never use genetic data to discriminate against them. Only then will they be willing to test for disease and give themselves the option of preventing it.

Third, we need a system in which everyone has an incentive to prevent disease. That will produce an environment that is willing to pay for genetic testing and for treatments to prevent disease. If the financial risk is perpetually kicked down the road to the next insurer, no one has an incentive to prevent long-term consequences.

With continued investment in technology and research, and continued transformation toward a patient-centered, information-driven healthcare system, our grandchildren will be able to live much longer, and more importantly, much healthier lives than we can even imagine.

I look forward to participating in the Bipartisan Policy Center’s forum on personalized medicine later this week as we culminate the celebration of National Health IT Week. I invite you to join the webcast or follow the #personalizedmedicine conversation on Twitter.

Copyright © 2013 IDG Communications, Inc.

7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon