The last half-century has certainly been an exciting time for humanity when we reflect on the speed of technological change.
As a so called “millennial” I certainly feel privileged to have belonged to the first generation to grow up with access to the internet. I am no computer whizz but I have always been intrigued by the advance of technology as described by Moore’s Law.
Gordon Moore, an Intel co-founder, made an observation in 1965 that the number of transistors per square inch on integrated circuits had doubled every year since their invention - his law was simply the prediction that this trend would continue into the foreseeable future. This prediction rang true for decades - computers and automated machines have become smaller and faster with time, as transistors have become more efficient.
As computers have become more efficient they have also become affordable and widespread in our day-to-day lives. Your smartphone is millions of times more powerful than all of NASA’s combined computing that first put men on the moon in 1969 - a testament to the relentless pace predicted by Moore.
Moore’s law no longer strictly holds true, as the rate of doubling has slowed down due to physical limitations. However there is a specific type of technology, which has accelerated far faster in recent years than computing ever has - DNA sequencing.
What is DNA sequencing?
Simply put, it is the reading of DNA code letter by letter. The letters are actually four nitrogen bases, known as nucleotides, each of which are often abbreviated into a single letter: A (adenine), C (cytosine), G (guanine), T (thymine).
In 1953 James Watson and Francis Crick, based on X-ray structures being studied by Rosalind Franklin, showed how DNA is formed of these nucleotides (arranged in the now famous double helix structure). It wasn’t until 1977 that DNA sequencing had developed into an efficient and reproducible method.
One of the great leaps forward is credited to a remarkable British scientist, Frederick Sanger. He had already won a Nobel Prize for describing the links that form the chemical chain of Insulin (paving the way for it to be synthesized and improving the lives of diabetics ever since). He won yet another Nobel Prize (the last person to ever win two) for his work on sequencing.
Sanger sequencing, as it is still known and used, could sequence up to 80 nucleotides in one go and was a big improvement on previous methods, but was it was still very laborious.
Nevertheless, Sanger’s group were able to sequence most of the 5,386 nucleotides of a simple virus. This was the first fully sequenced DNA-based genome (a complete set of genetic material present in an organism).
You can help fund future research leaders by supporting one of our young scientists and clinicians now.
Find out more
Constant increase in speed
To put the rate of progress so far using the personal context of my own family’s chronological history; Watson and Crick published their paper on the structure of DNA a few weeks before my maternal grandparents met on a holiday camp in Devon. The first genome of a simple virus was not sequenced until my mother left university and starting teaching. Pretty slow so far, right?
In 1984, the year before I was born, scientists finished the complete sequence of the Epstein-Barr virus (which causes glandular fever). It has over 172,000 nucleotides. This may a sound like a lot - but a human genome is comprised of over 3 billion pairs of nucleotides.
But by the time I was fifteen years old, scientists were able to sequence an entire human genome.
So how did sequencing speed up to such a rate? It’s because thanks to the increase in and affordability of the computing power predicted by Moore, automated sequencing technology starting appearing.
The Human Genome Project
The commercial automated sequencing machines that became available in the late 1980s would become the workhorses for the Human Genome Project which aimed to map the human genome.
After 15 years work and 3 billion dollars spent, Bill Clinton and Tony Blair announced on 26th June 2000 that the first draft of a human genome had been completed. (Please note that, although it was called “The Human Genome project”, there is no universal genome, each is unique to the individual. The first sequence published was pieced together from different individuals).
President Clinton compared the achievement of sequencing a complete human genome to putting mankind on the moon. The analogy is apt in terms of scientific effort.
But while human space travel has so far reached its limit with the moon landings, whole genome sequencing since the Human Genome Project has really taken off.
Drop in cost per genome
The cost of sequencing has been driven down by the development of high-throughput sequencing that produces millions of sequences at the same time. This is known as Next-Generation sequencing.
In 2005 a single sequencing machine run could read a billion nucleotides. By 2014, the rate climbed to 1.4 trillion nucleotides. Now a single machine in a single laboratory can sequence over 45 whole human genomes (each belonging to a different individual) for less than $1000 dollars each, in a single day.
Compare that to how long the Human Genome Project took and cost; a head-spinning outpacing of Moore’s law.
Sequencing and cancer
Is any of this relevant to cancer research? Well yes, previously genetic testing has been limited to the sequencing of single genes. Now we can quickly and affordably determine the complete DNA sequence of a patient’s genome and identify all the variants associated with their condition.
It enables the discovery of novel cancer-associated variants; by comparing tumour and normal DNA, sequencing can provide a comprehensive view of every single change in the tumour DNA. These changes are being logged in online databases that are available to cancer researchers everywhere.
This potential to recognise genetic factors by looking at a larger genetic picture of individuals could herald an era of truly personalised medicine and may make our generally “one size fits all” approach to cancer treatment seem as antiquated to future generations as the computers Gordon Moore once built.
comments powered by