The Outdated, Reductionist Premise of Genetic Engineering

Perhaps the most obvious attraction to using genetic engineering (otherwise known as ‘biotechnology’ or ‘genetic modification’) as my case study issue for reductionism is the fact that the entire premise on which it sits – the concept that genes are: “functional units of information which can be characterised precisely, counted, added or subtracted, altered, switched on and off, or moved from one organism or one species to another by means of genetic engineering” – is not only reductionist, but ultimately, outdated and highly simplistic. Yet, remarkably, genetic engineering continues on, with barely a mention of this startling reality.

Without wanting to get bogged down in the highly technical area of molecular biology, the basic situation is this – a wide range of unexpected discoveries in the field of molecular biology over the past decade, primarily arising from the Human Genome Project, have completely rewritten basic gene theory. Researchers found that the human genome is not a tidy collection of independent genes, with each sequence of DNA linked to a single function as had been the basic premise since the discovery of the DNA double helix – by Francis Crick and James Watson in 1953. But instead, that genes appear to operate in a complex network, interacting and overlapping, not only with one another, but also with other components in ways that no-one claims to fully understand.

The reality for molecular geneticists today is that the entire concept of a gene has become little more than a simplistic and outdated metaphor. Helen Pearson, in her 2006 Nature article titled ‘Genetics: what is a gene?’ states in her opening line – “The idea of genes as beads on a DNA string is fast fading. Protein-coding sequences have no clear beginning or end and RNA is a key part of the information package”. In describing the outdated view that many scientists continue to hold on the subject of genes, Pearson goes on – “those at the forefront of genetic research see it as increasingly old-fashioned – a crude approximation that, at best, hides fascinating new complexities and, at worst, blinds its users to useful new paths of enquiry.”

Barry Commoner, a senior scientist at the Center for Biology of Natural Systems, in an article titled ‘The Spurious Foundation of Genetic Engineering’ is a little more blunt in his assessment – “The experimental data, shorn of dogmatic theories, points to the irreducibility of the living cell, the inherent complexity of which suggests that any artificially altered genetic system, given the magnitude of our ignorance, must sooner or later give rise to unintended, potentially disastrous, consequences.”

The unfortunate reality for the genetic engineering industry is that, far from being the precise and accurate method that industry claims, evidence from the history of genetic engineering itself points squarely at this uncertainty. For every one of the (few) genetic engineering ‘success’ stories, thousands of unexpected mutations and failures lie silently behind it.

The assertion that the state of play when it comes to the field of molecular biology has completely changed in the past decade cannot be denied. These changes have, without doubt, also challenged the underlying assumptions of genetic engineering. What is perhaps most fascinating in the face of this reality is to question – why hasn’t the field of genetic engineering come under the intense scrutiny one would expect given the foundation of its entire premise has been eroded?

An article in the business section of the New York Times titled ‘A Challenge to Gene Theory, a Tougher Look at Biotech’ made exactly this assertion, stating – “The $73.5 billion global biotech business may soon have to grapple with a discovery that calls into question the scientific principles on which it was founded.” But this article was written in 2007 and surprisingly little has been said since.

What is important to understand in relation to this foundation premise of genetic engineering is that nearly every other premise on which the discipline is based depends on it. This includes, but is by no means limited to, the right to patent genes, and the regulatory process for genetic engineering which currently views genetically engineered crops as ‘substantially equivalent’ to natural crops. That is to say, the entire genetic engineering industry is a house of cards built almost solely on the foundation of a scientific principal that is now anything but solid.

Given such shaky foundations, it perhaps becomes easier to understand how scientist tend to get so heated in defending this topic. It is after all, their entire careers that hang in the balance.

This is an excerpt from Richard’s article entitled “The Complexity of Reductionism: A Case Study of Genetic Engineering.”  The full version of the article can be found here. Richard is an agricultural scientist with ten years professional experience in the agricultural sector, with a particular focus in agricultural policy. He has recently completed a Masters in Holistic Science at Schumacher College in the UK. Richard is currently working as an independent digital strategy consultant whilst exploring agroecological food stories in the UK and Europe.