Pawan Kumar Dhar

In science, a law describes true and unchanging relationship among interacting elements. Generally, while finding new laws in science, the scientist negotiates three layers — experiment, observation and imagination.

While physicists enjoy a number of laws of energy, mass, momentum and gravity, biologists have in their armoury the Mendelian Inheritance laws, laws of metabolic scaling and the power laws. However, even the most well known Laws of Inheritance come with exceptions such as non-random segregation of chromosomes and homozygous mutants parenting a normal offspring.

It is therefore useful to think of laws in biology as the most frequently observed regularity endorsed by a majority opinion, instead of a 'stiff relationship' among interacting components.

Mendelian curiosity

How could Mendel discover laws of inheritance despite the immense scarcity of data? Also why have more laws in biology not been discovered since despite the enormous data and an abundance of technology? The only technological aid Mendel was armed with was a pair of curious eyes! Perhaps, the key reason for his success was his clear understanding of the need to find 'constants'. It is interesting that the word 'constant' appears 69 times in his paper!

Mendel 'artificially eliminated' noise from his samples and considered only those plants that exhibited consistent patterns. That is, plants with the seven pairs of contrasting features that did not fluctuate with time, weather, nutrition and so on. Due to this reason Mendel used only elementary mathematics – addition and division – to obtain the Laws of Inheritance.

Mendel chose seven pairs of contrasting characteristics and ensured through in-breeding that each plant consistently exhibited the same feature. Even if he had included an eighth feature or considered only six pairs of contrasting characteristics, he would have still reached the same conclusion. It is important to note that the laws of inheritance are derivable using a maximum of two factors!

The data conundrum

In contrast to Mendelian era, scientists today are inundated with a morass of sequence data, expression data, metabolome and proteome data. They have access to huge computational power, and advanced mathematical equations but are nowhere close to identifying a network equivalent of Mendelian laws.

This could be due to the fact that moving from the consistent phenotype level to the dynamic molecular level exposes us to a large body of variables such as stochastic gene expression and concentration gradients influencing cell–cell interactions , lowering the possibility of finding Mendelian equivalent of molecular constants.

The challenge is: how to extract biological constants from this vast space of variables. The space of variables consists of stochastic gene expression, probabilistic molecular interactions, pathways, networks and cell–cell interactions. In this space between genome and phenotype, probabilities, fluctuating concentrations, molecular crowding, context dependencies and emergent phenomenon play a significant role. This layer is the domain of statistical laws and not of elementary mathematics.

Taking the Mendel route

One could take a cue from the Mendelian approach and look for relationship constants. For instance, one could look for a constant like a protein consistently interacting with another protein in several organisms under well-defined conditions. It is unlikely that we will ever find an absolute 'interaction constant' common to all the organisms. A trend restricted by strain, culture condition and metabolic state than an absolute correlation is what we should be probably expecting in biology.

It would be useful to identify consistent interaction patterns at the RNA–DNA level, protein–DNA level, pathway and network level, cell–cell interaction level, and build a 'constant chassis' from the sequence level to the phenotype. Such a 'chassis' could help identify core biological processes, around which variables operate. The chassis can be built from constants at multiple levels such as structure constants in the form of highly conserved folds and binding domains (helix-turn-helix, zinc finger, and leucine zipper), network level constants based on power law, 'the small world property'. It would be useful to find relationship among constants at the same level and constants connecting different levels, to get a hierarchical systems perspective of the constants-topography.

Laws are formal representations of objective reality. They do not necessarily represent the total reality but symbolize a specific feature of the system. The modern thinking is that Laws are emergent . The well-known laws of pressure and volume break down when the number of gas molecules reduce below a certain threshold. It is important to note that laws are true only within a certain zone, below or above which, uncertainty exists and laws break down.

The discovery of laws, based on well known constants in physics (such as the Planck's constant, the speed of light or the laws of motion) encourages search for similar regularities in biology. In biology, it is difficult to find new laws as every biological decision is optimal in a given environmental context.

One should probably look for generalisations at various levels instead. To find such generalisations it is useful to develop novel measurement technologies that capture the dynamic nature of biological systems and more importantly catch emergent properties arising from a group behavior of interacting components.

After all, laws formalise consistent observations, they do not explain them.