In a previous post on the ecology track, I have elaborated on the bane of almost all researchers’ existence – securing research money. Part of the post covered the need to “sell” one’s science in order to secure grants, though the current research climate has forced many to become essentially glorified science salespeople these days. For better or worse, that part of science is here to stay…

Last week, my lab group had the fortune to welcome a visiting entrepreneur/researcher  who works on the economics of alternative protein sources from plants. For the sake of confidentiality, I’ll be intentionally ambiguous on the details of his work, but it was extremely well-received by my colleagues; many of whom work on novel molecular approaches in agricultural sciences. I too enjoyed his presentation, but I was also surprised at how interested everyone was in the economics of his agricultural research as opposed to the agricultural research itself. 

In the end, he was brutally honest in his economic viability assessment – the economics simply doesn’t check out when plant production to scaled to industrial levels. His conclusion – it doesn’t make sense to fund research to study how to improve growth rates of this plant because at industrial scales, other conventional plant protein sources (e.g. soybeans) always win (i.e. cheaper). To substantiate his opinion, other companies that have attempted commercial production have gone out of business too.

Given that scientists are often overly optimistic and starry-eyed when it comes to research ideas, it is refreshing to hear someone admit and being transparent that an idea simply doesn’t work because of economics. Many of my colleagues investigate microbial solutions to boost agricultural productivity, thus a lot of their projects have a commercial end-goal (e.g. a patent for a strain of bacteria). This got me thinking: whose job is it to ensure that research has commercial viability? The academics, or the industry players?

Argument 1: Researchers should ensure their proposed idea is economically viable before conducting research

On one hand, most research money comes from public funds. Hence, it can be argued researchers conducting applied research bear a responsibility in using said funds wisely in order to benefit society, which includes ensuring that whatever research output generated can be scaled and utilized economically.

However, which researcher can guarantee a result? If so, there wouldn’t be a need for research in the first place! Thus, expecting positive results from a study with absolute certainty is unrealistic. Nevertheless, this doesn’t give academics a free pass to research whatever they want with little odds of success. Between zero and absolute certainty lies a sweet spot where projects have a good chance to succeed and thus should be funded, yet risky enough to produce something truly novel and significant (at the cost of failing outright). Personally, I don’t know where that sweet spot is, nor whether there exists one in the first place.

Taking a step back, perhaps a better question to ask would be: Is there a way in which we can sieve out economically doomed studies, even if they may produce positive results? Here is when we can start incorporating talents from multiple disciplines to make a more informed decision on the economics of applied research. For example, the guest researcher above capitalized on his skills as a process engineer to build his own in-house model to simulate the estimated cost of production of said alternative protein, while using sensitivity analysis to figure out the primary variables within the production pipeline that most strongly drove the cost. From his analysis, he showed that cultivation was not the biggest hurdle in cost, but post-harvest processing instead. Thus, if there was any research money at stake to scale the production of said plant protein, it should be directed to optimize the processing step rather than plant cultivation. However, to my knowledge, such research remains underdeveloped, and pursued by few individuals and organizations for now.

Argument 2: It is the job of businesspeople and entrepreneurs to figure out the economics of applied research

On the other hand, researchers are mere mortals and we can only specialize in so many things at a time. Perhaps economics are rightfully not our forte, and we should leave it to the business world to figure out the economics behind scaling findings from the scientific community.

While I don’t do commercial research, taking such a perspective feels almost…irresponsible? Historically, academics have had a bad rep of ivory tower syndrome, where they are so preoccupied with their own research that they become deaf to the needs of the outside, non-academic world. Clearly, no sane academic wants to be viewed as an aloof ivory-tower scholar working on things no one else cares about (perhaps the insane ones do desire that?). Perhaps then, some form of checks and balances need to be instilled into the research grant system to filter applied scientific projects that are of low economic viability.

Yet, sometimes research isn’t all about generating commercial value. In that case, assessing the value of a proposed study becomes can become very hazy – is it for knowledge generation, or for commercial potential? No doubt, a lot of scientific research would not have existed without commercial interest. Indeed, some of the greatest scientific findings in the 20th century were spurred on by industries – think the blue LED, or the invention of nylon. However, it is also true that lots of fundamental research in the past (e.g. the study of prime numbers, quantum mechanics, etc.) did not appear to have significant commercial prospects when they were first conceived, and likely required governmental funding for support. Only much later did we start developing technologies that utilized these fundamental research for everyday use. In the examples above, prime number generation is a core aspect of modern cryptography, whereas quantum mechanics is at the heart of many modern technologies used today.

Conclusion

My view is that society should definitely continue to fund fundamental research, for they form the bedrock of many fields in science. However, such research should be assessed (and funded) differently from more applied, commercial research. A key difference between the two types of research is their generalizability. The former seeks to establish universal “truths” in our world, with the potential for novel applications in the future. The latter seeks to address existing problems, but with little generalizability otherwise. There is clearly value in both types of research. However, as to where to draw the line between the two, I do not know.

In an era where mistrust in the scientific enterprise has declined, is there a way to reduce the wastage in scientific output? Biomedical research is especially notorious for the sheer volume of funding required, while putting out less-than-stellar outputs. Such is the difficulty of studying something as complex as biology (life) itself, which often does not conform to fixed rules the way physics and chemistry do. At the same time, one cannot help but wonder if there are better ways to sieve out much of the research ideas in modern times that have no hope of upscaling before we commit billions of dollars to a pipe dream.

Right now, funding for biotech research is akin to a very sophisticated slots machine. Maybe at the end of the day, that’s the best we got as scientists – throwing darts into the dark and hoping something sticks. Let me know your thoughts if you have a different opinion on how we can fund science with less wastage and more accountability!

Leave a comment