The role of the government in research
January 8, 2012 2 Comments
Science has forever been plagued by naysayers who are afraid of us unlocking the secrets of the universe. In the past it has been branded as dangerous, blasphemous, and the devil’s work. The most obvious example for a North American would be the Catholic church, which has until recently labeled any scientific advancements which contradict God as “heresy”, and threatened and even killed many such lunatics. Galileo, for example, was condemned to lifelong house arrest on the charge of heresy for his belief that the Earth rotates around the sun. Our scientific prowess has grown exponentially since this archaic time. We now live in a so-called “enlightened” society. Science is generally considered as an essential and worthy endeavor, and it would be unimaginable for the church to stifle progress in such a way. Unfortunately, however, not all scientific discoveries merely increase our understanding of the universe, like Galileo’s mind-blowing (at the time) theories. The more our technological expertise increases, the more science has the power to be dangerous. The invention of the sword was a monumental achievement which had many detrimental moral implications at the time. Yet the invention of the gun brought with it a catastrophically larger amount of killing potential; the invention of the atom bomb literally blows both of these inventions away. Violent technologies aside, our leaps forward now put us on the brink of many advances which threaten many people’s strongest moral values. Information technology threatens our right to privacy. Genetic engineering raises issues about the morality of self-improvement. Genetic screening brings about the controversial topic of designer babies, and even of aborting babies who will be diseased. The list goes on and on. The implications are staggering. Perhaps, then, the church was right? It would be near impossible to argue in our day that progress should be stifled, but what about progress that all, or the general population at least considers to be morally abhorrent? Should scientific research be guided by moral considerations? Or is the ultimate goal the pursuit of knowledge, whatever its outcome? Most importantly, should the government be allowed to interfere with science?As I am writing this, I do not have any opinions on the matter; the ethical implications seem too important to draw any conclusions without careful research and thought. Let us first pick the brains of some of the leading theorists in the field.
We will start with Edward Teller, a Senior Research Fellow at the Hoover Institution. He is a physicist born in Hungary, and is chiefly known for his contributions to the first demonstration of thermonuclear energy. He says:
Today there is perceived to be a strong contradiction between the results of science and the requirements of morality; for instance, the application of science has led to the development of nuclear weapons, while international morality seems to demand that such results never be applied—and that research leading to them should be stopped.
He believes, however, that contradiction and uncertainty are good, since they lead to a deeper understanding. In 1949, he (indirectly) advised President Truman to continue work on the hydrogen bomb. He defends his decision in several ways. First of all, he has a “firm belief that the pursuit of knowledge and the expansion of human capabilities are intrinsically worthwhile”. He also was afraid of the Soviets achieving military superiority. He finally finds solace in a letter from four of his Russian colleagues. It says that since, for the first time in history, the most powerful weapons ever created were not used, they would become “instrument(s) of human experience, the means of great discoveries, the tool(s) of deep penetration into the secrets of Nature”.
Teller has some valid points. His claim that contradiction and uncertainty are good, is quite compelling – how are we supposed to fully understand the universe if we limit ourselves to “moralistic” research? His claim that the pursuit of knowledge is intrinsically worthwhile, however, begs the question: at what cost? It is true that now that the hydrogen bomb hasn’t been used, it has many potentially “good” applications, such as power generation, and ultimately, a greater understanding of the universe. But it would be foolish to think that the threat has passed. A hydrogen bomb still has the potential to destroy. His argument for military supremacy is to me the least compelling, since to me the creation of weapons is never justified. That, however, is a whole different debate.
Our next expert is Joseph Rotblat, who worked on the development of the atomic bomb before devoting the rest of his life to the abolition of nuclear weapons and peace, for which he was awarded the Nobel Peace Prize. Rotblat was much against Teller’s view that scientists aren’t responsible for the consequences of their research. He ended up working on the bomb because if “Hitler can have the bomb, then the only way in which we can prevent him from using it against us would be if we also had it”. He soon realized, however, that the Germans did not have the resources to make such a bomb, and that the Americans had less nobel goals. He argued that even if a scientist could not predict the applications of his work, he was still responsible of any consequences. However he believed that it was up to the conscience of the individual scientist, and not to the state to determine the direction of research, since regulation would be too difficult. He thus suggested a ‘Hippocratic Oath’ for scientists, and believed that ethics should be taught to all scientists. He also wished that, like in medical science, ethical committees would be set up to approve research proposals.
I agree that a scientist should bear the responsibility of direct consequences of his work. However, the inventor of the laser should not be held responsible for the creation of laser weaponry – this takes it too far. Roblat makes a very interesting point about government regulation; such policies might well be difficult to implement. Perhaps an oath and ethical committees would ensure the morality of science, however this does not effectively counter Teller’s argument that science and morality must be separated in order to gain a complete understanding of the universe.
During my research, I have come across an article by David Koepsell which closely parallels the one which I am currently writing. He examines the claim that scientific research should proceed without limit since scientists merely unlock the secrets of the universe; it’s the politicians, technologists, and engineers who should be blamed for the unethical applications of scientiﬁc discoveries through technologies. He counters, however, that there have been many atrocities in the name of science before the application phase. This has led to the recent development of the “Belmont principles”, which are basically that people (including test subjects) must be treated with respect, the research must have some beneficial intentions behind it, and minorities must be justly treated. However these principles only really apply to bioethics, since other fields don’t require human test subjects. Koepsell argues that “Science proceeds not in a vacuum, but as a socially devised institution”. He therefore wishes for these principles to be expanded to include everyone who could be potentially affected by the research. He also recognizes the claim that many scientific pursuits are “dual-use”, which means that they have the potential to be harmful or helpful. His belief is that scientists should take moral responsibility for their own work, employing the formula L+P>R (L=likelihood of independent discovery and use, P= potential beneﬁt from scientiﬁc investigation now, R=risk). He also supports institutional regulation, and is a proponent of the ethical training of scientists.
I believe that Koepsell’s claim that scientists need to be aware of their surroundings effectively counters Teller’s view that every avenue of science must be explored – Putting other’s in danger is a direct violation of our fundamental human rights, especially if it the potential benefits are minimal. I think that a combination of government regulation and ethical training is a good start, but is perhaps not enough.
It is perhaps fitting that the last expert who I will cite is Albert Einstein: the man who began the chain reaction of the atom bomb. In a letter dated October 1952, he discussed the moral obligation of scientists. He debates whether scientists should simply search for the truth, or if this truth should have a practical application. To him, the very essence of scientific research is an almost “religious attitude” towards the acquisition of knowledge – with or without practical purpose. He strongly believes that a “man of science” is a proud person who is distressed that his field of work has endangered humanity through weaponry. To him, the man of science is enslaved by the politicians who are in power due to scientific discoveries. He believes that the only way to ensure the survival of mankind is to abolish all weaponry. He concludes that if every man of science thinks critically, and applies his thinking, the dangers of science would be greatly reduced.
It follows from Einstein’s train of thought that he would not support any government regulation on science. He would, however, support ethical training, and perhaps Rotblat’s ‘Hippocratic Oath’ as well.
Now that the experts have laid their best arguments on the table, it becomes easier for me to come to a conclusion on this confounding problem. Einstein’s and Teller’s argument that the pursuit of knowledge should not be hindered is very convincing. However to me, Koepsell’s claim that scientists have a duty to respect the lives and the futures of all those affected by their research trumps all. I believe that every human has a right to live out their lives without mad-scientists enabling more ways for them to die. Therefore I support the government regulation of scientific research. I believe that institutions should be created to ensure that the interests of all humans are protected from future implications as well as present ones. I think that individual scientists should take responsibility for their work as well; Rotblat’s ‘Hippocratic Oath’ for scientists would be a big step towards this end. Ethical training would also be necessary to reinforce the necessity of taking into consideration fellow humans. These steps will lead us on the path to a morally acceptable future.