Skip to main content

OPINION article

Front. Phys., 03 July 2018
Sec. Medical Physics and Imaging
Volume 6 - 2018 | https://doi.org/10.3389/fphy.2018.00068

Self-Managed Belief as Part of the “Scientific Method”: Part I—A Guide on Mental Modus Operandi as Exemplified by Research in Nuclear Magnetic Resonance

  • 1Spectroscopic Research Department, Gedeon Richter Plc., Budapest, Hungary
  • 2Faculty of Chemical Technology and Biotechnology, Budapest University of Technology and Economics, Budapest, Hungary
  • 3Center for Medical Physics and Biomedical Engineering, Medical University of Vienna, Vienna, Austria
  • 4MR Center of Excellence, Medical University of Vienna, Vienna, Austria
  • 5Department of Psychiatry, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States

The Truth is Out There—or Maybe Not…

“Thou shalt not believe.” Already about 2500 years ago, the Buddha told his disciples not just to believe his sayings, but to cross-check them with their own experiences1. From the perspective of modern scientific practice, this advice sounds eminently sensible, and reflects a key element of what, since Galileo Galilei, has become widely known as the “scientific method” in natural sciences (which is what we herein mean by “science”). The concept of the “scientific method” is often strongly associated with the ideal that the foundation of science is absolute truth, and scientific knowledge is a collection of explanations of objective reality that exists independently of our mind (in other words: “the truth is out there”). Thus (ideally), our established scientific knowledge rests on pure rationality, objectivity, irrefutable logic, and robust experimental proofs, as opposed to our beliefs which can shape our worldviews and our understanding of the world in a more unsubstantiated, more emotional, and more subjective manner.

The reason why we have started our discussion with this rather sketchy description of science is because many (or perhaps most) people, including scientists, appear to readily accept it. Indeed, scientists often merrily engage in their daily research activities by having a conception of science approximately along these lines and at this level, and without bothering to think about it more deeply. There are in fact good psychological reasons for reflexively succumbing to this view: as opposed to and/or besides our non-evidence-based beliefs, for which the word “faith” is probably more appropriately applicable (such as religious faith, faith in the candidness of the people important to us, etc.), it gives us an alternative and/or additional sense of existential and mental security to know (believe?) that the body of knowledge produced by science is solid and sound. This vision of science provides a secular belief system, a kind of anchor that our psyche subconsciously but desperately clings on to for its inner stability.

However, this is not how science ever worked in practice. In reality, our body of scientific knowledge is not some “objective entity” that exists entirely separately from the scientists themselves who produced that knowledge. In fact, science operates on the basis of plenty of subjective and belief-based inferences. As opposed to the “truth is out there” view which dominated science for a long time, modern science philosophy tends to recognize and embrace this inherent “human component” in science to a much larger extent. Accordingly, the concept of a “scientific truth” is regarded as “man-made” descriptions of the world (in the form of models and theories with their own scopes and limitations) that have been sufficiently substantiated so that we can accept them for the time being, but are always open to further confirmation or rejection. Quoting as an example Richard Rorty: “Truth cannot be out there—cannot exist independently of the human mind. The world is out there, but descriptions of the world are not. Only descriptions of the world can be true or false” [1].

This view of science explicitly or implicitly conveys or implies the ideas that: (a) the concepts of “knowledge” and “belief” are much more elusive and less distinct than they first appear; (b) the notion of “scientific truth” is a dynamic entity pertaining to certain man-made assumptions; (c) the difference between “objective” and “subjective” scientific truths often becomes blurred upon closer inspection; (d) emotions and other psychological factors do in fact play a significant role in how we formulate and understand scientific concepts; (e) human cognition, including the reasoning of scientists, can be fallible, leading to instances of misconceptions, errors and hidden ambiguities, sometimes even at a mass level and persisting for long times. Let us reflect a bit on these attributes so that we can better understand how exactly Buddha's teaching should be taken to heart in the context of this gloomier and more uncertain, but decidedly more realistic picture of science.

To Be(lieve) or not to Be(lieve)?

The recognition of the general frailty of human judgment is certainly not new, and attempts to highlight and overcome this problem have led to well-known common movements that have been around for decades, such as “critical thinking” and “everyday logic.” Nevertheless, further enquiries into this issue seem to have recently been gaining renewed momentum in psychology and cognitive sciences, often resulting in important and surprising fresh insights, and even contrasting ideas about the nature and the function of human reasoning, as represented by Kahneman [2], Pohl [3], Konnikova [4] Sharot [5], and Mercier [6]. The topic also gains special theoretical and practical interest in light of the rapidly evolving applications of artificial intelligence (AI), augmenting the need to better understand which human cognitive functions can or should be usefully replaced or complemented by AI functionalities so as to avoid human error and its propagation, and what realms of human reasoning are outside of the scope of such applications (somewhat more will be said on that later). Quite interestingly, however, there are much fewer examples that address the fallibility of our thinking and the role of our human nature within science in a direct, honest, comprehensive, and practice-oriented manner. It seems as though this is because scientists are inclined to entertain the illusion that, in line with the truth-is-out-there view of science, their adherence to the “scientific method” and their scientific training make them patently immune to reasoning errors, which are deemed to affect only “everyday” people and “everyday” judgments. In contrast to this romantic notion, science and scientific thinking are in reality fraught with error, and in spite of the self-, ego-, and image-preserving instincts of science/scientists, this is being demonstrated with increasing fervor and candidness, sometimes by giving quite shocking data in the literature, as represented here by Allchin [7], Ioannidis [811], and Szántay [12]. The main reasons for these mistaken judgments and enduring misconceptions come from what we, in a recent book, call “mental traps”—emotional or “semi-emotional” (therein termed “emotycal”) psychological entities that can secretly derail the reasoning of even the smartest and most skilled minds [12]. Szántay [12] gives an extensive account of these mental traps, discussing altogether 45 of them (e.g., we confuse familiarity with understanding, we accept intuitively appealing explanations, we confuse mathematical descriptions with a physical understanding, etc.), together with several real-life examples of how they can lead to covert scientific errors and common misconceptions. These examples include some erroneous or misleading theoretical NMR concepts that have been widely accepted even by many eminent practitioners of the field for several decades [1320], and also involve data-interpreting mistakes in the use of NMR spectroscopy and mass spectrometry (MS) [2129]. Many of these faults stem from and become widely accepted not because of simple oversights, logical errors, or a lack of expertise, but because of belief-based mental traps. Note that these theoretical cases represent misguided formulations or an illusory understanding of scientific descriptions, and therefore they are not the kind of problem-solving scenarios that can be addressed by AI tools. There are of course many problem-solving tasks requiring deduction where mental traps can also kick into action, and where AI tools can sometimes, but not always, be extremely useful to avert mistakes. For example, although modern NMR and MS methods are often regarded as being capable of yielding experimental data in a diversity, detail, and quality that relegates the challenge of molecular structure determination to an almost mechanical task akin to having to assemble a jigsaw puzzle from a complete set of well-defined puzzle-pieces, this is very often not true in practice [30]. In fact, even if such data are available, human deductive mistakes are often made, and unrecognized ambiguities linger on, as exemplified by Borman [31] and Carvalho et al. [32]. In many such cases computer-assisted structure determination tools are invaluable in avoiding structural misassignment [23, 3335]. On the other hand, there are also many structure elucidation problems of the type (typically when there is a lack of sufficient data or some of the premises used as an apparently solid puzzle-piece are wrong) where AI would be of no, or only of marginal help in dodging the mental traps [23]. All this shows the importance of understanding the nature of the mental traps for the sake of good scientific thinking, as one can only avoid something if one knows what to avoid!

As argued previously, many of our mental traps in scientific thinking exist largely because our “emotycs,” of which beliefs and intuitions are a great part, can so easily and stealthily cloud our rationality even when we are in a deeply contemplative mental mode [12]. This notion strays somewhat from the acclaimed model of “fast and slow thinking” [2] where intuitive and reflective judgements are viewed as separate mental modes. It is probably closer to another recent proposition according to which the line between intuitive and reflective thought is often vague because human reason, when viewed as an adaptive evolutionary trait, serves, to a significant degree, the purpose of persuasion in our social interactions [6]. Clearly, this is a function that is more effective if the two modes of thought, to which we should also add emotions, are intermingled. Whatever the true nature and purpose of human reason, we assert that humanity's emotional and cognitive sophistication have been evolving (through exploration, problem solving, tool making, social interactions, the invention and use of language and mathematics, the faculty of abstraction and self-awareness, etc.) hand in hand. Taking language as an example, it should be pointed out that in our social interactions spoken words and sentences always carry some emotional component besides conveying lexical information. In that sense we may say that language, our primary vehicle of thought, represents “emotionally modulated” rationality. On account of this complex understanding of our mental functions, beliefs are an inherent part of our thoughts even during analytical reflection, which can guide our conclusions and views about the world astray often in rather subtle and covert ways. For example, while studying a scientific field, we are prone to grasp scientific concepts on the belief-driven basis that they must be true simply because so many eminent people have been treating it as such for a long time. This belief-component in how we apprehend the world can easily create an illusion of understanding. On the other hand, when it comes to problem solving, the “scientific method” would dictate that: (a) one should take all possibilities into account in an emotionally neutral manner before arriving at a scientific conclusion; (b) in the face of new considerations or evidence, one should be able to reject or modify that conclusion in an emotionally equally neutral manner. As an example, a particularly forceful belief-driven mental trap that often corrupts this conduct is widely known as the confirmation bias—our tendency to gather, interpret, accept, or dismiss information with a subconscious view to swaying our conclusions in the direction of our preconceived beliefs. Furthermore, our confirmation biases can easily turn our conclusions into convictions. Our convictions are powerful constituents of our self-identity and form a strong social cohesive force in any community. Therefore, protecting this identity is one of the most important primal survival instincts that we have. Thus, any new information that seems to contradict this conclusion/conviction can be perceived as an identity-threatening assault, triggering a reflexive self-defense mechanism that inhibits self-revision. Moreover, attempts to change someone's beliefs are usually made by rational arguments, while our beliefs are notoriously impenetrable by rationality. This non-resilience of our beliefs is known in psychology as belief persistence. A psychological phenomenon closely related to belief persistence is often referred to as cognitive dissonance. When our convictions are threatened by contradictory evidence, i.e., “belief” clashes with “knowledge,” this can create a highly stressful mental state. Belief persistence, then, stems from a reflexive tendency to resolve this stress by ignoring or rejecting the evidence rather than by giving up or modifying our belief. Sometimes this can work so effectively that our belief becomes even stronger when it is confronted with a rationally irrefutable piece of contrary evidence.

It may be argued that a strongly rational mind is less prone to be influenced by beliefs, and that scientists are the very special human “subspecies” who are entirely willing to subordinate belief to reason. Paradoxically, this is not necessarily so! Some people's self-identity and self-esteem are so much dependent upon their sense of being smarter than others that they will become particularly resistant to changing their conclusions/convictions when confronted with contradictory evidence or competing ideas.

Furthermore, the many fields and professions (e.g., physicists, chemists, computer scientists, pharmacologists, radiologists, neurologists, neuroscientists, psychologists, psychiatrists, statisticians, well loggers, etc.) that employ NMR-based techniques or analyze NMR data involving a multitude of parameters thought to characterize the sample investigated (whether liquid, solid, tissue, ex vivo, or in vivo) seem to have quite a varied understanding of the basic physics behind the phenomenon as well as the information extraction process needed to obtain interpretable parameters from raw data. In addition, more often than not, the process of translating measured NMR parameters into information about the structure, function, or metabolism of the sample is largely subconscious and/or is just borrowed from other scientists' views and publications (confirmation bias). The latter may be strengthened also by the current peer review system [36, 37].

Nevertheless, it is our rationality through which we can hope to harness the negative effects of belief in scientific thinking, and the above considerations seem to indicate that in order to avoid mental traps by bolstering our rational thinking as much as possible, we should subdue our beliefs as much as possible. True enough, in principle the “scientific method” aims to do exactly that, i.e., to minimize the belief-component in a scientific conclusion or theory. This notion has even led to the radical proposition that scientists should acquire the ability and habit of never believing in anything about the real world, except in a probabilistic way [38]. From what we have discussed so far, this may seem to be a logical idea, and is apparently in agreement with Buddha's guidance. This is however a critical point in our analysis where, in order to assess the feasibility of such a recommendation, it is worthwhile to reflect again on the deeper conceptual bearings behind the more human-centered definition of science. One of the delicacies we should note in that respect is that we can all too easily get accustomed to using words and phrases such as “belief,” “knowledge,” “scientific truth,” or “experimental evidence” without realizing that their meaning is quite elusory. So, a fundamental question that we should ask is: what, in the first place, does it mean to know something and to believe (in) something, and how are these concepts connected? These are age-old and still open philosophical, psychological, and cognitional problems that we feel neither qualified, nor presumptuous enough to delve into. However, for our purposes it should suffice and is equitable to state that both knowledge and belief have a range of meanings, which overlap to a significant degree. Let us ponder on these terms a bit in the context of the “scientific method.”

On the one hand, what we call our “knowledge” about something (anything!) is a subjective mental object which is necessarily contingent upon some premises or theories which we accept as true on the basis of beliefs, which however we often ignore or are ignorant of. The magnitude of the belief-ingredient in “knowing” something depends on the nature of that knowledge and on the (perceived) degree to which it is supported by scientifically valid evidence, but it is never zero. For example, we all “know” for sure that bulk magnetism exists because we all have personal and direct sensory experiential evidence of the phenomenon. However, this knowledge rests on the implicit premise that there exists an external objective reality, which is not an illusion of our cognition – cf. the movie “The Matrix.” Nevertheless, for all practical purposes pertaining to our existence and functioning in the world, this kind of “knowing” can be accepted as (semi-)absolute truth, with only a pinch of ingrained belief that we might as well neglect. But we also “know” that the macroscopic phenomenon of magnetism is caused by the microscopic concept of “spin,” of which however we have no direct sensory apperception. In fact, the human brain has no means to really grasp the physical essence of the spin; what we can do is to create mathematical and geometrical models that emulate or reflect the concept and the behavior of the spin, but an understanding of these models must not be confused with a true understanding of the spin itself. Thus, the “knowledge” that macroscopic magnetism is caused by the spin involves a leap of abstraction and rests to a much higher degree on our belief that the pertinent theory of spin physics, as developed by others, is correct (the fact that a theory/model of a phenomenon gives good predictions regarding that phenomenon within a scope of conditions, and/or that it offers a scientifically sound way for us to rationalize the phenomenon, does not necessarily mean that it is a physically accurate description of the phenomenon). One should note, however, that direct experiential knowledge with only a minor belief-component can also be illusory, and is not necessarily “more true” than knowledge gained from outside of such experience, thus requiring more belief. For example, our (correct) knowledge that the Earth revolves about the Sun comes from accepting (believing in) the relevant evidence presented to us by others, whereas our personal sensory experience would lead to the (incorrect) knowledge that it is the other way around.

On the other hand, our beliefs also come in all sorts of shapes, shades, colors and flavors, with the intended meaning of “belief” extending from our faiths and hopes, which may pursue but do not require evidence, to our beliefs in ideas or facts that we have judged to be true on the basis of some direct or indirect piece of “evidence.” Although from the point of view of the correctness and credibility of a scientific conclusion these differences are extremely important, it is all too easy to overlook them in the mental process of understanding scientific concepts and solving scientific problems, which—as already noted—is one of the main sources of the mental traps.

Actually, the concepts of belief and knowledge blend into each other intricately, inevitably, and inseparably, with their ratio being dependent on the subject and nature of the belief/knowledge. Indeed, in a deeper context belief and knowledge cannot exist without each other; they have a symbiotic, and, as it will be pointed out below, can even have a positively synergistic relationship. To emphasize this connection, for the purposes of our discourse we have taken the somewhat playful and adventurous step of welding the words “belief” and “knowledge” so as to coin the queer but hopefully useful term “beliefedge” (this term also reflects the fact that “scientific truth” is never a yes (1) or no (0) as assumed in Boolean logic but, as described by fuzzy logic, can take any real number between 0 and 1 [39].

A beliefedge should not be thought of as a bounded mental object, but as the result of an information-processing procedure that has led to the judgment that the given beliefedge is true. It is always the process and the beliefedge together that should be the target of our considerations when we want to understand and avoid the mental traps caused by our beliefs. In that respect it is worthwhile to distinguish between pre-conclusion and post-conclusion beliefs, because we often use the word “belief” in a somewhat different sense in the two cases. In real-life situations this distinction can be blurred, but we use it here with the precept that in order to facilitate the analysis, and to gain a better understanding of a topic, it can be useful to think in terms of extremities. For example, in a statement such as “I believe that this hypothesis is true,” “belief” is still a part of the hypothesis-testing procedure, and is thus closer to the concept of faith or hope, while in the context of the beliefedge “belief” more typically means an accepted assumption or interpretation. In addition, our existing beliefedges are important building blocks in the way we form new beliefedges. Thus, faulty beliefedges can easily spawn further misguided ideas in our mental space. In that connection the concept of a “probabilistic belief” [38] seems to us to be more fittingly applicable to the realm of pre-conclusion beliefs, where proper analytical (critical) thinking dictates that no commitment to the truth-value of the conclusion should have yet been made. However, in a post-conclusion context the term “probabilistic belief” appears to be an oxymoron, because such beliefs are inherently non-probabilistic (although a conclusion can be probabilistic in a statistical sense).

True enough, our beliefs have a down side in the form mental traps. However, in light of the above considerations let us now reflect again on the question whether exiling all beliefs from human scientific thinking so as to affirm the “scientific method” would be a viable or even advisable scenario.

First of all, even if we truly want to banish all our beliefs with all our might, this would not only be extremely difficult, but virtually impossible because of how and why our beliefs infiltrate our psyche. We, as the human species, not only have a unique capacity to believe, but also an inherent need to believe! As stated above, our self-identity, our mental security, and probably even our consciousness hinge (besides our various competencies) upon our beliefs that exist partly as sovereign mental entities and partly come packed with our beliefedges. Also, being an ultra-social species, we have an innate need to be fitting members of some cohesive group(s) which, as already noted above, requires a sense of group-identity resting on a common belief system. No concertedly constructive social interaction would be possible without trust, i.e., without being able to believe (note that no “scientific evidence” is required or available here) that the people with whom we connect are honest and have positive intentions, and that they like and respect us to a reasonable degree (or, as a matter of fact, that your spouse really loves you!). We even have an inbuilt reflex to safeguard our pertinent social illusions, disregarding signs that would taint them. Moreover, without believing, we would not even be able to immerse ourselves into, say, the plot of a movie. In order to make that experience intellectually and emotionally meaningful, we are quite willing to temporarily let our beliefs fill our mental space, even though our reality-conscious rationality working in the background knows that the narrative is fictional, the scenes are artificial, and all characters are played by actors. As a somewhat loose but nevertheless valid analogy, we may consider scientific paradigms. According to a common usage of the word, by “paradigm” we mean a framework of strong premises, visions and conceptual strategies within which scientists conduct their research. During that research program further assumptions are made, ideas are generated and tested, and entire theories can be built that are consistent with the basic premises of the paradigm—all this happening without questioning the paradigm itself. If the knowledge thus generated starts to create contradictions with the paradigm, then this forces scientists to re-examine its validity, which leads to a new paradigm. However, until a “paradigm shift” occurs, scientists need to, and are quite willing to, believe in the validity of the paradigms that form the basis of their research program [12, 40, 41]. The same goes for axioms and hypotheses which are taken to be true until proven right or wrong. Here, in practice “taken to be true” does not simply mean a purely intellectual assumption whose future outcome is viewed by the scientist in an emotionally detached manner, but is more often than not treated with considerable emotional commitment (hope!) toward the truth of the assumption. Such beliefs are essential for most scientists, since they provide an intellectual and emotional home base from where they can commence their creative forays, and where they can return to for reevaluation when frustration sets in. Believing in the success of a research project, in the correctness of our intuition, in the validity of a “hunch,” and in our ability to achieve, is not just a collateral part of the action, but an elementary psychological requisite for having the courage to take risks, to raise and explore new ideas or to challenge old ones, and for persevering in the face of difficulties (which, as we all know, are abound in all research programs). Because science is partly an individual activity that very much involves our self-identity, and partly a social activity with all its ensuing group dynamics (brainstorming, mutual inspiration or criticism, competitiveness, etc.), and because in reality “scientific thinking” cannot be sharply separated from “everyday thinking” [12], we cannot realistically eliminate the belief-element from scientific conduct. And neither should we! In fact, the faculty of homo sapiens to believe has served as a unique evolutionary advantage over other species in the advancement of mankind: it is the clue to extending an individual's knowledge beyond the knowledge derived from that individual's direct sensory experiences. From the dawn of man, knowledge about the hazards and assets of nature became rapidly and efficiently transferable from individual to individual owing to our species-specific ability to trust, i.e., to believe. No learning and knowledge sharing could occur without an element of belief, since there is no way one can verify the credibility of all received information.

In all, no passion, hope, creativity, knowledge sharing, collaboration—all essential ingredients of science—would be possible without our capacity to believe! A purely analytically and impartially thinking human person might be a very efficient deductive powerhouse with no mental traps, but nevertheless a lousy researcher. The renowned scientist, Hans Selye expressed this point fabulously: “The totally unprejudiced individual who gives equal consideration to every possibility would be unfit not only for science but even for survival. The fact is that creative scientists are full of preconceived ideas and passions. They consider certain results likely, others unlikely; they want to prove their pet theories and are very disappointed if they can't. And why shouldn't they be prejudiced? Their prejudices are the most valuable fruits of their experience. Without them they could never choose among the countless possible paths that can be taken.”[42] (This could also be one of the reasons why AI will not be able to achieve human-like creativity and spontaneity easily, if at all).

To Know Your Science, Know Your Beliefs!

Of course, if one took Buddha's instruction literally, and our acceptance of any teaching depended on some form of personal verification instead of just believing it on face value—well, that would make knowledge sharing impossibly arduous and slow. Thus, the “scientific method,” and in fact Buddha's tutelage, should not be understood as a dictum to discard belief from science! Rather, in order to endorse the “scientific method,” we should strive to become conscious of our (often hidden) beliefs and understand their potential role in our mental traps. In other words, we should develop the faculty of managing our beliefs! To push the analogy with the movie “The Matrix” a little further, not acknowledging the existence of our mental traps is like choosing the blue pill (a cultural symbol of our blissful ignorance or rejection of reality for the sake of our mental comfort), while a commitment to active belief-management is like choosing the red pill (a symbol of accepting and facing up to the much harsher and more uncertain, but genuine reality).

The true essence of the “scientific method” lies not in the expectation that it will yield a full understanding of the world, but in the modus operandi of the procedure through which we are trying to approximate that understanding. Such a complete understanding is only wishful thinking, but we can do our best to give logically sound, workable, evidence-supported descriptions of the world. As Sir Francis Bacon said: “science is but an image of the truth.” If, say, someone asks an NMR spectroscopist whether he/she knows what is happening in an NMR tube during or after a radio-frequency pulse, and the answer is yes!, then this is a clear sign of self-delusion in the sense that his/her beliefedges on the subject contain unrecognized (unmanaged!) belief-components. A scientifically more legitimate answer to this question would be something like this: “I am not quite sure, but I can explain to you various quantum-mechanical and classical models resting on different premises and approximations that describe different aspects of the phenomenon with remarkable fidelity and predictive power within the scope of certain conditions.” (Note that although this is the more correct, and also more honest and more daring statement, we have typically not been conditioned to give or receive answers like that, or to think about the world along such lines. Scientists, science teachers, and even science students are expected to make confidently unambiguous statements, which contributes greatly to the, often, self-delusory habit of always trying to understand the world in terms of pristine concepts. This holds true even more for medical doctors who have been trained to provide an answer, or at least suggest a procedure how to continue, within a short period of time).

Thus, an adept commitment to the “scientific method” which recognizes the inherent limitations and uncertainties in scientific descriptions puts additional emphasis on the need for conscious belief-management. This means arriving at our beliefedges with meticulous care; distinguishing between unambiguous and belief-based information during that process; understanding the difference between our pre- and post-conclusion beliefs; struggling not to leave any stone unturned before arriving at a conclusion; becoming comfortable with leaving a problem open and not forcing a conclusion until we have more information available; not being reflexively dogmatic in a self-esteem-protective manner; becoming capable of changing our beliefs; etc. A belief-conscious scientist must never claim that he or she knows something with absolute certainty. This, however, is not uncertainty in the sense of indecisiveness, but in the sense of always being open to the possibility of revision or refinement. There is a saying commonly credited to Ambrose Redmoon: courage is not the absence of fear, but rather the judgment that something else is more important than fear. By analogy, we might say: science is not the absence of belief, but rather the judgment that rationality can regulate belief. The attitude advocated in reference [12], called “anthropic awareness” (in short: the faculty of becoming aware of the “human factors” that subconsciously influence our thinking even when our mind is fully engaged in analytical reasoning) was also forged in that spirit.

Scientists should not only adhere to the “scientific method” because it is the principal code of conduct for making good science. The “scientific method,” which finds many applications in areas beyond science itself (e.g., technological research and development, engineering, medical practice, etc.), must be fostered and held onto with great care and perseverance also because it is the only form of human thinking which purposely aims to gain an understanding of the world by being as objective and meticulous as possible. It is our intellectual sanctuary, our gold standard of how to acquire knowledge by thinking and experimenting rigorously, by providing solid proofs, and by letting ideas and experimental data be further tested. Thus, scientists should not only be the executives, but also the guardians of the mental modus operandi reflected in the term “scientific method.” However, these roles can be properly fulfilled only by understanding and controlling our inherent and often hidden beliefs that can, besides being a major driving force, potentially blemish the “method.”

In part II we shall provide some examples where mental traps or limited expertise in the basics of NMR, MRS or (f)MRI lead to questionable or even wrong results.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We are grateful to all our students and colleagues (including referees) who have questioned our work and this way helped us to improve.

Footnotes

1. ^ Do not believe anything, no matter where you read it, or who said it, even if I did, unless it is consistent with your own experience and your own common sense. Source: Siddharta Gautama (563 BC−483 BC).

References

1. Rorty R. Contingency, Irony, and Solidarity. Cambridge: Cambridge University Press (1989).

Google Scholar

2. Kahneman D. Thinking Fast and Slow. New York: Farrar, Strauss and Giroux (2011).

Google Scholar

3. Pohl RF, editor. Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgment and Memory. Hove: Psychology Press (2012).

4. Konnikova M. Mastermind – How to Think like Sherlock Holmes. New York, NY: Viking (2013).

5. Sharot T. The Influential Mind: What the Brain Reveals About Our Power to Change Others. New York, NY: Henry Holt and Company (2017).

6. Mercier H, Sperber D. The Enigma of Reason. A New Theory of Human Understanding. Cambridge, MA: Harvard University Press (2017).

Google Scholar

7. Allchin D. Error types. Perspectives on Science. (2001). 9:38–59. doi: 10.1162/10636140152947786

CrossRef Full Text | Google Scholar

8. Ioannidis JPA. Why most published research findings are false. PLoS Med. (2005) 2:e124. doi: 10.1371/journal.pmed.0020124

PubMed Abstract | CrossRef Full Text | Google Scholar

9. Ioannidis JPA. Why most discovered true associations are inflated. Epidemiology (2008) 19:640–8. doi: 10.1097/EDE.0b013e31818131e7

PubMed Abstract | CrossRef Full Text | Google Scholar

10. Ioannidis JPA. Why science is not necessarily self-correcting. Perspect Psychol Sci. (2012) 7:645–54. doi: 10.1177/1745691612464056

CrossRef Full Text | Google Scholar

11. Ioannidis JPA. Why most clinical research is not useful. PLoS Med. (2016) 13:e1002049. doi: 10.1371/journal.pmed.1002049

CrossRef Full Text | Google Scholar

12. Szántay C Jr, editor. The philosophy of anthropic awareness in scientific thinking. In: Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 3–93.

Google Scholar

13. Hanson LG. Is quantum mechanics necessary for understanding magnetic resonance? Concepts Magn Reson Part A (2008) 32A:329–40. doi: 10.1002/cmr.a.20123

CrossRef Full Text | Google Scholar

14. Hoult DI. The magnetic resonance myth of radio waves. Concepts Magn Reson. (1989) 1:1–5.

Google Scholar

15. Hoult DI, Ginsberg NS. The Quantum origins of the NMR signal and spin noise. J Magn Reson. (2001) 148:182–99. doi: 10.1006/jmre.2000.2229

CrossRef Full Text

16. Hoult DI. The origins and present status of the radio wave controversy in NMR. Concepts Magn Reson. (2009) 34A:193–216. doi: 10.1002/cmr.a.20142

CrossRef Full Text | Google Scholar

17. Szántay C Jr, editor. An “anthropically” flavored look at some basic aspects of spin physics using a classical description. In: Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). 97–140.

Google Scholar

18. Hanson LG. The ups and downs of classical and quantum formulations of magnetic resonance. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 141–71.

Google Scholar

19. Szántay C Jr, editor. The RF Pulse and the Uncertainty Principle. In: Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 173–211.

Google Scholar

20. Szántay C Jr, editor. On the nature of the RF driving field in NMR (with a lookout on optical rotation). In: Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 213–28.

Google Scholar

21. Szakács Z, Sánta Z. NMR Methodological Overview. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 257–89.

Google Scholar

22. Háda V, Dékány M. MS Methodological Overview. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 291–315.

Google Scholar

23. Béni Z, Szakács Z, Sánta Z. Computer-assisted structure elucidation in NMR. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 317–54.

Google Scholar

24. Béni Z. Structure elucidation of a mysterious trace component of ulipristal acetate. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 355–65.

Google Scholar

25. Dubrovay Z, Háda V. The adventurous discovery of the structure of a novel vincristine impurity. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 367–76.

Google Scholar

26. Szakács Z, Kóti J. An elusive degradation product of ziprasidone. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 377–89.

Google Scholar

27. Sánta Z, Háda V. The case of an emotion- and emotycs-laden structure determination of a small synthetic molecule with an unexpected structure. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 391–99.

Google Scholar

28. Szántay C Jr, Demeter Á. Self-induced recognition of enantiomers (SIRE) in NMR spectroscopy. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 401–15.

Google Scholar

29. Demeter Á. Believe it or not: carbon protonation of the pyrimidine ring. In: Szántay C Jr, editor. Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 417–31.

Google Scholar

30. Szántay C Jr, editor. An “Anthropic” Modus Operandi of Structure Elucidation by NMR and MS. In: Anthropic Awareness: The Human Aspects of Scientific Thinking in NMR Spectroscopy and Mass Spectrometry. New York, NY: Elsevier (2015). p. 231–56.

Google Scholar

31. Borman S. Tug of War Over Promising Cancer Drug Candidate. Chem. Eng. News, On-Line Edition (2014). Available online at http://cen.acs.org/articles/92/web/2014/05/Tug-War-Over-Promising-Cancer.html

32. Carvalho EM, Periera FA, Junker J. How well does NMR behave in natural products structure determination? A survey of natural products published in 2007 and 2008. In: Poster presented at the 50th Experimental NMR Conference. Asilomar, CA.

33. Elyashberg ME, Williams AJ, Blinov KA. Contemporary Computer-Assisted Approaches to Molecular Structure Elucidation. Cambridge: RSC (2012).

34. Elyashberg ME, Williams AJ. Computer-Based Structure Elucidation from Spectral Data. The Art of Solving Problems. Heidelberg: Springer (2015).

Google Scholar

35. Buevich AV, Elyashberg ME. Synergistic combination of CASE algorithms and DFT chemical shift predictions: a powerful approach for structure elucidation, verification, and revision. J Nat Prod. (2016) 79:3105–16. doi: 10.1021/acs.jnatprod.6b00799

PubMed Abstract | CrossRef Full Text | Google Scholar

36. Koehler JJ. The influence of prior beliefs on scientific judgments of evidence quality. Org Behav Hum Dec Proc. (1993) 56:28–55. doi: 10.1006/obhd.1993.1044

CrossRef Full Text | Google Scholar

37. Nickerson RS. Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol. (1998) 2:175–220. doi: 10.1037/1089-2680.2.2.175

CrossRef Full Text | Google Scholar

38. Martins ACR. Thou shalt not take sides: cognition, logic and the need for changing how we believe. Front Phys. (2016) 4:7. doi: 10.3389/fphy.2016.00007

CrossRef Full Text | Google Scholar

39. Zadeh LA. Fuzzy sets. Inform. Control (1965) 8:338–53. doi: 10.1016/s0019-9958(65)90241-x

CrossRef Full Text | Google Scholar

40. Kuhn TS. The Structure of Scientific Revolutions. 3rd ed. Chicago, IL: University of Chicago Press (1979).

Google Scholar

41. Lakatos I. Falsification and the methodology of scientific research programmes. In: Lakatos I, Musgrave A, editors. Criticism and the Growth of Knowledge. Cambridge: Cambridge University Press (1970). p. 91–195.

42. Selye H. From Dream to Discovery. On Being a Scientist. New York, NY: McGRaw-Hill (1964).

Google Scholar

Keywords: science, mental traps, confirmation bias, probabilistic belief, fuzzy logic, NMR spectroscopy

Citation: Szántay C Jr and Moser E (2018) Self-Managed Belief as Part of the “Scientific Method”: Part I—A Guide on Mental Modus Operandi as Exemplified by Research in Nuclear Magnetic Resonance. Front. Phys. 6:68. doi: 10.3389/fphy.2018.00068

Received: 08 February 2018; Accepted: 13 June 2018;
Published: 03 July 2018.

Edited by:

Alex Hansen, Norwegian University of Science and Technology, Norway

Reviewed by:

Federico Giove, Centro Fermi-Museo storico della fisica e Centro Studi e Ricerche Enrico Fermi, Italy

Copyright © Szántay and Moser. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Csaba Szántay Jr., cs.szantay@richter.hu

Download