On Knowing Things

To know something is so fundamental to the western psyche that considering a definition of knowledge may seem pedestrian. How many times per day do we think or utter the phrase “I know” as a response in the affirmative or to register agreement? We know an infinite number of individual things: our home address, the name of our first pet, famous world capitals, specialized information associated with our professions—the colors in a favorite painting or lines of a sonnet, perhaps. We also know seemingly very complex things: manners associated with a culture in which we were raised, written and spoken language(s), mathematical principles, and how to play a musical instrument are just a few examples. Some of these types of knowledge clearly have active cognitive elements as well as subconscious ones, but all seem to fit in some way within the scope of what we as humans feel justified in claiming we know.

But what is the meaning of knowledge? Such a broad question can not be tackled as a categorical imperative in an essay of this scope—indeed, as will be discussed below, the meaning of knowledge has evolved to mean different things within different sub-disciplines of the same field, let alone between, say, the sciences and humanities—so we endeavor to consider this question in the context of biomedical science. The fundamental basis of scientific inquiry from the dawn of human existence shares the thread of (i) sensory-based observation, (ii) formation of theory (or, prediction) and (iii) experimentation. Knowledge, then, should emerge from this formula and should require the input of each of the three components. This is an important distinction, wherein scientific knowledge has differed from the non-scientific variety: principles of scientific knowing are reasonably impervious to the Weltanschauung of the masses (e.g. humans knew the world was flat based on scientific principles…until scientific principles allowed star gazers and experimentalists [sea-farers] to prove otherwise) whereas conventional knowledge is not (e.g. ancient Romans knew the correctness of “rendering unto Caesar,” while modern western societies have a more nuanced knowledge of taxation and shared governance). Cultural shifts within societies over time, and between societies at a given moment, establish widely varied understandings of knowing and knowledge; scientific knowledge is not at the whim of cultural, political, or emotional tangents (As an aside, this is of course not to say that science cannot be politicized, as there are many unfortunate examples where it has been. Furthermore, cultural, political and emotional factors all play a large role in the method and scope of scientific inquiry, as citizens and politicians shift the emphasis of inquiry to new needs and interests of the population; these factors do not, however, affect the fundamental meaning of scientific knowledge.) Here we argue that the basis for scientific knowledge is uniquely human and would define it as the ability to predict and/or reproduce a phenomenon in the nature world through the use of the five senses and, as is increasingly the case in all of science, specialized tools for dissection, data collection and data analysis. Scientific knowledge is universal, which is to say, it is always the same, in any language or under any social construct, until a new theory, experiment and observation necessitates a new synthesis.

There is a critical distinction between knowledge, which is pursued and obtained in many fields (including as dealt with herein, science), from truth, which is the sole purview of mathematics, religion and some disciplines of philosophy. Truth is not the domain of science, because truth is not an unbiased attempt to rationally explain the world and it discards the assumption of fallibility, which itself enables the existence of scientific knowledge. Whether anything is permanent or eternal is a question with interesting cosmic considerations—some would argue central to the drive of human inquiry—but ontological permanence is not within the scope of scientific observation. There is also a distinction to be made between knowledge and information, the latter herein referring to a medium of the former: information is the content, the substance; knowledge includes the retrieval, processing and usage.

The features of human inquiry in biomedical science are changing on a scale that requires reevaluation of what it means to know something and what it means to reproduce an observation. The purpose of this essay is to address how scientific language and reasoning can evolve to respond to this change. We begin with a consideration of the evolution of scientific inquiry, particularly with regard to how the use of tools for data collection have changed the scale of information humans can acquire and how this has in turn changed the means for generating knowledge.

Use and Evolution of Tools, Changes in the Acquisition of Information and Knowledge

Early man distinguished himself from, and out-competed, ancient hominids based on improved cognitive abilities and use of tools. In this regard, humans and their evolutionary predecessors are still thought to be distinct from non-humans based on human ability to store and process information to make future decisions; apply this to things like prevention of conception or treatment of wounds sustained hunting or in combat, and you have the dawn of biomedical knowledge. Agriculture likewise represents an example of non-medical scientific knowledge. Although precise determinations of when language arose vary (based in part on the definition of language), pre-human primates were verbally communicating 1-2 million years ago, whereas modern language in pre-humans has probably been around for 100,000 years. Knowledge transfer, then, would have taken the form of oral traditions from the time developmentally modern humans arose (~50,000 years ago) until the emergence of written language, around 3,200 BC in Mesopotamia.

Fast-forward to the 20th century and the emergence of the computer. A mere half century before the present, the totality of human knowledge, while vast, could be reasonably quantified in a straightforward manner: count the number of books. Consider early thinkers like Socrates or experimentalists like Leonardo da Vinci, Galileo Galilei and Benjamin Franklin: these scientists used their senses and engineered tools and methods to acquire knowledge beyond the state-of-the-art of their contemporaries. The middle of the last century, all humans recorded their observations, theories and data, and performed their calculations and analyses, on paper, storing them in printed books or articles. With the emergence and now ubiquity of computers in all aspects of human existence, the type and scale of information to be obtained and stored has undergone a quantum leap. From everyday things like email and Wikipedia to the advanced calculations performed by engineers, physicists and life scientists—humans now possess an ability to acquire and process information orders of magnitude greater than possible with the human mind alone. (As an interesting aside, it is certainly an unresolved question (Sparrow, Liu et al. 2011) what the effect of off-loading information storage from our brains to computers and the Internet will have on human cognition, intelligence and creativity, as well as on pure social constructs like communication. Whether delegating cognitive tasks to computers frees our minds for creative pursuit or atrophy is also unknown, although the experiment is ongoing.) But does this information constitute knowledge? Akin to when all human knowledge was stored in books one could reasonably argue that no single person knew the totality of human knowledge, but this sum could be rightfully considered the domain of human civilization as a whole—because someone knew every individual piece of information. Not so in the present day: much of modern human activity relies of information never processed in the mind of a human being, and that no human being retains for calling forth. So we are left with the need to determine what in the vast amounts of information that surrounds human societies constitutes knowledge—when can something be said to be known in the present day?

In addition to storing information (some of which one assumes passes muster for being knowledge), computers are prime examples of tools that have enabled new types of information to be acquired in the modern era, but they are certainly not the only example. Science and technology are the areas in which perhaps the advances of tools have most fundamentally changed what it means to know something. Gone are the days when the curious, innately skilled, amateur could make ground-breaking discoveries by simply using a scalpel and scissors. In biomedical research, the frontier of human knowledge has progressed to the nano-scale (one millionth of a meter), wherein new discoveries are made by manipulating and measuring molecules, which can only be visualized and quantified with the most sophisticated (and expensive) instrumentation and using complex methodologies known only to very specialized researchers (see (Chen, Mias et al. 2012) for just one example in the area of large-scale biology). It is virtually impossible for a person without formal laboratory training, most times involving graduate education, to make a novel discovery about basic biology and/or medicine. Similar changes have occurred in the acquisition of knowledge about the universe, driven now by data from telescopes launched deep into space and theories based on observations that would be impossible with the tools our recent ancestors used to observe that the earth was not the center of the cosmos. This change in the scale of human inquiry necessitates a change in the language used to describe knowledge.

What Knowledge Is Not

The most intuitive method of acquiring information about the world is through direct observation with the five senses. Observation, however, does not constitute knowledge. The recalling of an observation and the processing of the observed information are separate from the act of observation, with only the combination of all three constituting knowledge. An example of the dog who observes that begging at the dinner table produces food scraps from his owners: the observation of the food is not enough—for the dog to have knowledge about this experience and hence for him to manipulate the information to affect his actions, he must identify the event (the begging, in this case) and process, or associate, the event with another fact or outcome (in this case, receiving the leftover bones). Here, the dog can be said to have knowledge that beseeching his owners produces a meal.

Neither information nor data are knowledge for a similar reason—they are inert and, alone, neither convey, nor contain, meaning. Meaning is imparted by the mind of the being that processes the information. (As an aside, the phrase “knowledge in books”, ergo, is a contradiction in terms. Information in books only becomes knowledge when a human being reads and comprehends it.) Knowledge, unlike data or information, requires a sentient being for its existence. One should also acknowledge at this juncture the concept of truth which is also neither pre-requisite, nor result, of knowledge. For obvious reasons one’s concept of truth is a central issue to a discussion on knowledge; as discussed above, truth is a dogmatic absolute with no place in science.

What Knowledge Is

Knowledge is the processing of past events to influence the present. I know that I left my car keys on the refrigerator and so I fetch them from there on my way out the door in the morning. There need be no philosophically more complex synthesis to understand, at its core, what knowledge is. Why, then, is it so challenging to estimate what humans know as a species, and how knowledge is transferred (or not) between individuals and generations?

Take the area of science: the chemistry professor clearly knows the make up of a DNA molecule: the chemical structures of the bases, the positioning of the sugars and phosphates, the logic for bonding between the bases in the double helical structure of the molecule…these are immutable facts, tested by thousands of experiments by different scientists all over the world. This information is contained in textbooks and remains knowledge when passed onto new minds through lectures or reading. Here the iterative path to knowledge is ostensibly clear: observation, overt experimentation, and interpretation. But what about scientific knowledge not obtained from textbooks, lectures or personal instruction? Knowledge that seems innate? Take the surgeon. An anatomy professor may know the Latin name of every nerve, muscle and ligament, but you would instead hire the skilled physician, whose textbook knowledge is perhaps indeed formidable, based instead on her practical knowledge to do your gall bladder surgery or hip replacement. It seems, then, that the physician here knows something that is completely reliant on: (i) her specific mind; (ii) her training; and (iii) her command of her experience in real-time. Similar examples exist in laboratory science, where some individuals just have “good hands” (or not) at the bench, a trait that is present or absent without correlation with knowledge acquired in a didactical manner. Other commonplace examples are a racecar driver or Wall Street trader: these individuals posses a core set of—perhaps universally attainable—facts that their own minds allow them to manipulate in a manner that creates specialized knowledge. The surgeon and the racecar driver posses knowledge of the variety that circumvents cognition, proceeding straight from the subconscious to the action, demonstrating that while this trait (cognition) may be sufficient for knowledge, it is not necessary. The antithetical consideration is one in which knowledge exists but is not applied—where cognition is engaged but action does not occur. This knowledge is no less valid than the foregoing examples.

These considerations create two intriguing mandates: first, it means that knowledge need not be communicable to exist and second, while information or truth may be universal, knowledge is not. Neither of these is particularly satisfying from a scientific point of view, in which we pride ourselves on the ability to codify nature and reduce to practice (if not also to principle) that which is observed. What value is there for the scientist to consider definitions of knowledge? What role if any does the scientist play in the collective knowledge of the human species?

Evolution of Science, Evolution of Knowledge

Unlike the humanities, which have a no less fundamental role in creation of human knowledge, scientists have the unique requirement that such knowledge must be transmittable through the five senses and observable in the same way (there are certain thought experiments here where this may not play out, but suffice it to say, the potentiality of observation by the five senses must exist, even if the reality does not). A novel may create soaring emotions in a reader, but the writer has relied on the theater of the reader’s mind, not any real sensory stimulation, for the creation. The scientist is bound by what can be shown, or more important, what can be reproduced. No one need reproduce Beethoven’s Moonlight Sonata for it to be a transcendent observation about the universe (if one chooses to think in such terms). But any scientific observation, no matter how stunning, is meaningless if the observer cannot codify it somehow and if another person cannot reproduce it.

We propose that the semantics around knowledge and the manner in which knowledge is codified by scientists should actively take into account the following basic principles in the current stage of human scientific development: (i) what it means to know a principle in a given scientific discipline should be addressed (debated may be a better term…) explicitly and publically by scientists in that field, to avoid rampant accumulation of data and review papers; (ii) knowledge should be codified using a logic that is appropriate for the given discipline (an example in genome biology is as follows: DNA is an inviolable genetic code for RNA—an ATG is necessary and sufficient for an AUG, and in turn for a methionine—but whether there is a histone code, wherein specific modifications to chromatin constitute necessary and sufficient conditions for a subsequent event, is an area of active debate (Strahl and Allis 2000; Lee, Smith et al. 2010)); (iii) scientists must become overall better communicators, as has been well-recognized, not only with the non-scientific public but also, no less importantly, amongst scientific disciplines. We need to learn to constantly and actively evolve our language to accommodate for our own discoveries and for those of other fields, including non-science; (iv) scientists need to learn to think like non-scientists and incorporate, when appropriate, knowledge structures in the humanities. As has been discussed in detail elsewhere (Shlain 1991), human insight, and thus knowledge as well, proceeds by Heuristic strides in humanities and science in parallel, with each of these disciplines, which are often (perhaps inappropriately) segregated in modern education, fundamentally changing the logic structure in the other throughout recorded history; and (v) the codification of knowledge within and between disciplines must be an active process and must be continually revisited as the nature of human inquiry advances. What makes humans unique is our ability to universalize individual experience—how this process changes to adapt to modern methods of experiencing the world has fundamental implications for the intellectual evolution of the species.

Matthew Thompson, Brad Picha, Thomas M. Vondriska

May 2013

Literature Cited

Chen, R., G. I. Mias, et al. (2012). "Personal omics profiling reveals dynamic molecular and medical phenotypes." Cell 148(6): 1293-1307.

Lee, J. S., E. Smith, et al. (2010). "The language of histone crosstalk." Cell 142(5): 682-685.

Shlain, L. (1991). Art & Physics: Parallel Visions in Space, Time and Light. USA, Perennial.

Sparrow, B., J. Liu, et al. (2011). "Google effects on memory: cognitive consequences of having information at our fingertips." Science 333(6043): 776-778.

Strahl, B. D. and C. D. Allis (2000). "The language of covalent histone modifications." Nature 403(6765): 41-45.