How researchers’ epistemic beliefs influence the quality of their work

View all tags
Jeanne Ellis Ormrod
Woman wearing glasses gazing at a white board thoughtfully

As we human beings learn new things every day, we all have ideas about what “knowledge” and “learning” are—ideas that are collectively known as epistemic beliefs. These beliefs typically include beliefs about many or all of the following:

  • The certainty of knowledge: Whether knowledge is a fixed, unchanging, absolute “truth” or, instead, a tentative, dynamic entity that will continue to evolve over time.
  • The simplicity and structure of knowledge: Whether knowledge is a collection of discrete, independent facts or, instead, a set of complex and interrelated ideas.
  • The source of knowledge: Whether knowledge comes from outside of learners (i.e., from a teacher or other authority figure) or, instead, is derived and constructed by learners themselves.
  • The criteria for determining truth: Whether an idea is accepted as true when it’s communicated by an expert or, instead, when it’s logically evaluated based on available evidence.
  • The speed of learning: Whether knowledge is acquired quickly, if at all (in which case learners either know something or they don’t, in an all-or-none fashion) or, instead, is acquired gradually over a period of time (in which case learners can partially know something).
  • The nature of learning ability: Whether people’s ability to learn is fixed at birth (i.e., inherited) or, instead, can improve over time with practice and use of better strategies.

Keep in mind that epistemic beliefs aren’t as either–or as I’ve just portrayed them. Most or all of the dimensions I’ve listed are probably continuums rather than strict either–or dichotomies.

Psychologists often use certain terms when referring to various beliefs about the nature of knowledge. Typical of 3-year-olds is a realist view, in which knowledge is the same as what people say or do (e.g., if I tell you that some cows have purple fur with orange spots, you’ll take my word for it). Four-year-olds are more likely to have an absolutist view, in which knowledge isn’t necessarily the same as people’s thoughts or assertions but it’s certain and definite—things are either absolutely right or absolutely wrong. Later on—typically in adolescence at the earliest—some individuals acquire a multiplist view, in which some knowledge is seen as uncertain, with people’s varying opinions all having equal legitimacy. People may or may not eventually acquire an evaluativist view, in which people’s ideas and opinions have more or less merit and legitimacy depending on whether defensible evidence or logic supports them.1

It makes sense to hold an absolutist view about some kinds of knowledge. Certain bits of information are fairly black and white; we usually think of them as “facts.” For example, France is a country in Europe, Christopher Columbus first sailed across the Atlantic in 1492, and two things plus two more things give us four things altogether; these facts are unlikely to change in the foreseeable future. In other situations, a multiplist view makes sense. For example, there isn’t necessarily a single “right” answer to questions such as “What qualities are essential for ‘good’ music?” and “Is it appropriate to burp when you’re a dinner guest in someone else’s home?”

When conducting research on complex issues or problems, however, good researchers adapt an evaluatist perspective: They recognize that a particular premise or conclusion is probably “true” only to the extent that concrete evidence and logic support it. Accordingly, taking an evaluatist view requires researchers to engage in at least three mental processes:

  • Critical thinking. Good researchers never take the things they read or hear at face value. Critical thinking involves evaluating the accuracy, credibility, and worth of information and lines of reasoning. For example, when people read about other individuals’ theories and research findings, they regularly ask themselves such questions as these: “Are there potential shortcomings in this research study that make me question the validity of the researcher’s conclusions?” “Does this researcher’s explanation make sense based on other research findings related to the issue being investigated?” “How might I improve on the research methods used in this study?”
  • Metacognitive reflectiveness. The term metacognition means “thinking about the nature of thinking,” and metacognitive reflectiveness means “thinking about one’s own thinking.” Good researchers regularly reflect on their own thought processes, mentally checking themselves regarding their own logic. For example, they continually ask themselves whether they’re being as objective as possible in their observations, whether their evidence adequately supports their hypotheses and conclusions, and where there might be holes or inconsistencies in the theories they have constructed to explain a phenomenon they are investigating. Metacognitive reflectiveness, then, requires considerable critical thinking.
  • Conceptual change when warranted. Conceptual change involves significantly revising one’s existing beliefs about a topic, enabling new, discrepant information to be better understood and explained. Good researchers regularly revise their beliefs, understandings, and explanations as credible new evidence and theories appear on the scene. In general, they keep open minds about the true nature of the phenomena they are investigating. Researchers who do otherwise—those who stubbornly stick to their own previous explanations even in the face of considerable contradictory information—impede scientific progress as we collectively strive to better understand our physical, psychological, and social worlds.


1 For groundbreaking work on this developmental trend, I refer you to two book chapters by Deanna Kuhn and colleagues:

• Kuhn, D., & Franklin, S. (2006). The second decade: What develops (and how)? In W. Damon & R. M. Lerner (Series Eds.), D. Kuhn & R. Siegler (Vol. Eds.), Handbook of child psychology: Vol. 2. Cognition, perception, and language (6th ed., pp. 953–993). New York, NY: Wiley.

• Kuhn, D., & Weinstock, M. (2002). What is epistemological thinking and why does it matter? In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs about knowledge and knowing (pp. 121–144). Mahwah, NJ: Erlbaum

Let’s connect

Connect with us to request a product demo, receive sample materials for your courses, and more.

Connect with a Pearson representative