Note: this is a paper I wrote for a course as part of my PhD in philosophy. In a future post, I'll write about the epistemic role of intuitions in x-risks research more specifically, but I thought it'd be good to post some key ideas about the epistemology of science in general before focusing on AI-related risks.
Update: I never wrote a post about intuitions in x-risks research. I did talk about intuitive reasoning in solving alignment in this post, though. Also, Richard has an excellent piece on intuitions that you can find here.
1. Introduction
My aim in this paper is to offer an epistemological account of intuitions from the point of view of philosophy of science. The view I defend is that scientists tend to prioritize and favor the hypotheses that their intuitions lead them to prioritize and favor. I call this the ‘hunch-following thesis’. Many scientists, in their reflections, testify this claim and if it is true, it has remarkable consequences for how scientific research is executed. It suggests that there is a powerful psychological mechanism, like heuristics, that co-exists with scientific methodology and drives research forward. Two questions underlie my project: (1) how do the intuitions of scientific experts differ compared to the intuitions of laypeople?, and (2) how do the intuitions of scientific experts ought to be reconciled with pre-scientific intuitions?
I use these questions to consider how pre-scientific and scientific intuitions differ from one another, how intuitions serve in the process of hypothesizing and setting up experiments, and what is the relationship between intuitions and theory in science. Further, I examine whether scientists become better intuiters with their training and work, and if that is true, what are the implications assuming that they are responsible epistemic agents. All these considerations should emphasize the value of intuitions in scientific practice and help support the hunch-following thesis.
2. The hunch-following thesis
Scientists tend to prioritize and favor the hypotheses that their intuitions lead them to prioritize and favor. This is the hunch-following thesis, as I call it. It suggests that 1) there is a tendency to hastily reject counterintuitive claims or leave them underexplored (e.g., quantum mechanics), 2) scientists may show insistence on positions that feel intuitive even if it is difficult to falsify or verify them, and 3) scientists may find themselves not knowing or being able to exactly why they think what they think. It is important to note that hunches, in this sense, are not random; an intuition when working in science is not like flipping a coin to decide what to do next. Rather, intuitions help scientists identify what is worth investing their time and resources in.
The working definition of intuitions I adopt here suggests that intuitions consist in dispositions to be attracted to certain beliefs. Such dispositions, since they do not get to be consciously processed, count as some kind of fast and frugal heuristics. They help the scientist decide what to do next the way heuristics help the baseball player move towards the right direction to successfully catch the ball.
3. Behind the scenes of scientific practice
2.1. Two kinds of intuitions
Science students begin their training having folk beliefs about how the world works. These are their pre-scientific intuitions; they are formed by the interaction with the non-scientific realm of knowledge, broadly understood. The other kind of intuitions is the one scientists develop once their training is complete and they have learned to work according to the rules, methods, and techniques of their scientific paradigm (Kuhn 2012). These intuitions encapsulate tacit knowledge that is essential for scientific reasoning. This is because of the general traits of intuition namely, that it is rapid and occurs without conscious awareness (Blancke, Tanghe, and Braeckman 2018).
The main difference between pre-scientific and scientific intuitions is that the latter tend to be part of the scientist’s tacit learning and training in a very specific area of knowledge while pre-scientific intuitions concern how one perceives and explains the world more generally. It is possible, however, that pre-scientific intuitions are very specific, goal-oriented, and support one’s strategy of problem-solving in life. The difference between the two kinds is primarily in the content, not in the structure. What intuitions of both kinds have in common is that they are not simple; in fact, they involve a complex of cognitive capacities such as modal reasoning, perception, etc. (Williamson 2004). It makes sense, of course, that a scientist’s modal thinking is largely not qualitatively comparable to the modal thinking of a layperson. This is because the content of their thinking overall ranges across different thematic areas.
There are cases where pre-scientific and scientific intuitions are entangled. In these cases, it is not clear whether such an intuition has been filtered by scientific knowledge. For example, at least at a basic level, both scientists and non-scientists tend to trust the inferences of inductive reasoning. Indeed, induction seems to be one of those psychological mechanisms that are found both at the pre-theoretical and the theoretically informed level of understanding. It can be argued, though, that scientific training transforms pre-scientific intuition into a more refined form of intuition. In this sense, we do not have two strict kinds of intuitions but rather a spectrum of more and less refined intuitions.
Whether intuition is a graded concept relates to the question of whether scientists ever do away with their pre-scientific intuitions. In other words, is it possible that scientists reach a stage where they are no longer able to intuit the physical world the way a layperson does? This has to do with the internalization of the scientific worldview during learning and practicing science. There is a sense in which a fully trained scientist can never see the world again the way she used to before her training. To understand this, consider how we internalize knowledge about the world at a basic level, e.g., knowledge that concerns the rotation of the earth. After a certain point, one cannot even remember how she was perceiving and explaining the world before knowing this piece of information. It is similar for scientists. The degree of internalization is such that it is no longer possible not to conceive the world through the scientific lens. Granting this, it is reasonable to infer that scientists abandon many of their pre-scientific intuitions because they see the world differently. Their intuitions have a different background compared to laypeople and support different goals.
2.2. The role of intuitions in hypothesizing
To achieve their goals, scientists need to generate well-founded hypotheses that they will be able to empirically test and either falsify or verify (Popper 2002). In its practice, science cannot be done without intuitions; this is because intuitions constitute the driving force behind the process of hypothesizing and setting up experiments. Intuitions are often contrasted with reasoning (Liao 2011) but in the case of working in science, it is impossible to reason without intuitions. Intuitions as a driving force provide the creativity a scientist needs to appropriately direct and organize her research. The hypothesizing process is complex: it requires knowledge, insight, and intuition all properly combined to come up with the hypothesis that will go through the testing procedure. Whether the hypothesis is proven right or wrong is not of the highest importance although, of course, a scientist will desire for her intuitions to lead her to the best possible hypotheses. If the process is well-structured, the scientist will still be benefited. The structure of the process is contingent upon the scientist’s intuitive judgments.
Hypothesizing is a creative process; Bohm emphasized the connection between insight and hypothesis and called the kind of insight used in the scientific endeavor “rational insight” (Bohm 2003). Insight and intuition are part of what is called “tacit knowledge” (Brock 2015), and while they are both necessary for hypothesizing, intuition has a more long-lasting impact on ‘doing science’. This is because in the making of a hypothesis, the insight usually comes only once. But intuition is there to guide conceptual analysis and help the scientist synthesize all the information required for the formulation of the hypothesis. It is tacit knowledge that persists and accompanies scientists throughout their scientific careers.
The process of hypothesizing entails coming up with thought experiments (before conducting empirical experiments). Rowbottom writes that “intuitions (and intuition statements) are to thought experiments as perceptions (and observation statements) are to experiments” (Booth & Rowbottom, 2014, p. 119). In a sense, intuition is the main intellectual tool a scientist has during hypothesizing besides her explicit knowledge of the subject under investigation. The question here is from where do the intuitions come. Rowbottom argues that intuitions originate from experience; he sets out to defend a view that he calls “scientific intuition empiricism” according to which, the conceptual materials used in thought experiments as well as intuitions come from experience.
There is good reason to place the origin of intuitions in experience; it is an indication that the process of hypothesizing is still empirical work that applies conceptual analysis to the natural world. In other words, thought experiments are not merely armchair-based reasoning but they are empirically informed and have reference to the world of experience. Another way to put this is to say that thought experiments are ‘intuition pumps’ in that they induce the formation of intuitions always in relation to the empirical understanding of the world. The intuitions arising in hypothesizing might often appear to be counterintuitive, even for scientists themselves. This is because scientific intuitions are never theory-free in any way. There is a strong connection between the theoretical knowledge and understanding of a scientist’s task and the intuitive responses elicited during hypothesizing.
2.3. Intuitions and theory-ladenness
Now the question is whether and to what degree theory shapes scientific intuitions. We begin with the supposition that all observation is theory-laden. This means that no observation is free from theoretical beliefs (Franklin 2015). To illustrate this better, consider Duhem’s example:
Enter a laboratory; approach the table crowded with an assortment of apparatus, an electric cell, silk-covered copper wire, small cups of mercury, spools, a mirror mounted on an iron bar; the experimenter is inserting into small openings the metal ends of ebony-headed pins; the iron oscillates, and the mirror attached to it throws a luminous band upon a celluloid scale; the forward-backward motion of this spot enables the physicist to observe the minute oscillations of the iron bar. But ask him what he is doing. Will he answer 'I am studying the oscillations of an iron bar which carries a mirror'? No, he will say that he is measuring the electric resistance of the spools. If you are astonished, if you ask him what his words mean, what relation they have with the phenomena he has been observing and which you have noted at the same time as he, he will answer that your question requires a long explanation and that you should take a course in electricity (Hanson, 1958, pp. 16-17). [1]
The interpretation of this excerpt invites two remarks concerning intuitions. First, that it is possible that the more theory a scientist has internalized, the more intuitions she will have concerning her specialized area of research. In this sense, theory feeds the capacity for intuition and makes it more accurate, productive, and useful. The second remark is that theory contributes to shaping very specific intuitions. This means that intuitive responses when working in science often lead from one step to the other in a more or less straightforward (at least in hindsight) procedure of problem-solving. A scientist’s intuitions when trying to find the solution to a particular problem help her put forth specific answers which gradually lead to the end result. Intuitions, thus, are so specific because they are products of an intense intellectual activity that delves deep into conceptual analysis and elucidation of the scientific notions and problems under examination.
If theory and intuition are so interconnected, it is reasonable to want to question whether we can distinguish between the two of them in scientific reasoning. Drawing from Hanson’s understanding of theory-ladenness and the literature on how perception and observation are theory-laden[2], the most plausible claim is that intuition in science is just as much theory-mediated. There cannot be scientific intuition that is isolated from the theoretical study and training of the scientist. Just like it does not make sense to ask a scientist to observe outside of a theoretical framework, it does not make any sense for a scientist to have intuitions if they are not intuitions about something. As a result, it appears that during the puzzle-solving phase of the scientific activity, theory and intuition are hard to distinguish. When working within a paradigm, a scientist’s intuitions seem to be part of the theoretical apparatus just as much as her explicit knowledge of the subject. The analogy between observation and experiments, and intuition and thought experiments (Booth and Rowbottom 2014) helps us reach the conclusion that if observation is theory-laden then intuition is also theory-laden.
3. Social epistemology of expertise and intuitions
3.1. Scientists as responsible epistemic agents
If knowledge is power, then expertise is even more power. And power creates responsibility. The normative framework of my epistemological account of intuitions treats scientists as responsible epistemic agents namely, as agents whose knowledge confers unique responsibilities. But before examining what this responsibility implies, it is useful to ask the following questions: do science experts have ‘better intuitions’ than laypeople? And who gets to determine that? The way one answers these questions is crucial for the purpose of framing the implications of scientists’ moral and social responsibilities.
According to the expertise defense, two premises help us answer the aforementioned questions. First, “the intuitive judgments of experts are different from the judgments of novices in a way that is relevant to the truth or falsity of such judgments” (Mizrahi, 2015, p. 53). Second, “the intuitive judgments of experts are better than the intuitive judgments of novices in a way that is relevant to the truth or falsity of such judgments” (Ibid). Note, however, that the expertise defense is originally an argument that concerns philosophy experts and non-experts. Nado points out that in scientific domains, professionals are granted their epistemic superiority by default (Nado 2015). This is because these professionals exhibit refined cognitive skills that laypeople have not developed (Williamson 2011). Their expertise rests upon their training and experience. A simplistic yet appealing analogy suggests that if scientists and laypeople were computers, the former would be having both superior software and an abundance of data compared to the latter.
Moreover, ‘better intuitions’ is a vague term; it can mean that they are approximately more accurate and closer to truth or that they are generally more reliable, pragmatically speaking. Focusing on reliability is generally more productive as it avoids unnecessary metaphysical assumptions about the nature of truth. So, the question is why rely upon one’s intuition. The answer is simple and pragmatist: scientists rely on and trust their intuitions to the extent that they bring them the desired results in creating testable hypotheses, problem-solving, and formulating theories with high predictive force.
Now, if we grant that scientists are responsible epistemic agents and that ‘they know better’, we should be able to propose that they ought to ‘do better’ as well. A set of responsibilities follows: they have to make the right choices in how their inventions will affect social reality. For example, there are inventions that carry inherent dangers, such as nuclear technologies or artificial intelligence. If scientists are not careful with inventions of this sort, they should be liable for “culpable ignorance” (Medina 2013) which is the product of epistemic vice: not to know the potential consequences when one ought to know. Further, scientists are to be held accountable for who gets to have access to scientific knowledge that could expose humanity to irreparable catastrophes if left in the wrong hands. Lastly, they are responsible for the communication and popularization of science in the social sphere as well as the education of the next generations. From all the above, the interplay between the scientific and the social becomes apparent.
3.2. Reflective equilibrium
Since science is a social activity, pre-scientific and scientific intuitions co-exist, at least to some degree. This suggests that scientists have to be able to reconcile the two different kinds of intuitions and reach a reflective equilibrium. The process of reaching a reflective equilibrium is a complex one; it involves succeeding in striking a balance between the folk intuitions the scientist had before her training and may still have when she is not occupied with her research, and the intuitions which are part of her scientific deliberation and work. Achieving this balance is difficult because no scientist can completely abandon their layperson identity (and this might not be desirable either).
Usually, the term ‘reflective equilibrium’ (Rawls 1999) is used in a broader context of moral reasoning and describes the attempt to reconcile one’s pre-theoretical beliefs with the beliefs acquired after consideration of principles, reflection, and theory-analysis. To apply this to how intuitions function in scientific practice we suppose that the more a scientist delves into her research, the more her way of reasoning shifts towards something alien to her pre-scientific understanding and reasoning methods. At a first glance, this seems to be useful and productive; it is what a scientist’s cognitive skill consists of, after all. But if we assume that scientists have ‘better intuitions’ will that imply that everyone should ultimately cultivate their scientifically enriched intuitions?
A distinction is appropriate at this point. The reflective equilibrium can concern deliberations that are scientific but do not straightforwardly address the internal questions of a scientific task (for instance, an internal question asks what is the next move in solving a mathematical equation). Such a scientific task is somehow interconnected with socio-political decision-making. Often, these sort of deliberations tend to have an immediate social impact (for example, determining which ages groups should receive a vaccine first during a pandemic). Alternatively, it can concern a more ‘isolated’ scientific task and in that case, there might be conflict between folk and scientific intuitions. This is usually what happens when a scientific theory is counterintuitive to the everyday life understanding of the world (which is very common in contemporary science). A well-trained scientist resists the feeling that counterintuitiveness evokes and can continue her research without interruption. The more she has internalized the rules of her paradigm, the less counterintuitive they appear.
From this, there is no suggestion that everyone, experts and laypeople, ought to lean to a more scientific mindset and thus, have scientific intuitions. The kind of intuitions one has and therefore, whether it is necessary to reach a reflective equilibrium, depends on her social role and the responsibilities attached to it. For scientists, the responsibility is double; they have to be able to fight the instincts of the counterintuitive when working on a specific problem. At the same time, they have to serve the purposes of communicating science to society and educating citizens that do not have a scientific background but need to be scientifically informed to have an active and responsible social role (for instance, government officials). A scientist has to find the golden mean between what is purely scientifically accurate (if there is such a thing) and what would contribute to the common good. It makes sense to argue here that moral and scientific intuitions dynamically interact with each other and this is why reaching a reflective equilibrium is central to decision-making.
3.3. Intuition and bias
So far, it has been made clear that intuition is a multidimensional and complex cognitive faculty that generally makes scientists more skillful. However, there seems to be a connection between intuition and cognitive bias. In particular, implicit bias might be itself a form of intuition; implicit bias can take the form of discrimination while, at the same time, it lacks explicit prejudiced motivation (Madva and Brownstein 2017). This is related to scientific activity in three ways. The first concerns the openness of a scientist within her scientific field. In others words, it is about a scientist’s intellectual agility, flexibility, and willingness to adjust to new theories and new data without attachment to one theory or one way of interpretation. Sometimes scientists might refuse to give up a theory because they strongly believe in its truth or because it is part of their traditional scientific background. This is how bias becomes manifest in science.
To understand this more, there is one interesting case study of how a scientist’s intuition can function as cognitive bias and prevent her from espousing a theory and working within a different paradigm. This is the example of Max Planck. Despite being prolific and overall one of the most successful physicists of his time, Planck remained attached to the idea that the propagation of light would be impossible in empty space, because of the metaphysical posit that there must be a medium, ether (Goldberg 2015). The notion of ether has a long record in the history of science dating back to ancient Greek astronomy and cosmology. In that sense, it is not entirely unreasonable that a scientist was not willing to abandon it; a scientist’s tradition can indeed exercise heavy influence on her way of thinking and conceptualizing. The appeal to one’s tradition, however, is in itself a weak strategy. It suggests that authority is more valuable than evidence and empirical research, a claim that scientists (should) fight against.
Another way to make sense of the connection between cognitive bias and science concerns the phenomenon of self-reflexive cognitive bias (Mugg and Khalidi 2021). In a nutshell, the problem is whether scientists who study cognitive bias (usually cognitive scientists) are subject to the very biases that they are investigating. Their intuition might be that since this is their area of research, they are very deeply aware of how bias can influence one’s thinking and so, that they are even more cautious and not vulnerable. But their intuition can also be misleading or simply a self-fulfilling prophecy. Indeed, intuition itself can be a form of bias in this case. Even so, biases can function as useful heuristics for quick results and increasing effectiveness in problem-solving.
The last point regarding intuition and bias addresses the value judgments scientists make in relation to their discipline. A scientist might have an intuition that scientific knowledge provides her with a privileged epistemic access into how the world works. This idea of the superiority of science often leads to scientism which encapsulates a form of bias that favors science over any other methods of obtaining knowledge and navigating the world. This can lead to an exaggerated reverence towards science (Haack 2007) and its turning into an unquestioned epistemic authority. The intuition of the superiority of science does not limit itself to scientific circles. Scientists and laypeople often share it and this is largely justified: it rests upon the secularized education of the western world, the instrumentalization of science for political purposes, and the pragmatic benefits of scientific and technological advancements. It remains, however, an intuition worth questioning.
4. The social role of scientific experts
To treat scientists as responsible epistemic agents involves the assumption that their intuitions have a higher epistemic status than the intuitions of laypeople. As a society, we often (but not homogenously) tend to rely on scientific intuitions and trust the community of experts when it comes to important decisions. The example of a global pandemic can help show that scientific intuitions functioned as a guide for action when uncertainty was prevailing. At times when societies count on scientific intuitions we essentially observe science in its making. Indeed, at the beginning of a pandemic, protocols of treatment are created based on already established scientific knowledge and the intuition of experts. The privileged epistemic status of scientific intuitions is a matter of past success, that is, effective problem-solving and predictive power.
Generally, the responsibility of scientists seems to stem from precisely the fact that they are privileged epistemic agents. Knowledge and better intuitions create moral obligations; scientists ought to use their privilege to contribute to the common good by offering guidance in making difficult decisions as well as informing the citizens and helping them gain an accurate understanding of what is at stake. Intuitions, therefore, also play a pedagogical role: they allow scientists to tell what is appropriate to be communicated to the public and in what manner. These intuitions function as a bridge between scientific and folk intuitions. Scientists might intuitively figure out the right way to popularize the results of their research and shape the impact of their work upon the social domain.
An open question is what happens when scientific intuitions fail. One view suggests that scientists offer their best theory given the data they have at a specific time. If they are proven wrong in the future, they are not blameworthy. This view presupposes a very light interpretation of moral responsibility. The failure of scientific intuitions is often translated into significant social cost. While scientists might have good intentions and high moral standards, social reality tends to be more complicated. The political dynamics and the financial motives underlying the very often presented scientific decisions might make one rather suspicious when it comes to blame attribution. What is needed is well-functioning institutions (Mantzavinos 2020) that help criticize the scientific activities and regulate the different power relations in order to avoid arbitrariness and to ensure that we are justified in trusting scientists and their intuitions.
Achieving a reflective equilibrium is a desideratum in the socio-political domain as well in the case of judging scientific intuitions. In other words, it is not just experts that are responsible for striking a balance between their different intuitions and beliefs, but also laypeople. Individuals in positions of power and common citizens should be able to weigh in the various intuitions, try to understand the ways in which science is being communicated and popularized, and judge for themselves what it means to reconcile folk and scientific intuitions. Of course, reaching the stage of the reflective equilibrium is a demanding intellectual task. Perhaps, scientists are again in a position of privilege compared to laypeople as their training enables them to have richer cognitive skills. Again, this adds to their social and moral responsibility. From all the above, it is safe to conclude that the epistemology of intuitions in science is an area that expands beyond the limits of scientific activity; it addresses questions and problems that touch our social, moral, and political lives and invites further deliberation on the interconnection between science and society.
5. Conclusion
Intuitions occupy a central role in science. Studying their epistemology and applying it to scientific reasoning and practice serves two purposes: first, it can provide insights concerning the exercise of science itself and how the scientific community functions. It helps us elucidate how scientists really think and come up with their theories, what kind of tacit knowledge is involved in this process, and how their training transforms them into perceptive and creative individuals. The process of hypothesizing and constructing a theory would not be possible without a highly refined intuitive faculty. Indeed, intuition appears to operate as a driving force in ‘doing science’.
Second, it can cast light on the complex interconnection between science and society. Science is a social activity; its epistemology could not be separated from the social and political domain. The epistemology of scientific intuitions can deepen our understanding of the role of experts in society and motivate the appropriate questions concerning their moral responsibilities. This can be useful when important decisions have to be made, especially in times of crisis. Nevertheless, the role of expertise in society remains an ongoing debate. The task of determining if experts are worth trusting goes beyond the analysis of whether experts are better intuiters than laypeople. We should be willing to strive towards ensuring that constructive criticism of scientific activity and scientists is possible in order to find satisfying justifications whenever we collectively assume that scientific intuitions are reliable.
References
Blancke, Stefaan, Koen B Tanghe, and Johan Braeckman. 2018. “Intuitions in Science Education and the Public Understanding of Science.” Perspectives on Science and Culture, 223.
Bohm, David. 2003. On Creativity. Edited by Lee Nichol. 1st edition. Routledge.
Booth, Anthony Robert, and Darrell P. Rowbottom, eds. 2014. Intuitions. Illustrated edition. Oxford, United Kingdom: Oxford University Press.
Brewer, William F., and Bruce L. Lambert. 2001. “The Theory-Ladenness of Observation and the Theory-Ladenness of the Rest of the Scientific Process.” Philosophy of Science 68 (S3): S176–86. https://doi.org/10.1086/392907.
Brock, Richard. 2015. “Intuition and Insight: Two Concepts That Illuminate the Tacit in Science Education.” Studies in Science Education 51 (2): 127–67. https://doi.org/10.1080/03057267.2015.1049843.
Duhem, Pierre, and Paul Brouzeng. 2007. La théorie physique : Son objet, sa structure. Paris: Vrin.
Franklin, Allan. 2015. “The Theory-Ladenness of Experiment.” Journal for General Philosophy of Science 46 (1): 155–66. https://doi.org/10.1007/s10838-015-9285-9.
Goldberg, Stanley. 2015. “Max Planck’s Philosophy of Nature and His Elaboration of the Special Theory of Relativity.” In Historical Studies in the Physical Sciences, Volume 7, edited by Russell McCormmach, 125–60. Princeton University Press. https://doi.org/doi:10.1515/9781400870189-004.
Haack, Susan. 2007. Defending Science-Within Reason: Between Scientism And Cynicism. Amherst, N.Y: Prometheus.
Hanson, Norwood Russell. 1958. Patterns of Discovery: An Inquiry into the Conceptual Foundations of Science. 1st edition. Cambridge: Cambridge University Press.
Kuhn, Thomas S. 2012. The Structure of Scientific Revolutions: 50th Anniversary Edition. 4th edition. Chicago ; London: University of Chicago Press.
Liao, S Matthew. 2011. “Bias and Reasoning: Haidt’s Theory of Moral Judgment.” In New Waves in Ethics, 108–27. Springer.
Madva, Alex, and Michael Brownstein. 2017. “Stereotypes, Prejudice, and the Taxonomy of the Implicit Social Mind.” Noûs 52 (January). https://doi.org/10.1111/nous.12182.
Mantzavinos, C. 2020. “Science, Institutions, and Values.” European Journal of Philosophy, September. https://doi.org/10.1111/ejop.12579.
Medina, José. 2013. “Epistemic Responsibility and Culpable Ignorance.” In The Epistemology of Resistance. New York: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199929023.003.0004.
Mizrahi, Moti. 2015. “Three Arguments against the Expertise Defense.” Metaphilosophy 46 (1): 52–64.
Mugg, Joshua, and Muhammad Ali Khalidi. 2021. “Self-Reflexive Cognitive Bias.” European Journal for Philosophy of Science 11 (3): 88. https://doi.org/10.1007/s13194-021-00404-2.
Nado, Jennifer. 2015. “Philosophical Expertise and Scientific Expertise.” Philosophical Psychology 28 (7): 1026–44. https://doi.org/10.1080/09515089.2014.961186.
Popper, Karl. 2002. The Logic of Scientific Discovery. 2nd edition. London: Routledge.
Rawls, John. 1999. A Theory of Justice. 2nd edition. Cambridge, Mass: Belknap Press: An Imprint of Harvard University Press.
Williamson, Timothy. 2004. “Philosophical ‘Intuitions’ and Scepticism about Judgement.” Proceedings of The IEEE - PIEEE 58 (January).
———. 2011. “Philosophical Expertise and the Burden of Proof.” Metaphilosophy 42 (April): 215–29. https://doi.org/10.1111/j.1467-9973.2011.01685.x.
[1] Originally found in (Duhem and Brouzeng 2007).
[2] For more on this, see (Brewer and Lambert 2001) and (Franklin 2015).