# Daniel Waxman

I'm an Assistant Professor of Philosophy at Lingnan University, Hong Kong. My research is mostly in the philosophy of mathematics, logic, and epistemology, but I have wide additional interests in metaphysics, metaethics, normative ethics, and political philosophy.

Before coming to Lingnan, I was a Junior Research Fellow at Corpus Christi College, Oxford. I received my PhD in Philosophy from New York University in 2017, where I wrote my dissertation under the supervision of Crispin Wright, Hartry Field, and Cian Dorr. As an undergraduate, I studied Maths and Philosophy at Worcester College, Oxford.

My current research focuses mostly on issues of truth and consistency. I'm interested in making sense of pluralist views about mathematics, according to which (roughly) any consistent mathematical theory is true. For related reasons, I think a lot about theories of truth, formal and otherwise, and about the epistemology of consistency: how we might ever come to have justification or knowledge that our best theories are consistent (especially in light of limitative results like Gödel's incompleteness theorems).

Here is a link to my CV.

My email address is danielwaxman@gmail.com.

## Publications

- “A Metasemantic Challenge for Mathematical Determinacy” with Jared Warren [Abstract | Penultimate Draft | Published]
*Synthese*(forthcoming)

This paper investigates the determinacy of mathematics. We begin by clarifying how we are understanding the notion of determinacy (section 1) before turning to the questions of if and how famous independence results bear on issues of determinacy in mathematics (section 2). From there, we pose a metasemantic challenge for those who believe that mathematical language is determinate (section 3), motivate two important constraints on attempts to meet our challenge (section 4), and then use these constraints to develop an argument against determinacy (section 5) before offering some brief closing reflections (section 6). We believe our discussion poses a challenge for many significant philosophical theories of mathematics, and puts considerable pressure on views that accept a non-trivial amount of determinacy for even basic arithmetic. - “Deflationism, Arithmetic, and the Argument from Conservativeness” [Abstract | Penultimate Draft | Published]
*Mind*(2017) 126: 429-463

Many philosophers believe that a deflationary truth theory must conservatively extend any base theory to which it is added (put roughly: talking about truth shouldn't allow us to establish any new claims about any subject-matter that doesn't involve truth). But when applied to arithmetic, it's argued, the imposition of a conservativeness requirement leads to a serious objection to deflationism: for the Gödel sentence for Peano Arithmetic (PA) is not a theorem of PA, but becomes a theorem when PA is extended by adding certain appealing principles governing truth.I argue in this paper that no such objection succeeds. The issue turns on how we understand the notion of logical consequence implicit in any conservativeness requirement, and whether or not we possess a categorical conception of the natural numbers (i.e. whether we can rule out so-called "non-standard models"). I offer a disjunctive response: if we do possess a categorical conception of arithmetic, then deflationists have principled reason to accept a conservativeness requirement stated in terms of a rich notion of logical consequence according to which the Gödel sentence follows from PA. But if we do not, then the reasons for requiring the derivation of the Gödel sentence lapse, and deflationists are free to accept a conservativeness requirement stated proof-theoretically. Either way, deflationism is in the clear.

[A reply by Julian Murzi and Lorenzo Rossi, entitled "Conservative Deflationism?", is forthcoming in

*Philosophical Studies*.]

## Work in Progress (please feel free to email me for drafts or more information)

- “Truth, Transmission Failure, and the Justification of Consistency” [Abstract]

Many who have considered the subject believe that our best mathematical theories (e.g. arithmetic, analysis, and set theory) are consistent. But, in light of Gödel's Incompleteness Theorems and for other reasons, it is not at all obvious how this attitude can be rationally defended. In this paper I consider one influential argument (schema) for the consistency of mathematical theories, going via the notion of truth. Roughly, it runs as follows: the axioms of the theory are true, the inference rules of logic preserve truth, so all of the theorems are true; but if the theory were inconsistent, it would have false theorems; so it must be consistent. I argue that this truth-theoretic argument is, on many plausible views of mathematics, epistemically useless: in the sense much discussed in recent epistemology, it is incapable of transmitting justification from its premises to its conclusion. - “Did Gentzen Prove the Consistency of Arithmetic?” [Abstract]

In 1936, Gerhard Gentzen famously gave a proof of the consistency of Peano arithmetic. There is no disputing that Gentzen provided us with a mathematically valid argument. This paper addresses the distinct question of whether Gentzen's result is properly viewed as a proof in the*epistemic*sense: an argument that can be used to obtain or enhance justification in its conclusion. Although Gentzen himself believed that he had provided a “real vindication” of Peano arithmetic, many subsequent mathematicians and philosophers have disagreed, on the basis that the proof is epistemically circular or otherwise inert. After gently sketching the outlines of Gentzen's proof, I discuss a number of “informal equivalence theses” that seek to equate various formal mathematical systems with intuitive conceptions of mathematical subject-matters. In light of this discussion, I argue that the truth lies somewhere in between the claims of Gentzen and his critics: although the proof is indeed epistemically non-trivial, it falls short of constituting a real vindication of the consistency of Peano arithmetic. - “Supertasks and Arithmetical Truth” with Jared Warren [Abstract]

Some recent work in the philosophy of physics has argued that general relativity is compatible with the existence of supertask computers, capable of running through infinitely many individual computations in a finite time. A natural thought is that, if true, this implies that arithmetical truth is determinate (at least for e.g. sentences saying that every number has a certain decidable property). In this paper we argue, via a careful analysis of putative arguments from supertasks to determinancy, that this natural thought is mistaken: supertasks are of no help in explaining arithmetical determinancy. - “Stable and Unstable Theories of Truth and Syntax” with Beau Madison Mount [Abstract]

Recent work on formal theories of truth has revived an approach, due originally to Tarski, on which syntax and truth theories are sharply distinguished -- `disentangled' -- from mathematical base theories. In this paper, we defend a novel philosophical constraint on disentangled theories: we argue that these theories must be epistemically stable in that they must possess an intrinsic motivation justifying no strictly stronger theory. We argue that a disentangled setting, even if the base and the syntax theory are individually stable, they may nevertheless be jointly unstable. We contend that this flaw afflicts many proposals discussed in the literature. We go on to defend a new, stable disentangled theory: double second-order arithmetic. - “Losing Confidence in Luminosity” with Simon Goldstein [Abstract]

A mental state is luminous if, whenever an agent is in that state, they are in a position to know that they are. Following Williamson's*Knowledge and Its Limits*, a wave of recent work has explored whether any interesting mental states are luminous. One powerful argument against luminosity comes from the connection between knowledge and confidence: that if an agent knows p, then p is true in any nearby world where she has a similar level of confidence in p. Unfortunately, the relevant notion of confidence in the principle above is relatively underexplored.In this paper, we remedy this gap, providing a precise theory of confidence: an agent's degree of confidence in p is the objective chance they will act in ways that satisfy their desires if p. We use this theory of confidence to propose a variety of interesting constraints on knowledge. We argue that knowledge is not luminous, but for quite different reasons than the existing literature has considered.

- “Rescuing Implicit Definition from Abstractionism” [Abstract]

Neo-Fregeans in the philosophy of mathematics hold that the key to a correct understanding of mathematics is the implicit definition of mathematical language. In this paper, I discuss and advocate the rejection of abstractionism: the constraint (implicit within much of the recent neo-Fregean tradition) according to which all acceptable implicit definitions take the form of abstraction principles. I argue if we take the axioms of mathematical theories themselves as implicit definitions, a much more attractive and unified view of mathematics results. - “Is Mathematics Unreasonably Effective?” [Abstract]

Is mathematics unreasonably effective when applied to the natural sciences? Many mathematicians, physicists, and philosophers have argued that it is. I evaluate an argument for this conclusion by Mark Steiner, according to which (i) mathematics is good only insofar as it possesses certain aesthetic virtues, and (ii) the fact that such mathematics is of great use in applications results in a puzzling and problematic anthropocentrism. I respond by arguing that the aesthetic virtues used to evaluate and motivate mathematics are far less straightforwardly anthropocentric than one might presume, and (with reference to case-studies within Galois theory and probabilistic number theory) I show that these virtues are themselves epistemically relevant or at least track generally recognized theoretical virtues, such as explanatory and unifying power, fruitfulness, and importance. - “The Role of Mathematical Truth in Benacerraf's Dilemma” [Abstract]

In*Mathematical Truth*, Paul Benacerraf famously raised a dilemma for the philosophy of mathematics: very roughly, that mathematics needs a reasonable semantics and a reasonable epistemology but that it's impossible to see how these needs can be simultaneously satisfied. In this paper I argue that (i) Benacerraf's semantic constraint is in fact ambiguous between a stronger and a weaker reading; (ii) only the stronger reading is incompatible with the idea that mathematical truth is an epistemically tractable property; and (iii) the stronger reading is under-motivated.

## Teaching

- At Lingnan:
- Modal Logic (Fall 2018)
- Critical Thinking: Logic and Argumentation (Fall & Spring 2018)

- At Oxford (tutorials/classes):
- Gödel's Incompleteness Theorems (Trinity 2017)
- Frege: Foundations of Arithmetic (Trinity 2017)
- Philosophy of Mathematics (Hilary 2017)
- Philosophical Logic (Hilary 2017)

- At NYU & the New School for Social Research (as primary instructor):
- At NYU (as TA):
- Great Works in Philosophy (Spring 2015, instructor Harvey Lederman)
- Ancient Philosophy (Fall 2014, instructor Jessica Moss)
- Texts and Ideas: Guilt and Sin, Law and Justice (Spring 2014, instructor Tim Maudlin)
- Political Philosophy (Fall 2013, instructor Aaron James)
- Life and Death (Spring 2013, instructor Joshua Gillon)
- Great Works in Philosophy (Fall 2012, instructor Jonathan Cottrell)