# Daniel Waxman

I'm an Assistant Professor of Philosophy at Lingnan University, Hong Kong. My research is mostly in the philosophy of mathematics, logic, and epistemology, and I have wide additional interests in metaphysics, metaethics, normative ethics, and political philosophy.

Before coming to Lingnan, I was a Junior Research Fellow at Corpus Christi College, Oxford. I received my PhD in Philosophy from New York University in 2017, where I wrote my dissertation under the supervision of Crispin Wright, Hartry Field, and Cian Dorr. As an undergraduate, I studied Maths and Philosophy at Worcester College, Oxford.

My current research focuses mostly on issues of truth, determinacy, and consistency. I'm interested in making sense of pluralist views about mathematics, according to which (roughly) any consistent mathematical theory is true. For related reasons, I think a lot about theories of truth, formal and otherwise, and about the epistemology of consistency: how we might ever come to have justification or knowledge that our best theories are consistent (especially in light of limitative results like Gödel's incompleteness theorems).

Here is a link to my CV.

My email address is danielwaxman@gmail.com

## Publications

- “Supertasks and Arithmetical Truth” with Jared Warren [Abstract | PDF coming soon]

*Philosophical Studies*(forthcoming)

Some recent work in the philosophy of physics has argued that general relativity is compatible with the existence of supertask computers, capable of running through infinitely many individual computations in a finite time. A natural thought is that, if true, this implies that arithmetical truth is determinate (at least for e.g. sentences saying that every number has a certain decidable property). In this paper we argue, via a careful analysis of putative arguments from supertask computations to determinacy, that this natural thought is mistaken: supertasks are of no help in explaining arithmetical determinacy. - “A Metasemantic Challenge for Mathematical Determinacy” with Jared Warren [Abstract | PDF | Journal]

*Synthese*(forthcoming)

This paper presents a challenge for the view that mathematical truth is determinate. We begin by discussing the bearing of famous independence results on the issue, arguing that there is no straightforward argument from independence to indeterminacy. Nevertheless, we present a serious challenge for advocates of determinacy: fundamentally, to give a metasemantic explanation of how it can arise. We defend two constraints on any acceptable explanation and show that they can be used to develop a powerful argument against determinacy. We believe our discussion poses a challenge for many significant philosophical theories of mathematics, applying even to those which accept the determinacy of basic arithmetic. - “Deflationism, Arithmetic, and the Argument from Conservativeness” [Abstract | PDF | Journal]

*Mind*(2017) 126: 429-463

Many philosophers believe that a deflationary truth theory must conservatively extend any base theory to which it is added (put roughly: talking about truth shouldn't allow us to establish any new claims about any subject-matter that doesn't involve truth). But when applied to arithmetic, it's argued, the imposition of a conservativeness requirement leads to a serious objection to deflationism: for the Gödel sentence for Peano Arithmetic (PA) is not a theorem of PA, but becomes a theorem when PA is extended by adding certain appealing principles governing truth.I argue in this paper that no such objection succeeds. The issue turns on how we understand the notion of logical consequence implicit in any conservativeness requirement, and whether or not we possess a categorical conception of the natural numbers (i.e. whether we can rule out so-called "non-standard models"). I offer a disjunctive response: if we do possess a categorical conception of arithmetic, then deflationists have principled reason to accept a conservativeness requirement stated in terms of a rich notion of logical consequence according to which the Gödel sentence follows from PA. But if we do not, then the reasons for requiring the derivation of the Gödel sentence lapse, and deflationists are free to accept a conservativeness requirement stated proof-theoretically. Either way, deflationism is in the clear.

[A reply by Julian Murzi and Lorenzo Rossi, entitled "Conservative Deflationism?", is forthcoming in

*Philosophical Studies*.]

## Work in Progress

- “Truth, Transmission Failure, and the Justification of Consistency” [Abstract]

Many who have considered the subject believe that our best mathematical theories (e.g. arithmetic, analysis, and set theory) are consistent. But, in light of Gödel's incompleteness theorems and for other reasons, it is not at all obvious how this attitude can be rationally defended. In this paper I consider one influential argument (schema) for the consistency of mathematical theories, going via the notion of truth. Roughly, it runs as follows: the axioms of the theory are true, the inference rules of logic preserve truth, so all of the theorems are true; but if the theory were inconsistent, it would have false theorems; so it must be consistent. I argue that this truth-theoretic argument is, on many plausible views of mathematics, epistemically useless: in the sense much discussed in recent epistemology, it is incapable of transmitting justification from its premises to its conclusion. - “Did Gentzen Prove the Consistency of Arithmetic?” [Abstract]

In 1936, Gerhard Gentzen famously gave a proof of the consistency of Peano arithmetic. There is no disputing that Gentzen provided us with a mathematically valid argument. This paper addresses the distinct question of whether Gentzen's result is properly viewed as a proof in the*epistemic*sense: an argument that can be used to obtain or enhance justification in its conclusion. Although Gentzen himself believed that he had provided a “real vindication” of Peano arithmetic, many subsequent mathematicians and philosophers have disagreed, on the basis that the proof is epistemically circular or otherwise inert. After gently sketching the outlines of Gentzen's proof, I discuss a number of “informal equivalence theses” that seek to equate various formal mathematical systems with intuitive conceptions of mathematical subject-matters. In light of this discussion, I argue that the truth lies somewhere in between the claims of Gentzen and his critics: although the proof is indeed epistemically non-trivial, it falls short of constituting a real vindication of the consistency of Peano arithmetic. - “Stable and Unstable Theories of Truth and Syntax” with Beau Madison Mount [Abstract]

Recent work on formal theories of truth has revived an approach, due originally to Tarski, on which syntax and truth theories are sharply distinguished -- `disentangled' -- from mathematical base theories. In this paper, we defend a novel philosophical constraint on disentangled theories: we argue that these theories must be epistemically stable in that they must possess an intrinsic motivation justifying no strictly stronger theory. We argue that a disentangled setting, even if the base and the syntax theory are individually stable, they may nevertheless be jointly unstable. We contend that this flaw afflicts many proposals discussed in the literature. We go on to defend a new, stable disentangled theory: double second-order arithmetic. - “Losing Confidence in Luminosity” with Simon Goldstein [Abstract]

A mental state is luminous if, whenever an agent is in that state, they are in a position to know that they are. Following Timothy Williamson's*Knowledge and Its Limits*, a wave of recent work has explored whether there are any non-trivial luminous mental states. A version of Williamson's powerful argument that none exist appeals to a safety-theoretic principle connecting knowledge and confidence: if an agent knows p, then p is true in any nearby scenario where she has a similar level of confidence in p. However, the notion of confidence in this safety principle is relatively underexplored.This paper attempts to remedy the gap by providing a precise theory of confidence: an agent's degree of confidence in p is the objective chance they will rely on p in practical reasoning and action. This theory of confidence is then used to critically evaluate the anti-luminosity argument, leading to the surprising conclusion that although there are strong reasons for thinking that luminosity does not obtain, they are quite different from those the existing literature has considered.

- “Rescuing Implicit Definition from Abstractionism” [Abstract]

Neo-Fregeans in the philosophy of mathematics hold that the key to a correct understanding of mathematics is the implicit definition of mathematical language. In this paper, I discuss and advocate the rejection of abstractionism: the constraint (implicit within much of the recent neo-Fregean tradition) according to which all acceptable implicit definitions take the form of abstraction principles. I argue if we take the axioms of mathematical theories themselves as implicit definitions, a much more attractive and unified view of mathematics results. - “Is Mathematics Unreasonably Effective?” [Abstract]

Is mathematics unreasonably effective when applied to the natural sciences? Many mathematicians, physicists, and philosophers have argued that it is. I evaluate an argument for this conclusion by Mark Steiner, according to which (i) mathematics is good only insofar as it possesses certain aesthetic virtues, and (ii) the fact that such mathematics is of great use in applications results in a puzzling and problematic anthropocentrism. I respond by arguing that the aesthetic virtues used to evaluate and motivate mathematics are far less straightforwardly anthropocentric than one might presume, and (with reference to case-studies within Galois theory and probabilistic number theory) I show that these virtues are themselves epistemically relevant or at least track generally recognized theoretical virtues, such as explanatory and unifying power, fruitfulness, and importance. - “The Role of Mathematical Truth in Benacerraf's Dilemma” [Abstract]

In*Mathematical Truth*, Paul Benacerraf famously raised a dilemma for the philosophy of mathematics: very roughly, that mathematics needs a reasonable semantics and a reasonable epistemology but that it's impossible to see how these needs can be simultaneously satisfied. In this paper I argue that (i) Benacerraf's semantic constraint is in fact ambiguous between a stronger and a weaker reading; (ii) only the stronger reading is incompatible with the idea that mathematical truth is an epistemically tractable property; and (iii) the stronger reading is under-motivated.

## Teaching

- Lingnan:
- The Nature of Truth (Spring 2019)
- Modal Logic (Fall 2018) [Syllabus]
- Critical Thinking: Logic and Argumentation (Fall & Spring 2018)

- Oxford (tutorials/classes):
- Gödel's Incompleteness Theorems (Trinity 2017)
- Frege: Foundations of Arithmetic (Trinity 2017)
- Philosophy of Mathematics (Hilary 2017)
- Philosophical Logic (Hilary 2017)

- NYU & the New School for Social Research (as primary instructor):
- NYU (as TA):
- Great Works in Philosophy (Spring 2015, instructor Harvey Lederman)
- Ancient Philosophy (Fall 2014, instructor Jessica Moss)
- Texts and Ideas: Guilt and Sin, Law and Justice (Spring 2014, instructor Tim Maudlin)
- Political Philosophy (Fall 2013, instructor Aaron James)
- Life and Death (Spring 2013, instructor Joshua Gillon)
- Great Works in Philosophy (Fall 2012, instructor Jonathan Cottrell)