首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 609 毫秒
1.
Paul Horwich’s Use Theory of Meaning (UTM) depends on his rejection of Paul Grice’s distinction between natural and non-natural meaning and his Univocality of Meaning Thesis, as he wishes to deflate the meaning-relation to usage. Horwich’s programme of deflating the meaning-relation (i.e. how words, sentences, etc., acquire meaning) to some basic regularity of usage cannot be carried through if the meaning-relation depends on the minds of users. Here, I first give a somewhat detailed account of the distinction between natural and non-natural meaning in order to set the stage for Horwich’s critique of it. I then present Horwich’s critique of the distinction and show how that rejection accords with his overall view of meaning as use. Horwich’s rejection of the distinction between natural and non-natural meaning, I argue in the last section, is ill founded, and because UTM depends on this rejection, UTM is stillborn.  相似文献   

2.
In early analytic philosophy, one of the most central questions concerned the status of arithmetical objects. Frege argued against the popular conception that we arrive at natural numbers with a psychological process of abstraction. Instead, he wanted to show that arithmetical truths can be derived from the truths of logic, thus eliminating all psychological components. Meanwhile, Dedekind and Peano developed axiomatic systems of arithmetic. The differences between the logicist and axiomatic approaches turned out to be philosophical as well as mathematical. In this paper, I will argue that Dedekind’s approach can be seen as a precursor to modern structuralism and as such, it enjoys many advantages over Frege’s logicism. I also show that from a modern perspective, Frege’s criticism of abstraction and psychologism is one-sided and fails against the psychological processes that modern research suggests to be at the heart of numerical cognition. The approach here is twofold. First, through historical analysis, I will try to build a clear image of what Frege’s and Dedekind’s views on arithmetic were. Then, I will consider those views from the perspective of modern philosophy of mathematics, and in particular, the empirical study of arithmetical cognition. I aim to show that there is nothing to suggest that the axiomatic Dedekind approach could not provide a perfectly adequate basis for philosophy of arithmetic.  相似文献   

3.
Formal axiomatic theories based on the three-valued logic of Lukasiewicz are considered. Main notions related to these theories, in particular, those of Luk-model, Luk-consistent theory, and Luk-complete theory are introduced. Logical calculuses that describe such theories are defined; counterparts of the classical compactness and completeness theorems are proved. Theories of arithmetic based on Lukasiewicz’s logic and on its constructive (intuitionistic) variant are investigated; the theorem on effective Luk-incompleteness is proved for a large class of arithmetic systems. This theorem is a three-valued counterpart of the famous Godel theorem on incompleteness of formal theories. Three-valued counterparts of Presburger’s arithmetic system are defined and proved to be Luk-complete but incomplete in the classical sense. Bibliography: 29 titles.__________Published in Zapiski Nauchnykh Seminarov POMI, Vol. 304, 2002, pp. 19–74.  相似文献   

4.
Adam Hosein 《Acta Analytica》2013,28(4):495-508
Rawls developed a contractualist theory of social justice and Scanlon attempted to extend the Rawlsian framework to develop a theory of rightness, or morality more generally. I argue that there are some good reasons to adopt a contractualist theory of social justice, but that it is a mistake to adopt a contractualist theory of rightness. I begin by illustrating the major shared features of Scanlon and Rawls’ theories. I then show that the justification for these features in Rawls’ theory, the centrality of cooperative fairness to social justice, cannot be used to defend their use in Scanlon’s. Finally, I argue that Scanlon has not provided an adequate alternative defense of these features, and show that they create problems when contractualists try to explain major features of our common-sense morality.  相似文献   

5.
The need for modifying axiomatic set theories was caused, in particular, by the development of category theory. The ZF and NBG axiomatic theories turned out to be unsuitable for defining the notion of a model of category theory. The point is that there are constructions such as the category of categories in naïve category theory, while constructions like the set of sets are strongly restricted in the ZF and NBG axiomatic theories. Thus, it was required, on the one hand, to restrict constructions similar to the category of categories and, on the other hand, adapt axiomatic set theory in order to give a definition of a category which survives restricted construction similar to the category of categories. This task was accomplished by promptly inventing the axiom of universality (AU) asserting that each set is an element of a universal set closed under all NBG constructions. Unfortunately, in the theories ZF + AU and NBG + AU, there are toomany universal sets (as many as the number of all ordinals), whereas to solve the problem stated above, a countable collection of universal sets would suffice. For this reason, in 2005, the first-named author introduced local-minimal set theory, which preserves the axiom AU of universality and has an at most countable collection of universal sets. This was achieved at the expense of rejecting the global replacement axiom and using the local replacement axiom for each universal class instead. Local-minimal set theory has 14 axioms and one axiom scheme (of comprehension). It is shown that this axiom scheme can be replaced by finitely many axioms that are special cases of the comprehension scheme. The proof follows Bernays’ scheme with significant modifications required by the presence of the restricted predicativity condition on the formula in the comprehension axiom scheme.  相似文献   

6.
Ludger Jansen 《Metaphysica》2007,8(2):211-220
After a short sketch of Lowe’s account of his four basic categories, I discuss his theory of formal ontological relations and how Lowe wants to account for dispositional predications. I argue that on the ontic level Lowe is a pan-categoricalist, while he is a language dualist and an exemplification dualist with regard to the dispositional/categorical distinction. I argue that Lowe does not present an adequate account of disposition. From an Aristotelian point of view, Lowe conflates dispositional predication with hôs epi to poly statements about what is normally or mostly the case.  相似文献   

7.
We explore the relationship between Brouwer’s intuitionistic mathematics and Euclidean geometry. Brouwer wrote a paper in 1949 called The contradictority of elementary geometry. In that paper, he showed that a certain classical consequence of the parallel postulate implies Markov’s principle, which he found intuitionistically unacceptable. But Euclid’s geometry, having served as a beacon of clear and correct reasoning for two millennia, is not so easily discarded.Brouwer started from a “theorem” that is not in Euclid, and requires Markov’s principle for its proof. That means that Brouwer’s paper did not address the question whether Euclid’s Elements really requires Markov’s principle. In this paper we show that there is a coherent theory of “non-Markovian Euclidean geometry”. We show in some detail that our theory is an adequate formal rendering of (at least) Euclid’s Book I, and suffices to define geometric arithmetic, thus refining the author’s previous investigations (which include Markov’s principle as an axiom).Philosophically, Brouwer’s proof that his version of the parallel postulate implies Markov’s principle could be read just as well as geometric evidence for the truth of Markov’s principle, if one thinks the geometrical “intersection theorem” with which Brouwer started is geometrically evident.  相似文献   

8.
Alethic pluralists argue truth is a metaphysically robust higher-order property that is multiply realized by a set of diverse and domain-specific subvening alethic properties. The higher-order truth property legitimizes mixed inferences and accounts for a univocal truth predicate. Absent of this higher-order property, pluralists lack an account of the validity of mixed inferences and an adequate semantics for the truth predicate and thereby appear forced to abandon the central tenets of alethic pluralism. I argue the use of many-valued logics to support pluralism fails to address the pluralist’s metaphysical problem regarding mixed inferences and mixed truth functional connectives. The high degree of heterogeneity of the alethic realizers (unlike the realizers for pain) challenges the plausibility of a single higher-order functional property. A functional property with such a heterogeneous base cannot be projectable at a theoretically significant level. The problem with mixed inferences and truth functions is but one symptom of the deeper projectability problem.  相似文献   

9.
In On What Matters, Derek Parfit defends a new metaethical theory, which he calls non-realist cognitivism. It claims that (i) normative judgments are beliefs; (ii) that some normative beliefs are true; (iii) that the normative concepts that are a part of the propositions that are the contents of normative beliefs are irreducible, unanalysable and of their own unique kind; and (iv) that neither the natural features of the reality nor any additional normative features of the reality make the relevant normative beliefs true. The aim of this article is to argue that Parfit’s theory is problematic because its defenders have no resources to make sense of the nature of normative truth, which is an essential element of their view. I do this by showing how the traditional theories of truth are not available for the non-realist cognitivists.  相似文献   

10.
Rough set theory is an important tool for approximate reasoning about data. Axiomatic systems of rough sets are significant for using rough set theory in logical reasoning systems. In this paper, outer product method are used in rough set study for the first time. By this approach, we propose a unified lower approximation axiomatic system for Pawlak’s rough sets and fuzzy rough sets. As the dual of axiomatic systems for lower approximation, a unified upper approximation axiomatic characterization of rough sets and fuzzy rough sets without any restriction on the cardinality of universe is also given. These rough set axiomatic systems will help to understand the structural feature of various approximate operators.  相似文献   

11.
Deflationism is usually thought to differ from the correspondence theory over whether truth is a substantial property. However, I argue that this notion of a ‘substantial property’ is tendentious. I further argue that the Equivalence Schema alone is sufficient to lead to idealism when combined with a pragmatist theory of truth. Deflationism thus has more powerful metaphysical implications than is generally thought and itself amounts to a kind of correspondence theory.  相似文献   

12.
Jay Newhard 《Acta Analytica》2014,29(3):349-361
Michael Lynch has recently proposed an updated version of alethic functionalism according to which the relation between truth per se and lower-level truth properties is not the realization relation, as might be expected, and as Lynch himself formerly held, but the manifestation relation. I argue that the manifestation relation is merely a resemblance relation and is inadequate to properly relate truth per se to lower-level truth properties. I also argue that alethic functionalism does not justify the claim that truth per se exists, or that truth per se is a functional property. Finally, I suggest a replacement for the manifestation relation. I argue that the resulting theory is a strict improvement over alethic functionalism on two counts, but that the improved theory does not justify the claim that truth per se exists. Since no further improvements to the theory are apparent, the prospects for alethic functionalism are dim.  相似文献   

13.
Realistic Mathematics Education supports students’ formalization of their mathematical activity through guided reinvention. To operationalize “formalization” in a proof-oriented instructional context, I adapt Sjogren's (2010) claim that formal proof explicates (Carnap, 1950) informal proof. Explication means replacing unscientific or informal concepts with scientific ones. I use Carnap's criteria for successful explication – similarity, exactness, and fruitfulness – to demonstrate how the elements of mathematical theory – definitions, axioms, theorems, proofs – can each explicate their less formal correlates. This lens supports an express goal of the instructional project, which is to help students coordinate semantic (informal) and syntactic (formal) mathematical activity. I demonstrate the analytical value of the explication lens by applying it to examples of students’ mathematical activity drawn from a design experiment in undergraduate, neutral axiomatic geometry. I analyze the chains of meanings (Thompson, 2013) that emerged when formal elements were presented readymade alongside those emerging from guided reinvention.  相似文献   

14.
The first attempt at a systematic approach to axiomatic theories of truth was undertaken by Friedman and Sheard (Ann Pure Appl Log 33:1–21, 1987). There twelve principles consisting of axioms, axiom schemata and rules of inference, each embodying a reasonable property of truth were isolated for study. Working with a base theory of truth conservative over PA, Friedman and Sheard raised the following questions. Which subsets of the Optional Axioms are consistent over the base theory? What are the proof-theoretic strengths of the consistent theories? The first question was answered completely by Friedman and Sheard; all subsets of the Optional Axioms were classified as either consistent or inconsistent giving rise to nine maximal consistent theories of truth.They also determined the proof-theoretic strength of two subsets of the Optional Axioms. The aim of this paper is to continue the work begun by Friedman and Sheard. We will establish the proof-theoretic strength of all the remaining seven theories and relate their arithmetic part to well-known theories ranging from PA to the theory of ${\Sigma^1_1}The first attempt at a systematic approach to axiomatic theories of truth was undertaken by Friedman and Sheard (Ann Pure Appl Log 33:1–21, 1987). There twelve principles consisting of axioms, axiom schemata and rules of inference, each embodying a reasonable property of truth were isolated for study. Working with a base theory of truth conservative over PA, Friedman and Sheard raised the following questions. Which subsets of the Optional Axioms are consistent over the base theory? What are the proof-theoretic strengths of the consistent theories? The first question was answered completely by Friedman and Sheard; all subsets of the Optional Axioms were classified as either consistent or inconsistent giving rise to nine maximal consistent theories of truth.They also determined the proof-theoretic strength of two subsets of the Optional Axioms. The aim of this paper is to continue the work begun by Friedman and Sheard. We will establish the proof-theoretic strength of all the remaining seven theories and relate their arithmetic part to well-known theories ranging from PA to the theory of S11{\Sigma^1_1} dependent choice.  相似文献   

15.
Evolutionary theories are characterised by another logic than the classical scientific theories as, e.g., in physics or chemistry. Since Darwin evolutionary processes are captured by algorithmic theories, i.e., theories whose logical core consists of certain fundamental algorithms. In the case of biological evolution this is a genetic algorithm, that is the processes of variation and selection. Other evolutionary processes like those of societies and cognitive development have to be understood by other algorithms that are characteristic for the specific domains. These considerations are illustrated with the case studies of Darwin, Marx and Piaget.  相似文献   

16.
Davidson’s 1974 argument denying the possibility of incommensurable conceptual schemes is widely interpreted as entailing a denial of metaphysical pluralism. Speakers may group objects differently or have different beliefs about the world, but there is just one world. I argue there is tension arising from three aspects of Davidson’s philosophy: (1) the 1974 argument against conceptual schemes; (2) Davidson’s more recent emphasis on primitive triangulation as a necessary condition for thought and language; and (3) Davidson’s semantic approach to metaphysics, what he calls ‘the method of truth in metaphysics’. After elucidating the tension, I argue the tension can be resolved while preserving at least two major tenets of Davidson’s philosophy: (1) conceptual schemes do not carve an uninterpreted reality into different worlds and (2) truth is objective and non-epistemic. I argue Davidson is implicitly committed to a plurality of worlds.  相似文献   

17.
Recently, I had a very interesting friendly e-mail discussion with Professor Parikh on vagueness and fuzzy logic. Parikh published several papers concerning the notion of vagueness. They contain critical remarks on fuzzy logic and its ability to formalize reasoning under vagueness [10,11]. On the other hand, for some years I have tried to advocate fuzzy logic (in the narrow sense, as Zadeh says, i.e. as formal logical systems formalizing reasoning under vagueness) and in particular, to show that such systems (of many-valued logic of a certain kind) offer a fully fledged and extremely interesting logic [4, 5]. But this leaves open the question of intuitive adequacy of many-valued logic as a logic of vagueness. Below I shall try to isolate eight questions Parikh asks, add two more and to comment on all of them. Finally, I formulate a problem on truth (in)definability in Łukasiewicz logic which shows, in my opinion, that fuzzy logic is not just “applied logic” but rather belongs to systems commonly called “philosophical logic” like modal logics, etc.  相似文献   

18.
The aim of this paper is to pose a problem for theories that claim that belief reports are context dependent. Firstly, I argue that the claim (interpreted in the spirit of moderate contextualism) is committed to verbalism, a theory that derives the context sensitivity of belief reports from the context sensitivity of the psychological verbs used in such reports. Secondly, I argue that verbalism is not an attractive theoretical option because it is in conflict with the non-proto-rigidity of verbs like ‘believe’. Finally, I describe various consequences that the argument has for invariantism and moderate contextualism.  相似文献   

19.
Local set theory     
In 1945, Eilenberg and MacLane introduced the new mathematical notion of category. Unfortunately, from the very beginning, category theory did not fit into the framework of either Zermelo—Fraenkel set theory or even von Neumann—Bernays—Gödel set-class theory. For this reason, in 1959, MacLane posed the general problem of constructing a new, more flexible, axiomatic set theory which would be an adequate logical basis for the whole of naïve category theory. In this paper, we give axiomatic foundations for local set theory. This theory might be one of the possible solutions of the MacLane problem.  相似文献   

20.
Various attempts at demarcating logic were undertaken, many of them based on specific understanding of how logical knowledge is formal and not material. MacFarlane has persuasively shown that general idea of formality of logic can be understood in various ways. I take two of the accounts of formality, namely the requirement of conservativity and the requirement of schematicity of logical vocabulary, into consideration as promising candidates to make the all too unclear notion of formality more precise and study to what degree they could be considered as either necessary or sufficient conditions for logicality of some piece of vocabulary. Finding both notion unsatisfactory, as they stand, I propose combining them and envisage a hierarchy of logicality of expressions of a given language. Such a hierarchy is complicated and not linear, yet still offers a valuable explication of both the range and pragmatic significance of logic, if we combine it with logical expressivism.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号