首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Three basic positions regarding the nature of fundamental properties are: dispositional monism, categorical monism and the mixed view. Dispositional monism apparently involves a regress or circularity, while an unpalatable consequence of categorical monism and the mixed view is that they are committed to quidditism. I discuss Alexander Bird's defence of dispositional monism based on the structuralist approach to identity. I argue that his solution does not help standard dispositional essentialism, as it admits the possibility that two distinct dispositional properties can possess the same stimuli and manifestations. Moreover, Bird's argument can be used to support the mixed view by relieving it of its commitment to quidditism. I briefly analyse an alternative defence of dispositional essentialism based on Leon Horsten's approach to the problem of circularity and impredicativity. I conclude that the best option is to choose Bird's solution but amend the dispositional perspective on properties. According to my proposal, the essences of dispositions are determined not directly by their stimuli and manifestations but by the role each property plays in the structure formed by the stimulus/manifestation relations.  相似文献   

2.
Parametric uncertainty quantification of the Rothermel's fire spread model is presented using the Polynomial Chaos expansion method under a Non-Intrusive Spectral Projection (NISP) approach. Several Rothermel's model input parameters have been considered random with an associated prescribed probability density function. Two different vegetation fire scenarios are considered and NISP method results and performance are compared with four other stochastic methodologies: Sensitivity Derivative Enhance Sampling; two Monte Carlo techniques; and Global Sensitivity Analysis. The stochastic analysis includes a sensitivity analysis study to quantify the direct influence of each random parameter on the solution. The NISP approach achieved performance three orders of magnitude faster than the traditional Monte Carlo method. The NISP capability to perform uncertainty quantification associated with fast convergence makes it well suited to be applied for stochastic prediction of fire spread.  相似文献   

3.
The examination of now-abandoned behaviorist analysis of the concept of belief can bring to light defects in perspectives such as functionalism and physicalism that are still considered viable. Most theories have in common that they identify the holding of the belief that p by a subject S with some matter of fact in or about S that is distinct from and independent of p. In the case of behaviorism it is easy to show that this feature of the theory generates incoherence in the first-person point of view since it gives footing to the possibility that S could correctly assert “I believe that p,” (that is, “I have the complex disposition the behaviorist theory identifies with holding the belief that p”) and at the same time deny that p is the case. Parallel incoherence can be developed in the context of other philosophically popular accounts of the nature of belief.  相似文献   

4.
It is well-known that in the class of convex functions the (nonnegative) remainder of the Midpoint Rule of approximate integration is majorized by the remainder of the Trapezoid Rule. Hence the approximation of the integral of a convex function by the Midpoint Rule is better than the analogous approximation by the Trapezoid Rule. Following this fact we examine remainders of certain quadratures in classes of convex functions of higher orders. Our main results state that for 3-convex (5-convex, respectively) functions the remainder of the 2-point (3-point, respectively) Gauss–Legendre quadrature is non-negative and it is not greater than the remainder of Simpson’s Rule (4-point Lobatto quadrature, respectively). We also check 2-point Radau quadratures for 2-convex functions to demonstrate that similar results fail to hold for convex functions of even orders. We apply the Peano Kernel Theorem as a main tool of our considerations.  相似文献   

5.
This article uses dynamic software in Excel to demonstrate several ways in which graphical and numerical approaches can be introduced both to enhance student understanding of l'Hopital's Rule and to explain why the Rule actually works to give the ‘right’ answers. One of the approaches used is to visualize what is happening by examining the limits with both l'Hopital's Rule and the associated Taylor approximation to the function. The dynamic software allows students to experiment with the ideas.  相似文献   

6.
G. E. Moore famously observed that to assert ‘I went to the pictures last Tuesday but I do not believe that I did’ would be ‘absurd’. Moore calls it a ‘paradox’ that this absurdity persists despite the fact that what I say about myself might be true. Krista Lawlor and John Perry have proposed an explanation of the absurdity that confines itself to semantic notions while eschewing pragmatic ones. We argue that this explanation faces four objections. We give a better explanation of the absurdity both in assertion and in belief that avoids our four objections.  相似文献   

7.
8.
Accuracy arguments are the en vogue route in epistemic justifications of probabilism and further norms governing rational belief. These arguments often depend on the fact that the employed inaccuracy measure is strictly proper. I argue controversially that it is ill-advised to assume that the employed inaccuracy measures are strictly proper and that strictly proper statistical scoring rules are a more natural class of measures of inaccuracy. Building on work in belief elicitation I show how strictly proper statistical scoring rules can be used to give an epistemic justification of probabilism.An agent's evidence does not play any role in these justifications of probabilism. Principles demanding the maximisation of a generalised entropy depend on the agent's evidence. In the second part of the paper I show how to simultaneously justify probabilism and such a principle. I also investigate scoring rules which have traditionally been linked with entropies.  相似文献   

9.
This paper argues that three plausible principles are mutually inconsistent: (KA) One ought to assert only what one knows; (AP) If it is proper to assert some proposition q, then it is, barring special and not very common circumstances, proper to assert any proposition p from which q has been competently inferred; and (AKN) Some propositions are both properly assertible and known by competent inference from propositions which one does not know. Each pair of two principles constitutes an argument against the remaining principle, but which principle should one drop?  相似文献   

10.
Contextualist theories of knowledge offer a semantic hypothesis to explain the observed contextual variation in what people say they know, and the difficulty people have resolving skeptical paradoxes. Subject or speaker relative versions make the truth conditions of “S knows that p” depend on the standards of either the knower’s context (Hawthorne and Stanley) or those of the speaker’s context (Cohen and DeRose). Speaker contextualism avoids objections to subject contextualism, but is implausible in light of evidence that “know” does not behave like an indexical. I deepen and extend these criticisms in light of recent defenses by contextualists (including Ludlow). Another difficulty is that whether certain standards are salient or intended does not entail that they are the proper standards. A normative form of contextualism on which the truth of a knowledge claim depends on the proper standards for the context is more promising, but still unsatisfactory whether the view is speaker or subject relative. I defend alternative explanations for the observed linguistic and psychological data: a pragmatic account for some cases and a cognitive account for others.1 I presented this paper at the 2004 Bled Conference on Contextualism, sponsored by Mirabor and Northern Illinois Universities.  相似文献   

11.
In this paper, I provide a probabilistic account of factual knowledge, based on the notion of chance, which is a function of an event (or a fact — I will use ‘fact’ to cover both) given a prior history. This account has some affinity with my chance account of token causation, but it neither relies on it nor presupposes it. Here, I concentrate on the core cases of perceptual knowledge and of knowledge by memory (based on perception). (The account can be extended to the other modes of knowledge, but not in this paper.) The analysis of knowledge presented below is externalist. The underlying intuition guiding the treatment of knowledge in this paper is that knowledge boils down to high token discriminative indicativeness. Type indicativeness or type discriminability are neither necessary nor sufficient for knowledge: the token aspect comes out in the strong dependence on the specific circumstances and chances of the case. The main condition of the first section, the indicativity condition (KI), properly refined, yields pertinent (token) indicativity as a main constituent. Very roughly, it involves the chance of the content clause p being higher given the subject's believing that p than otherwise. The discriminability condition in question (section 3) captures the sense of discriminability appropriate for knowledge and yield the indicativity condition: it is an extension of the indicativity condition KI. Roughly, the subject’s ability to discriminate the object in front of her being red from its being green is captured by holding fixed, in the indicativity condition, the condition “the object in front of her is red or green.” A major element in the analysis is the so-called Contrast Class, which governs the scope of discriminability. This is the class of features that have to be taken into account in the discriminability condition, and it is characterized by two constraints. Very roughly, according to the first constraint, for a feature to be in the contrast class, it must not represent a sub-type of (the feature specified by) the predicate in the content clause. According to the second constraint, which is a central condition with many implications, the chance that the object specified in the content clause has a feature represented in the contrast class must not under the circumstances be too low. This constraint, within the framework of the discriminability condition, brings out a major constitutive aspect of knowledge: knowledge amounts to a limited vulnerability to mistakes of the belief in question under the circumstances at hand. The contrast class plays a major role in my treatment of skepticism. The second constraint on the Contrast Class together with the VHP condition below bring out precisely the way in which perceptual knowledge is fallible.  相似文献   

12.
In this paper, we argue for a novel three-dimensionalist (3D'ist) solution to the problem of persistence, i.e. cross-temporal identity. We restrict the discussion of persistence to simple substances, which do not have other substances as their parts. The account of simple substances employed in the paper is a trope-nominalist strong nuclear theory (SNT), which develops Peter Simons' trope nominalism. Regarding the distinction between three dimensionalism (3D) and four dimensionalism (4D), we follow Michael Della Rocca's formulation, in which 3D explains persistence in virtue of same entities and 4D in virtue of distinct entities (temporal parts). SNT is a 3D'ist position because it accounts for the persistence of simple substances in virtue of diachronically identical ‘nuclear’ tropes. The nuclear tropes of a simple substance are necessary for it and mutually rigidly dependent but distinct. SNT explains qualitative change by tropes that are contingent to a simple substance. We show that it avoids the standard problems of 3D: temporal relativization of ontic predication, Bradley's regress and coincidence, fission and fusion cases. The temporal relativization is avoided because of the analysis of temporary parts that SNT gives in terms of temporal sub-location, which is atemporal part–whole relation.  相似文献   

13.
Self-expression,Expressiveness, and Sincerity   总被引:1,自引:0,他引:1  
This paper examines some aspects of Mitchell Green’s account of self-expression. I argue that Green fails to address the distinction between success and evidential notions of expression properly, which prevents him from adequately discussing the relation between these notions. I then consider Green’s explanation of how a speech act shows what is within, i.e., because of the liabilities one incurs and argue that this is false. Rather, the norms governing speech acts and liabilities incurred give us reason to think that the speaker is in a particular state of mind. It thus supports an evidential rather than success notion. Finally, I suggest that it is because of the sincerity of what is said, rather than the liabilities incurred, that you show what is within.  相似文献   

14.
In this short note is presented an easy and instructive proof of l'Hôpital's Rule in both the 0/0 and ∞/ ∞ cases, obtained by translating the theorem into a simple geometric statement about paths in the plane.  相似文献   

15.
The co-maker concept has become accepted practice in many successful global business organizations. This fact has resulted in a class of inventory models known as joint economic lot size (JELS) models. Heretofore such models assumed perfect quality production on the part of the vendor. This paper relaxes this assumption and proposes a quality-adjusted JELS model. In addition, classical optimization methods are used to derive models for the cases of setup cost reduction, quality improvement, and simultaneous setup cost reduction and quality improvement for the quality-adjusted JELS. Numerical results are presented for each of these models. Comparisons are made to the basic quality-adjusted model. Results indicate that all three policies exhibit significantly reduced total cost. However, the simultaneous model results in the lowest cost overall and the smallest lot size. This suggests a synergistic impact of continuous improvement programs that focus on both setup and quality improvement of the vendor's production process. Sensitivity analysis indicates that the simultaneous model is robust and representative of practice.  相似文献   

16.
《Comptes Rendus Mathematique》2019,357(11-12):846-850
Riemann's non-differentiable function is a celebrated example of a continuous but almost nowhere differentiable function. There is strong numeric evidence that one of its complex versions represents a geometric trajectory in experiments related to the binormal flow or the vortex filament equation. In this setting, we analyse certain geometric properties of its image in C. The objective of this note is to assert that the Hausdorff dimension of its image is no larger than 4/3 and that it has nowhere a tangent.  相似文献   

17.
Rule induction is a method of automatically developing rules from sets of examples. Quinlan's ID3 algorithm, which was developed for determinate data, has been extended to deal with statistical data. This paper reports on an experimental comparison with multiple regression.  相似文献   

18.
The topic of this paper is the role played by context in art. In this regard I examine three theories linked to the names of J. Levinson, G. Currie and D. Davies. Levinson’s arguments undermine the structural theory. He finds it objectionable because it makes the individuation of artworks independent of their histories. Secondly, such a consequence is unacceptable because it fails to recognise that works are created rather than discovered. But, if certain general features of provenance are always work-constitutive, as it seems that Levinson is willing to claim, these features must always be essential properties of works. On the other hand, consideration of our modal practice suggests that whether a given general feature of provenance is essential or non-essential depends upon the particular work in question or is “work relative”. D. Davies builds his performance theory on the basis of the critical evaluation of Currie’s action-type hypotheses (ATH). Performances, says Davies, are not to be identified with “basic actions” to which their times belong essentially, but with “doings” that permit of the sorts of variation in modal properties required by the work-relativity of modality. He is also a fierce critic of the contextualist account. Contextualism is in his view unable to reflect the fact that aspects of provenance bear upon our modal judgements with variable force.In the second part of the paper I consider Davies’s “modality principle”. Davies is inclined to defend the claim that labels used for designation of works are rigid designators. Such a view offers a ground for discussion about the historicity of art. What has been meant when people claim that art is an historical concept? I argue that any historical theory implies a two-dimensional notion of “art”. At the end of the paper I suggest that Davies should embrace the theory of contingent identity and not the colocationist view about the relationship that exists between a particular artwork and its physical bearer.  相似文献   

19.
InThe Many Faces of Realism and elsewhere, Hilary Putnam has presented an argument for the conclusion that there is no fact of the matter as to how many objects there are. In brief: “Carnap” says that a certain imaginary world contains three objects, ×1, ×2, and ×3. The “Polish logician” says that this same world must contain four other objects (×1 + ×2, ×1 + ×2 + ×3, etc.). Putnam maintains that there can be no fact of the matter as to whether the imaginary world contains three or seven objects. I examine Putnam’s argument and find it, at bottom, unintelligible.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号