首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 312 毫秒
1.
Contextualism is supposed to explain why the following argument for skepticism seems plausible: (1) I don’t know that I am not a bodiless brain-in-a-vat (BIV); (2) If I know I have hands, then I know I am not a bodiless BIV; (3) Therefore, I do not know I have hands. Keith DeRose claims that (1) and (2) are “initially plausible.” I claim that (1) is initially plausible only because of an implicit argument that stands behind it; it is not intuitively plausible. The argument DeRose offers is based on the requirement of sensitivity, that is, on the idea that if you know something then you would not believe it if it were false. I criticize the sensitivity requirement thereby undercutting its support for (1) and the skeptical data that contextualism is meant to explain. While skepticism is not a plausible ground for contextualism, I argue that certain pragmatic considerations are. It’s plausible to think that to know something more evidence is required when more is at stake. The best way to handle skepticism is to criticize the arguments for it. We should not adopt contextualism as a means of accommodating skepticism even if there are other pragmatic reasons for being a contextualist about knowledge.  相似文献   

2.
3.
Book Vignettes     
Book reviewed in this article: Riley, K. F., Mathematical Methods for the Physical Sciences: An Informa Treatment for Students of Physics and Engineering. Herstein, I. N. and Kaplansky, I., Matters Mathematical. Stone, A. Harris and Shrrman, Lloyd R., Spaceship Earth: Life Science. Deaton, John G., M.D., New Parts for Old: The Age of Organ Transplants. Ricciuti, Edward R., To the Brink of Extinction.  相似文献   

4.
The topic of this paper is the role played by context in art. In this regard I examine three theories linked to the names of J. Levinson, G. Currie and D. Davies. Levinson’s arguments undermine the structural theory. He finds it objectionable because it makes the individuation of artworks independent of their histories. Secondly, such a consequence is unacceptable because it fails to recognise that works are created rather than discovered. But, if certain general features of provenance are always work-constitutive, as it seems that Levinson is willing to claim, these features must always be essential properties of works. On the other hand, consideration of our modal practice suggests that whether a given general feature of provenance is essential or non-essential depends upon the particular work in question or is “work relative”. D. Davies builds his performance theory on the basis of the critical evaluation of Currie’s action-type hypotheses (ATH). Performances, says Davies, are not to be identified with “basic actions” to which their times belong essentially, but with “doings” that permit of the sorts of variation in modal properties required by the work-relativity of modality. He is also a fierce critic of the contextualist account. Contextualism is in his view unable to reflect the fact that aspects of provenance bear upon our modal judgements with variable force.In the second part of the paper I consider Davies’s “modality principle”. Davies is inclined to defend the claim that labels used for designation of works are rigid designators. Such a view offers a ground for discussion about the historicity of art. What has been meant when people claim that art is an historical concept? I argue that any historical theory implies a two-dimensional notion of “art”. At the end of the paper I suggest that Davies should embrace the theory of contingent identity and not the colocationist view about the relationship that exists between a particular artwork and its physical bearer.  相似文献   

5.
Contextualist theories of knowledge offer a semantic hypothesis to explain the observed contextual variation in what people say they know, and the difficulty people have resolving skeptical paradoxes. Subject or speaker relative versions make the truth conditions of “S knows that p” depend on the standards of either the knower’s context (Hawthorne and Stanley) or those of the speaker’s context (Cohen and DeRose). Speaker contextualism avoids objections to subject contextualism, but is implausible in light of evidence that “know” does not behave like an indexical. I deepen and extend these criticisms in light of recent defenses by contextualists (including Ludlow). Another difficulty is that whether certain standards are salient or intended does not entail that they are the proper standards. A normative form of contextualism on which the truth of a knowledge claim depends on the proper standards for the context is more promising, but still unsatisfactory whether the view is speaker or subject relative. I defend alternative explanations for the observed linguistic and psychological data: a pragmatic account for some cases and a cognitive account for others.1 I presented this paper at the 2004 Bled Conference on Contextualism, sponsored by Mirabor and Northern Illinois Universities.  相似文献   

6.
A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed. New adaptive step size scheme uses ordered statistics of fixed number of previous noisy function values as a criterion for accepting good and rejecting bad steps. The scheme allows the algorithm to move in bigger steps and avoid steps proportional to $1/k$ when it is expected that larger steps will improve the performance. An algorithm with the new adaptive scheme is defined for a general descent direction. The almost sure convergence is established. The performance of new algorithm is tested on a set of standard test problems and compared with relevant algorithms. Numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level. Numerical results on problems arising in machine learning are also presented. Linear regression problem is considered using real data set. The results suggest that the proposed algorithm shows promise.  相似文献   

7.
Instead of trying to recognize and avoid degenerate steps in the simplex method (as some variants do), we have developed a new Phase I algorithm that is impervious to degeneracy. The new algorithm solves a non-negative least-squares problem in order to find a Phase I solution. In each iteration, a simple two-variable least-squares subproblem is used to select an incoming column to augment a set of independent columns (called basic) to get a strictly better fit to the right-hand side. Although this is analogous in many ways to the simplex method, it can be proved that strict improvement is attained at each iteration, even in the presence of degeneracy. Thus cycling cannot occur, and convergence is guaranteed. This algorithm is closely related to a number of existing algorithms proposed for non-negative least-squares and quadratic programs.When used on the 30 smallest NETLIB linear programming test problems, the computational results for the new Phase I algorithm were almost 3.5 times faster than a particular implementation of the simplex method; on some problems, it was over 10 times faster. Best results were generally seen on the more degenerate problems.  相似文献   

8.
Artin's braid groups currently provide a promising background for cryptographical applications, since the first cryptosystems using braids were introduced in [I. Anshel, M. Anshel, D. Goldfeld, An algebraic method for public-key cryptography, Math. Res. Lett. 6 (1999) 287-291, I. Anshel, M. Anshel, B. Fisher, D. Goldfeld, New key agreement schemes in braid group cryptography, RSA 2001, K.H. Ko, S.J. Lee, J.H. Cheon, J.W. Han, J.S. Kang, C. Park, New public-key cryptosystem using braid groups, Crypto 2000, pp. 166-184] (see also [V.M. Sidelnikov, M.A. Cherepnev, V.Y. Yashcenko, Systems of open distribution of keys on the basis of noncommutative semigroups, Ross. Acad. Nauk Dokl. 332-5 (1993); English translation: Russian Acad. Sci. Dokl. Math. 48-2 (1194) 384-386]). A variety of key agreement protocols based on braids have been described, but few authentication or signature schemes have been proposed so far. We introduce three authentication schemes based on braids, two of them being zero-knowledge interactive proofs of knowledge. Then we discuss their possible implementations, involving normal forms or an alternative braid algorithm, called handle reduction, which can achieve good efficiency under specific requirements.  相似文献   

9.
We study generic unfoldings of homoclinic tangencies of two-dimensional area-preserving diffeomorphisms (conservative New house phenomena) and show that they give rise to invariant hyperbolic sets of arbitrarily large Hausdorff dimension. As applications, we discuss the size of the stochastic layer of a standard map and the Hausdorff dimension of invariant hyperbolic sets for certain restricted three-body problems. We avoid involved technical details and only concentrate on the ideas of the proof of the presented results.  相似文献   

10.
In this paper I sketch a model for the transition from biologically to culturally based forms of social organization. The impetus for the transition arises from increased individualization among the non-human primates that can be observed as one moves phylogenetically from the Cercopithecoids and Ceboids (Old and New World monkeys) to the hominoids, especially the African apes. Increased individualization introduced a conflict with coherent and stable social integration that was only resolved among the hominid ancestors to modern Homo sapiens by shifting to a cultural/conceptual, rather than a behavioral/biological, basis for social organization. The shift entailed a change from evolution driven by individual fitness to evolution driven by the structural coherency of a conceptual system for social organization; that is, to selection based on group, rather than individual, level traits. Conceptually the transition depended upon the evolution of mental capacities such as a theory of mind and recursion, both of which are absent or occur only in minimal form among the non-human primates.  相似文献   

11.
In a series of papers, Adam Leite has developed a novel view of justification tied to being able to responsibly justify a belief. Leite touts his view as (i) faithful to our ordinary practice of justifying beliefs, (ii) providing a novel response to an epistemological problem of the infinite regress, and (iii) resolving the “persistent interlocutor” problem. Though I find elements of Leite’s view of being able to justify a belief promising, I hold that there are several problems afflicting the overall picture of justification. In this paper, I argue that despite its ambitions, Leite’s view fails to solve the persistent interlocutor problem and does not avoid a vicious regress.  相似文献   

12.
13.
The trust region(TR) method for optimization is a class of effective methods.The conic model can be regarded as a generalized quadratic model and it possesses the good convergence properties of the quadratic model near the minimizer.The Barzilai and Borwein(BB) gradient method is also an effective method,it can be used for solving large scale optimization problems to avoid the expensive computation and storage of matrices.In addition,the BB stepsize is easy to determine without large computational efforts.In this paper,based on the conic trust region framework,we employ the generalized BB stepsize,and propose a new nonmonotone adaptive trust region method based on simple conic model for large scale unconstrained optimization.Unlike traditional conic model,the Hessian approximation is an scalar matrix based on the generalized BB stepsize,which resulting a simple conic model.By adding the nonmonotone technique and adaptive technique to the simple conic model,the new method needs less storage location and converges faster.The global convergence of the algorithm is established under certain conditions.Numerical results indicate that the new method is effective and attractive for large scale unconstrained optimization problems.  相似文献   

14.
In this paper I report findings from a four year study of beginning elementary school teachers which investigated development in their mathematical knowledge for teaching (MKT). The study took a developmental research approach, in that the teachers and the researcher collaborated to develop the mathematics teaching of the teachers, while also trying to understand how such development occurred and might be facilitated. The Knowledge Quartet (KQ) framework was used as a tool to support focused reflection on the mathematical content of teaching, with the aim of promoting development in mathematical content knowledge. Although I focused primarily on whether and how focused reflection using the KQ would promote development, it was impossible to separate this from other influences, and in this paper I discuss the ways in which reflection was found to interrelate with other areas of influence. I suggest that by helping the teachers to focus on the content of their mathematics teaching, within the context of their experience in classrooms and of working with others, the KQ framework supported development in the MKT of teachers in the study.  相似文献   

15.
FMT问题的两种三Ⅰ算法及其还原性   总被引:30,自引:8,他引:22  
进一步研究FMT问题,得到该问题的三Ⅰ算法的一般计算公式,提出该问题的一种新算法三Ⅰ^*算法,给出新算法的一般计算公式,讨论两种算法的还原性问题,明确两种还原性的含义,证明FMT问题的三Ⅰ算法是W-还原的,而三Ⅰ^*算法是Z-还原的。  相似文献   

16.
The semantic blindness objection to contextualism challenges the view that there is no incompatibility between (i) denials of external-world knowledge in contexts where radical-deception scenarios are salient, and (ii) affirmations of external-world knowledge in contexts where such scenarios are not salient. Contextualism allegedly attributes a gross and implausible form of semantic incompetence in the use of the concept of knowledge to people who are otherwise quite competent in its use; this blindness supposedly consists in wrongly judging that there is genuine conflict between claims of type (i) and type (ii). We distinguish two broad versions of contextualism: relativistic-content contextualism and categorical-content contextualism. We argue that although the semantic blindness objection evidently is applicable to the former, it does not apply to the latter. We describe a subtle form of conflict between claims of types (i) and (ii), which we call différance-based affirmatory conflict. We argue that people confronted with radical-deception scenarios are prone to experience a form of semantic myopia (as we call it): a failure to distinguish between différance-based affirmatory conflict and outright inconsistency. Attributing such semantic myopia to people who are otherwise competent with the concept of knowledge explains the bafflement about knowledge-claims that so often arises when radical-deception scenarios are made salient. Such myopia is not some crude form of semantic blindness at all; rather, it is an understandable mistake grounded in semantic competence itself: what we call a competence-based performance error.  相似文献   

17.
I consider the problem of weekly assignment of shifts to operators. The shifts are to be assigned by seniority (or by any other employee hierarchy), but every employee is guaranteed to receive at least one shift. I propose a practical solution method to this problem that guarantees a feasible solution. As an application, the solution method is applied to the data provided in the literature on weekly shift assignment of the operators of the New Brunswick Telephone company. The solution method is also applied to 100 randomly generated problems. The results show that the solution method produces close-to-optimal solutions.  相似文献   

18.
Recently external memory graph problems have received considerable attention because massive graphs arise naturally in many applications involving massive data sets. Even though a large number of I/O-efficient graph algorithms have been developed, a number of fundamental problems still remain open.The results in this paper fall in two main classes. First we develop an improved algorithm for the problem of computing a minimum spanning tree (MST) of a general undirected graph. Second we show that on planar undirected graphs the problems of computing a multi-way graph separation and single source shortest paths (SSSP) can be reduced I/O-efficiently to planar breadth-first search (BFS). Since BFS can be trivially reduced to SSSP by assigning all edges weight one, it follows that in external memory planar BFS, SSSP, and multi-way separation are equivalent. That is, if any of these problems can be solved I/O-efficiently, then all of them can be solved I/O-efficiently in the same bound. Our planar graph results have subsequently been used to obtain I/O-efficient algorithms for all fundamental problems on planar undirected graphs.  相似文献   

19.
This paper shows that the (New)2 Welfare Economics provides interesting new ways of classifying externalities in terms of the complexity of messages required to equate equilibria and optima; and in which, at the end, it is shown that the study of externalities provides useful new insights into the (New)2 Welfare Economics by showing problems with the definition of satisfactory informationally decentralized resource allocation mechanisms.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号