共查询到20条相似文献,搜索用时 31 毫秒
1.
Rino Falcone Cristiano Castelfranchi 《Computational & Mathematical Organization Theory》2011,17(4):402-418
Trust can be viewed at the same time as an instrument both for an agent selecting the right partners in order to achieve its
own goals, and for an agent of being selected from other potential partners in order to establish with them a cooperation/collaboration
and to take advantage from the accumulated trust. In this paper we will analyze trust as the agents’ relational capital. Starting from the classical dependence network with potential partners, we introduce the analysis of what it means for an
agent to be trusted and how this condition could be strategically used from it for achieving its own goals, that is, why it
represents a form of power. The idea of taking another agent’s point of view is especially important if we consider the amount
of studies in social science that connect trust with social capital related issues. Although there is a big interest in literature about ‘social capital’ and its powerful effects on the wellbeing
of both societies and individuals, often it is not clear enough what is it the object under analysis. Individual trust capital
(relational capital) and collective trust capital not only should be disentangled, but their relations are quite complicated
and even conflicting. To overcome this gap, we propose a study that first attempts to understand what trust is as capital of individuals. In which sense “trust” is a capital. How this capital is built, managed and saved. In particular, how this capital is the
result of the others’ beliefs and goals. Then we aim to analytically study the cognitive dynamics of this object. 相似文献
2.
We present an analytic approach to address the problem of how sellers can establish and maintain a long-lasting relationship
with a buyer and, at the same time, maximize customer lifetime value (CLV). To model the evolution of a relational exchange
between a seller and a buyer, we extend a well-known mathematical model of “love dynamics.” The growth of each partner’s commitment
to the relationship is a sum of negative and positive terms. The negative term describes each partner’s propensities for opportunism,
while the positive terms describe each partner’s trust in the commitment of the other, and the reaction to marketing efforts.
The seller controls the evolution of the relationship through social relationships and transactional marketing efforts. The
main findings are as follows: (1) Loyal (committed) customers and long-term relationships do not always generate better cash
flows, especially when buyers either look for superior current value in each purchase opportunity or are short-term oriented.
(2) Without mutual trust between partners, the seller should treat old customers over time as new ones, making the reduction
of retention costs impossible. (3) It is only cheaper to retain current customers rather than acquiring new ones if mutual
trust between partners overcomes propensities for opportunism and the seller slightly discounts future cash flows. 相似文献
3.
In open multiagent systems, agents depend on reputation and trust mechanisms to evaluate the behavior of potential partners. Often these evaluations are associated with a measure of reliability that the source agent computes. However, due to the subjectivity of reputation-related information, this can lead to serious problems when considering communicated social evaluations. In this paper, instead of considering only reliability measures computed from the sources, we provide a mechanism that allows the recipient decide whether the piece of information is reliable according to its own knowledge. We do it by allowing the agents engage in an argumentation-based dialog specifically designed for the exchange of social evaluations. We evaluate our framework through simulations. The results show that in most of the checked conditions, agents that use our dialog framework significantly improve (statistically) the accuracy of the evaluations, over the agents that do not use it. In particular, the simulations reveal that when there is a heterogeneity set of agents (not all the agents have the same goals) and agents base part of their inferences on third-party information, it is worth using our dialog protocol. 相似文献
4.
Matthew Boyle 《Acta Analytica》2010,25(1):9-20
I critically discuss the account of self-knowledge presented in Dorit Bar-On’s Speaking My Mind (OUP 2004), focusing on Bar-On’s understanding of what makes our capacity for self-knowledge puzzling and on her ‘neo-expressivist’
solution to the puzzle. I argue that there is an important aspect of the problem of self-knowledge that Bar-On’s account does
not sufficiently address. A satisfying account of self-knowledge must explain not merely how we are able to make accurate
avowals about our own present mental states, but how we can reasonably regard ourselves as entitled to claim self-knowledge. Addressing this aspect of the problem of self-knowledge requires confronting questions about the
metaphysical nature of mental states, questions that Bar-On’s approach seeks to avoid. 相似文献
5.
Bojan žalec 《Acta Analytica》2004,19(33):241-263
The article deals with the development of the philosophy of France Veber (1890–1975), the pupil of Meinong and a main Slovene
philosopher. One of the most important threads of Veber’s philosophy is the consideration of knowledge and factuality, which
may be seen as a driving force of its development. Veber’s philosophical development is usually divided into three phases:
the object theory phase, the phase when he created his philosophy of a person as a creature at the crossing of the natural
and the spiritual world, who as an active, not merely passive subject possesses her own causal powers, and the third phase,
when he supplemented his earlier philosophy with the theory of a special side of our experience which he called hitting-upon-reality.
It is a direct experience of reality, a special kind of intentionality, which is however fundamentally different from presentational
intentionality, which alone is taken into account by object theory or phenomenology. The questions of knowledge and factuality
are closely connected in Veber’s philosophy since, pace Veber, knowledge is a kind of, we may say, justified experience the object of which is a factual entity. Hence, if we want
to understand what knowledge is, we must face the challenge of comprehending factuality. There are five stages to be noted
in the development of his epistemology. The first two belong to his object theory phase, the third to his person phase, the
fourth is characterised by his distinguishing and exploring truth and validity with regard to the thought about God, and the
basis of the fifth phase lies in his theory of hitting-upon-reality. In Introduction to Philosophy and The System of Philosophy, that is in the year 1921, Veber believed that factuality (“truth,”) was a property of the object, which we do present, but
we do not present the factuality of this factuality (that is why he distinguishes between the merely objective truths and
truths that are in addition transcendental truths). In 1923, in The Problems of Contemporary Philosophy and in the work Science and Religion, he already rejected such a view. There is something that makes things factual, but that is a complete unknown X. Therefore
we cannot even say what kind of an entity this factuality is. Some people would probably demand the following formulation:
if X is an ultimate mystery, we should not claim even that it is an entity. In The Problems of Presentation Production (1928) Veber claimed that factuality is not a property since this would lead to a regressum ad infinitum. Philosophy (1930) related internally correct experience to personal will. In The Book about God (1934) he developed the thesis that factuality depends on the act of God. In The Question of Reality (1939) he importantly modified, developed and enriched the thesis that we do not present reality with his theory of immediate
experience of (hitting upon) factuality. 相似文献
6.
Johann C. Marek 《Acta Analytica》2011,26(1):53-61
Experiences are interpreted as conscious mental occurrences that are of phenomenal character. There is already a kind of (weak)
intentionality involved with this phenomenal interpretation. A stricter conception of experiences distinguishes between purely
phenomenal experiences and intentional experiences in a narrow sense. Wittgenstein’s account of psychological (experiential)
verbs is taken over: Usually, expressing mental states verbally is not describing them. According to this, “I believe” can
be seen as an expression of one’s own belief, but not as an expression of a belief about one’s belief. Hence, the utterance
“I believe it is raining” shows that I believe that it is raining, although it is not said by these words that I believe that it is raining. Thinking thoughts such as “I believe it is raining, but it is not raining”
(a variant of Moore’s paradox) is an absurdity between what is already said by silently uttering “It is not raining” and what
is shown by silently uttering “I believe it is raining.” The paper agrees with a main result of Wittgenstein’s considerations
of Moore’s paradox, namely the view that logical structure, deducibility, and consistency cannot be reduced solely to propositions—besides
a logic of propositions, there is, for example, a logic of assertions and of imperatives, respectively. 相似文献
7.
Enterprises often implement a measurement system to monitor their march towards their strategic goals. Although this way it
is possible to assess the progress of each goal, there is no structured way to reconsider resource allocation to those goals
and to plan an optimal (or near optimal) allocation scheme. In this study we propose a genetic approach to match each goal
with an autonomous entity (agent) with a specific resource sharing behavior. The overall performance is evaluated through
a set of functions and genetic algorithms are used to eventuate in approximate optimal behavior’s schemes. To outline the
strategic goals of the enterprise we used the balanced scorecard method. Letting agents deploy their sharing behavior over
simulation time, we measure the scorecard’s performance and detect distinguished behaviors, namely recommendations for resource
allocation. 相似文献
8.
Thayer Morrill 《International Journal of Game Theory》2011,40(2):367-385
While a relationship in a social or business network should be mutually beneficial, it is ambiguous whether the relationship
benefits or harms the rest of the network. This paper focuses on the situation where any new relationship imposes a negative
externality on the rest of the network. We model this by assuming an agent’s payoff from a relationship is a decreasing function
of the number of relationships the other agent maintains. We solve for the socially efficient and stable networks. While in
general the two diverge, we demonstrate that they coincide when agents are able to make transfers to their partners. 相似文献
9.
David P. Hunt 《Acta Analytica》2007,22(1):3-15
The most serious challenge to Frankfurt-type counterexamples to the Principle of Alternate Possibilities (PAP) comes in the
form of a dilemma: either the counterexample presupposes determinism, in which case it begs the question; or it does not presuppose
determinism, in which case it fails to deliver on its promise to eliminate all alternatives that might plausibly be thought
to satisfy PAP. I respond to this challenge with a counterexample in whichconsidering an alternative course of action is anecessary condition fordeciding to act otherwise, and the agent does not in fact consider the alternative. I call this a “buffer case,” because the morally
relevant alternative is “buffered” by the requirement that the agent first consider the alternative. Suppose further that
the agent’s considering an alternative action—entering the buffer zone—is what would trigger the counterfactual intervener.
Then it would appear that PAP-relevant alternatives are out of reach.
I defend this counterexample to PAP against three objections: that considering an alternative isitself a morally relevant alternative; that buffer cases can be shown to containother alternatives that arguably satisfy PAP; and that even if the agent’spresent access to PAP-relevant alternatives were eliminated, PAP could still be satisfied in virtue ofearlier alternatives. I conclude that alternative possibilities are a normal symptom, but not an essential constituent, of moral
agency. 相似文献
10.
Luigi Sauro 《Computational & Mathematical Organization Theory》2006,12(2-3):147-168
It is desirable that artificial agents can help each other when they cannot achieve their goals, or when they profit from
social exchanges. In this work we study coalition formation processes supported by enforced agreements and we define two qualitative
criteria, the do-ut-des property and the composition property, that establish when a coalition is admissible to be formed.
The do-ut-des property is based on a balance between the advantages and the burdens of an agent, when it agrees an enforced
agreements. The composition property is a refinement of the do-ut-des property that takes into account also the costs and
the risks deriving from the coalition formation process.
Two relevant aspects distinguish our approach from the solution criteria developed in cooperative game theory. First, the
do-ut-des property and the composition property are not based on an explicit utility function associated to the goals of an
agent, and hance they can be used also in that cases in which the importance that agents give to their own goals is unknown.
Second, a coalition has all the necessary information to establish if it satisfies the do-ut-des property or the composition
property, therefore these two properties can be used in the case not all the space of possible coalitions is known.
Luigi Sauro graduated in Physics at the University “Federico II” of Naples in 2001. From February 2002 to July 2002 he was collaborator
at the SRA division of the IRST Institute (Trento). He got is Ph.D. in Computer Science from University of Torino in February
2006. Currently he is member of the Natural Language Processing and Agents Group, directed by prof. Leonardo Lesmo. His research
interests include social reasoning, coalition formation and coordination in multiagent systems. 相似文献
11.
This paper introduces a model for Distributed Employee Timetabling Problems (DisETPs) and proposes a general architecture for solving DisETPs by using a Multi Agent System (MAS) paradigm. The architecture is composed of a set of autonomous software Scheduling Agents (SAs) that solve the Employee Timetabling Problems (ETP) for each department. Each agent has its own local ETP problem and its own goals. The Scheduling Agents must coordinate
their local solution with the other agents in order to achieve a global solution for the whole organization that yields a
better result with respect to the organization’s global targets. To achieve a coherent and consistent global solution, the
SAs make use of a sophisticated negotiation protocol among scheduling agents that always ends in an agreement (not ensured to
be optimal). The main functionalities of this protocol are agent to agent relation definition, a mechanism to approve a chain
of Request for Changes and an electronic marketplace for bidding on preferred common time slots.
Experimental analysis of the implemented Multi Agent System for the Soroka medical center is presented. The results of our study indicate that the proposed framework has the potential
to reduce the cost of transportation for the nurses that traveling to and from the hospital. 相似文献
12.
Declan Mungovan Enda Howley Jim Duggan 《Computational & Mathematical Organization Theory》2011,17(2):152-178
In this paper we explore the effect that random social interactions have on the emergence and evolution of social norms in
a simulated population of agents. In our model agents observe the behaviour of others and update their norms based on these
observations. An agent’s norm is influenced by both their own fixed social network plus a second random network that is composed
of a subset of the remaining population. Random interactions are based on a weighted selection algorithm that uses an individual’s
path distance on the network to determine their chance of meeting a stranger. This means that friends-of-friends are more
likely to randomly interact with one another than agents with a higher degree of separation. We then contrast the cases where
agents make highest utility based rational decisions about which norm to adopt versus using a Markov Decision process that
associates a weight with the best choice. Finally we examine the effect that these random interactions have on the evolution
of a more complex social norm as it propagates throughout the population. We discover that increasing the frequency and weighting
of random interactions results in higher levels of norm convergence and in a quicker time when agents have the choice between
two competing alternatives. This can be attributed to more information passing through the population thereby allowing for
quicker convergence. When the norm is allowed to evolve we observe both global consensus formation and group splintering depending
on the cognitive agent model used. 相似文献
13.
Mark Steen 《Acta Analytica》2011,26(2):135-154
Ned Markosian argues (Australasian Journal of Philosophy 76:213-228, 1998a; Australasian Journal of Philosophy 82:332-340, 2004a, The Monist 87:405-428, 2004b) that simples are ‘maximally continuous’ entities. This leads him to conclude that there could be non-particular ‘stuff’
in addition to things. I first show how an ensuing debate on this issue McDaniel (Australasian Journal of Philosophy 81(2):265-275,
2003); Markosian (Australasian Journal of Philosophy 82:332-340, 2004a) ended in deadlock. I attempt to break the deadlock. Markosian’s view entails stuff-thing coincidence, which I show is just
as problematic as the more oft-discussed thing-thing coincidence. Also, the view entails that every particular is only contingently
so. If there is a world W like our own, but with ether, then there would be only one object in W. But, since merely adding ether to a world does not destroy the entities in it, then W contains counterparts of all the entities in the actual world—they just are not things. Hence, if simples are maximally continuous,
then every actual particular is only contingently so. This in turn entails the following disjunction: (i) identity is contingent
or intransitive, or (ii) there are no things at all in the actual world, or (iii) the distinction between stuff and things
is one without a difference. I recommend that we reject this stuff-thing dualism. 相似文献
14.
Dave Pratt Richard Noss 《International Journal of Computers for Mathematical Learning》2010,15(2):81-97
Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process
we refer to as designing for abstraction. In this paper, we draw on detailed design experiments from our research on children’s understanding about chance and distribution
to re-present this work as a case study in designing for abstraction. Through the case study, we elaborate a number of design
heuristics that we claim are also identifiable in the broader literature on designing for mathematical abstraction. Our previous
work on the micro-evolution of mathematical knowledge indicated that new mathematical abstractions are routinely forged in
activity with available tools and representations, coordinated with relatively na?ve unstructured knowledge. In this paper,
we identify the role of design in steering the micro-evolution of knowledge towards the focus of the designer’s aspirations.
A significant finding from the current analysis is the identification of a heuristic in designing for abstraction that requires
the intentional blurring of the key mathematical concepts with the tools whose use might foster the construction of that abstraction.
It is commonly recognized that meaningful design constructs emerge from careful analysis of children’s activity in relation
to the designer’s own framework for mathematical abstraction. The case study in this paper emphasizes the insufficiency of
such a model for the relationship between epistemology and design. In fact, the case study characterises the dialectic relationship
between epistemological analysis and design, in which the theoretical foundations of designing for abstraction and for the
micro-evolution of mathematical knowledge can co-emerge. 相似文献
15.
This paper presents a bilevel fuzzy principal-agent model for optimal nonlinear taxation problems with asymmetric information,
in which the government and the monopolist are the principals, the consumer is their agent. Since the assessment of the government
and the monopolist about the consumer’s taste is subjective, therefore, it is reasonable to characterize this assessment as
a fuzzy variable. What’s more, a bilevel fuzzy optimal nonlinear taxation model is developed with the purpose of maximizing
the expected social welfare and the monopolist’s expected welfare under the incentive feasible mechanism. The equivalent model
for the bilevel fuzzy optimal nonlinear taxation model is presented and Pontryagin maximum principle is adopted to obtain
the necessary conditions of the solutions for the fuzzy optimal nonlinear taxation problems. Finally, one numerical example
is given to illustrate the effectiveness of the proposed model, the results demonstrate that the consumer’s purchased quantity
not only relates with the consumer’s taste, but also depends on the structure of the social welfare. 相似文献
16.
Dmitry A. Shapiro 《International Journal of Game Theory》2009,38(1):81-106
A popular approach to explain over-contribution in public good games is based on the assumption that people care (either positively
or negatively) about the utility of other participants. Over-contribution then is an outcome of utility maximization where
utility depends on subjects’ own payoffs as well as on the payoffs of other players. In this paper, I study to what extent this assumption of utility interdependence is responsible
for over-contribution. I design three treatments where subjects’ decisions cannot affect opponents’ payoffs and thus utility
interdependence cannot explain cooperative behavior. The main result is that while average contribution in these treatments
is below the benchmark it nonetheless stays well above zero. Even when no one benefits from subjects’ generosity the average
contributions are as high as one third of the endowment and are only 25% below those in the benchmark level. This suggests
that utility interdependence is not the main factor responsible for over-contribution.
I would like to thank my dissertation advisor Shyam Sunder for his valuable suggestions that helped improve this paper. I
am also grateful to Dan Levin, Ben Polak, Klaus Schmidt, Andrew Schotter, Anat Bracha, Danielle Catambay, Rodney Chan, two
anonymous referees and the anonymous Associate Editor for their comments and suggestions. Participants of the 17th International
Conference on Game Theory at Stony Brook University, 2006 ESA International Meeting and 10th Biennial Behavioral Decision
Research in Management Conference provided a valuable feedback at earlier stages of the paper. Finally, I would like to thank
the Whitebox Fellowship for its generous support of this study. 相似文献
17.
Danilo Šuster 《Acta Analytica》2005,20(4):41-52
Davies argues that the ontology of artworks as performances offers a principled way of explaining work-relativity of modality. Object oriented contextualist ontologies of art (Levinson) cannot adequately
address the problem of work-relativity of modal properties because they understand looseness in what counts as the same context
as a view that slight differences in the work-constitutive features of provenance are work-relative. I argue that it is more
in the spirit of contextualism to understand looseness as context-dependent. This points to the general problem—the context
of appreciation is not robust enough to ground modal intuitions about objective entities. In general, when epistemology dictates
ontology there is always a threat of anti-realism, scepticism and relativism. Davies also appeals to the modality principle—an
entity’s essential properties are all and only its constitutive properties. Davies understands essentiality in a traditional
way: a property P is an essential property of an object o iff o could not exist and lack P. Kit Fine has recently made a convincing case for the view that the notion of essence is not to
be understood in modal terms. I explore some of the implications of this view for Davies’ modal argument for the performance
theory. 相似文献
18.
Feichtinger Gustav Hartl Richard F. Kort Peter M. Seidl Andrea Wrzaczek Stefan 《Journal of Optimization Theory and Applications》2022,194(3):878-895
Journal of Optimization Theory and Applications - This paper considers a capital accumulation game where the installation costs of investments are lowered by the firm’s own capital stock... 相似文献
19.
In this paper, a new hybrid method is proposed for solving nonlinear complementarity problems (NCP) with P
0 function. In the new method, we combine a smoothing nonmonotone trust region method based on a conic model and line search
techniques. We reformulate the NCP as a system of semismooth equations using the Fischer-Burmeister function. Using Kanzow’s
smooth approximation function to construct the smooth operator, we propose a smoothing nonmonotone trust region algorithm
of a conic model for solving the NCP with P
0 functions. This is different from the classical trust region methods, in that when a trial step is not accepted, the method
does not resolve the trust region subproblem but generates an iterative point whose steplength is defined by a line search.
We prove that every accumulation point of the sequence generated by the algorithm is a solution of the NCP. Under a nonsingularity
condition, the superlinear convergence of the algorithm is established without a strict complementarity condition. 相似文献
20.
Daniel Breyer 《Acta Analytica》2010,25(2):133-154
A belief is reflectively lucky if it is a matter of luck that the belief is true, given what a subject is aware of on reflection
alone. Various epistemologists have argued that any adequate theory of knowledge should eliminate reflective luck, but doing
so has proven difficult. This article distinguishes between two kinds of reflective luck arguments in the literature: local
arguments and global arguments. It argues that local arguments are best interpreted as demanding, not that one be reflectively
aware of the reliability of the sources of one’s beliefs, but that one’s beliefs be attributable to one as one’s own. The
article then argues that global arguments make illegitimate demands, because they require that we be ultimately answerable
for our beliefs. In the end, the article argues that epistemologists should shift their focus away from reflective luck and
toward the conditions under which beliefs are attributable to cognitive agents. 相似文献