首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper studies one of the most important types of measurement that has arisen from the social sciences, the additive conjoint measurement introduced by Debreu (1960) and Luce and Tukey (1964). It also considers the variant we call additive conjoint extensive measurement. Both types of measurement are based on qualitative comparisons between multiattribute alternatives in Cartesian products of sets. This paper initiates a study of their uniqueness for the case in which all sets are finite. It considers uniqueness up to similar positive affine transformations for additive conjoint measurement, and uniqueness up to similar proportionality transformations for additive conjoint extensive measurement. Both types of uniqueness are related to sets of ‘indifference’ comparisons that correspond to sets of linearly independent equations for the measurement representation. After we explicate necessary and sufficient conditions for uniqueness, we explore specific aspects of sets of unique solutions for two-factor (two-set) additive conjoint measurement and two-factor additive conjoint extensive measurement.  相似文献   

2.
《Fuzzy Sets and Systems》1987,21(2):201-209
In the same way as ‘comprehensive fuzziness’ deals with a-fields of labels, ‘extensive fuzziness’ deals with objects. In this paper, the concept of ‘extensive mapping’ is defined in order to justify the notion of ‘measurable fuzzy function’. This is the natural continuation of the article “Comprehensive fuzziness”, recently published in this Journal.  相似文献   

3.
We are given a set of objects, each characterized by a weight and a fragility, and a large number of uncapacitated bins. Our aim is to find the minimum number of bins needed to pack all objects, in such a way that in each bin the sum of the object weights is less than or equal to the smallest fragility of an object in the bin. The problem is known in the literature as the Bin Packing Problem with Fragile Objects, and appears in the telecommunication field, when one has to assign cellular calls to available channels by ensuring that the total noise in a channel does not exceed the noise acceptance limit of a call.We propose a branch-and-bound and several branch-and-price algorithms for the exact solution of the problem, and improve their performance by the use of lower bounds and tailored optimization techniques. In addition we also develop algorithms for the optimal solution of the related knapsack problem with fragile objects. We conduct an extensive computational evaluation on the benchmark set of instances, and show that the proposed algorithms perform very well.  相似文献   

4.
This study examined students' accuracy of measurement estimation for linear distances, different units of measure, task context, and the relationship between accuracy estimation and logical thinking. Middle school students completed a series of tasks that included estimating the length of various objects in different contexts and completed a test of logical thinking ability. Results found that the students were not able to give accurate estimations for the lengths of familiar objects. Students were also less accurate in estimating in metric units as compared to English or novel units. Estimation accuracy was dependent on the task context. There were significant differences in estimation accuracy for two‐ versus three‐dimensional estimation tasks. There were no significant differences for estimating objects with different orientations or embedded objects. For the tasks requiring the students to estimate in English units, the embedded task and the three‐dimensional tasks were correlated with logical thinking. For estimation tasks with novel units, three‐dimensional and two‐dimensional estimation tasks were significantly correlated with the logical thinking. In order to interact effectively with our environment it is essential to possess an intuitive grasp of both dimension and scale and to be able to manipulate such information. Estimation, approximating and measuring are all components of such intuition ( Forrester, Latham, & Shire, 1990 , p. 283).  相似文献   

5.
Various metrics and functions of metrics are examined. A number of results concerning the identification of similarities and distinctions between objects are presented for different measurement techniques.  相似文献   

6.
Branching structures, alias topological tree structures are fundamental to any hierarchical classification that aims to relate objects according to their similarities or dissimilarities. This paper provides a rigorous treatment of these structures, and continues previous work of Colonius and Schulze on H-structures. Thus extensive use is made of the so-called neighbors relation associated with a dissimilarity index. Arbitrary dissimilarity data are then analyzed by comparing their neighbors relations with ideal, that is, tree-like relations: if it matches an ideal relation, then one can readily construct a tree representing the data that is optimal in a certain sense. Finally, some algorithms are proposed for fitting observed data to tree-like data.  相似文献   

7.
基于泛激励控制线的多阶段信息集结方法   总被引:4,自引:0,他引:4  
针对时序动态综合评价中多阶段评价信息集结问题,给出了一种基于泛激励控制线的集结方法,该方法能很好地刻画决策者对于被评价对象发展的预期,并在处理过程中实施"控制与激励"的双重管理手段,因而体现出明显的决策意图。实践中长期使用该方法,可对被评价对象的持续科学的发展行为产生良性诱导。最后,用一个算例验证了方法的有效性。  相似文献   

8.
Thomas Sattig 《Metaphysica》2013,14(2):211-223
The problem of the many poses the task of explaining mereological indeterminacy of ordinary objects in a way that sustains our familiar practice of counting these objects. The aim of this essay is to develop a solution to the problem of the many that is based on an account of mereological indeterminacy as having its source in how ordinary objects are, independently of how we represent them. At the center of the account stands a quasi-hylomorphic ontology of ordinary objects as material objects with multiple individual forms.  相似文献   

9.
In the framework of an algebraic approach, we consider a quantum teleportation procedure. It turns out that using the quantum measurement nonlocality hypothesis is unnecessary for describing this procedure. We study the question of what material objects are information carriers for quantum teleportation. __________ Translated from Teoreticheskaya i Matematicheskaya Fizika, Vol. 157, No. 1, pp. 79–98, October, 2008.  相似文献   

10.
1993年Puczylowski在用公理系统构造的其中元素称为代数的对象类中建立了根与半单类的一般理论。本文的目的是在这种最具广泛性的代数系统中讨论挠理论,用格论方法给出它的一些特征刻划。  相似文献   

11.
Quantitative measurement of the similarity of partitions is a problem of particular relevance to the social and behavioral sciences, where experimental procedures necessitate the analysis and comparison of partitions of objects. Since metrics used for this purpose vary considerably in computational complexity. I describe two related metric models that permit methodical enumeration of metrics which may be useful and computationally tractable. Twelve metrics on partitions are identified in this way. Five of them have appeared in the literature, while seven appear to be new. Four of them seem difficult to compute, but efficient algorithms for the remaining eight exist and exhibit time complexities ranging from O(n) to O(n3), where n is the number of objects in the partitions. These algorithms are all based on lattice- and graph-theoretic representations of the computational problems.  相似文献   

12.
In this paper an image-based control for an optomechanical image derotator is implemented. A derotator is an optical system to support measurements on rotating components by tracking their rotational movement. As a consequence, the position and rotational velocity of the measurement object has to be known continuously. In general this would be accomplished by measuring these variables using a rotary encoder. However, not all measuring objects are equipped for this task. As a solution universally applicable to a wide range of measuring objects, an image-based approach is developed in the scope of this work. The object is captured with a high-speed camera to determine its position and velocity by image processing algorithms. To proof the applicability of this concept, a controller using the data acquired with the camera and a controller using data of the rotary encoder are compared. (© 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

13.
Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes “as good as new” after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal.  相似文献   

14.
In this paper we consider the problem of locating an axis-parallel rectangle in the plane such that the sum of distances between the rectangle and a finite point set is minimized, where the distance is measured by the Manhattan norm ? 1. In this way we solve an extension of the Weber problem to extensive facility location. As a model, our problem is appropriate for position sensing of rectangular objects.  相似文献   

15.
Minor mathematics refers to the mathematical practices that are often erased by state-sanctioned curricular images of mathematics. We use the idea of a minor mathematics to explore alternative measurement practices. We argue that minor measurement practices have been buried by a ‘major’ settler mathematics, a process of erasure that distributes ‘sensibility’ and formulates conditions of mathematics dis/ability. We emphasize how measuring involves the making and mixing of analogies, and that this involves attending to intensive relationships rather than extensive properties. Our philosophical and historical approach moves from the archeological origins of human measurement activity, to pivotal developments in modern mathematics, to configurations of curriculum. We argue that the project of proliferating multiple mathematics is required in order to disturb narrow (and perhaps white, western, male) images of mathematics—and to open up opportunities for a more pluralist and inclusive school mathematics.  相似文献   

16.
When taken literally, mathematical texts are concerned with objects of a specific kind and learning mathematics, among other things, requires students to make sense of those mathematical objects. Understanding mathematical objects is commonly described as a cognitive construction. It is proposed and substantiated that for those objects to emerge for the individual, the construction processes have to be supplemented by a deliberate decision to view, treat, use, and investigate a structure or a collection of items as a unified object. This decision strongly depends on the mediation by symbols, diagrams, and notational systems.  相似文献   

17.
When taken literally, mathematical texts are concerned with objects of a specific kind and learning mathematics, among other things, requires students to make sense of those mathematical objects. Understanding mathematical objects is commonly described as a cognitive construction. It is proposed and substantiated that for those objects to emerge for the individual, the construction processes have to be supplemented by a deliberate decision to view, treat, use, and investigate a structure or a collection of items as a unified object. This decision strongly depends on the mediation by symbols, diagrams, and notational systems.  相似文献   

18.
We study the problem of allocating a set of objects, e.g. houses, tasks, offices to a group of people having preferences over these objects. For various reasons, there may be more or fewer objects than initially planned and allocated. How should such unexpected changes be handled? One way is to declare the initial decision irrelevant and reallocate all available objects. Alternatively, one can use the initial decision as starting point in allocating the new objects. Since both perspectives seem equally reasonable, a natural robustness principle on the rule is that it should produce the same outcome no matter which one is taken. We define two robustness properties based on this idea, pertaining to more objects and fewer objects, respectively.We characterize the family of rules that satisfy mild efficiency, fairness and incentives requirements, together with either one of our robustness properties. They are the family of serial dictatorship rules.  相似文献   

19.
The Hough transform is a common computer vision algorithm used to detect shapes in a noisy image. Originally the Hough transform was proposed as a technique for detection of straight lines in images. In this paper we study the statistical properties of the Hough transform estimator in the presence of measurement errors. We consider the simple case of detection of one line parameterized in polar coordinates. We show that the estimator is consistent, and possesses a rate of convergence of the cube-root type. We derive its limiting distribution, and study its robustness properties. Numerical results are discussed as well. In particular, based on extensive experiments, we define a “rule of thumb” for the determination of the optimal width parameter of the template used in the algorithm.  相似文献   

20.
Cluster analysis is an unsupervised learning technique for partitioning objects into several clusters. Assuming that noisy objects are included, we propose a soft clustering method which assigns objects that are significantly different from noise into one of the specified number of clusters by controlling decision errors through multiple testing. The parameters of the Gaussian mixture model are estimated from the EM algorithm. Using the estimated probability density function, we formulated a multiple hypothesis testing for the clustering problem, and the positive false discovery rate (pFDR) is calculated as our decision error. The proposed procedure classifies objects into significant data or noise simultaneously according to the specified target pFDR level. When applied to real and artificial data sets, it was able to control the target pFDR reasonably well, offering a satisfactory clustering performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号