首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Graphic representations are powerful tools used by scientists and other professionals to help them understand multifaceted natural phenomena. They can also serve teachers and students as they attempt to understand complex data sets. This study examines pencil‐and‐paper graphs produced by students at the beginning of a 1‐week summer teacher/student institute, as well as computer‐based graphs produced by the same students at the end of the institute. Initial problems with managing the data set and producing meaningful graphs disappeared quickly as students used a process of “building up” to handle the complexity of web‐based data on water quality. This process is examined, as are findings from the institute related to (a) barriers to accessing web‐based data, (b) students' problem‐solving processes, and (c) the promise of this approach for learning about environmental science issues.  相似文献   

2.
Through the GK‐12 program of the National Science Foundation, graduate student fellows in a coastal marine and wetland studies program were trained to present targeted science concepts to middle‐ and high‐school classes through their own research‐based lessons. Initially, they were taught to follow the 5‐E learning cycle in lesson plan development, but a streamlined approach targeting the three attributes of science concepts—macroscopic, model, and symbolic—was found to be a better approach, while still incorporating key facets of the 5‐E model. Evaluation of the level of inquiry in the classrooms was determined using an inquiry scale from 0 to 4, differentiated by the relative number of actions that are student‐centered. The graduate fellows consistently delivered lessons at the targeted levels 2 or 3, guided inquiry. In order to assess student learning, the GK‐12 fellows were trained to develop single‐item pre‐ and post‐assessments designed to probe middle‐level and high‐school students' understanding of the macroscopic, model, and symbolic attributes of targeted science concepts. For the lessons based on the research of the fellows, about 80% of the students showed statistically and practically significant learning gains. The GK‐12 fellows positively impact the classroom and are effective science ambassadors.  相似文献   

3.
Two agents control the areas in which a migrating fish stock is located. The harvesting is sequential. The stock available to Agent 1 depends on the growth of the stock, which in turn depends on the amount left after harvesting by Agent 2. The stock available to Agent 2 is the quantity left after harvesting by Agent 1. Each agent fishes down the stock in each period to an “abandonment level” deemed appropriate. The problem is analyzed as a noncooperative versus cooperative, repeated game with an infinite time horizon. In the noncooperative solution, both agents will harvest the stock if the unit cost of Agent 2 is not too much higher than the unit cost of Agent 1. A cooperative solution supported by a threat to revert to the noncooperative solution if deviation occurs implies greater differences in unit costs at which both agents will harvest the stock. The problem is illustrated by a simple, numerical example.  相似文献   

4.
High‐quality after‐school programs devoted to science have the potential to enhance students' science knowledge and attitudes, which may impact their decisions about pursuing science‐related careers. Because of the unique nature of these informal learning environments, an understanding of the relationships among aspects of students' content knowledge acquisition and attitudes toward science may aid in the development of effective science‐related interventions. We investigated the impact of a semester‐long after‐school intervention utilizing an inquiry‐based infectious diseases curriculum (designed for use after‐school) on 63 urban students' content knowledge and aspects of their attitudes toward science. Content knowledge increased 24.6% from pretest to posttest. Multiple regression analyses indicated suggested that the “self‐directed effort” subscale of the Simpson–Troost Attitude Questionnaire—Revised best predicted increases in students' science content knowledge. The construct “science is fun for me” served as a suppressor effect. These findings suggest that future after‐school programs focusing on aspects of attitudes toward science most closely associated with gains in content knowledge might improve students' enthusiasm and academic preparedness for additional science coursework by improving student attitudes toward their perceptions of their self‐directed effort.  相似文献   

5.
An enduring concern among science education researchers is the “swing away from science” ( Osborne. 2003 ). One of their central dilemmas is to identify—or construct—a valid outcome measure that could assess curricular effectiveness, and predict students' choices of science courses, university majors, or careers in science. Many instruments have been created and variably evaluated. The primary purpose of this paper was to re‐evaluate the psychometric properties of the Image of Science and Scientists Scale (ISSS) ( Krajkovich 1978 ). In the current study, confirmatory factor analysis (CFA) was used to examine the dimensionality of the 29‐item ISSS, which was administered to 531 middle school students in three San Antonio. Texas school districts at the beginning of the 2004–2005 school year. The results failed to confirm the presumed 1‐factor structure of the ISSS. but instead showed a 3‐factor structure with only marginal fit with the data, even after removal of 12 inadequate items. The three dimensions were “Positive Images of Scientists” (5 items). “Negative Images of Scientists” (9 items), and “Science Avocation” (3 items). The results do not support use of the original form of the ISSS for measuring “attitudes toward science,”“images of scientists. “or “scientific attitudes. “Shortening the scale from 29 to 17 items makes it more feasible to use in a classroom setting. Determining whether the three dimensions identified in our analysis. “Positive Images of Scientists. ““Negative Images of Scientists. “and “Science Avocation “contain useful assessments of middle school student impressions and attitudes will require independent investigation in other samples.  相似文献   

6.
7.
To handle the ubiquitous problem of “dependence learning,” copulas are quickly becoming a pervasive tool across a wide range of data‐driven disciplines encompassing neuroscience, finance, econometrics, genomics, social science, machine learning, healthcare, and many more. At the same time, despite their practical value, the empirical methods of “learning copula from data” have been unsystematic with full of case‐specific recipes. Taking inspiration from modern LP‐nonparametrics, this paper presents a modest contribution to the need for a more unified and structured approach of copula modeling that is simultaneously valid for arbitrary combinations of continuous and discrete variables.  相似文献   

8.
Approaches to test score use and test purpose lack the well-developed methodological guidelines and established sources of evidence available for intended score interpretation. We argue in this paper that this lack fails to reflect the ultimate purpose of a test score—to help solve an important problem faced by intended test users. We explore the treatment of intended test purpose and test score use under the chain of assumption/inferences perspective identified within an argument-based approach to validity. Next, we revisit the notion of test score use and argue that, at least for classroom assessments based on complex constructs, such as learning progressions in math and science, test score use can be more effectively conceptualized as part of a potential solution to solving a problem, or “job-to-be-done.”. We argue for shifting from the definition of validity to the concept of effectiveness. Finally, we illustrate an argument- based approach to test score effectiveness by contrasting effectiveness arguments for interim assessments based on a conventional test blueprint or a test blueprint augmented with learning progressions.  相似文献   

9.
We study the flow of two immiscible fluids of different density and mobility in a porous medium. If the heavier phase lies above the lighter one, the interface is observed to be unstable. The two phases start to mix on a mesoscopic scale and the mixing zone grows in time—an example of evolution of microstructure. A simple set of assumptions on the physics of this two‐phase flow in a porous medium leads to a mathematically ill‐posed problem—when used to establish a continuum free boundary problem. We propose and motivate a relaxation of this “nonconvex” constraint of a phase distribution with a sharp interface on a macroscopic scale. We prove that this approach leads to a mathematically well‐posed problem that predicts shape and evolution of the mixing profile as a function of the density difference and mobility quotient. © 1999 John Wiley & Sons, Inc.  相似文献   

10.
Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression‐based models, logit models, and theoretical market‐level models, such as the NBD‐Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method—agent‐based modeling—shows promise for addressing these issues. Agent‐based models use business‐driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system‐level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent‐based modeling to develop a multi‐scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings. © 2010 Wiley Periodicals, Inc. Complexity, 2010  相似文献   

11.
Formal methods abound in the teaching of probability and statistics. In the Connected Probability project, we explore ways for learners to develop their intuitive conceptions of core probabilistic concepts. This article presents a case study of a learner engaged with a probability paradox. Through engaging with this paradoxical problem, she develops stronger intuitions about notions of randomness and distribution and the connections between them. The case illustrates a Connected Mathematics approach: that primary obstacles to learning probability are conceptual and epistemological; that engagement with paradox can be a powerful means of motivating learners to overcome these obstacles; that overcoming these obstacles involves learners making mathematics—not learning a “received” mathematics and that, through programming computational models, learners can more powerfully express and refine their mathematical understandings.  相似文献   

12.
In this paper a multi-valued propositional logic — logic of agreement — in terms of its model theory and inference system is presented. This formal system is the natural consequence of a new way to approach concepts as commonsense knowledge, uncertainty and approximate reasoning — the point of view of agreement. Particularly, it is discussed a possible extension of the Classical Theory of Sets based on the idea that, instead of trying to conceptualize sets as “fuzzy” or “vague” entities, it is more adequate to define membership as the result of a partial agreement among a group of individual agents. Furthermore, it is shown that the concept of agreement provides a framework for the development of a formal and sound explanation for concepts (e.g. fuzzy sets) which lack formal semantics. According to the definition of agreement, an individual agent agrees or not with the fact that an object possesses a certain property. A clear distinction is then established, between an individual agent — to whom deciding whether an element belongs to a set is just a yes or no matter — and a commonsensical agent — the one who interprets the knowledge shared by a certain group of people. Finally, the logic of agreement is presented and discussed. As it is assumed the existence of several individual agents, the semantic system is based on the perspective that each individual agent defines her/his own conceptualization of reality. So the semantics of the logic of agreement can be seen as being similar to a semantics of possible worlds, one for each individual agent. The proof theory is an extension of a natural deduction system, using supported formulas and incorporating only inference rules. Moreover, the soundness and completeness of the logic of agreement are also presented.  相似文献   

13.
Correspondence analysis, a data analytic technique used to study two‐way cross‐classifications, is applied to social relational data. Such data are frequently termed “sociometric” or “network” data. The method allows one to model forms of relational data and types of empirical relationships not easily analyzed using either standard social network methods or common scaling or clustering techniques. In particular, correspondence analysis allows one to model:

—two‐mode networks (rows and columns of a sociomatrix refer to different objects)

—valued relations (e.g. counts, ratings, or frequencies).

In general, the technique provides scale values for row and column units, visual presentation of relationships among rows and columns, and criteria for assessing “dimensionality” or graphical complexity of the data and goodness‐of‐fit to particular models. Correspondence analysis has recently been the subject of research by Goodman, Haberman, and Gilula, who have termed their approach to the problem “canonical analysis” to reflect its similarity to canonical correlation analysis of continuous multivariate data. This generalization links the technique to more standard categorical data analysis models, and provides a much‐needed statistical justificatioa

We review both correspondence and canonical analysis, and present these ideas by analyzing relational data on the 1980 monetary donations from corporations to nonprofit organizations in the Minneapolis St. Paul metropolitan area. We also show how these techniques are related to dyadic independence models, first introduced by Holland, Leinhardt, Fienberg, and Wasserman in the early 1980's. The highlight of this paper is the relationship between correspondence and canonical analysis, and these dyadic independence models, which are designed specifically for relational data. The paper concludes with a discussion of this relationship, and some data analyses that illustrate the fart that correspondence analysis models can be used as approximate dyadic independence models.  相似文献   

14.
Methods for analyzing or learning from “fuzzy data” have attracted increasing attention in recent years. In many cases, however, existing methods (for precise, non-fuzzy data) are extended to the fuzzy case in an ad-hoc manner, and without carefully considering the interpretation of a fuzzy set when being used for modeling data. Distinguishing between an ontic and an epistemic interpretation of fuzzy set-valued data, and focusing on the latter, we argue that a “fuzzification” of learning algorithms based on an application of the generic extension principle is not appropriate. In fact, the extension principle fails to properly exploit the inductive bias underlying statistical and machine learning methods, although this bias, at least in principle, offers a means for “disambiguating” the fuzzy data. Alternatively, we therefore propose a method which is based on the generalization of loss functions in empirical risk minimization, and which performs model identification and data disambiguation simultaneously. Elaborating on the fuzzification of specific types of losses, we establish connections to well-known loss functions in regression and classification. We compare our approach with related methods and illustrate its use in logistic regression for binary classification.  相似文献   

15.
We have developed a new financial indicator—called the Interest Rate Differentials Adjusted for Volatility (IRDAV) measure—to assist investors in currency markets. On a monthly basis, we rank currency pairs according to this measure and then select a basket of pairs with the highest IRDAV values. Under positive market conditions, an IRDAV based investment strategy (buying a currency with high interest rate and simultaneously selling a currency with low interest rate, after adjusting for volatility of the currency pairs in question) can generate significant returns. However, when the markets turn for the worse and crisis situations evolve, investors exit such money-making strategies suddenly, and—as a result—significant losses can occur. In an effort to minimize these potential losses, we also propose an aggregated Risk Metric that estimates the total risk by looking at various financial indicators across different markets. These risk indicators are used to get timely signals of evolving crises and to flip the strategy from long to short in a timely fashion, to prevent losses and make further gains even during crisis periods. Since our proprietary model is implemented in Excel as a highly nonlinear “black box” computational procedure, we use suitable global optimization methodology and software—the Lipschitz Global Optimizer solver suite linked to Excel—to maximize the performance of the currency basket, based on our selection of key decision variables. After the introduction of the new currency trading model and its implementation, we present numerical results based on actual market data. Our results clearly show the advantages of using global optimization based parameter settings, compared to the typically used “expert estimates” of the key model parameters.  相似文献   

16.
In view of an enhancement of our implementation on the computer, we explore the possibility of an algorithmic optimization of the various proof‐theoretic techniques employed by Kohlenbach for the synthesis of new (and better) effective uniform bounds out of established qualitative proofs in Numerical Functional Analysis. Concretely, we prove that the method (developed by the author in his thesis, as an adaptation to Dialectica interpretations of Berger's original technique for modified realizability and A‐translation) of “colouring” some of the quantifiers as “non‐computational” extends well to ε‐arithmetization, elimination‐of‐extensionality and model‐interpretation (© 2009 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

17.
This paper derives rational ecological–economic equilibrium outcomes—capital and variable input allocations, harvests, discards, revenue, costs, and stock abundances—in a spatially heterogeneous, multispecies fishery that is regulated with individual fishing quotas (IFQs). The production setting is decentralized; a manager chooses species-specific, seasonal, and spatially nondelineated quotas. Industry controls all aspects of harvesting operations. We present a solution concept and computational algorithm to solve for equilibrium harvests, discards, and profits across species, space, and time (within the regulatory cycle). The rational equilibrium mapping that we derive, used recursively, can be used to implement management-preferred bioeconomic outcomes. The model offers an essential IFQ regulation-to-outcome mapping that enables more precise implementation of management goals in multiple-species and heterogeneous fishery settings. Recommendations for Resource Managers Knowing where and when individual tradeable fishing quotas will be utilized across heterogeneous space and time in multiple-species fisheries is essential for effective fisheries management. Ad hoc models, while simple, contribute to “implementation uncertainty” whereby predicted mortality, discards, cost, and rent outcomes across fish species, space, and time are poorly matched to the realized outcomes that are implemented by resource users. A model of rational equilibrium mortality, discards, costs, and rent across space and time offers and powerful tool to improve the management of quota-regulated fisheries.  相似文献   

18.
There is the tendency to explain away successful urban schools as indicative of the heroic efforts by a tireless individual, effectively blaming schools that underperform for a lack of grit and dedication. This study reports the development of a research instrument (School Science Infrastructure, or SSI) and then applying that tool to an investigation of equitable science performance by elementary schools. Our efforts to develop a science‐specific instrument to explore associations between school‐level variables and equitable science performance are informed by James Coleman's tripartite notion of social capital: the “wealth” of organizations is encompassed within their social norms, informational channels, and reciprocating relationships. Grounded in school effectiveness research and social capital theory, the instrument that we report on here is a valid and reliable tool to support meso‐level investigations of factors contributing to school variations in science achievement.  相似文献   

19.
Here, it is shown that the “traditional approach” to variable bubble-point problems, using black-oil models, is not consistent, because it violates the “bubble-point conservation law.” In order to have a consistent approach, it is necessary to incorporate shocks—discussed in previous papers—in which the bubble-point is discontinuous. A “consistent approach” is applied to specific examples, and results compared with those of the “traditional” one. The conclusion that the “traditional approach” generally yields large errors for the production rates and other parameters of interest in the oil industry, is reached. © 1997 John Wiley & Sons, Inc.  相似文献   

20.
The proofs of universally quantified statements, in mathematics, are given as “schemata” or as “prototypes” which may be applied to each specific instance of the quantified variable. Type Theory allows to turn into a rigorous notion this informal intuition described by many, including Herbrand. In this constructive approach where propositions are types, proofs are viewed as terms of λ‐calculus and act as “proof‐schemata”, as for universally quantified types. We examine here the critical case of Impredicative Type Theory, i. e. Girard's system F, where type‐quantification ranges over all types. Coherence and decidability properties are proved for prototype proofs in this impredicative context.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号