首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 440 毫秒
1.
Constructivism has been a key referent for research into the learning of science for several decades. There is little doubt that the research into learners’ ideas in science stimulated by the constructivist movement has been voluminous, and a great deal is now known about the way various science topics may commonly be understood by learners of various ages. Despite this significant research effort, there have been serious criticisms of this area of work: in terms of its philosophical underpinning, the validity of its most popular constructs, the limited scope of its focus, and its practical value to science teaching. This paper frames this area of work as a Lakatosian Research Programme (RP), and explores the major criticisms of constructivism from that perspective. It is argued that much of the criticism may be considered as part of the legitimate academic debate expected within any active RP, i.e. arguments about the auxiliary theory making up the ‘protective belt’ of the programme. It is suggested that a shifting focus from constructivism to ‘contingency in learning’ will allow the RP to draw upon a more diverse range of perspectives, each consistent with the existing hard core of the programme, which will provide potentially fruitful directions for future work and ensure the continuity of a progressive RP into learning science.  相似文献   

2.
Constructivism rejects the metaphysical position that “truth”, and thus knowledge in science, can represent an “objective” reality, independent of the knower. It modifies the role of knowledge from “true” representation to functional viability. In this interview, Ernst von Glasersfeld, the leading proponent of Radical Constructivism underlines the inaccessibility of reality, and proposes his view that the function of cognition is adaptive, in the biological sense: the adaptation is the result of the elimination of all that is not adapted. There is no rational way of knowing anything outside the domain of our experience and we construct our world of experiences. In addition to these philosophical claims, the interviewee provides some personal insights; he also gives some suggestions about better teaching and problem solving. These are the aspects of constructivism that have had a major impact on instruction and have modified the manner many of us teach. The process of teaching as linguistic communication, he says, needs to change in a way to involve actively the students in the construction of their knowledge. Because knowledge is not a transferable commodity, learning is mainly identified with the activity of the construction of personal meaning. This interview also provides glimpses on von Glasersfeld’s life.  相似文献   

3.
It is proposed that molecular phenomena may only be described within the framework of the Complementarity Principle (‘CP’), and that scientific controversy may originate in the essential incompatibility of complementary representations. Complementarity based on the temporal Uncertainty Principle leads to new insights into transition state theory, microscopic reversibility and the Curtin-Hammett Principle. An empirical application of the ‘CP’ to the structural theory leads to a revision of present concepts of ‘reaction dynamics’, with the Principle of Least Nuclear Motion (‘PLNM’) emerging as a general alternative to electronic theories of reactivity. In fact, it is argued that the ‘PLNM’ is a better basis for the Woodward-Hoffmann rules than is orbital symmetry. A more flexible approach to organic reaction mechanisms is thus indicated. Also, as the basis of the structural theory is fundamentally uncertain, and the present theory of X-ray diffraction apparently incompatible with the ‘UP’, a reinterpretation of the Bragg equation has been attempted.  相似文献   

4.
This critical survey argues that the theory, conventionally used to interpret kinetic data measured for thermal reactions of initially solid reactants, is not always suitable for elucidating reaction chemistry and mechanisms or for identifying reactivity controls. Studies of solid-state decompositions published before the 1960s usually portrayed the reaction rate as determined by Arrhenius type models closely related to those formulated for homogeneous rate processes, though scientific justifications for these parallels remained incompletely established. Since the 1960s, when thermal analysis techniques were developed, studies of solid-state decompositions contributed to establishment of the new experimental techniques, but research interest became redirected towards increasing the capabilities of automated equipment to collect, to store and later to analyze rate changes for selected reactions. Subsequently, much less attention has been directed towards chemical features of the rate processes studied, which have included a range of reactants that is much more diverse than the simple solid-state reactions with which early thermokinetic studies were principally concerned. Moreover, the theory applied to these various reactants does not recognize the possible complexities of behaviour that may include mechanisms involving melting and/or concurrent/consecutive reactions, etc. The situation that has arisen following, and attributable to, the eclipse of solid-state decomposition studies by thermal analysis, is presented here and the consequences critically discussed in a historical context. It is concluded that methods currently used for kinetic and mechanistic investigations of all types of thermal reactions indiscriminately considered by the same, but inadequate theory, are unsatisfactory. Urgent and fundamental reappraisal of the theoretical foundations of thermokinetic chemical studies is now necessary and overdue. While there are important, but hitherto unrecognized, delusions in thermokinetic methods and theories, an alternative theoretical explanation that accounts for many physical and chemical features of crystolysis reactions has been proposed. However, this novel but general model for the thermal behaviour and properties of solids has similarly remained ignored by the thermoanalytical community. The objective of this article is to emphasize the now pressing necessity for an open debate between these unreconciled opinions of different groups of researchers. The ethos of science is that disagreement between rival theories can be resolved by experiment and/or discussion, which may also strengthen the foundations of the subject in the process. As pointed out below, during recent years there has been no movement towards attempting to resolve some fundamental differences of opinion in a field that lacks an adequate theory. This should be unacceptable to all concerned. Here some criticisms are made of specific features of the alternative reaction models available with the stated intention of provoking a debate that might lead to identification of the significant differences between the currently irreconciled views. This could, of course, attract the displeasure of both sides, who will probably criticise me severely. Because I intend to retire completely from this field soon, it does not matter to me if I am considered to be ‘wrong’, if it contributes to us all eventually agreeing to get the science ‘right’.  相似文献   

5.
The SI is the world’s premier scientific instrument and, like all instruments, must be updated as science and technology advance. The CIPM is planning to redefine four SI base units in terms of invariants of nature and has already dubbed this revision the ‘New SI’. This title will confuse most SI users, who are impacted only by the structure and notation of the SI, and neither of these has changed. A truly ‘New SI’, to be practical in the information age, would have at minimum an unambiguous, and preferably a machine- and user-friendly, notation. These and other principles for constructing a notation are proposed, and the SI historical and current notations are examined against these. Several levels of remediation are suggested.  相似文献   

6.
In this paper I expand Eric Scerri’s notion of Popper’s naturalised approach to reduction in chemistry and investigate what its consequences might be. I will argue that Popper’s naturalised approach to reduction has a number of interesting consequences when applied to the reduction of chemistry to physics. One of them is that it prompts us to look at a ‘bootstrap’ approach to quantum chemistry, which is based on specific quantum theoretical theorems and practical considerations that turn quantum ‘theory’ into quantum ‘chemistry’ proper. This approach allows us to investigate some of the principles that drive theory formation in quantum chemistry. These ‘enabling theorems’ place certain limits on the explanatory latitude enjoyed by quantum chemists, and form a first step into establishing the relationship between chemistry and physics in more detail.  相似文献   

7.
试论化学教学中的"自我对话"教育   总被引:3,自引:0,他引:3  
曹少华 《化学教育》2005,26(7):12-13
“自我对话”教育强调学习者的主体作用和能动作用,它符合新课程理念和建构主义学习理论。实施“自我对话”教育必须注意激发学生的“自我对话”意识,并通过学前“知我性对话”、学中“生成性对话”和学后“反思性对话”等形式加以落实。  相似文献   

8.
Most base units in the SI relate to specific sensoric qualities our body is able to observe: space, heat, brightness, etc. The base unit ‘mole’ incorporates intellectual insight: the atomistic perception of the world. This perception is a quintessence of over 300 years of scientific research. The quintessence, from Dalton’s ‘The sceptical chymist’ to Perrin’s Nobel Prize in 1926 and Pauling’s ‘Nature of the Chemical Bond’ in 1939, results in the conclusion that the base unit of the SI quantity ‘amount of substance’ is not the mole but the dimensionless entity.  相似文献   

9.
10.
11.
The use of simple linear mathematical models to estimate chemical properties is not a new idea. Albert Einstein used very simple ‘gravity-like' forces to explain the capillarity of different liquids in 1900–1901. Today such models are used in more complicated situations, and a great many have been developed to analyse interactions between proteins and their ligands. This is not surprising, since proteins are too complicated to model accurately without lengthy numerical analysis, and simple models often do at least as good a job in predicting binding constants as much more computationally expensive methods. One hundred years after Einstein’s ‘miraculous year’ in which he transformed physics, it is instructive to recall some of his even earlier work. As approximations, ‘scoring functions’ are excellent, but it is dangerous to read too much into them. A few cautionary tales are presented for the beginner to the field of ligand affinity prediction by linear models.  相似文献   

12.

Abstract  

An inevitable consequence of humans living in the Aluminium Age is the presence of aluminium in the brain. This non-essential, neurotoxic metal gains entry to the brain throughout all stages of human development, from the foetus through to old age. Human exposure to myriad forms of this ubiquitous and omnipresent metal makes its presence in the brain inevitable, while the structure and physiology of the brain makes it particularly susceptible to the accumulation of aluminium with age. In spite of aluminium’s complete lack of biological essentiality, it actually participates avidly in brain biochemistry and substitutes for essential metals in critical biochemical processes. The degree to which such substitutions are disruptive and are manifested as biological effects will depend upon the biological availability of aluminium in any particular physical or chemical compartment, and will under all circumstances be exerting an energy load on the brain. In short, the brain must expend energy in its ‘unconscious’ response to an exposure to biologically available aluminium. There are many examples where ‘biological effect’ has resulted in aluminium-induced neurotoxicity and most potently in conditions that have resulted in an aluminium-associated encephalopathy. However, since aluminium is non-essential and not required by the brain, its biological availability will only rarely achieve such levels of acuity, and it is more pertinent to consider and investigate the brain’s response to much lower though sustained levels of biologically reactive aluminium. This is the level of exposure that defines the putative role of aluminium in chronic neurodegenerative disease and, though thoroughly investigated in numerous animal models, the chronic toxicity of aluminium has yet to be addressed experimentally in humans. A feasible test of the ‘aluminium hypothesis’, whereby aluminium in the human brain is implicated in chronic neurodegenerative disease, would be to reduce the brain’s aluminium load to the lowest possible level by non-invasive means. The simplest way that this aim can be fulfilled in a significant and relevant population is by facilitating the urinary excretion of aluminium through the regular drinking of a silicic acid-rich mineral water over an extended time period. This will lower the body and brain burden of aluminium, and by doing so will test whether brain aluminium contributes significantly to chronic neurodegenerative diseases such as Alzheimer’s and Parkinson’s.  相似文献   

13.
Gyorgy Kepes (1906–2001) established an innovative department at the Massachusetts Institute of Technology, the Center for Advanced Visual Studies and edited an influential book series. These activities went much beyond a scientific approach to Design; it pointed the way to establishing conscious efforts of design in Science. With progress in materials science and structural chemistry, we witness the realization of Kepes’s dreams and the ingenuity of his ideas.  相似文献   

14.
Excitation-emission matrices (EEM) from fluorescence spectroscopy may contain characteristic information about different algae species. As a result of measurements, one gets a whole stack of EEMs each of them corresponding to one species. Such a stack of matrices has to be understood as a cubic data array spanned by the dimensions ‘excitation’, ‘emission’ and ‘species’. The interpretation of higher dimensional data arrays requires efficient tools from multivariate data analysis. In this paper, it is illustrated how Three-way Principal Components Analysis as the appropriate generalization of conventional Principal Components Analysis may serve as a powerful method for classification of algae species. Received: 31 May 1996 / Revised: 1 July 1996 / Accepted: 4 Juli 1996  相似文献   

15.
In the 18th century, the concept of ‘affinity’, ‘principle’ and ‘element’ dominated chemical discourse, both inside and outside the laboratory. Although much work has been done on these terms and the methodological commitments which guided their usage, most studies over the past two centuries have concentrated on their application as relevant to Lavoisier's oxygen theory and the new nomenclature. Kim's affinity challenges this historiographical trajectory by looking at several French chemists in the light of their private thoughts, public disputations and communal networks. In doing so, she tells a complex story which points to the methodological and practical importance of industrial and medical chemistry. The following review highlights the advantages and snares of such an approach and makes a few historiographical points along the way. This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   

16.
Excitation-emission matrices (EEM) from fluorescence spectroscopy may contain characteristic information about different algae species. As a result of measurements, one gets a whole stack of EEMs each of them corresponding to one species. Such a stack of matrices has to be understood as a cubic data array spanned by the dimensions ‘excitation’, ‘emission’ and ‘species’. The interpretation of higher dimensional data arrays requires efficient tools from multivariate data analysis. In this paper, it is illustrated how Three-way Principal Components Analysis as the appropriate generalization of conventional Principal Components Analysis may serve as a powerful method for classification of algae species. Received: 31 May 1996 / Revised: 1 July 1996 / Accepted: 4 Juli 1996  相似文献   

17.
Desimoni and Brunetti raise some questions about the use of Eurachem/CITAC guide, because the Eurachem/CITAC guide does not discuss an ISO recommendation before performing a test, it should be decided whether it is to be a test for conformity or a test for non-conformity. In response, it is pointed out that although this recommendation is not discussed explicitly, it is of necessity covered by the decision rule that describes how the measurement uncertainty will be taken into consideration with regard to accepting or rejecting a product according to its specification and the result of a measurement. In addition, they propose the introduction of an ‘inconclusive’ zone. We do not think that this is necessary, since the Eurachem/CITAC guide takes the view that action on rejection should be covered by the ‘decision rule’ and this can make equivalent provision for confirmation or interpretation.  相似文献   

18.
The ‘triumph of the anti-phlogistians’ is a familiar story to the historians and philosophers of science who characterize the Chemical Revolution as a broad conceptual shift. The apparent “incommensurability” of the paradigms across the revolutionary divide has caused much anxiety. Chemists could identify phlogiston and oxygen, however, only with different sets of instrumental practices, theoretical schemes, and philosophical commitments. In addition, the substantive counterpart to phlogiston in the new chemistry was not oxygen, but caloric. By focusing on the changing visions of chemical body across the revolutionary divide with a more sensitive probe into the historical actors’ material manipulations and linguistic usage, we can historicize their laboratory realities and philosophical agenda. An archeology of chemical bodies that configures the fragile stability of the material worlds chemists created in succession promises a philosophical horizon that would recognize our hybrid (natural–artificial) environment as an evolving investigative object of science.  相似文献   

19.
We have investigated the effect of ‘Graham’s salt’ as a phosphorous containing flame-retardant applied onto cotton fabric. The optimum loading of this salt to impart flameretardancy has been determined to be about 36.78-41-31 g salt per 100 g cotton woven fabric (plain 144 g m−2). Thermogravimetry of pure cotton, treated cotton fabric and the pure salt were accomplished. The curves were then compared and commented. They reveal that this salt thermosensibilized combustion of the treated substrate as a dehydrating agent. The results obtained fortified the ‘Chemical Theory’ and ‘Coating Theory’ evidenced the formation of carbonaceous residue upon the cellulosic substrate during the combustion.  相似文献   

20.
In both European legislation relating to the testing of food and the recommendations of the Codex Alimentarius Commission, there is a movement away from specifying particular analytical methods towards specifying performance criteria to which any methods used must adhere. This ‘criteria approach’ has hitherto been based on the features traditionally used to describe analytical performance. This paper proposes replacing the traditional features, namely accuracy, applicability, detection limit and limit of determination, linearity, precision, recovery, selectivity and sensitivity, with a single specification, the uncertainty function, which tells us how the uncertainty varies with concentration. The uncertainty function can be used in two ways, either as a ‘fitness function’, which describes the uncertainty that is fit for purpose, or as a ‘characteristic function’ that describes the performance of a defined method applied to a defined range of test materials. Analytical chemists reporting the outcome of method validations are encouraged to do so in future in terms of the uncertainty function. When no uncertainty function is available, existing traditional information can be used to define one that is suitable for ‘off-the-shelf’ method selection. Some illustrative examples of the use of these functions in methods selection are appended.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号